BeI

Member
Dec 9, 2017
6,035
On the other side of the AI spectrum:

www.nbcnews.com

Promising new AI can detect early signs of lung cancer that doctors can't see

The tool, Sybil, looks for signs of where cancer is likely to turn up so doctors can spot it as early as possible.

The new AI tool, called Sybil, was developed by scientists at the Mass General Cancer Center and the Massachusetts Institute of Technology in Cambridge. In one study, it was shown to accurately predict whether a person will develop lung cancer in the next year 86% to 94% of the time.
The tool, experts say, could be a leap forward in the early detection of lung cancer, the third most common cancer in the United States, according to the CDC. The disease is the leading cause of cancer death, according to the American Cancer Society, which estimates that this year there will be more than 238,000 new cases of lung cancer and more than 127,000 deaths.

But early detection is difficult, she said. Since the lungs can't be seen or felt, the only way to spot it early is with a CT scan. By the time symptoms appear, including persistent coughing or trouble breathing, the cancer is usually advanced and the most difficult to treat.

Past research has shown that screening with low-dose CT scans can reduce the risk of death from lung cancer by 24%, because they can help detect cancer sooner, when it's more treatable.

But an AI tool could potentially increase the rates of early detection of lung cancer — and potentially increase survival rates as well, Sandler said.
There have been cases, Fintelmann added, where Sybil has detected signs of cancers that radiologists did not detect until nodules were visible on a CT scan years later.

Fintelmann said he sees a future in which the AI tool is helping radiologists make important treatment decisions — not replacing radiologists altogether.

"The future of radiology is going to be AI-assisted," he said. "You will still need a radiologist to identify where the cancer is, identify the best possible treatment and actually do the treatment."

There seemed like a lot of quotable bits, but it seems like a promising new tool (that's still in clinical trials) that could potentially join the many AI tools already FDA approved for Radiology. The sooner cancer can be detected and treated the better, and it seems kinda crazy that a tool like this could detect signs of cancer potentially over a year before a doctor could in a scan.

It's also good that they touch on the idea of "AI-assisted" work instead of replacement, because in the medical / health field in particular, it seems like there is still a ton of room for it to grow in how it can handle public health when the work is AI-assisted.

This particular article is new, but I think it's based on a story from January, although either way I didn't quite see a thread for it.
 

Midramble

Force of Habit
The Fallen
Oct 25, 2017
10,490
San Francisco
OP
OP
BeI

BeI

Member
Dec 9, 2017
6,035
Oct 27, 2017
1,151
Finland
There are actually a lot of companies working on AI solutions to help finding different types of cancer. There have been and are hospitals around the world taking these solutions to use. It's really exciting and as a person working in one of those companies developing these solutions, and as a person working in one of those companies developing these solutions, it feels good to be part of that.

There's this in the article in the OP too:
AI, however, is still far from perfect.

One issue that Madabhushi, of Emory University, said he worries about is the type of data being used to train AI.

"A lot of data that comes from medical institutions or clinical trials do not represent the diversity of our country," he said, adding that he believes the AI tools are not being developed in a way that is tailored to help Black and brown people.

The scientists who developed Sybil have acknowledged that the data used to create the AI tool does not include "sufficient Black or Hispanic patients to have confidence in broad applicability yet."

The FDA has already taken a step to address this issue, announcing last year that it would soon require researchers and companies seeking approval for medical products to submit a plan that ensures diversity in clinical trials.

"We have to make sure that AI does not reflect or capture our biases," Madabhushi said.
At least my company is very aware of it too and we genuinely want to do our best to ensure that it works for diverse sets of people. As the quote points out, it's not always quite easy though as it's hard to get good diverse data.
 
Oct 26, 2017
3,532
It looks like this tool shows promise but the fact that they don't know how it was accurate is a bit concerning and the lack of diversity in test subjects is an issue as well. With proper time and study, I'm sure they can work out those problems though.
 

trashbandit

Member
Dec 19, 2019
3,912
This is probably the gold standard for positive use of AI, as an augmentative tool that eases a previously manual or error prone task.
 

gozu

Banned
Oct 27, 2017
10,442
America
AI's power for good is vast and this is a perfect example......as is its power for evil. It's our job to make laws and regulation that help ensure a more equitable access (such as the FDA rule mentioned above) as well as punish bad actors who are profiting out of other artist's work without permission, or creating convincing propaganda and fake news, illegal deepfakes, etc.

Asking people to do new tech or science research without AI today is akin to asking them to do it without computers or electricity.