Northwestern Medicine and Google use AI to improve lung cancer detection
May 22, 2019
A new study from Google and Northwestern Medicine found deep learning, a type of artificial intelligence, was able to spot cancerous lung nodules in scans more accurately than radiology experts, opening the door for earlier detection and treatment of lung cancer.
The deep-learning model was able to identify the disease through creating automated images and teaching large computers to learn from past images, improving its accuracy rate. The system confirmed evidence of cancer more quickly and had fewer false positive and false negatives than its human counterparts.
Lung cancer resulted in an estimated 160,000 deaths in 2018, making it the most common cause of death related to cancer. Shravya Shetty, the technical lead at Google, said lung cancer’s impact makes the discovery especially consequential.
“This area of research is incredibly important, as lung cancer has the highest rate of mortality among all cancers, and there are many challenges in the way of broad adoption of lung cancer screening,” Shetty said in a University release. “Our work examines ways AI can be used to improve the accuracy and optimize the screening process, in ways that could help with the implementation of screening programs.”
To function, the deep-learning system relies on initial scans as well as past ones to analyze growth rates of suspicious lung nodules. It also does regional analysis on the organs to identify regions of interest.
“Radiologists generally examine hundreds of two-dimensional images or ‘slices’ in a single CT scan but this new machine learning system views the lungs in a huge, single three-dimensional image,” study co-author and anesthesiology and engineering Prof. Mozziyar Etemadi said in the release. “AI in 3D can be much more sensitive in its ability to detect early lung cancer than the human eye looking at 2-D images. This is technically ‘4D’ because it is not only looking at one CT scan, but two (the current and prior scan) over time.”
Northwestern researchers provided 2,763 de-identified CT scans to Google, where the company’s scientists applied the deep-learning model. Etemadi said compiling and preparing the data took over a year for his research team.
The chance to combine Northwestern’s data with Google’s computing power led to an outcome with lots of potential, Etemadi said in the release.
“The ability to collaborate with world-class scientists at Google, using their unprecedented computing capabilities to create something with the potential to save tens of thousands of lives a year is truly a privilege,” he said.
Email: [email protected]
Twitter: @birenbomb