Using AI to Detect Cancer, Not Just Cats


SHAOKANG WANG AND his startup, Infervision, build algorithms that read X-ray images and identify early signs of lung cancer. The company's technology, Wang says, is already running inside four of the largest hospitals in China. Two are merely running tests, but according to Wang, the two others—Shanghai Changzheng and Tongji, both in Shanghai—are installing the technology across their operations. "It's installed on every doctor's machine," he says.
 
To what extent these doctors are actually using the technology is another question. In the world of health care, artificial intelligence is still in its infancy. But the idea is spreading.
 
At two hospitals in India, Google is now testing technology that can identify signs of diabetic blindness in eye scans. And just last week, the data science competition site Kaggleannounced the winners of a $1 million contest in which more than 10,000 researchers competed to build machine learning models that could detect lung cancer from CT scans. The winning algorithms will feed work at the National Cancer Institute to more rapidly and effectively diagnose lung cancer, the leading cancerous killer in the US among both men and women. "We want to take these solutions further," says Keyvan Farahani, a program director at the institute.
 
Deploying such AI on a large scale—across hospitals, for instance—is still enormously difficult, says Dr. George Shih, a physician and professor at Weill Cornell Graduate School of Medical Sciences, and the co-founder of MD.ai, a company that participated in the Kaggle contest. Aggregating all the necessary data is enormously complicated, not to mention the difficulty that comes with just trying to plug this technology into existing systems and day-to-day operations. But Shih believes that today's best algorithms are already accurate enough to drive commercial products. "We're probably only a few years away from more massive deployments," he says.
 
The rise of these systems is powered by the rise of deep neural networks, complex mathematical systems that can learn tasks on their own by analyzing vast amounts of data. This is an old idea, dating back to the 1950s, but now that operations like Google and Facebook have access to such enormous amounts of data and computing power, neural networks can achieve far more than they could in the past. Among other things, they can accurately recognize faces and objects in photos. And they can identify signs of disease and illness in medical scans.
 
Just as a neural network can identify a cat in a snapshot of your living room, it can identify tiny aneurysms in eye scans or pinpoint nodules in CT scans of the lungs. Basically, after analyzing thousands of images that contain such nodules, it can learn to identify them on its own. Through the Kaggle contest, run in tandem with the tech-minded consultancy Booz Allen, thousands of data scientists competed to build the most accurate neural networks for the task.
 
Before a neural network can start learning the task from a collection of images, trained doctors must label them—that is, use their human intelligence and knowledge to identify the images that show signs of lung cancer. But once that's done, building these systems is more computer science than medicine. Case in point: The winners of the Kaggle prize—Liao Fangzhou and Zhe Li, two researchers at Tsinghua University in China—have no formal medical training.
 
Physician's Assistant
 
Still, these AI technologies won't completely replace trained doctors. "This is still only a small part of what radiologists or doctors do," Shih says. "There are dozens of other pathologies that we are still responsible for." New AI systems will examine scans faster and with greater accuracy before doctors explore the patient's situation in more detail. These AI assistants will ideally reduce health care costs, since screenings require so much time from doctors, who may also make mistakes.
 
According to Shih and others, doctors don't make many false negative diagnoses—failing to identify signs of cancer in a scan. But false positives are a problem. Hospitals often end up spending time and money tracking the progress of patients who don't need such close care. "The issue with lung cancer screening is that it's very expensive," Shih says. "The big goal is: How do you minimize that?"
 
Shih's company aims to build services for collecting and labeling data that researchers and companies can then use to train neural networks, not just for cancer detection but for many other tasks as well. He acknowledges that this kind of AI is only just getting started. But he believes it will fundamentally change the field of health care,
 
particularly in the developing world, where trained doctors aren't as prevalent. Over the next few years, he says, researchers aren't likely to build an AI that's better at detecting lung cancer than the very best doctors. But even if machines can top the performance of even some of them, that could change the way hospitals operate, one scan at a time.