AI shown to be able to harvest information hidden in chest X-ray

Gradient-weighted class activation maps (Grad-CAM) of anatomy contributing to the CXR Risk score. Left panel shows the Grad-CAM and the right panel shows the chest radiograph of a man in  his 60s from the Prostate Lung and Ovarian (PLCO) trial who died of respiratory illness in 2 years. Grad-CAM highlights an enlarged heart with prominent pulmonary vasculature indicating pulmonary edema (very-high risk CXR risk score). Image courtesy of JAMA Netw Open from Lu MT et al JAMA Netw Open. 2019; 2: e197416. doi: 10.1001/jamanetworkopen.2019.7416).

The most frequently performed imaging exam in medicine, the chest X-ray,  holds ‘hidden’ prognostic information that can be harvested with artificial intelligence (AI), according to a recently published study by scientists at Massachusetts General Hospital (MGH). (Lu MT et al Deep Learning to Assess Long-term Mortality From Chest Radiographs. JAMA Netw Open. 2019; 2: e197416. doi: 10.1001/jamanetworkopen.2019.7416).

Most chest radiographs are reported as normal, in that they rule out a specific diagnosis such as pneumonia. However, even normal radiographs manifest additional minor abnormalities, such as aortic calcification or an enlarged heart that may provide a new window into prognosis and longevity with the potential to inform decisions about lifestyle, screening, and prevention. Whereas physicians may interpret thousands of chest radiographs during a career, they rarely know the outcomes in these patients a decade later. Therefore, it is difficult to develop an intuition to articulate which features have long-term prognostic value.

Artificial intelligence (AI) has already been responsible for major advances in medicine; for example, several groups have applied AI to automate diagnosis of chest X-rays for detection of pneumonia and tuberculosis.

If AI technology can already make such diagnoses”, asked radiologist Dr M Lu “could it also identify people at high risk for future heart attack, lung cancer, or death?

Lu, who is director of research for the MGH Division of Cardiovascular Imaging, and his colleagues developed a convolutional neural network known as CXR-risk, for analyzing visual information. CXR-risk was trained by having the network analyze more than 85,000 chest X-rays from 42,000 subjects who took part in an earlier clinical trial. Each image was paired with a key piece of data: Did the person die over athe following 12-year period? The goal was for CXR-risk to learn the features or combinations of features on a chest X-ray image that best predict health and mortality.

Next, Lu and colleagues tested CXR-risk using chest X-rays for 16,000 patients from two earlier clinical trials. They found that 53% of people whom the neural network identified as “very high risk” died within 12 years, compared to fewer than 4% of those that CXR-risk labeled as “very low risk.” The study found that CXR-risk provided information that predicts long-term mortality, independent of radiologists’ readings of the x-rays and other factors, such as age and smoking status.

Lu believes this new tool will be even more accurate when combined with other risk factors, such as genetics and smoking status. Early identification of at-risk patients could result in more patients taking part in preventive and treatment programs. “This is a new way to extract prognostic information from everyday diagnostic tests,” says Lu. “It’s information that’s already there that we’re not using, that could improve people’s health.”