Illinois CS professors David Forsyth and Sanmi Koyejo worked with Ayis Pyrros, MD, of DuPage Medical Group to broaden knowledge extracted from X-rays and proved a more accurate modeling for COVID-19 cases.
The radiologist for DuPage Medical Group only knew Forsyth as a leading expert in Artificial Intelligence. Their lack of a relationship didn't undo the excitement he had to form a collaborative research effort, though. His goal was to guide medical imaging in a new direction, one that could offset a few growing trends in this country’s healthcare system.
Gaps in healthcare – especially inconsistencies in electronic medical record keeping and difficulties in sharing patient data – impact a provider’s ability to diagnose and treat conditions efficiently. It also affects a patient’s ability to access the care needed.
As COVID-19 spread this past year, the pandemic shed more light on these gaps increasing the need for creative solutions.
Three years later, that “shot in the dark” call shifted to something Dr. Pyrros now likes to call serendipity. A quick call back from Forsyth led to an introduction with fellow Illinois CS professor Sanmi Koyejo. Since then, this group has worked together to use AI and Machine Learning to bolster efforts in biomedical imaging.
Dr. Pyrros recently presented their most recent project, titled “Predicting Comorbidities Associated with COVID-19 Admissions from Frontal Radiographs Utilizing a Multipart Neural Network Classifier," at the Radiological Society of North America’s (RSNA) 2020 Annual Meeting.
Additional collaborators included Nasir Siddiqui, MD, of DuPage Medical Group, professor Alexander Schwing of Illinois ECE and Adam Flanders, MD, of Thomas Jefferson University Hospital. The group also credits Illinois CS graduate students Patrick Cole and Andrew Chen.
The tool they have created uses AI to evaluate an X-ray and account for six health variables. Typically, Dr. Pyrros said, radiologists focus on the lungs. Their program accounts for diabetes and other chronic conditions to more accurately predict the severity of a patient’s COVID-19 case.
“I wanted to help any provider pick up an X-ray and have a greater chance of noticing a patient at high risk for heart disease or diabetes complications,” Dr. Pyrros said. “The program we built accounts for the heart and soft tissue areas, primarily. Initially, this work was designed to help providers more accurately diagnose and cheapen treatment costs of chronic conditions.
“As COVID-19 spread, we adapted the model to address this patient population.”
When Koyejo first joined Dr. Pyrros on this project, he began reviewing COVID-19 patient data with graduate students. They wanted to first understand the way comorbidities influenced varying demographics.
As they plugged this data into their model, they found discrepancies between outcomes predicted by the patient information and their AI. Their assumption was that the model made a mistake. This is typical, according to Koyejo, as placing data into AI models usually needs refinement to adjust for inherit biases.
But further inspection uncovered that the actual problem stemmed from missed diagnoses. In a reversal, Koyejo and his students understood that the AI was detecting discrepancies in diagnosis.
“The reporting we found indicated a bias that appeared to be, thus far at least, based on English versus non-English speaking patients,” Koyejo said. “In these instances, it can be harder to diagnose properly. Thus, the correct information might not make it onto the medical record. It was the AI associated with these X-rays that helped us uncover outcomes that would not have been as obvious if we only looked at medical records.”
Specifically, the AI reads an X-ray and accounts for six medical variables, including:
- Diabetes with chronic complications
- Obesity, or a BMI over 40
- Congestive heart failure, or an enlarged heart on the X-ray
- Cardiac arrhythmia, such as atrial fibrillation
- Vascular disease, including atherosclerosis, calcifications, plaques, etc.
- Lung disease, such as COPD
The group also trained it to predict age, gender, and something called RAF, risk adjust factor which provided an overall score.
Their results, and thus the more accurate prediction, included both the medical information forecast and the AI model's prediction.
“So often radiologists are looking for very specific things, so we don’t typically serve as the inspiration for expanding what we view,” Dr. Pyrros said. “It was exciting to think bigger picture and to work with people like David, Sanmi and Alex who are so tremendously helpful.
“There is a reason for the ranking Illinois Computer Science earns. The students are amazing, while the faculty push them and provide opportunity that motivates.”
To further prove the worth of this model, this group started implementing it with a patient population from the University of Illinois Chicago. Thus far, findings have again shown promising results in more accurately predicting severity of COVID-19 instances.
Dr. Pyrros said the effort wouldn’t have been possible without the cooperation of the Illinois CS work group, as well as his own health system. Koyejo, meanwhile, credited the collaborative spirit of the work group including faculty colleagues and students, as well as collaborators from DuPage Medical Group and Edward-Elmhurst Institutional Review Board (IRB).
“The idea is to support the decision making of experts, and to help prevent any potential blind spots,” Koyejo said. “The machine has its own blind spots, of course, but our hope is that this combination produces a better, more consistent approach.
“Ayis deserves a huge kudos for recognizing this need. Maybe our work has something to do with serendipity, but I think there is a bit more to it than that.”