Stanford researchers have developed an algorithm that can spot 14 types of diseases among hundreds of chest X-rays in a matter of seconds.
“Usually, we see AI algorithms that can detect a brain hemorrhage or a wrist fracture — a very narrow scope for single-use cases,” said Matthew Lungren, MD, assistant professor of radiology. “But here we’re talking about 14 different pathologies analyzed simultaneously, and it’s all through one algorithm.”
The algorithm, CheXNeXt, is the first to simultaneously evaluate X-rays for a multitude of maladies and return results that are consistent with readings by radiologists. The tool’s accuracy matched or exceeded radiologists’ findings for 11 of 14 diseases.
And it read 420 X-rays and diagnosed the diseases in about 90 seconds, a task that took radiologists about three hours. A study of the technology, which could be especially helpful for patients without access to a radiologist’s expertise, was published Nov. 20 in PLOS Medicine.
Lungren, who shared senior authorship of the study with Andrew Ng, PhD, adjunct professor of computer science, said a new version of CheXNeXt is being developed that he hopes can expedite the X-ray-reading process for urgent care or emergency doctors.