First of a kind clinical use of AI/ML in Radiology shows promise and is being commercialized now

Tablet with x-ray.

Around the world, the application of artificial intelligence (AI) and machine learning (ML) in medical imaging is a hot topic. While research work will undoubtedly continue and expand, there are also growing examples of AI and ML being used in real-world, clinical settings.

Importantly, these early use cases are helping both technology vendors and providers understand how to integrate cutting-edge AI- and ML-based imaging systems with the entire healthcare system and its workflow. Currently, less than 3 percent of the data being produced by hospitals each year is actionable, tagged or analyzed1. Automating the detection, localization and quantification of critical conditions promises tangible improvements.

A glance at the agenda of the RSNA 2018 Annual Meeting, revealed that AI and ML were in the spotlight. Both were the focus of plenary sessions, educational courses and special interest sessions.

“The real power is not from looking at data from any individual system,” said Jonathan Gleason, MD, vice president of clinical advancement and patient safety at Carilion Clinic, a panelist at the Executive Roundtable during the American Hospital Association's Leadership Summit in San Diego this summer, sponsored by GE Healthcare, “but from looking at data across multiple systems, and from outside of health systems, and being able to pick out trends and patterns that otherwise would not emerge.”2

Just as important, intelligent imaging systems are needed to handle ever-increasing workloads. Modalities like Computed Tomography (CT), Magnetic Resonance Imaging (MRI), Ultrasound and X-ray, produce massive amounts of digitized data. This massive volume of imaging content — in 2017, more than 74 million CT and 36 million MRI procedures were performed in the United States3 — is contributing to an alarming, 45 percent burnout rate among radiologists, according to the 2018 Medscape National Physician Burnout and Depression Report.4

AI for head CT

At the Imaging Centre in Nagpur in the central Indian state of Maharashtra, a deep learning system is being used to review neurological CT scans to detect and diagnose the critical ones. The product, from Qure.ai, diagnoses critical abnormalities from head CTs with greater than 95 percent area under curve (AUC), according to a test with Mayo Clinic, findings of which were recently published in The Lancet.

The system “automatically picks up all eligible scans as soon as they are delivered to the PACS,” said Mustafa Biviji, MD, owner of the Imaging Centre, which conducts 10-12 brain studies per day on average. “Since it is a triage software, it runs in the background and only alerts when there is a scan with a critical abnormality.”

Biviji, who currently is the only physician at the center receiving alerts from the algorithm (either through a worklist notification or through a messaging app), said the experience of being an early adopter has been great. “I already see it helping in my day-to-day practice,” he said, adding that to date he hasn’t found any instances of misreporting.

Collapsed lung study

Chest radiography, the most commonly performed radiological test, is the focus of a partnership between the University of California San Francisco and GE Healthcare. Clinicians are working to develop a library of deep-learning algorithms to revolutionize the speed at which scans are interpreted and patients receive care. The collaboration is initially focused on high-volume, high-impact imaging to create algorithms that reliably distinguish between what is considered a normal result and what requires follow up or acute intervention. One early example is a suite of algorithms, known as Critical Care Suite* on the mobile Optima XR240amx X-Ray system, that can alert the clinical team of potential pneumothorax cases as soon as the patients are scanned, so they can prioritize reading them.

For AI to succeed in healthcare, more than strong data science is needed. Clinical domain expertise is key, according to Keith Bigelow, general manager of analytics at GE Healthcare. “Depending on how you annotate a scan before you feed it to an algorithm, it can either have meaningful impact or not make the algorithm better,” he told HIMSS TV5 earlier in November. By embedding AI into imaging equipment and shared via the cloud at scale, the resulting model will be available to rural and specialty hospitals alike. These AI-driven approaches have the opportunity to “augment that radiologist, help them to see the entire picture and drive a quality outcome for the patient,” Bigelow said.

* 510(k) pending at FDA. Not available for sale.

References

[1] Cisco report: The Internet of Things. How the Next Evolution of the Internet Is Changing Everything. http://www.cisco.com/web/about/ac79/docs/innov/IoT_IBSG_0411FINAL.pdf

[2] American Hospital Association's Leadership Summit. http://www.healthforum.com/field-insights/content/ge-2018-1011-ar-ai-powered-precision-medicine.shtml

[3] Imaging Technology News. Nov. 5, 2018. “State of the Industry: Medical Hybrid Imaging Market.” https://www.itnonline.com/article/state-industry-medical-hybrid-imaging-market

[4] Medscape National Physician Burnout & Depression Report 2018. https://www.medscape.com/slideshow/2018-lifestyle-burnout-depression-6009235#3

[5] HIMSS TV. http://himsstv.brightcovegallery.com/detail/videos/analytics/video/5804929980001/understanding-how-ai-can-advance-precision-medicine?autoStart=true