If there were an unofficial theme of SIR 2024, it might be artificial intelligence—what it is, when to use it and where it might go next. From dedicated sessions to keynote lectures, the possibility of AI and robotics in interventional radiology was a frequent discussion. As summarized by one session title, AI and robotics in IR is coming, whether we like it or not.
According to Bruce J. Tromberg, PhD, the director of the National Institute of Biomedical Imaging and Bioengineering (NIBIB) at the National Institutes of Health (NIH), AI is changing the way physicians practice medicine. Since 2019, AI and imaging funding and research has exploded, particularly in the cancer space.
Brittany Palmer Nielsen
Bruce J. Tromberg, PhD, delivers speech on AI integration at SIR 2024.
“Engineers are tuned into this area, and there are a lot of opportunities for enhancing and deepening physician and engineer collaborations,” he said during one of the SIR 2024 plenaries. “IR, as a clinical and technologically complex specialty, is an ideal use case for building those partnerships.”
The potential application of AI in medicine is limitless, he said, from workflow efficiency, automation protocols, device selection, navigation, risk assessment, disease prediction or even drug selection.
With so many opportunities and technologies evolving, there may soon be a delineation in the workforce between those who embrace AI and those who don’t, says Dania Daye, MD, PhD, who won the 2024 Gary J. Becker Young Investigator Award for her work with AI application in IR. “We have data showing that AI is affecting patient outcomes, and all physicians will eventually have to use some AI tools if they want to keep up.”
AI is here, and there’s more to come. But what is it, and how can you use it?
Extended reality
When it comes to AI integration, one area that IRs are likely already familiar with is extended reality (XR)—though, as Ali Dhanaliwala MD, PhD, said, “XR is like a solution looking for a problem.”
According to Dr. Dhanaliwala, XR—specifically XR headsets that can be worn during procedures—offers a solution to common inconveniences such as having to navigate a line of site to monitors that may be across the room during procedures. XR headsets can also provide real-time maps of the body and higher image registration, rather than navigating imaging through a 2D monitor. Recording features can improve reporting, eliminating memory lapses that come from doing dictation several hours after a procedure. Headsets can even provide live updates on inventory. But convincing someone to wear a headset on the possibility that they may need to look up inventory mid-procedure can be a hard sell, said Zachary L. Bercu, MD, FSIR. “To make XR worth it, you have to wear the headset,” he said, “and to make the headset worth wearing, it has to be as indispensable as a cell phone.”
In his practice, Dr. Bercu has built and used a variety of XR apps. They range from simple, such as a life-size ruler app, to complex—a floating virtual monitor with EMR access that can provide prior imaging, labs and clinical notes and is controlled by voice, gesture or eye movement. Another application allows the user to stream ultrasound and fluoroscopy imaging to a virtual slate.
“Using this, you can look at your hands and imaging at the same time,” Dr. Bercu said.
Though these applications are designed to improve outcomes and experience, Dr. Bercu acknowledged that there is a barrier to implementation of XR in practice.
“People are naturally hesitant,” he said. “Some people think the headset looks stupid, and don’t want to look stupid in front of a patient. And if it slows an IR down by even 3 seconds, no one will use it.” These devices and interfaces are still early iterations, he said—though he feels confident that as user interface improves, the technology will solidify its role in practice.
Robotics
Robotics and IR go hand in hand, so it’s not surprising that many IR practices have already folded robotic-assisted care into their toolbox. Several sessions at SIR 2024 were dedicated to sharing use-case experiences, such as with robotic-assisted ablation or interventional oncology applications.
“We expect, when using a robot, to have better outcomes and accuracy as well as higher degrees of freedom,” said Thierry De Baerre, MD, who has used robotic assistance for ablation procedures. According to Dr. De Baerre, robotics can improve treatment planning, assist with probe placement, ensure a single push from skin to target and even provide spacing and ablation confirmation.
Meanwhile, researchers and engineers at the University of Maryland and Johns Hopkins Hospital have developed soft robotic microcatheters—made of multi-lumen tubing and 3D-microprinted soft robotic unidirectional microcatheter tips—to aid IRs navigating tortuous or complex anatomy that challenges conventional microcatheters. These soft robotic microcatheters were able to accurately navigate complex and delicate vasculature. According to presenter Christopher Bailey, MD, this initial trial shows huge promise for the future of steerable robotic microcatheters, which may improve success rates, decrease intervention time and reduce complications.
Another presentation looked at the feasibility and safety of CT-guided bone biopsies performed on cancer patients via a robotic system. In reviewing data from 40 biopsies where the robot advanced the needle on demand, there was a 100% technical success rate. According to presenting author Agnieszka Witkowska, MD, the robotic assistance meant that the operators did not have to wear lead, and the median procedure time was 30 minutes—a low procedure time, which also reduced patient radiation exposure.
Researchers from Japan presented similar findings from a study evaluating the safety of a fully robot-assisted needle placement system from CT-guided biopsy. In that study, physicians used a targeting device in 11 patients; the robot moved the needle to the designated spot and inserted the needle before an IR took over the procedure.
Machine learning
Machine learning, along with “deep learning,” are subsets of AI that utilize algorithms to essentially mimic human reaction and function. Deep learning utilizes neural networks to train algorithms based on existing data. According to many presenters at SIR 2024, radiomics—in which an algorithm pulls data from medical imaging to make suggestions—has the most current and far-reaching applications in IR.
At the University of Wisconsin, researchers developed an AI algorithm that suggests puncture pathways utilizing data from volumetric lung CTs by segmenting the thoracic anatomy and detecting areas of emphysema. The pathways were then evaluated by physicians to determine their safety.
According to Meridith A. Kinsting, MA, who presented the findings, all the ideal pathways found by the algorithm were determined to be safe by the physicians. In effect, the algorithm successfully generated high-quality pathways and accurately judged their safety, based on literature-derived rules.
IRs at Temple University Hospital have begun using AI software that utilizes machine learning to interpret imaging and alert members of a pulmonary embolism response team (PERT) if there are critical findings. According to members of the PERT, this software has enhanced accuracy and speed of diagnosis, while improving communication and risk stratification.
“The integration of [AI software] exemplifies the potential of AI to facilitate and optimize patient care, particularly in emergent clinical settings including PE management,” said Temple University Hospital resident Daniel Kushner, MD.
Large language models
For the general public, large language models (LLM) have become synonymous with AI, thanks to the popularity of models like ChatGPT. Though LLMs pose some ethical considerations, especially in education and peer-reviewed research spaces, they can be an excellent tool when applied to patient education, logistics or education enhancement.
@iRadPulse, an AI-powered tool that uses LLM to streamline the dissemination of IR literature, is one way to use AI for continuing education. @iRadPulse uses the OpenAI GPT-4 model to take abstracts from five major IR journals including SIR’s Journal of Vascular and Interventional Radiology and create brief and scientifically accurate summaries, which are posted to social media. This enables physicians to easily stay up-to-date with emerging literature and easily determine what research is relevant to their practice area.
There are hundreds, if not thousands, of potential applications for AI in IR—but just because the solution exists, doesn’t mean it will work for your practice.
“AI can’t solve every problem, so it’s important to define your actual clinical problem and understand if it’s worth solving,” said Julius Chapiro, MD, PhD. “Don’t create solutions in the absence of problems. Start with simple issues and frequently encountered barriers.”
Dr. Chapiro, who published a paper on the topic with Olivia Gaddum, shared some key questions to ask when implementing AI into your practice. First, think critically about the practical value of the solution. Are you implementing AI to improve accuracy or reduce risks? Is it for streamlined patient education or experience? Identify the value, and ensure you have the metrics to track potential success.
Once the problem and end goal are identified, make sure the algorithm you’re using is tailored to your data type and needs, Dr. Chapiro said. LLMs and deep learning algorithms require large data sets, so you need to ensure that you have enough high-quality data to train the algorithm. And finally, determine whether the software or technology you’re using is realistic in your practice, or if your solution can be achieved through another software.
The future of AI
According to Dr. Daye, the sky is the limit for AI possibilities—but the real challenge is application. While many algorithms are already on the market in terms of patient selection, triage and pre-procedural planning, she said solutions for intraprocedural planning are still in the research phase.
Dr. Tromburg agreed, and said he expects to see a spike in devices and algorithms that will improve information content of imaging—such as limiting the noise in ultrasounds or utilizing imaging to extract vascular features and classify tumors.
In the coming years, Dr. Tromburg expects more devices and algorithms to be submitted for FDA approval. AI is quickly going to be unavoidable, Dr. Daye said, and IRs will have to choose whether to adapt or fall behind.
“It’s clear that AI absolutely has a place in medicine, both in management and clinical practice,” said Dr. Dhanaliwala. “And it’s already here.”