The ever-growing processing power of computers in smaller and more accessible packages has created a space for technologies such as virtual reality (VR), augmented reality (AR) and mixed reality (MR)—collectively known as extended reality (XR).
VR has thus far been implemented by replacing a user’s visual field with a screen—whether that’s through a mobile device as seen in Google’s Cardboard or with a complex headset connected to a powerful PC such as the Oculus. AR and MR are more ambiguous terms that generally refer to the projection of a virtual image into reality or an alteration of how the user would see or interact with their environment. One example is the “placement” of virtual furniture in a room by many mobile shopping applications before purchasing the product. The culmination of both VR and AR is XR, which encompasses the constantly changing spectrum of technologies that toe the edge between science fiction and reality. With these advances in XR technologies, the possibilities for medical application—especially in an innovative and high-tech field like interventional radiology—have never been broader.
What XR could mean for IR
Enhanced engagement within XR-driven virtual environments offers unique potential within IR, a field where strong spatial understanding of often complex patient anatomy is essential. Several field-relevant applications of VR and AR have been developed in recent years showing promise for shifting paradigms within the field.
Training
At the SIR 2018 Annual Scientific Meeting, Ziv J Haskal, MD, FSIR, demonstrated possible XR applications in medical training with a 360-degree video walking through a transjugular intrahepatic portosystemic shunt (TIPS) case. Dr. Haskal made this immersive demonstration available to anyone in the world with a phone, which can be converted to a stereoscopic VR viewer using affordable mobile device accessories like Google Cardboard. This demonstration was an engaging way to attract aspiring trainees, while also providing patients with a better understanding of their planned procedures. Additional studies since then have demonstrated the value of AR-guided endovascular puncture for successful TIPS placements.2
For many medical students and even interested pre-medical students, IR is an abstract and nebulous specialty. The opportunity to immerse oneself into a virtual IR suite and interact with different instruments or even view procedures could help improve outreach and spreading awareness of this field.3
For residents, fellows and attending staff, it is important to have continued exposure to IR tools and procedures. Not all institutions and IR training programs are exposed to the same volume of procedures, nor will those procedures necessarily fall within the IR scope of practice at those institutions. As a result, trainees can experience and learn those procedures virtually via simulation and attending staff can continue to hone those skills as a form of continuing medical education.
Patient imaging can also be utilized for collaborative surgical planning and practical scenario training. With this training modality, there are no lasting real-world complications, and the simulated environment allows a safe space in which to hone technical capabilities.4–6
Patient education
XR has also been utilized in patient education and communication. One notable example is for patients with cancer who will undergo chemoradiation. Because cancer diagnoses are understandably anxiety inducing, many studies have utilized XR technologies to help augment patient education and introduce them to the treatment process while setting expectations.7 This approach has the potential to improve physician–patient communication while reducing patient fears. Similar efforts have taken place in vascular surgery, where VR has been used in patient education of abdominal aortic aneurysm treatment.8
IR could adopt this approach by developing patient education modules for more advanced procedures, especially within the interventional oncology, hepatobiliary and vascular realms. This technology could also be used to show patients the progression of their treatment in follow-up clinic visits—whether that is their shrinking fibroids, postablation malignancies or stabilization of endografts. This may also have the unexpected side effect of enhancing the general population’s knowledge about the field of IR.
Pain management
To date, VR has primarily been utilized in the medical setting for psychiatric disorders. More recent research has utilized VR as a positive distractor for acute pain during medical procedures and hospitalizations. Relying on its ability to provide an immersive environment, VR is thought to overwhelm the patient with sensory input to help limit the patient’s processing of nociceptive pain.9 These findings have spurred further research into whether VR can also be utilized for the treatment of neuropathic pain, and initial findings are promising.10 Studies have shown the technology may be an effective adjunct in multimodal pain management with the hopes that it can help decrease the use of pain medications, specifically opioids.
IR’s expanding repertoire of techniques for oncology-related pain management, consisting of cement and/or screw fixation and ablations, can be applied to minimize opioid dependence. Pain management could one day be supported with the use of VR technologies. Recent studies have also used VR technologies for symptom management in palliative care.11,12 VR can also be an important tool pre- and postprocedurally by helping to reduce anxiety prior to a procedure and for postoperative pain management.
Quality improvement
A study demonstrated that use of AR guidance decreased the number of needle passes, reduced radiation dosing by decreasing the number of interval CT scans and shortened procedural time. Many advancements and IR use cases have come from the interventional oncology space with percutaneous needle insertion and biopsies.13,14 There have also been early studies using AR for tumor ablations.15,16
Barriers to use
From its fledgling beginnings to its iterative medical innovations, XR has seen its fair share of use cases globally.
The most important concern moving forward is addressing the economy of scale and how to encourage more widespread adoption of this technology. A systematic review of wearable heads-up display devices in the operating room identified IR as having the fewest number of published studies among 10 other procedural specialties.5 As minimally invasive and image-guided physicians, it’s important that we support further experimentation, development and adoption of technologies. While high financial costs are currently a barrier to adoption and use, as the demand for this technology increases there should be a proportionate saving in costs.
Another important factor of XR is addressing the physical footprint and ensuring that the equipment does not impede the quality of clinical care, such as with large displays and wires that could potentially limit tabletop movement and C-arm mobility. Another aspect to overcome in encouraging adoption is that the burden of creating institution-specific content to provide the most appropriate and familiar experience will inevitably fall to that institution itself. While it may seem like a tall order, institutions have already been able to take the initiative in integrating VR/AR technology within their health systems.3
On the patient side, a substantial barrier to entry is the ergonomic limitations from prolonged use or XR tech, otherwise known as cybersickness. It is a multifactorial problem, without clear causes and solutions. Each subsequent generation of devices attempts to mitigate cybersickness through hardware-related factors that better account for physical differences, such as pupil distance and accommodation and convergence distances. Hardware solutions also serve to maintain high frame rate renderings and reduce latency. Visual-spatial-motor differences in acceleration, field of view, environmental and contextual blurring, and dynamic focus points can also contribute to cybersickness. This occurs via sensory mismatch and overload, ultimately creating postural instability as the virtual environment fails to dynamically adjust with how the user is perceiving their surroundings.17 Nonetheless, ongoing efforts in research and development will hopefully allow a more comfortable viewing experience, especially in procedural situations where taking periodic breaks may not be reasonable.
Where to expand next
AR smartphone applications already exist that teach by prompting users to undergo specific steps necessary to perform a procedure from beginning to end. However, the available IR training modules have only included more basic procedures such as paracentesis, thoracentesis and central line placements. As this technology continues to be integrated within trainee education, more advanced IR procedures will hopefully be introduced that teach more fine-motor skills. AR and VR technologies within IR have predominantly been visually dominant, with little to no ability to interact with and obtain visual-tactile feedback from the environment. The current hand-tracking devices only provide point-and-click functionality. Moving forward, it will be important to increase the realism of these simulations with accurate haptic feedback and improved immersion within the environment.
As it stands, different technologies currently exist that serve to refine patient anatomy by utilizing existing radiographic imaging modalities. These technologies help physicians navigate to their target locations. Improving the efficiency in reaching those locations and being able to better visualize the pathways creates safer treatments by reducing the amount of radiation exposure and bolstering treatment confidence. Moving forward, it will be paramount to refine these technologies in ways that will account for common problems such as needle and wire bending and adjusting for patient respirations, soft tissue deformations, and organ movements and mobilization.5
Conclusion
IR has been at the forefront of medicine by integrating cutting-edge technology into its scope of practice. The utility that these display technologies can provide is appealing on many levels through training, education, procedure planning, perioperative management and even symptom control. As these digital technologies improve, the promising future of IR looks more virtual and feels more augmented. Although our base reality has been used to create XR, the roles may reverse in the future as XR may redefine our reality.
Acknowledgements: Riyaz Abidi, MS3
References
- Cipresso P, Giglioli IAC, Raya MA, Riva G. The Past, present, and future of virtual and augmented reality research: a network and cluster analysis of the literature. Front Psychol. 2018;9:2086. doi:10.3389/fpsyg.2018.02086.
- Yang J, Zhu J, Sze DY, et al. Feasibility of augmented reality–guided transjugular intrahepatic portosystemic shunt. J Vasc Interv Radiol. 2020;31(12):2098–2103. doi:10.1016/j.jvir.2020.07.025.
- Uppot RN, Laguna B, McCarthy CJ, et al. Implementing virtual and augmented reality tools for radiology education and training, communication, and clinical care. Radiology. 2019;291(3):570–580. doi:10.1148/radiol.2019182210.
- Gelmini AYP, Duarte ML, de Assis AM, Guimarães Junior JB, Carnevale FC. Virtual reality in interventional radiology education: A systematic review. Radiol Bras. 2021;54(4):254–260. doi:10.1590/0100-3984.2020.0162.
- Park BJ, Hunt SJ, Martin C, Nadolski GJ, Wood BJ, Gade TP. Augmented and mixed reality: Technologies for enhancing the future of IR. J Vasc Interv Radiol. 2020;31(7):1074–1082. doi:10.1016/j.jvir.2019.09.020.
- Javaid M, Haleem A, Singh RP, Khan S. Understanding roles of virtual reality in radiology. Internet Things Cyber-Phys Syst. 2022;2:91–98. doi:10.1016/j.iotcps.2022.06.002.
- Wang LJ, Casto B, Luh JY, Wang SJ. Virtual reality-based education for patients undergoing radiation therapy. J Cancer Educ. 2022;37(3):694–700. doi:10.1007/s13187-020-01870-7.
- Pandrangi VC, Gaston B, Appelbaum NP, Albuquerque FC, Levy MM, Larson RA. The Application of Virtual Reality in Patient Education. Ann Vasc Surg. 2019;59:184–189. doi:10.1016/j.avsg.2019.01.015.
- Spiegel B, Fuller G, Lopez M, et al. Virtual reality for management of pain in hospitalized patients: A randomized comparative effectiveness trial. PLoS ONE. 2019;14(8):e0219115. doi:10.1371/journal.pone.0219115.
- Austin PD, Siddall PJ. Virtual reality for the treatment of neuropathic pain in people with spinal cord injuries: A scoping review. J Spinal Cord Med. 2021;44(1):8–18. doi:10.1080/10790268.2019.1575554.
- Martin JL, Saredakis D, Hutchinson AD, Crawford GB, Loetscher T. Virtual reality in palliative care: A systematic review. Healthcare. 2022;10(7):1222. doi:10.3390/healthcare10071222.
- Guenther M, Görlich D, Bernhardt F, et al. Virtual reality reduces pain in palliative care–A feasibility trial. BMC Palliat Care. 2022;21(1):169. doi:10.1186/s12904-022-01058-4.
- Long DJ, Li M, De Ruiter QMB, et al. Comparison of smartphone augmented reality, smartglasses augmented reality, and 3d cbct-guided fluoroscopy navigation for percutaneous needle insertion: A phantom study. Cardiovasc Intervent Radiol. 2021;44(5):774–781. doi:10.1007/s00270-020-02760-7.
- Faiella E, Messina L, Castiello G, et al. Augmented reality 3D navigation system for percutaneous CT-guided pulmonary ground-glass opacity biopsies: A comparison with the standard CT-guided technique. J Thorac Dis. 2022;14(2):247-256. doi:10.21037/jtd-21-1285.
- Gadodia G, Yanof J, Hanlon A, et al. Early clinical feasibility evaluation of an augmented reality platform for guidance and navigation during percutaneous tumor ablation. J Vasc Interv Radiol. 2022;33(3):333–338. doi:10.1016/j.jvir.2021.11.014.
- Solbiati M, Ierace T, Muglia R, et al. Thermal ablation of liver tumors guided by augmented reality: An initial clinical experience. Cancers. 2022;14(5):1312. doi:10.3390/cancers14051312.
- Stanney K, Lawson BD, Rokers B, et al. Identifying causes of and solutions for cybersickness in immersive technology: Reformulation of a research and development agenda. Int J Hum-Comput Interact. 2020;36(19):1783–1803. doi:10.1080/10447318.2020.1828535.