Journal of Medical Education
Abstract
Ultrasound-guided procedures demand realism and interactive training tools to develop critical skills. However, traditional simulators often fall short, limiting effective clinical application. This article introduce a metaverse-powered extended reality (XR) neck simulator that address these limitations. Our XR neck leverages high-fidelity 3D models, haptic feedback, and real-time dynamic simulations to create an immersive, collaborative, and ecofriendly training environment for healthcare professionals. Gamified metrics assess skill proficiency, while AI-powered guidance standardizes techniques. We found demonstrate significant improvements in procedural knowledge, skill proficiency, and safety awareness. This innovative XR model not only reduces costs and reliance on animal model but also aligns with carbon reduction goals. By setting a new standard in medical education, the metaverse-driven XR simulator offers promising advancements in patient safety and care quality.
First Page
176
Last Page
181
DOI
10.6145/jme.202409_28(3).0006
Recommended Citation
Kuo, Ting-Chun; Smith, Shana; Yang, Chih-Wei; Chen, Kuen-Yuan; Xu, Yu-Chen; Huang, Tzu-Hsuan; Lee, Chien-Chang; Chen, Jin-Shing; Lin, Ming-Tsan; and Wu, Ming-Hsun
(2024)
"A Metaverse-Driven Extended Reality Neck Model for Ultrasound-Guided Invasive Procedures and Surgical Training,"
Journal of Medical Education: Vol. 28:
Iss.
3, Article 2.
DOI: [https://doi.org/]10.6145/jme.202409_28(3).0006
Available at:
https://jme.researchcommons.org/journal/vol28/iss3/2