03
April
2017
|
00:00 AM
Europe/Amsterdam

BodyExplorer Shows Students What They’re Made of

Pitt Team Awarded “Best in Show” for Augmented Reality Training System that Allows Students to Visualize Inside of the Human Body

ORLANDO, FL (April 3, 2017) … Imagine you are a medical or nursing student who wants to learn how to effectively and safely anesthetize a patient prior to surgery. You walk up to the patient and are guided by a virtual instructor’s voice and hands projected onto the body. You open up viewports that enable you to see through the skin to visualize the position of the breathing tube you are inserting into the trachea. All medications you inject are measured, and you are alerted if you administer an incorrect dose—and if you make such a mistake, no one is harmed: you can “push the reset button” and try again.

This is the guided learning experience provided by BodyExplorer, a next-generation medical simulator developed by a multidisciplinary team at the Simulation and Medical Technology R&D Laboratory in the Department of Bioengineering at the University of Pittsburgh. The entire system, including a highly sensorized physical model of a human body and an augmented-reality projection system, can easily fit on a table in a classroom or a nurses’ break room in a hospital unit. The system demonstrates advanced simulation-based healthcare training with automated instruction, real-time feedback and round-the-clock accessibility for trainees.  

At the Serious Games and Virtual Environments (SG/VE) Showcase during the International Meeting on Simulation in Healthcare (IMSH) in Orlando, BodyExplorer won the “Best in Show” award in the student project category. Douglas Nelson, Jr., a PhD student in bioengineering at the University of Pittsburgh, presented BodyExplorer to the judges.

“We’ve been developing BodyExplorer over the past five years to help students learn about medicine, nursing, pharmacy and clinical procedures,” Nelson said. “We designed the system to make simulators easier to use for students and instructors, which seemed to impress the judges looking to the future of healthcare simulation. BodyExplorer is particularly useful because its automated instruction can allow trainees to practice without supervision while still receiving feedback on proper technique. This has the potential to provide more efficient simulation-based healthcare training by reducing the workload on educators while increasing availability of such training to students.”

Using BodyExplorer’s augmented-reality (AR) visualization, students can manipulate an image projected onto the mannequin torso. Trainees can use a simple, pen-like tool to open “windows” into the underlying anatomy, revealing muscles, bones and organs, including breathing lungs and a beating heart. The trainees can also see patient vital signs or other data; for example, they can pull up an electrocardiogram (ECG) graph to see how the ECG relates to the sound and motion of the heart and how it is affected by injected drug simulants.

Joseph Samosky, assistant professor of bioengineering at Pitt, is the originator and principal investigator of the BodyExplorer project and faculty advisor for Nelson’s PhD research. “If a student wants to explore the effects of medications on cardiac function, the student can inject simulated drugs and the system will automatically respond with changes in heart rate that can be seen, heard and visualized on the ECG displayed directly beside the beating heart,” Samosky said. “We want to maintain a focus on the patient. In BodyExplorer, the body itself becomes a tangible user interface (TUI), sensing inputs from and displaying information back to the trainee. The system enables you to interact naturally with the simulated patient and see the internal consequences of your external actions.”

BodyExplorer is highly interactive. It features a novel drug-simulant recognition system that encodes an identity, or “signature,” directly in the fluid itself, so simulated drugs can be injected in a naturalistic way and automatically recognized by the system. If the trainee administers a medication too quickly, BodyExplorer may elicit a loud, painful scream. Likewise, if the trainee administers a medication that causes the heart to beat faster, BodyExplorer’s digitally-animated heart will pulse more quickly and the pounding sound of heartbeats will also quicken.

John O’Donnell, professor and chair of the Department of Nurse Anesthesia joined the BodyExplorer team in 2013 as a clinical co-investigator and faculty advisor on the project. He co-chaired the IMSH conference which had the highest attendance of any international simulation conference in the world to date with more than 3,500 healthcare educators and students. O’Donnell has been assisting with the development and validation of curriculum for the system and notes that “students in healthcare training programs want and need the chance to practice their skills and get immediate feedback. BodyExplorer has the potential to revolutionize the current model of training by offering ‘just in time’ and ‘on-demand’ access to key simulation experiences.”

The broadening of access for students is another key goal of the BodyExplorer project, Nelson explained. “Current healthcare simulation training is very resource intensive, requiring technicians, instructors and often specially-designed rooms. We want to bring simulation technology and training into everyday classrooms or hospitals and make it usable by students on their own without special training in simulator operation. The current BodyExplorer prototypes fit in the trunk of my car, and we would like to make commercialized models even more compact and easy to set up as we redesign them for manufacturability.”

Nelson, who will complete his PhD in April, plans after his graduation to develop a newly-founded company to bring to market a commercial version of the BodyExplorer simulation system. The development of the several technologies that have been integrated into the BodyExplorer system has been principally funded by the University of Pittsburgh Departments of Anesthesiology and Bioengineering. Additional funding has been provided by the U.S. Army’s Telemedicine and Advanced Technology Research Center (TATRC) and a Coulter Translational Research Award, as well as additional resources from the School of Nursing’s Department of Nurse Anesthesia. 

Follow this link to see a video of how BodyExplorer works: http://www.innovation.pitt.edu/innovations/bodyexplorer/

###

Author: Matt Cichowicz, Communications Writer

Contact: Paul Kovach