Bioengineering's BodyExplorer research to be featured at first annual ACC Smithsonian Creativity and Innovation Festival
University of Pittsburgh Release
Virginia Tech and the Smithsonian’s National Museum of American History present the first annual ACCelerate: ACC Smithsonian Creativity and Innovation Festival on October 13-15, 2017. The festival, programmed by Virginia Tech’s Institute for Creativity, Arts, and Technology and the Museum’s Lemelson Center for the Study of Invention and Innovation, is a three-day celebration of creative exploration and research at the nexus of science, engineering, arts, and design (SEAD). Visitors to the festival will interact with innovators and experience new interdisciplinary technologies developed to address global challenges. The event is free and open to the public.
The ACCelerate festival will be an opportunity for all ACC schools in partnership with the Lemelson Center to showcase their work to the public, each other, students, alumni, companies, legislators, and invited guests from the nation’s capital.
Learn more about the University of Pittsburgh projects that will be on display:
BODYEXPLORER: A NEXT-GENERATION SIMULATOR FOR HEALTHCARE TRAINING, PROVIDING HANDS-ON LEARNING AND PRACTICE VIA AUGMENTED REALITY VISUALIZATION
BodyExplorer is a next-generation medical simulator designed to enhance the ability of healthcare trainees to learn anatomy and physiology and practice treating patients though naturalistic interaction with an augmented reality-enhanced, full-body simulated patient. Simulation has been recognized as the most prominent innovation in healthcare education in the past two decades, but current systems require substantial resources, including technicians to run the simulator and instructors to lead scenarios, assess student performance, and provide guided feedback. Learning how to operate current simulators requires advanced training, so students typically cannot use them on their own for self-learning. BodyExplorer was designed to enable 24/7 on-demand training and self-learning for students by providing an intuitive interface, autonomous operation, and automated instruction using a highly sensorized physical body, projected augmented reality (AR), and an integrated virtual instructor. AR enables x-ray vision views inside the body, so trainees can see the internal effects of administering simulated medications or performing procedures, such as inserting a breathing tube. BodyExplorer is designed to expand access to the benefits of simulation-based learning for medical and nursing students, first responders, combat medics, and other healthcare practitioners, enabling them to practice skills and receive quantitative feedback on their performance before treating actual patients.
Researchers: Joseph Samosky, Douglas Nelson, and John O'Donnell
MOBILITY ENHANCEMENT ROBOTIC WHEELCHAIR
The Mobility Enhancement Robotic Wheelchair (MEBot) will tackle both curbs and challenging terrains. The large center driving wheels can reposition themselves to simulate front-, mid-, or rear-wheel driving. The four smaller caster wheels are controlled with compressed air and move up and down freely and independently. For climbing curbs, the front caster wheels lift up onto the curb, then the driving wheels lift themselves up and forward onto the curb, which lifts the chair onto the curb. This is done automatically whenever MEBot senses a curb or step. The ultimate goal is for MEBot to climb a set of stairs. The same general function is used to operate on icy or slippery surfaces. A traditional power wheelchair can get stuck on this kind of terrain. MEBot, however, uses its front and rear caster wheels to inch forward on the slick surface by extending its front casters, moving the seat forward, bringing the rear casters forward, and then repeating the process. Meanwhile, the seat stabilization system keeps the driver safely upright.
Researchers: Rory A. Cooper, Brandon Daveler, Ben Gebrosky, Garrett Grindle, Andrea Sundaram, Hongwu Wang, and Jorge Candiotti
OUR TIME IS UP: AN IMMERSIVE AUDIO DRAMA
This multichannel sound installation tells the story of Jake and Helen McCleary, an elderly couple struggling to save their troubled marriage. The story unfolds across a series of weekly therapy sessions in which Jake and Helen sort through the messy details of their relationship. Unlike a conventional audio drama, the characters’ voices are constructed from fragments of oral history recordings of two people who have died—and who never met. Using a manual process of concatenated speech synthesis, the archival voices have been digitally disarticulated and recombined to create a new, fictional story and an uncanny encounter between living and dead, human and machine. This project brings together an interdisciplinary team of writers, designers, historians, and engineers and invites the audience to enter a mock therapist’s office and inhabit the experience of the absent characters, with each character’s voice emitted from a directional speaker. A screencast of the multi-track audio session reveals the secret behind the drama’s construction, and individual headsets provide access to the original oral histories. This immersive experience offers a reflection on the precarious temporality of human lives and relationships and the paradoxical potential for reinvention that sound recording affords.
Researchers: Erin Anderson and Brandon Barber
Contact: Paul Kovach