Adoption of simulation in healthcare education has increased tremendously over the past two decades. However, the resources necessary to perform simulation are immense. Simulators are large capital investments and require specialized training for both instructors and simulation support staff to develop curriculum using the simulator and to use the simulator to train students. Simulators require staff to run the simulator, and instructors must always be present to guide and assess student performance. Current simulators do not support self-learning by students. Thus, the expensive simulators sit idle most of the day. Furthermore, simulators are minimally customizable, resulting in programs often being required to purchase simulators that have more functions and features than needed or that cannot be upgraded as needs change.
This dissertation presents the development of BodyExplorer, a system designed to address limitations of current simulators by reducing the resources required to support simulation in healthcare education, enabling self-use by students, and providing an architecture to support modular and extensible simulator development and upgrades. This dissertation discusses BodyExplorer’s initial prototype design, integration, and verification, as well as development of the modular architecture for integrating sensors, dynamic displays of anatomy and physiology, and automated instruction.
Novel sensor systems were integrated to measure user actions while performing (simulated) medication administration, cricoid pressure application, and endotracheal intubation on a simulation mannequin. Dynamic displays of anatomy and physiology, showing animations of breathing lungs or a beating heart, for example, were developed and integrated into BodyExplorer. Projected augmented reality is used to show users underlying anatomy and physiology on the surface of the mannequin, allowing users to see the internal consequences of their actions. An interface for supporting self-use and showing additional views of anatomy was incorporated using a mobile display. Using the projected images, mobile display, and audio output, a virtual instructor was developed to provide automated instructions to users based upon real-time sensor measurements.
Development of BodyExplorer was performed iteratively and included feedback from endusers throughout development using user-centered design principles. The mixed-methods results from three usability testing sessions with end-users at two academic institutions will be presented, along with the rationale for design decisions that derived from the results. Built upon feedback received during usability testing, the results from two scenarios of automated instruction will be provided by demonstrating examples of learning to apply cricoid pressure and learning to administer (simulated) medications to control heart rate. Discussion will also be provided regarding how the automated instruction techniques can be extended to provide training in other healthcare applications.