Mass casualty scenarios are inherently dangerous with many risks to those on site. Subjecting novices to such scenarios prematurely could lead to additional risks or dangers due to inexperience and poor decision making. Additionally, such scenarios are often far too expensive to replicate at any complexity or scale that mimics the real world. For example, staging a training scenario that faithfully recreates the 9/11 events would be a production that rivals blockbuster Hollywood movies as fire, smoke, debris, victims are all staged and coordinated. Even then one would begin to introduce additional dangers to the trainee and eliminate the possibility of trainees having *exactly* the same scenario unfold before them.
This project was actually an extension of earlier work done for the CDC and Department of Homeland Security in which first responders were trained for a specific disaster scenario to great effect running on an Onyx2 back in 2002. In this earlier version of the simulator a mass-casualty scenario was staged at a well-known athletic stadium on game day. This provided unique challenges for development as we tried to squeeze a stadium and crowd into the very low poly/texture budget of the Onyx2. Additionally, the fact that this was intended for a fully immersive virtual reality experience the textures/objects would be seen at life scale which required higher resolutions.
In 2008 the project was revived as a last minute study intended to target the needs of the Emergency Medicine residency program at the University of Michigan while also making significant advances in visual quality and immersion. It became important for trainees to identify wounds quickly and effectively. Advanced shaders were used to allow for a greater amount of detail per surface so the details of burns and lacerations could be realized. Additionally, advanced skeletally animated characters were also introduced to allow for full articulation of characters.
Much like with the first iteration of the project I was personally responsible for code and artwork, however, this revision was under much higher staffing/time constraints. Sean Petty and Ted Hall contributed significantly to the programming effort while I created the environments, 13 fully textured and animated characters in 12 days, various sounds, as well as adding physical properties to the scene.
Ultimately, the project was a large success as the related studies showed the simulator was just as effective at training residents as using standardized patients. As someone who was responsible for both project's development it was satisfying to to see these results. Also, knowing how much wasn't done for this project due to time/resource constraints leaves me convinced that the UM3D Lab could create something that would be far more effective than standardized patients if given the opportunity.
The project was recently featured on the Big Ten Network as part of the "Blue in Brief" segments. (slide 3)
Researcher: Dr. Pamela Andreatta
Special Thanks: Sean Petty, Ted Hall, Woojin Shim