Virtual Reality Revolutionizes High-Stress Training

The number of calls to fire departments continues to rise each year, but most of those calls are medical emergencies. The number of fires have actually fallen over the past decade, according to the National Fire Protection Association. Today, America’s 27,000 fire departments face an average of fewer than 50 fires annually.

Most of those fires are residential – meaning firefighters get precious little practice combatting the full range of fires and situations they could face in life-or-death circumstances on the job –from combatting blazes in high-rise towers to containing explosions in industrial buildings and evacuating burning hospitals.

Conventional smoke house trainers do little to solve that challenge. They simulate only a small range of scenarios – and environmental regulations limit how frequently they can be used.

Enter virtual reality. The Orange County (Fla.) Fire Rescue Department, one of the largest in the country, is turning to VR to improve the way it trains fire department lieutenants managing fire situations in the county. With support from a grant from the Federal Emergency Management Agency (FEMA), the department worked with the University of Central Florida’s Institute for Simulation and Training (IST) to develop a new Incident Command training system.

But don’t get the wrong idea. This is not firemen gone wild with the latest gaming technology. Department leaders initially did want to go all in with the highest-end VR headsets they could find, contracting partner Eileen Smith nixed that idea. As director of IST’s E2i Creative Studio, she says the appeal of new technology is often more about “wonderment and awe” than whether it can make a real difference in improving training.

The main challenge was that lieutenants had different levels of experience. While older Orange County battalion chiefs now in their sixties, had roughly 100 fires before they were promoted to lieutenant, younger lieutenants typically had fought just three. Some couldn’t even read smoke.

The existing training was insufficient: a collection of dull lecture slides and bound instruction books. The test at the end was a video-based simulation with limited interaction. What was needed was more intensity and more stress, so lieutenants could truly learn to work under the same pressures they would face in real life. They needed to assess a scene in real time, take in visual cues and live audio from other participants – and see the consequences of their actions. In the resulting simulations, every decision lieutenants make – or fail to make – is reflected. Simulations can also be paused to correct major mistakes on the go.

Rather than don head-mounted VR displays, Smith’s team opted for a 10x18-foot screen, allowing participants to wear breathing masks that measure stress. “I’m putting you inside a 3-D environment, even if it’s on a screen, that you are interacting with and it is dependent on you,” Smith explains. “All events stop except in our case, the fire keeps burning. All events stop until you make a decision. In a sense, there is almost a co-creation of the experience. You have to be involved.”

Dr. Mark Nesselrode, a retired Navy captain and now Solution Architect for the Training Sector of General Dynamics Information Technology’s Global Services Division, used to test sailors’ capacity for stress by putting them in gasmasks and introducing chemical smoke. Those who couldn’t handle the claustrophobic experience were first to rip off their masks.

Now Nesselrode pushes the envelope in simulating civilian firefighting by experimenting with sound, darkness, realistic depictions of flame and smoke, and haptics – that is, the simulated physical response of inanimate objects – to bring sailors closer to real-world stressors.

“Firefighting is really, really hard,” he says. In addition to the visual, aural and haptic elements, “there’s a huge amount of physics involved,” including understanding multiple modes of heat transfer in a real fire, particle interactions with surfaces and turbulence due to cooling. Combining physics-based game engines that adhere to the sensory experience of fire has, Nesselrode says, “permitted us to look at ‘composability’ of a potential task and we can assess the time, cost, and workflow required.”

Firefighting, he adds, could also be applied to other service branches and to power plants, heavy manufacturing, airports and municipal needs.

Of course, immersive simulators aren’t always necessary. Nesselrode and GDIT have had significant success with systems that animate shipboard engineering spaces, using conventional display technologies, rather than VR.

The ultimate measure of success in any training regimen is the extent to which it improves understanding and preparedness. That’s especially true in fields such as firefighting, close combat and emergency medicine where drills are designed to help trainees maintain their cool under intense pressure and in the face of sometimes horrific scenes.

Rob Parrish, Deputy Director at the U.S. Army’s Program Executive Office for Simulation, Training and Instrumentation, says VR has been demonstrated to significantly increase knowledge acquisition of tactical casualty care skills when compared to other training treatments that do not employ VR technologies.”

VR is particularly useful in training applications where spatial awareness is important. Courtney McNamara, a computer scientist with the Advanced Gaming Interactive Learning Environment (AGILE) Team at the Naval Air Warfare Center Training Systems Division in Orlando, cites aircraft carrier flight decks as a case in point.

“There are aircraft moving and launching, a lot of big equipment,” McNamara says. “As these engines are on, the intakes and [exhaust] … can blow sailors over …. It’s almost like a ballet of movement as these operations are happening – where it’s safe to move, where it’s not safe to move – and you have to be able to look 360 degrees, because things are coming in all directions.”

You can’t really simulate that environment with conventional cameras and monitors, McNamara explains. But “if you could put a user in a VR headset, drop them in and let them move around and see the chaos in all its glory,” the training can be powerful and informative. If trainees approach an unsafe area, the system can alert them, or trainers can “pause the sim, tell them to look to the right to see that there’s a jet coming.”

Indeed, VR simulations are so good at transporting trainees and helping them learn appropriate responses to stress-inducing stimuli that researchers have begun to use the technology therapeutically. The Virtual Reality Medical Center in San Diego uses VR sessions to treat various anxieties and phobias, as well as PTSD experienced by vets. The center uses exposure therapy with VR simulations to ease patients back into traumatic episodes to help them confront them. The trick was to get patients to remove the psychological stigma that they were damaged. “We worked hard to get over that,” says Dr. Mark Wiederhold, the center’s CEO. “We didn’t call it therapy. We called it training. They’d come in and do a couple ops. We had much, much higher acceptance.”

Doctors measure patients’ heart rate and skin conductivity, which can help determine their emotional state. This helps set what Wiederhold calls the “the velocity of exposure.” He also recruited veterans from the Art Institute of San Diego to design the VR environments. The extra verisimilitude helped. He says VRMC’s success rate is 80 percent, while the VA managed only 46 percent.

VR, Wiederhold says, is “one of the most powerful techniques we have available to us if used intelligently and correctly.”

Not all VR needs to incorporate head-mounted displays, however. Gregory Welch, professor of computer science at the University of Central Florida College of Nursing, is developing what he calls a physical-virtual patient simulator – an advanced, interactive mannequin. Conventional mannequins have long been used to teach students how to start an IV or intubate patients. But Welch aims to take mannequins to a whole new level of reality.

Welch’s computer-powered, translucent mannequins are powered by computers and can visually simulate capillary response through translucent skin. They also can modify pulse and produce variations in temperatures and breathing pattern. Race or gender can be changed in a flash. And the mannequins can move and communicate, if needed. Welch’s team is focusing now on perfecting facial features, such as drooping motions in the lips and eyelids that might indicate the warning signs of a stroke.

Welch says there is still much work to do before these technologies will be widely accepted, but he sees a future in which such technologies work with, rather than replace, conventional training. “In my mind these are complimentary,” he says. “They allow you to give different experiences to nurses or medical students.”

The need for training tools that can be tailored to the particular requirements of different users extends to just about every field. Indeed, one of the promises of developing these technologies is that, as they are spread across larger and larger user bases, the costs begin to fall. The Navy plans to begin shifting up to half of its naval nuclear reactor prototype training from textbook to hands-on simulation early in 2017, Nesselrode says.

“Instead of reading a tech manual and referring to either a separate print or schematic, you’ll be able to go through the virtual engine room and see the entire plant, and as the capability expands, operate the equipment either individually or as part of a watch team,” Nesselrode says. “That’s a training revolution.”

Submit a Comment

Your email address will not be published. Required fields are marked *

Related Articles

Tom Temin 250×250
ATARC: Federal Mobile Computing Summit 250×250
NITEC17 250×250
GM 250×250
GEMG 250×250
GITECH Summit 2017 250×250
Jason Miller 250×250
Vago 250×250
USNI News: 250×250
ATARC: Federal Mobile Computing Summit 250×250
NITEC17 250×250
GITECH Summit 2017 250×250
gdit cloud 250×250

Upcoming Events