Predictive Analytics and the Battle Against Wildfires

It’s autumn and California’s Santa Ana winds sweep in from the desert, carrying the threat of wildfires across the mountains to a land already tormented by drought.

The Santa Ana Wildfire Threat Index (SAWTI) tracks and predicts the extent of the threat, calculating daily just how dangerous the situation may become. Developed by the U.S. Department of Agriculture’s Forest Predictive Services, SAWTI helps agencies and citizens anticipate risk and prepare accordingly.

It’s one of a whole new range of digital tools in use today to help curb wildfires. Developed and managed by numerous agencies and often on shoestring budgets, these systems pull numerous databases together and are just beginning to leverage the computational powers of cloud computing. They can’t do much once the fires are burning, but they can be powerful tools for planning and preparation in the nation’s forests and at the wildland-urban interface where people and nature collide.

Data-driven Decisions

Several Federal agencies share a stake in managing wildfires: the Department of the Interior, the Department of Agriculture, the U.S. Park Service, the U.S. Forest Service, the U.S. Fish and Wildlife Service, the U.S. Fire Administration, the U.S. Weather Service and the Federal Emergency Management Agency. State and local agencies also get involved.

“There’s a lot of data from multiple disciplines and multiple areas that we look at and try to bundle into nice, neat packages for decision-makers,” says Ed Delgado, National Predictive Services program manager at the Bureau of Land Management’s National Interagency Fire Center (NIFC) in Boise. “Technology is big for us.” Delgado said the center relies on technology for a lot of numbers crunching, data mining and analysis work. “We try to keep up with the latest science and technology and we work with a pretty extensive and growing research community in the fire area to try to develop the tools we need.”

The data Delgado crunches includes inputs from satellites, aircraft, and ground observers. That information, plus data on weather, climate, the state of flammable materials – what fire insiders call fuels – vegetation, lightning strikes, and past fire patterns, all combine to create a clear picture of wildfire vulnerability and a sense of where fire is most likely to strike.

“We try to establish patterns and probability in terms of historical data,” Delgado explains. “Our primary mission right now is to identify those areas where there is the greatest risk, so that resource managers and fire management folks can determine where they’re going to get the most benefit by putting and moving resources appropriately.”

Among their critical tools:

  • ROMAN, the Realtime Observation Monitor and Analysis Network, which collects observational weather data from across the United States and makes it available in real time. Originally developed by the National Oceanographic and Atmospheric Administration’s Cooperative Institute for Regional Prediction with support from the Bureau of Land Management and the National Weather Service, ROMAN is now supported by the U.S. Forest Service and used by wildfire officials around the country
  • The National Digital Forecast Database, produced by the Weather Service, along with the National Centers for Environmental Prediction, forecasts weather elements, such as cloud cover and maximum temperature that analysts can use to help predict fire risk
  • The FARSITE (Fire Area Simulator), created by the U.S. Department of Agriculture’s Rocky Mountain Research Station, is a fire growth simulation system that combines spatial information with weather, wind, and fuel data to model wildfire behavior

Unlike large, structured government programs such as military weapons systems, these wildfire programs were developed piecemeal over the years as agencies have contributed capabilities.

John Horel, professor of atmospheric sciences at the University of Utah, said the approach has worked for ROMAN.

“It has grown organically over the past 15 years,” he said. “It really works so well because it is so distributed and heterogeneous.” Horel added the program is cost-effective and frequently used.

He estimated ROMAN was developed during a 12-year stretch at a cost of $700,000. Also available for public use, ROMAN logged 1.1 million sessions and received 7.3 million page views from 180,000 users in the first 10 months of 2015.

The FARSITE tool, logged one million page views, with 216,000 unique users during the same period.

Mark Finney, forester researcher at the Missoula, Mont., Fire Sciences Laboratory, estimates the cost of FARSITE since its 1991 inception is $2 million. Initially funded by the U.S. Park Service and later by the departments of Agriculture and Interior, Finney estimates its annual cost at about $50,000. That’s pennies compared to the $1.5 billion the U.S. Forest Service says it cost to suppress fires in 2014 alone.

The most pressing need is increased server capacity, especially for spikes in data during fire season.

Having an Impact

Across the nation, fire safety officials are either incorporating predictive analyses into their forecasts or are planning to do so.

The National Institute of Standards and Technology seeks to develop a scale for wildfire similar to the Richter Scale for earthquakes or the Saffir-Simpson Scale for hurricanes. It would allow safety officials and the public to categorize wildfires based on risk and intensity. But the task is difficult because wildfires are subject to so many variables, including weather, wind, terrain, and fuels.

Jeremy Sullens, a fuels analyst in the National Predictive Services (NPS) Subcommittee, uses the FireFamilyPlus application, built by the Missoula, Mont. Fire Science Lab, to analyze fuel dangers. Sponsored by the U.S. Forest Service Fire and Aviation Management and developed by the Rocky Mountain Research Station, the application is now in its fifth version.

“A lot can go into what makes a fuel burn at any given point,” Sullens said. “From a fuels perspective, a lot of what we’re doing is building and refining our models as we go.”

According to Sullens, the impact of modeling and simulation was clear during the 2015 fire season.

“I think we did generally very well,” he said. “We noticed very early, around the end of March, that the Alaska snow pack was basically nonexistent in the southern half of the state and we knew that there were going to be some problems there.”

Tim Mowry, statewide public information officer for the Alaska Division of Forestry, said those predictions created a sense of urgency in fire season preparations. Firefighter training was intensified and the state requested more aircraft and smokejumpers earlier, receiving them a month sooner than usual.

Ultimately, as the season progressed, 2,800 firefighting personnel were imported from the lower 48 states.

Sullens said predictions were on the mark elsewhere as well. “As we moved into the summer, we knew where our hotspots were going to be: the northern Rocky Mountains and portions of the Northwest,” he said.”We did a pretty good job of identifying the areas where significant fires were likely to occur.”

Forests in Oregon and Washington roared during the summer, with more than 3,800 fires burning some 1.6 million acres, and at a firefighting cost of at least $560 million.

Looking Ahead

Beyond coping with the next season’s fires, analytics firefighters are looking to do more than just evaluate existing conditions and make short-term predictions.

Tim Sexton manages of the Wildland Fire Management Research Development and Application Program, (WFM RDA) an interagency effort to coordinate wildfire analytics at the U.S. Forest Service Rocky Mountain Research Station in Fort Collins, Colo. He said a future step is to add artificial intelligence to enhance these systems’ predictive capability.

“To get there, we would have to collect vast amounts of information because of the high amount of variability on the landscapes and because of the way fire burns and the effectiveness and lack of effectiveness of different types of suppression actions on fire,” Sexton said.

For example, at the University of California at San Diego (UCSD), researchers and firefighters collaborated with others in 2013 to create WIFIRE, a real-time, data-driven simulation, prediction and visualization tool to understand and predict wildfire behavior. The team included the school’s Supercomputer Center, the Qualcomm Institute at the California Institute for Telecommunications and Information Technology, UCSD’s Jacobs School of Engineering and the University of Maryland’s Department of Fire Protection Engineering.

“The real key advances in the future will be providing more real-time information to folks on the ground, as well as decision-makers in command posts and decision-makers back in dispatch centers and forest offices,” Sexton said, adding crowd-sourced information, such as video, texts, or voice calls can contribute instant insight.

“I think we’ll be crowdsourcing sooner rather than later in terms of folks taking pictures on the fire line of what the fire’s doing. Anemometers, hydrometers, other weather devices attached to mobile devices could then report every five or 10 minutes and help us with our weather forecasts for specific areas,” he said.

Sexton uses Amazon Web Services (AWS) to leverage cloud computing for large analysis projects, dramatically cutting the time it takes to crunch the numbers. Additionally, his program is using IBM’s cloud infrastructure for maintenance and development of the Wildland Fire Decision Support System (WFDSS), which has paid off in improved reliability.

The heavy 2015 fire season accelerated the move to cloud solutions, Sexton said, because demand on the system was so great “in August that some of our servers couldn’t keep up,” and periodic failures necessitated the switch. “They dropped out numerous times; I don’t know if it was 10, 20 or 30 times. We brought them right back up, but it was very frustrating for the users,” he said. “By getting stuff in the cloud with more capacity and less likelihood of server failures, I think we’ll be able to avoid those kinds of things in the future.”

NFC’s Delgado said longer term, the goal is to predict wildfires with the same certainty as weather.

“We’d like to be able to pinpoint where we’re actually going to have a fire – a specific location –kind of the way we do with other weather elements like thunderstorms and things like that,” said Delgado. “But those are far down the road.”

Getting there will take further fire community investment in research and development and investments by land management agencies, Sullens said. “To improve our science and forecasting – really, that’s the future and the way we’re going to be more responsive and 80 percent effective in preventing significant fires.”

Wildfire prediction and analytics continue to make great strides, a fundamental conundrum remains for the analysts who crunch the data and look ahead for future hotspots.

“The hardest part about it is, if we prevent a fire, that’s a fire that never occurred,” Sullens said. “But we have no way of saying that we prevented that fire. We may very well have prevented a $50 million fire this year but we can’t say that. You can’t really measure success because you never really know when you were successful.”

Though difficult to measure success, the cost of fires is great. Firefighting itself hasn’t changed much over the years. But with people living closer to where the fires flare up, the stakes keep rising.

“We were using Pulaskis [the basic wildfire-fighting tool] to fight fires a hundred years ago, and shovels and hoses, and we’re still doing that today,” WFM RDA’s Sexton said. “I think we’ll be using Pulaskis and shovels a hundred years from now to do the same kind of things, only with better information.”

The payoff will be in how that information is applied, he explained. “We’ll be needing to do less manual work and more of what we do with those firefighters on the ground will be more effective – and less of what we do will be putting those folks in dangerous places where they might get in trouble.”

David Silverberg is a veteran government and technology journalist and a consulting editor with GovTechWorks.

Submit a Comment

Your email address will not be published. Required fields are marked *

Related Articles

GDIT Recruitment 600×300
GM 250×250
GDIT Recruitment 250×250
Vago 250×250
GDIT HCSD SCM 2 250×250 Plane Takeoff
(Visited 1,668 times, 1 visits today)