wn fedciv 1

How the Air Force Changed Tune on Cybersecurity

How the Air Force Changed Tune on Cybersecurity

Peter Kim, chief information security officer (CISO) for the U.S. Air Force, calls himself Dr. Doom. Lauren Knausenberger, director of cyberspace innovation for the Air Force, is his opposite. Where he sees trouble, she sees opportunity. Where he sees reasons to say no, she seeks ways to change the question.

For Kim, the dialogue they’ve shared since Knausenberger left her job atop a private sector tech consultancy to join the Air Force, has been transformational.

“I have gone into a kind of rehab for cybersecurity pros,” he says. “I’ve had to admit I have a problem: I can’t lock everything down.” He knows. He’s tried.

The two engage constantly, debating and questioning whether decisions and steps designed to protect Air Force systems and data are having their intended effect, they said, sharing a dais during a recent AFCEA cybersecurity event in Crystal City. “Are the things we’re doing actually making us more secure or just generating a lot of paperwork?” asks Knausenberger. “We are trying to turn everything on its head.”

As for Kim, she added, “Pete’s doing really well on his rehab program.”

One way Knausenberger has turned Kim’s head has been her approach to security certification packages for new software. Instead of developing massive cert packages for every program – documentation that’s hundreds of pages thick and unlikely to every be read – she wants the Air Force to certify the processes used to develop software, rather than the programs.

“Why don’t we think about software like meat at the grocery?” she asked. “USDA doesn’t look at every individual piece of meat… Our goal is to certify the factory, not the program.”

Similarly, Knausenberger says the Air Force is trying now to apply similar requirements to acquisition contracts, accepting the idea that since finding software vulnerabilities is inevitable, it’s best to have a plan for fixing them rather than hoping to regulate them out of existence. “So you might start seeing language that says, ‘You need to fix vulnerabilities within 10 days.’ Or perhaps we may have to pay bug bounties,” she says. “We know nothing is going to be perfect and we need to accept that. But we also need to start putting a level of commercial expectation into our programs.”

Combining development, security and operations into an integrated process – DevSecOps, in industry parlance – is the new name of the game, they argue together. The aim: Build security in during development, rather than bolting it on at the end.

The takeaways from the “Hack-the-Air-Force” bug bounty programs run so far, in that every such effort yields new vulnerabilities – and that thousands of pages of certification didn’t prevent them. As computer power becomes less costly and automation gets easier, hackers can be expected to use artificial intelligence to break through security barriers.

Continuous automated testing is the only way to combat their persistent threat, Kim said.

Michael Baker, CISO at systems integrator, General Dynamics Information Technology, agrees. “The best way to find the vulnerabilities – is to continuously monitor your environment and challenge your assumptions, he says. “Hackers already use automated tools and the latest vulnerabilities to exploit systems. We have to beat them to it – finding and patching those vulnerabilities before they can exploit them. Robust and assured endpoint protection, combined with continuous, automated testing to find vulnerabilities and exploits, is the only way to do that.”

I think we ought to get moving on automated security testing and penetration,” Kim added. “The days of RMF [risk management framework] packages are past. They’re dinosaurs. We’ve got to get to a different way of addressing security controls and the RMF process.”

Related Articles

GDIT Recruitment 600×300
Tom Temin 250×250
GovExec Newsletter 250×250
GDIT Recruitment 250×250
GDIT HCSD SCM 3 250×250 Train Yard
Design Thinking and DevOps Combine for Better Customer Experience

Design Thinking and DevOps Combine for Better Customer Experience

How citizens interact with government websites tells you much about how to improve – as long as you’re paying attention, said Aaron Wieczorek, digital services expert with U.S. Digital Services’ team at the Department of Veteran Affairs.

“At VA we will literally sit down with veterans, watch them work with the website and apply for benefits,” he said. The aim is to make sure the experience is what users want and expect he said, not “what we think they want.”

Taking copious notes on their observations, the team then sets to work on programming improvements that can be quickly put to the test. “Maybe some of the buttons were confusing or some of the way things work is confusing – so we immediately start reworking,” Wieczorek explained.

Applying a modern agile development approach means digital services can immediately put those tweaks to the test in their development environment. “If it works there, good. Then it moves to staging. If that’s acceptable, it deploys into production,” Wieczorek said.

That process can happen in days. Vets.gov deploys software updates into production 40 times per month Wieczorek said, and agency wide to all kinds of environments 600 times per month.

Case in point: Vets.gov’s digital Form 1010 EZ, which allows users to apply for VA healthcare online.

“We spent hundreds of hours watching veterans, and in end we were able to totally revamp everything,” Wieczorek said. “It’s actually so easy now, you can do it all on your phone.” More than 330,000 veterans have applied that way since the digital form was introduced. “I think that’s how you scale things.”

Of course, one problem remains: Vets.gov is essentially a veteran-friendly alternative site to VA.gov, which may not be obvious to search engines or veterans looking for the best way in the door. Search Google for “VA 1010ez” and the old, mobile-unfriendly PDF form still shows as the top result. The new mobile-friendly application? It’s the third choice.

At the National Geospatial-Intelligence Agency, developers take a similar approach, but focus hard on balancing speed, quality and design for maximum results. “We believe that requirements and needs should be seen like a carton of milk: The longer they sit around, the worse they get,” said Corry Robb product design lead in the Office of GEOINT Services at the National Geospatial-Intelligence Agency. “We try to handle that need as quickly as we can and deliver that minimally viable product to the user’s hands as fast as we can.”

DevOps techniques, where development and production processes take place simultaneously, increase speed. But speed alone is not the measure of success, Robb said. “Our agency needs to focus on delivering the right thing, not just the wrong thing faster.” So in addition to development sprints, his team has added “design sprints to quickly figure out the problem-solution fit.”

Combining design thinking, which focuses on using design to solve specific user problems, is critical to the methodology, he said. “Being hand in hand with the customer – that’s one of the core values our group has.”

“Iterative development is a proven approach,” said Dennis Gibbs, who established the agile development practice in General Dynamics Information Technology’s Intelligence Solutions Division. “Agile and DevOps techniques accelerate the speed of convergence on a better solution.  We continually incorporate feedback from the user into the solution, resulting in a better capability delivered faster to the user.”

Related Articles

GDIT Recruitment 600×300
Tom Temin 250×250
GovExec Newsletter 250×250
GDIT Recruitment 250×250
GDIT HCSD SCM 3 250×250 Train Yard
Can Micro Certifications Match Government’s Real-World Needs?

Can Micro Certifications Match Government’s Real-World Needs?

Certifications are the baseline for assessing proficiency in information technology and cybersecurity skills. But many career cyber professionals say required certs underestimate the value of on-the-job training and experience.

Others complain about the cost of obtaining and maintaining certifications that often don’t match up to the narrow responsibilities of a given job. They see emerging micro certifications – short, online training programs followed by knowledge exams – as more practical in many cases.

At the same time, new national standards are taking root to try to better tie knowledge requirements to specific job categories and certifying organizations are revising their programs.

Anil Karmel Founder / Chief Executive C2 Labs

Anil Karmel
Founder / Chief Executive
C2 Labs

“Nothing replaces real world expertise,” said Anil Karmel, founder and chief executive at IT startup C2 Labs of Reston, Va., and a former deputy chief technology officer at the Energy Department’s National Nuclear Security Administration (NNSA). “Having on the job skills is invaluable, especially in the cyber realm when you are entrusted to protect our nation’s critical IT assets. The aim is to strike the right balance between real world experience, certifications, and training, Karmel said.

Certifications were crucial early in his career, Karmel said. As he moved into mid-to-senior level positions however, his needs changed and so did the qualifications required for those jobs. The higher you go, the more experience and past performance defines your capabilities.

Karmel was a solutions architect at the Energy Department’s Los Alamos National Laboratory in Santa Fe, N.M., where he helped develop and launch a private cloud that let researchers automatically request virtual servers on-demand.

“As I grew and became a systems administrator, I focused on industry or vendor-specific certifications, such as VMware, which enabled me to build the private cloud at Los Alamos.”

Certifications can be seen as a baseline, onto which you may want to add additional skills. Micro training programs are one way to do that, Karmel noted.

For instance, a security analyst might need to move beyond incident response – reacting to events that could have a negative impact on an organization’s network – to learn about incident management. It focuses on preparing formal policies and procedures, as well as having the necessary tools in place, to thwart cyber threats. An online micro certification class on the Incident Management Lifecycle could meet that need.

Micro certification, a growing trend?
Micro certifications are narrowly focused, non-traditional skills training in which participants can earn a credential within a matter of days, versus months or years for traditional technical certification programs.

Micro certifications are well-liked by workers – and supervisors – according to a January 2017 Linux Academy/Cybrary survey of 6,000 IT pros. They allow employees to rapidly gain specific knowledge sets that answer specific needs.

Anthony James CEO and Founder Linux Academy

Anthony James
CEO and Founder
Linux Academy

“The growing micro certification trend is driven predominantly by industries such as IT and cybersecurity that have a workforce skills gap where jobs can’t be filled because of a lack of qualified applicants,” according to Anthony James, CEO and founder of Linux Academy, an online Linux and cloud training firm based in Fort Worth, Texas.

The survey, conducted in partnership with Cybrary, a provider of no-cost, open source cybersecurity Massive Open Online Courses (MOOCs), found IT professionals use micro certification programs to keep up with changing technologies and also learn at their own pace. Some 86 percent of respondents said they prefer learning and testing in small increments to receive IT skill credentials.

Thirty-five percent of respondents said that micro certifications have either helped them get or advance in a job; 70 percent think their company would benefit from partnering with micro certification providers; and 85 percent would most likely pursue micro certifications if employers facilitated the offering.

Opinions on micro certification versus traditional IT training varied. More than half – 58 percent – of respondents said micro certifications convey the same level of technical proficiency as traditional training and more than 94 percent believe that micro certifications give entry-level job candidates an edge in competing for jobs.

In terms of costs, 82 percent of respondents understood that micro certifications are more affordable than traditional IT training. Fifty-eight percent of those surveyed paid $25 or more for their own micro certification courses. Most respondents believe their company spends an average of up to $25,000 annually on IT skills training for employees.

Difference Between Certificates and Certification
Many government and government contractor jobs require certifications from established organizations, such as the Certified Information Systems Security Professional (CISSP) certification conferred by (ISC)2, which offers a portfolio of credentials that are part of a holistic, programmatic approach to security. Candidates must have a minimum of five years of paid full-time work experience in two of the eight domains of the CISSP Common Body of Knowledge (CBK). It covers application development security, cloud computing, communications and network security, identity and access management, mobile security, risk management and more.

CISSP certification is costly, ranging from $2,000 to $4,000, depending on the choice of study – CISSP Boot Camp, regular classroom or online training. The six-hour exam alone costs $599.

But Dan Waddell, managing director for North America with (ISC)2, doesn’t see such certifications going away.

“I don’t believe the certification requirement is overkill and I believe most cybersecurity executives in the government would agree,” Waddell said.

According to the recently released federal results of the 2017 Global Information Security Workforce Study, 73 percent of federal agencies require their IT staff members to hold information security certifications. The survey of over 19,600 InfoSec professionals includes responses from 2,620 U.S. Department of Defense (DOD), federal civilian and federal contractor employees. The study was conducted by The Center for Cyber Safety and Education and sponsored by (ISC)2, Booz Allen Hamilton, and Alta Associates. Findings of the report will be released throughout 2017 in a series of dedicated reports.

To effectively retain existing InfoSec professionals and attract new hires, federal respondents indicated that offering training programs, paying for professional cybersecurity certifications, boosting compensation, and providing more flexible and remote work schedules and opportunities were among the most important initiatives.

Still, Waddell acknowledged that traditional certifications must evolve over time, and that (ISC)2 must develop ways to support government efforts to move toward a more performance-based certification system.

Micro certifications aren’t necessarily a replacement for baseline job requirements, however. Scott Cassity, senior director at the Maryland-based SANS Institute Global Information Assurance Certification (GIAC) center, said there is room for both in the complex and rapidly evolving world of cybersecurity.

“I can appreciate folks saying [they need] more bite-size micro certifications,” Cassity said. “I can appreciate that there might be some particular bite-size training we need on a particular tool, a particular technique.

“But if you back up and say: ‘Hey, I want someone who can be a defender. I want them to have a broad range of skills.’ Then, we don’t think that is something that will be absorbed in bite-size chunks,” Cassity continued. “It is going to be very rigorous and challenging training. It is studying above and beyond that training.”

Take the GIAC program, for example, which offers several dozen certifications for a range of different skill sets. Courses typically run four months and cost $1,249. Most students spend 40 to 50 hours studying outside of the classroom, Cassity said. Like CISSP, GIAC is a DOD-approved credentialing body, and its programs meet requirements laid out in the DOD Directive 8570, setting training, certification and management requirements of government employees involved with Information Assurance and security.

(ISC)2’s Waddell agrees there is a difference between a certificate covering practical cyber security knowledge or a specific skill set and professional certification more rigorously assessing a broader range of knowledge, skills and competencies.

The cybersecurity industry keeps evolving, Cassity said. Fundamental skills for information security will stand the test of time. But with mobile security, forensics and other rapidly growing technologies, job functions and certifications must change, as well.

Building a Skills-based Workforce
Federal agencies are looking to adopt the skills-based workforce definitions developed under the National Initiative for Cybersecurity Education (NICE), a partnership between government, academia, and the private sector that’s managed by the National Institute of Standards and Technology (NIST). NICE aims to standardize expectations for cybersecurity education, training, and workforce development across the industry to level-set expectations for employers and employees alike.

“We are not in favor of check-the-box for knowledge and skills,” said Rodney Petersen, NICE director at NIST.  “We really want a robust process for validating an employee’s knowledge, skills and abilities or a job seeker’s knowledge, skills, and abilities.”

The NICE Cybersecurity Workforce Framework (NCWF) – released by NIST in November 2016 – is the centerpiece, describing seven broad job categories: security provision; operate and maintain; protect and defend; analyze; operate and collect; oversight and development and investigate. It also includes 31 specialty areas and 50 work roles, each predicated on specific knowledge and skills, Petersen said.

NICE aims to improve education programs, co-curriculum experiences, training and certification to increase the quality of those credentials, he added.

NICE also impacts certifications. Defense Department Directive 8140, Cyberspace Workforce Management, issued in August 2015, sets the stage for replacing DOD’s certification-based requirements with skill-based assessments rooted in NICE.

According to the 2017 Global Information Security Workforce Study, 30 percent of federal respondents said their organizations have at least partially adopted the NICE Cybersecurity Workforce Framework.

The U.S. Department of Homeland Security (DHS) is using the NICE framework to build up its cybersecurity workforce. As a government-wide workforce framework, NICE “helps us to implement best practices, to identify, find and recruit the really good people,” Phyllis Schneck, DHS deputy undersecretary told GovTechWorks last year.

Some certifying organizations are starting to develop new “performance-based” certifications that are more in line with the NICE standard:  ISACA unveiled its Cyber Security Nexus Practitioner (CSXP) certification, which tests a candidate’s skills in a live, virtual cyber-lab, and CompTIA’s A+, Network+, Security+ and CompTIA Advanced Security Practitioner (CASP) certifications also include performance-based assessments.

Both ISACA and CompTIA are building their new hands-on programs around the NICE standards and definitions. NICE doesn’t undo the call for certifications, but instead emphasizes functional roles to better align candidates’ skills with specific job functions.

(ISC)2 began mapping CISSP certification requirements to the NICE Cybersecurity Workforce Framework last year, Waddell said.

“Certification is just the beginning,” he added. “You are now required to maintain that certification. You are required to set aside a certain number of hours per year to maintain that certification.” Those Continuing Professional Education (CPE) hours can include hands-on training or even skilled-based micro certs.

“In a perfect world,” Waddell said, “a certification program and certificate program can co-exist in a healthy way.”

Related Articles

GDIT Recruitment 600×300
Tom Temin 250×250
GovExec Newsletter 250×250
GDIT Recruitment 250×250
GDIT HCSD SCM 3 250×250 Train Yard
Gotta Get Mobile: Citizens Drive Changes in .Gov Web Design

Gotta Get Mobile: Citizens Drive Changes in .Gov Web Design

More than one in three visitors to Federal Websites use mobile or tablet devices today, following a national trend that saw mobile Web usage surpass personal computers for the first time in 2014, and has continued growing since.

Many Federal agencies are unprepared for the shift, however. Some have recently updated their sites to be mobile-friendly, but others are scrambling to update systems developed before mobile was a significant concern.

The heat rose last April, when Google changed its search engine to give preference to mobile-friendly sites. At the time, the government IT news site NextGov tested two dozen of the government’s largest Websites using Google’s mobile friendly webmasters tool. Eleven sites failed, including defense.gov, DHS.gov, IRS.gov and VA.gov. Nine months later, defense.gov is the only one of the four to pass the Google mobile test today.

Upgrading Websites is a challenge to begin with, and adding mobile to the mix is just one more wrinkle to be ironed out. The Department of Homeland Security’s U.S. Citizenship and Immigration Service (USCIS) has been working on its site update for more than a year, focusing on mobile first and using responsive design technology that will adapt to different screen sizes on the fly.

At the same time, the General Services Administration’s (GSA) 18F organization is working with the White House U.S. Digital Services group to develop government-wide standards for Federal Websites. The draft U.S. Design Standards, which are voluntary, aim to provide federal IT developers and industry partners with a roadmap and toolset to help spread clear design and navigation standards across all government sites.

“When the American people go online to access government services, they’re often met with confusing navigation systems, a cacophony of visual brands, and inconsistent interaction patterns,” explained project leader Mollie Ruskin, announcing the standards on an 18F blog post.

The new standards include open-source user interface (UI) components, a visual style guide and industry-standard guidance for accessibility and design.

Going Mobile

At USCIS, leaders pursued responsive design because mobile was what their constituents wanted, says Jeffrey Levy, chief of E-Communications in USCIS’ Office of Communications. The new site will give mobile users access to all the same features and functions as the desktop version, which isn’t the case today.

“You get a layout that works on any size screen so you don’t have to maintain two servers, two web sites,” he says.

Responsive design is part of a broader strategy to make it easy for customers to get answers to their questions and apply for services without requiring help from a representative in person or on the phone. The more customers can help themselves with routine matters, either online or using automated phone systems, the more time representatives will have to resolve more complex issues that require human intervention. Toward that end, USCIS launched a virtual assistant feature in December. Named “Emma,” it helps Web visitors quickly answer questions and locate information, easing call center workload.

Unhappy Customers

Improving citizen Web services to be as simple and intuitive as consumer sites, such as Amazon or Netflix, is a major goal of the Obama administration. Executive Order 13571, Streamlining Service Delivery and Improving Customer Service, declared in 2011 that “government managers must learn from what is working in the private sector and apply these best practices to deliver services better, faster, and at lower cost.” The order emphasizes the need to offer online and mobile solutions and to reduce “the overall need for customer inquiries and complaints.”

In contrast, citizens’ expectations for government service have declined steadily. According to an American Customer Satisfaction Index report released last year, citizen satisfaction with Federal government services plunged in 2014 to its lowest level since at least 2007.

Citizen Satisfaction With Federal Government Services

“Overall, the services of the federal government continue to deliver a level of customer satisfaction below the private sector and the downturn this year exacerbates the difference,” the report states.

Alan Webber, a research director with IDC Government Insights, says government agencies don’t benefit from the competitive pressures seen in consumer markets. The Social Security Administration, for example, is the only source of information about your benefits, so there’s little natural incentive for the agency to upgrade its online services.

For the public, Webber says, this plays out in diminished expectations. “We have the expectation that any engagement or any interaction with government isn’t necessarily going to be easy,” he says. “We want it to be easy, but it is not necessarily going to be that way.”

New Federal Standards

Changing those expectations is the driving force behind a joint effort of the General Services Administration’s 18F organization and the White House’s U.S. Digital Services group. The pair introduced in September 2015, a first-ever effort to provide federal IT developers and industry partners with clear design and navigation guidance for government sites.

Much of the guidance in these draft Web Design Standards incorporates open source designs, code, and patterns from other civic and government organizations, including:

“Like any true alpha, this is a living product,” Ruskin wrote in the 18F blog. As a result, the 18F and U.S. Digital Services team will continue to test its UI decisions and assumptions with real-world feedback, letting the standards evolve over time. Designers are encouraged to explore the U.S. Web Design Standards, contribute their own code and ideas, and leave feedback on GitHub. The project team will use this input to improve the standards and make regular releases over the coming months, according to Ruskin.

Designers seeking other sources of insight and best practices can also look to the Content Managers Forum and the Social Media Community of Practice, both managed by GSA; the Federal Communicators Network; and the Federal User Experience Community, Levy says.

The new U.S. Web Design Standards do not come with a mandate or requirement; their use is entirely voluntary. But already some sites have begun to incorporate the standards, including a voter registration portal for USA.gov accessible at vote.usa.gov. USAJobs.gov has also incorporated some of the design standards into its own system, according to 18F officials.

Whether or not every agency website should have the same look and feel or be distinctive in its own right is open to debate. Some say yes, others no, still others argue for a middle ground in which a basic set of building blocks, such as a narrow selection of available fonts, is used to establish a modicum of order and consistency. But even that is easier said than done, Levy says. Changing the font on websites can require programming and layout changes, which cost time and money. Within USCIS alone, he says, four or five platforms would have to be changed just to standardize fonts.

Putting the User First

Simply changing fonts on existing sites may not be worth the effort. Rather than look back, the place to put current effort is in future improvements that not only meet, but exceed customer expectations.

David Simeon, chief of USCIS’s Innovation and Technology Division and myUSCIS Product Manager within the agency’s Customer Service and Public Engagement Directorate, says any site redesign needs to start with the customer.

“We start with user research, interviewing customers, understanding their needs and motivation. Then we determine what types of products [USCIS can develop] that would suit those needs,” Simeon says. The web development teams are made up of USCIS designers and contractors.

Once initial apps are designed, usability testing helps ferret out problems.

Simeon’s team has worked with Levy to build tools and applications for the USCIS website and add new capabilities to the my.uscis.gov page, enabling customers to check the status of their cases, or determine what options and benefits are available to them by entering information into a pull-down menu.

Based on that information, USCIS narrows the benefits and options to their specific needs and give customers links to various resources to begin the application process. For instance, by identifying themselves under Explore My Options as a citizen, a Green Card holder, an employer, a foreign national or an “individual without lawful immigration status,”  customers see a customized menu of choices appear to get them the information they need more quickly.

Other new tools help users complete citizenship applications and study for the citizenship exam, maintain appointments and receive text-message alerts. Coming soon: A secure message app will let visitors communicate online with immigration officers.

All of these improvements evolved from research, including the site’s emphasis on mobile devices, Simeon says. “We had to go responsive because mobile is the way they access the Internet.”

Now Simeon’s team is working to improve the citizenship e-filing experience so customers can submit immigration applications and petitions online. USCIS conducted usability testing for a new naturalization form by using real applicants from around the country. “They have given us pointers about what works and what doesn’t work,” he says.

The objective, UCIS E-Communications chief Levy says: “Help people to use our online resources in the way they think of it, rather than forcing them to think of it the way we think of it.”

Related Articles

GDIT Recruitment 600×300
Tom Temin 250×250
GovExec Newsletter 250×250
GDIT Recruitment 250×250
GDIT HCSD SCM 3 250×250 Train Yard
Predictive Analytics and the Battle Against Wildfires

Predictive Analytics and the Battle Against Wildfires

It’s autumn and California’s Santa Ana winds sweep in from the desert, carrying the threat of wildfires across the mountains to a land already tormented by drought.

The Santa Ana Wildfire Threat Index (SAWTI) tracks and predicts the extent of the threat, calculating daily just how dangerous the situation may become. Developed by the U.S. Department of Agriculture’s Forest Predictive Services, SAWTI helps agencies and citizens anticipate risk and prepare accordingly.

It’s one of a whole new range of digital tools in use today to help curb wildfires. Developed and managed by numerous agencies and often on shoestring budgets, these systems pull numerous databases together and are just beginning to leverage the computational powers of cloud computing. They can’t do much once the fires are burning, but they can be powerful tools for planning and preparation in the nation’s forests and at the wildland-urban interface where people and nature collide.

Data-driven Decisions

Several Federal agencies share a stake in managing wildfires: the Department of the Interior, the Department of Agriculture, the U.S. Park Service, the U.S. Forest Service, the U.S. Fish and Wildlife Service, the U.S. Fire Administration, the U.S. Weather Service and the Federal Emergency Management Agency. State and local agencies also get involved.

“There’s a lot of data from multiple disciplines and multiple areas that we look at and try to bundle into nice, neat packages for decision-makers,” says Ed Delgado, National Predictive Services program manager at the Bureau of Land Management’s National Interagency Fire Center (NIFC) in Boise. “Technology is big for us.” Delgado said the center relies on technology for a lot of numbers crunching, data mining and analysis work. “We try to keep up with the latest science and technology and we work with a pretty extensive and growing research community in the fire area to try to develop the tools we need.”

The data Delgado crunches includes inputs from satellites, aircraft, and ground observers. That information, plus data on weather, climate, the state of flammable materials – what fire insiders call fuels – vegetation, lightning strikes, and past fire patterns, all combine to create a clear picture of wildfire vulnerability and a sense of where fire is most likely to strike.

“We try to establish patterns and probability in terms of historical data,” Delgado explains. “Our primary mission right now is to identify those areas where there is the greatest risk, so that resource managers and fire management folks can determine where they’re going to get the most benefit by putting and moving resources appropriately.”

Among their critical tools:

  • ROMAN, the Realtime Observation Monitor and Analysis Network, which collects observational weather data from across the United States and makes it available in real time. Originally developed by the National Oceanographic and Atmospheric Administration’s Cooperative Institute for Regional Prediction with support from the Bureau of Land Management and the National Weather Service, ROMAN is now supported by the U.S. Forest Service and used by wildfire officials around the country
  • The National Digital Forecast Database, produced by the Weather Service, along with the National Centers for Environmental Prediction, forecasts weather elements, such as cloud cover and maximum temperature that analysts can use to help predict fire risk
  • The FARSITE (Fire Area Simulator), created by the U.S. Department of Agriculture’s Rocky Mountain Research Station, is a fire growth simulation system that combines spatial information with weather, wind, and fuel data to model wildfire behavior

Unlike large, structured government programs such as military weapons systems, these wildfire programs were developed piecemeal over the years as agencies have contributed capabilities.

John Horel, professor of atmospheric sciences at the University of Utah, said the approach has worked for ROMAN.

“It has grown organically over the past 15 years,” he said. “It really works so well because it is so distributed and heterogeneous.” Horel added the program is cost-effective and frequently used.

He estimated ROMAN was developed during a 12-year stretch at a cost of $700,000. Also available for public use, ROMAN logged 1.1 million sessions and received 7.3 million page views from 180,000 users in the first 10 months of 2015.

The FARSITE tool, logged one million page views, with 216,000 unique users during the same period.

Mark Finney, forester researcher at the Missoula, Mont., Fire Sciences Laboratory, estimates the cost of FARSITE since its 1991 inception is $2 million. Initially funded by the U.S. Park Service and later by the departments of Agriculture and Interior, Finney estimates its annual cost at about $50,000. That’s pennies compared to the $1.5 billion the U.S. Forest Service says it cost to suppress fires in 2014 alone.

The most pressing need is increased server capacity, especially for spikes in data during fire season.

Having an Impact

Across the nation, fire safety officials are either incorporating predictive analyses into their forecasts or are planning to do so.

The National Institute of Standards and Technology seeks to develop a scale for wildfire similar to the Richter Scale for earthquakes or the Saffir-Simpson Scale for hurricanes. It would allow safety officials and the public to categorize wildfires based on risk and intensity. But the task is difficult because wildfires are subject to so many variables, including weather, wind, terrain, and fuels.

Jeremy Sullens, a fuels analyst in the National Predictive Services (NPS) Subcommittee, uses the FireFamilyPlus application, built by the Missoula, Mont. Fire Science Lab, to analyze fuel dangers. Sponsored by the U.S. Forest Service Fire and Aviation Management and developed by the Rocky Mountain Research Station, the application is now in its fifth version.

“A lot can go into what makes a fuel burn at any given point,” Sullens said. “From a fuels perspective, a lot of what we’re doing is building and refining our models as we go.”

According to Sullens, the impact of modeling and simulation was clear during the 2015 fire season.

“I think we did generally very well,” he said. “We noticed very early, around the end of March, that the Alaska snow pack was basically nonexistent in the southern half of the state and we knew that there were going to be some problems there.”

Tim Mowry, statewide public information officer for the Alaska Division of Forestry, said those predictions created a sense of urgency in fire season preparations. Firefighter training was intensified and the state requested more aircraft and smokejumpers earlier, receiving them a month sooner than usual.

Ultimately, as the season progressed, 2,800 firefighting personnel were imported from the lower 48 states.

Sullens said predictions were on the mark elsewhere as well. “As we moved into the summer, we knew where our hotspots were going to be: the northern Rocky Mountains and portions of the Northwest,” he said.”We did a pretty good job of identifying the areas where significant fires were likely to occur.”

Forests in Oregon and Washington roared during the summer, with more than 3,800 fires burning some 1.6 million acres, and at a firefighting cost of at least $560 million.

Looking Ahead

Beyond coping with the next season’s fires, analytics firefighters are looking to do more than just evaluate existing conditions and make short-term predictions.

Tim Sexton manages of the Wildland Fire Management Research Development and Application Program, (WFM RDA) an interagency effort to coordinate wildfire analytics at the U.S. Forest Service Rocky Mountain Research Station in Fort Collins, Colo. He said a future step is to add artificial intelligence to enhance these systems’ predictive capability.

“To get there, we would have to collect vast amounts of information because of the high amount of variability on the landscapes and because of the way fire burns and the effectiveness and lack of effectiveness of different types of suppression actions on fire,” Sexton said.

For example, at the University of California at San Diego (UCSD), researchers and firefighters collaborated with others in 2013 to create WIFIRE, a real-time, data-driven simulation, prediction and visualization tool to understand and predict wildfire behavior. The team included the school’s Supercomputer Center, the Qualcomm Institute at the California Institute for Telecommunications and Information Technology, UCSD’s Jacobs School of Engineering and the University of Maryland’s Department of Fire Protection Engineering.

“The real key advances in the future will be providing more real-time information to folks on the ground, as well as decision-makers in command posts and decision-makers back in dispatch centers and forest offices,” Sexton said, adding crowd-sourced information, such as video, texts, or voice calls can contribute instant insight.

“I think we’ll be crowdsourcing sooner rather than later in terms of folks taking pictures on the fire line of what the fire’s doing. Anemometers, hydrometers, other weather devices attached to mobile devices could then report every five or 10 minutes and help us with our weather forecasts for specific areas,” he said.

Sexton uses Amazon Web Services (AWS) to leverage cloud computing for large analysis projects, dramatically cutting the time it takes to crunch the numbers. Additionally, his program is using IBM’s cloud infrastructure for maintenance and development of the Wildland Fire Decision Support System (WFDSS), which has paid off in improved reliability.

The heavy 2015 fire season accelerated the move to cloud solutions, Sexton said, because demand on the system was so great “in August that some of our servers couldn’t keep up,” and periodic failures necessitated the switch. “They dropped out numerous times; I don’t know if it was 10, 20 or 30 times. We brought them right back up, but it was very frustrating for the users,” he said. “By getting stuff in the cloud with more capacity and less likelihood of server failures, I think we’ll be able to avoid those kinds of things in the future.”

NFC’s Delgado said longer term, the goal is to predict wildfires with the same certainty as weather.

“We’d like to be able to pinpoint where we’re actually going to have a fire – a specific location –kind of the way we do with other weather elements like thunderstorms and things like that,” said Delgado. “But those are far down the road.”

Getting there will take further fire community investment in research and development and investments by land management agencies, Sullens said. “To improve our science and forecasting – really, that’s the future and the way we’re going to be more responsive and 80 percent effective in preventing significant fires.”

Wildfire prediction and analytics continue to make great strides, a fundamental conundrum remains for the analysts who crunch the data and look ahead for future hotspots.

“The hardest part about it is, if we prevent a fire, that’s a fire that never occurred,” Sullens said. “But we have no way of saying that we prevented that fire. We may very well have prevented a $50 million fire this year but we can’t say that. You can’t really measure success because you never really know when you were successful.”

Though difficult to measure success, the cost of fires is great. Firefighting itself hasn’t changed much over the years. But with people living closer to where the fires flare up, the stakes keep rising.

“We were using Pulaskis [the basic wildfire-fighting tool] to fight fires a hundred years ago, and shovels and hoses, and we’re still doing that today,” WFM RDA’s Sexton said. “I think we’ll be using Pulaskis and shovels a hundred years from now to do the same kind of things, only with better information.”

The payoff will be in how that information is applied, he explained. “We’ll be needing to do less manual work and more of what we do with those firefighters on the ground will be more effective – and less of what we do will be putting those folks in dangerous places where they might get in trouble.”

David Silverberg is a veteran government and technology journalist and a consulting editor with GovTechWorks.

Related Articles

GDIT Recruitment 600×300
Tom Temin 250×250
GovExec Newsletter 250×250
GDIT Recruitment 250×250
GDIT HCSD SCM 3 250×250 Train Yard