New Cyber Standards for IoT Ease – But Won’t Solve – Security Challenge

New Cyber Standards for IoT Ease – But Won’t Solve – Security Challenge

The first independent standard for cybersecurity for the Internet of Things (IoT) was approved earlier this month, following two years of debate and discussion over how to measure and secure such devices.

The American National Standards Institute (ANSI) approved UL 2900-1, General Requirements for Software Cybersecurity for Network-Connectable Products, as a standard on July 5. The effort was spearheaded by Underwriters Laboratories (UL), which is preparing two more standards to follow: UL 2900-2-1, which defines requirements for network-connectable components of healthcare systems, and UL 2900-2-2, which does the same for industrial control systems.

The three establish the first standard security protocols for software-controlled IoT devices, such as access controls, industrial controls for lighting and mechanical systems, internet-connected medical devices and more. They also offer potential answers to major worries about the lack of security built into such devices thus far.

The Internet of Things promises unparalleled opportunities to track, control and manage everything from lights and security cameras to pacemakers and medication delivery systems. But concerns about security, driven by real-world events in which unsecured IoT devices were co-opted in coordinated botnet attacks, have raised anxiety levels about the risks posed by connecting so many devices to the internet.

Those concerns have prompted government leaders from the Pentagon to Congress to call on industry to embrace security standards as a mark of quality and establish voluntary independent testing programs to assure customers that products are safe. The underlying warning: Either industry figures out how to police itself or government regulators will step in to fill the void.

Whether that is enough to inspire more companies to step up to the standards challenge remains unclear. “The market – that is, individual, corporate and government customers – has yet to put a price on IoT security in the same way that other markets have to determine the relative value of energy-efficient appliances or crash-worthy automobiles,” said Chris Turner, solutions architect with systems integrator General Dynamics Information Technology. “The market would benefit from standards. They’d help vendors back up product claims and integrators speed up adoption and implementation, which in turn would increase security and probably drive down prices, as well.”

Steven Walker Acting director of DARPA

Steven Walker
Acting director of DARPA

A standards regimen could change that equation, suggests Steven Walker, acting director of the Defense Advanced Research Agency (DARPA).

“What if customers were made aware of unsecure products and the companies that made them?” he asked at the AFCEA Defensive Cyber Symposium in June. “I’m pretty sure customers would buy the more secure products.”

As recently as Oct. 21, 2016, the Mirai botnet attack crippled Internet services provider Dyn via an international network of security cameras that launched an onslaught of bogus data requests on Dyn servers, peaking at about 1.2 terabytes/s. The attack brought down many of the most popular sites on the Internet.

Kevin Fu, director of the Archimedes Center for Medical Device Security and the Security and Privacy Research Group at the University of Michigan and the co-founder and chief scientist at Virta Labs, a startup medical device security firm, told the House Energy and Commerce Committee that the underlying problem is one of market failure.

“We are in this sorry and deteriorating state because there is almost no cost to a manufacturer for deploying [IoT] products with poor security to consumers,” he said at a November hearing. “Has a consensus body or federal agency issued a meaningful IoT security standard? Not yet. Is there a national testing lab to verify and assess the pre-market security of IoT devices? No. Is there a tangible cost to any company that puts an insecure IoT device into the market? I don’t think so.”

Could UL 2900 answer that need? Though Fu isn’t quite ready to endorse it, he did suggest the concept is sound.

“We know from the mathematician Gödel that it’s impossible to have both a sound and complete set of standards for any non-trivial problem,” Fu told GovTechWorks. “However, standards are important to improve security and simplify the problem to make it more tractable. No approach will completely solve security, but standards, sound engineering principles and experience gained through failure are necessary ingredients for reasonable defense.”

Developing the Standard
UL 2900 provides guidelines for how to evaluate and test connected products, including a standard approach to software analysis, efforts to root out embedded malware and process and control requirements for establishing IoT security risk controls in the architecture, design and long-term risk management of the product.

Rather than focus on hardware devices first, UL focused on software after initial conversations with the Department of Homeland Security (DHS), said Ken Modeste, leader of cybersecurity services, at UL. “One of DHS’s biggest challenges was their software supply chain,” he said. DHS was concerned about commercial software products running on computer systems, as well as industrial control software running the agencies operations technology, such as air conditioning, lighting and building or campus security systems.

Examining the problem, UL officials found clear similarities between the systems and sensors used in factory automation, enterprise building automation and security technology. “The majority of these cyber concerns – 90 percent – were in software,” Modeste told GovTechWorks. “So we realized, if we can create a standard for software, we can apply that to many, many products.”

UL invited representatives from industry, government and academia to participate in developing the standard. “We started looking at industry standards that make software better,” Modeste said. “A firmware file has a multitude of components. How can those be broken down and understood? How can they be protected?”

Participants studied every imaginable attack vector that threat actors could use to compromise a product, and then incorporated each into the testing process. Recognizing that new threats and vulnerabilities arise all the time, the testing and process was designed to be fluid and to incorporate follow-up testing after initial approval.

At first, Industry was slow to respond. “I thought we’d have more support early on,” Modeste said. “But there was an initial reluctance. It took a while for us to engage and get them to see the advantages.”

Now it seems interest is on the rise. Among the first movers with the standard: Electric Imp, an IoT software firm based in Los Altos, Calif., and Cambridge, U.K., which provides a cloud-based industrial IoT platform for fully integrating hardware, operating system, APIs, cloud services and security in a single flexible, scalable package. The Electric Imp platform is the first IoT platform to be independently certified to UL 2900-2-2.

Hugo Fiennes, co-founder and CEO at Electric Imp and former leader of Apple’s iPhone hardware development efforts (generations one through four), said:

“For security, UL has come at it at the right angle, because they’re not prescriptive,” Fiennes told GovTechWorks. “There are many ways to get security, depending on the application’s demands, latency requirements, data throughput requirements and everything like that. [But] the big problem has been that there has been no stake in the ground so far, nothing that says, ‘this is a reasonable level of security that shows a reasonable level of due diligence has been performed by the vendor.’”

What UL did was to study the problems of industrial control systems, look at the art of the possible, and then codify that in a standard established by a recognizable, independent third-party organization.

“It can’t be overstated how important that is,” Fiennes said. UL derives its trust from the fact that it is independent of other market players and forces.

Although UL 2900 “is not the be all and end all last word on cybersecurity for IoT,” Fiennes said, “it provides a good initial step for vendors.”

“They haven’t said this is one standard forever, because that’s not how security works,” he said. “They’ve said IoT security is a moving target, here is the current standard. We will test to it, we’ll give you a certificate and then you will retest and maintain compliance after.” The certification lasts a year, after which new and emerging threats must be considered in addition to those tested previously.

“This doesn’t absolve the people selling security products, platforms and security stacks from due diligence,” Fiennes warned. Firms must be vigilant and remain ready and able to react quickly to threats. “But it’s better than nothing. And we were in a state before where there was nothing.”  He noted that his product’s UL certification expires after a year, at which point some requirements are likely to change and the certification will have to be renewed.

Still, for customers seeking proof that a product has met a minimum baseline, this is the only option short of devoting extensive in-house resources to thoroughly test products on their own. Few have such resources.

“Auto makers and other large-scale manufacturers can afford that kind of testing because they can spread the cost out across unit sales in the hundreds of thousands,” says GDIT’s Turner. “But for government integration projects, individually testing every possible IoT product is cost-prohibitive. It’s just not practical. So reputable third-party testing could really help speed up adoption of these new technologies and the benefits they bring.”

Standards have value because they provide a baseline measure of confidence.

For Electric Imp, being able to tell customers that UL examined its source code, ran static analysis, performed fuzz testing and penetration testing and examined all of its quality and design controls, has made a difference.

For UL and Modeste, the notion that it will not be able to solve the IoT security problem with a single standard, proved something of an “aha moment.”

“Within cybersecurity, you have to recognize you can’t do everything at once,” he said. “You need a foundation, and then you can go in and take it step-by-step. Nothing anyone comes up with in one step will make you 100 percent cyber secure. It might take 10 years to come up with something perfect and then soon after, it will be obsolete. So it’s better to go in steps,” Modeste added. “That will make us increasingly secure over time.”

Related Articles

GDIT Recruitment 600×300
Tom Temin 250×250
GM 250×250
GEMG 250×250
Intel & National Security Summit

Upcoming Events

GDIT Recruitment 250×250
USNI News: 250×250
gdit cloud 250×250
Secure, High-Quality Software Doesn’t Happen By Accident

Secure, High-Quality Software Doesn’t Happen By Accident

Time, cost and security are all critical factors in developing new software. Late delivery can undermine the mission; rising costs can jeopardize programs; security breaches and system failures can disrupt entire institutions. Yet systematically reviewing system software for quality and security is far from routine.

“People get in a rush to get things built,” says Bill Curtis, founding executive director of Consortium of IT Software Quality (CISQ), where he leads the development of automatable standards that measure software size and quality. “They’re either given schedules they can’t meet or the business is running around saying … ‘The cost of the damage on an outage or a breach is less than what we’ll lose if we don’t get this thing out to market.’”

In the government context, pressures can arise from politics and public attention, as well as contract and schedule.

It shouldn’t take “a nine-digit defect – a defect that goes over 100 million bucks – to change attitudes,” Curtis says. But sometimes that’s what it takes.

Software defects and vulnerabilities come in many forms. The Common Weakness Enumeration (CWE) lists more than 705 types of security weaknesses organized into categories such as “Insecure Interaction Between Components” or “Risky Resource Management.” CWE’s list draws on contributions from participants ranging from Apple and IBM to the National Security Agency and the National Institute for Standards and Technology.

By defining these weaknesses, CWE – and its sponsor, the Department of Homeland Security’s Office of Cybersecurity and Communications – seek to raise awareness about bad software practices by:

  • Defining common language for describing software security weaknesses in architecture, design, or code
  • Developing a standard measuring stick for software security tools targeting such weaknesses
  • Providing a common baseline for identifying, mitigating and preventing weaknesses

Software weaknesses can include inappropriate linkages, defunct code that remains in place (at the risk of being accidentally reactivated later) and avoidable flaws – known vulnerabilities that nonetheless find their way into source code.

“We’ve known about SQL injections [as a security flaw] since the 1990s,” Curtis says. “So why are we still seeing them? It’s because people are in a rush. They don’t know. They weren’t trained.”

Educated Approach
Whether students today get enough rigor and process drilled into them while they’re learning computer languages and logic is open to debate. Curtis favors a more rigorous engineering approach for example, worrying that too many self-taught programmers lack critical underlying skills. Indeed, 2016 survey of 56,033 developers conducted by Stack Overflow, a global online programmer community, found 13 percent claimed they were entirely self-taught. Even among the 62.5 percent who had studied computer science and earned a bachelor’s or master’s degree, the majority also said some portion of their training was self-taught. The result is that some underlying elements of structure, discipline or understanding can be lost, increasing the risk of problems.

Having consistent, reliable processes and tools for examining and ensuring software quality could make a big difference.

Automated software developed to identify weak or risky architecture and code can help overcome that, says Curtis, a 38-year veteran of software engineering and development in industry and academia. Through a combination of static and dynamic reviews, developers can obtain a sense of the overall quality of their code and alerts about potential system weaknesses and vulnerabilities. The lower the score, the riskier the software.

CISQ is not a panacea. It can screen 22 of the 25 Most Dangerous Software Errors as defined by CWE and the SANS Institute, identifying both code-level and architectural-level errors.

By examining system architecture, Curtis says, CISQ delivers a comprehensive review. “We’ve got to be able to do system-level analysis,” Curtis says. “It’s not enough just to find code-level bugs or code-unit-level bugs. We’ve got to find the architectural issues, where somebody comes in through the user interface and slips all the way around the data access or authentication routines. And to do that you have to be able to analyze the overall stack.”

Building on ISO/IEC 25010, an international standard for stating and evaluating software quality requirements, CISQ establishes a process for measuring software quality against four sets of characteristics: security, reliability, performance efficiency and maintainability. These are “nonfunctional requirements,” in that they are peripheral to the actual mission of any given system, yet they are also the source of many of the most damaging security breaches and system failures.

Consider, for example, a 2012 failed software update to servers belonging to Jersey City, N.J. financial services firm Knight Capital Group. The update was supposed to replace old code that had remained in the system – unused – for eight years. The new code, which updated and repurposed a “flag” from the old code, was tested and proven to work correctly and reliably. Then the trouble started.

According to a Securities and Exchange Commission filing, a Knight technician copied the new code to only seven of the eight required servers. No one realized the old code had not been removed from the eighth server nor that the new code had not been added. While the seven updated servers operated correctly, the repurposed flag caused the eighth server to trigger outdated and defective software. The defective code instantly triggered millions of “buy” orders totaling 397 million shares in just 45 minutes. Total lost as a result: $460 million.

“A disciplined software configuration management approach would have stopped that failed deployment on two fronts,” said Andy Ma, senior software architect with General Dynamics Information Technology. “Disciplined configuration management means making sure dead code isn’t waiting in hiding to be turned on by surprise, and that strong control mechanisms are in place to ensure that updates are applied to all servers, not just some. That kind of discipline has to be instilled throughout the IT organization. It’s got to be part of the culture.”

Indeed, had the dead code been deleted, the entire episode would never have happened, Curtis says. Yet it is still common to find dead code hidden in system software. Indeed, as systems grow in complexity, such events could become more frequent. Large systems today utilize three to six computer languages and have constant interaction between different system components.

“We’re past the point where a single person can understand these large complex systems,” he says. “Even a team cannot understand the whole thing.”

As with other challenges where large data sets are beyond human comprehension, automation promises better performance than humans can muster. “Automating the deployment process would have avoided the problem Knight had – if they had configured their tools to update all eight servers,” said GDIT’s Ma. “Automated tools also can perform increasingly sophisticated code analysis to detect flaws. But they’re only as good as the people who use them. You have to spend the time and effort to set them up correctly.”

Contracts and Requirements
For acquisition professionals, such tools could be valuable in measuring quality performance. Contracts can be written to incorporate such measures, with contractors reporting on quality reviews on an ongoing basis. Indeed, the process lends itself to agile development, says Curtis, who recommends using the tools at least once every sprint. That way, risks are flagged and can be fixed immediately. “Some folks do it every week,” he says.

J. Brian Hall, principal director, Developmental Test and Evaluation in the Office of the Secretary of Defense, said at a conference in March that the concept of adding a security quality review early in the development process is still a relatively new idea. But Pentagon operational test and evaluation officials have determined systems to be un-survivable in the past – specifically because of cyber vulnerabilities discovered during operational testing. So establishing routine testing earlier in the process is essential.

The Joint Staff updated systems survivability performance parameters earlier this year and now include a cybersecurity component, Hall said in March. “This constitutes the first real cybersecurity requirements for major defense programs,” he explained. “Those requirements ultimately need to translate into contract specifications so cybersecurity can be engineered in from program inception.”

Building cyber into the requirements process is important because requirements drive funding, Hall said. If testing for cybersecurity is to be funded, it must be reflected in requirements.

The Defense Department will update its current guidance on cyber testing in the development, test and evaluation environment by year’s end, he said.

All this follows the November 2016 publication of Special Publication 800-160, a NIST/ISO standard that is “the playbook for how to integrate security into the systems engineering process,” according to one of its principal authors, Ron Ross, a senior fellow at NIST. That standard covers all aspects of systems development, requirements and life-cycle management.

Related Articles

GDIT Recruitment 600×300
Tom Temin 250×250
GM 250×250
GEMG 250×250
Intel & National Security Summit

Upcoming Events

GDIT Recruitment 250×250
USNI News: 250×250
gdit cloud 250×250
IOT Security Risks Begin With Supply Chains

IOT Security Risks Begin With Supply Chains

The explosion of network-enabled devices embodied in the Internet of Things (IoT) promises amazing advances in convenience, efficiency and even security. But every promising new device generates new seams and potential opportunities for hackers to worm their way into networks and exploit network weaknesses.

Figuring out which IoT devices are safe – and which aren’t – and how to safely leverage the promise of that technology will require looking beyond traditional supply chain and organizational boundaries and developing new ways to approve, monitor and review products that until recently weren’t even on the radar of information security officials.

Dean Souleles Chief technology officer at the National Counterintelligence & Security Center

Dean Souleles
Chief technology officer at the National Counterintelligence & Security Center

Conventional product definitions have fundamentally changed, said Dean Souleles, chief technology officer at the National Counterintelligence & Security Center, part of the Office of the Director of National Intelligence. To illustrate his point, he held up a light bulb during the recent Institute for Critical Infrastructure Technology Forum, noting that looks can be deceiving.

“This is not a light bulb,” he said. “It does produce light – it has an LED in it. But what controls the LED is a microcircuit at the base.” That microcircuit is controlled by software code that can be accessed and executed, via a local WiFi network, by another device. In the wrong hands – and without the proper controls –that light bulb becomes a medium through which bad actors can access and exploit a network and any system or device connected to it.

“When my light bulb may be listening to me,” Soules says, “we have a problem.”

What’s On Your Network?
Indeed, the whole government has a problem. Asset management software deployed across 70 federal agencies under the Department of Homeland Security’s Continuous Diagnostics and Mitigation program has uncovered the surprising extent of unknown software and systems connected to government networks: At least 44 percent more assets are on agency networks than were previously known, according to CDM program documents, and in some agencies, that number exceeded 200 percent.

IoT will only make such problems worse, because IT leaders rarely have a comprehensive of the many products acquired and installed in their buildings and campuses. It’s hard enough to keep track of computers, laptops, printers and phones. Reining in facilities managers who may not be fully aware of cyber concerns or properly equipped to make informed decisions is a whole different story.

“You have to have a different way of thinking about the internet of things and security,” Souleles said. “When you think about infrastructure today, you have to think beyond your servers and devices.”

Securing IoT is now a supply chain risk management issue, greatly expanding the definition of what constitutes the IT supply chain. “That risk management has got to [focus on] software risk management,” Souleles said. “You have to begin with the fact that software now includes your light bulbs. It’s a different way of thinking than we have had before. And things are moving so quickly that we really have to stay on top of this.”

The Intelligence Community and the technology companies that support it may be best equipped to define the necessary best practices and procedures for ensuring a safe and secure supply chain. Chris Turner, solutions architect at General Dynamics Information Technology, said the supply chain attack surface is huge, and that the risks among technology products can be huge, as well – if left unattended.

Indeed, Jon Boyens, a senior advisor for information security at the National Institute of Standards and Technology (NIST), says as much as 80 percent of cyber breaches originate in the supply chain, citing a 2015 study by the Sans Institute.

Boyens cites two notorious examples of supply chain failures: In the first, supplier-provided keyboard software gave hackers access to the personal data of 600,000 Samsung Galaxy smartphones; in the second, supplier-provided advertising software let attackers snoop on browser traffic on Lenovo computers. Counterfeit products, devices compromised in transit and component-level vulnerabilities are other supply chain risks that can lead to devastating consequences.

Maintaining sufficient controls to minimize risk and maximize transparency requires close relationships with vendors, clear understanding of the risks involved and strict adherence to procedure. Organizations should be able to identify their lower-tier sub-contractors as well as the extent to which their suppliers retain access to internal systems and technology.

For companies that routinely support highly classified programs, this kind of diligence is routine. But many are not sufficiently experienced to be so well equipped, says GDIT’s Taylor, where supply chain risk management is considered a core competency.

“How do you protect the supply chain when everything comes from overseas?” Taylor asks rhetorically. “You can’t know everything. But you can minimize risk. Experienced government contractors know how to do this: We know how to watch every single component.”

That’s not just hyperbole. For the most classified military systems, source materials may be tracked all the way back to where ore was mined from the Earth. Technology components must be understood in all their infinite detail, including subcomponents, source code and embedded firmware. Minimizing the number of suppliers involved and the instances in which products change hands is one way to minimize risks, he said.

Certified Cyber Safe
Making it easier to secure that supply chain and the software that drives IoT-connected devices is what’s behind a two-year-old standards effort at Underwriters Laboratories (UL), the independent safety and testing organization. UL has worked with the American National Standards Institute (ANSI) and the Standards Council of Canada (SCC) to develop a series of security standards that can be applied to IoT devices from lights, sensors and medical devices to access and industrial controls.

The first of these standards will be published in July 2017 and a few products have already been tested against draft versions of the initial standard, UL 2900-1, Software Cybersecurity for Network-Connectable Products, according to Ken Modeste, leader of cybersecurity services at UL. The standard covers access controls, authentication, encryption, remote communication and required penetration and malware testing and code analysis.

Now it’s up to users, manufacturers and regulators – the market – to either buy into the UL standard or develop an alternative.

Mike Buchwald, a career attorney in the Department of Justice’s National Security Division, believes the federal government can help drive that process. “As we look to connected devices, the government can have a lot of say in how those devices should be secured,” he said at the ICIT Forum. As one of the world’s biggest consumers, he argues, the government should leverage the power of its purse “to change the market place to get people to think about security.”

Whether the government has that kind of market power – or needs to add legislative or regulatory muscle to the process is still unclear. The United States may be the world’s single largest buyer of nuclear-powered submarines or aircraft carriers, but its consumption of commercial technology is small when compared to global markets, especially so when considering the global scale of IoT connections.

Steven Walker, acting director of the Defense Advanced Research Projects Agency (DARPA), believes the government’s role could be to encourage industry.

“What if a company that produces a software product receives something equivalent to a Good Housekeeping Seal of Approval for producing a secure product?” he said in June at the AFCEA Defensive Cyber Operations Conference in Baltimore. “What if customers were made aware of unsecure products and the companies that made them? I’m pretty sure customers would buy the more secure products – in today’s world especially.”

How the government might get involved is unclear, but there are already proven models in place in which federal agencies took an active role in encouraging industry standards for measurement and performance of consumer products.

“Philosophically, I’m opposed to government overregulating any industry,” he said. “In my view, overregulation stifles innovation – and invention. But the government does impose some regulation to keep Americans safe: Think of crash tests for automobiles. So should the government think about the equivalent of a crash test for cyber before a product – software or hardware – is put out on the Internet? I don’t know. I’m just asking the question.”

Among those asking the same question include Rep. Jim Langevin (D-R.I.), an early and outspoken proponent of cybersecurity legislation. “We need to ensure we approach the security of the Internet of Things with the techniques that have been successful with the smart phone and desktop computers: The policies of automatic patching, authentication and encryption that have worked in those domains need to be extended to all devices that are connected to the Internet,” he told the ICIT Forum. “I believe the government can act as a convener to work with private industry in this space.”

What might that look like? “Standard labeling for connected devices: Something akin to a nutritional label, if you will, for IoT,” he said.

Langevin agrees that “the pull of federal procurement dollars” can be an incentive in some cases to get the private sector to buy in to that approach.

But the key to rapid advancement in this area will be getting the public and private sectors to work together and buy into the idea that security is not the sole purview of a manufacturer or a customer or someone in IT, but rather everyone involved in the entire process, from product design and manufacture through software development, supply chain management and long-term system maintenance.

As IoT expands the overall attack surface, it’s up to everyone to manage the risks.

“Pure technological solutions will never achieve impenetrable security,” says Langevin. “It’s just not possible. And pure policy solutions can never keep up with technology.”

Related Articles

GDIT Recruitment 600×300
Tom Temin 250×250
GM 250×250
GEMG 250×250
Intel & National Security Summit

Upcoming Events

GDIT Recruitment 250×250
USNI News: 250×250
gdit cloud 250×250
Why Modernization Is Key to National Cyber Strategy

Why Modernization Is Key to National Cyber Strategy

President Trump’s cybersecurity strategy hinges on modernizing legacy computer systems that sap resources and hold back agencies from updating security policies. The approach views cloud-based services not only as more flexible and less costly, but also as inherently more secure.

Jeanette Manfra National Protection and Programs Directorate, Department of Homeland Security

Jeanette Manfra
National Protection and Programs Directorate,
Department of Homeland Security

“It’s not always a fact that IT modernization and cybersecurity have to go hand in hand,” said Jeanette Manfra, acting deputy undersecretary for Cybersecurity and Communications for the National Protection and Programs Directorate in the Department of Homeland Security (DHS). “But that is very important and something this administration recognized immediately: that the lack of modernization is itself a big vulnerability for the government.”

Combining hundreds of federal networks into one or more virtualized networks is one part of that strategy. But just as essential is replacing aging infrastructure that’s both expensive to operate and difficult to protect. Manfra said the government must graduate from today’s perimeter-focused security approach, which focuses primarily on protecting the network, to a data-centered approach designed to protect the information assets residing on that network.

That’s hardly a new concept to security experts, but it reflects a massive cultural shift for government. Retired Air Force Maj. Gen. Dale Meyerrose, a former chief information officer for the director of national intelligence. Now an information security professor and consultant, Meyerrose said protecting the network is a lost cause; it’s best to assume the enemy is already inside your network.

“Every industry sector has a time lag between the infiltration of the evil-doers into your enterprise and their discovery,” Meyerrose said in March at the Cyber Resilience Summit in Reston, Va. The video game industry is the most skilled at rooting out infiltrators, he said, finding intruders in less than a week, on average. The worst? “The United States Government,” he continued. “The average time between infiltration and discovery is almost two years.”

The security paradigm is broken because the focus is in the wrong place, Meyerrose said. “The evil-doers don’t want your network. They want the stuff that’s in your network.”

By definition, cloud-based services blur the lines of conventional perimeter security, forcing CIOs and chief information security officers (CISOs) to focus on securing organizational data, as opposed to protecting the system itself. Legacy systems weren’t necessarily built with the Internet, remote access and external links in mind. Cloud, on the other hand, exists solely because of that global connectivity.

This is the opportunity at hand, DHS’ Manfra said at the Institute for Critical Infrastructure Forum June 7: “The promise of modernization is that it also allows us to modernize our security processes.”

Michael Hermus, DHS chief technology officer, agrees. “Software-defined infrastructure really helps us in this modernization journey towards a better security posture, towards being able to adapt to changing needs,” he said. Legacy systems lack the flexibility to respond to rapidly changing threats or situations, but virtualized networks and architectures are infinitely – and almost instantly – reconfigurable. “If the infrastructure is flexible enough to meet those changing needs, you are going to be in a much better security posture.”

Jim Routh Chief Security Officer Aetna

Jim Routh
Chief Security Officer, Aetna

Legacy IT problems aren’t unique to government. Large institutions like banks, power companies, airlines, insurers and the like also operate using what some call “Frankenstein networks,” amalgamations of legacy systems, often with modern frontends, that make change challenging. Take insurance giant Aetna, for example, where Jim Routh is the chief security officer. Although the private sector has different financial incentives and opportunities, many of the issues are similar.

“The reality is that fragile systems are the most expensive to maintain,” Routh said. “And often fragile systems aren’t even core business systems, because they don’t get a lot of attention. So a new capability that actually has security designed into it is actually much more cost-effective from an economic standpoint than keeping these legacy systems around.”

His recommendation: “Take legacy systems that are the most expensive to operate and divide it into two categories: the ones that get a lot of attention and support core business needs, and those that aren’t part of the core business. Then take the ones that aren’t part of the core business and decommission them.”

Hermus said DHS applies a decision framework across the organization to make a similar evaluation. “We’re creating a framework that can identify systems that need to be updated or modernized” based on existing best practices, such as Gartner’s TIME model, which stands for Tolerate, Invest/Innovate/Integrate, Migrate/Modernize and Eliminate. “Having a consistent framework for evaluating your assets, that’s something we all need to consider, particularly at the large enterprise level.”

Christopher Wlaschin, CISO at the Department of Health and Human Services (HHS), says his agency also applies an organizational framework to prioritize modernization needs. The agency spent $12 billion on IT last year across 11 divisions that include the Food and Drug Administration, the Centers for Medicare and Medicaid Services and the National Institutes of Health, among others.

“If the definition of a Frankenstein network is mismatched parts cobbled together over a central nervous system, I think, yeah, HHS has that,” he said. The 11 operating divisions all have unique threat and risk profiles and substantially different users and needs. The objective is to move toward as many shared services as possible where that makes sense and, where it doesn’t, to replace proprietary systems with cloud solutions where that makes sense.

“The challenge is the immensity of the problem,” said Mark Sikorski, vice president of Homeland Security Solutions at systems integrator General Dynamics Information Technology. “Everything is interconnected. Modernizing one system affects all the others. Determining where your greatest vulnerabilities are – and then prioritizing them – is just the first step. The next is to carefully assess each decision so you understand the downstream impact on the rest of the enterprise ecosystem.”

Get that wrong and the complexity can quickly snowball.

“You have to decide if you want to take the ‘Big Bang’ approach and modernize everything at once, or go very slowly, carefully untangling the knot of legacy systems without disrupting anything,” Sikorski added. “If you do that, you’ll pay a premium for maintaining an increasingly outdated environment. Like the old TV commercial says, you can pay now, or pay later. But you can’t avoid paying forever.”

Related Articles

GDIT Recruitment 600×300
Tom Temin 250×250
GM 250×250
GEMG 250×250
Intel & National Security Summit

Upcoming Events

GDIT Recruitment 250×250
USNI News: 250×250
gdit cloud 250×250
Citizens Expect a Consumer-Quality Customer Experience

Citizens Expect a Consumer-Quality Customer Experience

Citizens expect government to provide the same quality of customer experience as they receive from the best consumer businesses. The only difference: Unlike the commercial world, in many cases, they don’t have another option to turn to.

“We are influenced by the experiences we have every day,” said Michele Bartram, chief customer service officer at the U.S. Census Bureau at a recent discussion on government service. “So, we think, Google: Go one place, get the answer really quick, in and out. Security: Influenced by online banking, and the convenience of that – I can scan a check at my house and make a deposit. Amazon: I can find everything in one place and have it delivered to my home.”

So when citizens turn to government Web sites or agencies for help, the bar is already set – high. Government employees live in that same world, so they too know what those consumer experiences are like. But translating a consumer customer experience to a government context isn’t easy.

Technology, security, the complexity of government programs, legal language and, of course, established culture can all get in the way.

Take language, for instance. Government has a way of looking in at itself, developing its own terminology and relying on acronyms and jargon that are well understood on the inside but may not be the language of a given customer. When Matthew T. Harmon, director of Web Communications at the Department of Homeland Security worked with the Department of Technical Communication in Mercer University’s School of Engineering on a website usability study not long ago, students there identified excessive use of unexplained acronyms as a hurdle for target visitors.

Harmon’s advice: “Implement the Plain Language Act.” This 2010 law aimed “to improve the effectiveness and accountability of Federal agencies to the public by promoting clear government communication that the public can understand and use.” It’s a simple concept, but hard to execute. By eliminating jargon, acronyms and complicated language, the act aims to make government communication more accessible to the public.

The Center for Plain Language, a non-profit dedicated to those principles in government at all levels, measures participating agency’s performance and scores them annually. In the most recent results, from 2016, DHS received a B for plain language use – up from Ds when the annual report cards began. Overall, improvement has been dramatic: In 2013, of 20 agencies participating, one failed and nine received Ds for plain writing; only the Social Security Administration managed an A grade. But by 2016, among 18 participating agencies, eight scored an A- or better and only one scored as low as a C.

But with only 18 agencies participating, it’s possible some laggards would rather pass than risk getting a low score.

Clear language has other benefits beyond just making it easier for customers to understand. As it turns out, clarity also helps on the Internet. “It’s a by-product, but plain language also helps your search-engine optimization,” Harmon said. That means when customers search the agency’s website, they’re more likely to get the answers they’re looking for.

Measure, measure

Hala Maktabi is director of measurement and performance improvement in the Veterans Experience Office within the Department of Veterans Affairs, which has struggled to improve its ability to respond to veterans’ needs. The agency collapsed hundreds of different websites into a unified vets.gov web service that “aspires to be honest, transparent, respectful and accessible to all visitors.”

VA’s customer experience efforts have extended across the enterprise, looking at form letters and web pages. In one project last year, former Military Times newspaper writers were brought in to help rewrite bureaucratic language into more understandable, veteran-friendly terms.

Maktabi said the agency is now turning a corner, including better ways to query and measure veterans’ responses to their VA experience. “We did not have a system to understand and listen to our customers,” she said. Her enterprise measurement office was established specifically to gather that data and use it to improve. “It really measures what matters to veterans, to catch early signals and strategic insight,” Maktabi said. In the future, VA will be able to be more proactive in anticipating veterans’ needs.

David Meyer, vice president of Military Health and Veterans Affairs with General Dynamics Health Solutions, agrees. “Veteran insights and associated data are critical to delivering responsive services that can continuously adapt to the veterans, families, survivors and caregivers needs,” he said. “Organizations are often rich in data, but find it a challenge to use that data in truly meaningful ways. The key is understanding what data to extract and what action can be taken to actually improve customers’ experiences. Business intelligence tools are available, but are only part of the solution. It really comes down to ensuring the technology is aligned to the mission, with the customer at the forefront of the discussion.”

Empathy is critical

At US Citizenship and Immigration Services, listening to customer feedback helped officials optimize its website for mobile users and also to provide better service in response to questions. “You’re not doing this just to be more efficient and effective,” said Mariela Melero, associate director of the Customer Service and Public Engagement Directorate at USCIS. “You have to be empathetic.”

USCIS serves a vast and varied constituency, including many who are not native English speakers. Some are already deep into the immigration or citizenship process; others are just trying to get the lay of the land. Both are important customers, she said, and each has different needs. Understanding the different customers means anticipating those different needs.

“This is where personas and customer journey mapping come in,” said Tish Falco, senior director of customer experience at General Dynamics Information Technology. “These two tools help an organization better understand the customer by seeing things from the customer’s perspective and understanding key ‘Moments that Matter’ along the different touch points – both  across each business silo interaction and across the organization.  Together, personas and journey maps – based on real customer insights – help build empathy and provide focus around key strategic initiatives.”

Falco said it is important to understand and define the different customer personas coming to one’s agency, and then to map these individual customer experience journeys as the customer moves from initial engagement to actually completing their objective. “Understanding the distinct needs of segments within your customer base and dependencies across the journey will help you create a more personalized, engaging and easy-to-do business with experience.”

At USCIS, for example, the agency found that applicants in a waiting process often want the reassurance that comes from speaking to a person. And when they check on status, they want answers that reflect their particular circumstances. “They want an answer that’s for someone like them,” Melero said, because they know situations vary depending on many issues, such as country of origin, family status, prior history and other issues. “They tell us: Please personalize this experience for me.”

Personalizing service is something commercial industry has gotten better and better at. Online vendors remember your preferences and serve up content related to the things that interested you in the past.

Government agencies aren’t there yet. But they are making progress – by focusing on data to drive decisions. “I will back up anyone on my team who makes a change based on data – and a little common sense,” said Mark Weber, deputy assistant secretary for public affairs for Human Services at the Department of Health and Human Services. He advocates an agile approach to improvement: Make a change, measure its effect, then see what can be done to improve further.

To draw in more perspectives, he established an Engagement Team that meets regularly to talk about these issues and to build wider understanding across functional lines within the agency. He said he’s constantly inviting new people to join. “This is where we talk about everything,” Weber explained. “It’s not a decision point. But it’s a connecting point. And that’s important, too.”

Related Articles

GDIT Recruitment 600×300
Tom Temin 250×250
GM 250×250
GEMG 250×250
Intel & National Security Summit

Upcoming Events

GDIT Recruitment 250×250
USNI News: 250×250
gdit cloud 250×250