Cyber GTW

IOT Security Risks Begin With Supply Chains

IOT Security Risks Begin With Supply Chains

The explosion of network-enabled devices embodied in the Internet of Things (IoT) promises amazing advances in convenience, efficiency and even security. But every promising new device generates new seams and potential opportunities for hackers to worm their way into networks and exploit network weaknesses.

Figuring out which IoT devices are safe – and which aren’t – and how to safely leverage the promise of that technology will require looking beyond traditional supply chain and organizational boundaries and developing new ways to approve, monitor and review products that until recently weren’t even on the radar of information security officials.

Dean Souleles Chief technology officer at the National Counterintelligence & Security Center

Dean Souleles
Chief technology officer at the National Counterintelligence & Security Center

Conventional product definitions have fundamentally changed, said Dean Souleles, chief technology officer at the National Counterintelligence & Security Center, part of the Office of the Director of National Intelligence. To illustrate his point, he held up a light bulb during the recent Institute for Critical Infrastructure Technology Forum, noting that looks can be deceiving.

“This is not a light bulb,” he said. “It does produce light – it has an LED in it. But what controls the LED is a microcircuit at the base.” That microcircuit is controlled by software code that can be accessed and executed, via a local WiFi network, by another device. In the wrong hands – and without the proper controls –that light bulb becomes a medium through which bad actors can access and exploit a network and any system or device connected to it.

“When my light bulb may be listening to me,” Soules says, “we have a problem.”

What’s On Your Network?
Indeed, the whole government has a problem. Asset management software deployed across 70 federal agencies under the Department of Homeland Security’s Continuous Diagnostics and Mitigation program has uncovered the surprising extent of unknown software and systems connected to government networks: At least 44 percent more assets are on agency networks than were previously known, according to CDM program documents, and in some agencies, that number exceeded 200 percent.

IoT will only make such problems worse, because IT leaders rarely have a comprehensive of the many products acquired and installed in their buildings and campuses. It’s hard enough to keep track of computers, laptops, printers and phones. Reining in facilities managers who may not be fully aware of cyber concerns or properly equipped to make informed decisions is a whole different story.

“You have to have a different way of thinking about the internet of things and security,” Souleles said. “When you think about infrastructure today, you have to think beyond your servers and devices.”

Securing IoT is now a supply chain risk management issue, greatly expanding the definition of what constitutes the IT supply chain. “That risk management has got to [focus on] software risk management,” Souleles said. “You have to begin with the fact that software now includes your light bulbs. It’s a different way of thinking than we have had before. And things are moving so quickly that we really have to stay on top of this.”

The Intelligence Community and the technology companies that support it may be best equipped to define the necessary best practices and procedures for ensuring a safe and secure supply chain. Chris Turner, solutions architect at General Dynamics Information Technology, said the supply chain attack surface is huge, and that the risks among technology products can be huge, as well – if left unattended.

Indeed, Jon Boyens, a senior advisor for information security at the National Institute of Standards and Technology (NIST), says as much as 80 percent of cyber breaches originate in the supply chain, citing a 2015 study by the Sans Institute.

Boyens cites two notorious examples of supply chain failures: In the first, supplier-provided keyboard software gave hackers access to the personal data of 600,000 Samsung Galaxy smartphones; in the second, supplier-provided advertising software let attackers snoop on browser traffic on Lenovo computers. Counterfeit products, devices compromised in transit and component-level vulnerabilities are other supply chain risks that can lead to devastating consequences.

Maintaining sufficient controls to minimize risk and maximize transparency requires close relationships with vendors, clear understanding of the risks involved and strict adherence to procedure. Organizations should be able to identify their lower-tier sub-contractors as well as the extent to which their suppliers retain access to internal systems and technology.

For companies that routinely support highly classified programs, this kind of diligence is routine. But many are not sufficiently experienced to be so well equipped, says GDIT’s Taylor, where supply chain risk management is considered a core competency.

“How do you protect the supply chain when everything comes from overseas?” Taylor asks rhetorically. “You can’t know everything. But you can minimize risk. Experienced government contractors know how to do this: We know how to watch every single component.”

That’s not just hyperbole. For the most classified military systems, source materials may be tracked all the way back to where ore was mined from the Earth. Technology components must be understood in all their infinite detail, including subcomponents, source code and embedded firmware. Minimizing the number of suppliers involved and the instances in which products change hands is one way to minimize risks, he said.

Certified Cyber Safe
Making it easier to secure that supply chain and the software that drives IoT-connected devices is what’s behind a two-year-old standards effort at Underwriters Laboratories (UL), the independent safety and testing organization. UL has worked with the American National Standards Institute (ANSI) and the Standards Council of Canada (SCC) to develop a series of security standards that can be applied to IoT devices from lights, sensors and medical devices to access and industrial controls.

The first of these standards will be published in July 2017 and a few products have already been tested against draft versions of the initial standard, UL 2900-1, Software Cybersecurity for Network-Connectable Products, according to Ken Modeste, leader of cybersecurity services at UL. The standard covers access controls, authentication, encryption, remote communication and required penetration and malware testing and code analysis.

Now it’s up to users, manufacturers and regulators – the market – to either buy into the UL standard or develop an alternative.

Mike Buchwald, a career attorney in the Department of Justice’s National Security Division, believes the federal government can help drive that process. “As we look to connected devices, the government can have a lot of say in how those devices should be secured,” he said at the ICIT Forum. As one of the world’s biggest consumers, he argues, the government should leverage the power of its purse “to change the market place to get people to think about security.”

Whether the government has that kind of market power – or needs to add legislative or regulatory muscle to the process is still unclear. The United States may be the world’s single largest buyer of nuclear-powered submarines or aircraft carriers, but its consumption of commercial technology is small when compared to global markets, especially so when considering the global scale of IoT connections.

Steven Walker, acting director of the Defense Advanced Research Projects Agency (DARPA), believes the government’s role could be to encourage industry.

“What if a company that produces a software product receives something equivalent to a Good Housekeeping Seal of Approval for producing a secure product?” he said in June at the AFCEA Defensive Cyber Operations Conference in Baltimore. “What if customers were made aware of unsecure products and the companies that made them? I’m pretty sure customers would buy the more secure products – in today’s world especially.”

How the government might get involved is unclear, but there are already proven models in place in which federal agencies took an active role in encouraging industry standards for measurement and performance of consumer products.

“Philosophically, I’m opposed to government overregulating any industry,” he said. “In my view, overregulation stifles innovation – and invention. But the government does impose some regulation to keep Americans safe: Think of crash tests for automobiles. So should the government think about the equivalent of a crash test for cyber before a product – software or hardware – is put out on the Internet? I don’t know. I’m just asking the question.”

Among those asking the same question include Rep. Jim Langevin (D-R.I.), an early and outspoken proponent of cybersecurity legislation. “We need to ensure we approach the security of the Internet of Things with the techniques that have been successful with the smart phone and desktop computers: The policies of automatic patching, authentication and encryption that have worked in those domains need to be extended to all devices that are connected to the Internet,” he told the ICIT Forum. “I believe the government can act as a convener to work with private industry in this space.”

What might that look like? “Standard labeling for connected devices: Something akin to a nutritional label, if you will, for IoT,” he said.

Langevin agrees that “the pull of federal procurement dollars” can be an incentive in some cases to get the private sector to buy in to that approach.

But the key to rapid advancement in this area will be getting the public and private sectors to work together and buy into the idea that security is not the sole purview of a manufacturer or a customer or someone in IT, but rather everyone involved in the entire process, from product design and manufacture through software development, supply chain management and long-term system maintenance.

As IoT expands the overall attack surface, it’s up to everyone to manage the risks.

“Pure technological solutions will never achieve impenetrable security,” says Langevin. “It’s just not possible. And pure policy solutions can never keep up with technology.”

Related Articles

AFDC Cyber Summit18 300×600
WEST Conference
NPR Morning Edition 250×250

Upcoming Events

AFDC Cyber Summit18 250×250
GDIT Recruitment 250×250
USNI News: 250×250
Nextgov Newsletter 250×250
GDIT HCSD SCM 1 250×250 Train
Secure, High-Quality Software Doesn’t Happen By Accident

Secure, High-Quality Software Doesn’t Happen By Accident

Time, cost and security are all critical factors in developing new software. Late delivery can undermine the mission; rising costs can jeopardize programs; security breaches and system failures can disrupt entire institutions. Yet systematically reviewing system software for quality and security is far from routine.

“People get in a rush to get things built,” says Bill Curtis, founding executive director of Consortium of IT Software Quality (CISQ), where he leads the development of automatable standards that measure software size and quality. “They’re either given schedules they can’t meet or the business is running around saying … ‘The cost of the damage on an outage or a breach is less than what we’ll lose if we don’t get this thing out to market.’”

In the government context, pressures can arise from politics and public attention, as well as contract and schedule.

It shouldn’t take “a nine-digit defect – a defect that goes over 100 million bucks – to change attitudes,” Curtis says. But sometimes that’s what it takes.

Software defects and vulnerabilities come in many forms. The Common Weakness Enumeration (CWE) lists more than 705 types of security weaknesses organized into categories such as “Insecure Interaction Between Components” or “Risky Resource Management.” CWE’s list draws on contributions from participants ranging from Apple and IBM to the National Security Agency and the National Institute for Standards and Technology.

By defining these weaknesses, CWE – and its sponsor, the Department of Homeland Security’s Office of Cybersecurity and Communications – seek to raise awareness about bad software practices by:

  • Defining common language for describing software security weaknesses in architecture, design, or code
  • Developing a standard measuring stick for software security tools targeting such weaknesses
  • Providing a common baseline for identifying, mitigating and preventing weaknesses

Software weaknesses can include inappropriate linkages, defunct code that remains in place (at the risk of being accidentally reactivated later) and avoidable flaws – known vulnerabilities that nonetheless find their way into source code.

“We’ve known about SQL injections [as a security flaw] since the 1990s,” Curtis says. “So why are we still seeing them? It’s because people are in a rush. They don’t know. They weren’t trained.”

Educated Approach
Whether students today get enough rigor and process drilled into them while they’re learning computer languages and logic is open to debate. Curtis favors a more rigorous engineering approach for example, worrying that too many self-taught programmers lack critical underlying skills. Indeed, 2016 survey of 56,033 developers conducted by Stack Overflow, a global online programmer community, found 13 percent claimed they were entirely self-taught. Even among the 62.5 percent who had studied computer science and earned a bachelor’s or master’s degree, the majority also said some portion of their training was self-taught. The result is that some underlying elements of structure, discipline or understanding can be lost, increasing the risk of problems.

Having consistent, reliable processes and tools for examining and ensuring software quality could make a big difference.

Automated software developed to identify weak or risky architecture and code can help overcome that, says Curtis, a 38-year veteran of software engineering and development in industry and academia. Through a combination of static and dynamic reviews, developers can obtain a sense of the overall quality of their code and alerts about potential system weaknesses and vulnerabilities. The lower the score, the riskier the software.

CISQ is not a panacea. It can screen 22 of the 25 Most Dangerous Software Errors as defined by CWE and the SANS Institute, identifying both code-level and architectural-level errors.

By examining system architecture, Curtis says, CISQ delivers a comprehensive review. “We’ve got to be able to do system-level analysis,” Curtis says. “It’s not enough just to find code-level bugs or code-unit-level bugs. We’ve got to find the architectural issues, where somebody comes in through the user interface and slips all the way around the data access or authentication routines. And to do that you have to be able to analyze the overall stack.”

Building on ISO/IEC 25010, an international standard for stating and evaluating software quality requirements, CISQ establishes a process for measuring software quality against four sets of characteristics: security, reliability, performance efficiency and maintainability. These are “nonfunctional requirements,” in that they are peripheral to the actual mission of any given system, yet they are also the source of many of the most damaging security breaches and system failures.

Consider, for example, a 2012 failed software update to servers belonging to Jersey City, N.J. financial services firm Knight Capital Group. The update was supposed to replace old code that had remained in the system – unused – for eight years. The new code, which updated and repurposed a “flag” from the old code, was tested and proven to work correctly and reliably. Then the trouble started.

According to a Securities and Exchange Commission filing, a Knight technician copied the new code to only seven of the eight required servers. No one realized the old code had not been removed from the eighth server nor that the new code had not been added. While the seven updated servers operated correctly, the repurposed flag caused the eighth server to trigger outdated and defective software. The defective code instantly triggered millions of “buy” orders totaling 397 million shares in just 45 minutes. Total lost as a result: $460 million.

“A disciplined software configuration management approach would have stopped that failed deployment on two fronts,” said Andy Ma, senior software architect with General Dynamics Information Technology. “Disciplined configuration management means making sure dead code isn’t waiting in hiding to be turned on by surprise, and that strong control mechanisms are in place to ensure that updates are applied to all servers, not just some. That kind of discipline has to be instilled throughout the IT organization. It’s got to be part of the culture.”

Indeed, had the dead code been deleted, the entire episode would never have happened, Curtis says. Yet it is still common to find dead code hidden in system software. Indeed, as systems grow in complexity, such events could become more frequent. Large systems today utilize three to six computer languages and have constant interaction between different system components.

“We’re past the point where a single person can understand these large complex systems,” he says. “Even a team cannot understand the whole thing.”

As with other challenges where large data sets are beyond human comprehension, automation promises better performance than humans can muster. “Automating the deployment process would have avoided the problem Knight had – if they had configured their tools to update all eight servers,” said GDIT’s Ma. “Automated tools also can perform increasingly sophisticated code analysis to detect flaws. But they’re only as good as the people who use them. You have to spend the time and effort to set them up correctly.”

Contracts and Requirements
For acquisition professionals, such tools could be valuable in measuring quality performance. Contracts can be written to incorporate such measures, with contractors reporting on quality reviews on an ongoing basis. Indeed, the process lends itself to agile development, says Curtis, who recommends using the tools at least once every sprint. That way, risks are flagged and can be fixed immediately. “Some folks do it every week,” he says.

J. Brian Hall, principal director, Developmental Test and Evaluation in the Office of the Secretary of Defense, said at a conference in March that the concept of adding a security quality review early in the development process is still a relatively new idea. But Pentagon operational test and evaluation officials have determined systems to be un-survivable in the past – specifically because of cyber vulnerabilities discovered during operational testing. So establishing routine testing earlier in the process is essential.

The Joint Staff updated systems survivability performance parameters earlier this year and now include a cybersecurity component, Hall said in March. “This constitutes the first real cybersecurity requirements for major defense programs,” he explained. “Those requirements ultimately need to translate into contract specifications so cybersecurity can be engineered in from program inception.”

Building cyber into the requirements process is important because requirements drive funding, Hall said. If testing for cybersecurity is to be funded, it must be reflected in requirements.

The Defense Department will update its current guidance on cyber testing in the development, test and evaluation environment by year’s end, he said.

All this follows the November 2016 publication of Special Publication 800-160, a NIST/ISO standard that is “the playbook for how to integrate security into the systems engineering process,” according to one of its principal authors, Ron Ross, a senior fellow at NIST. That standard covers all aspects of systems development, requirements and life-cycle management.

Related Articles

AFDC Cyber Summit18 300×600
WEST Conference
NPR Morning Edition 250×250

Upcoming Events

AFDC Cyber Summit18 250×250
GDIT Recruitment 250×250
USNI News: 250×250
Nextgov Newsletter 250×250
GDIT HCSD SCM 1 250×250 Train
Securing Health Data Means Going Well Beyond HIPAA

Securing Health Data Means Going Well Beyond HIPAA

A two-decade-old law designed to protect patients’ privacy may be preventing health care organizations from doing more to protect vulnerable health care data from theft or abuse.

The Health Insurance Portability and Accountability Act (HIPAA) established strict rules for how health data can be stored and shared. But in making health care providers vigilant about privacy protection, HIPAA may inadvertently distract providers from focusing on something just as important: overall information security.

“Unfortunately I think HIPAA has focused healthcare organizations too much on data privacy and not enough on data integrity, data loss, disrupted operations and patient safety. You can get your identity back at some point, but not your life,” warns Denise Anderson, president of the National Health Information Sharing and Analysis Center (NH-ISAC). “Many of the attacks we are seeing, such as WannaCry, are disruptive attacks and are not data theft attacks. Organizations should be driven to focus on enterprise risk management and it should come from the Board and CEO level on down.”

“Cybersecurity in Health Care crosses a wide spectrum of issues,” adds Sallie Sweeney, principal cyber solutions architect in the Health and Civilian Solutions Division of systems integrator General Dynamics Information Technology (GDIT). “It’s not just protecting patient data. It includes protecting their financial data and making sure the medical equipment works the way it’s supposed to, when it’s supposed to, without potential for error. Think about the consequences of a Denial of Service attack aimed at the systems monitoring patient vital signs in the ICU. You have to look at the whole picture.”

Many public health agencies and smaller businesses are under-resourced or under-skilled in cyber defense, leaving them reliant on products and service solutions they may not fully understand themselves.

NH-ISAC members have access to support and services, such as Cyber-Fit, a non-profit set of services ranging from simple information services to benchmarking assessments of organizations’ cyber health and security posture; shared risk assessments; and cyber services, including penetration testing, vulnerability management and incident response.

Maggie Amato HHS

Maggie Amato
HHS

Maggie Amato, deputy director of security, design and innovation at the Department of Health and Human Services (HHS), believes increased sharing is at least part of the answer.

“We have to build alliances of threat-sharing capabilities,” Amato says. “The speed, ferocity and depth of attack cannot be dealt with by individual agencies alone.”

Indeed, improved information sharing of threats, weakness and mitigation is one of the key recommendations of the June 2017 Health Care Industry Cybersecurity Task Force.

But getting companies to share threat data is a challenge. Built-in financial incentives drive some firms to minimize publicity and the potential risk it might pose to their businesses. But Anderson says she can see progress.

“I think the public and private sector came together well during the WannaCry incident,” Amato says. Though gaps clearly still exist, the swift response was encouraging.

Anderson’s NH-ISAC could play a key role in improving that response further and narrowing the gaps. NH-ISAC is a non-profit, member-driven organization linking private and public hospitals, providers, health insurance firms, pharmaceutical and biotech manufacturers, laboratories, medical device manufacturers, medical schools and others.

The group is one of 21 non-profit information sharing centers designed to help protect specific industries against cyber threats.

“I think within the NH-ISAC the membership did a phenomenal job of sharing indicators, snort signatures, hashes, mitigation strategies, malware analysis, patching issues and other best practice information. We tried as well to get the information out broadly beyond our membership,” she says. “NH-ISAC is a stellar example of how a community can pull together during an incident to help each other out.”

What HIPAA’s Security Rule Requires

The Office for the National Coordinator for Health Information Technology, which is responsible for overseeing the standards and rules applying to electronic health records writes in its Guide to Security of Electronic Health Information that the HIPAA Security Rule requires:

  • Administrative actions, policies and procedures to prevent, detect, contain and correct security violations and ensure development, implementation and maintenance of security measures to protect electronic personal health information (ePHI).
  • Physical measures, policies and procedures to protect electronic information systems and related buildings and equipment from natural and environmental hazards and unauthorized intrusion to protect and control access to ePHI.
  • Reasonable and appropriate policies and procedures to comply with government requirements, including requirements for contracting with IT services providers, for maintaining data over time and for periodically reviewing policies and procedures.

She has a long way to go, however. While health care represents one of the largest sectors, the NH-ISAC has garnered only about 200 members since its founding in 2010. By contrast, the financial services ISAC has more 6,000 members.

Anderson joined the health ISAC from the finance sector ISAC in part to help drum up participation.

“One of the greatest challenges for the NH-ISAC and all ISACs is the lack of awareness amongst the critical infrastructure owners and operators – particularly the smaller owners and operators – that the ISACs exist and are a valuable tool,” Anderson told the House Energy and Commerce subcommittee on oversight and investigations in April. “Numerous incidents have shown that effective information sharing amongst robust trusted networks of members’ works in combatting cyber threats.” She suggests tax breaks for new members might help encourage wider participation.

“Protecting highly sensitive information – whether it’s patient records; financial data or sensitive government information, is something that has to be baked into every Information system,” said GDIT’s Sweeney. “Too often, we have a health care IT system where security is an afterthought – and trying to bolt on the kinds of protections we need becomes painful and expensive.” Sweeney, whose background includes securing large scale health care information databases and systems for government clients, concluded “Health care systems should be no less secure than financial systems in banks.”

Another new tool for promoting intelligence and threat sharing among health providers is the new Healthcare Cybersecurity and Communications Integration Center (HCCIC), launched by the HHS in May.

Modeled after the Department of Homeland Security’s National Cybersecurity and Communications Integration Center (NCCIC), the new HCCIC (pronounced “Aych-Kick) has been criticized as potentially duplicating the NCCIC and other organizations. But Anderson defends the new center as a valuable community tool for funneling information from the many fragmented parts of HHS into a central healthcare information security clearing house.

She concedes, however, that HCCIC will have to prove itself.

“One potential downside of pulling together HHS components into one floor could be, a slowdown of sharing from the private sector as ‘government’ is involved,” she wrote in a written follow up to questions posed by Rep. Tim Murphy (R-PA). “Another downside could be that even though all of the components are brought together, sharing could still take place in a fragmented, unproductive manner. There could be risk of inadvertent disclosure or risk of post-hoc regulatory penalties for a reported breach. Finally if efforts are not effectively differentiated from the NCCIC environment, duplication of effort and additional costs for staffing and resources can result.”

HCICC, in fact, played a key role in the government’s response to May’s WannaCry ransomware attacks. “HCCIC analysts provided early warning of the potential impact of the attack and HHS responded by putting the secretary’s operations center on alert,” testified Leo Scanlon, deputy chief information security officer at HHS before a House Energy and Commerce subcommittee June 8. “This was the first time that a cyber-attack was the focus of such a mobilization,” he said. HCCIC was able to provide “real-time cyber situation awareness, best practices guidance and coordination” with the NCCIC.

Anderson sees further upside potential. Based on her prior experience with the financial services ISAC, “the HCCIC should be successful if carried out as envisioned and if it is voluntary and non-regulatory in nature,” she told GovTechWorks. “This will result in improved dissemination within the sector. In addition, by bringing all of the components of HHS under one roof, increased situational awareness and cyber security efficiencies will result.”

Related Articles

AFDC Cyber Summit18 300×600
WEST Conference
NPR Morning Edition 250×250

Upcoming Events

AFDC Cyber Summit18 250×250
GDIT Recruitment 250×250
USNI News: 250×250
Nextgov Newsletter 250×250
GDIT HCSD SCM 1 250×250 Train
Wanted: Metrics for Measuring Cyber Performance and Effectiveness

Wanted: Metrics for Measuring Cyber Performance and Effectiveness

Chief information security officers (CISOs) face a dizzying array of cybersecurity tools to choose from, each loaded with features and promised capabilities that are hard to measure or judge.

That leaves CISOs trying to balance unknown risks against growing costs, without a clear ability to justify the return on their cybersecurity investment. Not surprisingly, today’s high-threat environment makes it preferable to choose safe over sorry – regardless of cost. But is there a better way?

Some cyber insiders believe there is.

Margie Graves Acting U.S. Federal Chief Information Officer

Margie Graves
Acting U.S. Federal Chief Information Officer

Acting U.S. Federal Chief Information Officer (CIO) Margie Graves acknowledges the problem.

“Defining the measure of success is hard sometimes, because it’s hard to measure things that don’t happen,” Graves said. President’s Trump’s Executive Order on Cybersecurity asks each agency to develop its own risk management plan, she noted. “It should be articulated on that plan how every dollar will be applied to buying down that risk.”

There is a difference though, between a plan and an actual measure. A plan can justify an investment intended to reduce risk. But judgment, rather than hard knowledge, will determine how much risk is mitigated by any given tool.

The Defense Information Systems Agency (DISA) and the National Security Agency (NSA) have been trying to develop a methodology measuring the actual value of a given cyber tool’s performance. Their NIPRNet/SIPRNET Cyber Security Architecture Review (NSCSAR – pronounced “NASCAR”) is a classified effort to define a framework for measuring cybersecurity performance, said DISA CIO and Risk Management Executive John Hickey.

“We just went through a drill of ‘what are those metrics that are actually going to show us the effectiveness of those tools,’ because a lot of times we make an investment, people want a return on that investment,” he told GovTechWorks in June. “Security is a poor example of what you are going after. It is really the effectiveness of the security tools or compliance capabilities.”

The NSCSAR review, conducted in partnership with NSA and the Defense Department, may point to a future means of measuring cyber defense capability. “It is a framework that actually looks at the kill chain, how the enemy will move through that kill chain and what defenses we have in place,” Hickey said, adding that NSA is working with DISA on an unclassified version of the framework that could be shared with other agencies or the private sector to measure cyber performance.

“It is a methodology,” Hickey explained. “We look at the sensors we have today and measure what functionality they perform against the threat.… We are tracking the effectiveness of the tools and capabilities to get after that threat, and then making our decisions on what priorities to fund.”

Measuring Security
NSS Labs Inc., independently tests the cybersecurity performance of firewalls and other cyber defenses, annually scoring products’ performances. The Austin, Texas, company evaluated 11 next-generation firewall (NGFW) products from 10 vendors in June 2017, comparing the effectiveness of their security performance, as well as the firewalls’ stability, reliability and total cost of ownership.

In the test, products were presumed to be able to provide basic packet filtering, stateful multi-layer inspection, network address translation, virtual private network capability, application awareness controls, user/group controls, integrated intrusion prevention, reputation services, anti-malware capabilities and SSL inspection. Among the findings:

  • Eight of 11 products tested scored “above average” in terms of both performance and cost-effectiveness; Three scored below
  • Overall security effectiveness ranged from as low as 25.8 percent, up to 99.9; average security effectiveness was 67.3 percent
  • Four products scored below 78.5 percent
  • Total cost of ownership ranged from $5 per protected megabit/second to $105, with an average of $22
  • Nine products failed to detect at least one evasion, while only two detected all evasion attempts

NSS conducted similar tests of advanced endpoint protection tools, data center firewalls, and web application firewalls earlier this year.

But point-in-time performance tests don’t provide a reliable measure of ongoing performance. And measuring the effectiveness of a single tool does not necessarily indicate how well it performs its particular duties as part of a suite of tools, notes Robert J. Carey, vice president within the Global Solutions division at General Dynamics Information Technology (GDIT). The former U.S. Navy CIO and Defense Department principal deputy CIO says that though these tests are valuable, they still make it hard to quantify and compare the performance of different products in an organization’s security stack.

The evolution and blurring of the lines between different cybersecurity tools – from firewalls to intrusion detection/protection, gateways, traffic analysis tools, threat intelligence, intrusion detection, anomaly detection and so on – mean it’s easy to add another tool to one’s stack, but like any multivariate function, it is hard to be sure of its individual contributions to threat protection and what you can do without.

“We don’t know what an adequate cyber security stack looks like. What part of the threat does the firewall protect against, the intrusion detection tool, and so on?” Carey says. “We perceive that the tools are part of the solution. But it’s difficult to quantify the benefit. There’s too much marketing fluff about features and not enough facts.”

Mike Spanbauer, vice president of research strategy at NSS, says this is a common concern, especially in large, managed environments — as is the case in many government instances. One way to address it is to replicate the security stack in a test environment and experiment to see how tools perform against a range of known, current threats while under different configurations and settings.

Another solution is to add one more tool to monitor and measure performance. NSS’ Cyber Advanced Warning System (CAWS) provides continuous security validation monitoring by capturing live threats and then injecting them into a test environment mirroring customers’ actual security stacks. New threats are identified and tested non-stop. If they succeed in penetrating the stack, system owners are notified so they can update their policies to stop that threat in the future.

“We harvest the live threats and capture those in a very careful manner and preserve the complete properties,” Spanbauer said. “Then we bring those back into our virtual environment and run them across the [cyber stack] and determine whether it is detected.”

Adding more tools and solutions isn’t necessarily what Carey had in mind. While that monitoring may reduce risk, it also adds another expense.

And measuring value in terms of return on investment, is a challenge when every new tool adds real cost and results are so difficult to define. In cybersecurity, though managing risk has become the name of the game, actually calculating risk is hard.

The National Institute of Standards and Technology (NIST) created the 800-53 security controls and the cybersecurity risk management framework that encompass today’s best practices. Carey worries that risk management delivers an illusion of security by accepting some level of vulnerability depending on level of investment. The trouble with that is that it drives a compliance culture in which security departments focus on following the framework more than defending the network and securing its applications and data.

“I’m in favor of moving away from risk management,” GDIT’s Carey says. “It’s what we’ve been doing for the past 25 years. It’s produced a lot of spend, but no measurable results. We should move to effects-based cyber. Instead of 60 shades of gray, maybe we should have just five well defined capability bands.”

The ultimate goal: Bring compliance into line with security so that doing the former, delivers the latter. But the evolving nature of cyber threats suggests that may never be possible.

Automated tools will only be as good as the data and intelligence built into them. True, automation improves speed and efficiency, Carey says. “But it doesn’t necessarily make me better.”

System owners should be able to look at their cyber stack and determine exactly how much better security performance would be if they added another tool or upgraded an existing one. If that were the case, they could spend most of their time focused on stopping the most dangerous threats – zero-day vulnerabilities that no tool can identify because they’ve never seen it before – rather than ensuring all processes and controls are in place to minimize risk in the event of a breach.

Point-in-time measures based on known vulnerabilities and available threats help, but may be blind to new or emerging threats of the sort that the NSA identifies and often keeps secret.

The NSCSAR tests DISA and NSA perform include that kind of advanced threat. Rather than trying to measure overall security, they’ve determined that breaking it down into the different levels of security makes sense. Says DISA’s Hickey: “You’ve got to tackle ‘what are we doing at the perimeter, what are we doing at the region and what are we doing at the endpoint.’” A single overall picture isn’t really possible, he says. Rather, one has to ask: “What is that situational awareness? What are those gaps and seams? What do we stop [doing now] in order to do something else? Those are the types of measurements we are looking at.”

Related Articles

AFDC Cyber Summit18 300×600
WEST Conference
NPR Morning Edition 250×250

Upcoming Events

AFDC Cyber Summit18 250×250
GDIT Recruitment 250×250
USNI News: 250×250
Nextgov Newsletter 250×250
GDIT HCSD SCM 1 250×250 Train
IT Staffs Lag in Job Satisfaction vs. Non-IT Workers

IT Staffs Lag in Job Satisfaction vs. Non-IT Workers

Information Technology staff are more likely than other workers to feel disconnected from the missions of their overall organizations, a principal reason for diminished job satisfaction, according to a new study of 5,000 employees in 500 different technology organizations.

TinyPulse, a Seattle, Wash. specialist in employee morale and company culture surveyed workers about job satisfaction, happiness at work and company values. They found tech employees less satisfied than others.

“What we found to be the most surprising was technology workers’ misalignment with their organization’s purpose and values,” TinyPulse CEO David Niu told GovTechWorks. “Only 28 percent of them know their company’s mission, vision and values, versus 43 percent for non-IT employees.”

Other significant disparities included the extent to which their personal values matched those of their employers.

“For non-IT employees, 45 percent responded with a [top score of] 9 or 10,” Niu said, versus 34 percent for IT workers. “That’s surprising, given how much we see in the popular press about how technology companies like Google and Facebook preach their culture and work-life balance.”

Among areas of concern:

  • Only 19 percent of IT employees gave a strongly positive answer when asked how happy they were on the job. That compares to 22 percent among non-IT workers “which is a statistically significant difference that makes us worry,” the report stated. Employee engagement is key. “The creativity and passion we need from workers in the tech space can’t thrive without it. So when IT employees, some of our best and brightest, tell us that they’re so much unhappier than people in other industries, we need to pay attention and find out why.”
  • IT employees are less likely to see a clear career path ahead of them. Roughly half of all non-IT employees see clear promotion and career paths ahead versus lightly more than 1 in 3 IT employees.
  • Only a slim 17 percent of IT employees feel strongly valued at work, compared with 22 percent for non-IT employees. “We asked employees if they would reapply for their current job, then compared those answers to how valued they feel at work,” the report claimed. “The two go hand in hand: Even if they stick around, an unappreciated worker is not a motivated one. Recognition communicates to employees that their work matters, driving them to keep putting in that effort.”
  • Only 47 percent of IT employees say they have strong relationships with their coworkers, versus 56 percent for non-IT employees. “Peers are the number one reason that motivates employees to excel,” reads the report. “It’s not their salary, it’s not their boss — it’s not even their own passion for the field. Tactics like awarding raises and measuring job fit are important, but they can’t substitute for colleagues.”

IT staff working for government contractors and embedded in government offices may face particular challenges. They have to support both the government customer’s mission and their mission as a contractor. While most of the time those two challenges are aligned, sometimes they are not.

“Open and honest communication and trusted relationships are critical,” says Collen Nicoll, director of talent acquisition at systems integrator General Dynamics Information Technology (GDIT). “If the relationship is strong and built on mutual transparency from the beginning, whatever disconnects might arise can be dealt with and eliminated quickly and easily. When it’s not, that’s when problems arise. Onsite managers are there to ensure alignment, make sure they are meeting the customer’s needs and work through problems when and if they occur. For most employees, there should be no question about the alignment between the company’s values, their work and that of the government customer.”

Getting that relationship and tone right is especially important for younger, less experienced employees – the heart of the future workforce. Job satisfaction and career progression are the most critical factors in determining their propensity to stay with the same employer.

“One of the most pressing concerns for employees is to know where they’re going at a company,” the TinyPulse report states. “Our internal research found that among millennials — the largest generation in the workplace — 75 percent would consider looking for a new job if they didn’t have opportunities for professional growth.”

Daniel Todd, CEO and Founder of Affinity Influencing Systems in Kirkland, Wash., said, “Keeping people motivated is often times a mix of giving them clear, detailed direction while simultaneously talking about the big picture and how each element of what they are working on fits into the big picture.”

What can leaders do to improve IT staff morale?

  • Foster professional growth. Make sure employees fit with their jobs and know where they’re going in the organization. Managers should routinely discuss career development with employees.
  • Build the right team. Leaders should understand what kind of culture they want to create, and hire with it in mind. They should understand how a new hire will fit in before they bring them aboard.
  • Prioritize positive feedback. There’s an epidemic of feeling undervalued at work, leading to disengagement and attrition. Acknowledging employees accomplishments every day and talking to them when things go right, as well as wrong, builds confidence and trust.
  • Align employees with the company mission. If the mission isn’t clear to the team, the team won’t pull in the same direction. Clearly communicating core values and hiring the people who fit them helps ensure everyone is on the same page.

Unhappy employees “directly impact others with their work, so disengagement and unhappiness has ripple ef­fects throughout [an organization],” the report concludes. Helping unhappy employees improve their situation – and solving the underlying causes – are among the most important things leaders do.

But that doesn’t mean leaders need to do it all by themselves. Admir Hadziabulic, knowledge supervisor at Heavy Construction System Specialist (HCSS), which creates system software in Sugar Land, Texas, leans on employees to spread the company culture to new hires.

“HCSS evolved over time to develop its culture,” says Hadziabulic, “and we try to ensure that everyone who works here has a hand in that culture.”

Each new employee receives a “Culture Book,” a document written by employees and designed to help new hires integrate “into our established culture,” Hadziabulic says. New employees also go through a culture overview class that “explains why we do things the way we do.”

By helping employees understand and buy into that culture, he says, they’re more likely to stick around.

“Job satisfaction starts with the hiring process and then the employees’ start in the workplace,” says Nicoll of GDIT. “Cultures are hard to change, but morale is fluid and always a function of leadership. Hiring the right people is the first, best step. Next comes aligning them with our company values, this includes giving them the tools, training and support they need to succeed. And finally, celebrating their successes – and helping them to learn from their failures – is also important. Morale is just higher when leaders follow that approach.”

Related Articles

AFDC Cyber Summit18 300×600
WEST Conference
NPR Morning Edition 250×250

Upcoming Events

AFDC Cyber Summit18 250×250
GDIT Recruitment 250×250
USNI News: 250×250
Nextgov Newsletter 250×250
GDIT HCSD SCM 1 250×250 Train
How Employers Try to Retain Tech Talent

How Employers Try to Retain Tech Talent

As soon as Scott Algeier hires a freshly minted IT specialist out of college, a little clock starts ticking inside his head.

It’s not that he doesn’t have plenty to offer new hires in his role as director of the non-profit Information Technology-Information Sharing and Analysis Center (IT-ISAC) in Manassas, Va., nor that IT-ISAC cannot pay a fair wage. The issue is Algeier is in an all-out war for talent – and experience counts. Contractors, government agencies – indeed virtually every other employer across the nation – values experience almost as much as education and certifications.

As employees gain that experience, they see their value grow. “If I can get them to stay for at least three years, I consider that a win,” says Algeier. “We have one job where it never lasts more than two years. The best I can do is hire quality people right out of college, train them and hope they stick around for three years.”

The Military Context
An October 2016 White Paper from the Air Force University’s Research Institute says the frequency of churn is even more dire among those in the military, particularly in the Air Force which is undergoing a massive expansion of its cyber operations units.

The present demand for cybersecurity specialists in both the public and private sectors could undoubtedly lead the Air Force to be significantly challenged in retaining its most developed and experienced cyber Airmen in the years ahead, writes Air Force Major William Parker IV, author of the study.

“In the current environment, shortages in all flavors of cyber experts will increase, at least in the foreseeable future. Demand for all varieties of cybersecurity-skilled experts in both the private and public sectors is only rising.”

Meanwhile, it is estimated that today there are at least 30,000 unfilled cybersecurity jobs across the federal government, writes Parker. According to the International Information System Security Certification Consortium (ISC2), demand for cyber-certified professionals will continue to increase at 11 percent per year for the foreseeable future. Some estimates placed the global cyber workforce shortage at close to a million.

The military – both a primary trainer and employer in cyber — offers some interesting insight. A recent survey of Air Force cyber specialists choosing between re-enlistment or pursuit of opportunities in the civilian world indicates those who chose to reenlist were primarily influenced by job security and benefits, including health, retirement and education and training.

“For those Airmen who intended to separate, civilian job opportunities, pay and allowances, bonuses and special pays, promotion opportunities and the evaluation system contributed most heavily to their decisions [to leave the military],” Parker’s paper concluded.

Indeed, several airmen who expressed deep pride and love of serving in the Air Force stated they chose to separate because they felt their skills were not being fully utilized.

“Also, they were aware they had the ability to earn more income for their families in the private sector,” adds Parker. The re-enlistment bonuses the Air Force offered were not enough to make up the pay differences these airmen saw.

“It is also interesting that many of those who say that they will reenlist, included optimistic comments that they hope ‘someday’ they may be able to apply the cyber skills they have attained in the service of the nation.”

Tech companies present a different set of competitive stresses: competing with high pay, industrial glamor and attractive perks. Apple’s new Cupertino, Calif., headquarters epitomizes the age: an airy glass donut that looks like it just touched down from a galaxy far, far away, filled with cafés, restaurants, a wellness center, a child care facility and even an Eden-like garden inside the donut hole. Amazon’s $4 billion urban campus, anchored by the improbable “spheres,” in which three interlocking, multistory glass structures house treehouse meeting rooms, offices and collaborative spaces filled with trees, rare plants, waterfalls and a river that runs through it all.

While Washington, D.C., contractors and non-profits do not have campus rivers or stock option packages, they do have other ways to compete. At the forefront are the high-end missions in which both they and their customers perform. They also offer professional development, certifications, job flexibility and sometimes, the ability to work from home.

“We work with the intelligence community and the DoD,” says Chris Hiltbrand, vice president of Human Resources for General Dynamics Information Technology’s Intelligence Solutions Division. “Our employees have the opportunity to apply cutting-edge technologies to interesting and important missions that truly make a difference to our nation. It’s rewarding work.”

While sometimes people leave for pay packages from Silicon Valley, he admits, pay is rarely the only issue employees consider. Work location, comfort and familiarity, quality of work, colleagues, career opportunities and the impact of working on a worthwhile mission, all play a role.

“It’s not all about maximizing earning potential,” Hiltbrand says. “In terms of money, people want to be compensated fairly – relative to the market – for the work they do. We also look at other aspects of what we can offer, and that is largely around the customer missions we support and our reputation with customers and the industry.”

Especially for veterans, mission, purpose and service to the nation are real motivators. GDIT then goes a step further, supporting staff who are members of the National Guard or military reservists with extra benefits, such as paying the difference in salary when staff go on active duty.

Mission also factors in to the equation at IT-ISAC, Algeier says. “Our employees get to work with some of the big hitters in the industry and that experience definitely keeps them here longer than they might otherwise. But over time, that also has an inevitable effect.

“I get them here by saying: ‘Hey, look who you get to work with,’ he says. “And then within a few years, it’s ‘hey, look who they’re going to go work with.’”

Perks and Benefits
Though automation may seem like a way to replace people rather than entice them to stay, it can be a valuable, if unlikely retention tool.

Automated tools spare staff from the tedious work some find demoralizing (or boring), and save hours or even days for higher-level work, Algeier says. “That means they can now go do far more interesting work instead.” More time doing interesting work leads to happier employees, which in turn makes staff more likely to stay put.

Fitness and wellness programs are two other creative ways employers invest in keeping the talent they have. Gyms, wellness centers, an in-house yoga studio, exercise classes and even CrossFit boxes are some components. Since exercise helps relieve stress and stress can trigger employees to start looking elsewhere for work, it stands that reducing stress can help improve the strains of work and boost production. Keeping people motivated helps keep them from negative feelings that might lead them to seek satisfaction elsewhere.

Providing certified life coaches is another popular way employers can help staff, focusing on both personal and professional development. Indeed, Microsoft deployed life coaches at its Redmond headquarters more than a decade ago. They specialize in working with adults with Attention Deficit Hyperactivity Disorder (ADHD), and can help professionals overcome weaknesses and increase performance.

Such benefits used to be the domain of Silicon Valley alone, but not anymore. Fairfax, Va.-based boutique security company MKACyber, was launched by Mischel Kwon after posts as director of the Department of Homeland Security’s U.S. Computer Emergency Response Team and as vice president of public sector security solutions for Bedford, Mass.-based RSA. Kwon built her company with what she calls “a West Coast environment.”

The company provides breakfast, lunch and snack foods, private “chill” rooms, and operates a family-first environment, according to a job posting. It also highlights the company’s strong commitment to diversity and helps employees remain “life-long learners.”

Kwon says diversity is about more than just hiring the right mix of people. How you treat them is the key to how long they stay.

“There are a lot of things that go on after the hire that we have to concern ourselves with,” she said at a recent RSA conference.

Retention is a challenging problem for everyone in IT, Kwon says, but managers can do more to think differently about how to hire and keep new talent, beginning by focusing not just on raw technical knowledge, but also on soft skills that make a real difference when working on projects and with teams.

“We’re very ready to have people take tests, have certifications, and look at the onesy-twosy things that they know,” says Kwon. “What we’re finding though, is just as important as the actual content that they know, is their actual work ethic, their personalities. Do they fit in with other people? Do they work well in groups? Are they life-long learners? These types of personal skills are as important as technical skills,” Kwon says. “We can teach the technical skills. It’s hard to teach the work ethic.”

Flexible Work Schedules
Two stereotypes of the modern tech age are all-night coders working in perk-laden offices and fueled by free food, lattes and energy drinks. On the other hand are virtual meetings populated by individuals spread out across the nation or the globe, sitting in home offices or bedrooms, working on their laptops. For many, working from home is no longer a privilege. It’s either a right or at least, an opportunity to make work and life balance out. Have to wait for a plumber to fix the leaky sink? No problem: dial in remotely. In the District of Columbia, the government and many employers encourage regular telework as a means to reduce traffic and congestion — as well as for convenience.

For some, working from home also inevitably draws questions. IBM, for years one of the staunchest supporters of telework, now backtracks on the culture it built, telling workers they need to regularly be in the office if they want to stay employed. The policy shift follows similar moves by Yahoo!, among others.

GDIT’s Hiltbrand says because its staff works at company locations as well as on government sites, remote work is common.

“We have a large population of people who have full or part-time teleworking,” he says. “We are not backing away from that model. If anything, we’re trying to expand on that culture of being able to work from anywhere, anytime and on any device.”

Of course, that’s not possible for everyone. Staff working at military and intelligence agencies don’t typically have that flexibility. “But aside from that,” adds Hiltbrand, “we’re putting a priority on the most flexible work arrangements possible to satisfy employee needs.”

Related Articles

AFDC Cyber Summit18 300×600
WEST Conference
NPR Morning Edition 250×250

Upcoming Events

AFDC Cyber Summit18 250×250
GDIT Recruitment 250×250
USNI News: 250×250
Nextgov Newsletter 250×250
GDIT HCSD SCM 1 250×250 Train