sept 2017

Securing Health Data Means Going Well Beyond HIPAA

Securing Health Data Means Going Well Beyond HIPAA

A two-decade-old law designed to protect patients’ privacy may be preventing health care organizations from doing more to protect vulnerable health care data from theft or abuse.

The Health Insurance Portability and Accountability Act (HIPAA) established strict rules for how health data can be stored and shared. But in making health care providers vigilant about privacy protection, HIPAA may inadvertently distract providers from focusing on something just as important: overall information security.

“Unfortunately I think HIPAA has focused healthcare organizations too much on data privacy and not enough on data integrity, data loss, disrupted operations and patient safety. You can get your identity back at some point, but not your life,” warns Denise Anderson, president of the National Health Information Sharing and Analysis Center (NH-ISAC). “Many of the attacks we are seeing, such as WannaCry, are disruptive attacks and are not data theft attacks. Organizations should be driven to focus on enterprise risk management and it should come from the Board and CEO level on down.”

“Cybersecurity in Health Care crosses a wide spectrum of issues,” adds Sallie Sweeney, principal cyber solutions architect in the Health and Civilian Solutions Division of systems integrator General Dynamics Information Technology (GDIT). “It’s not just protecting patient data. It includes protecting their financial data and making sure the medical equipment works the way it’s supposed to, when it’s supposed to, without potential for error. Think about the consequences of a Denial of Service attack aimed at the systems monitoring patient vital signs in the ICU. You have to look at the whole picture.”

Many public health agencies and smaller businesses are under-resourced or under-skilled in cyber defense, leaving them reliant on products and service solutions they may not fully understand themselves.

NH-ISAC members have access to support and services, such as Cyber-Fit, a non-profit set of services ranging from simple information services to benchmarking assessments of organizations’ cyber health and security posture; shared risk assessments; and cyber services, including penetration testing, vulnerability management and incident response.

Maggie Amato HHS

Maggie Amato

Maggie Amato, deputy director of security, design and innovation at the Department of Health and Human Services (HHS), believes increased sharing is at least part of the answer.

“We have to build alliances of threat-sharing capabilities,” Amato says. “The speed, ferocity and depth of attack cannot be dealt with by individual agencies alone.”

Indeed, improved information sharing of threats, weakness and mitigation is one of the key recommendations of the June 2017 Health Care Industry Cybersecurity Task Force.

But getting companies to share threat data is a challenge. Built-in financial incentives drive some firms to minimize publicity and the potential risk it might pose to their businesses. But Anderson says she can see progress.

“I think the public and private sector came together well during the WannaCry incident,” Amato says. Though gaps clearly still exist, the swift response was encouraging.

Anderson’s NH-ISAC could play a key role in improving that response further and narrowing the gaps. NH-ISAC is a non-profit, member-driven organization linking private and public hospitals, providers, health insurance firms, pharmaceutical and biotech manufacturers, laboratories, medical device manufacturers, medical schools and others.

The group is one of 21 non-profit information sharing centers designed to help protect specific industries against cyber threats.

“I think within the NH-ISAC the membership did a phenomenal job of sharing indicators, snort signatures, hashes, mitigation strategies, malware analysis, patching issues and other best practice information. We tried as well to get the information out broadly beyond our membership,” she says. “NH-ISAC is a stellar example of how a community can pull together during an incident to help each other out.”

What HIPAA’s Security Rule Requires

The Office for the National Coordinator for Health Information Technology, which is responsible for overseeing the standards and rules applying to electronic health records writes in its Guide to Security of Electronic Health Information that the HIPAA Security Rule requires:

  • Administrative actions, policies and procedures to prevent, detect, contain and correct security violations and ensure development, implementation and maintenance of security measures to protect electronic personal health information (ePHI).
  • Physical measures, policies and procedures to protect electronic information systems and related buildings and equipment from natural and environmental hazards and unauthorized intrusion to protect and control access to ePHI.
  • Reasonable and appropriate policies and procedures to comply with government requirements, including requirements for contracting with IT services providers, for maintaining data over time and for periodically reviewing policies and procedures.

She has a long way to go, however. While health care represents one of the largest sectors, the NH-ISAC has garnered only about 200 members since its founding in 2010. By contrast, the financial services ISAC has more 6,000 members.

Anderson joined the health ISAC from the finance sector ISAC in part to help drum up participation.

“One of the greatest challenges for the NH-ISAC and all ISACs is the lack of awareness amongst the critical infrastructure owners and operators – particularly the smaller owners and operators – that the ISACs exist and are a valuable tool,” Anderson told the House Energy and Commerce subcommittee on oversight and investigations in April. “Numerous incidents have shown that effective information sharing amongst robust trusted networks of members’ works in combatting cyber threats.” She suggests tax breaks for new members might help encourage wider participation.

“Protecting highly sensitive information – whether it’s patient records; financial data or sensitive government information, is something that has to be baked into every Information system,” said GDIT’s Sweeney. “Too often, we have a health care IT system where security is an afterthought – and trying to bolt on the kinds of protections we need becomes painful and expensive.” Sweeney, whose background includes securing large scale health care information databases and systems for government clients, concluded “Health care systems should be no less secure than financial systems in banks.”

Another new tool for promoting intelligence and threat sharing among health providers is the new Healthcare Cybersecurity and Communications Integration Center (HCCIC), launched by the HHS in May.

Modeled after the Department of Homeland Security’s National Cybersecurity and Communications Integration Center (NCCIC), the new HCCIC (pronounced “Aych-Kick) has been criticized as potentially duplicating the NCCIC and other organizations. But Anderson defends the new center as a valuable community tool for funneling information from the many fragmented parts of HHS into a central healthcare information security clearing house.

She concedes, however, that HCCIC will have to prove itself.

“One potential downside of pulling together HHS components into one floor could be, a slowdown of sharing from the private sector as ‘government’ is involved,” she wrote in a written follow up to questions posed by Rep. Tim Murphy (R-PA). “Another downside could be that even though all of the components are brought together, sharing could still take place in a fragmented, unproductive manner. There could be risk of inadvertent disclosure or risk of post-hoc regulatory penalties for a reported breach. Finally if efforts are not effectively differentiated from the NCCIC environment, duplication of effort and additional costs for staffing and resources can result.”

HCICC, in fact, played a key role in the government’s response to May’s WannaCry ransomware attacks. “HCCIC analysts provided early warning of the potential impact of the attack and HHS responded by putting the secretary’s operations center on alert,” testified Leo Scanlon, deputy chief information security officer at HHS before a House Energy and Commerce subcommittee June 8. “This was the first time that a cyber-attack was the focus of such a mobilization,” he said. HCCIC was able to provide “real-time cyber situation awareness, best practices guidance and coordination” with the NCCIC.

Anderson sees further upside potential. Based on her prior experience with the financial services ISAC, “the HCCIC should be successful if carried out as envisioned and if it is voluntary and non-regulatory in nature,” she told GovTechWorks. “This will result in improved dissemination within the sector. In addition, by bringing all of the components of HHS under one roof, increased situational awareness and cyber security efficiencies will result.”

Related Articles

GDIT Recruitment 600×300
GDIT HCSD SCM 5 250×250 Truck
NPR Morning Edition 250×250
GDIT Recruitment 250×250
USNI News: 250×250
Nextgov Newsletter 250×250
Wanted: Metrics for Measuring Cyber Performance and Effectiveness

Wanted: Metrics for Measuring Cyber Performance and Effectiveness

Chief information security officers (CISOs) face a dizzying array of cybersecurity tools to choose from, each loaded with features and promised capabilities that are hard to measure or judge.

That leaves CISOs trying to balance unknown risks against growing costs, without a clear ability to justify the return on their cybersecurity investment. Not surprisingly, today’s high-threat environment makes it preferable to choose safe over sorry – regardless of cost. But is there a better way?

Some cyber insiders believe there is.

Margie Graves Acting U.S. Federal Chief Information Officer

Margie Graves
Acting U.S. Federal Chief Information Officer

Acting U.S. Federal Chief Information Officer (CIO) Margie Graves acknowledges the problem.

“Defining the measure of success is hard sometimes, because it’s hard to measure things that don’t happen,” Graves said. President’s Trump’s Executive Order on Cybersecurity asks each agency to develop its own risk management plan, she noted. “It should be articulated on that plan how every dollar will be applied to buying down that risk.”

There is a difference though, between a plan and an actual measure. A plan can justify an investment intended to reduce risk. But judgment, rather than hard knowledge, will determine how much risk is mitigated by any given tool.

The Defense Information Systems Agency (DISA) and the National Security Agency (NSA) have been trying to develop a methodology measuring the actual value of a given cyber tool’s performance. Their NIPRNet/SIPRNET Cyber Security Architecture Review (NSCSAR – pronounced “NASCAR”) is a classified effort to define a framework for measuring cybersecurity performance, said DISA CIO and Risk Management Executive John Hickey.

“We just went through a drill of ‘what are those metrics that are actually going to show us the effectiveness of those tools,’ because a lot of times we make an investment, people want a return on that investment,” he told GovTechWorks in June. “Security is a poor example of what you are going after. It is really the effectiveness of the security tools or compliance capabilities.”

The NSCSAR review, conducted in partnership with NSA and the Defense Department, may point to a future means of measuring cyber defense capability. “It is a framework that actually looks at the kill chain, how the enemy will move through that kill chain and what defenses we have in place,” Hickey said, adding that NSA is working with DISA on an unclassified version of the framework that could be shared with other agencies or the private sector to measure cyber performance.

“It is a methodology,” Hickey explained. “We look at the sensors we have today and measure what functionality they perform against the threat.… We are tracking the effectiveness of the tools and capabilities to get after that threat, and then making our decisions on what priorities to fund.”

Measuring Security
NSS Labs Inc., independently tests the cybersecurity performance of firewalls and other cyber defenses, annually scoring products’ performances. The Austin, Texas, company evaluated 11 next-generation firewall (NGFW) products from 10 vendors in June 2017, comparing the effectiveness of their security performance, as well as the firewalls’ stability, reliability and total cost of ownership.

In the test, products were presumed to be able to provide basic packet filtering, stateful multi-layer inspection, network address translation, virtual private network capability, application awareness controls, user/group controls, integrated intrusion prevention, reputation services, anti-malware capabilities and SSL inspection. Among the findings:

  • Eight of 11 products tested scored “above average” in terms of both performance and cost-effectiveness; Three scored below
  • Overall security effectiveness ranged from as low as 25.8 percent, up to 99.9; average security effectiveness was 67.3 percent
  • Four products scored below 78.5 percent
  • Total cost of ownership ranged from $5 per protected megabit/second to $105, with an average of $22
  • Nine products failed to detect at least one evasion, while only two detected all evasion attempts

NSS conducted similar tests of advanced endpoint protection tools, data center firewalls, and web application firewalls earlier this year.

But point-in-time performance tests don’t provide a reliable measure of ongoing performance. And measuring the effectiveness of a single tool does not necessarily indicate how well it performs its particular duties as part of a suite of tools, notes Robert J. Carey, vice president within the Global Solutions division at General Dynamics Information Technology (GDIT). The former U.S. Navy CIO and Defense Department principal deputy CIO says that though these tests are valuable, they still make it hard to quantify and compare the performance of different products in an organization’s security stack.

The evolution and blurring of the lines between different cybersecurity tools – from firewalls to intrusion detection/protection, gateways, traffic analysis tools, threat intelligence, intrusion detection, anomaly detection and so on – mean it’s easy to add another tool to one’s stack, but like any multivariate function, it is hard to be sure of its individual contributions to threat protection and what you can do without.

“We don’t know what an adequate cyber security stack looks like. What part of the threat does the firewall protect against, the intrusion detection tool, and so on?” Carey says. “We perceive that the tools are part of the solution. But it’s difficult to quantify the benefit. There’s too much marketing fluff about features and not enough facts.”

Mike Spanbauer, vice president of research strategy at NSS, says this is a common concern, especially in large, managed environments — as is the case in many government instances. One way to address it is to replicate the security stack in a test environment and experiment to see how tools perform against a range of known, current threats while under different configurations and settings.

Another solution is to add one more tool to monitor and measure performance. NSS’ Cyber Advanced Warning System (CAWS) provides continuous security validation monitoring by capturing live threats and then injecting them into a test environment mirroring customers’ actual security stacks. New threats are identified and tested non-stop. If they succeed in penetrating the stack, system owners are notified so they can update their policies to stop that threat in the future.

“We harvest the live threats and capture those in a very careful manner and preserve the complete properties,” Spanbauer said. “Then we bring those back into our virtual environment and run them across the [cyber stack] and determine whether it is detected.”

Adding more tools and solutions isn’t necessarily what Carey had in mind. While that monitoring may reduce risk, it also adds another expense.

And measuring value in terms of return on investment, is a challenge when every new tool adds real cost and results are so difficult to define. In cybersecurity, though managing risk has become the name of the game, actually calculating risk is hard.

The National Institute of Standards and Technology (NIST) created the 800-53 security controls and the cybersecurity risk management framework that encompass today’s best practices. Carey worries that risk management delivers an illusion of security by accepting some level of vulnerability depending on level of investment. The trouble with that is that it drives a compliance culture in which security departments focus on following the framework more than defending the network and securing its applications and data.

“I’m in favor of moving away from risk management,” GDIT’s Carey says. “It’s what we’ve been doing for the past 25 years. It’s produced a lot of spend, but no measurable results. We should move to effects-based cyber. Instead of 60 shades of gray, maybe we should have just five well defined capability bands.”

The ultimate goal: Bring compliance into line with security so that doing the former, delivers the latter. But the evolving nature of cyber threats suggests that may never be possible.

Automated tools will only be as good as the data and intelligence built into them. True, automation improves speed and efficiency, Carey says. “But it doesn’t necessarily make me better.”

System owners should be able to look at their cyber stack and determine exactly how much better security performance would be if they added another tool or upgraded an existing one. If that were the case, they could spend most of their time focused on stopping the most dangerous threats – zero-day vulnerabilities that no tool can identify because they’ve never seen it before – rather than ensuring all processes and controls are in place to minimize risk in the event of a breach.

Point-in-time measures based on known vulnerabilities and available threats help, but may be blind to new or emerging threats of the sort that the NSA identifies and often keeps secret.

The NSCSAR tests DISA and NSA perform include that kind of advanced threat. Rather than trying to measure overall security, they’ve determined that breaking it down into the different levels of security makes sense. Says DISA’s Hickey: “You’ve got to tackle ‘what are we doing at the perimeter, what are we doing at the region and what are we doing at the endpoint.’” A single overall picture isn’t really possible, he says. Rather, one has to ask: “What is that situational awareness? What are those gaps and seams? What do we stop [doing now] in order to do something else? Those are the types of measurements we are looking at.”

Related Articles

GDIT Recruitment 600×300
GDIT HCSD SCM 5 250×250 Truck
NPR Morning Edition 250×250
GDIT Recruitment 250×250
USNI News: 250×250
Nextgov Newsletter 250×250
Why a Long-Term Data Strategy is Essential to Stopping Insider Threats

Why a Long-Term Data Strategy is Essential to Stopping Insider Threats

Harold T. Martin III was arrested in August 2016 for allegedly stealing and hoarding terabytes of highly classified documents taken from multiple intelligence agencies, where he’d worked as a contractor for more than 20 years. Around the same time, the Shadow Brokers hacker group exposed a number of sophisticated cyber-hacking tools developed by the National Security Agency (NSA) and now believed to have been downloaded by an agency insider. Those tools were later tied to a pair of major international cyberattacks in spring 2017. A third incident dating to the same time frame, saw a retired DuPont engineer charged by the company with stealing proprietary trade secrets and then leveraging them for his independent consulting business.

In all three cases, trusted individuals with direct access to highly sensitive information violated that trust and evaded internal safeguards before making off with data that was never supposed to leave the premises.

Today, all government organizations and most large private sector firms operate insider threat programs to detect when insiders begin going rogue. The systems collect vast volumes of user data, sifting through it looking for anomalies that indicate a change in behavior patterns. Such changes include sudden copying, downloading or printing of unusual numbers or types of files. What began as a simple quest to track and correlate information is becoming a costly challenge. Tracking each user’s digital behavior means capturing everything from when and where they log on and off, to what applications they use, which data they touch and what and when they download, print or copy. Saving all that data – which can amount to terabytes daily and even petabytes, isn’t free.

“You need to have a long-term data strategy that optimizes that balance between access and cost,” says David Sarmanian, an enterprise architect with systems integrator General Dynamics Information Technology (GDIT), which builds and manages IT systems for a range of government customers in both the classified and unclassified worlds. “Then you need to revisit that strategy annually to make sure it’s current and up-to-date with advancing technology.”

As network speeds and encryption use have grown, insider threat protection has focused on monitoring user activity at the endpoint, said Mike Crouse, director of data and insider threat business solutions for ForcePoint in Austin, Texas.

For agencies with little sensitive data, the collection logs are relatively small. But where national security systems are involved – or data is highly sensitive because it contains personal or perhaps proprietary industry information – the volumes needing collection ramp up quickly.

Minimum standards developed for the National Insider Threat Policy call for “monitoring of user activity on U.S. government networks” to detect, monitor and analyze anomalous user behavior. How much monitoring is necessary is left to communities and agencies discretion – including which data to collect.

The Intelligence Community Standard for collecting audit data is most extensive, calling for the recording and collection almost every user action: logging on and off; creating, accessing, deleting or modifying files; uploads, downloads and printing or copying of files; changes to access or privilege levels and applications use. Even here, decisions must be made. Should the agency collect actual files and the specific changes made in every instance, or just that the files were accessed?

The latest best practices from the CERT division of Carnegie Mellon University’s Software Engineering Institute, begin in the hiring process and continue throughout an employee’s tenure. Necessary steps include:

  • Monitoring and responding to suspicious or disruptive behavior
  • Monitoring and controlled, remote access from all end points including mobile devices
  • Establishing a baseline of normal network device behavior
  • Employing a log correlation engine or security information and event management (SIEM) system to log, monitor, and audit employee actions

Agencies must decide for themselves how long of a period to maintain and review, said Michael Albrethsen, information system security analyst for the CERT division of the Carnegie Mellon Software Engineering Institute.

A baseline profile for each user tracks log-ins and log-outs to IT systems and applications, locations and devices used and specific data accessed or manipulated. Combined with job titles and functions, network and application permissions and physical access to facilities, this provides a portrait against which anomalous behavior can be detected.

But that’s really the easy part, says ForcePoint’s Crouse. “We have visibility into anything the user does on the endpoint or in the cloud,” he says. “But the ability to collect doesn’t mean we should collect it. It isn’t an issue of Big Data or small data, it’s the right data.”

The right data depends on the organization’s risk tolerance, mission, size and budget. An online app for reserving a National Park camping spot does not require the same level of scrutiny as a classified database.

In the least critical applications, log data may be retained for as little as a few months. In typical government applications, it may be a year or more and for classified environments, audit data for forensic investigations might be kept for the user’s entire career (and even beyond). The DuPont engineer’s theft wasn’t apparent until after he left the company. That’s when data really stack up and portability issues come into play. Data formats and media used today may be obsolete in a decade or two. Information security officers can’t afford to simply stack up disks or tapes for future use.

Even when an organization is selective about the information gathered, it can quickly accumulate to the petabyte range (1 petabyte equals 1 million gigabytes), for the law enforcement, defense and intelligence communities, Albrethsen said. “That is where the state of the art is moving.”

Containing Data Volumes
Managing this takes strategy. By strategically identifying which pieces of data must be saved from each event and how such data is stored, the volumes can be made more manageable.

“You’re not going to collect every bit and byte that comes from the end point. You don’t want to pay for data that you never use,” says Sarmanian. “And not all data being collected needs to be available for immediate access. Records from different sources will come in different formats. These logs and other data must be normalized before they can be stored and used. Typically, this is done at the point of ingestion, often by a SIEM solution designed for this purpose.

The next step is storage. Choosing storage requirements determines the long-term cost of a program. While storage costs are perpetually declining, savings are eaten up by increasing volumes of saved data.

“Storage is cheap, but you still have to buy a lot of it,” said James Cemelli, a GDIT project manager. “Most agencies now are storing data in the low petabyte range, but that will grow exponentially as long-term data collection continues.”

Fortunately, not all such data need be treated the same. Security officials should check network logs regularly for unusual activity and examine files for malware and malicious IP addresses. Network Operations teams will want to have 12 to 13 months of log data available. Tiered storage models provide lower-cost options for data only rarely needed, while still maintaining instant access to high-value current data.

Seldom-used data can be maintained in a storage-area network. Rarely-used data needed for forensic investigations following an event, could be kept on less expensive media such as hard disk drives, compact discs or tape drives.

There is a tradeoff between cost and availability. Data collection and storage policies for an insider threat program should be based on cost-benefit-risk analysis. “Creating a long-term data storage strategy helps agencies maximize the benefits of advancing technology while still being able to support their mission of defeating insider threat,” said Cemelli.

Related Articles

GDIT Recruitment 600×300
GDIT HCSD SCM 5 250×250 Truck
NPR Morning Edition 250×250
GDIT Recruitment 250×250
USNI News: 250×250
Nextgov Newsletter 250×250