CDM Program Starts to Tackle Complexities of Cloud
The Trump administration’s twin priorities for federal information technology – improved cybersecurity and modernized federal systems – impose a natural tension: How to protect a federal architecture that is rapidly changing as agencies push more and more systems into the cloud.
The Department of Homeland Security’s (DHS) Continuous Diagnostics and Mitigation (CDM) program’s early phases focus on understanding what systems are connected to federal networks and who has access to those systems. The next phases – understanding network activity and protecting federal data itself – will pose stiffer challenges for program managers, chief information security officers and systems integrators developing CDM solutions.
Figuring out how to monitor systems in the cloud – and how to examine and protect data there – is a major challenge that is still being worked out, even as more and more federal systems head that way.
“Getting that visibility into the cloud is critical,” says DHS’s CDM Program Manager Kevin Cox. Establishing a Master Device Record, which recognizes all network systems, and establishing a Master User Record, which identifies all network users, were essentially first steps, he told a gathering of government security experts at the ATARC Chief Information Security Officer Summit Jan. 25. “Where we’re headed is to expand out of the on-premise network and go out to the boundary.”
As federal systems move into the cloud, DHS wants CDM to follow – and to have just as much visibility and understanding of that part of the federal Information technology ecosystem as it has for systems in government data centers. “We need to make sure we know where that data is, and understand how it is protected,” Cox says.
Eric White, cybersecurity program director at General Dynamics Information Technology (GDIT) Health and Civilian Solutions Division, has been involved with CDM almost from its inception. “As agencies move their data and infrastructures from on premise into these virtualized cloud environments, frequently what we see is the complexity of managing IT services and capabilities increasing between on-premise legacy systems and the new cloud solutions. It creates additional challenges for cybersecurity writ large, but also specifically, CDM.”
Combining virtualized and conventional legacy systems is an integration challenge, “not just to get the two to interact effectively, but also to achieve the situational awareness you want in both environments,” White says. “That complexity is something that can impact an organization.”
The next phase of CDM, starts with monitoring network of sensors to identify “what is happening on the network,” including monitoring for defects between a “desired state” and the “actual state” of the device configurations that monitor network health and security. In a closed, on-premise environment, it’s relatively easy to monitor all those activities, because a network manager controls all the settings.
But as agencies incorporate virtualized services, such as cloud-based email or office productivity software, new complexities are introduced. Those services can incorporate their own set of security and communications standards and protocols. They may be housed in multi-tenant environments and implemented with proprietary security capabilities and tools. In some cases, these implementations may not be readily compatible with federal continuous monitoring solutions.
The Report to the President on Federal IT Modernization, describes the challenges faced in trying to combine existing cyber defenses with new cloud and mobile architectures. DHS’s National Cybersecurity Protection System (NCPS), which includes both the EINSTEIN cyber sensors and a range of cyber analytic tools and protection technologies, provide value, the report said, “but are not enough to combat the full spectrum of advanced persistent threats that rapidly change the attack vectors, tactics, techniques and procedures.”
DHS began a cybersecurity architectural review of federal systems last year, building on a similar Defense Department effort by the Defense Information Systems Agency, which conducted the NIPRNET/SIPRNET Cybersecurity Architecture Review (NSCSAR) in 2016 and 2017. Like NSCSAR, the new .Gov Cybersecurity Architecture Review (.GovCAR) intends to take an adversary’s-eye-view of federal networks in order to identify and fix exploitable weaknesses in the overall architecture. In a massively federated arrangement like the federal government’s IT system, that will be a monumental effort.
Cox says the .GovCAR review will also “layer in threat intelligence, so we can evaluate the techniques and technologies we use to see how those technologies are helping us respond to the threat.”
“Ultimately, if the analysis shows our current approach is not optimal, they will look at proposing more optimal approaches,” he says. “We’re looking to be nimble with the CDM program to support that effort.”
The rush to implement CDM as a centrally funded but locally deployed system of systems means the technology varies from agency to agency and implementation to implementation. Meanwhile, agencies have also proceeded with their own modernization and consolidation efforts. So among the pressing challenges is figuring out how to get those sensors and protection technologies to look at federal networks holistically. The government’s network perimeter is no longer a contiguous line. Cloud-based systems are still part of the network, but the security architecture may be completely different, with complex encryption that presents challenges to CDM monitoring technologies almost as effectively as it blocks adversaries.
“Some of these sensors on the network don’t operate too well when they see data in the wrong format,” White explains. “If you’re encrypting data and the sensors aren’t able to decipher it, those sensors won’t return value.”
There won’t be a single answer to solving that riddle. “What you’re trying to do is gather visibility in the cloud, and this requires that you be proactive in working with your cloud service providers,” White says. “You have to understand what they provide, what you are responsible for, what you will have a view of and what you might not be able to see. You’re going to have to negotiate to be compliant with federal FISMA requirements and local security risk thresholds and governance.”
Indeed, Cox points out, “There’s a big push to move more federal data out to the cloud; we need to make sure we know where that data is, and understand how it is protected.” Lapses do occur.
“There have been cases where users have moved data out to the cloud, there was uncertainty as to who is configuring the protections on that data, whether the cloud service provider or the user, and because of that uncertainty, the data was left open for others – or adversaries – to view it,” Cox says.
Addressing that issue will be a critical piece of CDM’s Phase 3 and Phase 4 will go further in data protection, Cox says: “It gets into technologies like digital rights management, data loss prevention, architecturally looking at things like microsegmentation, to ensure that – if there is a compromise –we can keep it isolated.”
Critics have questioned the federal government’s approach, focusing on the network first rather than the data. But Cox defends the strategy: “There was such a need to get some of these foundational capabilities in place – to get the basic visibility – that we had to start with Phase 1 and Phase 2, we had to understand what the landscape looked like, what the user base looked like, so we would then know how to protect the data wherever it was.”
“Now we’re really working to get additional protections to make sure that we will have better understanding if there is an incident and we need to respond, and better yet, keep the adversary off the network completely.”
The CDM program changed its approach last year, rolling out a new acquisition vehicle dubbed CDM DEFEND, which leverages task orders under the Alliant government-wide acquisition contract (GWAC), rather than the original “peanut butter spread” concept. “Before, we had to do the full scope of all the deployments everywhere in a short window,” he says, adding that now, “We can turn new capabilities much more quickly.”
Integrators are an essential partner in all of this, White says, because they have experience with the tools, experience with multiple agencies and the technical experience, skills and knowledge to help ensure a successful deployment. “The central tenet of CDM is to standardize how vulnerabilities are managed across the federal government, how they’re prioritized and remediated, how we manage the configuration of an enterprise,” he says. “It’s important to not only have a strategy at the enterprise level, but also at the government level, and to have an understanding of the complexity beyond your local situation.”
Ultimately, a point solution is always easier than an enterprise solution, and an enterprise solution is always easier than a multi-enterprise solution. Installing cyber defense tools for an installation of 5,000 people is relatively easy – until you have to make that work with a government-wide system that aims to collect and share threat data in a standardized way, as CDM aims to do.
“You have to take a wider, broader view,” says Stan Tyliszczak, chief engineer at GDIT. “You can’t ignore the complex interfaces with other government entities because when you do, you risk opening up a whole lot of back doors into sensitive networks. It’s not that hard to protect the core of the network – the challenge is in making sure the seams are sewn shut. It’s the interfaces between the disparate systems that pose great risk. Agencies have been trying to solve this thing piece by piece, but when you do that you’re going to have cracks and gaps. And cracks and gaps lead to vulnerabilities. You need to take a holistic approach.”
Agency cyber defenders are all in. Mittal Desai, CISO at the Federal Energy Regulatory Commission (FERC), says his agency is in the process of implementing CDM Phase 2, and looks forward to the results. “We’re confident that once we implement those dashboards,” he says, “it’s going to help us reduce our meantime to detect and our meantime to respond to threats.”