Federal Civilian Agencies Edge Toward Single, Shared Network

Momentum is picking up for establishing a single, secure network to protect federal civilian agencies.

The idea was advanced in the Commission on Enhancing National Cybersecurity report late last year, in recommendations by House Homeland Security Committee Chairman Rep. Mike McCaul in January. It also has the backing of acting Federal Chief Information Security Officer Grant Schneider, former White House Cybersecurity Coordinator Michael Daniel and former National Security Agency Chief Keith Alexander.

A unified network would theoretically be easier to secure and monitor and less expensive because it could leverage the full scale of the government. Schneider estimates it would be comparable in size to the Department of Defense’s (DoD) networks already managed as a central service by the Defense Information Systems Agency (DISA).

Grant Schneider, Acting Federal Chief Information Security Officer

Grant Schneider
Acting Federal Chief Information Security Officer

While data traffic moving from one defense agency to another never leaves the DoD environment, that’s not true in the civilian portion of the government. “Every time someone sends something from the White House to the Department of Commerce, it goes out in the wild,” Schneider says. “That is a risk.”

Centralizing networks and their management would solve that problem. “If we could move to some sort of a .gov environment, it would provide us opportunities we don’t have today,” he said. That includes improved situational awareness about the health of the overall network and common standards and architectures that would be easier to defend.

Some critics say that means they’d also be easier to attack, but Schneider says the balance of advantages are squarely on the side of centralized control. “A common architecture would enable us to apply more seamlessly some of the tools that are used across the .mil side [to defend] the federal civilian side,” he adds. “Today, that often proves to be more of a challenge than we would like.”

Daniel, who served five years managing White House cyber policy for the National Security Council, agrees, saying the idea of a unified network gained traction in the past year or so as competition for cyber talent and other resources increased and agencies struggled to keep up.

Michael Daniel, White House Cybersecurity Coordinator

Michael Daniel
White House Cybersecurity Coordinator

“We’ve taken the approach on the federal civilian side that it’s every agency for itself, every bureau for itself,” says Daniel, who now heads the non-profit Cyber Threat Alliance, a clearinghouse for cyber threat data. “The result is it’s very difficult for those agencies to find the resources to manage and support their IT and their cybersecurity. We should start thinking about this differently: Agencies should retain accountability for their information assets, the information they need to do pursue their mission. But that accountability does not mean they need to have their agency perform all of the different tasks required to do IT and cybersecurity from the top to the bottom of the stack.”

Instead, he argues, the network and transport layer of the IT stack ought to be managed centrally, providing a common architecture and security standard.

“Then the agencies would ride on top of that, and they would still be responsible for managing the application layer and the cybersecurity of the applications,” Daniel says. “This would enable us to dramatically improve IT network management and cybersecurity in the federal civilian sector.”

Daniel envisions the General Services Administration (GSA) taking on the network management role, possibly through a series of competitive managed services contracts, and the Department of Homeland Security (DHS) managing cybersecurity across the entire federal civilian enterprise.

“For the agencies, this means they can focus resources on the stuff they really care about, which is the mission applications their user base operates every day,” Daniel says. “You can get out of the business of having to manage the commodity network and transport layer.”

Though some might worry that in this setup, GSA would become a monopoly supplier, charging fees to agencies without competition, Daniel doesn’t buy it. “The only way for the federal government to get the kind of economies of scale that we know that we could get, would be to do a lot of that purchasing centrally.”

GSA would have to be transparent in how that was managed and charged. Agencies would need the ability to make tradeoffs based on specific mission requirements, such as system availability, guaranteed up-time or high-speed recovery in case of a failure.

Rep. Mike McCaul (R-Texas), chairman of the Homeland Security Committee, plans legislation this year that would create a cyber agency within DHS. And at least one draft of the Trump administration’s much anticipated executive order on cybersecurity calls for wider use of shared services across federal civilian agencies. McCaul co-authored “From Awareness to Action: A Cybersecurity Agenda for the 45th President,” in January. That report said “cybersecurity at DHS needs to be an operational component agency like the Coast Guard or U.S. Customs and Border Protection” and suggested there may be no greater homeland security mission than securing cyberspace. It too recommended creating a National Cybersecurity Agency inside DHS.

Similarly, the Commission on Enhancing National Cybersecurity, led by former National Security Advisor Tom Donilon and completed in December 2016, also recommended a single federal civilian network. “The Administration should establish a program to consolidate all civilian agencies’ network connections (as well as those of appropriate government contractors) into a single consolidated network,” it recommends. “The new agency should develop and implement a program to provide secure, reliable network services to all civilian government agencies, thereby providing a consolidated network for all .gov entities.”

Defining what that network encompasses will be challenging. With data centers closing and many agencies moving data rapidly into commercial cloud infrastructure, defining a network perimeter is no longer cut and dried.

But limiting the points of exposure would help reduce risk, Schneider argues. “We have 56 Trusted Internet Connections [across the civilian sector] that we’re trying to protect and that we’re spending a lot of money on,” he says. Might those be reduced? “The Department of Defense has 11 somewhat equivalent connections,” he adds.

Stan Tyliszcak

Stan Tyliszczak
Vice President for Technology Integration and Chief Engineer, GDIT

Systems integration contractors that provide the services to manage and secure government networks, agree. “Having so many different networks and network owners absolutely adds to the security challenges,” says Stan Tyliszczak, vice president for technology integration and chief engineer at General Dynamics Information Technology. “As providers of managed services for these networks, we’re constantly juggling multiple different technologies, products, policies and compliance. Instead of automating as much as we can and building deep analytical capability, we end up spread a mile wide and an inch deep, trying to protect against an increasingly sophisticated threat.”

Tyliszczak says it’s easy to get overly focused on the idea of a physical perimeter when the real focus should be on how each layer in the system stack is secured, from the network and operating system through the application layer. The Defense Department and experienced system integrators already have proven this can be done, he and others say.

Consolidating the myriad federal civilian agency networks into a single network architecture will take years to execute. Funding alone will be a challenge, and as Daniel says, it is funding – not technology – that drives policy implementation. Case in point: While the Defense Department long ago established centralized oversight of networks, efforts to drive toward a Joint Information Environment (JIE) with standardized technologies across all the military services have moved slowly without centralized funding. JIE is not a program of record, but a concept over numerous programs, which is one reason JIE standards remain a work in progress.

Likewise, the mammoth Intelligence Community Information Technology Enterprise (IC ITE) effort to standardize systems across the IC, has also struggled. Two years into a billion-dollar contract, the massive rollout of its second-generation virtual Desktop Enterprise (DTE) computer systems to standardize IT services across the IC – continues to slip further and further behind schedule. Deployment of Its Phase 2, originally set for last year, won’t start until this summer at the earliest.

“The problem with large-scale initiatives is underestimating their complexity,” says ”GDIT’s Tyliszczak. “At the scale of a JIE or IC-ITE – or a new .gov network – the complexity is in the size. So you have to keep the architecture as simple as possible. The more complex the system, the greater the risks in scaling it up,” he adds. “You want to stick with proven capability so you can focus on the scaling issues.”

This is true with networks, but also other technologies. The advantage of FedRAMP – the Federal Risk and Authorization Management Program created to accelerate cloud adoption across the federal government – is that it provides some assurance that the initial security requirements are set. That lets agency managers focus on their specific applications, saving time and money.

“When you start with a FedRAMP-certified cloud, you know the security basics have already been taken care of,” Tyliszczak says. “You can focus on scaling the solution instead of certifying the infrastructure.”

Federal civilian networks differ from defense networks in that they are not riding on private, dedicated fiber. In defense networks, DISA owns the fibers, Tyliszczak says. But civilian agencies’ networks ride on the same fibers as commercial traffic, using privately owned lines belonging to Verizon, AT&T and just a few others to move information from place to place. So the controls are all built on top of that, in the software and hardware that runs on the network, as opposed to the network infrastructure itself.

“There’s already quite a bit of government data in the cloud,” Schneider says. “Working out [where the perimeter might lie] would take time.” But he said it makes sense to establish security guidelines and to put one agency in charge of that for the whole civilian sector of the government.

“The old notion of a perimeter is not really applicable anymore,” Tyliszczak says. “You’re defending network access points and defending your data inside the network. It’s a lot easier to defend fewer access points if you can get to that point.”

Agencies have shown an increasing willingness to outsource services they don’t see as essential to their mission. More and more chief information officers see advantages in getting out of the data center business, getting out of managing networks which they see as commodity services, and focusing on the unique systems and technology that drive mission effectiveness.

That approach seems to square with the new administration’s focus on efficiency. “Certainly the administration believes hey, if the DoD can do this, why don’t we do more things similarly?” Schneider says. “That was my take coming out of DoD.”

1 Comment

  1. Dave

    While having dozens of servers/networks is wasteful, isn’t just one a single point failure risk. Not to mention how sluggish/unworkable a network becomes when trying a one size fits all approach for thousands upon thousands of users, thus just shifting cost onto the labor side by decreasing productivity. Extreme stances in either direction are counterproductive.

    Reply

Submit a Comment

Your email address will not be published. Required fields are marked *

Related Articles

Tom Temin 250×250
GM 250×250
GEMG 250×250
Jason Miller 250×250
Vago 250×250
USNI News: 250×250
gdit cloud 250×250

Upcoming Events