Why Modernization Is Key to National Cyber Strategy

Why Modernization Is Key to National Cyber Strategy

President Trump’s cybersecurity strategy hinges on modernizing legacy computer systems that sap resources and hold back agencies from updating security policies. The approach views cloud-based services not only as more flexible and less costly, but also as inherently more secure.

Jeanette Manfra National Protection and Programs Directorate, Department of Homeland Security

Jeanette Manfra
National Protection and Programs Directorate,
Department of Homeland Security

“It’s not always a fact that IT modernization and cybersecurity have to go hand in hand,” said Jeanette Manfra, acting deputy undersecretary for Cybersecurity and Communications for the National Protection and Programs Directorate in the Department of Homeland Security (DHS). “But that is very important and something this administration recognized immediately: that the lack of modernization is itself a big vulnerability for the government.”

Combining hundreds of federal networks into one or more virtualized networks is one part of that strategy. But just as essential is replacing aging infrastructure that’s both expensive to operate and difficult to protect. Manfra said the government must graduate from today’s perimeter-focused security approach, which focuses primarily on protecting the network, to a data-centered approach designed to protect the information assets residing on that network.

That’s hardly a new concept to security experts, but it reflects a massive cultural shift for government. Retired Air Force Maj. Gen. Dale Meyerrose, a former chief information officer for the director of national intelligence. Now an information security professor and consultant, Meyerrose said protecting the network is a lost cause; it’s best to assume the enemy is already inside your network.

“Every industry sector has a time lag between the infiltration of the evil-doers into your enterprise and their discovery,” Meyerrose said in March at the Cyber Resilience Summit in Reston, Va. The video game industry is the most skilled at rooting out infiltrators, he said, finding intruders in less than a week, on average. The worst? “The United States Government,” he continued. “The average time between infiltration and discovery is almost two years.”

The security paradigm is broken because the focus is in the wrong place, Meyerrose said. “The evil-doers don’t want your network. They want the stuff that’s in your network.”

By definition, cloud-based services blur the lines of conventional perimeter security, forcing CIOs and chief information security officers (CISOs) to focus on securing organizational data, as opposed to protecting the system itself. Legacy systems weren’t necessarily built with the Internet, remote access and external links in mind. Cloud, on the other hand, exists solely because of that global connectivity.

This is the opportunity at hand, DHS’ Manfra said at the Institute for Critical Infrastructure Forum June 7: “The promise of modernization is that it also allows us to modernize our security processes.”

Michael Hermus, DHS chief technology officer, agrees. “Software-defined infrastructure really helps us in this modernization journey towards a better security posture, towards being able to adapt to changing needs,” he said. Legacy systems lack the flexibility to respond to rapidly changing threats or situations, but virtualized networks and architectures are infinitely – and almost instantly – reconfigurable. “If the infrastructure is flexible enough to meet those changing needs, you are going to be in a much better security posture.”

Jim Routh Chief Security Officer Aetna

Jim Routh
Chief Security Officer, Aetna

Legacy IT problems aren’t unique to government. Large institutions like banks, power companies, airlines, insurers and the like also operate using what some call “Frankenstein networks,” amalgamations of legacy systems, often with modern frontends, that make change challenging. Take insurance giant Aetna, for example, where Jim Routh is the chief security officer. Although the private sector has different financial incentives and opportunities, many of the issues are similar.

“The reality is that fragile systems are the most expensive to maintain,” Routh said. “And often fragile systems aren’t even core business systems, because they don’t get a lot of attention. So a new capability that actually has security designed into it is actually much more cost-effective from an economic standpoint than keeping these legacy systems around.”

His recommendation: “Take legacy systems that are the most expensive to operate and divide it into two categories: the ones that get a lot of attention and support core business needs, and those that aren’t part of the core business. Then take the ones that aren’t part of the core business and decommission them.”

Hermus said DHS applies a decision framework across the organization to make a similar evaluation. “We’re creating a framework that can identify systems that need to be updated or modernized” based on existing best practices, such as Gartner’s TIME model, which stands for Tolerate, Invest/Innovate/Integrate, Migrate/Modernize and Eliminate. “Having a consistent framework for evaluating your assets, that’s something we all need to consider, particularly at the large enterprise level.”

Christopher Wlaschin, CISO at the Department of Health and Human Services (HHS), says his agency also applies an organizational framework to prioritize modernization needs. The agency spent $12 billion on IT last year across 11 divisions that include the Food and Drug Administration, the Centers for Medicare and Medicaid Services and the National Institutes of Health, among others.

“If the definition of a Frankenstein network is mismatched parts cobbled together over a central nervous system, I think, yeah, HHS has that,” he said. The 11 operating divisions all have unique threat and risk profiles and substantially different users and needs. The objective is to move toward as many shared services as possible where that makes sense and, where it doesn’t, to replace proprietary systems with cloud solutions where that makes sense.

“The challenge is the immensity of the problem,” said Mark Sikorski, vice president of Homeland Security Solutions at systems integrator General Dynamics Information Technology. “Everything is interconnected. Modernizing one system affects all the others. Determining where your greatest vulnerabilities are – and then prioritizing them – is just the first step. The next is to carefully assess each decision so you understand the downstream impact on the rest of the enterprise ecosystem.”

Get that wrong and the complexity can quickly snowball.

“You have to decide if you want to take the ‘Big Bang’ approach and modernize everything at once, or go very slowly, carefully untangling the knot of legacy systems without disrupting anything,” Sikorski added. “If you do that, you’ll pay a premium for maintaining an increasingly outdated environment. Like the old TV commercial says, you can pay now, or pay later. But you can’t avoid paying forever.”

Related Articles

Tom Temin 300×600
Tom Temin 250×250
GM 250×250
GEMG 250×250
Federal DevOps Summit 250×250
Intel & National Security Summit

Upcoming Events

Jason Miller 250×250
USNI News: 250×250
gdit cloud 250×250
TechNet Augusta 250×250
Citizens Expect a Consumer-Quality Customer Experience

Citizens Expect a Consumer-Quality Customer Experience

Citizens expect government to provide the same quality of customer experience as they receive from the best consumer businesses. The only difference: Unlike the commercial world, in many cases, they don’t have another option to turn to.

“We are influenced by the experiences we have every day,” said Michele Bartram, chief customer service officer at the U.S. Census Bureau at a recent discussion on government service. “So, we think, Google: Go one place, get the answer really quick, in and out. Security: Influenced by online banking, and the convenience of that – I can scan a check at my house and make a deposit. Amazon: I can find everything in one place and have it delivered to my home.”

So when citizens turn to government Web sites or agencies for help, the bar is already set – high. Government employees live in that same world, so they too know what those consumer experiences are like. But translating a consumer customer experience to a government context isn’t easy.

Technology, security, the complexity of government programs, legal language and, of course, established culture can all get in the way.

Take language, for instance. Government has a way of looking in at itself, developing its own terminology and relying on acronyms and jargon that are well understood on the inside but may not be the language of a given customer. When Matthew T. Harmon, director of Web Communications at the Department of Homeland Security worked with the Department of Technical Communication in Mercer University’s School of Engineering on a website usability study not long ago, students there identified excessive use of unexplained acronyms as a hurdle for target visitors.

Harmon’s advice: “Implement the Plain Language Act.” This 2010 law aimed “to improve the effectiveness and accountability of Federal agencies to the public by promoting clear government communication that the public can understand and use.” It’s a simple concept, but hard to execute. By eliminating jargon, acronyms and complicated language, the act aims to make government communication more accessible to the public.

The Center for Plain Language, a non-profit dedicated to those principles in government at all levels, measures participating agency’s performance and scores them annually. In the most recent results, from 2016, DHS received a B for plain language use – up from Ds when the annual report cards began. Overall, improvement has been dramatic: In 2013, of 20 agencies participating, one failed and nine received Ds for plain writing; only the Social Security Administration managed an A grade. But by 2016, among 18 participating agencies, eight scored an A- or better and only one scored as low as a C.

But with only 18 agencies participating, it’s possible some laggards would rather pass than risk getting a low score.

Clear language has other benefits beyond just making it easier for customers to understand. As it turns out, clarity also helps on the Internet. “It’s a by-product, but plain language also helps your search-engine optimization,” Harmon said. That means when customers search the agency’s website, they’re more likely to get the answers they’re looking for.

Measure, measure

Hala Maktabi is director of measurement and performance improvement in the Veterans Experience Office within the Department of Veterans Affairs, which has struggled to improve its ability to respond to veterans’ needs. The agency collapsed hundreds of different websites into a unified vets.gov web service that “aspires to be honest, transparent, respectful and accessible to all visitors.”

VA’s customer experience efforts have extended across the enterprise, looking at form letters and web pages. In one project last year, former Military Times newspaper writers were brought in to help rewrite bureaucratic language into more understandable, veteran-friendly terms.

Maktabi said the agency is now turning a corner, including better ways to query and measure veterans’ responses to their VA experience. “We did not have a system to understand and listen to our customers,” she said. Her enterprise measurement office was established specifically to gather that data and use it to improve. “It really measures what matters to veterans, to catch early signals and strategic insight,” Maktabi said. In the future, VA will be able to be more proactive in anticipating veterans’ needs.

David Meyer, vice president of Military Health and Veterans Affairs with General Dynamics Health Solutions, agrees. “Veteran insights and associated data are critical to delivering responsive services that can continuously adapt to the veterans, families, survivors and caregivers needs,” he said. “Organizations are often rich in data, but find it a challenge to use that data in truly meaningful ways. The key is understanding what data to extract and what action can be taken to actually improve customers’ experiences. Business intelligence tools are available, but are only part of the solution. It really comes down to ensuring the technology is aligned to the mission, with the customer at the forefront of the discussion.”

Empathy is critical

At US Citizenship and Immigration Services, listening to customer feedback helped officials optimize its website for mobile users and also to provide better service in response to questions. “You’re not doing this just to be more efficient and effective,” said Mariela Melero, associate director of the Customer Service and Public Engagement Directorate at USCIS. “You have to be empathetic.”

USCIS serves a vast and varied constituency, including many who are not native English speakers. Some are already deep into the immigration or citizenship process; others are just trying to get the lay of the land. Both are important customers, she said, and each has different needs. Understanding the different customers means anticipating those different needs.

“This is where personas and customer journey mapping come in,” said Tish Falco, senior director of customer experience at General Dynamics Information Technology. “These two tools help an organization better understand the customer by seeing things from the customer’s perspective and understanding key ‘Moments that Matter’ along the different touch points – both  across each business silo interaction and across the organization.  Together, personas and journey maps – based on real customer insights – help build empathy and provide focus around key strategic initiatives.”

Falco said it is important to understand and define the different customer personas coming to one’s agency, and then to map these individual customer experience journeys as the customer moves from initial engagement to actually completing their objective. “Understanding the distinct needs of segments within your customer base and dependencies across the journey will help you create a more personalized, engaging and easy-to-do business with experience.”

At USCIS, for example, the agency found that applicants in a waiting process often want the reassurance that comes from speaking to a person. And when they check on status, they want answers that reflect their particular circumstances. “They want an answer that’s for someone like them,” Melero said, because they know situations vary depending on many issues, such as country of origin, family status, prior history and other issues. “They tell us: Please personalize this experience for me.”

Personalizing service is something commercial industry has gotten better and better at. Online vendors remember your preferences and serve up content related to the things that interested you in the past.

Government agencies aren’t there yet. But they are making progress – by focusing on data to drive decisions. “I will back up anyone on my team who makes a change based on data – and a little common sense,” said Mark Weber, deputy assistant secretary for public affairs for Human Services at the Department of Health and Human Services. He advocates an agile approach to improvement: Make a change, measure its effect, then see what can be done to improve further.

To draw in more perspectives, he established an Engagement Team that meets regularly to talk about these issues and to build wider understanding across functional lines within the agency. He said he’s constantly inviting new people to join. “This is where we talk about everything,” Weber explained. “It’s not a decision point. But it’s a connecting point. And that’s important, too.”

Related Articles

Tom Temin 300×600
Tom Temin 250×250
GM 250×250
GEMG 250×250
Federal DevOps Summit 250×250
Intel & National Security Summit

Upcoming Events

Jason Miller 250×250
USNI News: 250×250
gdit cloud 250×250
TechNet Augusta 250×250
How New Cyber Executive Order Could Change Federal IT

How New Cyber Executive Order Could Change Federal IT

President Donald Trump (File Photo)

President Trump’s long-awaited cyber executive order defines cybersecurity as “an executive branch enterprise” and seeks a single consolidated federal network architecture with centralized security and control. The order also advocates for adopting other government-wide shared services, such as email and other cloud-based services.

A single network infrastructure would simplify security, standardizing defenses and minimizing the number of access points to the open internet.

The Defense Department and Intelligence Community, which already have their own enterprise-wide networks, would either become self-contained components within a federated structure or remain as stand-alone networks. The order gives Secretary of Defense James Mattis and Director of National Intelligence (DNI) Dan Coats until Oct. 8 to justify “any deviation from the requirements.”

Tom Bossert, White House Homeland Security Advisor

Tom Bossert, White House Homeland Security Advisor

Either way, says White House Homeland Security Advisor Tom Bossert, “we view our federal IT as one enterprise network.”

Tying overall network security to system modernization and the adoption of more shared services opens the door to a far less federated, more centralized approach to federal information technology. It follows similar efforts in the national security space, such as the Pentagon’s pursuit of a Joint Information Environment (JIE) and the DNI’s Intelligence Community Information Technology Enterprise (IC-ITE).

Bossert said such a move is imperative. “If we don’t move to commonality and shared services, we have 190 agencies that are all trying to develop their own defenses against advanced protection and collection efforts,” he said at a press briefing on the executive order. “I don’t think that’s a wise approach.”

Responsibility for overseeing that move rests with the American Technology Council, a new entity chaired by the president himself and includes the vice president, the secretaries of Commerce, Defense, and Homeland Security, the DNI, the director of the Office of Management and Budget (OMB), the director of the Office of Science and Technology Policy, the U.S. Chief Technology Officer, the heads of the General Services Administration, the U.S. Digital Service and a few others.

The idea of a unified federal civilian network gained momentum over the past year and was a central conclusion of President Obama’s Commission on Enhancing National Cybersecurity’s December 2016 report. House Homeland Security Committee Chairman Rep. Mike McCaul also favored the idea during the executive transition to President Trump.

Acting Federal Chief Information Security Officer Grant Schneider, who has served in OMB under both administrations, told GovTechWorks in April he backed the idea, noting: “A ‘dot-gov’ environment would provide us opportunities we don’t have today,” to achieve better situational awareness, cross-agency efficiencies, common standards and technologies.

Michael Daniel, cybersecurity coordinator under President Obama and now president of the Cyber Threat Alliance, also backs the concept. “I strongly support the approach taken to Federal networks, holding agencies accountable while also encouraging the move to shared services,” he said in a statement following release of the order.

Sallie Sweeney, Principal Cyber Solutions Architect at GDIT

Sallie Sweeney, Principal Cyber Solutions Architect at GDIT

“Increasing the use of shared services and enterprise licenses could help wring cost savings out of agency IT budgets by shaking loose duplicative services and licenses. More importantly, it would also significantly enhance cybersecurity,” said Sallie Sweeney, principal cyber solutions architect with General Dynamics Information Technology (GDIT). “Centralizing control will help ensure upgrades and security patches are implemented immediately and that outdated technology is quickly phased out, which is not the case today.”

Case in point: The WannaCry ransomware attacks that spread around the globe May 12, knocked out hospitals, government agencies and even an automobile plant. “The WannaCry malware exploited a vulnerability in Windows XP – an outdated, unsupported operating system,” Sweeney said. “System owners chose the convenience of delaying the upgrade over the risks posed by maintaining an insecure system. Now they will pay the price, either with infected systems or by having to rush to make fixes that should not have been put off in the first place.”

Aged tech was also partially responsible for the 2015 security breaches at the Office of Personnel Management, Bossert noted. “We spend a lot of time and inordinate money protecting antiquated and outdated systems,” he said. “The president has issued a preference from today forward in federal procurement of federal IT for shared services: Got to move to the cloud and try to protect ourselves, instead of fracturing our security posture.”

Emphasizing cloud as a modernization solution is not new. The federal government has nominally espoused a “cloud-first” policy for IT modernization since 2010. But in practice, cloud adoption has been relatively slow, partially because of agency caution and partly because products of delays getting cloud offerings authorized through FedRAMP, the Federal Risk and Authorization Management Program.

But by emphasizing shared services as a means to help secure government networks, the order suggests that whatever reservations still remain about the security of cloud solutions are now seen as more manageable than the unseen risks of maintaining outdated technology platforms for the long term.

Just how far federal agencies will go in terms of shared services is hard to say.

Standardizing such applications across the entire government will be a monumental undertaking. Doing so even within a single agency – let alone across entire departments or even the federal enterprise – exposes all kinds of internal conflicts over budget, control and choice. Agency and department heads inevitably fear the loss of control that comes with surrendering information technology decisions to higher-level decision makers.

But the arguments in favor of enterprise contracts and expanded shared services are compelling. The Air Force cut its IT spending by 17 percent during the past two years alone, just by getting a better grasp on what it’s buying and how, according to Butch Luckie, Air Force chief of IT business analytics. Luckie told Federal News Radio that consolidating 2,200 contract actions into a single, service-wide maintenance agreement with Cisco will save the service $109 million over three years.

“Multiply that kind of savings over hundreds of categories and dozens of agencies and the potential easily stretches into the billions,” said GDIT’s Sweeney.

Related Articles

Tom Temin 300×600
Tom Temin 250×250
GM 250×250
GEMG 250×250
Federal DevOps Summit 250×250
Intel & National Security Summit

Upcoming Events

Jason Miller 250×250
USNI News: 250×250
gdit cloud 250×250
TechNet Augusta 250×250
20 Minutes to Impress: Inside DIA’s Innovation Hub

20 Minutes to Impress: Inside DIA’s Innovation Hub

The make-shift command center is large and bright, tiered so that those in the back can see more clearly. Overhead projectors light up the long wall in front, and additional flat screen monitors flank the projected images on the wall. The room is large enough to accommodate as many as 50 people, but on this day only about a dozen are present, scattered about the room, looking for innovation, hoping to the two presenters waiting nervously to get started will spark some new idea.

This is the Defense Intelligence Agency’s (DIA) Innovation Hub, the center of an effort intended to acceleration the pace of change and innovation in the agency. The two presenters will get just 20 minutes to show what their product can do and to earn a chance to return for a longer demo and test and – they hope – a development contract. The presentation will take less time than it takes to get through security.

NeedipeDIADIA first launched its Innovation Office three years ago and has refined its process over time. Needs are posted on the agency’s NeedipeDIA web page, and interested parties are invited to submit brief white papers describing their potential solutions. The most promising ones earn an invitation to present at an industry day like this one.

DIA held a one-day event with seven demonstrations in December, all focused on a single need – a “User-Defined Intelligence Picture” intended to enable intelligence analysts to individually configure monitors to view multiple intelligence sources simultaneously; two of the seven have moved on to contract talks. Then, in April, the agency held its first two-day innovation event, receiving 32 white papers, of which 20 were deemed worth a further look. Three were dropped after pre-briefs, leaving 17 vendors for the event. Their technologies addressed advanced analytic support, tech identification, intelligence on weapons of mass destruction, threat tracking, discrete communications, electronic signatures, bulk translation, surveillance and counter surveillance, cyber behavior analytics and human persona understanding.

Agency partners dialed in from out of town to participate via video teleconferencing, and DIA says future demos may be opened to even more partners.

For the demonstrators, it’s nerve racking. The brief time limit means they have no time to waste and the pressure is on from the moment they start. The rules are strict: Visitors may not exchange business cards or talk to DIA participants except as part of their demonstration; PowerPoint presentations are forbidden; questions from the floor are expected and encouraged.

For this one instance, a reporter is allowed in to watch one of the demonstrations, a learning analytics platform that uses neural networks to break problems down into pieces, then compute answers with mind-boggling speed. But within minutes of its start, it is clear the demo is in trouble. After a brief introduction, the demonstration times out.

“This might take a few seconds,” one of the demonstrators says, laughing nervously. The room goes silent, waiting for the hang-up to resolve itself. It doesn’t. Seconds become minutes. “It worked in the parking lot,” he says. More nervous laughter, more silence. It’s painful to watch. The DIA people seem understanding and the demonstrators try to go with the flow, but it’s hard. Seeing is believing and there’s nothing to see. They fall back on describing how it works and then the questions start.

“It’s not parallel computing, is it?” asks a DIA staffer in back of the room.

“Yes it is,” he’s told. The system is designed to break down problems into discrete pieces to accelerate the number crunching.

“I can appreciate the distributed nature,” says a second DIA observer, a woman in the front. “But for the use case of technical identification, how will this platform help me identify unknown unknowns?”

The platform supports machine learning by running algorithms better and faster than conventional computing technology, but it’s not magic. It’s only as good as the algorithm it supports. The discussion shifts to machine learning and what it takes to train the system, and that prompts another question: “How would it prevent confirmation bias?”

“That depends on the questions you’re feeding into the algorithm.” Interest in using machine learning to tackle the vexing question of unknown unknowns is growing, he says. Algorithms can be chained together to take on increasingly complex concepts. But there is potential to introduce “machine bias” – that is, the sense that the computer must be right, even though the answers are based on human inputs.

When the 20 minutes are up, the session ends. The presenters say thank you and depart for the coffee room, disappointed but confident that it was worth the effort. “These things are forcing functions. Without a reason to present, we might keep researching and developing forever. We learned something here.”

DIA officials say they’ll try to arrange another demonstration, perhaps at the vendor’s offices instead. But it’s hard to escape the disappointment over the demo.

‘Changing Culture’
Innovation is hard. Even showing off a technology can be hard. In a classified environment, insulated from the outer world, it’s even harder. There can be a disconnect between the real-world hands-on problems of the analysts and commercial technology providers that may not have a full grasp of what’s needed inside the building. At the same time, the experts inside agencies do not necessarily know what’s technologically possible on the outside.

Robert Dixon, Jr. Special Advisor for Innovation at DIA

Robert Dixon, Jr.
Special Advisor for Innovation at DIA

The Innovation Office was created to help bridge that gap, and three years into its development, Robert Dixon, Jr., special advisor for programs and transition in DIA’s Innovation Office, says the agency has matured and improved its formula. “You have to transform the culture and build partnerships across the enterprise to make this work,” Dixon says. “And you have to pay attention to what works and what doesn’t and keep evolving, too.”

Connecting with commercial industry is a particular challenge for experts in the sheltered intelligence world. The “NeedipeDIA” requirements page aims to help, providing an open invitation to vendors to share their ideas, Dixon says. Once shared, the Innovation Office can help vet those concepts with internal customers and facilitate interaction with industry days and technology demonstrations. The most promising ideas can be acquired for testing on an isolated replica network – a lower bar than going through the testing and review necessary to take a new piece of gear and hook it up to the classified intelligence network.

In an organization where procurement is often measured in years, this process is designed to go from concept to pilot in just months, Dixon says.

DIA is also trying to share with its partners, working closely with U.S. Central Command, U.S. Pacific Command and the Pentagon’s Defense Innovation Unite Experimental (DIUX), and in the future with other intelligence agencies and possibly select foreign partners.

“The beauty is it’s unclassified,” Dixon says. “We can share with our partners and let them pilot solutions with us.”

Showing the Money

One big change is financial. DIA’s early innovation efforts ran into funding difficulties because mission partners were unable to come up with the needed research and development funding, Dixon says. The Innovation Office identified worthy ideas, but then mission owners couldn’t deliver funding to support them. General Dynamics Information Technology (GIDT) was among those early winners, presenting a concept tying facial recognition to social media for intelligence operators. DIA operators liked it, but no funding ever emerged.

The problem: DIA needed a mission owner to step up with research and development funding to advance such projects, but the mission owners didn’t have the funds or flexibility to make that happen. The Innovation Office didn’t either.

Now the Innovation Office is looking at a different funding framework, one that can leverage operations and maintenance (O&M) funds, as well as R&D. O&M funds are easier to come by. “That will give us more flexibility,” he says.

Each demo and each pilot is an opportunity to learn as much as possible as quickly as possible, Dixon says. “We want to learn in one pilot what we can, then leverage that knowledge with a second pilot,” Dixon says. The idea is to prove the concept fast and with a small investment; if it fails, that’s ok, because the investment in time and cash was small. But if it shows promise, then the process can continue. When a mission partner sees promise and champions the solution, the concept can evolve into a program of record.

The Innovation Office, meanwhile, will measure its success one effort at a time, measuring its effectiveness each step of the way. “How relevant was the pilot to the need? What came out of this that we maybe didn’t anticipate from the beginning?” Dixon says, posing the questions that emerge after each pilot. “What type of efficiencies could we achieve, in terms of cost savings or time?”

The office has the full backing of Marine Lt. Gen. Vincent Stewart, DIA’s director, Dixon says. “He’s very committed. He told us he wanted the IHub set up within 30 days and he’s given us the resources to build a better one. This is a priority.” By June, he said, a new innovation hub will be in place.

Dixon says NeedipeDIA has helped DIA engage over 120 private sector innovators, with more than 20 percent of those earning a chance to present their ideas in person. The exchanges can yield follow-up engagements and broader discussions, which in turn can develop into pilots or prototype programs. In each case, Dixon says, the agency tries to answer fundamental questions: “Can we scale this across the enterprise? Can we sustain it?”

Although most participants have been small companies, large prime contractors have also participated. Indeed, Dixon acknowledges that traditional contractors shared history with the intelligence and defense communities give them an edge in some cases, because they have an innate understanding of existing capabilities and potential needs. Small startups may have valuable capabilities, but may have neither the understanding of how it would fit into intelligence requirements nor the patience for the pace of government contracting – even when it’s accelerated, as the Innovation Office intends.

Says Dixon: “It’s not the size of the business that’s important. It’s the idea. We’re casting the net wide, to anyone who has the solutions that can help solve our problems. It could be someone working out of a garage someplace or it could be a large industry partner.”

Related Articles

Tom Temin 300×600
Tom Temin 250×250
GM 250×250
GEMG 250×250
Federal DevOps Summit 250×250
Intel & National Security Summit

Upcoming Events

Jason Miller 250×250
USNI News: 250×250
gdit cloud 250×250
TechNet Augusta 250×250
Federal Civilian Agencies Edge Toward Single, Shared Network

Federal Civilian Agencies Edge Toward Single, Shared Network

Momentum is picking up for establishing a single, secure network to protect federal civilian agencies.

The idea was advanced in the Commission on Enhancing National Cybersecurity report late last year, in recommendations by House Homeland Security Committee Chairman Rep. Mike McCaul in January. It also has the backing of acting Federal Chief Information Security Officer Grant Schneider, former White House Cybersecurity Coordinator Michael Daniel and former National Security Agency Chief Keith Alexander.

A unified network would theoretically be easier to secure and monitor and less expensive because it could leverage the full scale of the government. Schneider estimates it would be comparable in size to the Department of Defense’s (DoD) networks already managed as a central service by the Defense Information Systems Agency (DISA).

Grant Schneider, Acting Federal Chief Information Security Officer

Grant Schneider
Acting Federal Chief Information Security Officer

While data traffic moving from one defense agency to another never leaves the DoD environment, that’s not true in the civilian portion of the government. “Every time someone sends something from the White House to the Department of Commerce, it goes out in the wild,” Schneider says. “That is a risk.”

Centralizing networks and their management would solve that problem. “If we could move to some sort of a .gov environment, it would provide us opportunities we don’t have today,” he said. That includes improved situational awareness about the health of the overall network and common standards and architectures that would be easier to defend.

Some critics say that means they’d also be easier to attack, but Schneider says the balance of advantages are squarely on the side of centralized control. “A common architecture would enable us to apply more seamlessly some of the tools that are used across the .mil side [to defend] the federal civilian side,” he adds. “Today, that often proves to be more of a challenge than we would like.”

Daniel, who served five years managing White House cyber policy for the National Security Council, agrees, saying the idea of a unified network gained traction in the past year or so as competition for cyber talent and other resources increased and agencies struggled to keep up.

Michael Daniel, White House Cybersecurity Coordinator

Michael Daniel
White House Cybersecurity Coordinator

“We’ve taken the approach on the federal civilian side that it’s every agency for itself, every bureau for itself,” says Daniel, who now heads the non-profit Cyber Threat Alliance, a clearinghouse for cyber threat data. “The result is it’s very difficult for those agencies to find the resources to manage and support their IT and their cybersecurity. We should start thinking about this differently: Agencies should retain accountability for their information assets, the information they need to do pursue their mission. But that accountability does not mean they need to have their agency perform all of the different tasks required to do IT and cybersecurity from the top to the bottom of the stack.”

Instead, he argues, the network and transport layer of the IT stack ought to be managed centrally, providing a common architecture and security standard.

“Then the agencies would ride on top of that, and they would still be responsible for managing the application layer and the cybersecurity of the applications,” Daniel says. “This would enable us to dramatically improve IT network management and cybersecurity in the federal civilian sector.”

Daniel envisions the General Services Administration (GSA) taking on the network management role, possibly through a series of competitive managed services contracts, and the Department of Homeland Security (DHS) managing cybersecurity across the entire federal civilian enterprise.

“For the agencies, this means they can focus resources on the stuff they really care about, which is the mission applications their user base operates every day,” Daniel says. “You can get out of the business of having to manage the commodity network and transport layer.”

Though some might worry that in this setup, GSA would become a monopoly supplier, charging fees to agencies without competition, Daniel doesn’t buy it. “The only way for the federal government to get the kind of economies of scale that we know that we could get, would be to do a lot of that purchasing centrally.”

GSA would have to be transparent in how that was managed and charged. Agencies would need the ability to make tradeoffs based on specific mission requirements, such as system availability, guaranteed up-time or high-speed recovery in case of a failure.

Rep. Mike McCaul (R-Texas), chairman of the Homeland Security Committee, plans legislation this year that would create a cyber agency within DHS. And at least one draft of the Trump administration’s much anticipated executive order on cybersecurity calls for wider use of shared services across federal civilian agencies. McCaul co-authored “From Awareness to Action: A Cybersecurity Agenda for the 45th President,” in January. That report said “cybersecurity at DHS needs to be an operational component agency like the Coast Guard or U.S. Customs and Border Protection” and suggested there may be no greater homeland security mission than securing cyberspace. It too recommended creating a National Cybersecurity Agency inside DHS.

Similarly, the Commission on Enhancing National Cybersecurity, led by former National Security Advisor Tom Donilon and completed in December 2016, also recommended a single federal civilian network. “The Administration should establish a program to consolidate all civilian agencies’ network connections (as well as those of appropriate government contractors) into a single consolidated network,” it recommends. “The new agency should develop and implement a program to provide secure, reliable network services to all civilian government agencies, thereby providing a consolidated network for all .gov entities.”

Defining what that network encompasses will be challenging. With data centers closing and many agencies moving data rapidly into commercial cloud infrastructure, defining a network perimeter is no longer cut and dried.

But limiting the points of exposure would help reduce risk, Schneider argues. “We have 56 Trusted Internet Connections [across the civilian sector] that we’re trying to protect and that we’re spending a lot of money on,” he says. Might those be reduced? “The Department of Defense has 11 somewhat equivalent connections,” he adds.

Stan Tyliszcak

Stan Tyliszczak
Vice President for Technology Integration and Chief Engineer, GDIT

Systems integration contractors that provide the services to manage and secure government networks, agree. “Having so many different networks and network owners absolutely adds to the security challenges,” says Stan Tyliszczak, vice president for technology integration and chief engineer at General Dynamics Information Technology. “As providers of managed services for these networks, we’re constantly juggling multiple different technologies, products, policies and compliance. Instead of automating as much as we can and building deep analytical capability, we end up spread a mile wide and an inch deep, trying to protect against an increasingly sophisticated threat.”

Tyliszczak says it’s easy to get overly focused on the idea of a physical perimeter when the real focus should be on how each layer in the system stack is secured, from the network and operating system through the application layer. The Defense Department and experienced system integrators already have proven this can be done, he and others say.

Consolidating the myriad federal civilian agency networks into a single network architecture will take years to execute. Funding alone will be a challenge, and as Daniel says, it is funding – not technology – that drives policy implementation. Case in point: While the Defense Department long ago established centralized oversight of networks, efforts to drive toward a Joint Information Environment (JIE) with standardized technologies across all the military services have moved slowly without centralized funding. JIE is not a program of record, but a concept over numerous programs, which is one reason JIE standards remain a work in progress.

Likewise, the mammoth Intelligence Community Information Technology Enterprise (IC ITE) effort to standardize systems across the IC, has also struggled. Two years into a billion-dollar contract, the massive rollout of its second-generation virtual Desktop Enterprise (DTE) computer systems to standardize IT services across the IC – continues to slip further and further behind schedule. Deployment of Its Phase 2, originally set for last year, won’t start until this summer at the earliest.

“The problem with large-scale initiatives is underestimating their complexity,” says ”GDIT’s Tyliszczak. “At the scale of a JIE or IC-ITE – or a new .gov network – the complexity is in the size. So you have to keep the architecture as simple as possible. The more complex the system, the greater the risks in scaling it up,” he adds. “You want to stick with proven capability so you can focus on the scaling issues.”

This is true with networks, but also other technologies. The advantage of FedRAMP – the Federal Risk and Authorization Management Program created to accelerate cloud adoption across the federal government – is that it provides some assurance that the initial security requirements are set. That lets agency managers focus on their specific applications, saving time and money.

“When you start with a FedRAMP-certified cloud, you know the security basics have already been taken care of,” Tyliszczak says. “You can focus on scaling the solution instead of certifying the infrastructure.”

Federal civilian networks differ from defense networks in that they are not riding on private, dedicated fiber. In defense networks, DISA owns the fibers, Tyliszczak says. But civilian agencies’ networks ride on the same fibers as commercial traffic, using privately owned lines belonging to Verizon, AT&T and just a few others to move information from place to place. So the controls are all built on top of that, in the software and hardware that runs on the network, as opposed to the network infrastructure itself.

“There’s already quite a bit of government data in the cloud,” Schneider says. “Working out [where the perimeter might lie] would take time.” But he said it makes sense to establish security guidelines and to put one agency in charge of that for the whole civilian sector of the government.

“The old notion of a perimeter is not really applicable anymore,” Tyliszczak says. “You’re defending network access points and defending your data inside the network. It’s a lot easier to defend fewer access points if you can get to that point.”

Agencies have shown an increasing willingness to outsource services they don’t see as essential to their mission. More and more chief information officers see advantages in getting out of the data center business, getting out of managing networks which they see as commodity services, and focusing on the unique systems and technology that drive mission effectiveness.

That approach seems to square with the new administration’s focus on efficiency. “Certainly the administration believes hey, if the DoD can do this, why don’t we do more things similarly?” Schneider says. “That was my take coming out of DoD.”

Related Articles

Tom Temin 300×600
Tom Temin 250×250
GM 250×250
GEMG 250×250
Federal DevOps Summit 250×250
Intel & National Security Summit

Upcoming Events

Jason Miller 250×250
USNI News: 250×250
gdit cloud 250×250
TechNet Augusta 250×250