Public Safety

The ABCs of 2018 Federal IT Modernization: A to H

The ABCs of 2018 Federal IT Modernization: A to H

The White House issued its IT modernization plan last December and followed it with an ambitious program that could become a proving ground for rapidly overhauling IT infrastructure, data access and customer service. After years of talking about IT modernization, cybersecurity and migration to the cloud, federal agencies are now poised to ramp up the action.

Here, in A-B-C form, is what you need to know from A to H:

A is for Agriculture
The U.S. Department of Agriculture (USDA) will be a sort of proving ground for implementing the Trump administration’s vision for the future high-tech, high-performance, customer-satisfying government. USDA announced in December 2017 it will collapse 39 data centers into one (plus a backup), and consolidate 22 independent chief information officers under a single CIO with seven deputies. The aim: reinvent the agency as a modern, customer-centered organization and provide its leaders with instant access to a wealth of agency data.

B is for Better Citizen Services
“It is imperative for the federal government to leverage … innovations to provide better service for its citizens in the most cost-effective and secure manner,” the report states – in just its third sentence. Yes, modernization should ultimately save money by reducing the billions spent to keep aging systems operational. And yes, it should help overcome the patchwork of cybersecurity point solutions now used to protect federal networks, systems and data.

USDA Secretary Sonny Purdue’s experience modernizing government IT during two terms as governor of Georgia from 2003-2010 convinced him he could achieve similar results on the federal level. “He really saw, in reinventing Georgia government, how IT modernization and delivering better customer service benefitted not only employees, but the people of the state,” Deputy Secretary of Agriculture Steve Censky said in a TV interview.

Among the agency’s goals: Increase access to information throughout the agency by means of online service portals and advanced application program interfaces.

C is for Centers of Excellence
USDA won’t be going it alone. Under the direction of the Office of Science and Technology Policy, the agency will be the first to engage with a new set of experts at the General Services Administration (GSA). GSA is on an accelerated course to create five Centers of Excellence, leveraging both public and private sector expertise to develop best practices and standards that agencies can use for:

  • Cloud adoption
  • IT infrastructure optimization
  • Customer experience
  • Service delivery analytics
  • Contact centers

Jack Wilmer, White House senior advisor for Cybersecurity and IT Modernization, says the idea is to provide each agency’s modernization effort with the same core concepts and approach – and the best available experts. “We’re trying to leverage private sector expertise, bringing them in a centralized fashion, making them available to government agencies as they modernize,” he told Government Matters.

While GSA planned to award contracts to industry partners by the end of January – just 45 days after its initial solicitation – by March 5, no contracts had been awarded. Phase 1 contracts for assessment, planning and some initial activities should be finalized soon. Phase 2 awards for cloud migration, infrastructure optimization and customer experience are expected by the end of the year, Joanne Collins Smee, acting director of GSA’s Technology Transformation Service and deputy commissioner of the Federal Acquisition Service, said at a March 1 AFCEA event in Washington, D.C.

D is for Data Centers
While all data centers won’t close down, many more will soon disappear. Modernization is about getting the government out of the business of managing big infrastructure investments and instead, to leverage commercial cloud infrastructure and technology wherever possible. But don’t think your agency’s data won’t be in a data center somewhere.

“What is the cloud, anyway? Isn’t it really someone else’s data center, available on demand?” says Stan Tyliszczak, chief engineer at systems integrator General Dynamics Information Technology (GDIT). “Moving to the cloud means getting out of the business of running that data center yourself.”

The White House splits its cloud strategy into two buckets:

  • “Bring the government to the cloud.” Put government data and applications in privately-owned and operated infrastructure, where it is protected through encryption and other security technologies. This is public cloud, where government data sits side by side with private data in third-party data centers.
  • “Bring the cloud to the government.” Putting government data and applications on vendor-owned infrastructure, but located in government-owned facilities, as the Intelligence Community Information Technology Enterprise (IC ITE) does with the IC’s Commercial Cloud Services (C2S) contract with Amazon Web Services.

Figuring out what makes sense when, depends on your use case and for most agencies, will mean a combination of on premise solutions, shared government services and commercial services in public clouds. “That’s the Hybrid cloud model everyone’s talking about. But it’s not a trivial exercise. Melding those together is the challenge,” Tyliszczak says. “That’s what integrators are for.”

E is for Encryption
Government cybersecurity efforts have historically focused on defending the network and its perimeter, rather than the data that travels on that network. As cloud services are integrated into conventional on premise IT solutions, securing the data has become essential. At least 47 percent of federal network traffic is encrypted today – frustrating agency efforts to monitor what’s crossing network perimeters.

“Rather than treating Federal networks as trusted entities to be defended at the perimeter,” the modernization report advised, “agencies should shift their focus to placing security protections closer to data.”

To do that, the government must improve the way it authenticates devices and users on its networks, securing who has access and how, and encrypting data both at rest and in transit.

“Now you’re starting to obfuscate whether your sensors can actually inspect the content of that data,” notes Eric White, Cybersecurity program director at GDIT’s Health and Civilian Solutions Division. “Because it’s now encrypted, you add another layer of complexity to know for sure whether it’s the good guys or the bad guys moving data in and out of your network.”

White notes that the Department of Homeland Security (DHS) is charged with solving this encryption dilemma, balancing the millions of dollars in investment in high-end network-monitoring sensors, such as those associated with the Einstein program, against protecting individual privacy. Enabling those sensors to see through or decipher encrypted data without undermining the security of the data – or the privacy of individuals – is a critical priority. DHS has commissioned research to develop potential solutions, including virtualizing sensors for cloud environments; relocating sensors to the endpoints of encrypted tunnels; creating man-in-the-middle solutions that intercept data in motion; or providing the sensors with decryption keys.

F is for FedRAMP

The Federal Risk and Authorization Management Program (FedRAMP) remains the critical process for ensuring private-sector cloud offerings meet government security requirements. Look for updates to FedRAMP baselines that could allow tailoring of security controls for low-risk systems, address new approaches to integrated cloud services with federal Trusted Internet Connection (TIC) services and consider common features or capabilities that could be incorporated into higher-risk systems with FedRAMP “high” baselines.

Importantly, the report directs the General Services Administration (GSA), which manages FedRAMP, to come up with new solutions that make it easier for a software-as-a-service (SaaS) products already authorized for use in one agency to be accepted for use in another. Making the process for issuing an authority to operate (ATO) faster and easier to reuse has long been a goal of both cloud providers and government customers. This is particularly critical for shared services, in which one agency provides its approved commercial solution to another agency.

G is for GSA
Already powerfully influential as a buyer and developer for other agencies, GSA stands to become even more influential as the government moves to consolidate networks and other IT services into fewer contracts and licensing agreements, and to increase the commonality of solutions across the government.

This is especially true among smaller agencies that lack the resources, scale and expertise to effectively procure and manage their own IT services.

H is for Homeland Security
DHS is responsible for the overall cybersecurity of all federal government systems. The only federal entity mentioned more frequently in the White House modernization report is the Office of Management and Budget, which is the White House agency responsible for implementing the report’s guidance.

DHS was mandated to issue a report by Feb. 15, identifying the common weaknesses of the government’s highest-value IT assets and recommend solutions for reducing risk and vulnerability government-wide. By May 15, the agency must produce a prioritized list of systems “for government-wide intervention” and will provide a host of advisory and support services to help secure government systems. DHS also owns and manages the National Cybersecurity Protection System (NCPS) and the EINSTEIN sensor suites that capture and analyze network flow, detect intruders and scan the data coming in and out of government systems to identify potentially malicious activity and, in the case of email, blocking and filtering threatening content.

Look for next week’s edition of GovTechWorks for Part 2: Modernization from I to Z. In Part 2, we outline how infrastructure among government agencies will be impacted and streamlined by modernization, as well as discuss the fate of legacy systems and their maintenance budgets, and the major role the Office of Management and Budget will play in overall implementation.

Next week: Part 2, Modernization I-Z.

 

Related Articles

GDIT Recruitment 600×300
GDIT HCSD SCM 5 250×250 Truck
NPR Morning Edition 250×250
GDIT Recruitment 250×250
USNI News: 250×250
Nextgov Newsletter 250×250
Is Identity the New Perimeter? In a Zero-Trust World, More CISOs Think So

Is Identity the New Perimeter? In a Zero-Trust World, More CISOs Think So

As the network perimeter morphs from physical to virtual, the old Tootsie Pop security model – hard shell on the outside with a soft and chewy center – no longer works. The new mantra, as Mittal Desai, chief information security officer (CISO) at the Federal Energy Regulatory Commission, said at the ATARC CISO Summit: “Never trust, double verify.”

The zero-trust model modernizes conventional network-based security for a hybrid cloud environment. As agencies move systems and storage into the cloud, networks are virtualized and security naturally shifts to users and data. That’s easy enough to do in small organizations, but rapidly grows harder with the scale and complexity of an enterprise.

The notion of zero-trust security first surfaced five years ago in a Forrester Research report prepared for the National Institute for Standards and Technology (NIST). “The zero-trust model is simple,” Forrester posited then. “Cybersecurity professionals must stop trusting packets as if they were people. Instead, they must eliminate the idea of a trusted network (usually the internal network) and an untrusted network (external networks). In zero-trust, all network traffic is untrusted.”

Cloud adoption by its nature is forcing the issue, said Department of Homeland Security Chief Technology Officer Mike Hermus, speaking at a recent Tech + Tequila event: “It extends the data center,” he explained. “The traditional perimeter security model is not working well for us anymore. We have to work toward a model where we don’t trust something just because it’s within our boundary. We have to have strong authentication, strong access control – and strong encryption of data across the entire application life cycle.”

Indeed, as other network security features mature, identity – and the access that goes with it – is now the most common cybersecurity attack vector. Hackers favor phishing and spear-phishing attacks because they’re inexpensive and effective – and the passwords they yield are like the digital keys to an enterprise.

About 65 percent of breaches cited in Verizon’s 2017 Data Breach Investigations Report made use of stolen credentials.

Interestingly however, identity and access management represent only a small fraction of cybersecurity investment – less than 5 percent – according to Gartner’s market analysts. Network security equipment by contrast, constitutes more than 12 percent. Enterprises continue to invest in the Tootsie Pop model even as its weaknesses become more evident.

“The future state of commercial cloud computing makes identity and role-based access paramount,” said Rob Carey, vice president for cybersecurity and cloud solutions within the Global Solutions division at General Dynamics Information Technology (GDIT). Carey recommends creating both a framework for better understanding the value of identity management tools, and metrics to measure that impact. “Knowing who is on the network with a high degree of certainty has tremendous value.”

Tom Kemp, chief executive officer at Centrify, which provides cloud-based identity services, has a vested interest in changing that mix. Centrify, based in Sunnyvale, Calif., combines identity data with location and other information to help ensure only authorized, verified users access sensitive information.

“At the heart of zero-trust is the realization that an internal user should be treated just like an external user, because your internal network is just as polluted as your outside network,” Kemp said at the Feb. 7 Institute for Critical Infrastructure (ICIT) Winter Summit. “You need to move to constant verification.” Reprising former President Ronald Reagan’s “trust but verify” mantra, he adds: “Now it’s no trust and always verify. That’s the heart of zero-trust.”

The Google Experience
When Google found itself hacked in 2009, the company launched an internal project to find a better way to keep hackers out of its systems. Instead of beefing up firewalls and tightening virtual private network settings, Google’s BeyondCorp architecture dispensed with the Tootsie Pop model in which users logged in and then gained access to all manner of systems and services.

In its place, Google chose to implement a zero-trust model that challenges every user and every device on every data call – regardless of how that user accessed the internet in the first place.

While that flies in the face of conventional wisdom, Google reasoned that by tightly controlling the device and user permissions to access data, it had found a safer path.

Here’s an example of how that works when an engineer with a corporate-issued laptop wants to access an application from a public Wi-Fi connection:

  1. The laptop provides its device certificate to an access proxy.
  2. The access proxy confirms the device, then redirects to a single-sign-on (SSO) system to verify the user.
  3. The engineer provides primary and second-factor authentication credentials and, once authenticated by the SSO system, is issued a token.
  4. Now, with the device certificate to identify the device and the SSO token to identify the user, an Access Control Engine can perform a specific authorization check for every data access. The user must be confirmed to be in the engineering group; to possess a sufficient trust level; and to be using a managed device in good standing with a sufficient trust level.
  5. If all checks pass, the request is passed to an appropriate back-end system and the data access is allowed. If any of the checks fail however, the request is denied. This is repeated every time the engineer tries to access a data item.

“That’s easy enough when those attributes are simple and clear cut, as with the notional Google engineer,” said GDIT’s Carey, who spent three decades managing defense information systems. “But it gets complicated in a hurry if you’re talking about an enterprise on the scale of the Defense Department or Intelligence community.”

Segmenting the Sprawling Enterprise
A takeaway from 9/11 was that intelligence agencies needed to be better and faster at sharing threat data across agency boundaries. Opening databases across agency divisions, however, had consequences: Chelsea Manning, at the time Pfc. Bradley Manning, delivered a treasure trove of stolen files to WikiLeaks and then a few years later, Edward Snowden stole countless intelligence documents, exposing a program designed to collect metadata from domestic phone and email records.

“The more you want to be sure each user is authorized to see and access only the specific data they have a ‘need-to-know,’ the more granular the identity and access management schema need to be,” Carey said. “Implementing role-based access is complicated because you’ve got to develop ways to both tag data and code users based on their authorized need. Absent a management schema, that can quickly become difficult to manage for all but the smallest applications.”

Consider a scenario of a deployed military command working in a multinational coalition with multiple intelligence agencies represented in the command’s intelligence cell. The unit commands air and ground units from all military services, as well as civilians from defense, intelligence and possibly other agencies. Factors determining individual access to data might include the person’s job, rank, nationality, location and security clearance. Some missions might include geographic location, but others can’t rely on that factor because some members of the task force are located thousands of miles away, or operating from covert locations.

That scenario gets even more complicated in a hybrid cloud environment where some systems are located on premise, and others are far away. Managing identity-based access gets harder anyplace where distance or bandwidth limitations cause delays. Other integration challenges include implementing a single-sign-on solution across multiple clouds, or sharing data by means of an API.

Roles and Attributes
To organize access across an enterprise – whether in a small agency or a vast multi-agency system such as the Intelligence Community Information Technology Enterprise (IC ITE) – information managers must make choices. Access controls can be based on individual roles – such as job level, function and organization – or data attributes – such as type, source, classification level and so on.

“Ultimately, these are two sides of the same coin,” Carey said. “The real challenge is the mechanics of developing the necessary schema to a level of granularity that you can manage, and then building the appropriate tools to implement it.”

For example, the Defense Department intends to use role-based access controls for its Joint Information Enterprise (JIE), using the central Defense Manpower Data Center (DMDC) personnel database to connect names with jobs. The available fields in that database are in effect, the limiting factors on just how granular role-based access controls will be under JIE.

Access controls will only be one piece of JIE’s enterprise security architecture. Other features, ranging from encryption to procedural controls that touch everything from the supply chain to system security settings, will also contribute to overall security.

Skeptical of Everything
Trust – or the lack of it – plays out in each of these areas, and requires healthy skepticism at every step. Rod Turk, CISO at the Department of Commerce, said CISOs need to be skeptical of everything. “I’m talking about personnel, I’m talking about relationships with your services providers,” he told the ATARC CISO Summit.  “We look at the companies we do business with and we look at devices, and we run them through the supply chain.  And I will tell you, we have found things that made my hair curl.”

Commerce’s big push right now is the Decennial Census, which will collect volumes of personal information (PI) and personally identifiable information (PII) on almost every living person in the United States. Conducting a census every decade is like doing a major system reset each time. The next census will be no different, employing mobile devices for census takers and for the first time, allowing individuals to fill out census surveys online. Skepticism is essential because the accuracy of the data depends on the public’s trust in the census.

In a sense, that’s the riddle of the whole zero-trust concept: In order to achieve a highly trusted outcome, CISOs have to start with no trust at all.

Yet trust also cuts in the other direction. Today’s emphasis on modernization and migration to the cloud means agencies face tough choices. “Do we in the federal government trust industry to have our best interests in mind to keep our data in the cloud secure?” Turk asked rhetorically.

In theory, the Federal Risk and Authorization Management Program (FedRAMP) establishes baseline requirements for establishing trust but doubts persist. What satisfies one agency’s requirements may not satisfy another. Compliance with FedRAMP or NIST controls equates to risk management rather than actual security, GDIT’s Carey points out. They’re not the same thing.

Identity and Security
Beau Houser, CISO at the Small Business Administration, is more optimistic by improvements he’s seen as compartmentalized legacy IT systems are replaced with centralized, enterprise solutions in a Microsoft cloud.

“As we move to cloud, as we roll out Windows 10, Office 365 and Azure, we’re getting all this rich visibility of everything that’s happening in the environment,” he said. “We can now see all logins on every web app, whether that’s email or OneDrive or what have you, right on the dashboard. And part of that view is what’s happening over that session: What are they doing with email, where are they moving files.… That’s visibility we didn’t have before.”

Leveraging that visibility effectively extends that notion of zero-trust one step further, or at least shifts it into the realm of a watchful parent rather than one who blindly trusts his teenage children. The watchful parent believes trust is not a right, but an earned privilege.

“Increased visibility means agencies can add behavioral models to their security controls,” Carey said. “Behavioral analysis tools that can match behavior to what people’s roles are supposed to be and trigger warnings if people deviate from expected norms, is the next big hurdle in security.”

As Christopher Wlaschin, CISO at the Department of Health and Human Services, says: “A healthy distrust is a good thing.”

Related Articles

GDIT Recruitment 600×300
GDIT HCSD SCM 5 250×250 Truck
NPR Morning Edition 250×250
GDIT Recruitment 250×250
USNI News: 250×250
Nextgov Newsletter 250×250
Unpleasant Design Could Encourage Better Cyber Hygiene

Unpleasant Design Could Encourage Better Cyber Hygiene

Recent revelations that service members and intelligence professionals are inadvertently giving up their locations and fitness patterns via mobile apps caught federal agencies by surprise.

The surprise wasn’t that Fitbits, smartphones or workout apps try to collect information, nor that some users ignore policies reminding them to watch their privacy and location settings. The real surprise is that many IT policies aren’t doing more to help stop such inadvertent fitness data leaks.

If even fitness-conscious military and intelligence personnel are unknowingly trading security and privacy for convenience, how can IT security managers increase security awareness and compliance?

One answer: Unpleasant design.

Unpleasant design is a proven technique for using design to discourage unwanted behavior. Ever get stuck in an airport and long for a place to lie down — only to find every bench or row of seats is fitted with armrests? That’s no accident. Airports and train terminals don’t want people sleeping across benches. Or consider the decorative metalwork sometimes placed on urban windowsills or planter walls — designed expressly to keep loiterers from sitting down. It’s the same with harsh lights in suburban parking lots, which discourage people from hanging out and make it harder for criminals to lurk in the shadows.

As the federal government and other agency IT security leaders investigate these inadvertent disclosures, can they employ those same concepts to encourage better cyber behavior?

Here’s how unpleasant design might apply to federally furnished Wi-Fi networks: Rather than allow access with only a password, users instead might be required to have their Internet of Things (IoT) devices pass a security screening that requires certain security settings. That screening could include ensuring location services are disabled while such devices are connected to government-provided networks.

Employees would then have to choose between the convenience of free Wi-Fi for personal devices and the risks of inadequate operations security (OPSEC) via insecure device settings.

This of course, only works where users have access to such networks. At facilities where personal devices must be deposited in lockers or left in cars, it won’t make a difference. But for users working (and living) on installations where personnel routinely access Wi-Fi networks, this could be highly effective.

Screening – and even blocking – certain apps or domains could be managed through a cloud access security broker, network security management software that can enforce locally set rules governing apps actively using location data or posing other security risks. Network managers could whitelist acceptable apps and settings, while blocking those deemed unacceptable. If agencies already do that for their wired networks, why not for wireless?

Inconvenient? Absolutely. That’s the point.

IT security staffs are constantly navigating the optimal balance between security and convenience. Perfect security is achievable only when nothing is connected to anything else. Each new connection and additional convenience introduces another dent in the network’s armor.

Employing cloud-access security as a condition of Wi-Fi network access will impinge on some conveniences. In most cases, truly determined users can work around those rules by using local cellular data access instead. In most parts of the world, however, those places where the need for OPSEC is greatest, that access comes with a direct cash cost. When users pay for data by the megabyte, they’re much more likely to give up some convenience, check security and privacy settings, and limit their data consumption.

This too, is unpleasant design at work. Cellular network owners must balance network capacity with use. Lower-capacity networks control demand by raising prices, knowing that higher priced data discourages unbridled consumption.

Training and awareness will always be the most important factors in securing privacy and location data, because few users are willing to wade through pages-long user agreements to discover what’s hidden in the fine print and legalese they contain. More plain language and simpler settings for opting-in or out of certain kinds of data sharing are needed – and app makers must recognize that failing to heed such requirements only increase the risk that government steps in with new regulations.

But training and awareness only go so far. People still click on bad links, which is why some federal agencies automatically disable them. It makes users take a closer, harder look and think twice before clicking. That too, is unpleasant design.

So is requiring users to wear a badge that doubles as a computer access card (as is the case with the Pentagon’s Common Access Card and most Personal Identity Verification cards). Yet, knowing that some will inevitably leave the cards in their computers, such systems automatically log off after only a few minutes of inactivity. It’s inconvenient, but more secure.

We know this much: Human nature is such that people will take the path of least resistance. If that means accepting security settings that aren’t safe, that’s what’s going to happen. Though interrupting that convenience and turning it on its head by means of Wi-Fi security won’t stop everyone. But it might have prevented Australian undergrad Nathan Ruser – and who knows who else – from identifying the regular jogging routes of military members (among other examples) from Strava’s house-built heat map and the 13 trillion GPS points all collected from users.

“If soldiers use the app like normal people do,” Ruser tweeted Jan. 27, “it could be especially dangerous … I shouldn’t be able to establish any pattern of life info from this far away.”

Exactly.

Related Articles

GDIT Recruitment 600×300
GDIT HCSD SCM 5 250×250 Truck
NPR Morning Edition 250×250
GDIT Recruitment 250×250
USNI News: 250×250
Nextgov Newsletter 250×250
How AI Is Transforming Defense and Intelligence Technologies

How AI Is Transforming Defense and Intelligence Technologies

A Harvard Belfer Center study commissioned by the Intelligence Advanced Research Projects Agency (IARPA), Artificial Intelligence and National Security, predicted last May that AI will be as transformative to national defense as nuclear weapons, aircraft, computers and biotech.

Advances in AI will enable new capabilities and make others far more affordable – not only to the U.S., but to adversaries as well, raising the stakes as the United States seeks to preserve its hard-won strategic overmatch in the air, land, sea, space and cyberspace domains.

The Pentagon’s Third Offset Strategy seeks to leverage AI and related technologies in a variety of ways, according to Robert Work, former deputy secretary of defense and one of the strategy’s architects. In a forward to a new report from the market analytics firm Govini, Work says the strategy “seeks to exploit advances in AI and autonomous systems to improve the performance of Joint Force guided munitions battle networks” through:

  • Deep learning machines, powered by artificial neural networks and trained with big data sets
  • Advanced human-machine collaboration in which AI-enabled learning machines help humans make more timely and relevant combat decisions
  • AI devices that allow operators of all types to “plug into and call upon the power of the entire Joint Force battle network to accomplish assigned missions and tasks”
  • Human-machine combat teaming of manned and unmanned systems
  • Cyber- and electronic warfare-hardened, network-enabled, autonomous and high-speed weapons capable of collaborative attacks

“By exploiting advances in AI and autonomous systems to improve the warfighting potential and performance of the U.S. military,” Work says, “the strategy aims to restore the Joint Force’s eroding conventional overmatch versus any potential adversary, thereby strengthening conventional deterrence.”

Spending is growing, Govini reports, with AI and related defense program spending increasing at a compound annual rate of 14.5 percent from 2012 to 2017, and poised to grow substantially faster in coming years as advanced computing technologies come on line, driving down computational costs.

But in practical terms, what does that mean? How will AI change the way defense technology is managed, the way we gather and analyze intelligence or protect our computer systems?

Charlie Greenbacker, vice president of analytics at In-Q-Tel in Arlington, Va., the intelligence community’s strategic investment arm, sees dramatic changes ahead.

“The incredible ability of technology to automate parts of the intelligence cycle is a huge opportunity,” he said at an AI summit produced by the Advanced Technology Academic Research Center and Intel in November. “I want humans to focus on more challenging, high-order problems and not the mundane problems of the world.”

The opportunities are possible because of the advent of new, more powerful processing techniques, whether by distributing those loads across a cloud infrastructure, or using specialty processors purpose-built to do this kind of math. “Under the hood, deep learning is really just algebra,” he says. “Specialized processing lets us do this a lot faster.”

Computer vision is one focus of interest – learning to identify faces in crowds or objects in satellite or other surveillance images – as is identifying anomalies in cyber security or text-heavy data searches. “A lot of folks spend massive amounts of time sifting through text looking for needles in the haystack,” Greenbacker continued.

The Air Force is looking at AI to help more quickly identify potential cyber attacks, said Frank Konieczny, chief technology officer in the office of the Air Force chief information officer, speaking at the CyberCon 2017 in November. “We’re looking at various ways of adjusting the network or adjusting the topology based upon threats, like software-defined network capabilities as well as AI-based analysis,” he said.

Marty Trevino Jr., a former technical director and strategist for the National Security Agency, now chief data/analytics officer at intelligence specialist at Red Alpha, a tech firm based in Annapolis Junction, Md. “We are all familiar with computers beating humans in complex games – chess, Go, and so on,” Trevino says. “But experiments are showing that when humans are mated with those same computers, they beat the computer every time. It’s this unique combination of man and machine – each doing what its brain does best – that will constitute the active cyber defense (ACD) systems of tomorrow.”

Machines best humans when the task is highly defined at speed and scale. “With all the hype around artificial intelligence, it is important to understand that AI is only fantastic at performing the specific tasks to which it is intended,” Trevino says. “Otherwise AI can be very dumb.”

Humans on the other hand, are better than machines when it comes to putting information in context. “While the human brain cannot match AI in specific realms,” he adds, “it is unmatched in its ability to process complex contextual information in dynamic environments. In cyber, context is everything. Context enables data-informed strategic decisions to be made.”

Artificial Intelligence and National Security
To prepare for a future in which artificial intelligence plays a heavy or dominant role in a warfare and military strategy-rich future, IARPA commissioned the Harvard Belfer Center to study the issue. The center’s August 2017 report, “Artificial Intelligence and National Security,” offers a series of recommendations, including:

  • Wargames and strategy – The Defense Department should conduct AI-focused wargames to identify potentially disruptive military innovations. It should also fund diverse, long-term strategic analyses to better understand the impact and implications of advanced AI technologies
  • Prioritize investment – Building on strategic analysis, defense and intelligence agencies should prioritize AI research and development investment on technologies and applications that will either provide sustainable strategic advantages or mitigate key risks
  • Counter threats – Because others will also have access to AI technology, investing in “counter-AI” capabilities for both offense and defense is critical to long-term security. This includes developing technological solutions for countering AI-enabled forgery, such as faked audio or video evidence
  • Basic research – The speed of AI development in commercial industry does not preclude specific security requirements in which strategic investment can yield substantial returns. Increased investment in AI-related basic research through DARPA, IARPA, the Office of Naval Research and the National Science Foundation, are critical to achieving long-term strategic advantage
  • Commercial development – Although DoD cannot expect to be a dominant investor in AI technology, increased investment through In-Q-Tel and other means can be critical in attaining startup firms’ interest in national security applications

Building Resiliency
Looking at cybersecurity another way, AI can also be used to rapidly identify and repair software vulnerabilities, said Brian Pierce, director of the Information Innovation Office at the Defense Advanced Research Projects Agency (DARPA).

“We are using automation to engage cyber attackers in machine time, rather than human time,” he said. Using automation developed under DARPA funding, he said machine-driven defenses have demonstrated AI-based discovery and patching of software vulnerabilities. “Software flaws can last for minutes, instead of as long as years,” he said. “I can’t emphasize enough how much this automation is a game changer in strengthening cyber resiliency.”

Such advanced, cognitive ACD systems employ the gamut of detection tools and techniques, from heuristics to characteristic and signature-based identification, says Red Alpha’s Trevino. “These systems will be self-learning and self-healing, and if compromised, will be able to terminate and reconstitute themselves in an alternative virtual environment, having already learned the lessons of the previous engagement, and incorporated the required capabilities to survive. All of this will be done in real time.”

Seen in that context, AI is just the latest in a series of technologies the U.S. has used as a strategic force multiplier. Just as precision weapons enabled the U.S. Air Force to inflict greater damage with fewer bombs – and with less risk – AI can be used to solve problems that might otherwise take hundreds or even thousands of people. The promise is that instead of eyeballing thousands of images a day or scanning millions of network actions, computers can do the first screening, freeing up analysts for the harder task of interpreting results, says Dennis Gibbs, technical strategist, Intelligence and Security programs at General Dynamics Information Technology. “But just because the technology can do that, doesn’t mean it’s easy. Integrating that technology into existing systems and networks and processes is as much art as science. Success depends on how well you understand your customer. You have to understand how these things fit together.”

In a separate project, DARPA collaborated with a Fortune 100 company that was moving more than a terabyte of data per day across its virtual private network, and generating 12 million network events per day – far beyond the human ability to track or analyze. Using automated tools, however, the project team was able to identify a single unauthorized IP address that successfully logged into 3,336 VPN accounts over seven days.

Mathematically speaking, Pierce said, “The activity associated with this address was close to 9 billion network events with about a 1 in 10 chance of discovery.” The tipoff was a flaw in the botnet that attacked the network: Attacks were staged at exactly 57-minute intervals. Not all botnets of course, will make that mistake. But even pseudo random timing can also be detected. He added: “Using advanced signal processing methods applied to billions of network events over weeks and months-long timelines, we have been successful at finding pseudo random botnets.”

On the flipside however, must be the recognition that AI superiority will not be a given in cyberspace. Unlike air, land, sea or space, cyber is a man-made warfare domain. So it’s fitting that the fight there could end up being machine vs. machine.

The Harvard Artificial Intelligence and National Security study notes emphatically that while AI will make it easier to sort through ever greater volumes of intelligence, “it will also be much easier to lie persuasively.” The use of Photoshop and other image editors is well understood and has been for years. But recent advances in video editing have made it reasonably easy to forge audio and video files.

A trio of University of Washington researchers announced in a research paper published in July that they had used AI to synthesize a photorealistic, lip-synced video of former President Barack Obama. While the researchers used real audio, it’s easy to see the dangers posed if audio is also manipulated.

While the authors describe potential positive uses of the technology – such as “the ability to generate high-quality video from audio [which] could significantly reduce the amount of bandwidth needed in video coding/transmission” – potential nefarious uses are just as clear.

Related Articles

GDIT Recruitment 600×300
GDIT HCSD SCM 5 250×250 Truck
NPR Morning Edition 250×250
GDIT Recruitment 250×250
USNI News: 250×250
Nextgov Newsletter 250×250
OneNet vs. IC ITE vs. JIE: 3 Ways to Consolidate Fed Networks

OneNet vs. IC ITE vs. JIE: 3 Ways to Consolidate Fed Networks

President Trump’s executive order on cybersecurity aims to collapse federal networks into “one or more consolidated network architectures,” a move that would both centralize cyber responsibility in fewer agencies and radically alter the way agencies manage and operate their information technology systems.

Now government IT leaders are trying to figure out how best to achieve that vision.

Options include: a highly centralized system managed by a single federal entity; a shared arrangement built around multiple communities of interest, such as law enforcement or benefits delivery; or a commercially centered model in which agencies could choose from among a limited menu of approved commercial alternatives.

The Federal Chief Information Officer Council, led by Acting Federal CIO Margie Graves, gets the first crack at setting the path forward. Whatever solution it comes up with, must then survive scrutiny from a host of federal agencies – as well as Congress.

Among the options are three models already in place:

  • The Department of Homeland Security’s OneNet, a centralized virtual network architecture that links the department’s seven major components and 15 subcomponents
  • The Intelligence Community Information Technology Enterprise (IC ITE), a multibillion effort to harmonize and standardize IT services throughout the intelligence community, including establishing common transport and security layers
  • The Department of Defense’s Joint Information Environment (JIE), which like IC ITE, seeks to establish standard services and controls to keep more data inside a protected environment and limit contact with the public internet to 49 joint regional security stacks (JRSS), which act as gateways between the defense network and the global internet

DHS’ OneNet, though smallest of the three, is the most mature. Completed in 2014, it successfully drew together nearly two dozen agencies inside a single network in the decade after the department was created. In the OneNet model, each component operates its own local area networks (LANs) within a common set of standards and guidelines, using OneNet’s multiprotocol label switching (MPLS) backbone to connect with other DHS entities, with shared commercial cloud services and to reach the global internet.

OneNet’s backbone sits on top of commercial networks operated by AT&T and Verizon, passing data from agency to agency through internal gateways called Policy Enforcement Points (PEPs). Contact with the world at large is routed through two Trusted Internet Connections (TICs), which also manage access to cloud services that host apps, databases and public-facing DHS web sites.

Richard Spires

Richard Spires

Richard Spires, DHS’ CIO from 2009 to 2013, oversaw much of the transition to OneNet. The challenges to consolidation are less about technology, he says, than culture and control.

“It’s hard to argue the point theoretically,” Spires told GovTechWorks. “It’s like a lot of shared services ideas in government: It sounds good, but getting everyone on board is hard. We were struggling to get OneNet to run well with just the different components of DHS. A model where you have one monolithic network? It would be very hard to manage at that scale.”

Agencies, individual divisions and programs all have unique requirements. Coming up with a single solution that solves everyone’s needs demands a lot of compromise. The more compromises necessary, the more time it will take to overcome objections.

This is why Spires favors an enterprise services model, in which government negotiates umbrella services centrally, but agencies are free to acquire those services on an as-needed basis. “That model is pretty close to right,” Spires says. “That way agencies can leverage the buying power of government, but you can still allow agencies to configure their network architectures to suit their needs.”

To effect that change, Spires favors shifting most infrastructure services, including network security, to a select group of cloud providers using enterprise agreements negotiated and managed by the General Services Administration, just as telecommunications services are managed now. “You can bake in the security controls so the agency doesn’t have to worry about it at that level,” he explains. “The agencies will still need to worry about security for their apps, of course, but not the infrastructure.”

Central Control
Canada centralized its government IT infrastructure under a single agency, Shared Services Canada (SSC), in 2011. While the new organization is still in the process of collapsing hundreds of wide area networks into one, its charter goes much further – centralizing email and other services, consolidating data centers, collapsing network services and improving service results.

As with OneNet, SSC’s consolidation process proved slower than anticipated. Despite additional funding, the agency reported in its 2017-2018 plan that “current funding is insufficient to meet the higher-than-anticipated growth in government-wide demand for IT services and to refresh the older IT systems and enterprise environment.” That report puts funding and talent shortages at the top of its list of organizational risks – higher than cybersecurity or aging legacy infrastructure.

Still, it also notes progress: The number of critical incidents affecting government agencies has declined and cost growth has been arrested.

Of course, Canada’s government IT budget is a fraction – about one-tenth in size – of that of its southern neighbor. But there is another model closer to home and closer to scale: The Defense Information Systems Agency (DISA), which manages DOD’s networks and shared services. DISA’s enterprise is roughly comparable in size and scale to what the federal civilian sector might need.

In addition to managing network infrastructure, DISA is central to the evolving Joint Information Environment (JIE) and its critical security component, the Joint Regional Security Stacks (JRSS), the network gateways intended to reduce DOD’s attack surface from more than 1,000 live Internet connections to just 49 once the program is complete.

Thomas P. Micelli

Thomas P. Micelli

“JIE is a framework for the Department of Defense (DOD) to consolidate and bring efficiencies into our networks, particularly NIPRNet and SIPRNet,” said Thomas P. Micelli, acting principal deputy to the DOD’s CIO, at a recent Armed Forces Communications and Electronics Association (AFCEA) event. That’s a lot like what the president called for in his cyberEO, he said, adding: “So it looks like the rest of the federal government is going to be following DOD and the JIE environment, and collapsing its networks.”

Not that Micelli sees the Pentagon playing a central role in that effort, however. Having previously held CIO positions at two DHS components – Immigration and Customs Enforcement (ICE), which uses OneNet, and at the Coast Guard, which uses DOD networks – he is familiar with both approaches and the unique demands of both sectors.

Will DOD will take a lead role? No more than any other agency, Micelli told GovTechWorks: The whole CIO council will contribute ideas and expertise, and each of the agencies will do its part, he said.

While the executive order leaves open the possibility that defense and intelligence networks could also be folded into a consolidated architecture, it’s unlikely that will be the case. The different mission and security requirements argue for keeping some things separate, which may be why, when asked about DISA as a model service provider for the whole government, Micelli responded by citing DHS: “They have a pretty good network too.”

That “pretty good” network didn’t come together overnight. It took close to a decade to complete DHS’s migration to OneNet. While today’s technology might suggest it now could be done faster, it’s the people part that will be the most challenging, Spires notes. “It’s not that it’s technically impossible,” he explains. “It’s the politics, the way the money is appropriated.”

Making this happen demands forceful leadership and management focus from the top – specifically, from the White House, he says. “If you have true leadership, driving this hard from the top, it’s possible,” Spires says. “Without that, it’s too easy for resistance to rise up and simply slow everything down.”

That’s why Spires favors starting by consolidating networks around communities of interest.

“I’d like to see agencies that have functional compatibility try to work this out together first,” he says. “The intelligence community is doing this with IC ITE.” So why couldn’t agencies that pay for health care or that deliver other citizen services do the same?

Stan Tyliszczak, chief engineer and vice president of technology integration with General Dynamics Information Technology, agrees.

“That’s a solid approach,” he says.  “The Intelligence Community and the DOD take advantage of the fact that their missions are isolated from the public, so they can use tightly controlled network access rules to protect against unauthorized persons trying to hack into their data.

“Creating a solution for other federal agencies based on commonality of their needs would allow for similarly tight control among like-minded agencies. There could be one community for health care; another for law enforcement and so forth. There could even be one community that’s wide open to the public for access to citizen services.

“This way, cybersecurity requirements and solutions could be tailored for each agency based on mission needs,” Tyliszczak concluded. “Every agency could pick the combination of networks that best met their unique mission and applications needs.”

Related Articles

GDIT Recruitment 600×300
GDIT HCSD SCM 5 250×250 Truck
NPR Morning Edition 250×250
GDIT Recruitment 250×250
USNI News: 250×250
Nextgov Newsletter 250×250
Washington, Not Silicon Valley, Leads The Way in Cybersecurity

Washington, Not Silicon Valley, Leads The Way in Cybersecurity

Related Articles

GDIT Recruitment 600×300
GDIT HCSD SCM 5 250×250 Truck
NPR Morning Edition 250×250
GDIT Recruitment 250×250
USNI News: 250×250
Nextgov Newsletter 250×250