Defense

Recognizing the Need for Innovation in Acquisition

Recognizing the Need for Innovation in Acquisition

The President’s Management Agenda lays out ambitious plans for the federal government to modernize information technology, prepare its future workforce and improve the way it manages major acquisitions.

These are among 14 cross-agency priority goals on which the administration is focused as it seeks to jettison outdated legacy systems and embrace less cumbersome ways of doing business.

Increasingly, federal IT managers are recognizing the need for innovation in acquisition, not just technology modernization. What exactly will it take to modernize an acquisition system bound by the 1,917-page Federal Acquisition Regulation? Federal acquisition experts say the challenges have less to do with changing those rules than with human behavior – the incentives, motivations and fears of people who touch federal acquisition – from the acquisition professionals themselves to mission owners and government executives and overseers.

“If you want a world-class acquisition system that is responsive to customer needs, you have to be able to use the right tool at the right time,” says Mathew Blum, associate administrator in the Office of Federal Procurement Policy at the Office of Management and Budget. The trouble isn’t a lack of options, he said at the American Council for Technology’s ACT/IAC Acquisition Excellence conference March 27. Rather he said, it is lack of bandwidth and fear of failure that conspire to keep acquisition pros from trying different acquisition strategies.

Risk aversion is a critical issue, agreed Greg Capella, deputy director of the National Technology Information Service at the Department of Commerce. “If you look at what contracting officers get evaluated on, it’s the number of protests, or the number of small business awards [they make],” he said. “It’s not how many successful procurements they’ve managed or what were the results for individual customers.”

Yet there are ways to break through the fear of failure, protests and blame that can paralyze acquisition shops and at the same time save time, save money and improve mission outcomes. Here are four:

  1. Outside Help

The General Services Administration’s (GSA) 18F digital services organization focuses on improving public facing services and internal systems using commercial-style development approaches. Its agile software development program employs a multidisciplinary team incentivized to work together and produce results quickly, said Alla Goldman Seifert, acting director of GSA’s Office of Acquisition in the Technology Transformation Service.

Her team helps other federal agencies tackle problems quickly and incrementally using an agile development approach. “We bring in a cross-functional team of human-centered design and technical experts, as well as acquisition professionals — all of whom work together to draft a statement of work and do the performance-based contracting for agile software acquisition,” she said.

Acquisition planning may be the most important part of that process. Seifert said 18F learned a lot since launching its Agile Blanket Purchase Agreement. The group suffered seven protests in three venues. “But since then, every time we iterate, we make sure we right-size the scope and risk we are taking.” She added by approaching projects in a modular way, risks are diminished and outcomes improved. That’s a best practice that can be replicated throughout government.

“We’re really looking at software and legacy IT modernization: How do you get a mission critical program off of a mainframe? How do you take what is probably going to be a five-year modernization effort and program for it, plan for it and budget for it?” Seifert asked.

GSA experiments in other ways, as well. For example, 18F helped agencies leverage the government’s Challenge.gov platform, publishing needs and offering prizes to the best solutions. The Defense Advanced Research Projects Agency (DARPA) currently seeks ideas for more efficient use of the radio frequency spectrum in its Spectrum Collaboration Challenge. DARPA will award up to $3.5 million to the best ideas. “Even [intelligence community components] have really enjoyed this,” Seifert said. “It really is a good way to increase competition and lower barriers to entry.”

  1. Coaching and Assistance

Many program acquisition officers cite time pressure and lack of bandwidth to learn new tools as barriers to innovation. It’s a classic chicken-and-egg problem: How do you find the time to learn and try something new?

The Department of Homeland Security’s Procurement Innovation Lab (PIL) was created to help program offices do just that – and then capture and share their experience so others in DHS can leverage the results. The PIL provides coaching, advice and asks only that the accumulated knowledge is shared by webinars and other internal means.

“How do people find time to do innovative stuff?” asked Eric Cho, project lead for PIL. “Either one: find ways to do less, or two: borrow from someone else’s work.” Having a coach to help is also critical, and that’s where his organization comes in.

In less than 100 days, the PIL recently helped a Customs and Border Protection team acquire a system to locate contraband such as drugs hidden in walls, by using a high-end stud finder, Cho said. The effort was completed in less than half the time of an earlier, unsuccessful effort.

Acquisition cycle time can be saved in many ways, from capturing impressions immediately, via group evaluations after oral presentations, to narrowing the competitive field by means of a down-select before trade-off analyses on qualified finalists. Reusing language from similar solicitations can also save time, he said. “This is not an English class.”

Even so, the successful PIL program still left middle managers in program offices a little uncomfortable, DHS officials acknowledged – the natural result of trying something new. Key to success is having high-level commitment and support for such experiments. DHS’s Chief Procurement Officer Soraya Correa has been an outspoken advocate of experimentation and the PIL. That makes a difference.

“It all comes back to the culture of rewarding compliance, rather than creativity,” said OMB’s Blum. “We need to figure out how we build incentives to encourage the workforce to test and adopt new and better ways to do business.”

  1. Outsourcing for Innovation

Another approach is to outsource the heavy-lifting to another better skilled or better experienced government entity to execute on a specialized need, such as hiring GSA’s 18F to manage agile software development.

Similarly, outsourcing to GSA’s FEDSIM is a proven strategy for efficiently managing and executing complex, enterprise-scale programs with price tags approaching $1 billion or more. FEDSIM combines both acquisition and technical expertise to manage such large-scale projects, and execute quickly by leveraging government-wide acquisition vehicles such as Alliant or OASIS, which have already narrowed the field of viable competitors.

“The advantage of FEDSIM is that they have experience executing these large-scale complex IT programs — projects that they’ve done dozens of times — but that others may only face once in a decade,” says Michael McHugh, staff vice president within General Dynamics IT’s Government Wide Acquisition Contract (GWAC) Center. The company supports Alliant and OASIS among other GWACs. “They understand that these programs shouldn’t be just about price, but in identifying the superior technical solution within a predetermined reasonable price range. There’s a difference.”

For program offices looking for guidance rather than to outsource procurement, FEDSIM is developing an “Express Platform” with pre-defined acquisition paths that depend on the need and acquisition templates designed. These streamline and accelerate processes, reduce costs and enable innovation. It’s another example of sharing best practices across government agencies.

  1. Minimizing Risk

OMB’s Blum said he doesn’t blame program managers for feeling anxious. He gets that while they like the concept of innovation, they’d rather someone else take the risk. He also believes the risks are lower than they think.

“If you’re talking about testing something new, the downside risk is much less than the upside gain,” Blum said. “Testing shouldn’t entail any more risk than a normal acquisition if you’re applying good acquisition practices — if you’re scoping it carefully, sharing information readily with potential sources so they understand your goals, and by giving participants a robust debrief,” he added. Risks can be managed.

Properly defining the scope, sounding out experts, defining goals and sharing information cannot happen in a vacuum, of course. Richard Spires, former chief information officer at DHS, and now president of Learning Tree International, said he could tell early if projects were likely to succeed or fail based on the level of teamwork exhibited by stakeholders.

“If we had a solid programmatic team that worked well with the procurement organization and you could ask those probing questions, I’ll tell you what: That’s how you spawn innovation,” Spires said. “I think we need to focus more on how to build the right team with all the right stakeholders: legal, security, the programmatic folks, the IT people running the operations.”

Tony Cothron, vice president with General Dynamics IT’s Intelligence portfolio agreed, saying it takes a combination of teamwork and experience to produce results.

“Contracting and mission need to go hand-in-hand,” Cothron said. “But in this community, mission is paramount. The things everyone should be asking are what other ways are there to get the job done? How do you create more capacity? Deliver analytics to help the mission? Improve continuity of operations? Get more for each dollar? These are hard questions, and they require imaginative solutions.”

For example, Cothron said, bundling services may help reduce costs. Likewise, contractors might accept lower prices in exchange for a longer term. “You need to develop a strategy going in that’s focused on the mission, and then set specific goals for what you want to accomplish,” he added. “There are ways to improve quality. How you contract is one of them.”

Risk of failure doesn’t have to be a disincentive to innovation. Like any risk, it can be managed – and savvy government professionals are discovering they can mitigate risks by leveraging experienced teams, sharing best practices and building on lessons learned. When they do those things, risk decreases – and the odds of success improve.

Related Articles

GDIT Recruitment 600×300
Tom Temin 250×250
GovExec Newsletter 250×250
GDIT HCSD SCM 5 250×250 Truck
GDIT Recruitment 250×250
Design Thinking and DevOps Combine for Better Customer Experience

Design Thinking and DevOps Combine for Better Customer Experience

How citizens interact with government websites tells you much about how to improve – as long as you’re paying attention, said Aaron Wieczorek, digital services expert with U.S. Digital Services’ team at the Department of Veteran Affairs.

“At VA we will literally sit down with veterans, watch them work with the website and apply for benefits,” he said. The aim is to make sure the experience is what users want and expect he said, not “what we think they want.”

Taking copious notes on their observations, the team then sets to work on programming improvements that can be quickly put to the test. “Maybe some of the buttons were confusing or some of the way things work is confusing – so we immediately start reworking,” Wieczorek explained.

Applying a modern agile development approach means digital services can immediately put those tweaks to the test in their development environment. “If it works there, good. Then it moves to staging. If that’s acceptable, it deploys into production,” Wieczorek said.

That process can happen in days. Vets.gov deploys software updates into production 40 times per month Wieczorek said, and agency wide to all kinds of environments 600 times per month.

Case in point: Vets.gov’s digital Form 1010 EZ, which allows users to apply for VA healthcare online.

“We spent hundreds of hours watching veterans, and in end we were able to totally revamp everything,” Wieczorek said. “It’s actually so easy now, you can do it all on your phone.” More than 330,000 veterans have applied that way since the digital form was introduced. “I think that’s how you scale things.”

Of course, one problem remains: Vets.gov is essentially a veteran-friendly alternative site to VA.gov, which may not be obvious to search engines or veterans looking for the best way in the door. Search Google for “VA 1010ez” and the old, mobile-unfriendly PDF form still shows as the top result. The new mobile-friendly application? It’s the third choice.

At the National Geospatial-Intelligence Agency, developers take a similar approach, but focus hard on balancing speed, quality and design for maximum results. “We believe that requirements and needs should be seen like a carton of milk: The longer they sit around, the worse they get,” said Corry Robb product design lead in the Office of GEOINT Services at the National Geospatial-Intelligence Agency. “We try to handle that need as quickly as we can and deliver that minimally viable product to the user’s hands as fast as we can.”

DevOps techniques, where development and production processes take place simultaneously, increase speed. But speed alone is not the measure of success, Robb said. “Our agency needs to focus on delivering the right thing, not just the wrong thing faster.” So in addition to development sprints, his team has added “design sprints to quickly figure out the problem-solution fit.”

Combining design thinking, which focuses on using design to solve specific user problems, is critical to the methodology, he said. “Being hand in hand with the customer – that’s one of the core values our group has.”

“Iterative development is a proven approach,” said Dennis Gibbs, who established the agile development practice in General Dynamics Information Technology’s Intelligence Solutions Division. “Agile and DevOps techniques accelerate the speed of convergence on a better solution.  We continually incorporate feedback from the user into the solution, resulting in a better capability delivered faster to the user.”

Related Articles

GDIT Recruitment 600×300
Tom Temin 250×250
GovExec Newsletter 250×250
GDIT HCSD SCM 5 250×250 Truck
GDIT Recruitment 250×250
Is Identity the New Perimeter? In a Zero-Trust World, More CISOs Think So

Is Identity the New Perimeter? In a Zero-Trust World, More CISOs Think So

As the network perimeter morphs from physical to virtual, the old Tootsie Pop security model – hard shell on the outside with a soft and chewy center – no longer works. The new mantra, as Mittal Desai, chief information security officer (CISO) at the Federal Energy Regulatory Commission, said at the ATARC CISO Summit: “Never trust, double verify.”

The zero-trust model modernizes conventional network-based security for a hybrid cloud environment. As agencies move systems and storage into the cloud, networks are virtualized and security naturally shifts to users and data. That’s easy enough to do in small organizations, but rapidly grows harder with the scale and complexity of an enterprise.

The notion of zero-trust security first surfaced five years ago in a Forrester Research report prepared for the National Institute for Standards and Technology (NIST). “The zero-trust model is simple,” Forrester posited then. “Cybersecurity professionals must stop trusting packets as if they were people. Instead, they must eliminate the idea of a trusted network (usually the internal network) and an untrusted network (external networks). In zero-trust, all network traffic is untrusted.”

Cloud adoption by its nature is forcing the issue, said Department of Homeland Security Chief Technology Officer Mike Hermus, speaking at a recent Tech + Tequila event: “It extends the data center,” he explained. “The traditional perimeter security model is not working well for us anymore. We have to work toward a model where we don’t trust something just because it’s within our boundary. We have to have strong authentication, strong access control – and strong encryption of data across the entire application life cycle.”

Indeed, as other network security features mature, identity – and the access that goes with it – is now the most common cybersecurity attack vector. Hackers favor phishing and spear-phishing attacks because they’re inexpensive and effective – and the passwords they yield are like the digital keys to an enterprise.

About 65 percent of breaches cited in Verizon’s 2017 Data Breach Investigations Report made use of stolen credentials.

Interestingly however, identity and access management represent only a small fraction of cybersecurity investment – less than 5 percent – according to Gartner’s market analysts. Network security equipment by contrast, constitutes more than 12 percent. Enterprises continue to invest in the Tootsie Pop model even as its weaknesses become more evident.

“The future state of commercial cloud computing makes identity and role-based access paramount,” said Rob Carey, vice president for cybersecurity and cloud solutions within the Global Solutions division at General Dynamics Information Technology (GDIT). Carey recommends creating both a framework for better understanding the value of identity management tools, and metrics to measure that impact. “Knowing who is on the network with a high degree of certainty has tremendous value.”

Tom Kemp, chief executive officer at Centrify, which provides cloud-based identity services, has a vested interest in changing that mix. Centrify, based in Sunnyvale, Calif., combines identity data with location and other information to help ensure only authorized, verified users access sensitive information.

“At the heart of zero-trust is the realization that an internal user should be treated just like an external user, because your internal network is just as polluted as your outside network,” Kemp said at the Feb. 7 Institute for Critical Infrastructure (ICIT) Winter Summit. “You need to move to constant verification.” Reprising former President Ronald Reagan’s “trust but verify” mantra, he adds: “Now it’s no trust and always verify. That’s the heart of zero-trust.”

The Google Experience
When Google found itself hacked in 2009, the company launched an internal project to find a better way to keep hackers out of its systems. Instead of beefing up firewalls and tightening virtual private network settings, Google’s BeyondCorp architecture dispensed with the Tootsie Pop model in which users logged in and then gained access to all manner of systems and services.

In its place, Google chose to implement a zero-trust model that challenges every user and every device on every data call – regardless of how that user accessed the internet in the first place.

While that flies in the face of conventional wisdom, Google reasoned that by tightly controlling the device and user permissions to access data, it had found a safer path.

Here’s an example of how that works when an engineer with a corporate-issued laptop wants to access an application from a public Wi-Fi connection:

  1. The laptop provides its device certificate to an access proxy.
  2. The access proxy confirms the device, then redirects to a single-sign-on (SSO) system to verify the user.
  3. The engineer provides primary and second-factor authentication credentials and, once authenticated by the SSO system, is issued a token.
  4. Now, with the device certificate to identify the device and the SSO token to identify the user, an Access Control Engine can perform a specific authorization check for every data access. The user must be confirmed to be in the engineering group; to possess a sufficient trust level; and to be using a managed device in good standing with a sufficient trust level.
  5. If all checks pass, the request is passed to an appropriate back-end system and the data access is allowed. If any of the checks fail however, the request is denied. This is repeated every time the engineer tries to access a data item.

“That’s easy enough when those attributes are simple and clear cut, as with the notional Google engineer,” said GDIT’s Carey, who spent three decades managing defense information systems. “But it gets complicated in a hurry if you’re talking about an enterprise on the scale of the Defense Department or Intelligence community.”

Segmenting the Sprawling Enterprise
A takeaway from 9/11 was that intelligence agencies needed to be better and faster at sharing threat data across agency boundaries. Opening databases across agency divisions, however, had consequences: Chelsea Manning, at the time Pfc. Bradley Manning, delivered a treasure trove of stolen files to WikiLeaks and then a few years later, Edward Snowden stole countless intelligence documents, exposing a program designed to collect metadata from domestic phone and email records.

“The more you want to be sure each user is authorized to see and access only the specific data they have a ‘need-to-know,’ the more granular the identity and access management schema need to be,” Carey said. “Implementing role-based access is complicated because you’ve got to develop ways to both tag data and code users based on their authorized need. Absent a management schema, that can quickly become difficult to manage for all but the smallest applications.”

Consider a scenario of a deployed military command working in a multinational coalition with multiple intelligence agencies represented in the command’s intelligence cell. The unit commands air and ground units from all military services, as well as civilians from defense, intelligence and possibly other agencies. Factors determining individual access to data might include the person’s job, rank, nationality, location and security clearance. Some missions might include geographic location, but others can’t rely on that factor because some members of the task force are located thousands of miles away, or operating from covert locations.

That scenario gets even more complicated in a hybrid cloud environment where some systems are located on premise, and others are far away. Managing identity-based access gets harder anyplace where distance or bandwidth limitations cause delays. Other integration challenges include implementing a single-sign-on solution across multiple clouds, or sharing data by means of an API.

Roles and Attributes
To organize access across an enterprise – whether in a small agency or a vast multi-agency system such as the Intelligence Community Information Technology Enterprise (IC ITE) – information managers must make choices. Access controls can be based on individual roles – such as job level, function and organization – or data attributes – such as type, source, classification level and so on.

“Ultimately, these are two sides of the same coin,” Carey said. “The real challenge is the mechanics of developing the necessary schema to a level of granularity that you can manage, and then building the appropriate tools to implement it.”

For example, the Defense Department intends to use role-based access controls for its Joint Information Enterprise (JIE), using the central Defense Manpower Data Center (DMDC) personnel database to connect names with jobs. The available fields in that database are in effect, the limiting factors on just how granular role-based access controls will be under JIE.

Access controls will only be one piece of JIE’s enterprise security architecture. Other features, ranging from encryption to procedural controls that touch everything from the supply chain to system security settings, will also contribute to overall security.

Skeptical of Everything
Trust – or the lack of it – plays out in each of these areas, and requires healthy skepticism at every step. Rod Turk, CISO at the Department of Commerce, said CISOs need to be skeptical of everything. “I’m talking about personnel, I’m talking about relationships with your services providers,” he told the ATARC CISO Summit.  “We look at the companies we do business with and we look at devices, and we run them through the supply chain.  And I will tell you, we have found things that made my hair curl.”

Commerce’s big push right now is the Decennial Census, which will collect volumes of personal information (PI) and personally identifiable information (PII) on almost every living person in the United States. Conducting a census every decade is like doing a major system reset each time. The next census will be no different, employing mobile devices for census takers and for the first time, allowing individuals to fill out census surveys online. Skepticism is essential because the accuracy of the data depends on the public’s trust in the census.

In a sense, that’s the riddle of the whole zero-trust concept: In order to achieve a highly trusted outcome, CISOs have to start with no trust at all.

Yet trust also cuts in the other direction. Today’s emphasis on modernization and migration to the cloud means agencies face tough choices. “Do we in the federal government trust industry to have our best interests in mind to keep our data in the cloud secure?” Turk asked rhetorically.

In theory, the Federal Risk and Authorization Management Program (FedRAMP) establishes baseline requirements for establishing trust but doubts persist. What satisfies one agency’s requirements may not satisfy another. Compliance with FedRAMP or NIST controls equates to risk management rather than actual security, GDIT’s Carey points out. They’re not the same thing.

Identity and Security
Beau Houser, CISO at the Small Business Administration, is more optimistic by improvements he’s seen as compartmentalized legacy IT systems are replaced with centralized, enterprise solutions in a Microsoft cloud.

“As we move to cloud, as we roll out Windows 10, Office 365 and Azure, we’re getting all this rich visibility of everything that’s happening in the environment,” he said. “We can now see all logins on every web app, whether that’s email or OneDrive or what have you, right on the dashboard. And part of that view is what’s happening over that session: What are they doing with email, where are they moving files.… That’s visibility we didn’t have before.”

Leveraging that visibility effectively extends that notion of zero-trust one step further, or at least shifts it into the realm of a watchful parent rather than one who blindly trusts his teenage children. The watchful parent believes trust is not a right, but an earned privilege.

“Increased visibility means agencies can add behavioral models to their security controls,” Carey said. “Behavioral analysis tools that can match behavior to what people’s roles are supposed to be and trigger warnings if people deviate from expected norms, is the next big hurdle in security.”

As Christopher Wlaschin, CISO at the Department of Health and Human Services, says: “A healthy distrust is a good thing.”

Related Articles

GDIT Recruitment 600×300
Tom Temin 250×250
GovExec Newsletter 250×250
GDIT HCSD SCM 5 250×250 Truck
GDIT Recruitment 250×250
Unpleasant Design Could Encourage Better Cyber Hygiene

Unpleasant Design Could Encourage Better Cyber Hygiene

Recent revelations that service members and intelligence professionals are inadvertently giving up their locations and fitness patterns via mobile apps caught federal agencies by surprise.

The surprise wasn’t that Fitbits, smartphones or workout apps try to collect information, nor that some users ignore policies reminding them to watch their privacy and location settings. The real surprise is that many IT policies aren’t doing more to help stop such inadvertent fitness data leaks.

If even fitness-conscious military and intelligence personnel are unknowingly trading security and privacy for convenience, how can IT security managers increase security awareness and compliance?

One answer: Unpleasant design.

Unpleasant design is a proven technique for using design to discourage unwanted behavior. Ever get stuck in an airport and long for a place to lie down — only to find every bench or row of seats is fitted with armrests? That’s no accident. Airports and train terminals don’t want people sleeping across benches. Or consider the decorative metalwork sometimes placed on urban windowsills or planter walls — designed expressly to keep loiterers from sitting down. It’s the same with harsh lights in suburban parking lots, which discourage people from hanging out and make it harder for criminals to lurk in the shadows.

As the federal government and other agency IT security leaders investigate these inadvertent disclosures, can they employ those same concepts to encourage better cyber behavior?

Here’s how unpleasant design might apply to federally furnished Wi-Fi networks: Rather than allow access with only a password, users instead might be required to have their Internet of Things (IoT) devices pass a security screening that requires certain security settings. That screening could include ensuring location services are disabled while such devices are connected to government-provided networks.

Employees would then have to choose between the convenience of free Wi-Fi for personal devices and the risks of inadequate operations security (OPSEC) via insecure device settings.

This of course, only works where users have access to such networks. At facilities where personal devices must be deposited in lockers or left in cars, it won’t make a difference. But for users working (and living) on installations where personnel routinely access Wi-Fi networks, this could be highly effective.

Screening – and even blocking – certain apps or domains could be managed through a cloud access security broker, network security management software that can enforce locally set rules governing apps actively using location data or posing other security risks. Network managers could whitelist acceptable apps and settings, while blocking those deemed unacceptable. If agencies already do that for their wired networks, why not for wireless?

Inconvenient? Absolutely. That’s the point.

IT security staffs are constantly navigating the optimal balance between security and convenience. Perfect security is achievable only when nothing is connected to anything else. Each new connection and additional convenience introduces another dent in the network’s armor.

Employing cloud-access security as a condition of Wi-Fi network access will impinge on some conveniences. In most cases, truly determined users can work around those rules by using local cellular data access instead. In most parts of the world, however, those places where the need for OPSEC is greatest, that access comes with a direct cash cost. When users pay for data by the megabyte, they’re much more likely to give up some convenience, check security and privacy settings, and limit their data consumption.

This too, is unpleasant design at work. Cellular network owners must balance network capacity with use. Lower-capacity networks control demand by raising prices, knowing that higher priced data discourages unbridled consumption.

Training and awareness will always be the most important factors in securing privacy and location data, because few users are willing to wade through pages-long user agreements to discover what’s hidden in the fine print and legalese they contain. More plain language and simpler settings for opting-in or out of certain kinds of data sharing are needed – and app makers must recognize that failing to heed such requirements only increase the risk that government steps in with new regulations.

But training and awareness only go so far. People still click on bad links, which is why some federal agencies automatically disable them. It makes users take a closer, harder look and think twice before clicking. That too, is unpleasant design.

So is requiring users to wear a badge that doubles as a computer access card (as is the case with the Pentagon’s Common Access Card and most Personal Identity Verification cards). Yet, knowing that some will inevitably leave the cards in their computers, such systems automatically log off after only a few minutes of inactivity. It’s inconvenient, but more secure.

We know this much: Human nature is such that people will take the path of least resistance. If that means accepting security settings that aren’t safe, that’s what’s going to happen. Though interrupting that convenience and turning it on its head by means of Wi-Fi security won’t stop everyone. But it might have prevented Australian undergrad Nathan Ruser – and who knows who else – from identifying the regular jogging routes of military members (among other examples) from Strava’s house-built heat map and the 13 trillion GPS points all collected from users.

“If soldiers use the app like normal people do,” Ruser tweeted Jan. 27, “it could be especially dangerous … I shouldn’t be able to establish any pattern of life info from this far away.”

Exactly.

Related Articles

GDIT Recruitment 600×300
Tom Temin 250×250
GovExec Newsletter 250×250
GDIT HCSD SCM 5 250×250 Truck
GDIT Recruitment 250×250
How AI Is Transforming Defense and Intelligence Technologies

How AI Is Transforming Defense and Intelligence Technologies

A Harvard Belfer Center study commissioned by the Intelligence Advanced Research Projects Agency (IARPA), Artificial Intelligence and National Security, predicted last May that AI will be as transformative to national defense as nuclear weapons, aircraft, computers and biotech.

Advances in AI will enable new capabilities and make others far more affordable – not only to the U.S., but to adversaries as well, raising the stakes as the United States seeks to preserve its hard-won strategic overmatch in the air, land, sea, space and cyberspace domains.

The Pentagon’s Third Offset Strategy seeks to leverage AI and related technologies in a variety of ways, according to Robert Work, former deputy secretary of defense and one of the strategy’s architects. In a forward to a new report from the market analytics firm Govini, Work says the strategy “seeks to exploit advances in AI and autonomous systems to improve the performance of Joint Force guided munitions battle networks” through:

  • Deep learning machines, powered by artificial neural networks and trained with big data sets
  • Advanced human-machine collaboration in which AI-enabled learning machines help humans make more timely and relevant combat decisions
  • AI devices that allow operators of all types to “plug into and call upon the power of the entire Joint Force battle network to accomplish assigned missions and tasks”
  • Human-machine combat teaming of manned and unmanned systems
  • Cyber- and electronic warfare-hardened, network-enabled, autonomous and high-speed weapons capable of collaborative attacks

“By exploiting advances in AI and autonomous systems to improve the warfighting potential and performance of the U.S. military,” Work says, “the strategy aims to restore the Joint Force’s eroding conventional overmatch versus any potential adversary, thereby strengthening conventional deterrence.”

Spending is growing, Govini reports, with AI and related defense program spending increasing at a compound annual rate of 14.5 percent from 2012 to 2017, and poised to grow substantially faster in coming years as advanced computing technologies come on line, driving down computational costs.

But in practical terms, what does that mean? How will AI change the way defense technology is managed, the way we gather and analyze intelligence or protect our computer systems?

Charlie Greenbacker, vice president of analytics at In-Q-Tel in Arlington, Va., the intelligence community’s strategic investment arm, sees dramatic changes ahead.

“The incredible ability of technology to automate parts of the intelligence cycle is a huge opportunity,” he said at an AI summit produced by the Advanced Technology Academic Research Center and Intel in November. “I want humans to focus on more challenging, high-order problems and not the mundane problems of the world.”

The opportunities are possible because of the advent of new, more powerful processing techniques, whether by distributing those loads across a cloud infrastructure, or using specialty processors purpose-built to do this kind of math. “Under the hood, deep learning is really just algebra,” he says. “Specialized processing lets us do this a lot faster.”

Computer vision is one focus of interest – learning to identify faces in crowds or objects in satellite or other surveillance images – as is identifying anomalies in cyber security or text-heavy data searches. “A lot of folks spend massive amounts of time sifting through text looking for needles in the haystack,” Greenbacker continued.

The Air Force is looking at AI to help more quickly identify potential cyber attacks, said Frank Konieczny, chief technology officer in the office of the Air Force chief information officer, speaking at the CyberCon 2017 in November. “We’re looking at various ways of adjusting the network or adjusting the topology based upon threats, like software-defined network capabilities as well as AI-based analysis,” he said.

Marty Trevino Jr., a former technical director and strategist for the National Security Agency, now chief data/analytics officer at intelligence specialist at Red Alpha, a tech firm based in Annapolis Junction, Md. “We are all familiar with computers beating humans in complex games – chess, Go, and so on,” Trevino says. “But experiments are showing that when humans are mated with those same computers, they beat the computer every time. It’s this unique combination of man and machine – each doing what its brain does best – that will constitute the active cyber defense (ACD) systems of tomorrow.”

Machines best humans when the task is highly defined at speed and scale. “With all the hype around artificial intelligence, it is important to understand that AI is only fantastic at performing the specific tasks to which it is intended,” Trevino says. “Otherwise AI can be very dumb.”

Humans on the other hand, are better than machines when it comes to putting information in context. “While the human brain cannot match AI in specific realms,” he adds, “it is unmatched in its ability to process complex contextual information in dynamic environments. In cyber, context is everything. Context enables data-informed strategic decisions to be made.”

Artificial Intelligence and National Security
To prepare for a future in which artificial intelligence plays a heavy or dominant role in a warfare and military strategy-rich future, IARPA commissioned the Harvard Belfer Center to study the issue. The center’s August 2017 report, “Artificial Intelligence and National Security,” offers a series of recommendations, including:

  • Wargames and strategy – The Defense Department should conduct AI-focused wargames to identify potentially disruptive military innovations. It should also fund diverse, long-term strategic analyses to better understand the impact and implications of advanced AI technologies
  • Prioritize investment – Building on strategic analysis, defense and intelligence agencies should prioritize AI research and development investment on technologies and applications that will either provide sustainable strategic advantages or mitigate key risks
  • Counter threats – Because others will also have access to AI technology, investing in “counter-AI” capabilities for both offense and defense is critical to long-term security. This includes developing technological solutions for countering AI-enabled forgery, such as faked audio or video evidence
  • Basic research – The speed of AI development in commercial industry does not preclude specific security requirements in which strategic investment can yield substantial returns. Increased investment in AI-related basic research through DARPA, IARPA, the Office of Naval Research and the National Science Foundation, are critical to achieving long-term strategic advantage
  • Commercial development – Although DoD cannot expect to be a dominant investor in AI technology, increased investment through In-Q-Tel and other means can be critical in attaining startup firms’ interest in national security applications

Building Resiliency
Looking at cybersecurity another way, AI can also be used to rapidly identify and repair software vulnerabilities, said Brian Pierce, director of the Information Innovation Office at the Defense Advanced Research Projects Agency (DARPA).

“We are using automation to engage cyber attackers in machine time, rather than human time,” he said. Using automation developed under DARPA funding, he said machine-driven defenses have demonstrated AI-based discovery and patching of software vulnerabilities. “Software flaws can last for minutes, instead of as long as years,” he said. “I can’t emphasize enough how much this automation is a game changer in strengthening cyber resiliency.”

Such advanced, cognitive ACD systems employ the gamut of detection tools and techniques, from heuristics to characteristic and signature-based identification, says Red Alpha’s Trevino. “These systems will be self-learning and self-healing, and if compromised, will be able to terminate and reconstitute themselves in an alternative virtual environment, having already learned the lessons of the previous engagement, and incorporated the required capabilities to survive. All of this will be done in real time.”

Seen in that context, AI is just the latest in a series of technologies the U.S. has used as a strategic force multiplier. Just as precision weapons enabled the U.S. Air Force to inflict greater damage with fewer bombs – and with less risk – AI can be used to solve problems that might otherwise take hundreds or even thousands of people. The promise is that instead of eyeballing thousands of images a day or scanning millions of network actions, computers can do the first screening, freeing up analysts for the harder task of interpreting results, says Dennis Gibbs, technical strategist, Intelligence and Security programs at General Dynamics Information Technology. “But just because the technology can do that, doesn’t mean it’s easy. Integrating that technology into existing systems and networks and processes is as much art as science. Success depends on how well you understand your customer. You have to understand how these things fit together.”

In a separate project, DARPA collaborated with a Fortune 100 company that was moving more than a terabyte of data per day across its virtual private network, and generating 12 million network events per day – far beyond the human ability to track or analyze. Using automated tools, however, the project team was able to identify a single unauthorized IP address that successfully logged into 3,336 VPN accounts over seven days.

Mathematically speaking, Pierce said, “The activity associated with this address was close to 9 billion network events with about a 1 in 10 chance of discovery.” The tipoff was a flaw in the botnet that attacked the network: Attacks were staged at exactly 57-minute intervals. Not all botnets of course, will make that mistake. But even pseudo random timing can also be detected. He added: “Using advanced signal processing methods applied to billions of network events over weeks and months-long timelines, we have been successful at finding pseudo random botnets.”

On the flipside however, must be the recognition that AI superiority will not be a given in cyberspace. Unlike air, land, sea or space, cyber is a man-made warfare domain. So it’s fitting that the fight there could end up being machine vs. machine.

The Harvard Artificial Intelligence and National Security study notes emphatically that while AI will make it easier to sort through ever greater volumes of intelligence, “it will also be much easier to lie persuasively.” The use of Photoshop and other image editors is well understood and has been for years. But recent advances in video editing have made it reasonably easy to forge audio and video files.

A trio of University of Washington researchers announced in a research paper published in July that they had used AI to synthesize a photorealistic, lip-synced video of former President Barack Obama. While the researchers used real audio, it’s easy to see the dangers posed if audio is also manipulated.

While the authors describe potential positive uses of the technology – such as “the ability to generate high-quality video from audio [which] could significantly reduce the amount of bandwidth needed in video coding/transmission” – potential nefarious uses are just as clear.

Related Articles

GDIT Recruitment 600×300
Tom Temin 250×250
GovExec Newsletter 250×250
GDIT HCSD SCM 5 250×250 Truck
GDIT Recruitment 250×250
Five Federal IT Trends to Watch in 2018

Five Federal IT Trends to Watch in 2018

Out with the old, in with the new. As the new year turns, it’s worth looking back on where we’ve been to better grasp where we’re headed tomorrow.

Here are five trends that took off in the year past and will shape the year ahead:

1. Modernization
The White House spent most of 2017 building its comprehensive Report to the President on Federal IT Modernization, and it will spend most of 2018 executing the details of that plan. One part assessment, one part roadmap, the report defines the IT challenges agencies face and lays out a future that will radically alter the way feds manage and acquire information technology.

The plan calls for consolidating federal networks and pooling resources and expertise by adopting common, shared services. Those steps will accelerate cloud adoption and centralize control over many commodity IT services. The payoff, officials argue: “A modern Federal IT architecture where agencies are able to maximize secure use of cloud computing, modernize Government-hosted applications and securely maintain legacy systems.”

What that looks like will play out over the coming months as agencies respond to a series of information requests and leaders at the White House, the Office of Management and Budget, the Department of Homeland Security and the National Institute for Standards and Technology respond to 50 task orders by July 4, 2018.

Among them:

  • A dozen recommendations to prioritize modernization of high-risk, high-value assets (HVAs)
  • 11 recommendations to modernize both Trusted Internet Connections (TIC) and the National Cybersecurity Protection System (NCPS) to improve security while enabling those systems to migrate to cloud-based solutions
  • 8 recommendations to support agencies adoption of shared services to accelerate adoption of commercial cloud services and infrastructure
  • 10 recommendations designed to accelerate broad adoption of commercial cloud-based email and collaboration services, such as Microsoft Office 365 or Google’s G-Suite services
  • 8 recommendations to improve existing shared services and expand such offerings, especially to smaller agencies

The devil will be in the details. Some dates to keep in mind: Updating the Federal Cloud Computing Strategy (new report due April 30); a plan to standardize cloud contract language (due from OMB by June 30); a plan to improve the speed, reliability and reuse of authority to operate (ATO) approvals for both software-as-a-service (SaaS) and other shared services (April 1).

2. CDM’s Eye on Cyber
The driving force behind the modernization plan is cybersecurity, and a key to the government’s cyber strategy is the Department of Homeland Security’s (DHS) Continuous Diagnostics and Mitigation (CDM) program. DHS will expand CDM to “enable a layered security architecture that facilitates transition to modern computing in the commercial cloud.”

Doing so means changing gears. CDM’s Phase 1, now being deployed, is designed to identify what’s connected to federal networks. Phase 2 will identify the people on the network. Phase 3 will identify activity on the network and include the ability to identify and analyze anomalies for signs of compromise, and Phase 4 will focus on securing government data.

Now CDM will have to adopt a new charge: securing government systems in commercial clouds, something not included in the original CDM plan.

“A challenge in implementing CDM capabilities in a more cloud-friendly architecture is that security teams and security operations centers may not necessarily have the expertise available to defend the updated architecture,” DHS officials write in the modernization report. “The Federal Government is working to develop this expertise and provide it across agencies through CDM.” The department is developing a security-as-a-service model with the intent of expanding CDM’s reach beyond the 68 agencies currently using the program to include all civilian federal agencies, large and small.

3. Protecting Critical Infrastructure
Securing federal networks and data is one thing, but 85 percent of the nation’s critical infrastructure is in private, not public hands. Figuring out how best to protect privately owned critical national infrastructure, such as the electric grid, gas and oil pipelines, dams and rivers and levees, public communications networks and other critical infrastructure, has long been a thorny issue.

The private sector has historically enjoyed the freedom of managing its own security – and privacy.  However, growing cyber and terrorist threats and the potential liability that could stem from such attacks means those businesses also like having the guiding hand and cooperation of federal regulators.

To date, this responsibility has taken root in the DHS’s National Protection and Programs Directorate (NPPD), which operates largely beneath the public radar. That soon that could change: The House voted in December to elevate NPPD to be the next operational component within DHS, joining the likes of Customs and Border Protection, the Coast Guard and the Secret Service.

NPPD would become the Cybersecurity and Infrastructure Security Agency and while the new status would not explicitly expand its portfolio, it would pave the way for increased influence within the agency and a greater voice in the national debate.

First, it’s got to clear the Senate. The Cybersecurity and Infrastructure Security Agency Act of 2017 faces an uncertain future in the upper chamber because of complex jurisdictional issues, and a gridlocked legislative process that makes passage of any bill an adventure — even if as in this case, that bill has the active backing of both the White House and DHS leadership.

4. Standards for IoT
The Internet of Things (IoT), the Internet of Everything, the wireless, connected world – call it what you will – is challenging the makers of industrial controls and network-connected technology to rethink security and their entire supply chains.

If a lightbulb, camera, motion detector – or any number of other sensors – can be controlled via networks, they can also be co-opted by bad actors in cyberspace. But while manufacturers have been quick to field network-enabled products, most have been slow to ensure those products are safe from hackers and abuse.

Jim Langevin (D-R.I.) advocates legislation to mandate better security in connected devices. “We need to ensure we approach the security of the Internet of Things with the techniques that have been successful with the smart phone and desktop computers: The policies of automatic patching, authentication and encryption that have worked in those domains need to be extended to all devices that are connected to the Internet,” he said last summer. “I believe the government can act as a convener to work with private industry in this space.”

The first private standard for IoT devices was approved in July when the American National Standards Institute (ANSI) endorsed UL 2900-1, General Requirements for Software Cybersecurity for Network-Connectable Products. Underwriters Laboratories (UL) plans two additional standards to follow: UL 2900-2-1, network-connectable healthcare systems, and UL 2900-2- for industrial controls.

Sens. Mark R. Warner (D-Va.) and Cory Gardner (R-Colo.), co-chairs of the Senate Cybersecurity Caucus, introduced The Internet of Things Cybersecurity Improvement Act of 2017 in August with an eye toward holding suppliers responsible for providing insecure connected products to the federal government.

The bill would require vendors supplying IoT devices to the U.S. government to ensure their devices are patchable, not hard-coded with unchangeable passwords and are free of known security vulnerabilities. It would also require automatic, authenticated security updates from the manufacturer. The measure has been criticized for its vague definitions and language and for limiting its scope to products sold to the federal government.

Yet in a world where cybersecurity is a growing liability concern for businesses of every stripe – and where there is a dearth of industry standards – such a measure could become a benchmark requirement imposed by non-government customers, as well.

5. Artificial Intelligence
2017 was the year when data analytics morphed into artificial intelligence (AI) in the public mindset. Government agencies are only now making the connection that their massive data troves could fuel a revolution in how they manage, fuse and use data to make decisions, deliver services and interact with the public.

According to market researcher IDC, that realization is not limited to government: “By the end of 2018,” the firm predicts, “half of manufacturers will be using analytics, IoT, and social collaboration tools to extend the integrated planning process across the entire enterprise, in real time.”

Gartner goes even further: “The ability to use AI to enhance decision making, reinvent business models and ecosystems, and remake the customer experience will drive the payoff for digital initiatives through 2025,” the company predicts. More than half of businesses and agencies are still searching for strategies, however.

“Enterprises should focus on business results enabled by applications that exploit narrow AI technologies,” says David Cearley, vice president and Gartner Fellow, Gartner Research. “Leave general AI to the researchers and science fiction writers.”

AI and machine learning will not be stand-alone functions, but rather foundational components that underlie the applications and services agencies employ, Cearley says. For example, natural language processing – think of Amazon’s Alexa or Apple’s Siri – can now handle increasingly complicated tasks, promising more sophisticated, faster interactions when the public calls a government 800 number.

Michael G. Rozendaal, vice president for health analytics at General Dynamics Information Technology’s Health and Civilian Solutions Division, says today’s challenge with AI is two-fold: First, finding the right applications that provide a real return on investment, and second overcoming security and privacy concerns.

“There comes a tipping point where challenges and concerns fade and the floodgates open to take advantage of a new technology,” Rozendaal told GovTechWorks. “Over the coming year, the speed of those successes and lessons learned will push AI to that tipping point.”

What this Means for Federal IT
Federal agencies face tipping points across the technology spectrum. The pace of change quickens as the pressure to modernize increases. While technology is an enabler, new skills will be needed for cloud integration, shared services security and agile development. Similarly, the emergence of new products, services and providers, greatly expand agencies’ choices. But each of those choices has its own downstream implications and risks, from vendor lock-in to bandwidth and run-time challenges. With each new wrinkle, agency environments become more complex, demanding ever more sophisticated expertise from those pulling those hybrid environments together.

“Cybersecurity will be the linchpin in all this,” says Stan Tyliszczak, vice president and chief engineer at GDIT. “Agencies can no longer afford the cyber risks of NOT modernizing their IT. It’s not whether or not to modernize, but how fast can we get there? How much cyber risk do we have in the meantime?”

Cloud is ultimately a massive integration exercise with no one-size-fits-all answers. Agencies will employ multiple systems in multiple clouds for multiple kinds of users. Engineering solutions to make those systems work harmoniously is essential.

“Turning on a cloud service is easy,” Tyliszczak says. “Integrating it with the other things you do – and getting that integration right – is where agencies will need the greatest help going forward.”

Related Articles

GDIT Recruitment 600×300
Tom Temin 250×250
GovExec Newsletter 250×250
GDIT HCSD SCM 5 250×250 Truck
GDIT Recruitment 250×250