Federal Civilian GTW

How AI Is Transforming Defense and Intelligence Technologies

How AI Is Transforming Defense and Intelligence Technologies

A Harvard Belfer Center study commissioned by the Intelligence Advanced Research Projects Agency (IARPA), Artificial Intelligence and National Security, predicted last May that AI will be as transformative to national defense as nuclear weapons, aircraft, computers and biotech.

Advances in AI will enable new capabilities and make others far more affordable – not only to the U.S., but to adversaries as well, raising the stakes as the United States seeks to preserve its hard-won strategic overmatch in the air, land, sea, space and cyberspace domains.

The Pentagon’s Third Offset Strategy seeks to leverage AI and related technologies in a variety of ways, according to Robert Work, former deputy secretary of defense and one of the strategy’s architects. In a forward to a new report from the market analytics firm Govini, Work says the strategy “seeks to exploit advances in AI and autonomous systems to improve the performance of Joint Force guided munitions battle networks” through:

  • Deep learning machines, powered by artificial neural networks and trained with big data sets
  • Advanced human-machine collaboration in which AI-enabled learning machines help humans make more timely and relevant combat decisions
  • AI devices that allow operators of all types to “plug into and call upon the power of the entire Joint Force battle network to accomplish assigned missions and tasks”
  • Human-machine combat teaming of manned and unmanned systems
  • Cyber- and electronic warfare-hardened, network-enabled, autonomous and high-speed weapons capable of collaborative attacks

“By exploiting advances in AI and autonomous systems to improve the warfighting potential and performance of the U.S. military,” Work says, “the strategy aims to restore the Joint Force’s eroding conventional overmatch versus any potential adversary, thereby strengthening conventional deterrence.”

Spending is growing, Govini reports, with AI and related defense program spending increasing at a compound annual rate of 14.5 percent from 2012 to 2017, and poised to grow substantially faster in coming years as advanced computing technologies come on line, driving down computational costs.

But in practical terms, what does that mean? How will AI change the way defense technology is managed, the way we gather and analyze intelligence or protect our computer systems?

Charlie Greenbacker, vice president of analytics at In-Q-Tel in Arlington, Va., the intelligence community’s strategic investment arm, sees dramatic changes ahead.

“The incredible ability of technology to automate parts of the intelligence cycle is a huge opportunity,” he said at an AI summit produced by the Advanced Technology Academic Research Center and Intel in November. “I want humans to focus on more challenging, high-order problems and not the mundane problems of the world.”

The opportunities are possible because of the advent of new, more powerful processing techniques, whether by distributing those loads across a cloud infrastructure, or using specialty processors purpose-built to do this kind of math. “Under the hood, deep learning is really just algebra,” he says. “Specialized processing lets us do this a lot faster.”

Computer vision is one focus of interest – learning to identify faces in crowds or objects in satellite or other surveillance images – as is identifying anomalies in cyber security or text-heavy data searches. “A lot of folks spend massive amounts of time sifting through text looking for needles in the haystack,” Greenbacker continued.

The Air Force is looking at AI to help more quickly identify potential cyber attacks, said Frank Konieczny, chief technology officer in the office of the Air Force chief information officer, speaking at the CyberCon 2017 in November. “We’re looking at various ways of adjusting the network or adjusting the topology based upon threats, like software-defined network capabilities as well as AI-based analysis,” he said.

Marty Trevino Jr., a former technical director and strategist for the National Security Agency, now chief data/analytics officer at intelligence specialist at Red Alpha, a tech firm based in Annapolis Junction, Md. “We are all familiar with computers beating humans in complex games – chess, Go, and so on,” Trevino says. “But experiments are showing that when humans are mated with those same computers, they beat the computer every time. It’s this unique combination of man and machine – each doing what its brain does best – that will constitute the active cyber defense (ACD) systems of tomorrow.”

Machines best humans when the task is highly defined at speed and scale. “With all the hype around artificial intelligence, it is important to understand that AI is only fantastic at performing the specific tasks to which it is intended,” Trevino says. “Otherwise AI can be very dumb.”

Humans on the other hand, are better than machines when it comes to putting information in context. “While the human brain cannot match AI in specific realms,” he adds, “it is unmatched in its ability to process complex contextual information in dynamic environments. In cyber, context is everything. Context enables data-informed strategic decisions to be made.”

Artificial Intelligence and National Security
To prepare for a future in which artificial intelligence plays a heavy or dominant role in a warfare and military strategy-rich future, IARPA commissioned the Harvard Belfer Center to study the issue. The center’s August 2017 report, “Artificial Intelligence and National Security,” offers a series of recommendations, including:

  • Wargames and strategy – The Defense Department should conduct AI-focused wargames to identify potentially disruptive military innovations. It should also fund diverse, long-term strategic analyses to better understand the impact and implications of advanced AI technologies
  • Prioritize investment – Building on strategic analysis, defense and intelligence agencies should prioritize AI research and development investment on technologies and applications that will either provide sustainable strategic advantages or mitigate key risks
  • Counter threats – Because others will also have access to AI technology, investing in “counter-AI” capabilities for both offense and defense is critical to long-term security. This includes developing technological solutions for countering AI-enabled forgery, such as faked audio or video evidence
  • Basic research – The speed of AI development in commercial industry does not preclude specific security requirements in which strategic investment can yield substantial returns. Increased investment in AI-related basic research through DARPA, IARPA, the Office of Naval Research and the National Science Foundation, are critical to achieving long-term strategic advantage
  • Commercial development – Although DoD cannot expect to be a dominant investor in AI technology, increased investment through In-Q-Tel and other means can be critical in attaining startup firms’ interest in national security applications

Building Resiliency
Looking at cybersecurity another way, AI can also be used to rapidly identify and repair software vulnerabilities, said Brian Pierce, director of the Information Innovation Office at the Defense Advanced Research Projects Agency (DARPA).

“We are using automation to engage cyber attackers in machine time, rather than human time,” he said. Using automation developed under DARPA funding, he said machine-driven defenses have demonstrated AI-based discovery and patching of software vulnerabilities. “Software flaws can last for minutes, instead of as long as years,” he said. “I can’t emphasize enough how much this automation is a game changer in strengthening cyber resiliency.”

Such advanced, cognitive ACD systems employ the gamut of detection tools and techniques, from heuristics to characteristic and signature-based identification, says Red Alpha’s Trevino. “These systems will be self-learning and self-healing, and if compromised, will be able to terminate and reconstitute themselves in an alternative virtual environment, having already learned the lessons of the previous engagement, and incorporated the required capabilities to survive. All of this will be done in real time.”

Seen in that context, AI is just the latest in a series of technologies the U.S. has used as a strategic force multiplier. Just as precision weapons enabled the U.S. Air Force to inflict greater damage with fewer bombs – and with less risk – AI can be used to solve problems that might otherwise take hundreds or even thousands of people. The promise is that instead of eyeballing thousands of images a day or scanning millions of network actions, computers can do the first screening, freeing up analysts for the harder task of interpreting results, says Dennis Gibbs, technical strategist, Intelligence and Security programs at General Dynamics Information Technology. “But just because the technology can do that, doesn’t mean it’s easy. Integrating that technology into existing systems and networks and processes is as much art as science. Success depends on how well you understand your customer. You have to understand how these things fit together.”

In a separate project, DARPA collaborated with a Fortune 100 company that was moving more than a terabyte of data per day across its virtual private network, and generating 12 million network events per day – far beyond the human ability to track or analyze. Using automated tools, however, the project team was able to identify a single unauthorized IP address that successfully logged into 3,336 VPN accounts over seven days.

Mathematically speaking, Pierce said, “The activity associated with this address was close to 9 billion network events with about a 1 in 10 chance of discovery.” The tipoff was a flaw in the botnet that attacked the network: Attacks were staged at exactly 57-minute intervals. Not all botnets of course, will make that mistake. But even pseudo random timing can also be detected. He added: “Using advanced signal processing methods applied to billions of network events over weeks and months-long timelines, we have been successful at finding pseudo random botnets.”

On the flipside however, must be the recognition that AI superiority will not be a given in cyberspace. Unlike air, land, sea or space, cyber is a man-made warfare domain. So it’s fitting that the fight there could end up being machine vs. machine.

The Harvard Artificial Intelligence and National Security study notes emphatically that while AI will make it easier to sort through ever greater volumes of intelligence, “it will also be much easier to lie persuasively.” The use of Photoshop and other image editors is well understood and has been for years. But recent advances in video editing have made it reasonably easy to forge audio and video files.

A trio of University of Washington researchers announced in a research paper published in July that they had used AI to synthesize a photorealistic, lip-synced video of former President Barack Obama. While the researchers used real audio, it’s easy to see the dangers posed if audio is also manipulated.

While the authors describe potential positive uses of the technology – such as “the ability to generate high-quality video from audio [which] could significantly reduce the amount of bandwidth needed in video coding/transmission” – potential nefarious uses are just as clear.

Related Articles

AFDC Cyber Summit18 300×600
WEST Conference
NPR Morning Edition 250×250

Upcoming Events

AFDC Cyber Summit18 250×250
GDIT Recruitment 250×250
USNI News: 250×250
Nextgov Newsletter 250×250
GDIT HCSD SCM 1 250×250 Train
Automation Critical to Securing Code in an Agile, DevOps World

Automation Critical to Securing Code in an Agile, DevOps World

The world’s biggest hack might have happened to anyone. The same software flaw hackers exploited to expose 145 million identities in the Equifax database – most likely yours included – was also embedded in thousands of other computer systems belonging to all manner of businesses and government agencies.

The software in question was a commonly used open-source piece of Java code known as Apache Struts. The Department of Homeland Security’s U.S. Computer Emergency Readiness Team (US-CERT) discovered a flaw in that code and issued a warning March 8, detailing the risk posed by the flaw. Like many others, Equifax reviewed the warning and searched its systems for the affected code. Unfortunately, the Atlanta-based credit bureau failed to find it among the millions of lines of code in its systems. Hackers exploited the flaw three days later.

Open source and third-party software components like Apache Struts now make up between 80 and 90 percent of software produced today, says Derek Weeks, vice president and DevOps advocate at Sonatype. The company is a provider of security tools and manager of the world’s largest open source software collections The Central Repository. Programmers completed nearly 60 billion software downloads from the repository in 2017 alone.

“If you are a software developer in any federal agency today, you are very aware that you are using open-source and third-party [software] components in development today,” Weeks says. “The average organization is using 125,000 Java open source components – just Java alone. But organizations aren’t just developing in Java, they’re also developing in JavaScript, .Net, Python and other languages. So that number goes up by double or triple.”

Reusing software saves time and money. It’s also critical to supporting the rapid cycles favored by today’s Agile and DevOps methodologies. Yet while reuse promises time-tested code, it is not without risk: Weeks estimates one in 18 downloads from The Central Repository – 5.5 percent – contains a known vulnerability. Because it never deletes anything, the repository is a user-beware system. It’s up to software developers themselves – not the repository – to determine whether or not the software components they download are safe.

Manual Review or Automation?

Performing a manual, detailed security analysis of each open-source software component takes hours to ensure it is safe and free of vulnerabilities. That in turn, distracts from precious development time, undermining the intended efficiency of reusing code in the first place.

Tools from Sonatype, Black Duck of Burlington, Mass., and others automate most of that work. Sonatype’s Nexus Firewall for example, scans modules as they come into the development environment and stops them if they contain flaws. It also suggests alternative solutions, such as newer versions of the same components, that are safe. Development teams can employ a host of automated tools to simplify or speed other parts of the build, test and secure processes.

Some of these are commercial products, and others like the software itself, are open-source tools. For example, Jenkins is a popular open-source DevOps tool that helps developers quickly find and solve defects in their codebase. These tools focus on the reused code in a system; static analysis tools, like those from Veracode, focus on the critical custom code that glues that open-source software together into a working system.

“Automation is key to agile development,” says Matthew Zach, director of software engineering at General Dynamics Information Technology’s (GDIT) Health Solutions. “The tools now exist to automate everything: the builds, unit tests, functional testing, performance testing, penetration testing and more. Ensuring the code behind new functionality not only works, but is also secure, is critical. We need to know that the stuff we’re producing is of high quality and meets our standards, and we try to automate as much of these reviews as possible.”

But automated screening and testing is still far from universal. Some use it, others don’t. Weeks describes one large financial services firm that prided its software team’s rigorous governance process. Developers were required to ask permission from a security group before using open source components. The security team’s thorough reviews took about 12 weeks for new components and six to seven weeks for new versions of components already in use. Even so, officials estimated some 800 open source components had made it through those reviews, and were in use in their 2,000-plus deployed applications.

Then, Sonatype was invited to scan the firm’s deployed software. “We found more than 13,000 open source components were running in those 2,000 applications,” Weeks recalls. “It’s not hard to see what happened. You’ve got developers working on two-week sprints, so what do you think they’re going to do? The natural behavior is, ‘I’ve got a deadline, I have to meet it, I have to be productive.’ They can’t wait 12 weeks for another group to respond.”

Automation, he said, is the answer.

Integration and the Supply Chain

Building software today is a lot like building a car: Rather than manufacture every component, from the screws to the tires to the seat covers, manufacturers focus their efforts on the pieces that differentiate products and outsource the commodity pieces to suppliers.

Chris Wysopal, chief technology officer at Veracode, said the average software application today uses 46 ready-made components. Like Sonatype, Veracode offers a testing tool that scans components for known vulnerabilities; its test suite also includes a static analysis tool to spot problems in custom code and a dynamic analysis tool that tests software in real time.

As development cycles get shorter, the demand for automating features is increasing, Wysopal says. The five-year shift from waterfall to Agile, shortened typical development cycles from months to weeks. The advent of DevOps and continuous development accelerates that further, from weeks to days or even hours.

“We’re going through this transition ourselves. When we started Veracode 11 years ago, we were a waterfall company. We did four to 10 releases a year,” Wysopal says. “Then we went to Agile and did 12 releases a year and now we’re making the transition to DevOps, so we can deploy on a daily basis if we need or want to. What we see in most of our customers is fragmented methodologies: It might be 50 percent waterfall, 40 percent agile and 10 percent DevOps. So they want tools that can fit into that DevOps pipeline.”

A tool built for speed can support slower development cycles; the opposite, however, is not the case.

One way to enhance testing is to let developers know sooner that they may have a problem. Veracode is developing a product that will scan code as its written by running a scan every few seconds and alerting the developer as soon as a problem is spotted. This has two effects: First, to clean up problems more quickly, but second, to help train developers to avoid those problems in the first place. In that sense, it’s like spell check in a word processing program.

“It’s fundamentally changing security testing for a just-in-time programming environment,” Wysopal says.

Yet as powerful and valuable as automation is, these tools alone will not make you secure.

“Automation is extremely important,” he says. “Everyone who’s doing software should be doing automation. And then manual testing on top of that is needed for anyone who has higher security needs.” He puts the financial industry and government users into that category.

For government agencies that contract for most of their software, understanding what kinds of tools and processes their suppliers have in place to ensure software quality, is critical. That could mean hiring a third-party to do security testing on software when it’s delivered, or it could mean requiring systems integrators and development firms to demonstrate their security processes and procedures before software is accepted.

“In today’s Agile-driven environment, software vulnerability can be a major source of potential compromise to sprint cadences for some teams,” says GDIT’s Zach. “We can’t build a weeks-long manual test and evaluation cycle into Agile sprints. Automated testing is the only way we can validate the security of our code while still achieving consistent, frequent software delivery.”

According to Veracode’s State of Software Security 2017, 36 percent of the survey’s respondents do not run (or were unaware of) automated static analysis on their internally developed software. Nearly half never conduct dynamic testing in a runtime environment. Worst of all, 83 percent acknowledge releasing software before or resolving security issues.

“The bottom line is all software needs to be tested. The real question for teams is what ratio and types of testing will be automated and which will be manual,” Zach says. “By exploiting automation tools and practices in the right ways, we can deliver the best possible software, as rapidly and securely as possible, without compromising the overall mission of government agencies.”

Related Articles

AFDC Cyber Summit18 300×600
WEST Conference
NPR Morning Edition 250×250

Upcoming Events

AFDC Cyber Summit18 250×250
GDIT Recruitment 250×250
USNI News: 250×250
Nextgov Newsletter 250×250
GDIT HCSD SCM 1 250×250 Train
Five Federal IT Trends to Watch in 2018

Five Federal IT Trends to Watch in 2018

Out with the old, in with the new. As the new year turns, it’s worth looking back on where we’ve been to better grasp where we’re headed tomorrow.

Here are five trends that took off in the year past and will shape the year ahead:

1. Modernization
The White House spent most of 2017 building its comprehensive Report to the President on Federal IT Modernization, and it will spend most of 2018 executing the details of that plan. One part assessment, one part roadmap, the report defines the IT challenges agencies face and lays out a future that will radically alter the way feds manage and acquire information technology.

The plan calls for consolidating federal networks and pooling resources and expertise by adopting common, shared services. Those steps will accelerate cloud adoption and centralize control over many commodity IT services. The payoff, officials argue: “A modern Federal IT architecture where agencies are able to maximize secure use of cloud computing, modernize Government-hosted applications and securely maintain legacy systems.”

What that looks like will play out over the coming months as agencies respond to a series of information requests and leaders at the White House, the Office of Management and Budget, the Department of Homeland Security and the National Institute for Standards and Technology respond to 50 task orders by July 4, 2018.

Among them:

  • A dozen recommendations to prioritize modernization of high-risk, high-value assets (HVAs)
  • 11 recommendations to modernize both Trusted Internet Connections (TIC) and the National Cybersecurity Protection System (NCPS) to improve security while enabling those systems to migrate to cloud-based solutions
  • 8 recommendations to support agencies adoption of shared services to accelerate adoption of commercial cloud services and infrastructure
  • 10 recommendations designed to accelerate broad adoption of commercial cloud-based email and collaboration services, such as Microsoft Office 365 or Google’s G-Suite services
  • 8 recommendations to improve existing shared services and expand such offerings, especially to smaller agencies

The devil will be in the details. Some dates to keep in mind: Updating the Federal Cloud Computing Strategy (new report due April 30); a plan to standardize cloud contract language (due from OMB by June 30); a plan to improve the speed, reliability and reuse of authority to operate (ATO) approvals for both software-as-a-service (SaaS) and other shared services (April 1).

2. CDM’s Eye on Cyber
The driving force behind the modernization plan is cybersecurity, and a key to the government’s cyber strategy is the Department of Homeland Security’s (DHS) Continuous Diagnostics and Mitigation (CDM) program. DHS will expand CDM to “enable a layered security architecture that facilitates transition to modern computing in the commercial cloud.”

Doing so means changing gears. CDM’s Phase 1, now being deployed, is designed to identify what’s connected to federal networks. Phase 2 will identify the people on the network. Phase 3 will identify activity on the network and include the ability to identify and analyze anomalies for signs of compromise, and Phase 4 will focus on securing government data.

Now CDM will have to adopt a new charge: securing government systems in commercial clouds, something not included in the original CDM plan.

“A challenge in implementing CDM capabilities in a more cloud-friendly architecture is that security teams and security operations centers may not necessarily have the expertise available to defend the updated architecture,” DHS officials write in the modernization report. “The Federal Government is working to develop this expertise and provide it across agencies through CDM.” The department is developing a security-as-a-service model with the intent of expanding CDM’s reach beyond the 68 agencies currently using the program to include all civilian federal agencies, large and small.

3. Protecting Critical Infrastructure
Securing federal networks and data is one thing, but 85 percent of the nation’s critical infrastructure is in private, not public hands. Figuring out how best to protect privately owned critical national infrastructure, such as the electric grid, gas and oil pipelines, dams and rivers and levees, public communications networks and other critical infrastructure, has long been a thorny issue.

The private sector has historically enjoyed the freedom of managing its own security – and privacy.  However, growing cyber and terrorist threats and the potential liability that could stem from such attacks means those businesses also like having the guiding hand and cooperation of federal regulators.

To date, this responsibility has taken root in the DHS’s National Protection and Programs Directorate (NPPD), which operates largely beneath the public radar. That soon that could change: The House voted in December to elevate NPPD to be the next operational component within DHS, joining the likes of Customs and Border Protection, the Coast Guard and the Secret Service.

NPPD would become the Cybersecurity and Infrastructure Security Agency and while the new status would not explicitly expand its portfolio, it would pave the way for increased influence within the agency and a greater voice in the national debate.

First, it’s got to clear the Senate. The Cybersecurity and Infrastructure Security Agency Act of 2017 faces an uncertain future in the upper chamber because of complex jurisdictional issues, and a gridlocked legislative process that makes passage of any bill an adventure — even if as in this case, that bill has the active backing of both the White House and DHS leadership.

4. Standards for IoT
The Internet of Things (IoT), the Internet of Everything, the wireless, connected world – call it what you will – is challenging the makers of industrial controls and network-connected technology to rethink security and their entire supply chains.

If a lightbulb, camera, motion detector – or any number of other sensors – can be controlled via networks, they can also be co-opted by bad actors in cyberspace. But while manufacturers have been quick to field network-enabled products, most have been slow to ensure those products are safe from hackers and abuse.

Jim Langevin (D-R.I.) advocates legislation to mandate better security in connected devices. “We need to ensure we approach the security of the Internet of Things with the techniques that have been successful with the smart phone and desktop computers: The policies of automatic patching, authentication and encryption that have worked in those domains need to be extended to all devices that are connected to the Internet,” he said last summer. “I believe the government can act as a convener to work with private industry in this space.”

The first private standard for IoT devices was approved in July when the American National Standards Institute (ANSI) endorsed UL 2900-1, General Requirements for Software Cybersecurity for Network-Connectable Products. Underwriters Laboratories (UL) plans two additional standards to follow: UL 2900-2-1, network-connectable healthcare systems, and UL 2900-2- for industrial controls.

Sens. Mark R. Warner (D-Va.) and Cory Gardner (R-Colo.), co-chairs of the Senate Cybersecurity Caucus, introduced The Internet of Things Cybersecurity Improvement Act of 2017 in August with an eye toward holding suppliers responsible for providing insecure connected products to the federal government.

The bill would require vendors supplying IoT devices to the U.S. government to ensure their devices are patchable, not hard-coded with unchangeable passwords and are free of known security vulnerabilities. It would also require automatic, authenticated security updates from the manufacturer. The measure has been criticized for its vague definitions and language and for limiting its scope to products sold to the federal government.

Yet in a world where cybersecurity is a growing liability concern for businesses of every stripe – and where there is a dearth of industry standards – such a measure could become a benchmark requirement imposed by non-government customers, as well.

5. Artificial Intelligence
2017 was the year when data analytics morphed into artificial intelligence (AI) in the public mindset. Government agencies are only now making the connection that their massive data troves could fuel a revolution in how they manage, fuse and use data to make decisions, deliver services and interact with the public.

According to market researcher IDC, that realization is not limited to government: “By the end of 2018,” the firm predicts, “half of manufacturers will be using analytics, IoT, and social collaboration tools to extend the integrated planning process across the entire enterprise, in real time.”

Gartner goes even further: “The ability to use AI to enhance decision making, reinvent business models and ecosystems, and remake the customer experience will drive the payoff for digital initiatives through 2025,” the company predicts. More than half of businesses and agencies are still searching for strategies, however.

“Enterprises should focus on business results enabled by applications that exploit narrow AI technologies,” says David Cearley, vice president and Gartner Fellow, Gartner Research. “Leave general AI to the researchers and science fiction writers.”

AI and machine learning will not be stand-alone functions, but rather foundational components that underlie the applications and services agencies employ, Cearley says. For example, natural language processing – think of Amazon’s Alexa or Apple’s Siri – can now handle increasingly complicated tasks, promising more sophisticated, faster interactions when the public calls a government 800 number.

Michael G. Rozendaal, vice president for health analytics at General Dynamics Information Technology’s Health and Civilian Solutions Division, says today’s challenge with AI is two-fold: First, finding the right applications that provide a real return on investment, and second overcoming security and privacy concerns.

“There comes a tipping point where challenges and concerns fade and the floodgates open to take advantage of a new technology,” Rozendaal told GovTechWorks. “Over the coming year, the speed of those successes and lessons learned will push AI to that tipping point.”

What this Means for Federal IT
Federal agencies face tipping points across the technology spectrum. The pace of change quickens as the pressure to modernize increases. While technology is an enabler, new skills will be needed for cloud integration, shared services security and agile development. Similarly, the emergence of new products, services and providers, greatly expand agencies’ choices. But each of those choices has its own downstream implications and risks, from vendor lock-in to bandwidth and run-time challenges. With each new wrinkle, agency environments become more complex, demanding ever more sophisticated expertise from those pulling those hybrid environments together.

“Cybersecurity will be the linchpin in all this,” says Stan Tyliszczak, vice president and chief engineer at GDIT. “Agencies can no longer afford the cyber risks of NOT modernizing their IT. It’s not whether or not to modernize, but how fast can we get there? How much cyber risk do we have in the meantime?”

Cloud is ultimately a massive integration exercise with no one-size-fits-all answers. Agencies will employ multiple systems in multiple clouds for multiple kinds of users. Engineering solutions to make those systems work harmoniously is essential.

“Turning on a cloud service is easy,” Tyliszczak says. “Integrating it with the other things you do – and getting that integration right – is where agencies will need the greatest help going forward.”

Related Articles

AFDC Cyber Summit18 300×600
WEST Conference
NPR Morning Edition 250×250

Upcoming Events

AFDC Cyber Summit18 250×250
GDIT Recruitment 250×250
USNI News: 250×250
Nextgov Newsletter 250×250
GDIT HCSD SCM 1 250×250 Train
In the Age of Agile, Feds Rally

In the Age of Agile, Feds Rally

Agile software development is now the dominant approach to software engineering, with adoption rates reaching 55 percent in late 2016, according to research from Computer Economics of Irvine, Calif.

Today, the incremental development trend is everywhere – even in places you wouldn’t expect. “I was surprised at the degree to which agile software development and agile DevOps are being used in programs,” said Maj. Gen. Sarah Zabel, who recently became the Air Force’s director of IT acquisition process development to help the Air Force acquire systems faster. “F-35 is doing agile. F22 is doing agile. There are so many projects doing some type of agile development.… I’ve seen between 25 and 30 in the past couple of months.”

And she wants to see more.

Zabel’s job was created for her with a specific mission in mind: “It was an expression of the frustration from our secretary and chief of staff,” she said at the Defense Systems Summit in November. “Why does it take us eight to 10 years to develop systems that will be wickedly expensive and which we know won’t be what we need when it’s finally delivered?”

There are a host of reasons, of course. Risk-averse procurement officers, arcane procurement rules and grindingly slow requirements processes are prime causes, but so are old-fashioned waterfall development processes that separate users from developers and assume that nothing will change between the time the requirements are set and the system is delivered.

Agile development, by contrast, breaks down those requirements into smaller, more manageable pieces that can be completed in short “sprints” of one to four weeks. Daily scrums bring all interested parties together to share current task information and discuss any impediments. At the conclusion of a sprint, customers get an early look at functionality and the opportunity not only to approve, but to see possibilities they hadn’t imagined before. Other times, that early access affords the opportunity to “de-scope” requirements and eliminate waste; everyone becomes more vested – and accountable – in the development process.

“I’d have a very hard time going back to waterfall,” says Matthew Zach, director of software engineering at General Dynamics Information Technology’s Health Solutions, who has been working in an agile environment since 2009. “Once you transition teams to this, they like it. The customer likes it, too.”

The 2001 Agile Manifesto launched a revolution in customer-centered software development. In the process, its authors laid out 12 underlying principles that can be applied to software development and, indeed, to many other kinds of projects, as well:

  1. Our highest priority is to satisfy the customer through early and continuous delivery of valuable software.
  2. Welcome changing requirements, even late in development. Agile processes harness change for the customer’s competitive advantage.
  3. Deliver working software frequently, from a couple of weeks to a couple of months, with a preference to the shorter timescale.
  4. Business people and developers must work together daily throughout the project.
  5. Build projects around motivated individuals. Give them the environment and support they need, and trust them to get the job done.
  6. The most efficient and effective method of conveying information to and within a development team is face-to-face conversation.
  7. Working software is the primary measure of progress.
  8. Agile processes promote sustainable development. The sponsors, developers, and users should be able to maintain a constant pace indefinitely.
  9. Continuous attention to technical excellence and good design enhances agility.
  10. Simplicity – the art of maximizing the amount of work not done – is essential.
  11. The best architectures, requirements, and designs emerge from self-organizing teams.
  12. At regular intervals, the team reflects on how to become more effective, then tunes and adjusts its behavior accordingly.
Teams, in fact, are a key to the agile philosophy. Close-knit teams work together like well-oiled machines. “If one person’s tasks for a sprint are done,” Zach said, “we want them to look at the board and see how they can help others on the team. It’s a high-trust, high-competence situation. We trust the team will produce. It all revolves around the team: the overall team succeeds or stumbles. Individual heroics are not as prevalent or necessary.”

The board is a means of communicating status and progress to the group. Agile practitioners may follow one of several methodologies to structure their projects. The most common of these, Scrum, divides project tickets into three categories – Ready, Doing and Done. Tickets move across the board showing progress from conception through testing and completion.

A second agile methodology is Kanban, which is derived from Toyota’s just-in-time delivery process of the same name. In Kanban, work flows through four stages: In progress, testing, ready for release and released. Only a certain number of items can be in each state at any one time. By limiting work in progress (WIP), the team can see when bottlenecks occur and rally to solve those problems, while otherwise managing a steady workflow.

“Teams using Kanban are typically mature teams. They can produce at such a predictable fashion, that they abandon the sprint concept and simply pull work off a prioritized list. These teams produce a small but continual stream of business value. Teams operating at this level have a high amount of automation built into the process to maximize efficiency and quality,” Zach said. “Engaged customers groom the backlog and ensure the most needed functionality is at the top of the list. They can then trust those items will be completed first.”

Customers on Board
That may be the biggest difference of all: Instead of ordering a completed product and then going away until it’s done, the product is in a continuous state of development and the customer is likewise continuously returning to the table, reviewing results, and offering feedback. That means fewer surprises and disappointments and more opportunities to refine ideas and clarify intent.

For the customer, the big change is near constant involvement. Customers are involved at the start so developers can better understand requirements, which are contained in user stories that describe the necessary task, and they are there, as well, at the end of each sprint to see, approve and comment on the work thus far.

It can be a hard adjustment for mission-focused customer agencies. “Officials from the Office of the CIO at three agencies (Environmental Protection Agency, General Services Administration and the Department of Labor) independently reported challenges related to organizational changes, such as staff adapting to the culture shift from being business customers to taking on a more active role as product owners and project managers in the software development process,” noted the General Accountability Office (GAO) in a November report on incremental development.

Formal Processes
The agile movement started in 2001 with the creation of the Agile Manifesto, in which a group of software visionaries sought to change the way developers and their customers approached product development by placing people and interactions ahead of processes and tools; valuing working software over comprehensive documentation; encouraging customer collaboration over contract negotiation; and embracing the ability to respond to changes over slavishly adherence to plans. The group laid out 12 principles to developing better software (see box).

But agile is not all touchy-feely soft skills. At its root is order and organization. The up-front planning, daily scrum sessions, progress boards, one- or two- or three-week-long sprints – are all structures intended to enforce discipline and make teams work more efficiently.

“Think of it this way: If I want to make a low-cal birthday cake, I can’t bake the cake, frost it and then say, ‘Oh, I want to make it low-calorie.’ It doesn’t work that way,” says Paul Black, computer scientist at the National Institute of Standards and Technology. “I have to decide to replace oil with apple sauce before I bake the cake for it to turn out right. It’s the same with software. The planning up front is crucial.”

Yet that should not be construed as trying to plan everything all at once, notes GDIT’s Zach. The critical difference is that agile development teams break down those plans into small, manageable pieces.

Robert Binder, senior engineer in the Software Solutions Division at Carnegie-Mellon University’s Software Engineering Institute (SEI), says agile is here to stay. “You might say it’s all over but the shouting,” he told GovTechWorks. “Agile development is now pretty much the way most software gets done. There are some parts of the software development universe where it’s taken a while for that change to take root: Program offices working on systems that are very long-lived, [which] tend to be on the trailing end of that.”

Now comes the hard part, at least for those arriving late to the party: actually implementing agile processes and managing the changes they entail.

“The cultural piece does represent significant barriers, just because of the way we have traditionally executed software and system acquisitions,” says Eileen Wrubel, technical lead for agile in government at SEI’s Software Solutions Division. “In the interest of being good stewards of taxpayer dollars, [government program managers] have historically tried to nail down all the requirements up front, when, in reality, we operate in a world where the ground changes under us rather frequently. However, those behaviors are still engrained in the culture: The idea that we need to lock down as much as possible because all variability may be interpreted as bad is a cultural mindset that is changing over time.”

Every small program that adopts an agile process is a step in that direction, she says. “There’s been a lot of work to look at smaller programs and engaging testing and cyber security organizations earlier and more often. There’s been a lot of guidance to look at prototyping and innovative means of acquisition. But it has been a shift due to the inertia of how the acquisition system has operated in the past.”

Related Articles

AFDC Cyber Summit18 300×600
WEST Conference
NPR Morning Edition 250×250

Upcoming Events

AFDC Cyber Summit18 250×250
GDIT Recruitment 250×250
USNI News: 250×250
Nextgov Newsletter 250×250
GDIT HCSD SCM 1 250×250 Train
Building a Cutting-Edge Supply Chain

Building a Cutting-Edge Supply Chain

In a non-descript office park in northern Virginia, there are several unmarked warehouses filled with prepositioned wooden crates waiting to be dispatched to destinations all over the world.  Office personnel are busy processing orders, while warehouse technicians hustle to fill them. Their work, invisible by design, ensures the secure and reliable door-to-door delivery of mission-critical supplies to locations everywhere — often to unfriendly environments. This supply chain process is similar to commercial processes but more specialized due to unique government requirements.

Supply chain management is defined by what happens behind the scenes, where strict processes and procedures are implemented to successfully fulfill order requirements in a timely manner. At both the Virginia warehouse and commercial operations, advanced software makes the system more efficient. Hand-held computers process items faster, but require talented personnel to do it successfully.

The difference between commercial supply chain management and government supply chain management is the need to meet stringent government compliance mandates in addition to customers’ requirements. Commercial vendors focus on low prices and fast delivery. Those who have the government as a customer fulfill requirements that ordinary consumers would never think of: such as opening their books to government auditors, complying with an array of Federal Acquisition Regulations that restrict how, when and where orders can be shipped and by delivering specialty items rarely available to mainstream consumers.

“While the commercial business model is not what we strive to replicate, we can leverage many of the same technologies and approaches they use and apply those to our operations,” says Phil Jones, vice president at General Dynamics Information Technology.

Warehouse shelves are labeled with barcodes defining their locations and products are shelved in bins the warehouse management software determines are most sensible. Handheld scanners instantly update inventory as items are placed on shelves or removed for order fulfillment. The same software calculates the best routes for warehouse technicians moving through aisles to fulfill an order. These are the tools for building a world-class supply chain operation.

But they’re only tools, notes Steve Tracey, executive director of Penn State University’s Smeal College of Business’ Center for Supply Chain Research™. He dismisses the “world class” label as simplistic.

“World class logistics’ is a misnomer, in my professional opinion,” Tracey told GovTechWorks. “There are common best practices in supply chains. There are firms that use those practices, but there will never be any one firm that will be best at all of them.”

Expertise is what makes any organization excel or fail. As with information technology or virtually any other service management business, supply chain management boils down to three essential components: people, process and technology.

“Anybody can go out and buy SAP or Oracle software,” Tracey continues. “But buying it and using it well are not the same thing. If you don’t have the people and processes to use that software, if you don’t understand the specific areas of expertise in your operation, then it’s like giving a five-year old a Ferrari: He can sit in it, but he can’t drive it.”

As soon as they were introduced, computers revolutionized the supply chain process. First, manufacturing operations used statistical process controls to monitor quality and increase yields in the 1970s. Then, they introduced just-in-time delivery in the 1980s to squeeze out warehouses and middlemen, forging closer relationships with a small number of suppliers who repaid the favor with better service and more precise delivery.

“The same factors are in play today as warehouse managers seek efficiency by reducing the time any product sits on shelves – even to the point of having suppliers direct-ship products by bypassing the warehouse altogether,” says Jeff Waller, president of Atlanta-based Waller & Associates LLC, a supply chain consultancy. For some supply chains, especially those directly supporting the federal government, that may not be an option, but the concept holds: The closer the logisticians are with their suppliers, the more efficient their operations and cash flows will be. Similarly, analytics can be used to better understand and anticipate customer needs.

Government is Different
To Penn State’s Tracey, differences between government and non-government supply chains start with the mission and extend all the way to foiling those who might seek to disrupt it.

“For both military and non-military government agencies operating overseas, the risks posed to supply lines by nation-state and terrorist actors are real and persistent,” he says.

Because enemies may have interests in penetrating and disrupting government supply chains, both physical and digital security is essential.

Other differences are no less challenging, Tracey says. Disruptions caused by the political process – from government shutdowns in extreme cases to the long-term effects of spending caps through sequestration – are unique to government.

“Financially, business has a continuity to it,” says Tracey. “We close the books periodically, maybe at year’s end, but it doesn’t typically affect the operations of the business. That’s not true for government agencies, which operate under significant constraints. Even though the fiscal year starts Oct. 1, some entities might not know how much they have to spend until spring.”

A second problem is sequestration, Tracey says, because “it arbitrarily limits what can be spent in certain categories and so, misaligns resources, overspending in some areas and underspending in others.”

“Those differences limit the ability to mimic what private industry does, because they make you make different choices,” he says.

Size and Scale
Though government agencies can be big customers, “big” is a relative term.

Commercial vendors process billions of orders a year. Annual online orders surpass hundreds of billions of dollars. Compared to that, a typical government supply chain contract averages about 10,000 to 20,000 orders per year; a tiny fraction of commercial order volumes.

Scale matters because in business, scale translates to influence. For many commercial firms, the challenges of meeting unique government requirements simply aren’t worth the cost.

Other differences: commercial prices are dynamic, changing constantly throughout the day in response to market movements. Government institutions are somewhat static, preferring firm price schedules that support advance planning. Commercial vendors offer their own wares and those of others in their online marketplaces. If a customer can’t find what they’re looking for in one vendor’s marketplace, they are free to shop for it someplace else. By contrast, government supply chain contracts require contractors to locate and deliver as quickly as possible and at the lowest possible price, anything for which the customer might ask.

Best of Both Worlds
While there are clear differences between a commercial and government supply chain, government agencies can gain some of the advantages pioneered in the commercial sector.

Waller believes government-focused operations can leverage the same technologies that the big guys use. “It’s a hybrid model,” Waller says. “Take the best of commercial practices – cost effectiveness and speed – and bring it into the government context.”

To do that, he says, organizations should not look at the entire supply chain and try to change everything at once. “You have to take the individual chunks of the supply chain and look at each piece individually,” he says. How do you manage inventory? Process orders? Pack and ship? Track performance?

“Start small,” he says. “Tackle your upstream logistics. Then tackle your procurement. Take it one bite at a time. Don’t wait for perfection to implement something. Make the transformation and then tweak it to get where you want to be.”

By breaking it down into pieces, you can see where problems crop up and look for ways to eliminate them – whether adding or upgrading technology, changing business processes or adding, training or replacing people.

“You want to make proactive use of big data,” Waller said. “Predict what you need, where you need it. That’s rapid fulfillment.”

Starting small doesn’t mean thinking small. Waller urges supply chain managers look beyond their own warehouses to their suppliers. How does the material arrive? From where does it come? Can savings in time or cost, be achieved by changing any of that? The bottom line: challenge everything and opportunities are sure to emerge.

Related Articles

AFDC Cyber Summit18 300×600
WEST Conference
NPR Morning Edition 250×250

Upcoming Events

AFDC Cyber Summit18 250×250
GDIT Recruitment 250×250
USNI News: 250×250
Nextgov Newsletter 250×250
GDIT HCSD SCM 1 250×250 Train
Feds Look to AI Solutions to Solve Problems from Customer Service to Cyber Defense

Feds Look to AI Solutions to Solve Problems from Customer Service to Cyber Defense

From Amazon’s Alexa speech recognition technology to Facebook’s uncanny ability to recognize our faces in photos and the coming wave of self-driving cars, artificial intelligence (AI) and machine learning (ML) are changing the way we look at the world – and how it looks at us.

Nascent efforts to embrace natural language processing to power AI chatbots on government websites and call centers are among the leading short-term AI applications in the government space. But AI also has potential application in virtually every government sector, from health and safety research to transportation safety, agriculture and weather prediction and cyber defense.

The ideas behind artificial intelligence are not new. Indeed, the U.S. Postal Service has used machine vision to automatically read and route hand-written envelopes for nearly 20 years. What’s different today is that the plunging price of data storage and the increasing speed and scalability of computing power using cloud services from Amazon Web Services (AWS) and Microsoft Azure, among others, are converging with new software to make AI solutions easier and less costly to execute than ever before.

Justin Herman, emerging citizen technology lead at the General Services Administration’s Technology Transformation Service, is a sort of AI evangelist for the agency. His job, he says: “GSA helps other agencies and prove AI is real.”

That means talking to feds, lawmakers and vendors to spread an understanding of how AI and machine learning can transform at least some parts of government.

“What are agencies actually doing and thinking about?” he asked at the recent Advanced Technology Academic Research Center’s Artificial Intelligence Applications for Government Summit. “You’ve got to ignore the hype and bring it down to a level that’s actionable…. We want to talk about the use cases, the problems, where we think the data sets are. But we’re not prescribing the solutions.”

GSA set up an “Emerging Citizen Technology Atlas” this fall, essentially an online portal for AI government applications, and established an AI user group that holds its first meeting Dec. 13. And an AI Assistant Pilot program, which so far lists more than two dozen instances where agencies hope to employ AI, includes a range of aspirational projects including:

  • Department of Health and Human Services: Develop responses for Amazon’s Alexa platform to help users quit smoking and answer questions about food safety
  • Department of Housing and Urban Development: Automate or assist with customer service using existing site content
  • National Forest Service: Provide alerts, notices and information about campgrounds, trails and recreation areas
  • Federal Student Aid: Automate responses to queries on social media about applying for and receiving aid
  • Defense Logistics Agency: Help businesses answer frequently asked questions, access requests for quotes and identify commercial and government entity (CAGE) codes

Separately, NASA used the Amazon Lex platform to train its “Rov-E” robotic ambassador to follow voice commands and answer students’ questions about Mars, a novel AI application for outreach. And chatbots – rare just two years ago – now are ubiquitous on websites, Facebook and other social media.

In all, there are now more than 100,000 chatbots on Facebook Messenger. Chatbots are common features but customer service chatbots are the most basic of applications.

“The challenge for government, as is always the case with new technology, is finding the right applications for use and breaking down the walls of security or privacy concerns that might block the way forward,” says Michael G. Rozendaal, vice president for health analytics at General Dynamics Information Technology Health and Civilian Solutions Division. “For now, figuring out how to really make AI practical for enhanced customer experience and enriched data, and with a clear return on investment, is going to take thoughtful consideration and a certain amount of trial and error.”

But as with cloud in years past, progress can be rapid. “There comes a tipping point where challenges and concerns fade and the floodgates open to take advantage of a new technology,” Rozendaal says. AI can follow the same path. “Over the coming year, the speed of those successes and lessons learned will push AI to that tipping point.”

That view is shared by Hila Mehr, a fellow at the Ash Center for Democratic Governance and Innovation at Harvard University’s Kennedy School of Government and a member of IBM’s Market Development and Insight strategy team. “Al becomes powerful with machine learning, where the computer learns from supervised training and inputs over time to improve responses,” she wrote in Artificial Intelligence for Citizen Services and Government an Ash Center white paper published in August.

In addition to chatbots, she sees translation services and facial recognition and other kinds of image identification as perfectly suited applications where “AI can reduce administrative burdens, help resolve resource allocation problems and take on significantly complex tasks.”

Open government – the act of making government data broadly available for new and innovative uses – is another promise. As Herman notes, challenging his fellow feds: “Your agencies are collecting voluminous amounts of data that are just sitting there, collecting dust. How can we make that actionable?”

Emerging Technology
Historically, most of that data wasn’t actionable. Paper forms and digital scans lack the structure and metadata to lend themselves to big data applications. But those days are rapidly fading. Electronic health records are turning the tide with medical data; website traffic data is helping government understand what citizens want when visiting, providing insights and feedback that can be used to improve the customer experience.

And that’s just the beginning. According to Fiaz Mohamed, head of solutions enablement for Intel’s AI Products Group, data volumes are growing exponentially. “By 2020, the average internet user will generate 1.5 GB of traffic per day; each self-driving car will generate 4,000 GB/day; connected planes will generate 40,000 GB/day,” he says.

At the same time, advances in hardware will enable faster and faster processing of that data, driving down the compute-intensive costs associated with AI number crunching. Facial recognition historically required extensive human training simply to teach the system the critical factors to look for, such as the distance between the eyes and the nose. “But now neural networks can take multiple samples of a photo of [an individual], and automatically detect what features are important,” he says. “The system actually learns what the key features are. Training yields the ability to infer.”

Intel, long known for its microprocessor technologies, is investing heavily in AI through internal development and external acquisitions. Intel bought machine-learning specialist Nervana in 2016 and programmable chip specialist Altera the year before. The combination is key to the company’s integrated AI strategy, Mohamed says. “What we are doing is building a full-stack solution for deploying AI at scale,” Mohamed says. “Building a proof-of-concept is one thing. But actually taking this technology and deploying it at the scale that a federal agency would want is a whole different thing.”

Many potential AI applications pose similar challenges.

FINRA, the Financial Industry Regulatory Authority, is among the government’s biggest users of AWS cloud services. Its market surveillance system captures and stores 75 billion financial records every day, then analyzes that data to detect fraud. “We process every day what Visa and Mastercard process in six months,” says Steve Randich, FINRA’s chief information officer in a presentation captured on video. “We stitch all this data together and run complex sophisticated surveillance queries against that data to look for suspicious activity.” The payoff: a 400 percent increase in performance.

Other uses include predictive fleet maintenance. IBM put its Watson AI engine to work last year in a proof-of-concept test of Watson’s ability to perform predictive maintenance for the U.S. Army’s 350 Stryker armored vehicles. In September, the Army’s Logistics Support Activity (LOGSA) signed a contract adding Watson’s cognitive services to other cloud services it gets from IBM.

“We’re moving beyond infrastructure as-a-service and embracing both platform and software as-a service,” said LOGSA Commander Col. John D. Kuenzli. He said Watson holds the potential to “truly enable LOGSA to deliver cutting-edge business intelligence and tools to give the Army unprecedented logistics support.”

AI applications share a few things in common. They use large data sets to gain an understanding of a problem and advanced computing to learn through experience. Many applications share a basic construct even if the objectives are different. Identifying military vehicles in satellite images is not unlike identifying tumors in mammograms or finding illegal contraband in x-ray images of carry-on baggage. The specifics of the challenge are different, but the fundamentals are the same. Ultimately, machines will be able to do that more accurately – and faster – than people, freeing humans to do higher-level work.

“The same type of neural network can be applied to different domains so long as the function is similar,” Mohamed says. So a system built to detect tumors for medical purposes could be adapted and trained instead to detect pedestrians in a self-driving automotive application.

Neural net processors will help because they are simply more efficient at this kind of computation than conventional central processing units. Initially these processors will reside in data centers or the cloud, but Intel already has plans to scale the technology to meet the low-power requirements of edge applications that might support remote, mobile users, such as in military or border patrol applications.

Related Articles

AFDC Cyber Summit18 300×600
WEST Conference
NPR Morning Edition 250×250

Upcoming Events

AFDC Cyber Summit18 250×250
GDIT Recruitment 250×250
USNI News: 250×250
Nextgov Newsletter 250×250
GDIT HCSD SCM 1 250×250 Train