Recognizing the Need for Innovation in Acquisition

Recognizing the Need for Innovation in Acquisition

The President’s Management Agenda lays out ambitious plans for the federal government to modernize information technology, prepare its future workforce and improve the way it manages major acquisitions.

These are among 14 cross-agency priority goals on which the administration is focused as it seeks to jettison outdated legacy systems and embrace less cumbersome ways of doing business.

Increasingly, federal IT managers are recognizing the need for innovation in acquisition, not just technology modernization. What exactly will it take to modernize an acquisition system bound by the 1,917-page Federal Acquisition Regulation? Federal acquisition experts say the challenges have less to do with changing those rules than with human behavior – the incentives, motivations and fears of people who touch federal acquisition – from the acquisition professionals themselves to mission owners and government executives and overseers.

“If you want a world-class acquisition system that is responsive to customer needs, you have to be able to use the right tool at the right time,” says Mathew Blum, associate administrator in the Office of Federal Procurement Policy at the Office of Management and Budget. The trouble isn’t a lack of options, he said at the American Council for Technology’s ACT/IAC Acquisition Excellence conference March 27. Rather he said, it is lack of bandwidth and fear of failure that conspire to keep acquisition pros from trying different acquisition strategies.

Risk aversion is a critical issue, agreed Greg Capella, deputy director of the National Technology Information Service at the Department of Commerce. “If you look at what contracting officers get evaluated on, it’s the number of protests, or the number of small business awards [they make],” he said. “It’s not how many successful procurements they’ve managed or what were the results for individual customers.”

Yet there are ways to break through the fear of failure, protests and blame that can paralyze acquisition shops and at the same time save time, save money and improve mission outcomes. Here are four:

  1. Outside Help

The General Services Administration’s (GSA) 18F digital services organization focuses on improving public facing services and internal systems using commercial-style development approaches. Its agile software development program employs a multidisciplinary team incentivized to work together and produce results quickly, said Alla Goldman Seifert, acting director of GSA’s Office of Acquisition in the Technology Transformation Service.

Her team helps other federal agencies tackle problems quickly and incrementally using an agile development approach. “We bring in a cross-functional team of human-centered design and technical experts, as well as acquisition professionals — all of whom work together to draft a statement of work and do the performance-based contracting for agile software acquisition,” she said.

Acquisition planning may be the most important part of that process. Seifert said 18F learned a lot since launching its Agile Blanket Purchase Agreement. The group suffered seven protests in three venues. “But since then, every time we iterate, we make sure we right-size the scope and risk we are taking.” She added by approaching projects in a modular way, risks are diminished and outcomes improved. That’s a best practice that can be replicated throughout government.

“We’re really looking at software and legacy IT modernization: How do you get a mission critical program off of a mainframe? How do you take what is probably going to be a five-year modernization effort and program for it, plan for it and budget for it?” Seifert asked.

GSA experiments in other ways, as well. For example, 18F helped agencies leverage the government’s platform, publishing needs and offering prizes to the best solutions. The Defense Advanced Research Projects Agency (DARPA) currently seeks ideas for more efficient use of the radio frequency spectrum in its Spectrum Collaboration Challenge. DARPA will award up to $3.5 million to the best ideas. “Even [intelligence community components] have really enjoyed this,” Seifert said. “It really is a good way to increase competition and lower barriers to entry.”

  1. Coaching and Assistance

Many program acquisition officers cite time pressure and lack of bandwidth to learn new tools as barriers to innovation. It’s a classic chicken-and-egg problem: How do you find the time to learn and try something new?

The Department of Homeland Security’s Procurement Innovation Lab (PIL) was created to help program offices do just that – and then capture and share their experience so others in DHS can leverage the results. The PIL provides coaching, advice and asks only that the accumulated knowledge is shared by webinars and other internal means.

“How do people find time to do innovative stuff?” asked Eric Cho, project lead for PIL. “Either one: find ways to do less, or two: borrow from someone else’s work.” Having a coach to help is also critical, and that’s where his organization comes in.

In less than 100 days, the PIL recently helped a Customs and Border Protection team acquire a system to locate contraband such as drugs hidden in walls, by using a high-end stud finder, Cho said. The effort was completed in less than half the time of an earlier, unsuccessful effort.

Acquisition cycle time can be saved in many ways, from capturing impressions immediately, via group evaluations after oral presentations, to narrowing the competitive field by means of a down-select before trade-off analyses on qualified finalists. Reusing language from similar solicitations can also save time, he said. “This is not an English class.”

Even so, the successful PIL program still left middle managers in program offices a little uncomfortable, DHS officials acknowledged – the natural result of trying something new. Key to success is having high-level commitment and support for such experiments. DHS’s Chief Procurement Officer Soraya Correa has been an outspoken advocate of experimentation and the PIL. That makes a difference.

“It all comes back to the culture of rewarding compliance, rather than creativity,” said OMB’s Blum. “We need to figure out how we build incentives to encourage the workforce to test and adopt new and better ways to do business.”

  1. Outsourcing for Innovation

Another approach is to outsource the heavy-lifting to another better skilled or better experienced government entity to execute on a specialized need, such as hiring GSA’s 18F to manage agile software development.

Similarly, outsourcing to GSA’s FEDSIM is a proven strategy for efficiently managing and executing complex, enterprise-scale programs with price tags approaching $1 billion or more. FEDSIM combines both acquisition and technical expertise to manage such large-scale projects, and execute quickly by leveraging government-wide acquisition vehicles such as Alliant or OASIS, which have already narrowed the field of viable competitors.

“The advantage of FEDSIM is that they have experience executing these large-scale complex IT programs — projects that they’ve done dozens of times — but that others may only face once in a decade,” says Michael McHugh, staff vice president within General Dynamics IT’s Government Wide Acquisition Contract (GWAC) Center. The company supports Alliant and OASIS among other GWACs. “They understand that these programs shouldn’t be just about price, but in identifying the superior technical solution within a predetermined reasonable price range. There’s a difference.”

For program offices looking for guidance rather than to outsource procurement, FEDSIM is developing an “Express Platform” with pre-defined acquisition paths that depend on the need and acquisition templates designed. These streamline and accelerate processes, reduce costs and enable innovation. It’s another example of sharing best practices across government agencies.

  1. Minimizing Risk

OMB’s Blum said he doesn’t blame program managers for feeling anxious. He gets that while they like the concept of innovation, they’d rather someone else take the risk. He also believes the risks are lower than they think.

“If you’re talking about testing something new, the downside risk is much less than the upside gain,” Blum said. “Testing shouldn’t entail any more risk than a normal acquisition if you’re applying good acquisition practices — if you’re scoping it carefully, sharing information readily with potential sources so they understand your goals, and by giving participants a robust debrief,” he added. Risks can be managed.

Properly defining the scope, sounding out experts, defining goals and sharing information cannot happen in a vacuum, of course. Richard Spires, former chief information officer at DHS, and now president of Learning Tree International, said he could tell early if projects were likely to succeed or fail based on the level of teamwork exhibited by stakeholders.

“If we had a solid programmatic team that worked well with the procurement organization and you could ask those probing questions, I’ll tell you what: That’s how you spawn innovation,” Spires said. “I think we need to focus more on how to build the right team with all the right stakeholders: legal, security, the programmatic folks, the IT people running the operations.”

Tony Cothron, vice president with General Dynamics IT’s Intelligence portfolio agreed, saying it takes a combination of teamwork and experience to produce results.

“Contracting and mission need to go hand-in-hand,” Cothron said. “But in this community, mission is paramount. The things everyone should be asking are what other ways are there to get the job done? How do you create more capacity? Deliver analytics to help the mission? Improve continuity of operations? Get more for each dollar? These are hard questions, and they require imaginative solutions.”

For example, Cothron said, bundling services may help reduce costs. Likewise, contractors might accept lower prices in exchange for a longer term. “You need to develop a strategy going in that’s focused on the mission, and then set specific goals for what you want to accomplish,” he added. “There are ways to improve quality. How you contract is one of them.”

Risk of failure doesn’t have to be a disincentive to innovation. Like any risk, it can be managed – and savvy government professionals are discovering they can mitigate risks by leveraging experienced teams, sharing best practices and building on lessons learned. When they do those things, risk decreases – and the odds of success improve.

Related Articles

GDIT Recruitment 600×300
GovExec Newsletter 250×250
GDIT Recruitment 250×250
GDIT HCSD SCM 1 250×250 Train
How AI Is Transforming Defense and Intelligence Technologies

How AI Is Transforming Defense and Intelligence Technologies

A Harvard Belfer Center study commissioned by the Intelligence Advanced Research Projects Agency (IARPA), Artificial Intelligence and National Security, predicted last May that AI will be as transformative to national defense as nuclear weapons, aircraft, computers and biotech.

Advances in AI will enable new capabilities and make others far more affordable – not only to the U.S., but to adversaries as well, raising the stakes as the United States seeks to preserve its hard-won strategic overmatch in the air, land, sea, space and cyberspace domains.

The Pentagon’s Third Offset Strategy seeks to leverage AI and related technologies in a variety of ways, according to Robert Work, former deputy secretary of defense and one of the strategy’s architects. In a forward to a new report from the market analytics firm Govini, Work says the strategy “seeks to exploit advances in AI and autonomous systems to improve the performance of Joint Force guided munitions battle networks” through:

  • Deep learning machines, powered by artificial neural networks and trained with big data sets
  • Advanced human-machine collaboration in which AI-enabled learning machines help humans make more timely and relevant combat decisions
  • AI devices that allow operators of all types to “plug into and call upon the power of the entire Joint Force battle network to accomplish assigned missions and tasks”
  • Human-machine combat teaming of manned and unmanned systems
  • Cyber- and electronic warfare-hardened, network-enabled, autonomous and high-speed weapons capable of collaborative attacks

“By exploiting advances in AI and autonomous systems to improve the warfighting potential and performance of the U.S. military,” Work says, “the strategy aims to restore the Joint Force’s eroding conventional overmatch versus any potential adversary, thereby strengthening conventional deterrence.”

Spending is growing, Govini reports, with AI and related defense program spending increasing at a compound annual rate of 14.5 percent from 2012 to 2017, and poised to grow substantially faster in coming years as advanced computing technologies come on line, driving down computational costs.

But in practical terms, what does that mean? How will AI change the way defense technology is managed, the way we gather and analyze intelligence or protect our computer systems?

Charlie Greenbacker, vice president of analytics at In-Q-Tel in Arlington, Va., the intelligence community’s strategic investment arm, sees dramatic changes ahead.

“The incredible ability of technology to automate parts of the intelligence cycle is a huge opportunity,” he said at an AI summit produced by the Advanced Technology Academic Research Center and Intel in November. “I want humans to focus on more challenging, high-order problems and not the mundane problems of the world.”

The opportunities are possible because of the advent of new, more powerful processing techniques, whether by distributing those loads across a cloud infrastructure, or using specialty processors purpose-built to do this kind of math. “Under the hood, deep learning is really just algebra,” he says. “Specialized processing lets us do this a lot faster.”

Computer vision is one focus of interest – learning to identify faces in crowds or objects in satellite or other surveillance images – as is identifying anomalies in cyber security or text-heavy data searches. “A lot of folks spend massive amounts of time sifting through text looking for needles in the haystack,” Greenbacker continued.

The Air Force is looking at AI to help more quickly identify potential cyber attacks, said Frank Konieczny, chief technology officer in the office of the Air Force chief information officer, speaking at the CyberCon 2017 in November. “We’re looking at various ways of adjusting the network or adjusting the topology based upon threats, like software-defined network capabilities as well as AI-based analysis,” he said.

Marty Trevino Jr., a former technical director and strategist for the National Security Agency, now chief data/analytics officer at intelligence specialist at Red Alpha, a tech firm based in Annapolis Junction, Md. “We are all familiar with computers beating humans in complex games – chess, Go, and so on,” Trevino says. “But experiments are showing that when humans are mated with those same computers, they beat the computer every time. It’s this unique combination of man and machine – each doing what its brain does best – that will constitute the active cyber defense (ACD) systems of tomorrow.”

Machines best humans when the task is highly defined at speed and scale. “With all the hype around artificial intelligence, it is important to understand that AI is only fantastic at performing the specific tasks to which it is intended,” Trevino says. “Otherwise AI can be very dumb.”

Humans on the other hand, are better than machines when it comes to putting information in context. “While the human brain cannot match AI in specific realms,” he adds, “it is unmatched in its ability to process complex contextual information in dynamic environments. In cyber, context is everything. Context enables data-informed strategic decisions to be made.”

Artificial Intelligence and National Security
To prepare for a future in which artificial intelligence plays a heavy or dominant role in a warfare and military strategy-rich future, IARPA commissioned the Harvard Belfer Center to study the issue. The center’s August 2017 report, “Artificial Intelligence and National Security,” offers a series of recommendations, including:

  • Wargames and strategy – The Defense Department should conduct AI-focused wargames to identify potentially disruptive military innovations. It should also fund diverse, long-term strategic analyses to better understand the impact and implications of advanced AI technologies
  • Prioritize investment – Building on strategic analysis, defense and intelligence agencies should prioritize AI research and development investment on technologies and applications that will either provide sustainable strategic advantages or mitigate key risks
  • Counter threats – Because others will also have access to AI technology, investing in “counter-AI” capabilities for both offense and defense is critical to long-term security. This includes developing technological solutions for countering AI-enabled forgery, such as faked audio or video evidence
  • Basic research – The speed of AI development in commercial industry does not preclude specific security requirements in which strategic investment can yield substantial returns. Increased investment in AI-related basic research through DARPA, IARPA, the Office of Naval Research and the National Science Foundation, are critical to achieving long-term strategic advantage
  • Commercial development – Although DoD cannot expect to be a dominant investor in AI technology, increased investment through In-Q-Tel and other means can be critical in attaining startup firms’ interest in national security applications

Building Resiliency
Looking at cybersecurity another way, AI can also be used to rapidly identify and repair software vulnerabilities, said Brian Pierce, director of the Information Innovation Office at the Defense Advanced Research Projects Agency (DARPA).

“We are using automation to engage cyber attackers in machine time, rather than human time,” he said. Using automation developed under DARPA funding, he said machine-driven defenses have demonstrated AI-based discovery and patching of software vulnerabilities. “Software flaws can last for minutes, instead of as long as years,” he said. “I can’t emphasize enough how much this automation is a game changer in strengthening cyber resiliency.”

Such advanced, cognitive ACD systems employ the gamut of detection tools and techniques, from heuristics to characteristic and signature-based identification, says Red Alpha’s Trevino. “These systems will be self-learning and self-healing, and if compromised, will be able to terminate and reconstitute themselves in an alternative virtual environment, having already learned the lessons of the previous engagement, and incorporated the required capabilities to survive. All of this will be done in real time.”

Seen in that context, AI is just the latest in a series of technologies the U.S. has used as a strategic force multiplier. Just as precision weapons enabled the U.S. Air Force to inflict greater damage with fewer bombs – and with less risk – AI can be used to solve problems that might otherwise take hundreds or even thousands of people. The promise is that instead of eyeballing thousands of images a day or scanning millions of network actions, computers can do the first screening, freeing up analysts for the harder task of interpreting results, says Dennis Gibbs, technical strategist, Intelligence and Security programs at General Dynamics Information Technology. “But just because the technology can do that, doesn’t mean it’s easy. Integrating that technology into existing systems and networks and processes is as much art as science. Success depends on how well you understand your customer. You have to understand how these things fit together.”

In a separate project, DARPA collaborated with a Fortune 100 company that was moving more than a terabyte of data per day across its virtual private network, and generating 12 million network events per day – far beyond the human ability to track or analyze. Using automated tools, however, the project team was able to identify a single unauthorized IP address that successfully logged into 3,336 VPN accounts over seven days.

Mathematically speaking, Pierce said, “The activity associated with this address was close to 9 billion network events with about a 1 in 10 chance of discovery.” The tipoff was a flaw in the botnet that attacked the network: Attacks were staged at exactly 57-minute intervals. Not all botnets of course, will make that mistake. But even pseudo random timing can also be detected. He added: “Using advanced signal processing methods applied to billions of network events over weeks and months-long timelines, we have been successful at finding pseudo random botnets.”

On the flipside however, must be the recognition that AI superiority will not be a given in cyberspace. Unlike air, land, sea or space, cyber is a man-made warfare domain. So it’s fitting that the fight there could end up being machine vs. machine.

The Harvard Artificial Intelligence and National Security study notes emphatically that while AI will make it easier to sort through ever greater volumes of intelligence, “it will also be much easier to lie persuasively.” The use of Photoshop and other image editors is well understood and has been for years. But recent advances in video editing have made it reasonably easy to forge audio and video files.

A trio of University of Washington researchers announced in a research paper published in July that they had used AI to synthesize a photorealistic, lip-synced video of former President Barack Obama. While the researchers used real audio, it’s easy to see the dangers posed if audio is also manipulated.

While the authors describe potential positive uses of the technology – such as “the ability to generate high-quality video from audio [which] could significantly reduce the amount of bandwidth needed in video coding/transmission” – potential nefarious uses are just as clear.

Related Articles

GDIT Recruitment 600×300
GovExec Newsletter 250×250
GDIT Recruitment 250×250
GDIT HCSD SCM 1 250×250 Train
Automation Critical to Securing Code in an Agile, DevOps World

Automation Critical to Securing Code in an Agile, DevOps World

The world’s biggest hack might have happened to anyone. The same software flaw hackers exploited to expose 145 million identities in the Equifax database – most likely yours included – was also embedded in thousands of other computer systems belonging to all manner of businesses and government agencies.

The software in question was a commonly used open-source piece of Java code known as Apache Struts. The Department of Homeland Security’s U.S. Computer Emergency Readiness Team (US-CERT) discovered a flaw in that code and issued a warning March 8, detailing the risk posed by the flaw. Like many others, Equifax reviewed the warning and searched its systems for the affected code. Unfortunately, the Atlanta-based credit bureau failed to find it among the millions of lines of code in its systems. Hackers exploited the flaw three days later.

Open source and third-party software components like Apache Struts now make up between 80 and 90 percent of software produced today, says Derek Weeks, vice president and DevOps advocate at Sonatype. The company is a provider of security tools and manager of the world’s largest open source software collections The Central Repository. Programmers completed nearly 60 billion software downloads from the repository in 2017 alone.

“If you are a software developer in any federal agency today, you are very aware that you are using open-source and third-party [software] components in development today,” Weeks says. “The average organization is using 125,000 Java open source components – just Java alone. But organizations aren’t just developing in Java, they’re also developing in JavaScript, .Net, Python and other languages. So that number goes up by double or triple.”

Reusing software saves time and money. It’s also critical to supporting the rapid cycles favored by today’s Agile and DevOps methodologies. Yet while reuse promises time-tested code, it is not without risk: Weeks estimates one in 18 downloads from The Central Repository – 5.5 percent – contains a known vulnerability. Because it never deletes anything, the repository is a user-beware system. It’s up to software developers themselves – not the repository – to determine whether or not the software components they download are safe.

Manual Review or Automation?

Performing a manual, detailed security analysis of each open-source software component takes hours to ensure it is safe and free of vulnerabilities. That in turn, distracts from precious development time, undermining the intended efficiency of reusing code in the first place.

Tools from Sonatype, Black Duck of Burlington, Mass., and others automate most of that work. Sonatype’s Nexus Firewall for example, scans modules as they come into the development environment and stops them if they contain flaws. It also suggests alternative solutions, such as newer versions of the same components, that are safe. Development teams can employ a host of automated tools to simplify or speed other parts of the build, test and secure processes.

Some of these are commercial products, and others like the software itself, are open-source tools. For example, Jenkins is a popular open-source DevOps tool that helps developers quickly find and solve defects in their codebase. These tools focus on the reused code in a system; static analysis tools, like those from Veracode, focus on the critical custom code that glues that open-source software together into a working system.

“Automation is key to agile development,” says Matthew Zach, director of software engineering at General Dynamics Information Technology’s (GDIT) Health Solutions. “The tools now exist to automate everything: the builds, unit tests, functional testing, performance testing, penetration testing and more. Ensuring the code behind new functionality not only works, but is also secure, is critical. We need to know that the stuff we’re producing is of high quality and meets our standards, and we try to automate as much of these reviews as possible.”

But automated screening and testing is still far from universal. Some use it, others don’t. Weeks describes one large financial services firm that prided its software team’s rigorous governance process. Developers were required to ask permission from a security group before using open source components. The security team’s thorough reviews took about 12 weeks for new components and six to seven weeks for new versions of components already in use. Even so, officials estimated some 800 open source components had made it through those reviews, and were in use in their 2,000-plus deployed applications.

Then, Sonatype was invited to scan the firm’s deployed software. “We found more than 13,000 open source components were running in those 2,000 applications,” Weeks recalls. “It’s not hard to see what happened. You’ve got developers working on two-week sprints, so what do you think they’re going to do? The natural behavior is, ‘I’ve got a deadline, I have to meet it, I have to be productive.’ They can’t wait 12 weeks for another group to respond.”

Automation, he said, is the answer.

Integration and the Supply Chain

Building software today is a lot like building a car: Rather than manufacture every component, from the screws to the tires to the seat covers, manufacturers focus their efforts on the pieces that differentiate products and outsource the commodity pieces to suppliers.

Chris Wysopal, chief technology officer at Veracode, said the average software application today uses 46 ready-made components. Like Sonatype, Veracode offers a testing tool that scans components for known vulnerabilities; its test suite also includes a static analysis tool to spot problems in custom code and a dynamic analysis tool that tests software in real time.

As development cycles get shorter, the demand for automating features is increasing, Wysopal says. The five-year shift from waterfall to Agile, shortened typical development cycles from months to weeks. The advent of DevOps and continuous development accelerates that further, from weeks to days or even hours.

“We’re going through this transition ourselves. When we started Veracode 11 years ago, we were a waterfall company. We did four to 10 releases a year,” Wysopal says. “Then we went to Agile and did 12 releases a year and now we’re making the transition to DevOps, so we can deploy on a daily basis if we need or want to. What we see in most of our customers is fragmented methodologies: It might be 50 percent waterfall, 40 percent agile and 10 percent DevOps. So they want tools that can fit into that DevOps pipeline.”

A tool built for speed can support slower development cycles; the opposite, however, is not the case.

One way to enhance testing is to let developers know sooner that they may have a problem. Veracode is developing a product that will scan code as its written by running a scan every few seconds and alerting the developer as soon as a problem is spotted. This has two effects: First, to clean up problems more quickly, but second, to help train developers to avoid those problems in the first place. In that sense, it’s like spell check in a word processing program.

“It’s fundamentally changing security testing for a just-in-time programming environment,” Wysopal says.

Yet as powerful and valuable as automation is, these tools alone will not make you secure.

“Automation is extremely important,” he says. “Everyone who’s doing software should be doing automation. And then manual testing on top of that is needed for anyone who has higher security needs.” He puts the financial industry and government users into that category.

For government agencies that contract for most of their software, understanding what kinds of tools and processes their suppliers have in place to ensure software quality, is critical. That could mean hiring a third-party to do security testing on software when it’s delivered, or it could mean requiring systems integrators and development firms to demonstrate their security processes and procedures before software is accepted.

“In today’s Agile-driven environment, software vulnerability can be a major source of potential compromise to sprint cadences for some teams,” says GDIT’s Zach. “We can’t build a weeks-long manual test and evaluation cycle into Agile sprints. Automated testing is the only way we can validate the security of our code while still achieving consistent, frequent software delivery.”

According to Veracode’s State of Software Security 2017, 36 percent of the survey’s respondents do not run (or were unaware of) automated static analysis on their internally developed software. Nearly half never conduct dynamic testing in a runtime environment. Worst of all, 83 percent acknowledge releasing software before or resolving security issues.

“The bottom line is all software needs to be tested. The real question for teams is what ratio and types of testing will be automated and which will be manual,” Zach says. “By exploiting automation tools and practices in the right ways, we can deliver the best possible software, as rapidly and securely as possible, without compromising the overall mission of government agencies.”

Related Articles

GDIT Recruitment 600×300
GovExec Newsletter 250×250
GDIT Recruitment 250×250
GDIT HCSD SCM 1 250×250 Train
DOD’s Shanahan to DCMO: establish a cloud-computing program

DOD’s Shanahan to DCMO: establish a cloud-computing program

An abstract digital world globe showing north America. The globe is sorrounded by communication lines and digital infographics. The border of the image features encrypted computer code.

Justin Doubleday
January 08, 2018

Deputy Defense Secretary Pat Shanahan has tasked the Pentagon’s deputy chief management officer with leading the military’s adoption of cloud technologies by establishing a new program and budget line for the effort.

Based on feedback from the cloud executive steering group he established in September, Shanahan directs Jay Gibson to implement the initial acquisition strategy for accelerating the adoption of cloud technologies, according to a Jan. 8 memo provided to Inside Defense. Shanahan instructs Gibson to lead the implementation alongside the Pentagon’s cost assessment and program evaluation office, the chief information officer and the Defense Digital Service.

Gibson is expected to soon be named the Pentagon’s chief management officer, a position created by Congress in the latest defense authorization act.

Inside Defense reported last week that Gibson had replaced Pentagon acquisition chief Ellen Lord as chair of the steering group, while CAPE and the CIO had been added as voting members, according to a Jan. 4 memo penned by Shanahan. That memo has since been “rescinded” and the Jan. 8 memo takes its place, according to a DOD spokesman.

Once elevated to CMO, Gibson will still lead the adoption of cloud technologies, the spokesman said.

The latest directive says phase one of the cloud-adoption effort “will leverage cloud technology to strengthen and streamline commercial operations within the department.” Gibson is instructed to “work with industry to ensure DOD is maximizing security, building clouds that can scale effectively to meet department demand, and developing common standards to minimize switching cost to take full advantage of vendor competition,” the memo continues.

To that end, Shanahan directs Gibson to establish a “Cloud Computing Program Manager” or CCPM. The program manager will report to Gibson, who is also authorized to “establish a budget line item and allocate an appropriate number of civilian and military billets to the CCPM based on mission need,” the memo states.

The memo appears to respond to concerns voiced by industry over the steering group’s plans to downselect to just one cloud services provider for the entire Defense Department. Industry associations argued that strategy was misguided and would lead to a limiting and costly situation of “cloud lock-in.”

Shanahan’s latest memo signals the initiative will begin with a less expansive push into the cloud.

“The initial cloud acquisition strategy will start small and employ an iterative process as the department explores how to leverage cloud technology to lower costs, improve performance and increase lethality,” the document states. “Additionally, this initiative will use existing systems, facilities and services for the DOD and other federal agencies when possible to avoid duplication and achieve maximum efficiency and economy.”

The second phase of the cloud adoption initiative will involve the DCMO and the CIO working with the services, the under secretary of defense for intelligence and Lord’s office “to build cloud strategies for requirements related to military operations and intelligence support,” the memo states.

“The department understands that applying cloud technology within the battlespace has unique challenges and opportunities and may require specialized technology investments,” it adds.

The memo, however, makes no mention of the Joint Enterprise Defense Infrastructure acquisition strategy. A Nov. 6 information paper on JEDI laid out a plan to move DOD to one cloud services provider, with a contract award planned for the fourth quarter of fiscal year 2018. The information paper partly contributed to industry’s concerns over the cloud executive steering group’s approach.

Asked about the JEDI information paper, Pentagon spokesman Cmdr. Patrick Evans told Inside Defense the steering group’s acquisition strategy “is executing and evolving in the midst of the significant reorganization” at DOD, including the disestablishment of Lord’s office, the establishment of a chief management officer “and other defense reform initiatives.” He said Shanahan’s latest memo reflects those organizational changes.

“As the department’s analysis and market research process continues, the JEDI cloud path forward will continue to evolve and mature as appropriate,” Evans wrote in a Jan. 8 email.

Related Articles

GDIT Recruitment 600×300
GovExec Newsletter 250×250
GDIT Recruitment 250×250
GDIT HCSD SCM 1 250×250 Train
To Do Agile Development, Contractors Should Be Agile, Too

To Do Agile Development, Contractors Should Be Agile, Too

Do it faster, make it better, be more efficient.

Every organization wants to improve. But knee deep in the day-to-day business of getting work done, few have the capacity to step back, survey the landscape and take time for more than incremental improvement.

Today however, more and more government agencies are turning to private sector models to achieve better results.

They’re employing agile development techniques to roll out new systems more quickly; setting up innovation centers to encourage and facilitate new ways of thinking; and seeking ways to change the procurement process to lower barriers to innovation. Each approach has proven its merit in practice.

Going Agile
Agile software development emphasizes quick iterative development cycles, in which developers roll out and then improve one version after another, learning as they go, rather than striving for perfection at the end of a single, long development cycle. Agile has been a mainstay in commercial industry for years, but it’s still a relatively new concept in government. To be successful, agile demands changes on all sides of the government acquisition process.

The American Council for Technology & Industry Advisory Council (ACT-IAC) hosted an annual Igniting Innovation competition last spring, in which 140 public-private entries vied for recognition. Among the eight finalists: InCadence Strategic Solutions, which employed agile methodologies to develop a mobile fingerprint ID system for the FBI, allowing field agents to capture fingerprints on Android smartphones and tablets and then “receive near-real time identity information on a suspect, wherever they have cellular service or WiFi access to the Internet, worldwide.”

Anthony Iasso“Agile brings us closer to the end user,” says Anthony Iasso, InCadence president. “That’s really the key: It’s about users. Oftentimes, we find there’s too many people between developers and end users. Adhering to agile allows us to quickly get to the functionality that the end user needs. It reduces the risk that a long-running program misses the mark.”

Adding engineers to the mix is also important, Iasso notes. “You have to pair agile with V1 engineers. They can go to an empty compiler and make version 1 of an application. If you let them learn as they code, then you get great capabilities,” he said.

Now the system is being marketed to state and local law enforcement, along with the military.

When the EPA decided it was finally time to replace paper forms with a digital system for evaluating firms seeking approval to remediate lead-based paint from aging buildings, developers at contractor CGI shaved months off the project by employing agile development. The whole thing was done in six months, a 20 to 30 percent versus a conventional waterfall approach.

That meant EPA needed to be actively involved, observes Linda F. Odorisio, a vice president at CGI. “If you want to do it and you want to do it right, you have to be right in there at the same table with your sleeves rolled up.”

Center for Agile Innovation
The state of North Carolina’s Innovation Center is essentially a laboratory for agile development. “Before we had the center, practically all projects appeared to be waterfall in nature,” says Eric Ellis, head of the Innovation Center. “We maybe had one or two trying to do agile methodology.”

But one goal for the new center was to conduct-proof-of-concept studies to test out new systems as they were being developed.

For example, a new application and renewal system for commercial fishing licenses was developed with agile techniques, saving the state $5 million in development costs.

“We would have gotten there [without agile], but it would taken us longer and cost us more money,” says state Chief Information Officer Keith Werner. “I had reservations that they wouldn’t have gotten the functionality they were looking for.”

Innovation centers are not without risk. Separate from the rest of an organization, they can be seen as disconnected or elitist, creative experts focused on innovating but disconnected from the real business of government.

“If you create an innovation group then they’re seen as the innovation group,” says Ellis. “The rest of the people, who aren’t in the innovation group, don’t feel compelled to innovate.”

To guard against that, the North Carolina Innovation Center, located on the first floor of the state’s Department of Environment and Natural Resources HQ, has no full-time resources of its own. The idea is to create an open environment that can change as needs change. Even its office space is flexible, easily reconfigured to encourage open-space interactions, so ideas can be demonstrated with little fuss.

Agile Contracting
Changing the software development process alone is not enough, says Michael Howell, senior director of the Institute for Innovation and Special Projects at ACT-IAC. The contracting piece also has to change.

“You can’t say I want to be agile, so here’s what I’m going to do: ‘I’m going to put a request in my 2018 budget and wait and see if I get any money,’” Howell says. It doesn’t work. They have to have flexibility to come up with the money. Then they have to have flexibility … to actually spend the money.”

Bob Gleason, director of the Division of Purchases and Supplies in the Virginia Department of General Services, says conventional procurement practices focus on known solutions and avoid unknowns, which add risk and uncertainty to programs.

Traditional requests for proposals define in specific detail exactly what is wanted, and suppliers respond in kind. “It gives you what it is you’re looking for,” Gleason says. “But there’s no incentive for any added value.”

It’s better, he said, to focus on the desired outcome, rather than on the detailed requirements intended to produce that same result, and to invite industry to offer innovative solutions the government customer may not have imagined on its own.

Contracts also must be flexible so vendors can improve their products or services over time, as they learn. Otherwise, vendors can be contractually locked into inefficient systems and approaches.

“You need to have a contract that’s not structured in fixed points in time, but is structured in a way that enables change over the life of the agreement,” Gleason says.

Managing Risk
“Part of the challenge we have as integrators is not just coming up with that new capability,” but also making sure that contracting officers’ technical advisors are well informed so they have the ability to compare very different proposals, says David Gagliano, chief technology officer for global solutions at General Dynamics Information Technology. Innovation inevitably involves risk, and contracting officials are trained to be risk-averse. Selection based on price is clear and straightforward in a way that value comparisons are not. So acquisition officers need skills to evaluate the benefits of different technical proposals and the confidence to take on reasonable amounts of risk.

“Two years ago, the government published the ‘TechFAR Handbook for Procuring Digital Services Using Agile Processes,’” Gagliano says. “It’s a pretty good starting point for contracting officers and their technical representatives who want to learn more about Best Practices in Agile procurement.”

“People don’t want government to fail at all,” says Darrell West, director of the Center for Technology Innovation at the Brookings Institution. “When government fails, it often ends up on the front page. The private sector model of failing nine times to have that initial success has been difficult to incorporate in the public sector.”

So to accept failure, the threshold must be low enough that risk can be tolerated. Pilot programs and related short-term, proof-of-concept contracts can lower risk by reducing the amount of money at stake. West contends they can “encourage innovation while protecting against large-scale failures.”

The Defense Department’s DIUX initiative, which brings together venture capital firms, small technology businesses and Pentagon technologists to accelerate the injection of new technologies into the department, exemplifies the approach. New concepts can be conceived and proven in a low-risk, small contract environment, independent of conventional contracting rules and schedules. Then, once the technology has matured to the point of a wider roll-out, bigger firms can compete for the right to manage that implementation.

In this case, government gets the best of both worlds: rapid-fire innovation from small firms unfettered by cumbersome acquisition rules followed by a managed implementation by experienced contractors steeped in the intricacies of doing business with large-scale government organizations.

Related Articles

GDIT Recruitment 600×300
GovExec Newsletter 250×250
GDIT Recruitment 250×250
GDIT HCSD SCM 1 250×250 Train
Annual Report Shows Progress in Acquisition

Annual Report Shows Progress in Acquisition

Data-driven acquisition and procurement policies are creating savings in production and development programs, lowering large-program costs and moving contract costs and management in positive directions, the Defense Department’s acquisition chief said this morning.

Frank Kendall, undersecretary of defense for acquisition, technology and logistics, discussed his third annual Performance of the Defense Acquisition System report during a DoD acquisition event hosted here by Defense One.

“I believe in using data to inform policy, … so what I’ve been trying to do for the last several years is have a more data-driven set of acquisition policies,” Kendall said in his keynote remarks.

In 2010, Kendall, working with then-acquisition undersecretary Ash Carter, created the first version of Better Buying Power, the implementation of best practices to strengthen the department’s buying power and improve industry productivity.

Better Buying Power

From 2009 to 2011, Carter, now defense secretary, was undersecretary of defense for acquisition, technology and logistics with responsibility for DoD’s procurement reform and innovation agenda.

“Together we had put together a cell … [whose work] was responsible for a lot of the data” represented in the report, Kendall said.

“We’ve grown that body of work every year, and I think we’ve gotten to the point after several years of putting policies in place [and] trying to learn from what we’ve done that some results are starting to … suggest we’re headed in the right direction,” he added.

The acquisition chief also gave credit to the thousands of people in government and industry who are making possible the progress documented in the report.

Acquisition Trenches

Also this year, Kendall’s office is publishing a Compendium of Program Manager Assessments — individual reports of program managers from major programs — “to give the community, in Washington in particular, a chance to look at what real life in the trenches of acquisition is like,” Kendall said.

During his remarks, Kendall went through several charts from the report and commented on the department’s progress in several areas. Referring to a chart on production, Kendall called the positive trend a “pretty good estimator of what’s actually happening in unit costs in our programs,” and added that the department is better at predicting how production will go than it is at predicting how development will go.

“There’s less uncertainty, and you see that reflected in results, which are much better [for production] than they are for development,” he said.

Kendall said the focus on development distracts from other things that matter in acquisition. “Production and sustainment are where almost all the money is in defense acquisition,” he added.

‘Should-Cost’ Management

In large part because of a Better Buying Power initiative called “should-cost management,” cost growth on major programs generally is at or better than historical levels, Kendall said.

For example, he added, more programs are showing savings relative to initial baselines than in the past, major program cost growth in two-year increments has a downward trend, and contractors for major programs are doing a better job of meeting their contract cost targets.

Should-cost management is separate from budgeting numbers, Kendall noted.

“The idea is to do better than the budget, and we’re seeing some significant improvements here. This is statistically significant data that shows that the fraction of our programs in development showing savings has gone up pretty significantly,” he said, adding that the report shows similar results for production.

Encouraging Trends

“I think that can be attributed, at least in large part, to our change in emphasis throughout the workforce on should-cost and the idea that it’s our job to get costs down, not just to stay under our budget or even just to spend the budget,” Kendall said.

Addressing competition, or the percentage of DoD awards that are competitive, Kendall said he’s encouraged by the trend in the last year but “worried about what the budget climate in fiscal year 2016 is going to do to us.”

When budgets get tight and the department is spending less money, there are fewer opportunities to compete contracts, Kendall said. He noted that the Obama administration has emphasized small business, and added that he would like to see more Defense Department contracts go to small businesses.

“In general, I think we’re at least holding the line there, and more recently moving in the right direction,” he said.

Cheryl Pellerin is a reporter and science writer for DoD News, which provides news and feature articles for , the U.S. Department of Defense website.

Related Articles

GDIT Recruitment 600×300
GovExec Newsletter 250×250
GDIT Recruitment 250×250
GDIT HCSD SCM 1 250×250 Train