Big Data

Design Thinking and DevOps Combine for Better Customer Experience

Design Thinking and DevOps Combine for Better Customer Experience

How citizens interact with government websites tells you much about how to improve – as long as you’re paying attention, said Aaron Wieczorek, digital services expert with U.S. Digital Services’ team at the Department of Veteran Affairs.

“At VA we will literally sit down with veterans, watch them work with the website and apply for benefits,” he said. The aim is to make sure the experience is what users want and expect he said, not “what we think they want.”

Taking copious notes on their observations, the team then sets to work on programming improvements that can be quickly put to the test. “Maybe some of the buttons were confusing or some of the way things work is confusing – so we immediately start reworking,” Wieczorek explained.

Applying a modern agile development approach means digital services can immediately put those tweaks to the test in their development environment. “If it works there, good. Then it moves to staging. If that’s acceptable, it deploys into production,” Wieczorek said.

That process can happen in days. Vets.gov deploys software updates into production 40 times per month Wieczorek said, and agency wide to all kinds of environments 600 times per month.

Case in point: Vets.gov’s digital Form 1010 EZ, which allows users to apply for VA healthcare online.

“We spent hundreds of hours watching veterans, and in end we were able to totally revamp everything,” Wieczorek said. “It’s actually so easy now, you can do it all on your phone.” More than 330,000 veterans have applied that way since the digital form was introduced. “I think that’s how you scale things.”

Of course, one problem remains: Vets.gov is essentially a veteran-friendly alternative site to VA.gov, which may not be obvious to search engines or veterans looking for the best way in the door. Search Google for “VA 1010ez” and the old, mobile-unfriendly PDF form still shows as the top result. The new mobile-friendly application? It’s the third choice.

At the National Geospatial-Intelligence Agency, developers take a similar approach, but focus hard on balancing speed, quality and design for maximum results. “We believe that requirements and needs should be seen like a carton of milk: The longer they sit around, the worse they get,” said Corry Robb product design lead in the Office of GEOINT Services at the National Geospatial-Intelligence Agency. “We try to handle that need as quickly as we can and deliver that minimally viable product to the user’s hands as fast as we can.”

DevOps techniques, where development and production processes take place simultaneously, increase speed. But speed alone is not the measure of success, Robb said. “Our agency needs to focus on delivering the right thing, not just the wrong thing faster.” So in addition to development sprints, his team has added “design sprints to quickly figure out the problem-solution fit.”

Combining design thinking, which focuses on using design to solve specific user problems, is critical to the methodology, he said. “Being hand in hand with the customer – that’s one of the core values our group has.”

“Iterative development is a proven approach,” said Dennis Gibbs, who established the agile development practice in General Dynamics Information Technology’s Intelligence Solutions Division. “Agile and DevOps techniques accelerate the speed of convergence on a better solution.  We continually incorporate feedback from the user into the solution, resulting in a better capability delivered faster to the user.”

Related Articles

GDIT Recruitment 600×300
NPR Morning Edition 250×250
GDIT Recruitment 250×250
USNI News: 250×250
Nextgov Newsletter 250×250
GDIT HCSD SCM 2 250×250 Plane Takeoff
One Big Risk With Big Data: Format Lock-In

One Big Risk With Big Data: Format Lock-In

Insider threat programs and other long-term Big Data projects demand users take a longer view than is necessary with most technologies.

If the rapid development of new technologies over the past three decades has taught us anything, it’s that each successive new technology will undoubtedly be replaced by another. Vinyl records gave way to cassettes and then compact discs and MP3 files; VHS tapes gave way to DVD and video streaming.

Saving and using large databases present similar challenges. As agencies retain security data to track behavior patterns over years and even decades, ensuring the information remains accessible for future audit and forensic investigations is critical. Today, agency requirements call for saving system logs for a minimum of five years. But there’s no magic to that timeframe, which is arguably not long enough.

The records of many notorious trusted insiders who later went rogue – from Aldrich Ames at the CIA to Robert Hansen at the FBI to Harold T. Martin III at NSA suggest the first indications of trouble began a decade or longer before they were caught. It stands to reason, then, that longer-term tracking should make it harder for moles to remain undetected.

But how can agencies ensure data saved today will still be readable in 20 or 30 years? The answer is in normalizing data and standardizing the way data is saved.

“This is actually going on now where you have to convert your ArcSight databases into Elastic,” says David Sarmanian, an enterprise architect with General Dynamics Information Technology (GDIT). The company helps manage a variety of programs involving large, longitudinal databases for government customers. “There is a process here of taking all the old data – where we have data that is seven years old – and converting that into a new format for Elastic.”

Java Script Object Notation (JSON) is an open source standard for data interchange favored by many integrators and vendors. As a lightweight data-interchange format, it is easy for humans to read and write and also easy for machines to parse and generate. Non-proprietary and widely used, it is common in both web application development, java programming and in the popular Elasticsearch search engine.

To convert data to JSON for one customer, GDIT’s Sarmanian says, “We had to write a special script that did that conversion.”  Converting to a common, widely used standard helps ensure data will be accessible in the future, but history suggests that any format used today is likely to change in the future – as will file storage. Whether in an on-premises data center or in the cloud, agencies need to be concerned about how best to ensure long-term access to the data years or decades from now.

“If you put it in the cloud now, what happens in the future if you want to change? How do you get it out if you want to go from Amazon to Microsoft Azure – or the other way – 15 years from now? There could be something better than Hadoop or Google, but the cost could be prohibitive,” says Sarmanian.

JSON emerged as a favored standard, supported by a diverse range of vendors from Amazon Web Services to Elastic and IBM to Oracle, along with the Elasticsearch search engine. In a world where technologies and businesses can come and go rapidly, its wide use is reassuring to government customers with a long-range outlook.

“Elasticsearch is open source,” says Michael Paquette, director of products for security markets with Elastic, developer of the Elasticsearch distributed search and analytical engine. “Therefore, you can have it forever. If Elasticsearch ever stopped being used, you can keep an old copy of it and access data forever. If you choose to use encryption, then you take on the obligation of managing the keys that go with that encryption and decryption.”

In time, some new format may be favored, necessitating a conversion similar to what Sarmanian is doing today to help their customer convert to JSON. Conversion itself will have a cost, of course. But by using an open source standard today, it’s far less likely that you’ll need custom code to make that conversion tomorrow.

Related Articles

GDIT Recruitment 600×300
NPR Morning Edition 250×250
GDIT Recruitment 250×250
USNI News: 250×250
Nextgov Newsletter 250×250
GDIT HCSD SCM 2 250×250 Plane Takeoff
To Do Agile Development, Contractors Should Be Agile, Too

To Do Agile Development, Contractors Should Be Agile, Too

Do it faster, make it better, be more efficient.

Every organization wants to improve. But knee deep in the day-to-day business of getting work done, few have the capacity to step back, survey the landscape and take time for more than incremental improvement.

Today however, more and more government agencies are turning to private sector models to achieve better results.

They’re employing agile development techniques to roll out new systems more quickly; setting up innovation centers to encourage and facilitate new ways of thinking; and seeking ways to change the procurement process to lower barriers to innovation. Each approach has proven its merit in practice.

Going Agile
Agile software development emphasizes quick iterative development cycles, in which developers roll out and then improve one version after another, learning as they go, rather than striving for perfection at the end of a single, long development cycle. Agile has been a mainstay in commercial industry for years, but it’s still a relatively new concept in government. To be successful, agile demands changes on all sides of the government acquisition process.

The American Council for Technology & Industry Advisory Council (ACT-IAC) hosted an annual Igniting Innovation competition last spring, in which 140 public-private entries vied for recognition. Among the eight finalists: InCadence Strategic Solutions, which employed agile methodologies to develop a mobile fingerprint ID system for the FBI, allowing field agents to capture fingerprints on Android smartphones and tablets and then “receive near-real time identity information on a suspect, wherever they have cellular service or WiFi access to the Internet, worldwide.”

Anthony Iasso“Agile brings us closer to the end user,” says Anthony Iasso, InCadence president. “That’s really the key: It’s about users. Oftentimes, we find there’s too many people between developers and end users. Adhering to agile allows us to quickly get to the functionality that the end user needs. It reduces the risk that a long-running program misses the mark.”

Adding engineers to the mix is also important, Iasso notes. “You have to pair agile with V1 engineers. They can go to an empty compiler and make version 1 of an application. If you let them learn as they code, then you get great capabilities,” he said.

Now the system is being marketed to state and local law enforcement, along with the military.

When the EPA decided it was finally time to replace paper forms with a digital system for evaluating firms seeking approval to remediate lead-based paint from aging buildings, developers at contractor CGI shaved months off the project by employing agile development. The whole thing was done in six months, a 20 to 30 percent versus a conventional waterfall approach.

That meant EPA needed to be actively involved, observes Linda F. Odorisio, a vice president at CGI. “If you want to do it and you want to do it right, you have to be right in there at the same table with your sleeves rolled up.”

Center for Agile Innovation
The state of North Carolina’s Innovation Center is essentially a laboratory for agile development. “Before we had the center, practically all projects appeared to be waterfall in nature,” says Eric Ellis, head of the Innovation Center. “We maybe had one or two trying to do agile methodology.”

But one goal for the new center was to conduct-proof-of-concept studies to test out new systems as they were being developed.

For example, a new application and renewal system for commercial fishing licenses was developed with agile techniques, saving the state $5 million in development costs.

“We would have gotten there [without agile], but it would taken us longer and cost us more money,” says state Chief Information Officer Keith Werner. “I had reservations that they wouldn’t have gotten the functionality they were looking for.”

Innovation centers are not without risk. Separate from the rest of an organization, they can be seen as disconnected or elitist, creative experts focused on innovating but disconnected from the real business of government.

“If you create an innovation group then they’re seen as the innovation group,” says Ellis. “The rest of the people, who aren’t in the innovation group, don’t feel compelled to innovate.”

To guard against that, the North Carolina Innovation Center, located on the first floor of the state’s Department of Environment and Natural Resources HQ, has no full-time resources of its own. The idea is to create an open environment that can change as needs change. Even its office space is flexible, easily reconfigured to encourage open-space interactions, so ideas can be demonstrated with little fuss.

Agile Contracting
Changing the software development process alone is not enough, says Michael Howell, senior director of the Institute for Innovation and Special Projects at ACT-IAC. The contracting piece also has to change.

“You can’t say I want to be agile, so here’s what I’m going to do: ‘I’m going to put a request in my 2018 budget and wait and see if I get any money,’” Howell says. It doesn’t work. They have to have flexibility to come up with the money. Then they have to have flexibility … to actually spend the money.”

Bob Gleason, director of the Division of Purchases and Supplies in the Virginia Department of General Services, says conventional procurement practices focus on known solutions and avoid unknowns, which add risk and uncertainty to programs.

Traditional requests for proposals define in specific detail exactly what is wanted, and suppliers respond in kind. “It gives you what it is you’re looking for,” Gleason says. “But there’s no incentive for any added value.”

It’s better, he said, to focus on the desired outcome, rather than on the detailed requirements intended to produce that same result, and to invite industry to offer innovative solutions the government customer may not have imagined on its own.

Contracts also must be flexible so vendors can improve their products or services over time, as they learn. Otherwise, vendors can be contractually locked into inefficient systems and approaches.

“You need to have a contract that’s not structured in fixed points in time, but is structured in a way that enables change over the life of the agreement,” Gleason says.

Managing Risk
“Part of the challenge we have as integrators is not just coming up with that new capability,” but also making sure that contracting officers’ technical advisors are well informed so they have the ability to compare very different proposals, says David Gagliano, chief technology officer for global solutions at General Dynamics Information Technology. Innovation inevitably involves risk, and contracting officials are trained to be risk-averse. Selection based on price is clear and straightforward in a way that value comparisons are not. So acquisition officers need skills to evaluate the benefits of different technical proposals and the confidence to take on reasonable amounts of risk.

“Two years ago, the government published the ‘TechFAR Handbook for Procuring Digital Services Using Agile Processes,’” Gagliano says. “It’s a pretty good starting point for contracting officers and their technical representatives who want to learn more about Best Practices in Agile procurement.”

“People don’t want government to fail at all,” says Darrell West, director of the Center for Technology Innovation at the Brookings Institution. “When government fails, it often ends up on the front page. The private sector model of failing nine times to have that initial success has been difficult to incorporate in the public sector.”

So to accept failure, the threshold must be low enough that risk can be tolerated. Pilot programs and related short-term, proof-of-concept contracts can lower risk by reducing the amount of money at stake. West contends they can “encourage innovation while protecting against large-scale failures.”

The Defense Department’s DIUX initiative, which brings together venture capital firms, small technology businesses and Pentagon technologists to accelerate the injection of new technologies into the department, exemplifies the approach. New concepts can be conceived and proven in a low-risk, small contract environment, independent of conventional contracting rules and schedules. Then, once the technology has matured to the point of a wider roll-out, bigger firms can compete for the right to manage that implementation.

In this case, government gets the best of both worlds: rapid-fire innovation from small firms unfettered by cumbersome acquisition rules followed by a managed implementation by experienced contractors steeped in the intricacies of doing business with large-scale government organizations.

Related Articles

GDIT Recruitment 600×300
NPR Morning Edition 250×250
GDIT Recruitment 250×250
USNI News: 250×250
Nextgov Newsletter 250×250
GDIT HCSD SCM 2 250×250 Plane Takeoff
‘IT is Mission’: How Data Is Revolutionizing Intelligence

‘IT is Mission’: How Data Is Revolutionizing Intelligence

Intelligence agencies must stop viewing information technology as a back-office support service and instead elevate it to its rightful place as a mission-critical capability, argues Sean Roche, associate deputy director of digital innovation for the CIA.

“Stop treating IT like a service. Stop treating IT with the word ‘customer.’ Stop treating IT like it’s part of the admin portion of your organization,” he told the Department of Defense Intelligence Information Systems (DoDIIS) worldwide conference Aug. 2. “IT is mission.”

The distinction “changes the way we go about funding and prioritizing programs,” he said. It also has significant implications for the kinds of skills and talent intelligence agencies will need in the future, and for how systems are built, managed and designed.

For much of the past two decades, networks were the critical assets in intelligence – the ability to interact and quickly communicate across secure networks helped deliver data to the tactical edge more and more quickly. The time lag between intelligence shrunk from days to hours, minutes, even seconds. But in a data-centric world, intelligence is crossing into new territory. It is increasingly possible to predict likely outcomes, allowing national security leaders to make better decisions more quickly.

Open-source data analysis is one of the valuable data streams helping in that process, flooding the IC with new insights that must be tagged, collated, correlated, verified and analyzed. The volumes exceed human capacity, requiring intelligent systems, informed by data and able to learn as they go, to sort through all the signals to identify patterns that can point to truths.

Streaming social media analytics has become as valuable today as “the information we get clandestinely,” Roche said. The old notion that only information stamped “top secret” is of real value in intelligence has long since expired, he said. “Mining that rich vein of open source data that’s increasing rapidly in real time, doing sentiment analysis on it, is going to be more and more important,” he said.

IT Grows in Stature
Time and again, DoDIIS speakers returned to this theme: IT as not just an enabler, but critical to mission success. The IT enterprise is not just a network, but the underpinning of the IC’s future contributions to national security.

Under Secretary of Defense for Intelligence Marcel Lettre, acknowledged that the Pentagon “no longer controls the pace of change, especially in IT,” but emphasized that “technology is the secret sauce in the Third Offset strategy.” The Third Offset is Defense Secretary Ashton Carter’s long-term strategic vision to ensure the United States maintains technological superiority through sustained investment in emerging technologies like data science and machine intelligence.

Providing the underlying framework to enable that technological superiority is the primary reason Director of National Intelligence James Clapper is still on the job, he told a packed DoDIIS auditorium. Ensuring that IC ITE, the Intelligence Community Information Technology Enterprise, was sufficiently mature that it could not be reversed is not so much about providing infrastructure as it is about enhancing mission effectiveness across the IC.

“Data is a community property,” he said. “Integration simply means bringing the best and most appropriate resources from across the community together and applying them to our most challenging intelligence problems.”

And Marine Lt. Gen. Vince Stewart, director of the Defense Intelligence Agency (DIA) said the “decisive advantage over our adversaries in the future” will not be kinetic weapons or ground maneuver skills, but rather “this cognitive fight that matters most.”

Cross-Disciplinary Teams
Data and the ability to rapidly extract meaning from it is the means by which leaders expect to gain that decisive advantage, but achieving that nirvana will take more than technical acumen. Innovators across the IC are also looking at new organizational models and the kinds of integrated skill sets analysts and technologists will need to bring to the IC in the future.

“We need people who actually understand the math and can help validate and do things,” said Michael McCAbe, chief of applied research in DIA’s Chief Technology Office. “We need programmers and data experts who can move data, groom it, secure it. And you need users who actually understand what [the technology is] doing. And it’s more than analysis. It’s decision making, too.”

Cross-disciplinary teams are looking at what future skill sets DIA will need, what a data science career field and career path will look like, and how the technical skills of IT specialists and the analytic skills of analysts will begin to merge over time, he said.

“The technological skill set new analysts will need is significantly more than in the 1970s,” McCabe continued. “We’re not really at the full answer yet, but where we’re trending is this: IT services will be IT services and analysts will be analysts, but somewhere in the middle, you’re going to see IT reach out to the analyst and the analyst reach out to IT. You’ll have analysts who can code and coders who know how to think like an analyst. And I think we’re going to hit probably four different user groups on that spectrum.”

Already, he said, this is happening. Where the breakthroughs are taking place, it’s because the walls between analysts and coders are coming down, and the collaboration is increasing.

“Analyst cells, divisions and branches will include teams with that full spectrum of skills,” McCabe said. DIA has assembled multidisciplinary innovation pods – analysts, programmers and human resources specialists to help conceive of the skill sets and training and career paths future analysts will need. The agency is also “trying to do experiments to inform what new data sets should look like in the future,” he said, a process that likewise demands both “technological and also analysis skills.”

Across the agency, he continued, “we’ve got a bunch of users who want to get better at analytics, want to get better at decision making, tracking and monitoring, statistics and metrics.”

Roche cited the same phenomenon at CIA: “What we’re finding is most successful is the data analyst sitting with a programmer, sitting with a mission specialist, and often with an operator.”

Indeed, this pattern is repeating itself wherever large data sets and the desire to unlock their secrets has emerged, said Stan Tyliszczak, chief engineer at General Dynamics Information Technology. “We’ve seen it in healthcare, medical research and public safety, not just Intelligence or DoD: : Data analytics is a mission function, not an IT support function,” he says. “Sure, IT managers can help set up a support function or acquire a software tool. But big data solutions come out of the mission side of an organization. Increasingly, we’re seeing the mission specialists becoming IT savvy, and the IT staff bringing a new perspective to mission analysis.”

For example, the National Institutes for Health (NIH) found it’s easier to teach medical researchers how to program than it is to teach programmers the necessary medical knowledge to extract information from a data pool. “If you look at NIH’s Health IT model, they have these cells and the researchers become IT specialists, and the people who have IT knowledge gradually learn more about health.”

So just as agencies are discovering they can extract new knowledge by aligning disparate data sources, they’re also discovering the best way to do it is by assembling people with disparate skill sets into teams and then setting them loose on these complex data problems.

The long-term implications are profound. Just a few years ago, IT was part of the CIA’s administrative directorate, Roche said. IT was a support service, just like human resources and accounting departments. “Today, we realize it is inextricably linked to our ability to respond.”

Related Articles

GDIT Recruitment 600×300
NPR Morning Edition 250×250
GDIT Recruitment 250×250
USNI News: 250×250
Nextgov Newsletter 250×250
GDIT HCSD SCM 2 250×250 Plane Takeoff
Congress Wants Details on Pentagon’s LPTA Contracts

Congress Wants Details on Pentagon’s LPTA Contracts

Congress wants the Defense Department to shed more light on how it’s using lowest-price, technically acceptable (LPTA) contracts – and to report back by March 1, 2017, with the results. The measure is included in both the House and Senate versions of the National Defense Authorization bill.

The House Armed Services Committee’s report on the bill says “anecdotal examples suggest … widespread over-use of LPTA processes and contracts,” with potentially “substantial unintended consequences.”

The measure, which was cheered by the Professional Services Council (PSC) and other industry groups, would require DoD to survey contracting officers to determine how well they understand official policy guidance and to compile data on how often and for what kinds of products and services LPTA was used in fiscal years 2015 and 2016.

Pulling such data will be difficult; budget analysts say it’s almost impossible to get an accurate count without actually reading the acquisition documents. But the focus that it could bring will be helpful, said Alan Chvotkin, executive vice president and counsel at PSC.

“What we lack today is data about how extensive LPTA is actually being used in the federal government or by the Department of Defense,” Chvotkin said. “This is an attempt to try to document how extensive it’s being used, by whom and for what procurements, in order to help inform future policy.” The trouble is, government reporting systems do not capture all uses of LPTA. So he’s not expecting the data to be perfect, but it’s a start.

The legislative push follows an April 1 policy memo from Claire Grady, director of defense procurement and acquisition policy, which aims to limit LPTA to situations where price is truly the only differentiating factor. The memo offers detailed formal guidance that follows a 2015 directive from Pentagon acquisition chief Frank Kendall, in which he urged more responsible use of LPTA as a source-selection method.

“The new procedures will have a significant impact on proposal evaluations,” says Leonardo Manning, director of the Center for Contracting at Defense Acquisition University. “It is revamping the DoD source-selection process, specifically on how the department handles best value tradeoffs and lowest-price, technically acceptable procurements.”

Grady’s memo introduced a new source selection option called Value Adjusted Total Evaluated Price (VATEP), which allows acquisition officials to spell out the relative value of exceeding minimum requirements on its most important requirements.

Manning offers a hypothetical aircraft program as an example. “If the minimum speed is, say, 500 knots, and you can do 600, then I can say I’m willing to pay X percent or X dollars more to achieve that higher performance,” he explained. “The idea is to encourage people to go for the objective, the maximum, knowing that the government is willing to pay extra to achieve that level of performance.”

Manning expressed confidence that the new approach could work, but acknowledged it will push acquisition officers into uncharted territory. “We’ve got instructions, we will develop examples,” he said. “The government will have to clearly identify the minimum thresholds and the maximum performance objectives and spell out what they’re willing to pay for in the RFP.”

Whether the concept can be applied to technical and professional services contracts remains to be seen. Defining outcomes and requirements there can be a challenge, as can identifying objective means of comparison. Hardware comes with specifications, but the closest you can get with people are certifications and educational credentials, which are far less precise. Plus, Manning noted, in many cases, native ability or technical savvy can trump paper credentials.

“It’s very difficult,” he said. “There are levels of IT expertise. That’s what we go by. But that new person that may be on the job for less than two years may be a brainiac when it comes to this work.”

Past performance and experience in similar kinds of work should count heavily in the source-selection process for such contracts, Manning said. A vendor’s ability to hire, train and retain capable personnel should count.

“It’s tricky in the service area,” he said. “You have to be mindful of that.”

Leveraging government-wide acquisition contracts (GWACs) or other contract vehicles, such as General Services Agency schedules, can be a route around that, he said. But such contracts are no guarantee against the inappropriate use of LPTA as a final source-selection criteria.

Manning acknowledged that lowest-price contracts put pressure on vendors to find ways to cut costs, including squeezing labor rates and potentially compromising service. But if the requirements are written properly and the program managers are working with the contractor closely, he said, problems should be identified quickly and worked out through open dialog. “Once we sign the contract, we’re partners,” he said.

Including performance incentives in the deal can help, especially when it comes to handling unusual circumstances or even crises. “Think about shoveling snow,” he said. “I need to provide incentives so you surge when I need you to surge.” The same holds true for many IT staffing or support scenarios, he said.

Getting Around the System
For complex or unusual technology contracts, one tactic is to employ DoD’s Other Transaction Authority (OTA), which effectively lets acquisition officers bypass most legal restrictions when seeking innovations, such as leading-edge technology or prototypes from commercial sources. “It gives you the flexibility necessary to develop agreements tailored to a particular transaction,” Manning said.

But OTA is effectively a back door on competition. Set up in the 1950s when NASA was launching the space program, OTA isn’t subject to the Competition in Contracting Act, the Contract Disputes Act and the Procurement Integrity Act or to Pentagon audits, Manning said. OTA is best suited to developmental programs with companies that lack the experience and structures demanded in most other government contracts.

But while OTA’s can be useful under the guise of “innovation” to experiment with new technology, Chvotkin said it effectively sets up a parallel procurement system for a separate class of contractors. That ultimately runs counter to fair and open competition, and is not guaranteed to produce maximum value for the government.

“We do not support undercutting or circumventing the system for some category of companies,” Chvotkin said. “If we can streamline the procurement system for some, we ought to be able to streamline it for everybody.

“I challenge the premise that current contractors cannot be innovative. I challenge the premise that current contractors can’t be agile. I believe, on the other hand, that the current system, as rules-bound and compliance-oriented as it is today, inhibits the ability to be agile or innovative.”

Lt.-Gen.-Bill-Bender--AfceaIndeed, “working end-arounds” to solve acquisition challenges should not be the norm, said Lt. Gen. Bill Bender, chief information officer for the Air Force, during an AFCEA event in April. “We need a process to actually do things, act different and change our culture,” he said.

Lt. Gen. Susan Lawrence, a former Army CIO now with Booz-Allen, and a member of that same panel, said LPTA contracts make sense for commodities and can make sense for services contracts, as well. But comparing offers where price is just one of several variables makes the process harder.

Rather than add complexity with VATEP, she argues for reducing complexity by setting a budget and sticking to it. “Take price off the table,” she said. By bidding to budget, she argued, vendors can compete for a known price point and acquisition officers can evaluate proposals on the value they offer for the pre-set price.

Chvotkin agreed this system can be effective, citing the Coast Guard’s National Security Cutter program as an example.

Related Articles

GDIT Recruitment 600×300
NPR Morning Edition 250×250
GDIT Recruitment 250×250
USNI News: 250×250
Nextgov Newsletter 250×250
GDIT HCSD SCM 2 250×250 Plane Takeoff
Predicting Elections, Crop Yields and Cancer: Feds Think Big with Data

Predicting Elections, Crop Yields and Cancer: Feds Think Big with Data

Intelligence agencies want to predict flu outbreaks and the potential for protests and political unrest. The Department of Agriculture wants more accurate projections for crop yields. Doctors wonder if studying X-rays and ultrasounds could lead them to identify early indicators of cancer down to the cellular level.

Data scientists may not be able to do all these things yet, but the potential for taking current data and extrapolating its portent for the future is growing daily, limited only by the imagination – and the ability to amass and process data sets measured in petabytes.

Philippe Lostaunau, technical advisor to Jason Matheny, director of the Intelligence Advanced Research Projects Agency (IARPA) in McLean, Va., is using commercial data to try to help analysts forecast events around the world relevant to intelligence and national security.

For example, a commercial service analyzes satellite images to project retail sales based on the number of cars parked outside. By repurposing that same data source, intelligence analysts have found they can monitor parking volumes at hospitals and clinics to predict flu outbreaks around the world.

What’s more, they can improve the accuracy of their forecasts by merging multiple data streams. “Open Table has an API that enables you to see the volume of reservations and cancellations,” he explained at the Advanced Technology Research Council Big Data Forum June 30. “It turns out that it has a large signal for predicting flu.”

In addition to parking volumes, they can also monitor Internet searches on flu symptoms, web traffic to medical or health sites, cancelled restaurant reservations and changes to the relative mobility of smart phones. Taken together, he said, researchers have shown they can predict flu outbreaks 26 days in advance with 70 percent accuracy, civil unrest eight days in advance with 75 percent accuracy and election results 14 days in advance with 85 percent accuracy.

IARPA continues to mine other types of commercial data that could help forecast developments in a range of fields. Researchers look only at foreign locations and use current data to ensure known outcomes don’t skew algorithms used to make predictions.

“All the technologies we are investigating are tested in real time. We have the system send us forecasts today for events that haven’t occurred yet to ensure there is no bias,” Lostaunau said.

Can Data Boost Crop Yields?

Domestically, Michael Valivullah, chief technology officer at the National Agricultural Statistics Service, part of the U.S. Department of Agriculture (USDA), is looking to solve a different set of problems. His department still relies on written surveys to understand which crops farmers are planting and harvesting – and in what volumes. The farmers, meanwhile, are increasingly turning to sophisticated tools to manage that kind of information. The question is how can the government tap into that resource?

Farmers already have Internet-connected combine harvesters that report moisture and fertilizer levels and can access satellite imaging data to gain insights about their acreage. But drones could provide the most data yet – and change the way farmers watch their fields. “Satellites can look at one pixel to 100 square feet,” Valivullah said. “But if I use a drone, which is pretty cheap, I can put a camera on it, and it can look at a resolution of one pixel per square inch. That’s 145,000 times more resolution!

“So I can spot an insect, type of infestation, the dryness, the yellowing on a plant and the type – whether it’s due to nitrogen deficiency or water or something else.”

For USDA, the payoff would be a greater understanding of the productivity of every acre of American farmland. “Imagine if USDA, instead of asking farmers [survey questions], gets permission from farmers to go directly to consolidators and get data there,” Valivullah said, articulating an idea he’s trying to sell to USDA management. In time, he said, farmers could add multispectral sensors to their drones to monitor moisture levels and intercede more quickly than is possible today by simply walking fields and inspecting plants by hand.

Such changes point to a very different future for USDA, Valivullah said. Instead of hiring survey creators, he said, USDA will need “model creators” to produce forecasting models from data collected from external sources. “We need to get out of the business of collecting and managing the data,” Valivullah said. “We should be tool builders, not collectors. That’s a better use of taxpayer money.”

Can Data Predict Disease?

Col. Albert Bonnema, chief of information delivery at the Defense Health Agency, is helping Defense Department doctors make sense of the vast stores of medical data at their disposal.

“With medical data, to really get to the good stuff, it’s unstructured data,” Bonnema said. Doctors taking notes in a medical file don’t all fill in their answers in identical formats, creating easily searchable data fields. “It’s a different use case, different analytical problem,” Bonnema said. “It’s not complicated data, but you have to have a certain amount of medical expertise to know what you’re looking for.”

Doctors are now breaking down medical notes into symptoms are making progress researching traumatic brain injury and post-traumatic stress, he said, but the challenges are enormous because of the volume of data and the computing horsepower needed to sort through it all.

The “ultimate use case” for large data sets in medical research is precision medicine, Bonnema said. His office is in the midst of helping ingest and analyze five petabytes of data representing the full genome sequences of 700 people. Making the project still more challenging: Researchers want to connect that database to the patients’ medical data.

“Now you get situational awareness around people,” he said. “It’s pretty amazing what you start to find out.”

The challenges aren’t highly complex data science issues, though. For much of this research, it’s a combination of maintaining ready access to great masses of data storage; painstaking organization of unstructured data; and security for personal information (PI) and personally identifiable information (PII). Those are basic information technology challenges, but on a scale over and above what most organizations have to deal with.

And those challenges will only grow.

Looking still further out, Bonnema said the latest challenge to come his way is an effort to take digitized images from X-rays, mammograms and the like and analyze it “at the pixel level to see if they can identify things sooner.” By reviewing past mammograms of cancer patients for example, might they be able to identify tell-tale signs that might have predicted the disease? And could they learn to identify those signs sooner, before cancer takes root?

A lot of very basic research needs to be done before anyone can prove pixel-sized anomalies can predict cancer, he said. But to do that research, scientists will need to be able to process some 55 petabytes of data. “That’s the kind of compute and store that’s necessary,” Bonnema said. Finding cost-effective ways to manage such volumes of data will be a major focus for defense health information technology experts for a long time to come.

Related Articles

GDIT Recruitment 600×300
NPR Morning Edition 250×250
GDIT Recruitment 250×250
USNI News: 250×250
Nextgov Newsletter 250×250
GDIT HCSD SCM 2 250×250 Plane Takeoff