Evidence-Based Policy Act Could Change How Feds Use, Share Data

As government CIOs try to get their arms around how the Modernizing Government Technology (MGT) Act will affect their lives and programs, the next big IT measure to hit Congress is coming into focus: House Speaker Paul Ryan’s (R-Wis.) “Foundations for Evidence-Based Policymaking Act of 2017.”

A bipartisan measure now pending in both the House and Senate, the bill has profound implications for how federal agencies manage and organize data – the keys to being able to put data for informed policy decisions into the public domain in the future. Sponsored by Ryan in the House and by Sen. Patty Murray (D-Wash.) in the Senate, the measure would:

  • Make data open by default. That means government data must be accessible for research and statistical purposes – while still protecting privacy, intellectual property, proprietary business information and the like. Exceptions are allowed for national security
  • Appoint a chief data officer responsible for ensuring data quality, governance and availability
  • Appoint a chief evaluation officer to “continually assess the coverage, quality, methods, consistency, effectiveness, independence and balance of the portfolio of evaluations, policy research, and ongoing evaluation activities of the agency”

“We’ve got to get off of this input, effort-based system [of measuring government performance], this 20th century relic, and onto clearly identifiable, evidence-based terms, conditions, data, results and outcomes,” Ryan said on the House floor Nov. 15. “It’s going to mean a real sea change in how government solves problems and how government actually works.”

Measuring program performance in government is an old challenge. The Bush and Obama administrations each struggled to implement viable performance measurement systems. But the advent of Big Data, advanced analytics and automation technologies holds promise for quickly understanding how programs perform and whether or not results match desired expectations. It also holds promise for both agencies and private sector players to devise new ways to use and share government data.

“Data is the lifeblood of the modern enterprise,” said Stan Tyliszczak, vice president of technology integration with systems integrator General Dynamics Information Technology. “Everything an organization does is captured:  client data; sensor and monitoring data; regulatory data, or even internal IT operating data. That data can be analyzed and processed by increasingly sophisticated tools to find previously unseen correlations and conclusions. And with open and accessible data, we’re no longer restricted to a small community of insiders.

“The real challenge, though, will be in the execution,” Tyliszczak says. “You have to make the data accessible. You have to establish policies for both open sharing and security. And you have to organize for change – because new methods and techniques are sure to emerge once people start looking at that data.”

Indeed, the bill would direct the Office of Management and Budget, Office of Government Information Services and General Services Administration to develop and maintain an online repository of tools, best practices and schema standards to facilitate open data practices across the Federal Government. Individual agency Chief Data Officers (CDO) would be responsible for applying those standards to ensure data assets are properly inventoried, tagged and cataloged, complete with metadata descriptions that enable users to consume and use the data for any number of purpose.

Making data usable by outsiders is key. “Look at what happened when weather data was opened up,” says Tyliszczak. “A whole new class of web-based apps for weather forecasting emerged. Now, anyone with a smartphone can access up-to-the-minute weather forecasts from anywhere on the planet. That’s the power of open data.”

Choosing Standards
Most of the government’s data today is not open. Some is locked up in legacy systems that were never intended to be shared with the public. Some lacks the metadata and organization that would make it truly useful by helping users understand what individual fields represent. And most is pre-digested – that is, the information is bound up in PDF reports and documents rather consumable by analytics tools.

Overcoming all that will require discipline in technology, organization and execution.

“Simply publishing data in a well-known format is open, but it is not empowering,” says Mike Pizzo, co-chair of the Oasis Open Data Protocol (OData) Technical Committee and a software architect at Microsoft. “Data published as files is hard to find, hard to understand and tends to get stale as soon as it’s published.… To be useful, data must be accurate, consumable and interoperable.”

Some federal agencies are already embracing OData for externally-facing APIs. The Department of Labor, for example, built a public-facing API portal providing access to 175 data tables within 32 datasets, and with more planned in the future. Pizzo says other agencies both inside and outside the U.S. have used the standard to share, or “expose” labor, city, health, revenue, planning and election information.

Some agencies are already driving in this direction by creating a data ecosystem built around data and application programming interfaces (APIs). The Department of Veterans Affairs disclosed in October it is developing plans to build a management platform called Lighthouse, intended to “create APIs that are managed as products to be consumed by developers within and outside of VA.”

The VA described the project as “a deliberate shift” to becoming an “API-driven digital enterprise,” according to a request for information published on FedBizOps.gov. Throughout VA, APIs will be the vehicles through which different VA systems communicate and share data, underpinning both research and the delivery of benefits to veterans and allowing a more rapid migration from legacy systems to commercial off-the-shelf and Software-as-a-Service (SaaS) solutions. “It will enable creation of new, high-value experiences for our Veterans [and] VA’s provider partners, and allow VA’s employees to provide better service to Veterans,” the RFI states.

Standardizing the approach to building those APIs will be critical.

Modern APIs are based on ReST (Representational State Transfer), a “style” for interacting with web resources, and on JSON (JavaScript Object Notation), a popular format for data interchange that is more efficient than XML (eXtensible Markup Language). These standards by themselves do not solve the interoperability problem, however, because they offer no standard way of identifying metadata, Pizzo says. This is what OData provides: a metadata description language intended to establish common conventions and best practices for metadata that can be applied on top of REST and JSON. Once applied, OData provides interoperable open data access, where records are searchable, recognizable, accessible and protectable – all, largely, because of the metadata.

OData is an OASIS and ISO standard and is widely supported across the industry, including by Microsoft, IBM, Oracle, Salesforce among many.

“There are something like 5,000 or 6,000 APIs published on the programmable web, but you couldn’t write a single application that would interact with two of them,” he says. “What we did with OData was to look across those APIs, take the best practices and define those as common conventions.” In effect, OData sets a standard for implementing ReST with a JSON payload. Adopting the standard means providing a shortcut to choose the best way to implement request and response headers, status codes, HTTP methods, URL conventions, media types, payload formats and query options, so the more important work using the data can be the focus of development activity.

This has value whether or not a data owner plans to share data openly or not.

Whether APIs will be accessed by an agency’s own systems – as with one VA system tapping into the database of another agency’s system – or by consumers – as in the case of a veteran accessing a user portal – doesn’t matter. In the Pentagon, one question never goes away: “How do we improve data interoperability from a joint force perspective?” said Robert Vietmeyer, associate director for Cloud Computing and Agile Development, Enterprise Services and Integration Directorate in the Office of the Defense Department CIO at a recent Defense Systems forum.

“I talk to people all the time, and they say, ‘I’m just going to put this into this cloud, or that cloud, and they have a data integration engine or big data capability, or machine learning, and once it’s all there all my problems will be solved.’ No, it’s not. So when the next person comes in and says, ‘You have data I really need, open up your API, expose your data so I can access it, and support that function over time,’ they’re not prepared. The program manager says: ‘I don’t have money to support that.’”

Vietmeyer acknowledges the Pentagon is behind in establishing best practices and standards. “The standards program has been in flux,” he said. “We haven’t set a lot, but it’s one of those areas we’re trying to fix right now. I’m looking for all ideas to see what we can do. Regardless, however, he sees a silver lining in the growing openness to cloud solutions. “The cloud makes it much easier to look at new models which can enable that data to become consumable by others,” he said.

Standards – whether by happenstance or by design – are particularly valuable in fulfilling unforeseen needs, Pizzo says. “Even if your service never ends up needing to be interoperable, it still has those good bones so that you know it can grow, it can evolve, and when you start scratching your head about a problem there are standards in place for how to answer that need,” he says.

By using established discipline at the start, data owners are better prepared to manage changing needs and requirements later, and to capitalize on new and innovative ways to use their data in the future.

“Ultimately, we want to automate as much of this activity as possible, and standards help make that possible,” says GDIT’s Tyliszczak. “Automation and machine learning will open up entirely new areas of exploration, insight, modernization and efficiency. We’ll never be able to achieve really large-scale integration if we rely on human-centered analysis.

“It makes more sense – and opens up a whole world of new opportunities – to leverage commercial standards and best practices to link back-end operations with front-end cloud solutions,” he adds. “That’s when we can start to truly take advantage of the power of the Cloud.”

Tobias Naegele has covered defense, military, and technology issues as an editor and reporter for more than 25 years, most of that time as editor-in-chief at Defense News and Military Times.

Submit a Comment

Your email address will not be published. Required fields are marked *

Related Articles

GDIT Recruitment 600×300
Tom Temin 250×250
GM 250×250
GovExec Newsletter 250×250
Cyber Education, Research, and Training Symposium 250×250
December Breakfast 250×250

Upcoming Events

USNI News: 250×250
Nextgov Newsletter 250×250
WEST Conference
GDIT Recruitment 250×250
NPR Morning Edition 250×250
Winter Gala 250×250
AFCEA Bethesda’s Health IT Day 2018 250×250
(Visited 425 times, 21 visits today)