Home Page Latest Articles

How Feds Are Trying to Bring Order to Blockchain Mania

How Feds Are Trying to Bring Order to Blockchain Mania

Blockchain hype is at a fever pitch. The distributed ledger technology is hailed as a cure for everything from identity management to electronic health records and securing the Internet of Things. Blockchain provides a secure, reliable matter of record for transactions between independent parties, entities or companies. There are industry trade groups, a Congressional Blockchain Caucus and frequent panel discussions to raise awareness.

Federal agencies are plunging ahead, both on their own and in concert with the General Services Administration’s Emerging Citizen Technology Office (GSA ECTO). The office groups blockchain with artificial intelligence and robotic automation, social and collaborative technologies, and virtual and augmented reality as its four most critical technologies. Its goal: Develop use cases and roadmaps to hasten government adoption and success with these new technologies.

“There’s a number of people who assume that fed agencies aren’t looking at things like blockchain,” Justin Herman, emerging technology lead and evangelist at GSA ECTO, told a gathering at the State of the Net Conference held Jan. 29 in Washington, D.C. “We got involved in blockchain because there were so many federal agencies coming to the table demanding government wide programs to explore the technology. People had already done analysis on what specific use cases they thought they had and wanted to be able to invest in it.”

Now his office is working with more than 320 federal, state and local agencies interested in one or more of its four emerging tech categories. “A lot of that is blockchain,” Herman said. “Some have already done successful pilots. We hear identity management, supply chain management…. We should be exploring those things together, not in little silos, not in walled gardens, but in public.”

Among those interested:

  • The Joint Staff’s J4 Logistics Directorate and the Deputy Assistant Secretary of Defense for Maintenance, Policy and Programs are collaborating on a project to create a digital supply chain, enabled by additive manufacturing (also known as 3-D Printing). Blockchain’s role would be to secure the integrity of 3-D printing files, seen as “especially vulnerable to cyber threats and intrusions.” The Navy is looking at the same concept.“The ability to secure and securely share data throughout the manufacturing process (from design, prototyping, testing, production and ultimately disposal) is critical to Additive Manufacturing and will form the foundation for future advanced manufacturing initiatives,” writes Lt. Cmdr. Jon McCarter, a member of the Fiscal 2017 Secretary of the Navy Naval Innovation Advisory Council (NIAC).
  • The Office of the Undersecretary of Defense for Acquisition, Technology and Logistics (OUSD (AT&L)) Rapid Reaction Technology Office (RRTO) has similar designs on blockchain, seeing it as a potential solution for ensuring data provenance, according to a solicitation published Jan. 29.
  • The Centers for Disease Control’s Center for Surveillance, Epidemiology and Laboratory Services is interested in using blockchain for public health tracking, such as maintaining a large, reliable, current and shared database of opioid abuse or managing health data during crises. Blockchain’s distributed ledger system ensures that when one user updates the chain, everyone sees the same data, solving a major shortfall today, when researchers are often working with different versions of the same or similar data sets, rather than the same, unified data.
  • The U.S. Food and Drug Administration has similar interests in sharing health data for large-scale clinical trials.
  • The Office of Personnel Management last fall sought ideas for how to create a new consolidated Employee Digital Record that would track an employee’s skills, performance and experience over the course of an entire career, using blockchain as a means to ensure records are up to date and to speed the process of transfers from one agency to another.

Herman sees his mission as bringing agencies together so they can combine expertise and resources and more quickly make progress. “There are multiple government agencies right now exploring electronic health records with blockchain,” he said. “But we can already see the hurdles with this because they are separate efforts, so we’re adding barriers. We’ve got to design new and better ways to move across agencies, across bureaucracies and silos, to test, evaluate and adopt this technology. It should be eight agencies working together on one pilot, not eight separate pilots on one particular thing.”

The Global Blockchain Business Council (GBBC) is an industry group advocating for blockchain technology and trying to take a similar approach in the commercial sector to what GSA is doing in the federal government. “We try to break down these traditionally siloed communities,” said Mercina Tilleman-Dick, chief operating officer for the GBBC.

These days, that means trying to get people together to talk about standards and regulation and connecting those who are having success with others just beginning to think about such issues. “Blockchain is not going to solve every problem,” Tilleman-Dick said. It could prove effective in a range of use cases where secure, up-to-date, public records are essential.

Take property records, for example. The Republic of Georgia moved all its land titles onto a blockchain-based system in 2017, Sweden is exploring the idea and the city of South Burlington, Vt., is working on a blockchain pilot for local real estate transactions. Patrick Byrne, founder of Overstock.com and its subsidiary Medici Ventures, announced in December he’s funding a startup expressly to develop a global property registry system using blockchain technology.

“I think over the next decade it will fundamentally alter many of the systems that power everyday life,” GBBC’s Tilleman-Dick said.

“Blockchain has the potential to revolutionize all of our supply chains. From machine parts to food safety,” said Adi Gadwale, chief enterprise architect for systems integrator General Dynamics Information Technology. “We will be able to look up the provenance and history of an item, ensuring it is genuine and tracing the life of its creation along the supply chain.

“Secure entries, immutable and created throughout the life of an object, allow for secure sourcing, eliminate fraud, forgeries and ensure food safety,” Gadwale said. “Wal-Mart has already begun trials of blockchain with food safety in mind.”

Hilary Swab Gawrilow, legislative director and counsel in the office of Rep. Jared Polis (D-Colo.) who is among the Congressional Blockchain Caucus leaders, said the government needs to do more to facilitate understanding of the technology. The rapid rise in value of bitcoin and the overall wild fluctuations in value and speculation in digital cryptocurrencies, has done much to raise awareness. Yet it does not necessarily instill confidence in the concepts behind blockchain and distributed ledger technology.

“There are potential government applications or programs that deserve notice and study,” Swab Gawrilow said.

Identity management is a major challenge for agencies today. In citizen engagement, citizens may have accounts with multiple agencies. Finding a way to verify status without having to build complicated links between disparate systems to enable benefits or confirm program eligibility would be valuable. The same is true for program accountability. “Being able to verify transactions – would be another great way to use blockchain technology.”

That’s where the caucus is coming from: A lot of this is around education. Lawmakers have all heard of bitcoin, whether in a positive or negative way. “They understand what it is, Gawrilow said. “But they don’t necessarily understand the underlying technology.” The caucus’ mission is to help inform the community.

Like GSA’s Herman, Gawrilow favors agency collaboration on new technology projects and pilots. “HHS did a hackathon on blockchain. The Postal Service put out a paper, and State is doing something. DHS is doing something. It’s every agency almost,” she said. “We’ve kicked around the idea of asking the administration to start a commission around blockchain.”

That, in turn, might surface issues requiring legislative action – “tweaks to the law” that underlie programs, such as specifications on information access, or a prescribed means of sharing or verifying data. That’s where lawmakers could be most helpful.

Herman, for his part, sees GSA as trying to fill that role, and to fill it in such a way that his agency can tie together blockchain and other emerging and maturing technologies. “It’s not the technology, it’s the culture,” he said. “So much in federal tech is approached as some zero-sum game, that if an agency is dedicating time to focus and investigate a technology like blockchain, people freak out because they’re not paying attention to cloud or something else.”

Agencies need to pool resources and intelligence, think in terms of shared services and shared approaches, break down walls and look holistically at their challenges to find common ground.

That’s where the payoff will come. Otherwise, Herman asks, “What does it matter if the knowledge developed isn’t shared?”

Related Articles

GDIT Recruitment 600×300
GDIT HCSD SCM 5 250×250 Truck
NPR Morning Edition 250×250
GDIT Recruitment 250×250
USNI News: 250×250
Nextgov Newsletter 250×250
Relocatable Video Surveillance Systems Give CBP Flexibility on Border

Relocatable Video Surveillance Systems Give CBP Flexibility on Border

Illegal border crossings fell to their lowest level in at least five years in 2017, but after plunging through April, the numbers have risen each of the past eight months, according to U.S. Customs and Border Protection (CBP).

Meanwhile, the debate continues: Build a physical wall spanning from the Gulf of Mexico to the Pacific Ocean, add more Border Patrol agents or combine better physical barriers with technology to stop drug trafficking, smuggling and illegal immigration?

Increasingly, however, it’s clear no one solution is right for everyplace. Ron Vitiello, acting deputy commissioner at CBP, said the agency intends to expand on the existing 652 miles of walls and fencing now in place – but not necessarily extend the wall the entire length of the border.

“We’re going to add to fill some of the gaps we didn’t get in the [previous] laydown, and then we’re going to prioritize some new wall [construction] across the border in places where we need it the most,” he said in a Jan. 12 TV interview.

Walls and barriers are a priority, Vitiello said in December at a CBP press conference. “In this society and all over our lives, we use walls and fences to protect things,” he said. “It shouldn’t be any different on the border.…  But we’re still challenged with access, we’re still challenged with situational awareness and we’re still challenged with security on that border. We’re still arresting nearly 1,000 people a day.

“So we want to have more capability: We want more agents, we want more technology and we want that barrier to have a safer and more secure environment.”

Among the needs: Relocatable Remote Video Surveillance Systems (R-RVSS) that can be picked up and moved to where they’re needed most as border activity ebbs and flows in response to CBP’s border actions.

CBP mapped its fencing against its 2017 apprehension record in December (see map), finding that areas with physical fencing, such as near the metropolitan centers of San Diego/Tijuana, Tucson/Nogales and El Paso/Juarez are just as likely to see illegal migration activity as unfenced areas in the Laredo/Nueva Laredo area.

CBP mapped its fencing against its 2017 apprehension record in December (see map below), finding that areas with physical fencing are just as likely to see illegal migration activity as unfenced areas.

Source: U.S. Customs and Border Protection

Rep. Will Hurd (R-Tex.), vice chairman of the House Homeland Security subcommittee on Border and Maritime Security, is an advocate for technology as both a complement to and an alternative to physical walls and fences. “A wall from sea to shining sea is the least effective and most expensive solution for border security,” he argued Jan. 16. “This is especially true in areas like Big Bend National Park, where rough terrain, natural barriers and the remoteness of a location render a wall or other structure impractical and ineffective.”

CBP has successfully tested and deployed video surveillance systems to enhance situational awareness on the border and help Border Patrol agents track and respond to incursions. These RVSS systems use multiple day and night sensors mounted on poles to create an advance warning and tracking system identifying potential border-crossing activity. Officers can monitor those sensors feeds remotely and dispatch agents as needed.

Savvy smugglers are quick to adjust when CBP installs new technologies, shifting their routes to less-monitored areas. The new, relocatable RVSS systems (R-RVSS) make it easy for CBP to respond in kind, forcing smugglers and traffickers to constantly adapt.

Robert Gilbert, a former Border Patrol sector chief at CBP and now a senior program director for RVSS at systems integrator General Dynamics Information Technology (GDIT), says relocatable systems will empower CBP with new tools and tactics. “Over the past 20 or 30 years, DOJ then CBP has always deployed technology into the busiest areas along the border, the places with the most traffic. In reality, because of the long procurement process, we usually deployed too late as the traffic had shifted to other locations on the border. The big difference with this capability is you can pick it up and move it to meet the evolving threat. The technology can be relocated within days.”

GDIT fielded a three-tower system in CBP’s Laredo (Texas) West area last summer and a similar setup in McAllen, Texas, in December. The towers – set two to five miles apart – were so effective, CBP is now preparing to buy up to 50 more units to deploy in the Rio Grande sector, where the border follows the river through rugged terrain. There, a physical wall may not be viable, while a technology-based virtual wall could prove highly effective.

Each tower includes an 80-foot-tall collapsible pole that can support a sensor and communications payload weighing up to 2,000 pounds. While far in excess of current needs, it provides a growth path to hanging additional sensors or communications gear if requirements change later on.

When CBP wants to move the units, poles are collapsed, sensors can be packed away and a standard 3/4- or 1-ton pickup truck can haul it to its next location.

Roughly two-thirds of the U.S.-Mexico border runs through land not currently owned by the federal government, a major hurdle when it comes to building permanent infrastructure like walls or even fixed-site towers. Land acquisition would add billions to the cost even if owners agree to the sale. Where owners decline, the government might still be able to seize the land under the legal procedure known as eminent domain, but such cases can take years to resolve.

By contrast, R-RVSS requires only a temporary easement from the land owner. Site work is bare bones: no concrete pad, just a cleared area measuring roughly 40 feet by 40 feet. It need not be level – the R-RVSS system is designed to accommodate slopes up to 10 degrees. Where grid power is unavailable – likely in remote areas – a generator or even a hydrogen fuel cell can produce needed power.

What’s coming next
CBP seeks concepts for a Modular Mobile Surveillance System (M2S2) similar to RVSS, which provide the Border Patrol with an even more rapidly deployable system for detecting, identifying, classifying and tracking “vehicles, people and animals suspected of unlawful border crossing activities.”

More ambitiously, CBP also wants such systems to incorporate data science and artificial intelligence to add a predictive capability. The system would “detect, identify, classify, and track equipment, vehicles, people, and animals used in or suspected of unlawful border crossing activities,” and employ AI to help agents anticipate their direction so they can quickly respond, and resolve each situation.

At the same time, CBP is investigating RVSS-like systems for coastal areas. Deploying pole-mounted systems would train their sensors to monitor coastal waters, where smugglers in small boats seek to exploit the shallows by operating close to shore, rather than the deeper waters patrolled by Coast Guard and Navy ships.

In a market research request CBP floated last June, the agency described a Remote Surveillance System Maritime (RSS-M) as “a subsystem in an overall California Coastal Surveillance demonstration.” The intent: to detect, track, identify, and classify surface targets of interest, so the Border Patrol and partner law enforcement agencies can interdict such threats.

Legislating Tech
Rep. Hurd, Rep. Peter Aguilar (D-Calif.) and a bipartisan group of 49 other congress members support the ‘‘Uniting and Securing America Act of 2017,’’ or “USA Act.” The measure included a plan to evaluate every mile of the U.S.-Mexico border to determine the best security solution for each. After weeks of Senate wrangling over immigration matters, Sens. John McCain (R-Ariz.) and Chris Coons (D-Del.) offered a companion bill in the Senate on Feb. 5.

With 820 miles of border in his district, Hurd says, few in Congress understand the border issue better than he – or feel it more keenly.

“I’m on the border almost every weekend,” he said when unveiling the proposal Jan. 16. The aim: “Full operational control of our border by the year 2020,” Hurd told reporters. “We should be able to know who’s going back and forth across our border. The only way we’re going to do that is by border technologies.” And in an NPR interview that day, he added: “We should be focused on outcomes. How do we get operational control of that border?”

The USA Act would require the Department of Homeland Security to “deploy the most practical and effective technology available along the United States border for achieving situational awareness and operational control of the border by Inauguration Day 2021, including radar surveillance systems; Vehicle and Dismount Exploitation Radars (VADER); three-dimensional, seismic acoustic detection and ranging border tunneling detection technology; sensors, unmanned cameras, drone aircraft and anything else that proves more effective or advanced. The technology is seen as complementing and supporting hard infrastructure.

Related Articles

GDIT Recruitment 600×300
GDIT HCSD SCM 5 250×250 Truck
NPR Morning Edition 250×250
GDIT Recruitment 250×250
USNI News: 250×250
Nextgov Newsletter 250×250
The ABCs of 2018 Federal IT Modernization: I to Z

The ABCs of 2018 Federal IT Modernization: I to Z

In part two of GovTechWorks’ analysis of the Trump Administration’s federal IT modernization plan, we examine the likely guiding impact of the Office of Management and Budget, the manner in which agencies’ infrastructures might change, and the fate of expensive legacy systems.

The White House IT modernization plan released in December seeks a rapid overhaul of IT infrastructure across federal civilian agencies, with an emphasis on redefining the government’s approach to managing its networks and securing its data. Here, in this second part of our two-part analysis, is what you need to know from I to Z (for A-H, click here):

I is for Infrastructure
Modernization boils down to three things: Infrastructure, applications and security. Imagine if every government agency managed its own telephone network or international logistics office, rather than outsourcing such services. IT services are essentially the same. Agencies still need expertise to connect to those services – they still have telecom experts and mail room staff – but they don’t have to manage the entire process.

Special exceptions will always exist for certain military, intelligence (or other specialized) requirements. Increasingly, IT services are becoming commodity services purchased on the open market. Rather than having to own, manage and maintain all that infrastructure, agencies will increasingly buy infrastructure as a service (IaaS) in the cloud — netting faster, perpetually maintained and updated equipment at a lower cost. To bring maximum value – and savings – out of those services, they’ll have to invest in integration and support services to ensure their systems are not only cost effective, but also secure.

J is for JAB, the Joint Authorization Board
The JAB combines expertise at General Services Administration (GSA), Department of Homeland Security (DHS) and the Department of Defense (DOD). It issues preliminary authority to operate (ATO) for widely used cloud services. The JAB will have a definitive role in prioritizing and approving commercial cloud offerings for the highest-risk federal systems.

K is for Keys
The ultimate solution for scanning encrypted data for potential malicious activity is to unencrypt that data for a thorough examination. This involves first having access to encryption keys for federal data and then, securing those keys to ensure they don’t get in the wrong hands. In short, these keys are key to the federal strategy of securing both government data and government networks.

L is for Legacy
The government still spends 70 percent of its IT budget managing legacy systems. That’s down from as much as 85 percent a few years ago, but still too much. In a world where volumes of data continue to expand exponentially and the cost of computer processing power continues to plunge, how long can we afford overspending on last year’s (or last decade’s) aging (and less secure) technology.

M is for Monopsony
A monopoly occurs when one source controls the supply of a given product, service or commodity. A monopsony occurs when a single customer controls the consumption of products, services or commodities. In a classical monopsony, the sole customer dictates terms to all sellers.

Despite its size, the federal government cannot dictate terms to information technology vendors. It can consolidate its purchasing power to increase leverage, and that’s exactly what the government will do in coming years. The process begins with networking services as agencies transition from the old Networx contract to the new Enterprise Information Services vehicle.

Look for it to continue as agencies consolidate purchasing power for commodity software services, such as email, continuous monitoring and collaboration software.

The government may not ultimately wield the full market power of a monopsony, but it can leverage greater negotiating power by centralizing decision making and consolidating purchase and licensing agreements. Look for that to increase significantly in the years ahead.

N is for Networks
Networks used to be the crown jewels of the government’s information enterprise, providing the glue that held systems together and enabling the government to operate. But if the past few years proved anything, it’s that you can’t keep the bad guys out. They’re already in, looking around, waiting for an opportunity.

Networks are essential infrastructure, but will increasingly be virtualized in the future, exist in software and protect encrypted data travelling on commercial fiber and stored much of the time, in commercial data centers (generically referred to as the cloud). You may not keep the bad guys out, but you can control what they get access to.

O is for OMB
The Office of Management and Budget has oversight over much of the modernization plan. The agency is mentioned 127 times in the White House plan, including 47 times in its 50 recommendations. OMB will either be the responsible party or the receiving party, for work done by others on 34 of those 50 recommendations.

P is for Prioritization
Given the vast number of technical, manpower and security challenges that weigh down modernization efforts, prioritizing programs that can deliver the greatest payoff, are essential. In addition, agencies are expected to prioritize and focus their modernization efforts on high-value assets that pose the greatest vulnerabilities and risks. From those lists, by June 30, the DHS must identify six to receive centralized interventions that include staffing and technical support.

The aim is to prioritize where new investment, talent infusions and security policies will make the greatest difference. To maximize that effort, DHS may choose projects that can expand to include other systems and agencies.

OMB must also review and prioritize any impediments to modernization and cloud adoption.

Q is for Quick Start
Technology is not often the most complicated part of many modernization efforts. Finding a viable acquisition strategy that won’t put yesterday’s technology in the government’s hands tomorrow, is often harder. That’s why the report directs OMB to assemble an Acquisition Tiger Team to develop a “quick start” acquisition package to help agencies more quickly license technology and migrate to the cloud.

The aim: combine market research, acquisition plans, readily identified sources and templates for both requests for quotes (RFQs) and Independent Government Cost Estimate (IGCE) calculations — which would be based on completed acquisitions. The tiger team will also help identify qualified small and disadvantaged businesses to help agencies meet set-aside requirements.

R is for Recommendations
There are 50 recommendations in the White House IT modernization report with deadlines ranging from February to August, making the year ahead a busy one for OMB, DHS and GSA, the three agencies responsible for most of the work. A complete list of the recommendations is available here.

T is for the TIC
The federal government developed the Trusted Internet Connection as a means of controlling the number of on and off ramps between government networks and the largely unregulated internet. But in a world now dominated by cloud-based software applications, remote cloud data centers, mobile computing platforms and web-based interfaces that may access multiple different systems to deliver information in context, the TIC needs to be rethought.

“The piece that we struggled with is the Trusted Internet Connections (TIC) initiative – that is a model that has to mature and get solved,” former Federal CIO Tony Scott told Federal News Radio. “It’s an old construct that is applied to modern-day cloud that doesn’t work. It causes performance, cost and latency issues. So the call to double down and sort that out is important. There has been a lot of good work that has happened, but the definitive solution has not been figured out yet.”

The TIC policy is the heart and soul of the government’s perimeter-based security model. Already, some agencies chose to bypass the TIC for certain cloud-based services, such as for Office 365, trusting Microsoft’s security and recognizing that if all that data had to go through an agency’s TIC, performance would suffer.

To modernize TIC capabilities, policies, reference architectures and associated cloud security authorization baselines, OMB must update TIC policies so agencies have a clear path forward to build out data-level protections and more quickly migrate to commercial cloud solutions. A 90-day sprint is to begin in mid-February, during which projects approved by OMB will pilot proposed changes in TIC requirements.

OMB must determine whether all data traveling to and from agency information systems hosted by commercial cloud providers warrants scanning by DHS, or whether only some information needs to be scanned. Other considerations under review: Expanding the number of TIC access points in each agency and a model for determining how best to implement intrusion detection and prevention capabilities into cloud services.

U is for Updating the Federal Cloud Computing Strategy
The government’s “Cloud First” policy is now seven years old. Updates are in order. By April 15, OMB must provide additional guidance on both appropriate use cases and operational security for cloud environments. All relevant policies on cloud migration, infrastructure consolidation and shared services will be reviewed.

In addition, OMB has until June to develop standardized contract language for cloud acquisition, including clauses that define consistent requirements for security, privacy and access to data. Establishing uniform contract language will make it easier to compare and broker cloud offerings and ensure government requirements are met.

V is for Verification
Verification or authentication of users’ identities is at the heart of protecting government information. Are you who you say you are? Key to securing information systems is ensuring that access is granted to only users who can be identified and verified as deserving access.

OMB has until March 1 to issue for public comment new identity policy guidance and to recommend identity service areas suitable for shared services. GSA must provide a business case for consolidating existing identity services to improve usability and drive secure access and enable cloud-based collaboration service that will enhance the ability to easily share and collaborate across agencies, which can be cumbersome today.

W, X, Y, Z is for Wrapping it All Up
The Federal Government is shifting to a consolidated IT model that will change the nature of IT departments and the services they buy. Centralized offerings for commodity IT – whether email, office tools and other common software-as-a-service offerings or virtual desktops and web hosting – will be the norm. As much as possible, the objective is to get agencies on the same page, using the same security services, the same collaboration services, the same data services and make those common (or in some cases shared) across multiple agencies.

Doing so promises to reduce needed manpower and licensing costs by eliminating duplication of effort and increased market leverage to drive down prices. But getting there will not be easy. Integration and security pose unique challenges in a government context, requiring skill, experience and specific expertise. On the government side, policy updates will only solve some of the challenges. Acquisition regulations must also be updated to support wider adoption of commercial cloud products.

Some agencies will need more help than others. Cultural barriers will continue to be major hurdles. Inevitably, staff will have to develop new skills as old ones disappear. Yet even in the midst of all that upheaval, some things don’t change. “In the end, IT modernization is really all about supporting the mission,” says Stan Tyliszczak, chief engineer at systems integrator General Dynamics Information Technology. “It’s about helping government employees complete their work, protecting the privacy of our citizens and ensuring both have timely access to the information and services they need. IT has always made those things better and easier, and modernization is only necessary to continue that process. That much never changes.”

 

Related Articles

GDIT Recruitment 600×300
GDIT HCSD SCM 5 250×250 Truck
NPR Morning Edition 250×250
GDIT Recruitment 250×250
USNI News: 250×250
Nextgov Newsletter 250×250
The ABCs of 2018 Federal IT Modernization: A to H

The ABCs of 2018 Federal IT Modernization: A to H

The White House issued its IT modernization plan last December and followed it with an ambitious program that could become a proving ground for rapidly overhauling IT infrastructure, data access and customer service. After years of talking about IT modernization, cybersecurity and migration to the cloud, federal agencies are now poised to ramp up the action.

Here, in A-B-C form, is what you need to know from A to H:

A is for Agriculture
The U.S. Department of Agriculture (USDA) will be a sort of proving ground for implementing the Trump administration’s vision for the future high-tech, high-performance, customer-satisfying government. USDA announced in December 2017 it will collapse 39 data centers into one (plus a backup), and consolidate 22 independent chief information officers under a single CIO with seven deputies. The aim: reinvent the agency as a modern, customer-centered organization and provide its leaders with instant access to a wealth of agency data.

B is for Better Citizen Services
“It is imperative for the federal government to leverage … innovations to provide better service for its citizens in the most cost-effective and secure manner,” the report states – in just its third sentence. Yes, modernization should ultimately save money by reducing the billions spent to keep aging systems operational. And yes, it should help overcome the patchwork of cybersecurity point solutions now used to protect federal networks, systems and data.

USDA Secretary Sonny Purdue’s experience modernizing government IT during two terms as governor of Georgia from 2003-2010 convinced him he could achieve similar results on the federal level. “He really saw, in reinventing Georgia government, how IT modernization and delivering better customer service benefitted not only employees, but the people of the state,” Deputy Secretary of Agriculture Steve Censky said in a TV interview.

Among the agency’s goals: Increase access to information throughout the agency by means of online service portals and advanced application program interfaces.

C is for Centers of Excellence
USDA won’t be going it alone. Under the direction of the Office of Science and Technology Policy, the agency will be the first to engage with a new set of experts at the General Services Administration (GSA). GSA is on an accelerated course to create five Centers of Excellence, leveraging both public and private sector expertise to develop best practices and standards that agencies can use for:

  • Cloud adoption
  • IT infrastructure optimization
  • Customer experience
  • Service delivery analytics
  • Contact centers

Jack Wilmer, White House senior advisor for Cybersecurity and IT Modernization, says the idea is to provide each agency’s modernization effort with the same core concepts and approach – and the best available experts. “We’re trying to leverage private sector expertise, bringing them in a centralized fashion, making them available to government agencies as they modernize,” he told Government Matters.

While GSA planned to award contracts to industry partners by the end of January – just 45 days after its initial solicitation – by March 5, no contracts had been awarded. Phase 1 contracts for assessment, planning and some initial activities should be finalized soon. Phase 2 awards for cloud migration, infrastructure optimization and customer experience are expected by the end of the year, Joanne Collins Smee, acting director of GSA’s Technology Transformation Service and deputy commissioner of the Federal Acquisition Service, said at a March 1 AFCEA event in Washington, D.C.

D is for Data Centers
While all data centers won’t close down, many more will soon disappear. Modernization is about getting the government out of the business of managing big infrastructure investments and instead, to leverage commercial cloud infrastructure and technology wherever possible. But don’t think your agency’s data won’t be in a data center somewhere.

“What is the cloud, anyway? Isn’t it really someone else’s data center, available on demand?” says Stan Tyliszczak, chief engineer at systems integrator General Dynamics Information Technology (GDIT). “Moving to the cloud means getting out of the business of running that data center yourself.”

The White House splits its cloud strategy into two buckets:

  • “Bring the government to the cloud.” Put government data and applications in privately-owned and operated infrastructure, where it is protected through encryption and other security technologies. This is public cloud, where government data sits side by side with private data in third-party data centers.
  • “Bring the cloud to the government.” Putting government data and applications on vendor-owned infrastructure, but located in government-owned facilities, as the Intelligence Community Information Technology Enterprise (IC ITE) does with the IC’s Commercial Cloud Services (C2S) contract with Amazon Web Services.

Figuring out what makes sense when, depends on your use case and for most agencies, will mean a combination of on premise solutions, shared government services and commercial services in public clouds. “That’s the Hybrid cloud model everyone’s talking about. But it’s not a trivial exercise. Melding those together is the challenge,” Tyliszczak says. “That’s what integrators are for.”

E is for Encryption
Government cybersecurity efforts have historically focused on defending the network and its perimeter, rather than the data that travels on that network. As cloud services are integrated into conventional on premise IT solutions, securing the data has become essential. At least 47 percent of federal network traffic is encrypted today – frustrating agency efforts to monitor what’s crossing network perimeters.

“Rather than treating Federal networks as trusted entities to be defended at the perimeter,” the modernization report advised, “agencies should shift their focus to placing security protections closer to data.”

To do that, the government must improve the way it authenticates devices and users on its networks, securing who has access and how, and encrypting data both at rest and in transit.

“Now you’re starting to obfuscate whether your sensors can actually inspect the content of that data,” notes Eric White, Cybersecurity program director at GDIT’s Health and Civilian Solutions Division. “Because it’s now encrypted, you add another layer of complexity to know for sure whether it’s the good guys or the bad guys moving data in and out of your network.”

White notes that the Department of Homeland Security (DHS) is charged with solving this encryption dilemma, balancing the millions of dollars in investment in high-end network-monitoring sensors, such as those associated with the Einstein program, against protecting individual privacy. Enabling those sensors to see through or decipher encrypted data without undermining the security of the data – or the privacy of individuals – is a critical priority. DHS has commissioned research to develop potential solutions, including virtualizing sensors for cloud environments; relocating sensors to the endpoints of encrypted tunnels; creating man-in-the-middle solutions that intercept data in motion; or providing the sensors with decryption keys.

F is for FedRAMP

The Federal Risk and Authorization Management Program (FedRAMP) remains the critical process for ensuring private-sector cloud offerings meet government security requirements. Look for updates to FedRAMP baselines that could allow tailoring of security controls for low-risk systems, address new approaches to integrated cloud services with federal Trusted Internet Connection (TIC) services and consider common features or capabilities that could be incorporated into higher-risk systems with FedRAMP “high” baselines.

Importantly, the report directs the General Services Administration (GSA), which manages FedRAMP, to come up with new solutions that make it easier for a software-as-a-service (SaaS) products already authorized for use in one agency to be accepted for use in another. Making the process for issuing an authority to operate (ATO) faster and easier to reuse has long been a goal of both cloud providers and government customers. This is particularly critical for shared services, in which one agency provides its approved commercial solution to another agency.

G is for GSA
Already powerfully influential as a buyer and developer for other agencies, GSA stands to become even more influential as the government moves to consolidate networks and other IT services into fewer contracts and licensing agreements, and to increase the commonality of solutions across the government.

This is especially true among smaller agencies that lack the resources, scale and expertise to effectively procure and manage their own IT services.

H is for Homeland Security
DHS is responsible for the overall cybersecurity of all federal government systems. The only federal entity mentioned more frequently in the White House modernization report is the Office of Management and Budget, which is the White House agency responsible for implementing the report’s guidance.

DHS was mandated to issue a report by Feb. 15, identifying the common weaknesses of the government’s highest-value IT assets and recommend solutions for reducing risk and vulnerability government-wide. By May 15, the agency must produce a prioritized list of systems “for government-wide intervention” and will provide a host of advisory and support services to help secure government systems. DHS also owns and manages the National Cybersecurity Protection System (NCPS) and the EINSTEIN sensor suites that capture and analyze network flow, detect intruders and scan the data coming in and out of government systems to identify potentially malicious activity and, in the case of email, blocking and filtering threatening content.

Look for next week’s edition of GovTechWorks for Part 2: Modernization from I to Z. In Part 2, we outline how infrastructure among government agencies will be impacted and streamlined by modernization, as well as discuss the fate of legacy systems and their maintenance budgets, and the major role the Office of Management and Budget will play in overall implementation.

Next week: Part 2, Modernization I-Z.

 

Related Articles

GDIT Recruitment 600×300
GDIT HCSD SCM 5 250×250 Truck
NPR Morning Edition 250×250
GDIT Recruitment 250×250
USNI News: 250×250
Nextgov Newsletter 250×250
How the Air Force Changed Tune on Cybersecurity

How the Air Force Changed Tune on Cybersecurity

Peter Kim, chief information security officer (CISO) for the U.S. Air Force, calls himself Dr. Doom. Lauren Knausenberger, director of cyberspace innovation for the Air Force, is his opposite. Where he sees trouble, she sees opportunity. Where he sees reasons to say no, she seeks ways to change the question.

For Kim, the dialogue they’ve shared since Knausenberger left her job atop a private sector tech consultancy to join the Air Force, has been transformational.

“I have gone into a kind of rehab for cybersecurity pros,” he says. “I’ve had to admit I have a problem: I can’t lock everything down.” He knows. He’s tried.

The two engage constantly, debating and questioning whether decisions and steps designed to protect Air Force systems and data are having their intended effect, they said, sharing a dais during a recent AFCEA cybersecurity event in Crystal City. “Are the things we’re doing actually making us more secure or just generating a lot of paperwork?” asks Knausenberger. “We are trying to turn everything on its head.”

As for Kim, she added, “Pete’s doing really well on his rehab program.”

One way Knausenberger has turned Kim’s head has been her approach to security certification packages for new software. Instead of developing massive cert packages for every program – documentation that’s hundreds of pages thick and unlikely to every be read – she wants the Air Force to certify the processes used to develop software, rather than the programs.

“Why don’t we think about software like meat at the grocery?” she asked. “USDA doesn’t look at every individual piece of meat… Our goal is to certify the factory, not the program.”

Similarly, Knausenberger says the Air Force is trying now to apply similar requirements to acquisition contracts, accepting the idea that since finding software vulnerabilities is inevitable, it’s best to have a plan for fixing them rather than hoping to regulate them out of existence. “So you might start seeing language that says, ‘You need to fix vulnerabilities within 10 days.’ Or perhaps we may have to pay bug bounties,” she says. “We know nothing is going to be perfect and we need to accept that. But we also need to start putting a level of commercial expectation into our programs.”

Combining development, security and operations into an integrated process – DevSecOps, in industry parlance – is the new name of the game, they argue together. The aim: Build security in during development, rather than bolting it on at the end.

The takeaways from the “Hack-the-Air-Force” bug bounty programs run so far, in that every such effort yields new vulnerabilities – and that thousands of pages of certification didn’t prevent them. As computer power becomes less costly and automation gets easier, hackers can be expected to use artificial intelligence to break through security barriers.

Continuous automated testing is the only way to combat their persistent threat, Kim said.

Michael Baker, CISO at systems integrator, General Dynamics Information Technology, agrees. “The best way to find the vulnerabilities – is to continuously monitor your environment and challenge your assumptions, he says. “Hackers already use automated tools and the latest vulnerabilities to exploit systems. We have to beat them to it – finding and patching those vulnerabilities before they can exploit them. Robust and assured endpoint protection, combined with continuous, automated testing to find vulnerabilities and exploits, is the only way to do that.”

I think we ought to get moving on automated security testing and penetration,” Kim added. “The days of RMF [risk management framework] packages are past. They’re dinosaurs. We’ve got to get to a different way of addressing security controls and the RMF process.”

Related Articles

GDIT Recruitment 600×300
GDIT HCSD SCM 5 250×250 Truck
NPR Morning Edition 250×250
GDIT Recruitment 250×250
USNI News: 250×250
Nextgov Newsletter 250×250
JOMIS Will Take E-Health Records to the Frontlines

JOMIS Will Take E-Health Records to the Frontlines

The Defense Department Military Health System Genesis electronic health records (EHR) system went live last October at Madigan Army Medical Center (Wash.), the biggest step so far in modernizing DOD’s vast MHS with a proven commercial solution. Now comes the hard part: Tying that system in with operational medicine for deployed troops around the globe.

War zones, ships at sea and aeromedical evacuations each present a new set of challenges for digital health records. Front-line units lack the bandwidth and digital infrastructure to enable cloud-based health systems like MHS Genesis. Indeed, when bandwidth is constrained, health data ranks last on the priority list, falling below command and control, intelligence and other mission data.

The Joint Operational Medicine Information Systems (JOMIS) program office oversees DOD’s operational medicine initiatives, including the legacy Theater Medical Information Program – Joint system used in today’s operational theaters of Iraq and Afghanistan, as well as aboard ships and in other remote locales.

“One of the biggest pain points we have right now is the issue of moving data from the various roles of care, from the first responder [in the war zone] to the First Aid station to something like Landstuhl (Germany) Regional Medical Center, to something in the U.S.,” Navy Capt. Dr. James Andrew Ellzy told GovTechWorks. He is deputy program executive officer (functional) for JOMIS, under the Program Executive Office, Defense Healthcare Management Systems (PEO DHMS).

PEO DHMS defines four stages or “roles,” once a patient begins to receive care. Role One is for first responders; Role Two: Forward resuscitative care; Role Three: Theater hospitals; and Role Four: Service-based medical facilities.

“Most of those early roles right now, are still using paper records,” Ellzy said. Electronic documentation begins once medical operators are in an established location. “Good records usually start the first place that has a concrete slab.”

Among the changes MHS Genesis will bring is consolidation. The legacy AHLTA (Armed Forces Health Longitudinal Technology Application – Theater) solution and its heavily modified theater-level variant AHLTA-T, incorporate separate systems for inpatient and outpatient support.

MHS Genesis however, will provide a single record regardless of patient status.

For deployed medical units, that’s important. Set up and maintenance for AHLTA’s outpatient records and the Joint Composite Health Care System have always been challenging.

“In order to set up the system, you have to have the technical skillset to initialize and sustain these systems,” said Ryan Loving, director of Health IT Solutions for military health services and the VA at General Dynamics Information Technology’s (GDIT) Health and Civilian Solutions Division. “This is a bigger problem for the Army than the other services, because the system is neither operated nor maintained until they go downrange. As a result, they lack the experience to be experts in setup and sustainment.”

JOMIS’ ultimate goal according to Stacy A. Cummings, who heads PEO DHMS, is to provide a virtually seamless representation of MHS Genesis deployed locations.

“For the first time, we’re bringing together inpatient and outpatient, medical and dental records, so we’re going to have a single integrated record for the military health system,” Cummings said at the HIMSS 2018 health IT conference in March. Last year, she told Government CIO magazine, “We are configuring the same exact tool for low-and no-communications environments.”

Therein lies the challenge, said GDIT’s Loving. “Genesis wasn’t designed for this kind of austere environment. Adapting to the unique demands of operational medicine will require a lot of collaboration with military health, with service-specific tactical networks, and an intimate understanding of those network environments today and where they’re headed in the future.”

Operating on the tactical edge – whether doing command and control or sharing medical data – is probably the hardest problem to solve, said Tom Sasala, director of the Army Architecture Integration Center and the service’s Chief Data Officer. “The difference between the enterprise environment and the tactical environment, when it comes to some of the more modern technologies like cloud, is that most modern technologies rely on an always-on, low-latency network connection. That simply doesn’t exist in a large portion of the world – and it certainly doesn’t exist in a large portion of the Army’s enterprise.”

Military units deploy into war zones and disaster zones where commercial connectivity is either highly compromised or non-existent. Satellite connectivity is limited at best. “Our challenge is how do we find commercial solutions that we cannot just adopt, but [can] adapt for our special purposes,” Sasala said.

MHS Genesis is like any modern cloud solution in that regard. In fact, it’s based on Cerner Millennium, a popular commercial EHR platform. So while it may be perfect for garrison hospitals and clinics – and ideal for sharing medical records with other agencies, civilian hospitals and health providers – the military’s operational requirements present unique circumstances unimagined by the original system’s architects.

Ellzy acknowledges the concern. “There’s only so much bandwidth,” he said. “So if medical is taking some of it, that means the operators don’t have as much. So how do we work with the operators to get that bandwidth to move the data back and forth?”

Indeed, the bandwidth and latency standards available via satellite links weren’t designed for such systems, nor fast enough to accommodate their requirements. More important, when bandwidth is constrained, military systems must line up for access, and health data is literally last on the priority list. Even ideas like using telemedicine in forward locations aren’t viable. “That works well in a hospital where you have all the connectivity you need,” Sasala said. “But it won’t work so well in an austere environment with limited connectivity.”

The legacy AHLTA-T system has a store-and-forward capability that allows local storage while connectivity is constrained or unavailable, with data forwarded to a central database once it’s back online. Delays mean documentation may not be available at subsequent locations when patients are moved from one level of care to the next.

The challenge for JOMIS will be to find a way to work in theater and then connect and share saved data while overcoming the basic functional challenges that threaten to undermine the system in forward locations.

“I’ll want the ability to go off the network for a period of time,” Ellzy said, “for whatever reason, whether I’m in a place where there isn’t a network, or my network goes down or I’m on a submarine and can’t actually send information out.”

AHLTA-T manages the constrained or disconnected network situation by allowing the system to operate on a stand-alone computer (or network configuration) at field locations, relying on built-in store-and-forward functionality to save medical data locally until it can be forwarded to the Theater Medical Data Store and Clinical Data Repository. There, it can be accessed by authorized medical personnel worldwide.

Engineering a comparable JOMIS solution will be complex and involve working around and within the MHS Genesis architecture, leveraging innovative warfighter IT infrastructure wherever possible. “We have to adapt Genesis to the store-and-forward architecture without compromising the basic functionality it provides,” said GDIT’s Loving.

Ellzy acknowledges compromises necessary to make AHLTA-T work, led to unintended consequences.

“When you look at the legacy AHLTA versus the AHLTA-T, there are some significant differences,” he said. Extra training is necessary to use the combat theater version. That shouldn’t be the case with JOMIS. “The desire with Genesis,” Ellzy said, “is that medical personnel will need significantly less training – if any – as they move from the garrison to the deployed setting.”

Reporter Jon Anderson contributed to this report.

Related Articles

GDIT Recruitment 600×300
GDIT HCSD SCM 5 250×250 Truck
NPR Morning Edition 250×250
GDIT Recruitment 250×250
USNI News: 250×250
Nextgov Newsletter 250×250