Homeland GTW

How Feds Are Trying to Bring Order to Blockchain Mania

How Feds Are Trying to Bring Order to Blockchain Mania

Blockchain hype is at a fever pitch. The distributed ledger technology is hailed as a cure for everything from identity management to electronic health records and securing the Internet of Things. Blockchain provides a secure, reliable matter of record for transactions between independent parties, entities or companies. There are industry trade groups, a Congressional Blockchain Caucus and frequent panel discussions to raise awareness.

Federal agencies are plunging ahead, both on their own and in concert with the General Services Administration’s Emerging Citizen Technology Office (GSA ECTO). The office groups blockchain with artificial intelligence and robotic automation, social and collaborative technologies, and virtual and augmented reality as its four most critical technologies. Its goal: Develop use cases and roadmaps to hasten government adoption and success with these new technologies.

“There’s a number of people who assume that fed agencies aren’t looking at things like blockchain,” Justin Herman, emerging technology lead and evangelist at GSA ECTO, told a gathering at the State of the Net Conference held Jan. 29 in Washington, D.C. “We got involved in blockchain because there were so many federal agencies coming to the table demanding government wide programs to explore the technology. People had already done analysis on what specific use cases they thought they had and wanted to be able to invest in it.”

Now his office is working with more than 320 federal, state and local agencies interested in one or more of its four emerging tech categories. “A lot of that is blockchain,” Herman said. “Some have already done successful pilots. We hear identity management, supply chain management…. We should be exploring those things together, not in little silos, not in walled gardens, but in public.”

Among those interested:

  • The Joint Staff’s J4 Logistics Directorate and the Deputy Assistant Secretary of Defense for Maintenance, Policy and Programs are collaborating on a project to create a digital supply chain, enabled by additive manufacturing (also known as 3-D Printing). Blockchain’s role would be to secure the integrity of 3-D printing files, seen as “especially vulnerable to cyber threats and intrusions.” The Navy is looking at the same concept.“The ability to secure and securely share data throughout the manufacturing process (from design, prototyping, testing, production and ultimately disposal) is critical to Additive Manufacturing and will form the foundation for future advanced manufacturing initiatives,” writes Lt. Cmdr. Jon McCarter, a member of the Fiscal 2017 Secretary of the Navy Naval Innovation Advisory Council (NIAC).
  • The Office of the Undersecretary of Defense for Acquisition, Technology and Logistics (OUSD (AT&L)) Rapid Reaction Technology Office (RRTO) has similar designs on blockchain, seeing it as a potential solution for ensuring data provenance, according to a solicitation published Jan. 29.
  • The Centers for Disease Control’s Center for Surveillance, Epidemiology and Laboratory Services is interested in using blockchain for public health tracking, such as maintaining a large, reliable, current and shared database of opioid abuse or managing health data during crises. Blockchain’s distributed ledger system ensures that when one user updates the chain, everyone sees the same data, solving a major shortfall today, when researchers are often working with different versions of the same or similar data sets, rather than the same, unified data.
  • The U.S. Food and Drug Administration has similar interests in sharing health data for large-scale clinical trials.
  • The Office of Personnel Management last fall sought ideas for how to create a new consolidated Employee Digital Record that would track an employee’s skills, performance and experience over the course of an entire career, using blockchain as a means to ensure records are up to date and to speed the process of transfers from one agency to another.

Herman sees his mission as bringing agencies together so they can combine expertise and resources and more quickly make progress. “There are multiple government agencies right now exploring electronic health records with blockchain,” he said. “But we can already see the hurdles with this because they are separate efforts, so we’re adding barriers. We’ve got to design new and better ways to move across agencies, across bureaucracies and silos, to test, evaluate and adopt this technology. It should be eight agencies working together on one pilot, not eight separate pilots on one particular thing.”

The Global Blockchain Business Council (GBBC) is an industry group advocating for blockchain technology and trying to take a similar approach in the commercial sector to what GSA is doing in the federal government. “We try to break down these traditionally siloed communities,” said Mercina Tilleman-Dick, chief operating officer for the GBBC.

These days, that means trying to get people together to talk about standards and regulation and connecting those who are having success with others just beginning to think about such issues. “Blockchain is not going to solve every problem,” Tilleman-Dick said. It could prove effective in a range of use cases where secure, up-to-date, public records are essential.

Take property records, for example. The Republic of Georgia moved all its land titles onto a blockchain-based system in 2017, Sweden is exploring the idea and the city of South Burlington, Vt., is working on a blockchain pilot for local real estate transactions. Patrick Byrne, founder of Overstock.com and its subsidiary Medici Ventures, announced in December he’s funding a startup expressly to develop a global property registry system using blockchain technology.

“I think over the next decade it will fundamentally alter many of the systems that power everyday life,” GBBC’s Tilleman-Dick said.

“Blockchain has the potential to revolutionize all of our supply chains. From machine parts to food safety,” said Adi Gadwale, chief enterprise architect for systems integrator General Dynamics Information Technology. “We will be able to look up the provenance and history of an item, ensuring it is genuine and tracing the life of its creation along the supply chain.

“Secure entries, immutable and created throughout the life of an object, allow for secure sourcing, eliminate fraud, forgeries and ensure food safety,” Gadwale said. “Wal-Mart has already begun trials of blockchain with food safety in mind.”

Hilary Swab Gawrilow, legislative director and counsel in the office of Rep. Jared Polis (D-Colo.) who is among the Congressional Blockchain Caucus leaders, said the government needs to do more to facilitate understanding of the technology. The rapid rise in value of bitcoin and the overall wild fluctuations in value and speculation in digital cryptocurrencies, has done much to raise awareness. Yet it does not necessarily instill confidence in the concepts behind blockchain and distributed ledger technology.

“There are potential government applications or programs that deserve notice and study,” Swab Gawrilow said.

Identity management is a major challenge for agencies today. In citizen engagement, citizens may have accounts with multiple agencies. Finding a way to verify status without having to build complicated links between disparate systems to enable benefits or confirm program eligibility would be valuable. The same is true for program accountability. “Being able to verify transactions – would be another great way to use blockchain technology.”

That’s where the caucus is coming from: A lot of this is around education. Lawmakers have all heard of bitcoin, whether in a positive or negative way. “They understand what it is, Gawrilow said. “But they don’t necessarily understand the underlying technology.” The caucus’ mission is to help inform the community.

Like GSA’s Herman, Gawrilow favors agency collaboration on new technology projects and pilots. “HHS did a hackathon on blockchain. The Postal Service put out a paper, and State is doing something. DHS is doing something. It’s every agency almost,” she said. “We’ve kicked around the idea of asking the administration to start a commission around blockchain.”

That, in turn, might surface issues requiring legislative action – “tweaks to the law” that underlie programs, such as specifications on information access, or a prescribed means of sharing or verifying data. That’s where lawmakers could be most helpful.

Herman, for his part, sees GSA as trying to fill that role, and to fill it in such a way that his agency can tie together blockchain and other emerging and maturing technologies. “It’s not the technology, it’s the culture,” he said. “So much in federal tech is approached as some zero-sum game, that if an agency is dedicating time to focus and investigate a technology like blockchain, people freak out because they’re not paying attention to cloud or something else.”

Agencies need to pool resources and intelligence, think in terms of shared services and shared approaches, break down walls and look holistically at their challenges to find common ground.

That’s where the payoff will come. Otherwise, Herman asks, “What does it matter if the knowledge developed isn’t shared?”

Related Articles

GDIT Recruitment 600×300
Tom Temin 250×250
GovExec Newsletter 250×250
GDIT Recruitment 250×250
GDIT HCSD SCM 3 250×250 Train Yard
Relocatable Video Surveillance Systems Give CBP Flexibility on Border

Relocatable Video Surveillance Systems Give CBP Flexibility on Border

Illegal border crossings fell to their lowest level in at least five years in 2017, but after plunging through April, the numbers have risen each of the past eight months, according to U.S. Customs and Border Protection (CBP).

Meanwhile, the debate continues: Build a physical wall spanning from the Gulf of Mexico to the Pacific Ocean, add more Border Patrol agents or combine better physical barriers with technology to stop drug trafficking, smuggling and illegal immigration?

Increasingly, however, it’s clear no one solution is right for everyplace. Ron Vitiello, acting deputy commissioner at CBP, said the agency intends to expand on the existing 652 miles of walls and fencing now in place – but not necessarily extend the wall the entire length of the border.

“We’re going to add to fill some of the gaps we didn’t get in the [previous] laydown, and then we’re going to prioritize some new wall [construction] across the border in places where we need it the most,” he said in a Jan. 12 TV interview.

Walls and barriers are a priority, Vitiello said in December at a CBP press conference. “In this society and all over our lives, we use walls and fences to protect things,” he said. “It shouldn’t be any different on the border.…  But we’re still challenged with access, we’re still challenged with situational awareness and we’re still challenged with security on that border. We’re still arresting nearly 1,000 people a day.

“So we want to have more capability: We want more agents, we want more technology and we want that barrier to have a safer and more secure environment.”

Among the needs: Relocatable Remote Video Surveillance Systems (R-RVSS) that can be picked up and moved to where they’re needed most as border activity ebbs and flows in response to CBP’s border actions.

CBP mapped its fencing against its 2017 apprehension record in December (see map), finding that areas with physical fencing, such as near the metropolitan centers of San Diego/Tijuana, Tucson/Nogales and El Paso/Juarez are just as likely to see illegal migration activity as unfenced areas in the Laredo/Nueva Laredo area.

CBP mapped its fencing against its 2017 apprehension record in December (see map below), finding that areas with physical fencing are just as likely to see illegal migration activity as unfenced areas.

Source: U.S. Customs and Border Protection

Rep. Will Hurd (R-Tex.), vice chairman of the House Homeland Security subcommittee on Border and Maritime Security, is an advocate for technology as both a complement to and an alternative to physical walls and fences. “A wall from sea to shining sea is the least effective and most expensive solution for border security,” he argued Jan. 16. “This is especially true in areas like Big Bend National Park, where rough terrain, natural barriers and the remoteness of a location render a wall or other structure impractical and ineffective.”

CBP has successfully tested and deployed video surveillance systems to enhance situational awareness on the border and help Border Patrol agents track and respond to incursions. These RVSS systems use multiple day and night sensors mounted on poles to create an advance warning and tracking system identifying potential border-crossing activity. Officers can monitor those sensors feeds remotely and dispatch agents as needed.

Savvy smugglers are quick to adjust when CBP installs new technologies, shifting their routes to less-monitored areas. The new, relocatable RVSS systems (R-RVSS) make it easy for CBP to respond in kind, forcing smugglers and traffickers to constantly adapt.

Robert Gilbert, a former Border Patrol sector chief at CBP and now a senior program director for RVSS at systems integrator General Dynamics Information Technology (GDIT), says relocatable systems will empower CBP with new tools and tactics. “Over the past 20 or 30 years, DOJ then CBP has always deployed technology into the busiest areas along the border, the places with the most traffic. In reality, because of the long procurement process, we usually deployed too late as the traffic had shifted to other locations on the border. The big difference with this capability is you can pick it up and move it to meet the evolving threat. The technology can be relocated within days.”

GDIT fielded a three-tower system in CBP’s Laredo (Texas) West area last summer and a similar setup in McAllen, Texas, in December. The towers – set two to five miles apart – were so effective, CBP is now preparing to buy up to 50 more units to deploy in the Rio Grande sector, where the border follows the river through rugged terrain. There, a physical wall may not be viable, while a technology-based virtual wall could prove highly effective.

Each tower includes an 80-foot-tall collapsible pole that can support a sensor and communications payload weighing up to 2,000 pounds. While far in excess of current needs, it provides a growth path to hanging additional sensors or communications gear if requirements change later on.

When CBP wants to move the units, poles are collapsed, sensors can be packed away and a standard 3/4- or 1-ton pickup truck can haul it to its next location.

Roughly two-thirds of the U.S.-Mexico border runs through land not currently owned by the federal government, a major hurdle when it comes to building permanent infrastructure like walls or even fixed-site towers. Land acquisition would add billions to the cost even if owners agree to the sale. Where owners decline, the government might still be able to seize the land under the legal procedure known as eminent domain, but such cases can take years to resolve.

By contrast, R-RVSS requires only a temporary easement from the land owner. Site work is bare bones: no concrete pad, just a cleared area measuring roughly 40 feet by 40 feet. It need not be level – the R-RVSS system is designed to accommodate slopes up to 10 degrees. Where grid power is unavailable – likely in remote areas – a generator or even a hydrogen fuel cell can produce needed power.

What’s coming next
CBP seeks concepts for a Modular Mobile Surveillance System (M2S2) similar to RVSS, which provide the Border Patrol with an even more rapidly deployable system for detecting, identifying, classifying and tracking “vehicles, people and animals suspected of unlawful border crossing activities.”

More ambitiously, CBP also wants such systems to incorporate data science and artificial intelligence to add a predictive capability. The system would “detect, identify, classify, and track equipment, vehicles, people, and animals used in or suspected of unlawful border crossing activities,” and employ AI to help agents anticipate their direction so they can quickly respond, and resolve each situation.

At the same time, CBP is investigating RVSS-like systems for coastal areas. Deploying pole-mounted systems would train their sensors to monitor coastal waters, where smugglers in small boats seek to exploit the shallows by operating close to shore, rather than the deeper waters patrolled by Coast Guard and Navy ships.

In a market research request CBP floated last June, the agency described a Remote Surveillance System Maritime (RSS-M) as “a subsystem in an overall California Coastal Surveillance demonstration.” The intent: to detect, track, identify, and classify surface targets of interest, so the Border Patrol and partner law enforcement agencies can interdict such threats.

Legislating Tech
Rep. Hurd, Rep. Peter Aguilar (D-Calif.) and a bipartisan group of 49 other congress members support the ‘‘Uniting and Securing America Act of 2017,’’ or “USA Act.” The measure included a plan to evaluate every mile of the U.S.-Mexico border to determine the best security solution for each. After weeks of Senate wrangling over immigration matters, Sens. John McCain (R-Ariz.) and Chris Coons (D-Del.) offered a companion bill in the Senate on Feb. 5.

With 820 miles of border in his district, Hurd says, few in Congress understand the border issue better than he – or feel it more keenly.

“I’m on the border almost every weekend,” he said when unveiling the proposal Jan. 16. The aim: “Full operational control of our border by the year 2020,” Hurd told reporters. “We should be able to know who’s going back and forth across our border. The only way we’re going to do that is by border technologies.” And in an NPR interview that day, he added: “We should be focused on outcomes. How do we get operational control of that border?”

The USA Act would require the Department of Homeland Security to “deploy the most practical and effective technology available along the United States border for achieving situational awareness and operational control of the border by Inauguration Day 2021, including radar surveillance systems; Vehicle and Dismount Exploitation Radars (VADER); three-dimensional, seismic acoustic detection and ranging border tunneling detection technology; sensors, unmanned cameras, drone aircraft and anything else that proves more effective or advanced. The technology is seen as complementing and supporting hard infrastructure.

Related Articles

GDIT Recruitment 600×300
Tom Temin 250×250
GovExec Newsletter 250×250
GDIT Recruitment 250×250
GDIT HCSD SCM 3 250×250 Train Yard
The ABCs of 2018 Federal IT Modernization: I to Z

The ABCs of 2018 Federal IT Modernization: I to Z

In part two of GovTechWorks’ analysis of the Trump Administration’s federal IT modernization plan, we examine the likely guiding impact of the Office of Management and Budget, the manner in which agencies’ infrastructures might change, and the fate of expensive legacy systems.

The White House IT modernization plan released in December seeks a rapid overhaul of IT infrastructure across federal civilian agencies, with an emphasis on redefining the government’s approach to managing its networks and securing its data. Here, in this second part of our two-part analysis, is what you need to know from I to Z (for A-H, click here):

I is for Infrastructure
Modernization boils down to three things: Infrastructure, applications and security. Imagine if every government agency managed its own telephone network or international logistics office, rather than outsourcing such services. IT services are essentially the same. Agencies still need expertise to connect to those services – they still have telecom experts and mail room staff – but they don’t have to manage the entire process.

Special exceptions will always exist for certain military, intelligence (or other specialized) requirements. Increasingly, IT services are becoming commodity services purchased on the open market. Rather than having to own, manage and maintain all that infrastructure, agencies will increasingly buy infrastructure as a service (IaaS) in the cloud — netting faster, perpetually maintained and updated equipment at a lower cost. To bring maximum value – and savings – out of those services, they’ll have to invest in integration and support services to ensure their systems are not only cost effective, but also secure.

J is for JAB, the Joint Authorization Board
The JAB combines expertise at General Services Administration (GSA), Department of Homeland Security (DHS) and the Department of Defense (DOD). It issues preliminary authority to operate (ATO) for widely used cloud services. The JAB will have a definitive role in prioritizing and approving commercial cloud offerings for the highest-risk federal systems.

K is for Keys
The ultimate solution for scanning encrypted data for potential malicious activity is to unencrypt that data for a thorough examination. This involves first having access to encryption keys for federal data and then, securing those keys to ensure they don’t get in the wrong hands. In short, these keys are key to the federal strategy of securing both government data and government networks.

L is for Legacy
The government still spends 70 percent of its IT budget managing legacy systems. That’s down from as much as 85 percent a few years ago, but still too much. In a world where volumes of data continue to expand exponentially and the cost of computer processing power continues to plunge, how long can we afford overspending on last year’s (or last decade’s) aging (and less secure) technology.

M is for Monopsony
A monopoly occurs when one source controls the supply of a given product, service or commodity. A monopsony occurs when a single customer controls the consumption of products, services or commodities. In a classical monopsony, the sole customer dictates terms to all sellers.

Despite its size, the federal government cannot dictate terms to information technology vendors. It can consolidate its purchasing power to increase leverage, and that’s exactly what the government will do in coming years. The process begins with networking services as agencies transition from the old Networx contract to the new Enterprise Information Services vehicle.

Look for it to continue as agencies consolidate purchasing power for commodity software services, such as email, continuous monitoring and collaboration software.

The government may not ultimately wield the full market power of a monopsony, but it can leverage greater negotiating power by centralizing decision making and consolidating purchase and licensing agreements. Look for that to increase significantly in the years ahead.

N is for Networks
Networks used to be the crown jewels of the government’s information enterprise, providing the glue that held systems together and enabling the government to operate. But if the past few years proved anything, it’s that you can’t keep the bad guys out. They’re already in, looking around, waiting for an opportunity.

Networks are essential infrastructure, but will increasingly be virtualized in the future, exist in software and protect encrypted data travelling on commercial fiber and stored much of the time, in commercial data centers (generically referred to as the cloud). You may not keep the bad guys out, but you can control what they get access to.

O is for OMB
The Office of Management and Budget has oversight over much of the modernization plan. The agency is mentioned 127 times in the White House plan, including 47 times in its 50 recommendations. OMB will either be the responsible party or the receiving party, for work done by others on 34 of those 50 recommendations.

P is for Prioritization
Given the vast number of technical, manpower and security challenges that weigh down modernization efforts, prioritizing programs that can deliver the greatest payoff, are essential. In addition, agencies are expected to prioritize and focus their modernization efforts on high-value assets that pose the greatest vulnerabilities and risks. From those lists, by June 30, the DHS must identify six to receive centralized interventions that include staffing and technical support.

The aim is to prioritize where new investment, talent infusions and security policies will make the greatest difference. To maximize that effort, DHS may choose projects that can expand to include other systems and agencies.

OMB must also review and prioritize any impediments to modernization and cloud adoption.

Q is for Quick Start
Technology is not often the most complicated part of many modernization efforts. Finding a viable acquisition strategy that won’t put yesterday’s technology in the government’s hands tomorrow, is often harder. That’s why the report directs OMB to assemble an Acquisition Tiger Team to develop a “quick start” acquisition package to help agencies more quickly license technology and migrate to the cloud.

The aim: combine market research, acquisition plans, readily identified sources and templates for both requests for quotes (RFQs) and Independent Government Cost Estimate (IGCE) calculations — which would be based on completed acquisitions. The tiger team will also help identify qualified small and disadvantaged businesses to help agencies meet set-aside requirements.

R is for Recommendations
There are 50 recommendations in the White House IT modernization report with deadlines ranging from February to August, making the year ahead a busy one for OMB, DHS and GSA, the three agencies responsible for most of the work. A complete list of the recommendations is available here.

T is for the TIC
The federal government developed the Trusted Internet Connection as a means of controlling the number of on and off ramps between government networks and the largely unregulated internet. But in a world now dominated by cloud-based software applications, remote cloud data centers, mobile computing platforms and web-based interfaces that may access multiple different systems to deliver information in context, the TIC needs to be rethought.

“The piece that we struggled with is the Trusted Internet Connections (TIC) initiative – that is a model that has to mature and get solved,” former Federal CIO Tony Scott told Federal News Radio. “It’s an old construct that is applied to modern-day cloud that doesn’t work. It causes performance, cost and latency issues. So the call to double down and sort that out is important. There has been a lot of good work that has happened, but the definitive solution has not been figured out yet.”

The TIC policy is the heart and soul of the government’s perimeter-based security model. Already, some agencies chose to bypass the TIC for certain cloud-based services, such as for Office 365, trusting Microsoft’s security and recognizing that if all that data had to go through an agency’s TIC, performance would suffer.

To modernize TIC capabilities, policies, reference architectures and associated cloud security authorization baselines, OMB must update TIC policies so agencies have a clear path forward to build out data-level protections and more quickly migrate to commercial cloud solutions. A 90-day sprint is to begin in mid-February, during which projects approved by OMB will pilot proposed changes in TIC requirements.

OMB must determine whether all data traveling to and from agency information systems hosted by commercial cloud providers warrants scanning by DHS, or whether only some information needs to be scanned. Other considerations under review: Expanding the number of TIC access points in each agency and a model for determining how best to implement intrusion detection and prevention capabilities into cloud services.

U is for Updating the Federal Cloud Computing Strategy
The government’s “Cloud First” policy is now seven years old. Updates are in order. By April 15, OMB must provide additional guidance on both appropriate use cases and operational security for cloud environments. All relevant policies on cloud migration, infrastructure consolidation and shared services will be reviewed.

In addition, OMB has until June to develop standardized contract language for cloud acquisition, including clauses that define consistent requirements for security, privacy and access to data. Establishing uniform contract language will make it easier to compare and broker cloud offerings and ensure government requirements are met.

V is for Verification
Verification or authentication of users’ identities is at the heart of protecting government information. Are you who you say you are? Key to securing information systems is ensuring that access is granted to only users who can be identified and verified as deserving access.

OMB has until March 1 to issue for public comment new identity policy guidance and to recommend identity service areas suitable for shared services. GSA must provide a business case for consolidating existing identity services to improve usability and drive secure access and enable cloud-based collaboration service that will enhance the ability to easily share and collaborate across agencies, which can be cumbersome today.

W, X, Y, Z is for Wrapping it All Up
The Federal Government is shifting to a consolidated IT model that will change the nature of IT departments and the services they buy. Centralized offerings for commodity IT – whether email, office tools and other common software-as-a-service offerings or virtual desktops and web hosting – will be the norm. As much as possible, the objective is to get agencies on the same page, using the same security services, the same collaboration services, the same data services and make those common (or in some cases shared) across multiple agencies.

Doing so promises to reduce needed manpower and licensing costs by eliminating duplication of effort and increased market leverage to drive down prices. But getting there will not be easy. Integration and security pose unique challenges in a government context, requiring skill, experience and specific expertise. On the government side, policy updates will only solve some of the challenges. Acquisition regulations must also be updated to support wider adoption of commercial cloud products.

Some agencies will need more help than others. Cultural barriers will continue to be major hurdles. Inevitably, staff will have to develop new skills as old ones disappear. Yet even in the midst of all that upheaval, some things don’t change. “In the end, IT modernization is really all about supporting the mission,” says Stan Tyliszczak, chief engineer at systems integrator General Dynamics Information Technology. “It’s about helping government employees complete their work, protecting the privacy of our citizens and ensuring both have timely access to the information and services they need. IT has always made those things better and easier, and modernization is only necessary to continue that process. That much never changes.”

 

Related Articles

GDIT Recruitment 600×300
Tom Temin 250×250
GovExec Newsletter 250×250
GDIT Recruitment 250×250
GDIT HCSD SCM 3 250×250 Train Yard
The ABCs of 2018 Federal IT Modernization: A to H

The ABCs of 2018 Federal IT Modernization: A to H

The White House issued its IT modernization plan last December and followed it with an ambitious program that could become a proving ground for rapidly overhauling IT infrastructure, data access and customer service. After years of talking about IT modernization, cybersecurity and migration to the cloud, federal agencies are now poised to ramp up the action.

Here, in A-B-C form, is what you need to know from A to H:

A is for Agriculture
The U.S. Department of Agriculture (USDA) will be a sort of proving ground for implementing the Trump administration’s vision for the future high-tech, high-performance, customer-satisfying government. USDA announced in December 2017 it will collapse 39 data centers into one (plus a backup), and consolidate 22 independent chief information officers under a single CIO with seven deputies. The aim: reinvent the agency as a modern, customer-centered organization and provide its leaders with instant access to a wealth of agency data.

B is for Better Citizen Services
“It is imperative for the federal government to leverage … innovations to provide better service for its citizens in the most cost-effective and secure manner,” the report states – in just its third sentence. Yes, modernization should ultimately save money by reducing the billions spent to keep aging systems operational. And yes, it should help overcome the patchwork of cybersecurity point solutions now used to protect federal networks, systems and data.

USDA Secretary Sonny Purdue’s experience modernizing government IT during two terms as governor of Georgia from 2003-2010 convinced him he could achieve similar results on the federal level. “He really saw, in reinventing Georgia government, how IT modernization and delivering better customer service benefitted not only employees, but the people of the state,” Deputy Secretary of Agriculture Steve Censky said in a TV interview.

Among the agency’s goals: Increase access to information throughout the agency by means of online service portals and advanced application program interfaces.

C is for Centers of Excellence
USDA won’t be going it alone. Under the direction of the Office of Science and Technology Policy, the agency will be the first to engage with a new set of experts at the General Services Administration (GSA). GSA is on an accelerated course to create five Centers of Excellence, leveraging both public and private sector expertise to develop best practices and standards that agencies can use for:

  • Cloud adoption
  • IT infrastructure optimization
  • Customer experience
  • Service delivery analytics
  • Contact centers

Jack Wilmer, White House senior advisor for Cybersecurity and IT Modernization, says the idea is to provide each agency’s modernization effort with the same core concepts and approach – and the best available experts. “We’re trying to leverage private sector expertise, bringing them in a centralized fashion, making them available to government agencies as they modernize,” he told Government Matters.

While GSA planned to award contracts to industry partners by the end of January – just 45 days after its initial solicitation – by March 5, no contracts had been awarded. Phase 1 contracts for assessment, planning and some initial activities should be finalized soon. Phase 2 awards for cloud migration, infrastructure optimization and customer experience are expected by the end of the year, Joanne Collins Smee, acting director of GSA’s Technology Transformation Service and deputy commissioner of the Federal Acquisition Service, said at a March 1 AFCEA event in Washington, D.C.

D is for Data Centers
While all data centers won’t close down, many more will soon disappear. Modernization is about getting the government out of the business of managing big infrastructure investments and instead, to leverage commercial cloud infrastructure and technology wherever possible. But don’t think your agency’s data won’t be in a data center somewhere.

“What is the cloud, anyway? Isn’t it really someone else’s data center, available on demand?” says Stan Tyliszczak, chief engineer at systems integrator General Dynamics Information Technology (GDIT). “Moving to the cloud means getting out of the business of running that data center yourself.”

The White House splits its cloud strategy into two buckets:

  • “Bring the government to the cloud.” Put government data and applications in privately-owned and operated infrastructure, where it is protected through encryption and other security technologies. This is public cloud, where government data sits side by side with private data in third-party data centers.
  • “Bring the cloud to the government.” Putting government data and applications on vendor-owned infrastructure, but located in government-owned facilities, as the Intelligence Community Information Technology Enterprise (IC ITE) does with the IC’s Commercial Cloud Services (C2S) contract with Amazon Web Services.

Figuring out what makes sense when, depends on your use case and for most agencies, will mean a combination of on premise solutions, shared government services and commercial services in public clouds. “That’s the Hybrid cloud model everyone’s talking about. But it’s not a trivial exercise. Melding those together is the challenge,” Tyliszczak says. “That’s what integrators are for.”

E is for Encryption
Government cybersecurity efforts have historically focused on defending the network and its perimeter, rather than the data that travels on that network. As cloud services are integrated into conventional on premise IT solutions, securing the data has become essential. At least 47 percent of federal network traffic is encrypted today – frustrating agency efforts to monitor what’s crossing network perimeters.

“Rather than treating Federal networks as trusted entities to be defended at the perimeter,” the modernization report advised, “agencies should shift their focus to placing security protections closer to data.”

To do that, the government must improve the way it authenticates devices and users on its networks, securing who has access and how, and encrypting data both at rest and in transit.

“Now you’re starting to obfuscate whether your sensors can actually inspect the content of that data,” notes Eric White, Cybersecurity program director at GDIT’s Health and Civilian Solutions Division. “Because it’s now encrypted, you add another layer of complexity to know for sure whether it’s the good guys or the bad guys moving data in and out of your network.”

White notes that the Department of Homeland Security (DHS) is charged with solving this encryption dilemma, balancing the millions of dollars in investment in high-end network-monitoring sensors, such as those associated with the Einstein program, against protecting individual privacy. Enabling those sensors to see through or decipher encrypted data without undermining the security of the data – or the privacy of individuals – is a critical priority. DHS has commissioned research to develop potential solutions, including virtualizing sensors for cloud environments; relocating sensors to the endpoints of encrypted tunnels; creating man-in-the-middle solutions that intercept data in motion; or providing the sensors with decryption keys.

F is for FedRAMP

The Federal Risk and Authorization Management Program (FedRAMP) remains the critical process for ensuring private-sector cloud offerings meet government security requirements. Look for updates to FedRAMP baselines that could allow tailoring of security controls for low-risk systems, address new approaches to integrated cloud services with federal Trusted Internet Connection (TIC) services and consider common features or capabilities that could be incorporated into higher-risk systems with FedRAMP “high” baselines.

Importantly, the report directs the General Services Administration (GSA), which manages FedRAMP, to come up with new solutions that make it easier for a software-as-a-service (SaaS) products already authorized for use in one agency to be accepted for use in another. Making the process for issuing an authority to operate (ATO) faster and easier to reuse has long been a goal of both cloud providers and government customers. This is particularly critical for shared services, in which one agency provides its approved commercial solution to another agency.

G is for GSA
Already powerfully influential as a buyer and developer for other agencies, GSA stands to become even more influential as the government moves to consolidate networks and other IT services into fewer contracts and licensing agreements, and to increase the commonality of solutions across the government.

This is especially true among smaller agencies that lack the resources, scale and expertise to effectively procure and manage their own IT services.

H is for Homeland Security
DHS is responsible for the overall cybersecurity of all federal government systems. The only federal entity mentioned more frequently in the White House modernization report is the Office of Management and Budget, which is the White House agency responsible for implementing the report’s guidance.

DHS was mandated to issue a report by Feb. 15, identifying the common weaknesses of the government’s highest-value IT assets and recommend solutions for reducing risk and vulnerability government-wide. By May 15, the agency must produce a prioritized list of systems “for government-wide intervention” and will provide a host of advisory and support services to help secure government systems. DHS also owns and manages the National Cybersecurity Protection System (NCPS) and the EINSTEIN sensor suites that capture and analyze network flow, detect intruders and scan the data coming in and out of government systems to identify potentially malicious activity and, in the case of email, blocking and filtering threatening content.

Look for next week’s edition of GovTechWorks for Part 2: Modernization from I to Z. In Part 2, we outline how infrastructure among government agencies will be impacted and streamlined by modernization, as well as discuss the fate of legacy systems and their maintenance budgets, and the major role the Office of Management and Budget will play in overall implementation.

Next week: Part 2, Modernization I-Z.

 

Related Articles

GDIT Recruitment 600×300
Tom Temin 250×250
GovExec Newsletter 250×250
GDIT Recruitment 250×250
GDIT HCSD SCM 3 250×250 Train Yard
Design Thinking and DevOps Combine for Better Customer Experience

Design Thinking and DevOps Combine for Better Customer Experience

How citizens interact with government websites tells you much about how to improve – as long as you’re paying attention, said Aaron Wieczorek, digital services expert with U.S. Digital Services’ team at the Department of Veteran Affairs.

“At VA we will literally sit down with veterans, watch them work with the website and apply for benefits,” he said. The aim is to make sure the experience is what users want and expect he said, not “what we think they want.”

Taking copious notes on their observations, the team then sets to work on programming improvements that can be quickly put to the test. “Maybe some of the buttons were confusing or some of the way things work is confusing – so we immediately start reworking,” Wieczorek explained.

Applying a modern agile development approach means digital services can immediately put those tweaks to the test in their development environment. “If it works there, good. Then it moves to staging. If that’s acceptable, it deploys into production,” Wieczorek said.

That process can happen in days. Vets.gov deploys software updates into production 40 times per month Wieczorek said, and agency wide to all kinds of environments 600 times per month.

Case in point: Vets.gov’s digital Form 1010 EZ, which allows users to apply for VA healthcare online.

“We spent hundreds of hours watching veterans, and in end we were able to totally revamp everything,” Wieczorek said. “It’s actually so easy now, you can do it all on your phone.” More than 330,000 veterans have applied that way since the digital form was introduced. “I think that’s how you scale things.”

Of course, one problem remains: Vets.gov is essentially a veteran-friendly alternative site to VA.gov, which may not be obvious to search engines or veterans looking for the best way in the door. Search Google for “VA 1010ez” and the old, mobile-unfriendly PDF form still shows as the top result. The new mobile-friendly application? It’s the third choice.

At the National Geospatial-Intelligence Agency, developers take a similar approach, but focus hard on balancing speed, quality and design for maximum results. “We believe that requirements and needs should be seen like a carton of milk: The longer they sit around, the worse they get,” said Corry Robb product design lead in the Office of GEOINT Services at the National Geospatial-Intelligence Agency. “We try to handle that need as quickly as we can and deliver that minimally viable product to the user’s hands as fast as we can.”

DevOps techniques, where development and production processes take place simultaneously, increase speed. But speed alone is not the measure of success, Robb said. “Our agency needs to focus on delivering the right thing, not just the wrong thing faster.” So in addition to development sprints, his team has added “design sprints to quickly figure out the problem-solution fit.”

Combining design thinking, which focuses on using design to solve specific user problems, is critical to the methodology, he said. “Being hand in hand with the customer – that’s one of the core values our group has.”

“Iterative development is a proven approach,” said Dennis Gibbs, who established the agile development practice in General Dynamics Information Technology’s Intelligence Solutions Division. “Agile and DevOps techniques accelerate the speed of convergence on a better solution.  We continually incorporate feedback from the user into the solution, resulting in a better capability delivered faster to the user.”

Related Articles

GDIT Recruitment 600×300
Tom Temin 250×250
GovExec Newsletter 250×250
GDIT Recruitment 250×250
GDIT HCSD SCM 3 250×250 Train Yard
Is Identity the New Perimeter? In a Zero-Trust World, More CISOs Think So

Is Identity the New Perimeter? In a Zero-Trust World, More CISOs Think So

As the network perimeter morphs from physical to virtual, the old Tootsie Pop security model – hard shell on the outside with a soft and chewy center – no longer works. The new mantra, as Mittal Desai, chief information security officer (CISO) at the Federal Energy Regulatory Commission, said at the ATARC CISO Summit: “Never trust, double verify.”

The zero-trust model modernizes conventional network-based security for a hybrid cloud environment. As agencies move systems and storage into the cloud, networks are virtualized and security naturally shifts to users and data. That’s easy enough to do in small organizations, but rapidly grows harder with the scale and complexity of an enterprise.

The notion of zero-trust security first surfaced five years ago in a Forrester Research report prepared for the National Institute for Standards and Technology (NIST). “The zero-trust model is simple,” Forrester posited then. “Cybersecurity professionals must stop trusting packets as if they were people. Instead, they must eliminate the idea of a trusted network (usually the internal network) and an untrusted network (external networks). In zero-trust, all network traffic is untrusted.”

Cloud adoption by its nature is forcing the issue, said Department of Homeland Security Chief Technology Officer Mike Hermus, speaking at a recent Tech + Tequila event: “It extends the data center,” he explained. “The traditional perimeter security model is not working well for us anymore. We have to work toward a model where we don’t trust something just because it’s within our boundary. We have to have strong authentication, strong access control – and strong encryption of data across the entire application life cycle.”

Indeed, as other network security features mature, identity – and the access that goes with it – is now the most common cybersecurity attack vector. Hackers favor phishing and spear-phishing attacks because they’re inexpensive and effective – and the passwords they yield are like the digital keys to an enterprise.

About 65 percent of breaches cited in Verizon’s 2017 Data Breach Investigations Report made use of stolen credentials.

Interestingly however, identity and access management represent only a small fraction of cybersecurity investment – less than 5 percent – according to Gartner’s market analysts. Network security equipment by contrast, constitutes more than 12 percent. Enterprises continue to invest in the Tootsie Pop model even as its weaknesses become more evident.

“The future state of commercial cloud computing makes identity and role-based access paramount,” said Rob Carey, vice president for cybersecurity and cloud solutions within the Global Solutions division at General Dynamics Information Technology (GDIT). Carey recommends creating both a framework for better understanding the value of identity management tools, and metrics to measure that impact. “Knowing who is on the network with a high degree of certainty has tremendous value.”

Tom Kemp, chief executive officer at Centrify, which provides cloud-based identity services, has a vested interest in changing that mix. Centrify, based in Sunnyvale, Calif., combines identity data with location and other information to help ensure only authorized, verified users access sensitive information.

“At the heart of zero-trust is the realization that an internal user should be treated just like an external user, because your internal network is just as polluted as your outside network,” Kemp said at the Feb. 7 Institute for Critical Infrastructure (ICIT) Winter Summit. “You need to move to constant verification.” Reprising former President Ronald Reagan’s “trust but verify” mantra, he adds: “Now it’s no trust and always verify. That’s the heart of zero-trust.”

The Google Experience
When Google found itself hacked in 2009, the company launched an internal project to find a better way to keep hackers out of its systems. Instead of beefing up firewalls and tightening virtual private network settings, Google’s BeyondCorp architecture dispensed with the Tootsie Pop model in which users logged in and then gained access to all manner of systems and services.

In its place, Google chose to implement a zero-trust model that challenges every user and every device on every data call – regardless of how that user accessed the internet in the first place.

While that flies in the face of conventional wisdom, Google reasoned that by tightly controlling the device and user permissions to access data, it had found a safer path.

Here’s an example of how that works when an engineer with a corporate-issued laptop wants to access an application from a public Wi-Fi connection:

  1. The laptop provides its device certificate to an access proxy.
  2. The access proxy confirms the device, then redirects to a single-sign-on (SSO) system to verify the user.
  3. The engineer provides primary and second-factor authentication credentials and, once authenticated by the SSO system, is issued a token.
  4. Now, with the device certificate to identify the device and the SSO token to identify the user, an Access Control Engine can perform a specific authorization check for every data access. The user must be confirmed to be in the engineering group; to possess a sufficient trust level; and to be using a managed device in good standing with a sufficient trust level.
  5. If all checks pass, the request is passed to an appropriate back-end system and the data access is allowed. If any of the checks fail however, the request is denied. This is repeated every time the engineer tries to access a data item.

“That’s easy enough when those attributes are simple and clear cut, as with the notional Google engineer,” said GDIT’s Carey, who spent three decades managing defense information systems. “But it gets complicated in a hurry if you’re talking about an enterprise on the scale of the Defense Department or Intelligence community.”

Segmenting the Sprawling Enterprise
A takeaway from 9/11 was that intelligence agencies needed to be better and faster at sharing threat data across agency boundaries. Opening databases across agency divisions, however, had consequences: Chelsea Manning, at the time Pfc. Bradley Manning, delivered a treasure trove of stolen files to WikiLeaks and then a few years later, Edward Snowden stole countless intelligence documents, exposing a program designed to collect metadata from domestic phone and email records.

“The more you want to be sure each user is authorized to see and access only the specific data they have a ‘need-to-know,’ the more granular the identity and access management schema need to be,” Carey said. “Implementing role-based access is complicated because you’ve got to develop ways to both tag data and code users based on their authorized need. Absent a management schema, that can quickly become difficult to manage for all but the smallest applications.”

Consider a scenario of a deployed military command working in a multinational coalition with multiple intelligence agencies represented in the command’s intelligence cell. The unit commands air and ground units from all military services, as well as civilians from defense, intelligence and possibly other agencies. Factors determining individual access to data might include the person’s job, rank, nationality, location and security clearance. Some missions might include geographic location, but others can’t rely on that factor because some members of the task force are located thousands of miles away, or operating from covert locations.

That scenario gets even more complicated in a hybrid cloud environment where some systems are located on premise, and others are far away. Managing identity-based access gets harder anyplace where distance or bandwidth limitations cause delays. Other integration challenges include implementing a single-sign-on solution across multiple clouds, or sharing data by means of an API.

Roles and Attributes
To organize access across an enterprise – whether in a small agency or a vast multi-agency system such as the Intelligence Community Information Technology Enterprise (IC ITE) – information managers must make choices. Access controls can be based on individual roles – such as job level, function and organization – or data attributes – such as type, source, classification level and so on.

“Ultimately, these are two sides of the same coin,” Carey said. “The real challenge is the mechanics of developing the necessary schema to a level of granularity that you can manage, and then building the appropriate tools to implement it.”

For example, the Defense Department intends to use role-based access controls for its Joint Information Enterprise (JIE), using the central Defense Manpower Data Center (DMDC) personnel database to connect names with jobs. The available fields in that database are in effect, the limiting factors on just how granular role-based access controls will be under JIE.

Access controls will only be one piece of JIE’s enterprise security architecture. Other features, ranging from encryption to procedural controls that touch everything from the supply chain to system security settings, will also contribute to overall security.

Skeptical of Everything
Trust – or the lack of it – plays out in each of these areas, and requires healthy skepticism at every step. Rod Turk, CISO at the Department of Commerce, said CISOs need to be skeptical of everything. “I’m talking about personnel, I’m talking about relationships with your services providers,” he told the ATARC CISO Summit.  “We look at the companies we do business with and we look at devices, and we run them through the supply chain.  And I will tell you, we have found things that made my hair curl.”

Commerce’s big push right now is the Decennial Census, which will collect volumes of personal information (PI) and personally identifiable information (PII) on almost every living person in the United States. Conducting a census every decade is like doing a major system reset each time. The next census will be no different, employing mobile devices for census takers and for the first time, allowing individuals to fill out census surveys online. Skepticism is essential because the accuracy of the data depends on the public’s trust in the census.

In a sense, that’s the riddle of the whole zero-trust concept: In order to achieve a highly trusted outcome, CISOs have to start with no trust at all.

Yet trust also cuts in the other direction. Today’s emphasis on modernization and migration to the cloud means agencies face tough choices. “Do we in the federal government trust industry to have our best interests in mind to keep our data in the cloud secure?” Turk asked rhetorically.

In theory, the Federal Risk and Authorization Management Program (FedRAMP) establishes baseline requirements for establishing trust but doubts persist. What satisfies one agency’s requirements may not satisfy another. Compliance with FedRAMP or NIST controls equates to risk management rather than actual security, GDIT’s Carey points out. They’re not the same thing.

Identity and Security
Beau Houser, CISO at the Small Business Administration, is more optimistic by improvements he’s seen as compartmentalized legacy IT systems are replaced with centralized, enterprise solutions in a Microsoft cloud.

“As we move to cloud, as we roll out Windows 10, Office 365 and Azure, we’re getting all this rich visibility of everything that’s happening in the environment,” he said. “We can now see all logins on every web app, whether that’s email or OneDrive or what have you, right on the dashboard. And part of that view is what’s happening over that session: What are they doing with email, where are they moving files.… That’s visibility we didn’t have before.”

Leveraging that visibility effectively extends that notion of zero-trust one step further, or at least shifts it into the realm of a watchful parent rather than one who blindly trusts his teenage children. The watchful parent believes trust is not a right, but an earned privilege.

“Increased visibility means agencies can add behavioral models to their security controls,” Carey said. “Behavioral analysis tools that can match behavior to what people’s roles are supposed to be and trigger warnings if people deviate from expected norms, is the next big hurdle in security.”

As Christopher Wlaschin, CISO at the Department of Health and Human Services, says: “A healthy distrust is a good thing.”

Related Articles

GDIT Recruitment 600×300
Tom Temin 250×250
GovExec Newsletter 250×250
GDIT Recruitment 250×250
GDIT HCSD SCM 3 250×250 Train Yard