State & Local

Close the Data Center, Skip the TIC – How One Agency Bought Big Into Cloud

Close the Data Center, Skip the TIC – How One Agency Bought Big Into Cloud

It’s no longer a question of whether the federal government is going to fully embrace cloud computing. It’s how fast.

With the White House pushing for cloud services as part of its broader cybersecurity strategy and budgets getting squeezed by the administration and Congress, chief information officers (CIOs) are coming around to the idea that the faster they can modernize their systems, the faster they’ll be able to meet security requirements. And that once in the cloud, market dynamics will help them drive down costs.

“The reality is, our data-center-centric model of computing in the federal government no longer works,” says Chad Sheridan, CIO at the Department of Agriculture’s Risk Management Agency. “The evidence is that there is no way we can run a federal data center at the moderate level or below better than what industry can do for us. We don’t have the resources, we don’t have the energy and we are going to be mired with this millstone around our neck of modernization for ever and ever.”

Budget pressure, demand for modernization and concern about security all combine as a forcing function that should be pushing most agencies rapidly toward broad cloud adoption.

Joe Paiva, CIO at the International Trade Administration (ITA), agrees. He used an expiring lease as leverage to force his agency into the cloud soon after he joined ITA three years ago. Time and again the lease was presented to him for a signature and time and again, he says, he tore it up and threw it away.

Finally, with the clock ticking on his data center, Paiva’s staff had to perform a massive “lift and shift” operation to keep services running. Systems were moved to the Amazon cloud.  Not a pretty transition, he admits, but good enough to make the move without incident.

“Sometimes lift and shift actually makes sense,” Paiva told federal IT specialists at the Advanced Technology Academic Research Center’s (ATARC) Cloud and Data Center Summit. “Lift and shift actually gets you there, and for me that was the key – we had to get there.”

At first, he said, “we were no worse off or no better off.” With systems and processes that hadn’t been designed for cloud, however, costs were high. “But then we started doing the rationalization and we dropped our bill 40 percent. We were able to rationalize the way we used the service, we were able to start using more reserve things instead of ad hoc.”

That rationalization included cutting out software and services licenses that duplicated other enterprise solutions. Microsoft Office 365, for example, provided every user with a OneDrive account in the cloud. By getting users to save their work there, meant his team no longer had to support local storage and backup, and the move to shared virtual drives instead of local ones improved worker productivity.

With 226 offices around the world, offloading all that backup was significant. To date, all but a few remote locations have made the switch. Among the surprise benefits: happier users. Once they saw how much easier things were with shared drives that were accessible from anywhere, he says, “they didn’t even care how much money we were saving or how much more secure they were – they cared about how much more functional they suddenly became.”

Likewise, Office 365 provided Skype for Business – meaning the agency could eliminate expensive stand-alone conferencing services – another benefit providing additional savings.

Cost savings matter. Operating in the cloud, ITA’s annual IT costs per user are about $15,000 – less than half the average for the Commerce Department as a whole ($38,000/user/year), or the federal government writ large ($39,000/user/year), he said.

“Those are crazy high numbers,” Paiva says. “That is why I believe we all have to go to the cloud.”

In addition to Office 365, ITA uses Amazon Web Services (AWS) for infrastructure and Salesforce to manage the businesses it supports, along with several other cloud services.

“Government IT spending is out of freaking control,” Paiva says, noting that budget cuts provide incentive for driving change that might not come otherwise. “No one will make the big decisions if they’re not forced to make them.”

Architecture and Planning
If getting to the cloud is now a common objective, figuring out how best to make the move is unique to every user.

“When most organizations consider a move to the cloud, they focus on the ‘front-end’ of the cloud experience – whether or not they should move to the cloud, and if so, how will they get there,” says Srini Singaraju, chief cloud architect at General Dynamics Information Technology, a systems integrator. “However, organizations commonly don’t give as much thought to the ‘back-end’ of their cloud journey: the new operational dynamics that need to be considered in a cloud environment or how operations can be optimized for the cloud, or what cloud capabilities they can leverage once they are there.”

Rather than lift and shift and then start looking for savings, Singaraju advocates planning carefully what to move and what to leave behind. Designing systems and processes to take advantage of its speed and avoiding some of the potential pitfalls not only makes things go more smoothly, it saves money over time.

“Sometimes it just makes more sense to retire and replace an application instead of trying to lift and shift,” Singaraju says. “How long can government maintain and support legacy applications that can pose security and functionality related challenges?”

The challenge is getting there. The number of cloud providers that have won provisional authority to operate under the 5-year-old Federal Risk and Authorization Management Program (FedRAMP) is still relatively small: just 86 with another 75 still in the pipeline. FedRAMP’s efforts to speed up the process are supposed to cut the time it takes to earn a provisional authority to operate (P-ATO) from as much as two years to as little as four months. But so far only three cloud providers have managed to get a product through FedRAMP Accelerated – the new, faster process, according to FedRAMP Director Matt Goodrich. Three more are in the pipeline with a few others lined up behind those, he said.

Once an agency or the FedRAMP Joint Authorization Board has authorized a cloud solution, other agencies can leverage their work with relatively little effort. But even then, moving an application from its current environment is an engineering challenge. Determining how to manage workflow and the infrastructure needed to make a massive move to the cloud work is complicated.

At ITA, for example, Paiva determined that cloud providers like AWS, Microsoft Office 365 and Salesforce had sufficient security controls in place that they could be treated as a part of his internal network. That meant user traffic could be routed directly to them, rather than through his agency’s Trusted Internet Connection (TIC). That provided a huge infrastructure savings because he didn’t have to widen that TIC gateway to accommodate all that routine work traffic, all of which in the past would have stayed inside his agency’s network.

Rather than a conventional “castle-and-moat” architecture, Paiva said he had to interpret the mandate to use the TIC “in a way that made sense for a borderless network.”

“I am not violating the mandate,” he said. “All my traffic that goes to the wild goes through the TIC. I want to be very clear about that. If you want to go to www-dot-name-my-whatever-dot-com, you’re going through the TIC. Office 365? Salesforce? Service Now? Those FedRAMP-approved, fully ATO’d applications that I run in my environment? They’re not external. My Amazon cloud is not external. It is my data center. It is my network. I am fulfilling the intent and letter of the mandate – it’s just that the definition of what is my network has changed.”

Todd Gagorik, senior manager for federal services at AWS, said this approach is starting to take root across the federal government. “People are beginning to understand this clear reality: If FedRAMP has any teeth, if any of this has any meaning, then let’s embrace it and actually use it as it’s intended to be used most efficiently and most securely. If you extend your data center into AWS or Azure, those cloud environments already have these certifications. They’re no different than your data center in terms of the certifications that they run under. What’s important is to separate that traffic from the wild.”

ATARC has organized a working group of government technology leaders to study the network boundary issue and recommend possible changes to the policy, said Tom Suder, ATARC president. “When we started the TIC, that was really kind of pre-cloud, or at least the early stages of cloud,” he said. “It was before FedRAMP. So like any policy, we need to look at that again.” Acting Federal CIO Margie Graves is a reasonable player, he said, and will be open to changes that makes sense, given how much has changed since then.

Indeed, the whole concept of a network’s perimeter has been changed by the introduction of cloud services, Office of Management and Budget’s Grant Schneider, the acting federal chief information security officer (CISO), told GovTechWorks earlier this year.

Limiting what needs to go through the TIC and what does not could have significant implications for cost savings, Paiva said. “It’s not chump change,” he said. “That little architectural detail right there could be billions across the government that could be avoided.”

But changing the network perimeter isn’t trivial. “Agency CIOs and CISOs must take into account the risks and sensitivities of their particular environment and then ensure their security architecture addresses all of those risks,” says GDIT’s Singaraju. “A FedRAMP-certified cloud is a part of the solution, but it’s only that – a part of the solution. You still need to have a complete security architecture built around it. You can’t just go to a cloud service provider without thinking all that through first.”

Sheridan and others involved in the nascent Cloud Center of Excellence sees the continued drive to the cloud as inevitable. “The world has changed,” he says. “It’s been 11 years since these things first appeared on the landscape. We are in exponential growth of technology, and if we hang on to our old ideas we will not continue. We will fail.”

His ad-hoc, unfunded group includes some 130 federal employees from 48 agencies and sub-agencies that operate independent of vendors, think tanks, lobbyists or others with a political or financial interest in the group’s output. “We are a group of people who are struggling to drive our mission forward and coming together to share ideas and experience to solve our common problems and help others to adopt the cloud,” Sheridan says. “It’s about changing the culture.”

Related Articles

GDIT Recruitment 600×300
GDIT HCSD SCM 5 250×250 Truck
NPR Morning Edition 250×250
AFCEA/GMU Critical Issues in  C4I Symposium 250×250
GDIT Recruitment 250×250
USNI News: 250×250
Vago 250×250
Nextgov Newsletter 250×250
How Feds Are Trying to Bring Order to Blockchain Mania

How Feds Are Trying to Bring Order to Blockchain Mania

Blockchain hype is at a fever pitch. The distributed ledger technology is hailed as a cure for everything from identity management to electronic health records and securing the Internet of Things. Blockchain provides a secure, reliable matter of record for transactions between independent parties, entities or companies. There are industry trade groups, a Congressional Blockchain Caucus and frequent panel discussions to raise awareness.

Federal agencies are plunging ahead, both on their own and in concert with the General Services Administration’s Emerging Citizen Technology Office (GSA ECTO). The office groups blockchain with artificial intelligence and robotic automation, social and collaborative technologies, and virtual and augmented reality as its four most critical technologies. Its goal: Develop use cases and roadmaps to hasten government adoption and success with these new technologies.

“There’s a number of people who assume that fed agencies aren’t looking at things like blockchain,” Justin Herman, emerging technology lead and evangelist at GSA ECTO, told a gathering at the State of the Net Conference held Jan. 29 in Washington, D.C. “We got involved in blockchain because there were so many federal agencies coming to the table demanding government wide programs to explore the technology. People had already done analysis on what specific use cases they thought they had and wanted to be able to invest in it.”

Now his office is working with more than 320 federal, state and local agencies interested in one or more of its four emerging tech categories. “A lot of that is blockchain,” Herman said. “Some have already done successful pilots. We hear identity management, supply chain management…. We should be exploring those things together, not in little silos, not in walled gardens, but in public.”

Among those interested:

  • The Joint Staff’s J4 Logistics Directorate and the Deputy Assistant Secretary of Defense for Maintenance, Policy and Programs are collaborating on a project to create a digital supply chain, enabled by additive manufacturing (also known as 3-D Printing). Blockchain’s role would be to secure the integrity of 3-D printing files, seen as “especially vulnerable to cyber threats and intrusions.” The Navy is looking at the same concept.“The ability to secure and securely share data throughout the manufacturing process (from design, prototyping, testing, production and ultimately disposal) is critical to Additive Manufacturing and will form the foundation for future advanced manufacturing initiatives,” writes Lt. Cmdr. Jon McCarter, a member of the Fiscal 2017 Secretary of the Navy Naval Innovation Advisory Council (NIAC).
  • The Office of the Undersecretary of Defense for Acquisition, Technology and Logistics (OUSD (AT&L)) Rapid Reaction Technology Office (RRTO) has similar designs on blockchain, seeing it as a potential solution for ensuring data provenance, according to a solicitation published Jan. 29.
  • The Centers for Disease Control’s Center for Surveillance, Epidemiology and Laboratory Services is interested in using blockchain for public health tracking, such as maintaining a large, reliable, current and shared database of opioid abuse or managing health data during crises. Blockchain’s distributed ledger system ensures that when one user updates the chain, everyone sees the same data, solving a major shortfall today, when researchers are often working with different versions of the same or similar data sets, rather than the same, unified data.
  • The U.S. Food and Drug Administration has similar interests in sharing health data for large-scale clinical trials.
  • The Office of Personnel Management last fall sought ideas for how to create a new consolidated Employee Digital Record that would track an employee’s skills, performance and experience over the course of an entire career, using blockchain as a means to ensure records are up to date and to speed the process of transfers from one agency to another.

Herman sees his mission as bringing agencies together so they can combine expertise and resources and more quickly make progress. “There are multiple government agencies right now exploring electronic health records with blockchain,” he said. “But we can already see the hurdles with this because they are separate efforts, so we’re adding barriers. We’ve got to design new and better ways to move across agencies, across bureaucracies and silos, to test, evaluate and adopt this technology. It should be eight agencies working together on one pilot, not eight separate pilots on one particular thing.”

The Global Blockchain Business Council (GBBC) is an industry group advocating for blockchain technology and trying to take a similar approach in the commercial sector to what GSA is doing in the federal government. “We try to break down these traditionally siloed communities,” said Mercina Tilleman-Dick, chief operating officer for the GBBC.

These days, that means trying to get people together to talk about standards and regulation and connecting those who are having success with others just beginning to think about such issues. “Blockchain is not going to solve every problem,” Tilleman-Dick said. It could prove effective in a range of use cases where secure, up-to-date, public records are essential.

Take property records, for example. The Republic of Georgia moved all its land titles onto a blockchain-based system in 2017, Sweden is exploring the idea and the city of South Burlington, Vt., is working on a blockchain pilot for local real estate transactions. Patrick Byrne, founder of Overstock.com and its subsidiary Medici Ventures, announced in December he’s funding a startup expressly to develop a global property registry system using blockchain technology.

“I think over the next decade it will fundamentally alter many of the systems that power everyday life,” GBBC’s Tilleman-Dick said.

“Blockchain has the potential to revolutionize all of our supply chains. From machine parts to food safety,” said Adi Gadwale, chief enterprise architect for systems integrator General Dynamics Information Technology. “We will be able to look up the provenance and history of an item, ensuring it is genuine and tracing the life of its creation along the supply chain.

“Secure entries, immutable and created throughout the life of an object, allow for secure sourcing, eliminate fraud, forgeries and ensure food safety,” Gadwale said. “Wal-Mart has already begun trials of blockchain with food safety in mind.”

Hilary Swab Gawrilow, legislative director and counsel in the office of Rep. Jared Polis (D-Colo.) who is among the Congressional Blockchain Caucus leaders, said the government needs to do more to facilitate understanding of the technology. The rapid rise in value of bitcoin and the overall wild fluctuations in value and speculation in digital cryptocurrencies, has done much to raise awareness. Yet it does not necessarily instill confidence in the concepts behind blockchain and distributed ledger technology.

“There are potential government applications or programs that deserve notice and study,” Swab Gawrilow said.

Identity management is a major challenge for agencies today. In citizen engagement, citizens may have accounts with multiple agencies. Finding a way to verify status without having to build complicated links between disparate systems to enable benefits or confirm program eligibility would be valuable. The same is true for program accountability. “Being able to verify transactions – would be another great way to use blockchain technology.”

That’s where the caucus is coming from: A lot of this is around education. Lawmakers have all heard of bitcoin, whether in a positive or negative way. “They understand what it is, Gawrilow said. “But they don’t necessarily understand the underlying technology.” The caucus’ mission is to help inform the community.

Like GSA’s Herman, Gawrilow favors agency collaboration on new technology projects and pilots. “HHS did a hackathon on blockchain. The Postal Service put out a paper, and State is doing something. DHS is doing something. It’s every agency almost,” she said. “We’ve kicked around the idea of asking the administration to start a commission around blockchain.”

That, in turn, might surface issues requiring legislative action – “tweaks to the law” that underlie programs, such as specifications on information access, or a prescribed means of sharing or verifying data. That’s where lawmakers could be most helpful.

Herman, for his part, sees GSA as trying to fill that role, and to fill it in such a way that his agency can tie together blockchain and other emerging and maturing technologies. “It’s not the technology, it’s the culture,” he said. “So much in federal tech is approached as some zero-sum game, that if an agency is dedicating time to focus and investigate a technology like blockchain, people freak out because they’re not paying attention to cloud or something else.”

Agencies need to pool resources and intelligence, think in terms of shared services and shared approaches, break down walls and look holistically at their challenges to find common ground.

That’s where the payoff will come. Otherwise, Herman asks, “What does it matter if the knowledge developed isn’t shared?”

Related Articles

GDIT Recruitment 600×300
GDIT HCSD SCM 5 250×250 Truck
NPR Morning Edition 250×250
AFCEA/GMU Critical Issues in  C4I Symposium 250×250
GDIT Recruitment 250×250
USNI News: 250×250
Vago 250×250
Nextgov Newsletter 250×250
Relocatable Video Surveillance Systems Give CBP Flexibility on Border

Relocatable Video Surveillance Systems Give CBP Flexibility on Border

Illegal border crossings fell to their lowest level in at least five years in 2017, but after plunging through April, the numbers have risen each of the past eight months, according to U.S. Customs and Border Protection (CBP).

Meanwhile, the debate continues: Build a physical wall spanning from the Gulf of Mexico to the Pacific Ocean, add more Border Patrol agents or combine better physical barriers with technology to stop drug trafficking, smuggling and illegal immigration?

Increasingly, however, it’s clear no one solution is right for everyplace. Ron Vitiello, acting deputy commissioner at CBP, said the agency intends to expand on the existing 652 miles of walls and fencing now in place – but not necessarily extend the wall the entire length of the border.

“We’re going to add to fill some of the gaps we didn’t get in the [previous] laydown, and then we’re going to prioritize some new wall [construction] across the border in places where we need it the most,” he said in a Jan. 12 TV interview.

Walls and barriers are a priority, Vitiello said in December at a CBP press conference. “In this society and all over our lives, we use walls and fences to protect things,” he said. “It shouldn’t be any different on the border.…  But we’re still challenged with access, we’re still challenged with situational awareness and we’re still challenged with security on that border. We’re still arresting nearly 1,000 people a day.

“So we want to have more capability: We want more agents, we want more technology and we want that barrier to have a safer and more secure environment.”

Among the needs: Relocatable Remote Video Surveillance Systems (R-RVSS) that can be picked up and moved to where they’re needed most as border activity ebbs and flows in response to CBP’s border actions.

CBP mapped its fencing against its 2017 apprehension record in December (see map), finding that areas with physical fencing, such as near the metropolitan centers of San Diego/Tijuana, Tucson/Nogales and El Paso/Juarez are just as likely to see illegal migration activity as unfenced areas in the Laredo/Nueva Laredo area.

CBP mapped its fencing against its 2017 apprehension record in December (see map below), finding that areas with physical fencing are just as likely to see illegal migration activity as unfenced areas.

Source: U.S. Customs and Border Protection

Rep. Will Hurd (R-Tex.), vice chairman of the House Homeland Security subcommittee on Border and Maritime Security, is an advocate for technology as both a complement to and an alternative to physical walls and fences. “A wall from sea to shining sea is the least effective and most expensive solution for border security,” he argued Jan. 16. “This is especially true in areas like Big Bend National Park, where rough terrain, natural barriers and the remoteness of a location render a wall or other structure impractical and ineffective.”

CBP has successfully tested and deployed video surveillance systems to enhance situational awareness on the border and help Border Patrol agents track and respond to incursions. These RVSS systems use multiple day and night sensors mounted on poles to create an advance warning and tracking system identifying potential border-crossing activity. Officers can monitor those sensors feeds remotely and dispatch agents as needed.

Savvy smugglers are quick to adjust when CBP installs new technologies, shifting their routes to less-monitored areas. The new, relocatable RVSS systems (R-RVSS) make it easy for CBP to respond in kind, forcing smugglers and traffickers to constantly adapt.

Robert Gilbert, a former Border Patrol sector chief at CBP and now a senior program director for RVSS at systems integrator General Dynamics Information Technology (GDIT), says relocatable systems will empower CBP with new tools and tactics. “Over the past 20 or 30 years, DOJ then CBP has always deployed technology into the busiest areas along the border, the places with the most traffic. In reality, because of the long procurement process, we usually deployed too late as the traffic had shifted to other locations on the border. The big difference with this capability is you can pick it up and move it to meet the evolving threat. The technology can be relocated within days.”

GDIT fielded a three-tower system in CBP’s Laredo (Texas) West area last summer and a similar setup in McAllen, Texas, in December. The towers – set two to five miles apart – were so effective, CBP is now preparing to buy up to 50 more units to deploy in the Rio Grande sector, where the border follows the river through rugged terrain. There, a physical wall may not be viable, while a technology-based virtual wall could prove highly effective.

Each tower includes an 80-foot-tall collapsible pole that can support a sensor and communications payload weighing up to 2,000 pounds. While far in excess of current needs, it provides a growth path to hanging additional sensors or communications gear if requirements change later on.

When CBP wants to move the units, poles are collapsed, sensors can be packed away and a standard 3/4- or 1-ton pickup truck can haul it to its next location.

Roughly two-thirds of the U.S.-Mexico border runs through land not currently owned by the federal government, a major hurdle when it comes to building permanent infrastructure like walls or even fixed-site towers. Land acquisition would add billions to the cost even if owners agree to the sale. Where owners decline, the government might still be able to seize the land under the legal procedure known as eminent domain, but such cases can take years to resolve.

By contrast, R-RVSS requires only a temporary easement from the land owner. Site work is bare bones: no concrete pad, just a cleared area measuring roughly 40 feet by 40 feet. It need not be level – the R-RVSS system is designed to accommodate slopes up to 10 degrees. Where grid power is unavailable – likely in remote areas – a generator or even a hydrogen fuel cell can produce needed power.

What’s coming next
CBP seeks concepts for a Modular Mobile Surveillance System (M2S2) similar to RVSS, which provide the Border Patrol with an even more rapidly deployable system for detecting, identifying, classifying and tracking “vehicles, people and animals suspected of unlawful border crossing activities.”

More ambitiously, CBP also wants such systems to incorporate data science and artificial intelligence to add a predictive capability. The system would “detect, identify, classify, and track equipment, vehicles, people, and animals used in or suspected of unlawful border crossing activities,” and employ AI to help agents anticipate their direction so they can quickly respond, and resolve each situation.

At the same time, CBP is investigating RVSS-like systems for coastal areas. Deploying pole-mounted systems would train their sensors to monitor coastal waters, where smugglers in small boats seek to exploit the shallows by operating close to shore, rather than the deeper waters patrolled by Coast Guard and Navy ships.

In a market research request CBP floated last June, the agency described a Remote Surveillance System Maritime (RSS-M) as “a subsystem in an overall California Coastal Surveillance demonstration.” The intent: to detect, track, identify, and classify surface targets of interest, so the Border Patrol and partner law enforcement agencies can interdict such threats.

Legislating Tech
Rep. Hurd, Rep. Peter Aguilar (D-Calif.) and a bipartisan group of 49 other congress members support the ‘‘Uniting and Securing America Act of 2017,’’ or “USA Act.” The measure included a plan to evaluate every mile of the U.S.-Mexico border to determine the best security solution for each. After weeks of Senate wrangling over immigration matters, Sens. John McCain (R-Ariz.) and Chris Coons (D-Del.) offered a companion bill in the Senate on Feb. 5.

With 820 miles of border in his district, Hurd says, few in Congress understand the border issue better than he – or feel it more keenly.

“I’m on the border almost every weekend,” he said when unveiling the proposal Jan. 16. The aim: “Full operational control of our border by the year 2020,” Hurd told reporters. “We should be able to know who’s going back and forth across our border. The only way we’re going to do that is by border technologies.” And in an NPR interview that day, he added: “We should be focused on outcomes. How do we get operational control of that border?”

The USA Act would require the Department of Homeland Security to “deploy the most practical and effective technology available along the United States border for achieving situational awareness and operational control of the border by Inauguration Day 2021, including radar surveillance systems; Vehicle and Dismount Exploitation Radars (VADER); three-dimensional, seismic acoustic detection and ranging border tunneling detection technology; sensors, unmanned cameras, drone aircraft and anything else that proves more effective or advanced. The technology is seen as complementing and supporting hard infrastructure.

Related Articles

GDIT Recruitment 600×300
GDIT HCSD SCM 5 250×250 Truck
NPR Morning Edition 250×250
AFCEA/GMU Critical Issues in  C4I Symposium 250×250
GDIT Recruitment 250×250
USNI News: 250×250
Vago 250×250
Nextgov Newsletter 250×250
Is Identity the New Perimeter? In a Zero-Trust World, More CISOs Think So

Is Identity the New Perimeter? In a Zero-Trust World, More CISOs Think So

As the network perimeter morphs from physical to virtual, the old Tootsie Pop security model – hard shell on the outside with a soft and chewy center – no longer works. The new mantra, as Mittal Desai, chief information security officer (CISO) at the Federal Energy Regulatory Commission, said at the ATARC CISO Summit: “Never trust, double verify.”

The zero-trust model modernizes conventional network-based security for a hybrid cloud environment. As agencies move systems and storage into the cloud, networks are virtualized and security naturally shifts to users and data. That’s easy enough to do in small organizations, but rapidly grows harder with the scale and complexity of an enterprise.

The notion of zero-trust security first surfaced five years ago in a Forrester Research report prepared for the National Institute for Standards and Technology (NIST). “The zero-trust model is simple,” Forrester posited then. “Cybersecurity professionals must stop trusting packets as if they were people. Instead, they must eliminate the idea of a trusted network (usually the internal network) and an untrusted network (external networks). In zero-trust, all network traffic is untrusted.”

Cloud adoption by its nature is forcing the issue, said Department of Homeland Security Chief Technology Officer Mike Hermus, speaking at a recent Tech + Tequila event: “It extends the data center,” he explained. “The traditional perimeter security model is not working well for us anymore. We have to work toward a model where we don’t trust something just because it’s within our boundary. We have to have strong authentication, strong access control – and strong encryption of data across the entire application life cycle.”

Indeed, as other network security features mature, identity – and the access that goes with it – is now the most common cybersecurity attack vector. Hackers favor phishing and spear-phishing attacks because they’re inexpensive and effective – and the passwords they yield are like the digital keys to an enterprise.

About 65 percent of breaches cited in Verizon’s 2017 Data Breach Investigations Report made use of stolen credentials.

Interestingly however, identity and access management represent only a small fraction of cybersecurity investment – less than 5 percent – according to Gartner’s market analysts. Network security equipment by contrast, constitutes more than 12 percent. Enterprises continue to invest in the Tootsie Pop model even as its weaknesses become more evident.

“The future state of commercial cloud computing makes identity and role-based access paramount,” said Rob Carey, vice president for cybersecurity and cloud solutions within the Global Solutions division at General Dynamics Information Technology (GDIT). Carey recommends creating both a framework for better understanding the value of identity management tools, and metrics to measure that impact. “Knowing who is on the network with a high degree of certainty has tremendous value.”

Tom Kemp, chief executive officer at Centrify, which provides cloud-based identity services, has a vested interest in changing that mix. Centrify, based in Sunnyvale, Calif., combines identity data with location and other information to help ensure only authorized, verified users access sensitive information.

“At the heart of zero-trust is the realization that an internal user should be treated just like an external user, because your internal network is just as polluted as your outside network,” Kemp said at the Feb. 7 Institute for Critical Infrastructure (ICIT) Winter Summit. “You need to move to constant verification.” Reprising former President Ronald Reagan’s “trust but verify” mantra, he adds: “Now it’s no trust and always verify. That’s the heart of zero-trust.”

The Google Experience
When Google found itself hacked in 2009, the company launched an internal project to find a better way to keep hackers out of its systems. Instead of beefing up firewalls and tightening virtual private network settings, Google’s BeyondCorp architecture dispensed with the Tootsie Pop model in which users logged in and then gained access to all manner of systems and services.

In its place, Google chose to implement a zero-trust model that challenges every user and every device on every data call – regardless of how that user accessed the internet in the first place.

While that flies in the face of conventional wisdom, Google reasoned that by tightly controlling the device and user permissions to access data, it had found a safer path.

Here’s an example of how that works when an engineer with a corporate-issued laptop wants to access an application from a public Wi-Fi connection:

  1. The laptop provides its device certificate to an access proxy.
  2. The access proxy confirms the device, then redirects to a single-sign-on (SSO) system to verify the user.
  3. The engineer provides primary and second-factor authentication credentials and, once authenticated by the SSO system, is issued a token.
  4. Now, with the device certificate to identify the device and the SSO token to identify the user, an Access Control Engine can perform a specific authorization check for every data access. The user must be confirmed to be in the engineering group; to possess a sufficient trust level; and to be using a managed device in good standing with a sufficient trust level.
  5. If all checks pass, the request is passed to an appropriate back-end system and the data access is allowed. If any of the checks fail however, the request is denied. This is repeated every time the engineer tries to access a data item.

“That’s easy enough when those attributes are simple and clear cut, as with the notional Google engineer,” said GDIT’s Carey, who spent three decades managing defense information systems. “But it gets complicated in a hurry if you’re talking about an enterprise on the scale of the Defense Department or Intelligence community.”

Segmenting the Sprawling Enterprise
A takeaway from 9/11 was that intelligence agencies needed to be better and faster at sharing threat data across agency boundaries. Opening databases across agency divisions, however, had consequences: Chelsea Manning, at the time Pfc. Bradley Manning, delivered a treasure trove of stolen files to WikiLeaks and then a few years later, Edward Snowden stole countless intelligence documents, exposing a program designed to collect metadata from domestic phone and email records.

“The more you want to be sure each user is authorized to see and access only the specific data they have a ‘need-to-know,’ the more granular the identity and access management schema need to be,” Carey said. “Implementing role-based access is complicated because you’ve got to develop ways to both tag data and code users based on their authorized need. Absent a management schema, that can quickly become difficult to manage for all but the smallest applications.”

Consider a scenario of a deployed military command working in a multinational coalition with multiple intelligence agencies represented in the command’s intelligence cell. The unit commands air and ground units from all military services, as well as civilians from defense, intelligence and possibly other agencies. Factors determining individual access to data might include the person’s job, rank, nationality, location and security clearance. Some missions might include geographic location, but others can’t rely on that factor because some members of the task force are located thousands of miles away, or operating from covert locations.

That scenario gets even more complicated in a hybrid cloud environment where some systems are located on premise, and others are far away. Managing identity-based access gets harder anyplace where distance or bandwidth limitations cause delays. Other integration challenges include implementing a single-sign-on solution across multiple clouds, or sharing data by means of an API.

Roles and Attributes
To organize access across an enterprise – whether in a small agency or a vast multi-agency system such as the Intelligence Community Information Technology Enterprise (IC ITE) – information managers must make choices. Access controls can be based on individual roles – such as job level, function and organization – or data attributes – such as type, source, classification level and so on.

“Ultimately, these are two sides of the same coin,” Carey said. “The real challenge is the mechanics of developing the necessary schema to a level of granularity that you can manage, and then building the appropriate tools to implement it.”

For example, the Defense Department intends to use role-based access controls for its Joint Information Enterprise (JIE), using the central Defense Manpower Data Center (DMDC) personnel database to connect names with jobs. The available fields in that database are in effect, the limiting factors on just how granular role-based access controls will be under JIE.

Access controls will only be one piece of JIE’s enterprise security architecture. Other features, ranging from encryption to procedural controls that touch everything from the supply chain to system security settings, will also contribute to overall security.

Skeptical of Everything
Trust – or the lack of it – plays out in each of these areas, and requires healthy skepticism at every step. Rod Turk, CISO at the Department of Commerce, said CISOs need to be skeptical of everything. “I’m talking about personnel, I’m talking about relationships with your services providers,” he told the ATARC CISO Summit.  “We look at the companies we do business with and we look at devices, and we run them through the supply chain.  And I will tell you, we have found things that made my hair curl.”

Commerce’s big push right now is the Decennial Census, which will collect volumes of personal information (PI) and personally identifiable information (PII) on almost every living person in the United States. Conducting a census every decade is like doing a major system reset each time. The next census will be no different, employing mobile devices for census takers and for the first time, allowing individuals to fill out census surveys online. Skepticism is essential because the accuracy of the data depends on the public’s trust in the census.

In a sense, that’s the riddle of the whole zero-trust concept: In order to achieve a highly trusted outcome, CISOs have to start with no trust at all.

Yet trust also cuts in the other direction. Today’s emphasis on modernization and migration to the cloud means agencies face tough choices. “Do we in the federal government trust industry to have our best interests in mind to keep our data in the cloud secure?” Turk asked rhetorically.

In theory, the Federal Risk and Authorization Management Program (FedRAMP) establishes baseline requirements for establishing trust but doubts persist. What satisfies one agency’s requirements may not satisfy another. Compliance with FedRAMP or NIST controls equates to risk management rather than actual security, GDIT’s Carey points out. They’re not the same thing.

Identity and Security
Beau Houser, CISO at the Small Business Administration, is more optimistic by improvements he’s seen as compartmentalized legacy IT systems are replaced with centralized, enterprise solutions in a Microsoft cloud.

“As we move to cloud, as we roll out Windows 10, Office 365 and Azure, we’re getting all this rich visibility of everything that’s happening in the environment,” he said. “We can now see all logins on every web app, whether that’s email or OneDrive or what have you, right on the dashboard. And part of that view is what’s happening over that session: What are they doing with email, where are they moving files.… That’s visibility we didn’t have before.”

Leveraging that visibility effectively extends that notion of zero-trust one step further, or at least shifts it into the realm of a watchful parent rather than one who blindly trusts his teenage children. The watchful parent believes trust is not a right, but an earned privilege.

“Increased visibility means agencies can add behavioral models to their security controls,” Carey said. “Behavioral analysis tools that can match behavior to what people’s roles are supposed to be and trigger warnings if people deviate from expected norms, is the next big hurdle in security.”

As Christopher Wlaschin, CISO at the Department of Health and Human Services, says: “A healthy distrust is a good thing.”

Related Articles

GDIT Recruitment 600×300
GDIT HCSD SCM 5 250×250 Truck
NPR Morning Edition 250×250
AFCEA/GMU Critical Issues in  C4I Symposium 250×250
GDIT Recruitment 250×250
USNI News: 250×250
Vago 250×250
Nextgov Newsletter 250×250
New Framework Defines Cyber Security Workforce Needs

New Framework Defines Cyber Security Workforce Needs

Both the federal government and its contractors are locked in a battle for talent with commercial providers, each vying for the best personnel in critical areas of cybersecurity, and each dealing with a shortage of available talent.

Both would benefit from targeted investment in education and increased standardization to define the skills and knowledge required for different kinds of jobs – and now the National Institute for Standards and Technology (NIST) has taken a big step to help make that happen.

NIST published a framework for the future cybersecurity workforce this week, Special Publication 800-181, the culmination of years of effort under the National Initiative for Cybersecurity Education (NICE).

The framework defines “a common, consistent lexicon to describe cybersecurity work by category, specialty area, and work role,” and details the necessary knowledge, skills and abilities (KSAs) and tasks performed by individuals in each kind of job. The framework defines cyber operations jobs in seven operational categories and 32 job specialties.

The aim is that everyone – employers, educators, trainers and cyber professionals will be able to leverage that common language into a better understanding of the existing workforce and the knowledge gaps that need to be filled.

“Building the future workforce is a priority for all of us,” said Stan Tyliszczak, vice president and chief engineer at General Dynamics Information Technology, a systems integrator in Fairfax, Va. “Government, industry and academia all share in this problem. Having a common language we can use to understand each other will help employers explain their requirements and help educators deliver on those needs.”

It’s been a long time coming. A 2015 report on the cyber workforce – Increasing the Effectiveness of the Federal Role in Cybersecurity Education – concluded the government needed to make a host of changes to assure access to a skilled cyber workforce, said David Wennergren, until recently senior vice president for technology at the Professional Services Council, former assistant deputy chief management officer at the Defense Department and a one-time chief information officer for the Navy. Wennergren led the investigation.

The report examined two government-funded programs – the National Centers of Academic Excellence in Information Assurance/Cyber Defense (CAEs), funded by the National Security Agency (NSA) and the Department of Homeland Security (DHS); and the CyberCorps Scholarship for Service (SFS) program managed by the National Science Foundation (NSF) – and concluded they each needed:

  • More hands-on education. “We have to get people in the labs actually using tools and demonstrating proficiencies, not just doing text-book type work,” said Wennergren.
  • The government needs to ensure “we are delivering students who are competent and can to do the jobs without additional training to organizations,” Wennergren said.
  • Focus on the entire public sector – federal, state, local, tribal and territorial governments.
  • Expand programs to include qualified two-year degrees at community colleges. Not all cybersecurity jobs require a four-year degree and military members who have both technical training and practical experience may already have the skills needed to perform critical cyber functions in non-military settings.
  • The entire federal sector needs cyber skills, not just defense and intelligence agencies. The CAE program should embrace the entire federal sector.

Two bills now working their way through Congress build on some of those concepts, particularly the potential for two-year degrees as a means of lowering barriers to entry to this critical part of the workforce.

The Department of Defense Cyber Scholarship Program Act of 2017, a bipartisan bill co-sponsored by Sen. Mike Rounds (R-S.D.), chairman of the Senate Armed Services’ Committee Subcommittee on Cybersecurity, and Sen. Tim Kaine (D-Va.), seeks to provide $10 million in scholarship funds, at least $500,000 of that to fund two-year degree-level programs.

A second bipartisan measure, the Cyber Scholarship Opportunities Act of 2017, co-sponsored by Kaine, Sen. Roger Wicker (R-Miss.), Sen. Patty Murray (D-Wash.), and Sen. David Perdue (R-Ga.), would amend the Cybersecurity Enhancement Act of 2014 by setting aside at least 5 percent of federal cyber scholarship-for-service funds for two-year degree programs, either for military veterans, students pursuing careers in cybersecurity via associates’ degrees in that discipline or students who already have bachelors’ degrees.

Although the Wennergren report’s recommendations focused on federal programs, the concepts apply equally to federal contractors, Wennergren said.

“Clearly both industry and government would benefit from improvements in how cyber is taught in academic institutions [and] how we measure the successful development and placement of students,” he said, adding both will also benefit from the wide adoption of the NICE workforce standards in which government, academia, and the private sector collaborated.

Workforce Shortfall By the Numbers
According to Cyberseek.org, a joint project of NICE, Burning Glass Technologies and CompTIA, there are more than 299,000 cybersecurity job vacancies in the United States today, representing about 28 percent of all U.S. cyber jobs. For some of those jobs, there are as many openings – or more – as there are certified, qualified candidates to fill them – even though such people are almost all employed. For example, there are 69,549 individuals who have earned Certified Information Systems Security Professionals (CISSP) status. But there are 76,336 openings for people with CISSPs.

The most common cyber certification is CompTIA Security+, with more than 167,000 people holding that certification. But there are still more than 33,000 openings for such people, meaning a significant shortage remains.

Rodney Peterson, NIST’s director for NICE, called the cyber workforce the “key enabler” of the future of the nation’s cyber security in a recent interview with Federal News Radio’s Tom Temin.

“We’re clearly building momentum to promote and energize a robust and integrated ecosystem of cybersecurity education, training and workforce development,” he said on Temin’s Federal Drive program. “I think it’s that momentum that both allows us to create a community across both the public and private sector.  That NICE workforce framework really provides a common way to think about cybersecurity work, a taxonomy, a reference tool that can really help align our diverse and complex community together toward a common vision.”

Related Articles

GDIT Recruitment 600×300
GDIT HCSD SCM 5 250×250 Truck
NPR Morning Edition 250×250
AFCEA/GMU Critical Issues in  C4I Symposium 250×250
GDIT Recruitment 250×250
USNI News: 250×250
Vago 250×250
Nextgov Newsletter 250×250
New Cyber Standards for IoT Ease – But Won’t Solve – Security Challenge

New Cyber Standards for IoT Ease – But Won’t Solve – Security Challenge

The first independent standard for cybersecurity for the Internet of Things (IoT) was approved earlier this month, following two years of debate and discussion over how to measure and secure such devices.

The American National Standards Institute (ANSI) approved UL 2900-1, General Requirements for Software Cybersecurity for Network-Connectable Products, as a standard on July 5. The effort was spearheaded by Underwriters Laboratories (UL), which is preparing two more standards to follow: UL 2900-2-1, which defines requirements for network-connectable components of healthcare systems, and UL 2900-2-2, which does the same for industrial control systems.

The three establish the first standard security protocols for software-controlled IoT devices, such as access controls, industrial controls for lighting and mechanical systems, internet-connected medical devices and more. They also offer potential answers to major worries about the lack of security built into such devices thus far.

The Internet of Things promises unparalleled opportunities to track, control and manage everything from lights and security cameras to pacemakers and medication delivery systems. But concerns about security, driven by real-world events in which unsecured IoT devices were co-opted in coordinated botnet attacks, have raised anxiety levels about the risks posed by connecting so many devices to the internet.

Those concerns have prompted government leaders from the Pentagon to Congress to call on industry to embrace security standards as a mark of quality and establish voluntary independent testing programs to assure customers that products are safe. The underlying warning: Either industry figures out how to police itself or government regulators will step in to fill the void.

Whether that is enough to inspire more companies to step up to the standards challenge remains unclear. “The market – that is, individual, corporate and government customers – has yet to put a price on IoT security in the same way that other markets have to determine the relative value of energy-efficient appliances or crash-worthy automobiles,” said Chris Turner, solutions architect with systems integrator General Dynamics Information Technology. “The market would benefit from standards. They’d help vendors back up product claims and integrators speed up adoption and implementation, which in turn would increase security and probably drive down prices, as well.”

Steven Walker Acting director of DARPA

Steven Walker
Acting director of DARPA

A standards regimen could change that equation, suggests Steven Walker, acting director of the Defense Advanced Research Agency (DARPA).

“What if customers were made aware of unsecure products and the companies that made them?” he asked at the AFCEA Defensive Cyber Symposium in June. “I’m pretty sure customers would buy the more secure products.”

As recently as Oct. 21, 2016, the Mirai botnet attack crippled Internet services provider Dyn via an international network of security cameras that launched an onslaught of bogus data requests on Dyn servers, peaking at about 1.2 terabytes/s. The attack brought down many of the most popular sites on the Internet.

Kevin Fu, director of the Archimedes Center for Medical Device Security and the Security and Privacy Research Group at the University of Michigan and the co-founder and chief scientist at Virta Labs, a startup medical device security firm, told the House Energy and Commerce Committee that the underlying problem is one of market failure.

“We are in this sorry and deteriorating state because there is almost no cost to a manufacturer for deploying [IoT] products with poor security to consumers,” he said at a November hearing. “Has a consensus body or federal agency issued a meaningful IoT security standard? Not yet. Is there a national testing lab to verify and assess the pre-market security of IoT devices? No. Is there a tangible cost to any company that puts an insecure IoT device into the market? I don’t think so.”

Could UL 2900 answer that need? Though Fu isn’t quite ready to endorse it, he did suggest the concept is sound.

“We know from the mathematician Gödel that it’s impossible to have both a sound and complete set of standards for any non-trivial problem,” Fu told GovTechWorks. “However, standards are important to improve security and simplify the problem to make it more tractable. No approach will completely solve security, but standards, sound engineering principles and experience gained through failure are necessary ingredients for reasonable defense.”

Developing the Standard
UL 2900 provides guidelines for how to evaluate and test connected products, including a standard approach to software analysis, efforts to root out embedded malware and process and control requirements for establishing IoT security risk controls in the architecture, design and long-term risk management of the product.

Rather than focus on hardware devices first, UL focused on software after initial conversations with the Department of Homeland Security (DHS), said Ken Modeste, leader of cybersecurity services, at UL. “One of DHS’s biggest challenges was their software supply chain,” he said. DHS was concerned about commercial software products running on computer systems, as well as industrial control software running the agencies operations technology, such as air conditioning, lighting and building or campus security systems.

Examining the problem, UL officials found clear similarities between the systems and sensors used in factory automation, enterprise building automation and security technology. “The majority of these cyber concerns – 90 percent – were in software,” Modeste told GovTechWorks. “So we realized, if we can create a standard for software, we can apply that to many, many products.”

UL invited representatives from industry, government and academia to participate in developing the standard. “We started looking at industry standards that make software better,” Modeste said. “A firmware file has a multitude of components. How can those be broken down and understood? How can they be protected?”

Participants studied every imaginable attack vector that threat actors could use to compromise a product, and then incorporated each into the testing process. Recognizing that new threats and vulnerabilities arise all the time, the testing and process was designed to be fluid and to incorporate follow-up testing after initial approval.

At first, Industry was slow to respond. “I thought we’d have more support early on,” Modeste said. “But there was an initial reluctance. It took a while for us to engage and get them to see the advantages.”

Now it seems interest is on the rise. Among the first movers with the standard: Electric Imp, an IoT software firm based in Los Altos, Calif., and Cambridge, U.K., which provides a cloud-based industrial IoT platform for fully integrating hardware, operating system, APIs, cloud services and security in a single flexible, scalable package. The Electric Imp platform is the first IoT platform to be independently certified to UL 2900-2-2.

Hugo Fiennes, co-founder and CEO at Electric Imp and former leader of Apple’s iPhone hardware development efforts (generations one through four), said:

“For security, UL has come at it at the right angle, because they’re not prescriptive,” Fiennes told GovTechWorks. “There are many ways to get security, depending on the application’s demands, latency requirements, data throughput requirements and everything like that. [But] the big problem has been that there has been no stake in the ground so far, nothing that says, ‘this is a reasonable level of security that shows a reasonable level of due diligence has been performed by the vendor.’”

What UL did was to study the problems of industrial control systems, look at the art of the possible, and then codify that in a standard established by a recognizable, independent third-party organization.

“It can’t be overstated how important that is,” Fiennes said. UL derives its trust from the fact that it is independent of other market players and forces.

Although UL 2900 “is not the be all and end all last word on cybersecurity for IoT,” Fiennes said, “it provides a good initial step for vendors.”

“They haven’t said this is one standard forever, because that’s not how security works,” he said. “They’ve said IoT security is a moving target, here is the current standard. We will test to it, we’ll give you a certificate and then you will retest and maintain compliance after.” The certification lasts a year, after which new and emerging threats must be considered in addition to those tested previously.

“This doesn’t absolve the people selling security products, platforms and security stacks from due diligence,” Fiennes warned. Firms must be vigilant and remain ready and able to react quickly to threats. “But it’s better than nothing. And we were in a state before where there was nothing.”  He noted that his product’s UL certification expires after a year, at which point some requirements are likely to change and the certification will have to be renewed.

Still, for customers seeking proof that a product has met a minimum baseline, this is the only option short of devoting extensive in-house resources to thoroughly test products on their own. Few have such resources.

“Auto makers and other large-scale manufacturers can afford that kind of testing because they can spread the cost out across unit sales in the hundreds of thousands,” says GDIT’s Turner. “But for government integration projects, individually testing every possible IoT product is cost-prohibitive. It’s just not practical. So reputable third-party testing could really help speed up adoption of these new technologies and the benefits they bring.”

Standards have value because they provide a baseline measure of confidence.

For Electric Imp, being able to tell customers that UL examined its source code, ran static analysis, performed fuzz testing and penetration testing and examined all of its quality and design controls, has made a difference.

For UL and Modeste, the notion that it will not be able to solve the IoT security problem with a single standard, proved something of an “aha moment.”

“Within cybersecurity, you have to recognize you can’t do everything at once,” he said. “You need a foundation, and then you can go in and take it step-by-step. Nothing anyone comes up with in one step will make you 100 percent cyber secure. It might take 10 years to come up with something perfect and then soon after, it will be obsolete. So it’s better to go in steps,” Modeste added. “That will make us increasingly secure over time.”

Related Articles

GDIT Recruitment 600×300
GDIT HCSD SCM 5 250×250 Truck
NPR Morning Edition 250×250
AFCEA/GMU Critical Issues in  C4I Symposium 250×250
GDIT Recruitment 250×250
USNI News: 250×250
Vago 250×250
Nextgov Newsletter 250×250