Federal Civilian GTW

How the Air Force Changed Tune on Cybersecurity

How the Air Force Changed Tune on Cybersecurity

Peter Kim, chief information security officer (CISO) for the U.S. Air Force, calls himself Dr. Doom. Lauren Knausenberger, director of cyberspace innovation for the Air Force, is his opposite. Where he sees trouble, she sees opportunity. Where he sees reasons to say no, she seeks ways to change the question.

For Kim, the dialogue they’ve shared since Knausenberger left her job atop a private sector tech consultancy to join the Air Force, has been transformational.

“I have gone into a kind of rehab for cybersecurity pros,” he says. “I’ve had to admit I have a problem: I can’t lock everything down.” He knows. He’s tried.

The two engage constantly, debating and questioning whether decisions and steps designed to protect Air Force systems and data are having their intended effect, they said, sharing a dais during a recent AFCEA cybersecurity event in Crystal City. “Are the things we’re doing actually making us more secure or just generating a lot of paperwork?” asks Knausenberger. “We are trying to turn everything on its head.”

As for Kim, she added, “Pete’s doing really well on his rehab program.”

One way Knausenberger has turned Kim’s head has been her approach to security certification packages for new software. Instead of developing massive cert packages for every program – documentation that’s hundreds of pages thick and unlikely to every be read – she wants the Air Force to certify the processes used to develop software, rather than the programs.

“Why don’t we think about software like meat at the grocery?” she asked. “USDA doesn’t look at every individual piece of meat… Our goal is to certify the factory, not the program.”

Similarly, Knausenberger says the Air Force is trying now to apply similar requirements to acquisition contracts, accepting the idea that since finding software vulnerabilities is inevitable, it’s best to have a plan for fixing them rather than hoping to regulate them out of existence. “So you might start seeing language that says, ‘You need to fix vulnerabilities within 10 days.’ Or perhaps we may have to pay bug bounties,” she says. “We know nothing is going to be perfect and we need to accept that. But we also need to start putting a level of commercial expectation into our programs.”

Combining development, security and operations into an integrated process – DevSecOps, in industry parlance – is the new name of the game, they argue together. The aim: Build security in during development, rather than bolting it on at the end.

The takeaways from the “Hack-the-Air-Force” bug bounty programs run so far, in that every such effort yields new vulnerabilities – and that thousands of pages of certification didn’t prevent them. As computer power becomes less costly and automation gets easier, hackers can be expected to use artificial intelligence to break through security barriers.

Continuous automated testing is the only way to combat their persistent threat, Kim said.

Michael Baker, CISO at systems integrator, General Dynamics Information Technology, agrees. “The best way to find the vulnerabilities – is to continuously monitor your environment and challenge your assumptions, he says. “Hackers already use automated tools and the latest vulnerabilities to exploit systems. We have to beat them to it – finding and patching those vulnerabilities before they can exploit them. Robust and assured endpoint protection, combined with continuous, automated testing to find vulnerabilities and exploits, is the only way to do that.”

I think we ought to get moving on automated security testing and penetration,” Kim added. “The days of RMF [risk management framework] packages are past. They’re dinosaurs. We’ve got to get to a different way of addressing security controls and the RMF process.”

Related Articles

GDIT Recruitment 600×300
Tom Temin 250×250
GovExec Newsletter 250×250
GDIT HCSD SCM 4 250×250 Plane w/Shadow
GDIT Recruitment 250×250
Design Thinking and DevOps Combine for Better Customer Experience

Design Thinking and DevOps Combine for Better Customer Experience

How citizens interact with government websites tells you much about how to improve – as long as you’re paying attention, said Aaron Wieczorek, digital services expert with U.S. Digital Services’ team at the Department of Veteran Affairs.

“At VA we will literally sit down with veterans, watch them work with the website and apply for benefits,” he said. The aim is to make sure the experience is what users want and expect he said, not “what we think they want.”

Taking copious notes on their observations, the team then sets to work on programming improvements that can be quickly put to the test. “Maybe some of the buttons were confusing or some of the way things work is confusing – so we immediately start reworking,” Wieczorek explained.

Applying a modern agile development approach means digital services can immediately put those tweaks to the test in their development environment. “If it works there, good. Then it moves to staging. If that’s acceptable, it deploys into production,” Wieczorek said.

That process can happen in days. Vets.gov deploys software updates into production 40 times per month Wieczorek said, and agency wide to all kinds of environments 600 times per month.

Case in point: Vets.gov’s digital Form 1010 EZ, which allows users to apply for VA healthcare online.

“We spent hundreds of hours watching veterans, and in end we were able to totally revamp everything,” Wieczorek said. “It’s actually so easy now, you can do it all on your phone.” More than 330,000 veterans have applied that way since the digital form was introduced. “I think that’s how you scale things.”

Of course, one problem remains: Vets.gov is essentially a veteran-friendly alternative site to VA.gov, which may not be obvious to search engines or veterans looking for the best way in the door. Search Google for “VA 1010ez” and the old, mobile-unfriendly PDF form still shows as the top result. The new mobile-friendly application? It’s the third choice.

At the National Geospatial-Intelligence Agency, developers take a similar approach, but focus hard on balancing speed, quality and design for maximum results. “We believe that requirements and needs should be seen like a carton of milk: The longer they sit around, the worse they get,” said Corry Robb product design lead in the Office of GEOINT Services at the National Geospatial-Intelligence Agency. “We try to handle that need as quickly as we can and deliver that minimally viable product to the user’s hands as fast as we can.”

DevOps techniques, where development and production processes take place simultaneously, increase speed. But speed alone is not the measure of success, Robb said. “Our agency needs to focus on delivering the right thing, not just the wrong thing faster.” So in addition to development sprints, his team has added “design sprints to quickly figure out the problem-solution fit.”

Combining design thinking, which focuses on using design to solve specific user problems, is critical to the methodology, he said. “Being hand in hand with the customer – that’s one of the core values our group has.”

“Iterative development is a proven approach,” said Dennis Gibbs, who established the agile development practice in General Dynamics Information Technology’s Intelligence Solutions Division. “Agile and DevOps techniques accelerate the speed of convergence on a better solution.  We continually incorporate feedback from the user into the solution, resulting in a better capability delivered faster to the user.”

Related Articles

GDIT Recruitment 600×300
Tom Temin 250×250
GovExec Newsletter 250×250
GDIT HCSD SCM 4 250×250 Plane w/Shadow
GDIT Recruitment 250×250
The ABCs of 2018 Federal IT Modernization: I to Z

The ABCs of 2018 Federal IT Modernization: I to Z

In part two of GovTechWorks’ analysis of the Trump Administration’s federal IT modernization plan, we examine the likely guiding impact of the Office of Management and Budget, the manner in which agencies’ infrastructures might change, and the fate of expensive legacy systems.

The White House IT modernization plan released in December seeks a rapid overhaul of IT infrastructure across federal civilian agencies, with an emphasis on redefining the government’s approach to managing its networks and securing its data. Here, in this second part of our two-part analysis, is what you need to know from I to Z (for A-H, click here):

I is for Infrastructure
Modernization boils down to three things: Infrastructure, applications and security. Imagine if every government agency managed its own telephone network or international logistics office, rather than outsourcing such services. IT services are essentially the same. Agencies still need expertise to connect to those services – they still have telecom experts and mail room staff – but they don’t have to manage the entire process.

Special exceptions will always exist for certain military, intelligence (or other specialized) requirements. Increasingly, IT services are becoming commodity services purchased on the open market. Rather than having to own, manage and maintain all that infrastructure, agencies will increasingly buy infrastructure as a service (IaaS) in the cloud — netting faster, perpetually maintained and updated equipment at a lower cost. To bring maximum value – and savings – out of those services, they’ll have to invest in integration and support services to ensure their systems are not only cost effective, but also secure.

J is for JAB, the Joint Authorization Board
The JAB combines expertise at General Services Administration (GSA), Department of Homeland Security (DHS) and the Department of Defense (DOD). It issues preliminary authority to operate (ATO) for widely used cloud services. The JAB will have a definitive role in prioritizing and approving commercial cloud offerings for the highest-risk federal systems.

K is for Keys
The ultimate solution for scanning encrypted data for potential malicious activity is to unencrypt that data for a thorough examination. This involves first having access to encryption keys for federal data and then, securing those keys to ensure they don’t get in the wrong hands. In short, these keys are key to the federal strategy of securing both government data and government networks.

L is for Legacy
The government still spends 70 percent of its IT budget managing legacy systems. That’s down from as much as 85 percent a few years ago, but still too much. In a world where volumes of data continue to expand exponentially and the cost of computer processing power continues to plunge, how long can we afford overspending on last year’s (or last decade’s) aging (and less secure) technology.

M is for Monopsony
A monopoly occurs when one source controls the supply of a given product, service or commodity. A monopsony occurs when a single customer controls the consumption of products, services or commodities. In a classical monopsony, the sole customer dictates terms to all sellers.

Despite its size, the federal government cannot dictate terms to information technology vendors. It can consolidate its purchasing power to increase leverage, and that’s exactly what the government will do in coming years. The process begins with networking services as agencies transition from the old Networx contract to the new Enterprise Information Services vehicle.

Look for it to continue as agencies consolidate purchasing power for commodity software services, such as email, continuous monitoring and collaboration software.

The government may not ultimately wield the full market power of a monopsony, but it can leverage greater negotiating power by centralizing decision making and consolidating purchase and licensing agreements. Look for that to increase significantly in the years ahead.

N is for Networks
Networks used to be the crown jewels of the government’s information enterprise, providing the glue that held systems together and enabling the government to operate. But if the past few years proved anything, it’s that you can’t keep the bad guys out. They’re already in, looking around, waiting for an opportunity.

Networks are essential infrastructure, but will increasingly be virtualized in the future, exist in software and protect encrypted data travelling on commercial fiber and stored much of the time, in commercial data centers (generically referred to as the cloud). You may not keep the bad guys out, but you can control what they get access to.

O is for OMB
The Office of Management and Budget has oversight over much of the modernization plan. The agency is mentioned 127 times in the White House plan, including 47 times in its 50 recommendations. OMB will either be the responsible party or the receiving party, for work done by others on 34 of those 50 recommendations.

P is for Prioritization
Given the vast number of technical, manpower and security challenges that weigh down modernization efforts, prioritizing programs that can deliver the greatest payoff, are essential. In addition, agencies are expected to prioritize and focus their modernization efforts on high-value assets that pose the greatest vulnerabilities and risks. From those lists, by June 30, the DHS must identify six to receive centralized interventions that include staffing and technical support.

The aim is to prioritize where new investment, talent infusions and security policies will make the greatest difference. To maximize that effort, DHS may choose projects that can expand to include other systems and agencies.

OMB must also review and prioritize any impediments to modernization and cloud adoption.

Q is for Quick Start
Technology is not often the most complicated part of many modernization efforts. Finding a viable acquisition strategy that won’t put yesterday’s technology in the government’s hands tomorrow, is often harder. That’s why the report directs OMB to assemble an Acquisition Tiger Team to develop a “quick start” acquisition package to help agencies more quickly license technology and migrate to the cloud.

The aim: combine market research, acquisition plans, readily identified sources and templates for both requests for quotes (RFQs) and Independent Government Cost Estimate (IGCE) calculations — which would be based on completed acquisitions. The tiger team will also help identify qualified small and disadvantaged businesses to help agencies meet set-aside requirements.

R is for Recommendations
There are 50 recommendations in the White House IT modernization report with deadlines ranging from February to August, making the year ahead a busy one for OMB, DHS and GSA, the three agencies responsible for most of the work. A complete list of the recommendations is available here.

T is for the TIC
The federal government developed the Trusted Internet Connection as a means of controlling the number of on and off ramps between government networks and the largely unregulated internet. But in a world now dominated by cloud-based software applications, remote cloud data centers, mobile computing platforms and web-based interfaces that may access multiple different systems to deliver information in context, the TIC needs to be rethought.

“The piece that we struggled with is the Trusted Internet Connections (TIC) initiative – that is a model that has to mature and get solved,” former Federal CIO Tony Scott told Federal News Radio. “It’s an old construct that is applied to modern-day cloud that doesn’t work. It causes performance, cost and latency issues. So the call to double down and sort that out is important. There has been a lot of good work that has happened, but the definitive solution has not been figured out yet.”

The TIC policy is the heart and soul of the government’s perimeter-based security model. Already, some agencies chose to bypass the TIC for certain cloud-based services, such as for Office 365, trusting Microsoft’s security and recognizing that if all that data had to go through an agency’s TIC, performance would suffer.

To modernize TIC capabilities, policies, reference architectures and associated cloud security authorization baselines, OMB must update TIC policies so agencies have a clear path forward to build out data-level protections and more quickly migrate to commercial cloud solutions. A 90-day sprint is to begin in mid-February, during which projects approved by OMB will pilot proposed changes in TIC requirements.

OMB must determine whether all data traveling to and from agency information systems hosted by commercial cloud providers warrants scanning by DHS, or whether only some information needs to be scanned. Other considerations under review: Expanding the number of TIC access points in each agency and a model for determining how best to implement intrusion detection and prevention capabilities into cloud services.

U is for Updating the Federal Cloud Computing Strategy
The government’s “Cloud First” policy is now seven years old. Updates are in order. By April 15, OMB must provide additional guidance on both appropriate use cases and operational security for cloud environments. All relevant policies on cloud migration, infrastructure consolidation and shared services will be reviewed.

In addition, OMB has until June to develop standardized contract language for cloud acquisition, including clauses that define consistent requirements for security, privacy and access to data. Establishing uniform contract language will make it easier to compare and broker cloud offerings and ensure government requirements are met.

V is for Verification
Verification or authentication of users’ identities is at the heart of protecting government information. Are you who you say you are? Key to securing information systems is ensuring that access is granted to only users who can be identified and verified as deserving access.

OMB has until March 1 to issue for public comment new identity policy guidance and to recommend identity service areas suitable for shared services. GSA must provide a business case for consolidating existing identity services to improve usability and drive secure access and enable cloud-based collaboration service that will enhance the ability to easily share and collaborate across agencies, which can be cumbersome today.

W, X, Y, Z is for Wrapping it All Up
The Federal Government is shifting to a consolidated IT model that will change the nature of IT departments and the services they buy. Centralized offerings for commodity IT – whether email, office tools and other common software-as-a-service offerings or virtual desktops and web hosting – will be the norm. As much as possible, the objective is to get agencies on the same page, using the same security services, the same collaboration services, the same data services and make those common (or in some cases shared) across multiple agencies.

Doing so promises to reduce needed manpower and licensing costs by eliminating duplication of effort and increased market leverage to drive down prices. But getting there will not be easy. Integration and security pose unique challenges in a government context, requiring skill, experience and specific expertise. On the government side, policy updates will only solve some of the challenges. Acquisition regulations must also be updated to support wider adoption of commercial cloud products.

Some agencies will need more help than others. Cultural barriers will continue to be major hurdles. Inevitably, staff will have to develop new skills as old ones disappear. Yet even in the midst of all that upheaval, some things don’t change. “In the end, IT modernization is really all about supporting the mission,” says Stan Tyliszczak, chief engineer at systems integrator General Dynamics Information Technology. “It’s about helping government employees complete their work, protecting the privacy of our citizens and ensuring both have timely access to the information and services they need. IT has always made those things better and easier, and modernization is only necessary to continue that process. That much never changes.”

 

Related Articles

GDIT Recruitment 600×300
Tom Temin 250×250
GovExec Newsletter 250×250
GDIT HCSD SCM 4 250×250 Plane w/Shadow
GDIT Recruitment 250×250
The ABCs of 2018 Federal IT Modernization: A to H

The ABCs of 2018 Federal IT Modernization: A to H

The White House issued its IT modernization plan last December and followed it with an ambitious program that could become a proving ground for rapidly overhauling IT infrastructure, data access and customer service. After years of talking about IT modernization, cybersecurity and migration to the cloud, federal agencies are now poised to ramp up the action.

Here, in A-B-C form, is what you need to know from A to H:

A is for Agriculture
The U.S. Department of Agriculture (USDA) will be a sort of proving ground for implementing the Trump administration’s vision for the future high-tech, high-performance, customer-satisfying government. USDA announced in December 2017 it will collapse 39 data centers into one (plus a backup), and consolidate 22 independent chief information officers under a single CIO with seven deputies. The aim: reinvent the agency as a modern, customer-centered organization and provide its leaders with instant access to a wealth of agency data.

B is for Better Citizen Services
“It is imperative for the federal government to leverage … innovations to provide better service for its citizens in the most cost-effective and secure manner,” the report states – in just its third sentence. Yes, modernization should ultimately save money by reducing the billions spent to keep aging systems operational. And yes, it should help overcome the patchwork of cybersecurity point solutions now used to protect federal networks, systems and data.

USDA Secretary Sonny Purdue’s experience modernizing government IT during two terms as governor of Georgia from 2003-2010 convinced him he could achieve similar results on the federal level. “He really saw, in reinventing Georgia government, how IT modernization and delivering better customer service benefitted not only employees, but the people of the state,” Deputy Secretary of Agriculture Steve Censky said in a TV interview.

Among the agency’s goals: Increase access to information throughout the agency by means of online service portals and advanced application program interfaces.

C is for Centers of Excellence
USDA won’t be going it alone. Under the direction of the Office of Science and Technology Policy, the agency will be the first to engage with a new set of experts at the General Services Administration (GSA). GSA is on an accelerated course to create five Centers of Excellence, leveraging both public and private sector expertise to develop best practices and standards that agencies can use for:

  • Cloud adoption
  • IT infrastructure optimization
  • Customer experience
  • Service delivery analytics
  • Contact centers

Jack Wilmer, White House senior advisor for Cybersecurity and IT Modernization, says the idea is to provide each agency’s modernization effort with the same core concepts and approach – and the best available experts. “We’re trying to leverage private sector expertise, bringing them in a centralized fashion, making them available to government agencies as they modernize,” he told Government Matters.

While GSA planned to award contracts to industry partners by the end of January – just 45 days after its initial solicitation – by March 5, no contracts had been awarded. Phase 1 contracts for assessment, planning and some initial activities should be finalized soon. Phase 2 awards for cloud migration, infrastructure optimization and customer experience are expected by the end of the year, Joanne Collins Smee, acting director of GSA’s Technology Transformation Service and deputy commissioner of the Federal Acquisition Service, said at a March 1 AFCEA event in Washington, D.C.

D is for Data Centers
While all data centers won’t close down, many more will soon disappear. Modernization is about getting the government out of the business of managing big infrastructure investments and instead, to leverage commercial cloud infrastructure and technology wherever possible. But don’t think your agency’s data won’t be in a data center somewhere.

“What is the cloud, anyway? Isn’t it really someone else’s data center, available on demand?” says Stan Tyliszczak, chief engineer at systems integrator General Dynamics Information Technology (GDIT). “Moving to the cloud means getting out of the business of running that data center yourself.”

The White House splits its cloud strategy into two buckets:

  • “Bring the government to the cloud.” Put government data and applications in privately-owned and operated infrastructure, where it is protected through encryption and other security technologies. This is public cloud, where government data sits side by side with private data in third-party data centers.
  • “Bring the cloud to the government.” Putting government data and applications on vendor-owned infrastructure, but located in government-owned facilities, as the Intelligence Community Information Technology Enterprise (IC ITE) does with the IC’s Commercial Cloud Services (C2S) contract with Amazon Web Services.

Figuring out what makes sense when, depends on your use case and for most agencies, will mean a combination of on premise solutions, shared government services and commercial services in public clouds. “That’s the Hybrid cloud model everyone’s talking about. But it’s not a trivial exercise. Melding those together is the challenge,” Tyliszczak says. “That’s what integrators are for.”

E is for Encryption
Government cybersecurity efforts have historically focused on defending the network and its perimeter, rather than the data that travels on that network. As cloud services are integrated into conventional on premise IT solutions, securing the data has become essential. At least 47 percent of federal network traffic is encrypted today – frustrating agency efforts to monitor what’s crossing network perimeters.

“Rather than treating Federal networks as trusted entities to be defended at the perimeter,” the modernization report advised, “agencies should shift their focus to placing security protections closer to data.”

To do that, the government must improve the way it authenticates devices and users on its networks, securing who has access and how, and encrypting data both at rest and in transit.

“Now you’re starting to obfuscate whether your sensors can actually inspect the content of that data,” notes Eric White, Cybersecurity program director at GDIT’s Health and Civilian Solutions Division. “Because it’s now encrypted, you add another layer of complexity to know for sure whether it’s the good guys or the bad guys moving data in and out of your network.”

White notes that the Department of Homeland Security (DHS) is charged with solving this encryption dilemma, balancing the millions of dollars in investment in high-end network-monitoring sensors, such as those associated with the Einstein program, against protecting individual privacy. Enabling those sensors to see through or decipher encrypted data without undermining the security of the data – or the privacy of individuals – is a critical priority. DHS has commissioned research to develop potential solutions, including virtualizing sensors for cloud environments; relocating sensors to the endpoints of encrypted tunnels; creating man-in-the-middle solutions that intercept data in motion; or providing the sensors with decryption keys.

F is for FedRAMP

The Federal Risk and Authorization Management Program (FedRAMP) remains the critical process for ensuring private-sector cloud offerings meet government security requirements. Look for updates to FedRAMP baselines that could allow tailoring of security controls for low-risk systems, address new approaches to integrated cloud services with federal Trusted Internet Connection (TIC) services and consider common features or capabilities that could be incorporated into higher-risk systems with FedRAMP “high” baselines.

Importantly, the report directs the General Services Administration (GSA), which manages FedRAMP, to come up with new solutions that make it easier for a software-as-a-service (SaaS) products already authorized for use in one agency to be accepted for use in another. Making the process for issuing an authority to operate (ATO) faster and easier to reuse has long been a goal of both cloud providers and government customers. This is particularly critical for shared services, in which one agency provides its approved commercial solution to another agency.

G is for GSA
Already powerfully influential as a buyer and developer for other agencies, GSA stands to become even more influential as the government moves to consolidate networks and other IT services into fewer contracts and licensing agreements, and to increase the commonality of solutions across the government.

This is especially true among smaller agencies that lack the resources, scale and expertise to effectively procure and manage their own IT services.

H is for Homeland Security
DHS is responsible for the overall cybersecurity of all federal government systems. The only federal entity mentioned more frequently in the White House modernization report is the Office of Management and Budget, which is the White House agency responsible for implementing the report’s guidance.

DHS was mandated to issue a report by Feb. 15, identifying the common weaknesses of the government’s highest-value IT assets and recommend solutions for reducing risk and vulnerability government-wide. By May 15, the agency must produce a prioritized list of systems “for government-wide intervention” and will provide a host of advisory and support services to help secure government systems. DHS also owns and manages the National Cybersecurity Protection System (NCPS) and the EINSTEIN sensor suites that capture and analyze network flow, detect intruders and scan the data coming in and out of government systems to identify potentially malicious activity and, in the case of email, blocking and filtering threatening content.

Look for next week’s edition of GovTechWorks for Part 2: Modernization from I to Z. In Part 2, we outline how infrastructure among government agencies will be impacted and streamlined by modernization, as well as discuss the fate of legacy systems and their maintenance budgets, and the major role the Office of Management and Budget will play in overall implementation.

Next week: Part 2, Modernization I-Z.

 

Related Articles

GDIT Recruitment 600×300
Tom Temin 250×250
GovExec Newsletter 250×250
GDIT HCSD SCM 4 250×250 Plane w/Shadow
GDIT Recruitment 250×250
Is Identity the New Perimeter? In a Zero-Trust World, More CISOs Think So

Is Identity the New Perimeter? In a Zero-Trust World, More CISOs Think So

As the network perimeter morphs from physical to virtual, the old Tootsie Pop security model – hard shell on the outside with a soft and chewy center – no longer works. The new mantra, as Mittal Desai, chief information security officer (CISO) at the Federal Energy Regulatory Commission, said at the ATARC CISO Summit: “Never trust, double verify.”

The zero-trust model modernizes conventional network-based security for a hybrid cloud environment. As agencies move systems and storage into the cloud, networks are virtualized and security naturally shifts to users and data. That’s easy enough to do in small organizations, but rapidly grows harder with the scale and complexity of an enterprise.

The notion of zero-trust security first surfaced five years ago in a Forrester Research report prepared for the National Institute for Standards and Technology (NIST). “The zero-trust model is simple,” Forrester posited then. “Cybersecurity professionals must stop trusting packets as if they were people. Instead, they must eliminate the idea of a trusted network (usually the internal network) and an untrusted network (external networks). In zero-trust, all network traffic is untrusted.”

Cloud adoption by its nature is forcing the issue, said Department of Homeland Security Chief Technology Officer Mike Hermus, speaking at a recent Tech + Tequila event: “It extends the data center,” he explained. “The traditional perimeter security model is not working well for us anymore. We have to work toward a model where we don’t trust something just because it’s within our boundary. We have to have strong authentication, strong access control – and strong encryption of data across the entire application life cycle.”

Indeed, as other network security features mature, identity – and the access that goes with it – is now the most common cybersecurity attack vector. Hackers favor phishing and spear-phishing attacks because they’re inexpensive and effective – and the passwords they yield are like the digital keys to an enterprise.

About 65 percent of breaches cited in Verizon’s 2017 Data Breach Investigations Report made use of stolen credentials.

Interestingly however, identity and access management represent only a small fraction of cybersecurity investment – less than 5 percent – according to Gartner’s market analysts. Network security equipment by contrast, constitutes more than 12 percent. Enterprises continue to invest in the Tootsie Pop model even as its weaknesses become more evident.

“The future state of commercial cloud computing makes identity and role-based access paramount,” said Rob Carey, vice president for cybersecurity and cloud solutions within the Global Solutions division at General Dynamics Information Technology (GDIT). Carey recommends creating both a framework for better understanding the value of identity management tools, and metrics to measure that impact. “Knowing who is on the network with a high degree of certainty has tremendous value.”

Tom Kemp, chief executive officer at Centrify, which provides cloud-based identity services, has a vested interest in changing that mix. Centrify, based in Sunnyvale, Calif., combines identity data with location and other information to help ensure only authorized, verified users access sensitive information.

“At the heart of zero-trust is the realization that an internal user should be treated just like an external user, because your internal network is just as polluted as your outside network,” Kemp said at the Feb. 7 Institute for Critical Infrastructure (ICIT) Winter Summit. “You need to move to constant verification.” Reprising former President Ronald Reagan’s “trust but verify” mantra, he adds: “Now it’s no trust and always verify. That’s the heart of zero-trust.”

The Google Experience
When Google found itself hacked in 2009, the company launched an internal project to find a better way to keep hackers out of its systems. Instead of beefing up firewalls and tightening virtual private network settings, Google’s BeyondCorp architecture dispensed with the Tootsie Pop model in which users logged in and then gained access to all manner of systems and services.

In its place, Google chose to implement a zero-trust model that challenges every user and every device on every data call – regardless of how that user accessed the internet in the first place.

While that flies in the face of conventional wisdom, Google reasoned that by tightly controlling the device and user permissions to access data, it had found a safer path.

Here’s an example of how that works when an engineer with a corporate-issued laptop wants to access an application from a public Wi-Fi connection:

  1. The laptop provides its device certificate to an access proxy.
  2. The access proxy confirms the device, then redirects to a single-sign-on (SSO) system to verify the user.
  3. The engineer provides primary and second-factor authentication credentials and, once authenticated by the SSO system, is issued a token.
  4. Now, with the device certificate to identify the device and the SSO token to identify the user, an Access Control Engine can perform a specific authorization check for every data access. The user must be confirmed to be in the engineering group; to possess a sufficient trust level; and to be using a managed device in good standing with a sufficient trust level.
  5. If all checks pass, the request is passed to an appropriate back-end system and the data access is allowed. If any of the checks fail however, the request is denied. This is repeated every time the engineer tries to access a data item.

“That’s easy enough when those attributes are simple and clear cut, as with the notional Google engineer,” said GDIT’s Carey, who spent three decades managing defense information systems. “But it gets complicated in a hurry if you’re talking about an enterprise on the scale of the Defense Department or Intelligence community.”

Segmenting the Sprawling Enterprise
A takeaway from 9/11 was that intelligence agencies needed to be better and faster at sharing threat data across agency boundaries. Opening databases across agency divisions, however, had consequences: Chelsea Manning, at the time Pfc. Bradley Manning, delivered a treasure trove of stolen files to WikiLeaks and then a few years later, Edward Snowden stole countless intelligence documents, exposing a program designed to collect metadata from domestic phone and email records.

“The more you want to be sure each user is authorized to see and access only the specific data they have a ‘need-to-know,’ the more granular the identity and access management schema need to be,” Carey said. “Implementing role-based access is complicated because you’ve got to develop ways to both tag data and code users based on their authorized need. Absent a management schema, that can quickly become difficult to manage for all but the smallest applications.”

Consider a scenario of a deployed military command working in a multinational coalition with multiple intelligence agencies represented in the command’s intelligence cell. The unit commands air and ground units from all military services, as well as civilians from defense, intelligence and possibly other agencies. Factors determining individual access to data might include the person’s job, rank, nationality, location and security clearance. Some missions might include geographic location, but others can’t rely on that factor because some members of the task force are located thousands of miles away, or operating from covert locations.

That scenario gets even more complicated in a hybrid cloud environment where some systems are located on premise, and others are far away. Managing identity-based access gets harder anyplace where distance or bandwidth limitations cause delays. Other integration challenges include implementing a single-sign-on solution across multiple clouds, or sharing data by means of an API.

Roles and Attributes
To organize access across an enterprise – whether in a small agency or a vast multi-agency system such as the Intelligence Community Information Technology Enterprise (IC ITE) – information managers must make choices. Access controls can be based on individual roles – such as job level, function and organization – or data attributes – such as type, source, classification level and so on.

“Ultimately, these are two sides of the same coin,” Carey said. “The real challenge is the mechanics of developing the necessary schema to a level of granularity that you can manage, and then building the appropriate tools to implement it.”

For example, the Defense Department intends to use role-based access controls for its Joint Information Enterprise (JIE), using the central Defense Manpower Data Center (DMDC) personnel database to connect names with jobs. The available fields in that database are in effect, the limiting factors on just how granular role-based access controls will be under JIE.

Access controls will only be one piece of JIE’s enterprise security architecture. Other features, ranging from encryption to procedural controls that touch everything from the supply chain to system security settings, will also contribute to overall security.

Skeptical of Everything
Trust – or the lack of it – plays out in each of these areas, and requires healthy skepticism at every step. Rod Turk, CISO at the Department of Commerce, said CISOs need to be skeptical of everything. “I’m talking about personnel, I’m talking about relationships with your services providers,” he told the ATARC CISO Summit.  “We look at the companies we do business with and we look at devices, and we run them through the supply chain.  And I will tell you, we have found things that made my hair curl.”

Commerce’s big push right now is the Decennial Census, which will collect volumes of personal information (PI) and personally identifiable information (PII) on almost every living person in the United States. Conducting a census every decade is like doing a major system reset each time. The next census will be no different, employing mobile devices for census takers and for the first time, allowing individuals to fill out census surveys online. Skepticism is essential because the accuracy of the data depends on the public’s trust in the census.

In a sense, that’s the riddle of the whole zero-trust concept: In order to achieve a highly trusted outcome, CISOs have to start with no trust at all.

Yet trust also cuts in the other direction. Today’s emphasis on modernization and migration to the cloud means agencies face tough choices. “Do we in the federal government trust industry to have our best interests in mind to keep our data in the cloud secure?” Turk asked rhetorically.

In theory, the Federal Risk and Authorization Management Program (FedRAMP) establishes baseline requirements for establishing trust but doubts persist. What satisfies one agency’s requirements may not satisfy another. Compliance with FedRAMP or NIST controls equates to risk management rather than actual security, GDIT’s Carey points out. They’re not the same thing.

Identity and Security
Beau Houser, CISO at the Small Business Administration, is more optimistic by improvements he’s seen as compartmentalized legacy IT systems are replaced with centralized, enterprise solutions in a Microsoft cloud.

“As we move to cloud, as we roll out Windows 10, Office 365 and Azure, we’re getting all this rich visibility of everything that’s happening in the environment,” he said. “We can now see all logins on every web app, whether that’s email or OneDrive or what have you, right on the dashboard. And part of that view is what’s happening over that session: What are they doing with email, where are they moving files.… That’s visibility we didn’t have before.”

Leveraging that visibility effectively extends that notion of zero-trust one step further, or at least shifts it into the realm of a watchful parent rather than one who blindly trusts his teenage children. The watchful parent believes trust is not a right, but an earned privilege.

“Increased visibility means agencies can add behavioral models to their security controls,” Carey said. “Behavioral analysis tools that can match behavior to what people’s roles are supposed to be and trigger warnings if people deviate from expected norms, is the next big hurdle in security.”

As Christopher Wlaschin, CISO at the Department of Health and Human Services, says: “A healthy distrust is a good thing.”

Related Articles

GDIT Recruitment 600×300
Tom Temin 250×250
GovExec Newsletter 250×250
GDIT HCSD SCM 4 250×250 Plane w/Shadow
GDIT Recruitment 250×250
Unpleasant Design Could Encourage Better Cyber Hygiene

Unpleasant Design Could Encourage Better Cyber Hygiene

Recent revelations that service members and intelligence professionals are inadvertently giving up their locations and fitness patterns via mobile apps caught federal agencies by surprise.

The surprise wasn’t that Fitbits, smartphones or workout apps try to collect information, nor that some users ignore policies reminding them to watch their privacy and location settings. The real surprise is that many IT policies aren’t doing more to help stop such inadvertent fitness data leaks.

If even fitness-conscious military and intelligence personnel are unknowingly trading security and privacy for convenience, how can IT security managers increase security awareness and compliance?

One answer: Unpleasant design.

Unpleasant design is a proven technique for using design to discourage unwanted behavior. Ever get stuck in an airport and long for a place to lie down — only to find every bench or row of seats is fitted with armrests? That’s no accident. Airports and train terminals don’t want people sleeping across benches. Or consider the decorative metalwork sometimes placed on urban windowsills or planter walls — designed expressly to keep loiterers from sitting down. It’s the same with harsh lights in suburban parking lots, which discourage people from hanging out and make it harder for criminals to lurk in the shadows.

As the federal government and other agency IT security leaders investigate these inadvertent disclosures, can they employ those same concepts to encourage better cyber behavior?

Here’s how unpleasant design might apply to federally furnished Wi-Fi networks: Rather than allow access with only a password, users instead might be required to have their Internet of Things (IoT) devices pass a security screening that requires certain security settings. That screening could include ensuring location services are disabled while such devices are connected to government-provided networks.

Employees would then have to choose between the convenience of free Wi-Fi for personal devices and the risks of inadequate operations security (OPSEC) via insecure device settings.

This of course, only works where users have access to such networks. At facilities where personal devices must be deposited in lockers or left in cars, it won’t make a difference. But for users working (and living) on installations where personnel routinely access Wi-Fi networks, this could be highly effective.

Screening – and even blocking – certain apps or domains could be managed through a cloud access security broker, network security management software that can enforce locally set rules governing apps actively using location data or posing other security risks. Network managers could whitelist acceptable apps and settings, while blocking those deemed unacceptable. If agencies already do that for their wired networks, why not for wireless?

Inconvenient? Absolutely. That’s the point.

IT security staffs are constantly navigating the optimal balance between security and convenience. Perfect security is achievable only when nothing is connected to anything else. Each new connection and additional convenience introduces another dent in the network’s armor.

Employing cloud-access security as a condition of Wi-Fi network access will impinge on some conveniences. In most cases, truly determined users can work around those rules by using local cellular data access instead. In most parts of the world, however, those places where the need for OPSEC is greatest, that access comes with a direct cash cost. When users pay for data by the megabyte, they’re much more likely to give up some convenience, check security and privacy settings, and limit their data consumption.

This too, is unpleasant design at work. Cellular network owners must balance network capacity with use. Lower-capacity networks control demand by raising prices, knowing that higher priced data discourages unbridled consumption.

Training and awareness will always be the most important factors in securing privacy and location data, because few users are willing to wade through pages-long user agreements to discover what’s hidden in the fine print and legalese they contain. More plain language and simpler settings for opting-in or out of certain kinds of data sharing are needed – and app makers must recognize that failing to heed such requirements only increase the risk that government steps in with new regulations.

But training and awareness only go so far. People still click on bad links, which is why some federal agencies automatically disable them. It makes users take a closer, harder look and think twice before clicking. That too, is unpleasant design.

So is requiring users to wear a badge that doubles as a computer access card (as is the case with the Pentagon’s Common Access Card and most Personal Identity Verification cards). Yet, knowing that some will inevitably leave the cards in their computers, such systems automatically log off after only a few minutes of inactivity. It’s inconvenient, but more secure.

We know this much: Human nature is such that people will take the path of least resistance. If that means accepting security settings that aren’t safe, that’s what’s going to happen. Though interrupting that convenience and turning it on its head by means of Wi-Fi security won’t stop everyone. But it might have prevented Australian undergrad Nathan Ruser – and who knows who else – from identifying the regular jogging routes of military members (among other examples) from Strava’s house-built heat map and the 13 trillion GPS points all collected from users.

“If soldiers use the app like normal people do,” Ruser tweeted Jan. 27, “it could be especially dangerous … I shouldn’t be able to establish any pattern of life info from this far away.”

Exactly.

Related Articles

GDIT Recruitment 600×300
Tom Temin 250×250
GovExec Newsletter 250×250
GDIT HCSD SCM 4 250×250 Plane w/Shadow
GDIT Recruitment 250×250