Defense GTW Spotlight

Is Identity the New Perimeter? In a Zero-Trust World, More CISOs Think So

Is Identity the New Perimeter? In a Zero-Trust World, More CISOs Think So

As the network perimeter morphs from physical to virtual, the old Tootsie Pop security model – hard shell on the outside with a soft and chewy center – no longer works. The new mantra, as Mittal Desai, chief information security officer (CISO) at the Federal Energy Regulatory Commission, said at the ATARC CISO Summit: “Never trust, double verify.”

The zero-trust model modernizes conventional network-based security for a hybrid cloud environment. As agencies move systems and storage into the cloud, networks are virtualized and security naturally shifts to users and data. That’s easy enough to do in small organizations, but rapidly grows harder with the scale and complexity of an enterprise.

The notion of zero-trust security first surfaced five years ago in a Forrester Research report prepared for the National Institute for Standards and Technology (NIST). “The zero-trust model is simple,” Forrester posited then. “Cybersecurity professionals must stop trusting packets as if they were people. Instead, they must eliminate the idea of a trusted network (usually the internal network) and an untrusted network (external networks). In zero-trust, all network traffic is untrusted.”

Cloud adoption by its nature is forcing the issue, said Department of Homeland Security Chief Technology Officer Mike Hermus, speaking at a recent Tech + Tequila event: “It extends the data center,” he explained. “The traditional perimeter security model is not working well for us anymore. We have to work toward a model where we don’t trust something just because it’s within our boundary. We have to have strong authentication, strong access control – and strong encryption of data across the entire application life cycle.”

Indeed, as other network security features mature, identity – and the access that goes with it – is now the most common cybersecurity attack vector. Hackers favor phishing and spear-phishing attacks because they’re inexpensive and effective – and the passwords they yield are like the digital keys to an enterprise.

About 65 percent of breaches cited in Verizon’s 2017 Data Breach Investigations Report made use of stolen credentials.

Interestingly however, identity and access management represent only a small fraction of cybersecurity investment – less than 5 percent – according to Gartner’s market analysts. Network security equipment by contrast, constitutes more than 12 percent. Enterprises continue to invest in the Tootsie Pop model even as its weaknesses become more evident.

“The future state of commercial cloud computing makes identity and role-based access paramount,” said Rob Carey, vice president for cybersecurity and cloud solutions within the Global Solutions division at General Dynamics Information Technology (GDIT). Carey recommends creating both a framework for better understanding the value of identity management tools, and metrics to measure that impact. “Knowing who is on the network with a high degree of certainty has tremendous value.”

Tom Kemp, chief executive officer at Centrify, which provides cloud-based identity services, has a vested interest in changing that mix. Centrify, based in Sunnyvale, Calif., combines identity data with location and other information to help ensure only authorized, verified users access sensitive information.

“At the heart of zero-trust is the realization that an internal user should be treated just like an external user, because your internal network is just as polluted as your outside network,” Kemp said at the Feb. 7 Institute for Critical Infrastructure (ICIT) Winter Summit. “You need to move to constant verification.” Reprising former President Ronald Reagan’s “trust but verify” mantra, he adds: “Now it’s no trust and always verify. That’s the heart of zero-trust.”

The Google Experience
When Google found itself hacked in 2009, the company launched an internal project to find a better way to keep hackers out of its systems. Instead of beefing up firewalls and tightening virtual private network settings, Google’s BeyondCorp architecture dispensed with the Tootsie Pop model in which users logged in and then gained access to all manner of systems and services.

In its place, Google chose to implement a zero-trust model that challenges every user and every device on every data call – regardless of how that user accessed the internet in the first place.

While that flies in the face of conventional wisdom, Google reasoned that by tightly controlling the device and user permissions to access data, it had found a safer path.

Here’s an example of how that works when an engineer with a corporate-issued laptop wants to access an application from a public Wi-Fi connection:

  1. The laptop provides its device certificate to an access proxy.
  2. The access proxy confirms the device, then redirects to a single-sign-on (SSO) system to verify the user.
  3. The engineer provides primary and second-factor authentication credentials and, once authenticated by the SSO system, is issued a token.
  4. Now, with the device certificate to identify the device and the SSO token to identify the user, an Access Control Engine can perform a specific authorization check for every data access. The user must be confirmed to be in the engineering group; to possess a sufficient trust level; and to be using a managed device in good standing with a sufficient trust level.
  5. If all checks pass, the request is passed to an appropriate back-end system and the data access is allowed. If any of the checks fail however, the request is denied. This is repeated every time the engineer tries to access a data item.

“That’s easy enough when those attributes are simple and clear cut, as with the notional Google engineer,” said GDIT’s Carey, who spent three decades managing defense information systems. “But it gets complicated in a hurry if you’re talking about an enterprise on the scale of the Defense Department or Intelligence community.”

Segmenting the Sprawling Enterprise
A takeaway from 9/11 was that intelligence agencies needed to be better and faster at sharing threat data across agency boundaries. Opening databases across agency divisions, however, had consequences: Chelsea Manning, at the time Pfc. Bradley Manning, delivered a treasure trove of stolen files to WikiLeaks and then a few years later, Edward Snowden stole countless intelligence documents, exposing a program designed to collect metadata from domestic phone and email records.

“The more you want to be sure each user is authorized to see and access only the specific data they have a ‘need-to-know,’ the more granular the identity and access management schema need to be,” Carey said. “Implementing role-based access is complicated because you’ve got to develop ways to both tag data and code users based on their authorized need. Absent a management schema, that can quickly become difficult to manage for all but the smallest applications.”

Consider a scenario of a deployed military command working in a multinational coalition with multiple intelligence agencies represented in the command’s intelligence cell. The unit commands air and ground units from all military services, as well as civilians from defense, intelligence and possibly other agencies. Factors determining individual access to data might include the person’s job, rank, nationality, location and security clearance. Some missions might include geographic location, but others can’t rely on that factor because some members of the task force are located thousands of miles away, or operating from covert locations.

That scenario gets even more complicated in a hybrid cloud environment where some systems are located on premise, and others are far away. Managing identity-based access gets harder anyplace where distance or bandwidth limitations cause delays. Other integration challenges include implementing a single-sign-on solution across multiple clouds, or sharing data by means of an API.

Roles and Attributes
To organize access across an enterprise – whether in a small agency or a vast multi-agency system such as the Intelligence Community Information Technology Enterprise (IC ITE) – information managers must make choices. Access controls can be based on individual roles – such as job level, function and organization – or data attributes – such as type, source, classification level and so on.

“Ultimately, these are two sides of the same coin,” Carey said. “The real challenge is the mechanics of developing the necessary schema to a level of granularity that you can manage, and then building the appropriate tools to implement it.”

For example, the Defense Department intends to use role-based access controls for its Joint Information Enterprise (JIE), using the central Defense Manpower Data Center (DMDC) personnel database to connect names with jobs. The available fields in that database are in effect, the limiting factors on just how granular role-based access controls will be under JIE.

Access controls will only be one piece of JIE’s enterprise security architecture. Other features, ranging from encryption to procedural controls that touch everything from the supply chain to system security settings, will also contribute to overall security.

Skeptical of Everything
Trust – or the lack of it – plays out in each of these areas, and requires healthy skepticism at every step. Rod Turk, CISO at the Department of Commerce, said CISOs need to be skeptical of everything. “I’m talking about personnel, I’m talking about relationships with your services providers,” he told the ATARC CISO Summit.  “We look at the companies we do business with and we look at devices, and we run them through the supply chain.  And I will tell you, we have found things that made my hair curl.”

Commerce’s big push right now is the Decennial Census, which will collect volumes of personal information (PI) and personally identifiable information (PII) on almost every living person in the United States. Conducting a census every decade is like doing a major system reset each time. The next census will be no different, employing mobile devices for census takers and for the first time, allowing individuals to fill out census surveys online. Skepticism is essential because the accuracy of the data depends on the public’s trust in the census.

In a sense, that’s the riddle of the whole zero-trust concept: In order to achieve a highly trusted outcome, CISOs have to start with no trust at all.

Yet trust also cuts in the other direction. Today’s emphasis on modernization and migration to the cloud means agencies face tough choices. “Do we in the federal government trust industry to have our best interests in mind to keep our data in the cloud secure?” Turk asked rhetorically.

In theory, the Federal Risk and Authorization Management Program (FedRAMP) establishes baseline requirements for establishing trust but doubts persist. What satisfies one agency’s requirements may not satisfy another. Compliance with FedRAMP or NIST controls equates to risk management rather than actual security, GDIT’s Carey points out. They’re not the same thing.

Identity and Security
Beau Houser, CISO at the Small Business Administration, is more optimistic by improvements he’s seen as compartmentalized legacy IT systems are replaced with centralized, enterprise solutions in a Microsoft cloud.

“As we move to cloud, as we roll out Windows 10, Office 365 and Azure, we’re getting all this rich visibility of everything that’s happening in the environment,” he said. “We can now see all logins on every web app, whether that’s email or OneDrive or what have you, right on the dashboard. And part of that view is what’s happening over that session: What are they doing with email, where are they moving files.… That’s visibility we didn’t have before.”

Leveraging that visibility effectively extends that notion of zero-trust one step further, or at least shifts it into the realm of a watchful parent rather than one who blindly trusts his teenage children. The watchful parent believes trust is not a right, but an earned privilege.

“Increased visibility means agencies can add behavioral models to their security controls,” Carey said. “Behavioral analysis tools that can match behavior to what people’s roles are supposed to be and trigger warnings if people deviate from expected norms, is the next big hurdle in security.”

As Christopher Wlaschin, CISO at the Department of Health and Human Services, says: “A healthy distrust is a good thing.”

Related Articles

GDIT Recruitment 600×300
GovExec Newsletter 250×250
GDIT Recruitment 250×250
GDIT HCSD SCM 1 250×250 Train
Unpleasant Design Could Encourage Better Cyber Hygiene

Unpleasant Design Could Encourage Better Cyber Hygiene

Recent revelations that service members and intelligence professionals are inadvertently giving up their locations and fitness patterns via mobile apps caught federal agencies by surprise.

The surprise wasn’t that Fitbits, smartphones or workout apps try to collect information, nor that some users ignore policies reminding them to watch their privacy and location settings. The real surprise is that many IT policies aren’t doing more to help stop such inadvertent fitness data leaks.

If even fitness-conscious military and intelligence personnel are unknowingly trading security and privacy for convenience, how can IT security managers increase security awareness and compliance?

One answer: Unpleasant design.

Unpleasant design is a proven technique for using design to discourage unwanted behavior. Ever get stuck in an airport and long for a place to lie down — only to find every bench or row of seats is fitted with armrests? That’s no accident. Airports and train terminals don’t want people sleeping across benches. Or consider the decorative metalwork sometimes placed on urban windowsills or planter walls — designed expressly to keep loiterers from sitting down. It’s the same with harsh lights in suburban parking lots, which discourage people from hanging out and make it harder for criminals to lurk in the shadows.

As the federal government and other agency IT security leaders investigate these inadvertent disclosures, can they employ those same concepts to encourage better cyber behavior?

Here’s how unpleasant design might apply to federally furnished Wi-Fi networks: Rather than allow access with only a password, users instead might be required to have their Internet of Things (IoT) devices pass a security screening that requires certain security settings. That screening could include ensuring location services are disabled while such devices are connected to government-provided networks.

Employees would then have to choose between the convenience of free Wi-Fi for personal devices and the risks of inadequate operations security (OPSEC) via insecure device settings.

This of course, only works where users have access to such networks. At facilities where personal devices must be deposited in lockers or left in cars, it won’t make a difference. But for users working (and living) on installations where personnel routinely access Wi-Fi networks, this could be highly effective.

Screening – and even blocking – certain apps or domains could be managed through a cloud access security broker, network security management software that can enforce locally set rules governing apps actively using location data or posing other security risks. Network managers could whitelist acceptable apps and settings, while blocking those deemed unacceptable. If agencies already do that for their wired networks, why not for wireless?

Inconvenient? Absolutely. That’s the point.

IT security staffs are constantly navigating the optimal balance between security and convenience. Perfect security is achievable only when nothing is connected to anything else. Each new connection and additional convenience introduces another dent in the network’s armor.

Employing cloud-access security as a condition of Wi-Fi network access will impinge on some conveniences. In most cases, truly determined users can work around those rules by using local cellular data access instead. In most parts of the world, however, those places where the need for OPSEC is greatest, that access comes with a direct cash cost. When users pay for data by the megabyte, they’re much more likely to give up some convenience, check security and privacy settings, and limit their data consumption.

This too, is unpleasant design at work. Cellular network owners must balance network capacity with use. Lower-capacity networks control demand by raising prices, knowing that higher priced data discourages unbridled consumption.

Training and awareness will always be the most important factors in securing privacy and location data, because few users are willing to wade through pages-long user agreements to discover what’s hidden in the fine print and legalese they contain. More plain language and simpler settings for opting-in or out of certain kinds of data sharing are needed – and app makers must recognize that failing to heed such requirements only increase the risk that government steps in with new regulations.

But training and awareness only go so far. People still click on bad links, which is why some federal agencies automatically disable them. It makes users take a closer, harder look and think twice before clicking. That too, is unpleasant design.

So is requiring users to wear a badge that doubles as a computer access card (as is the case with the Pentagon’s Common Access Card and most Personal Identity Verification cards). Yet, knowing that some will inevitably leave the cards in their computers, such systems automatically log off after only a few minutes of inactivity. It’s inconvenient, but more secure.

We know this much: Human nature is such that people will take the path of least resistance. If that means accepting security settings that aren’t safe, that’s what’s going to happen. Though interrupting that convenience and turning it on its head by means of Wi-Fi security won’t stop everyone. But it might have prevented Australian undergrad Nathan Ruser – and who knows who else – from identifying the regular jogging routes of military members (among other examples) from Strava’s house-built heat map and the 13 trillion GPS points all collected from users.

“If soldiers use the app like normal people do,” Ruser tweeted Jan. 27, “it could be especially dangerous … I shouldn’t be able to establish any pattern of life info from this far away.”

Exactly.

Related Articles

GDIT Recruitment 600×300
GovExec Newsletter 250×250
GDIT Recruitment 250×250
GDIT HCSD SCM 1 250×250 Train
Common Desktop, the Foundation for IC ITE, Expands Footprint

Common Desktop, the Foundation for IC ITE, Expands Footprint

photo credit: DIA

The intelligence community’s common desktop system, which now serves more than 50,000 users, will spread to new users this fall and early next year.

The Common Desktop Environment (DTE) covers nearly all of the Defense Intelligence Agency and National Geospatial Agency (NGA). A joint program office run by the two agencies will put a new architecture to the test this fall. Once accredited, the joint office will begin rolling out the Phase 2 system in January with a limited number of users and to launch a full-scale rollout next April.

DTE was developed as NGA moved to its new headquarters in 2011 and soon after expanded to DIA. Now a Joint Program Office runs the program, led by Kendrea DeLauter, a former analyst who heads the effort to create a single shared desktop computer system for the entire national intelligence apparatus.

The Face of IC ITE
DTE is one of nine core building blocks making up the Intelligence Community Information Technology Enterprise (IC ITE). It may be the most important – the single shared interface that everyone in the community will experience on a day-to-day basis.

“DTE is the user-facing component of IC ITE,” DeLauter told GovTechWorks in a modern, glassed-in conference room in DIA’s glistening headquarters at Joint Base Anacostia-Bolling, the walled mini-city across the river from Reagan National Airport in Washington, DC. “Everybody works off a desktop or laptop that allows them to get access to their data. It basically affects everybody who works with a desktop or laptop computers.”

This makes DTE not just user facing, but the literal face of IC ITE to the community. Everything else is accessed through the desktop, effectively making those other eight components invisible to users:

  • Commercial cloud services
  • Government cloud services
  • Enterprise Management (EMT), comprising the help desk and system management functions
  • Identification, authorization and authentication (IAA) for access control
  • Networks
  • Integrated Transport Service
  • Security Coordination
  • Application Mall

ICITE, like the Defense Department’s Joint Information Environment, is not a program of record by itself. Rather, it’s the umbrella that unifies a series of IC-wide programs into a single vision. “It’s all of the agencies getting together and brining our tools to a better standard, but doing it within our programs,” DeLauter says. “That will be an incredible test of integration across the intelligence community.”

Much of it rests on her shoulders.

Evolution of DTE
DTE came about almost as an accident. As NGA was moving into its new Springfield, Va., offices in September 2011, agency leaders sought a way to reduce the power consumption needed for every workstation and to accommodate the “green” building design. NGA’s IT support contractor General Dynamics Information Technology, came up with a state of the art design for a virtual desktop infrastructure (VDI). With approval of this design, NGA could reduce power requirements, network cabling and systems support requirements and enhance security at the same time. Patching could be done centrally, resources could be shared and users could be more mobile. Instead of being tied to a cubicle, they could easily move from work area to work area as needed, with email, phone number and data available simply by logging into a terminal anywhere in the building.

The effort was a runaway success, and soon DIA joined in, followed by intelligence teams at the joint combatant commands and Coast Guard Intelligence. To date, more than 72,000 users have been credentialed, although the actual number of current users is closer to 50,000 as retirements, transfers and normal attrition accounts for thousands of departures each year.

Eventually, DTE will boast some 250,000 users and possibly more. It will serve 17 intelligence agencies, plus some organizations “on the fringes,” DeLauter says, adding that the Office of the Secretary of Defense “has some interest in the system.”

Almost all of NGA and DIA are using the system, with a few exceptions. Additional users are added regularly, sometimes hundreds at a time.

The transition was smooth she said, thanks to extensive advanced planning and outreach by the government-contractor team: Emails, town halls, helpful tips and instructions all contributed to making the move easy on users, whose greatest worries revolved around ensuring access to favorite applications.

“You need to make sure you have a good customer experience the first time or you don’t get someone who’s going to be an advocate for you,” DeLauter says. “We put a heavy emphasis on the customer experience.”

Surveys showed 90 percent satisfaction rates for those who took advantage of the advance information, though the rate was lower for the inevitable portion who were too busy, distracted or fearful of change to prepare themselves for the move.

The biggest beef? Some applications weren’t ready in time. But with some 400 applications across the two organizations, that was inevitable. To minimize stress, the system was rolled out gradually, beginning with standard users – those performing basic computing functions and moving gradually up the scale to more sophisticated, high-demand users.

“We did not migrate the analysts first, because they tend to have the more complicated requirements,” DeLauter said.

Analysts are the power users, with more demanding applications and requirements. For some, those applications are accessed via a browser, while others were incorporated into the system image. For the most demanding instances, users were put on a thick client, rather than a thin client.

Phase Two
Now DTE is entering its next stage. Phase 2 will deploy DTE to at least 10 more agencies over the next few years, using a new architecture and a new contractor. Phase I’s 50,000-plus users will have to be migrated to the new platform, as well, before they are fully compatible with other users. Phase 2 uses an all-Microsoft software stack, including Microsoft Hyper-V in place of VMWare’s ESXI for virtualization and Skype for Business in place of Cisco Unified Communications suite, the voice application used in Phase 1.

But Phase 2 has been slow out of the starting blocks. Starting from scratch, the follow-on program has been held up by negotiations over software licensing and supply chain delays, Delauter says. While Phase 1 moved from contract to initial deployment in just six months, Phase 2 has taken about three times that long, and is not yet accredited. DeLauter says that should happen this fall, with the first few pilot users getting on the system in December and January, before a broader rollout in April.

To achieve that schedule, the Joint Program Office and Phase 2 contractor BAE want participating agencies to sign task order agreements now, even before the system is proven and accredited. The security accreditation process for such Top Secret systems can take 90 days or longer, depending on whether significant deficiencies are identified.

So the planned January roll-out depends on no major problems emerging this fall.

“We’ve been testing the data centers and getting the data centers ready, but not all the equipment is installed yet,” DeLauter said in June. Setting up an acceptable supply chain “for something that was going to be a community system” took longer than anticipated, as did extensive license negotiations aimed at reducing the number of licenses needed by accounting for users who move from one agency to another.

DeLauter says the testing will prove whether the approach was right or not, but that the technology decisions point to a more efficient and more secure system, including an integrated security enclave to support controlled access for multiple international partners with different levels of access.

The new system uses attribute-based access controls (ABAC), with enforcement extending to email, Sharepoint and directory files. “That’s a little new for the community and it will take some getting used to,” DeLauter said.

Security controls can provide short-term access for system administrators limited to a need-to-know requirement, a response to the Edward Snowden leaks in which he abused administrative privileges to download thousands of documents, subsequently publishing them via WikiLeaks. For administrators, it means having limited time and access privileges to do system maintenance, or risk being kicked offline mid-job.

Other features new to the Phase 2 system include:

  • An integrated NARA-compliant records management capability
  • A desktop as a service board for users to see what services or software is available and what the impact on their organizations will be if they add or expand services. This feature is to be added later in the Phase 2 schedule
  • Support for non-US partners

Long term, Phase 2 is intended to provide similar services to other security domains including the Secret Internet Protocol Network, or SIPRNet, and its Non-secure cousin, NIPRNet. That requirement is built into Phase 2, but won’t be activated until after the system is proven in the closed arena, DeLauter says.

Will Phase 2 go as smoothly as the initial rollout? DeLauter is cautiously optimistic: “We hope it is as exquisite an experience as it was the first time.”

Windows 10
DTE probably won’t meet the Defense Department mandate that systems upgrade to Windows 10 by February 2017. But it could. One of the principal advantages to a virtualized system like this is the ease with which upgrades can be managed globally across the platform. But while the Phase 1 users at DIA and NGA could be upgraded today, DeLauter says her office is deferring to the agencies on when they want to make the switch.

“We’ve gone back to DIA and NGA and said, ‘Do you want to go to Windows 10?’ We are prepared to put Windows 10 in,” she says, “but DIA and NGA need to tell us when they want that. They shouldn’t do it separately; they need to do it together. The Joint Program Management Office is ready to provide those services when the agencies are ready to receive it.”

An upcoming new release for Phase 1, scheduled for the August/September period, will not include Windows 10, but the next release after that will, she said. “We are still working the delivery of Windows 10 in Phase 1,” Delauter said. NGA and DIA must work through differences on timing before they can schedule the upgrade, she added, and meetings to broker a compromise will be held this fall.

Phase 2 is also not ready to push ahead on Windows 10. The system will launch with Windows 7 and won’t upgrade to Windows 10 until well after the initial release.

Faster, Better Intelligence
While DTE will be common to every agency, it won’t necessarily be identical. Each agency will have the ability to customize access to applications, email limits and more, making the next steps in the process daunting. No doubt, as DeLauter says, “It was easier for two agencies than it will be for 17.”

She has helped establish a cross-IC working group that invites agency leaders in ahead of time so they can examine the DTE baseline and requirements, and determine what additions they may need – and what those will cost. The goal is to demystify the process and system, to give every agency a chance at a little bit of ownership in the whole.

“Transparency has been a big part of how we’ve been running the desktop,” DeLauter says. “You let people come into your design reviews. You have forums to share. Transparency is part of integration across the community.”

As each additional agency joins DTE, it will represent an additional task order, with its own particulars to be worked out and its own cost to be borne by the agency. Changes from the baseline will vary from email and storage limits to the number of applications included in the standard image.

“We think it should be a thin image, but we also have agreed to expand the number as we move along,” DeLauter says, indicating the kind of give and take necessary to get so many independent agencies to agree to give up some autonomy for the benefits of a shared system.

In exchange for giving up some independence, agencies are expected to gain security and cost savings. Gone will be large on-site staffs needed to do manual system patches and other localized support. “We patch the gold image and then everyone is protected,” DeLauter says.

Applications will be common, so everyone will be on the same version of Word, PowerPoint or Excel, eliminating formatting problems that waste time and effort. Mobility – in this context the ability to log into the system anywhere and access your files, rather than from a mobile computing device – will enhance collaboration.

“If we eliminate infrastructure as an obstacle, analysts can now focus on content,” DeLauter says.

More importantly, what DTE allows are new ways of doing business. It’s not just that you’re getting a new computer, DeLauter says. This is an opportunity to develop new workflows, new ways to share and interact, and new ways to locate others working on related issues. The goal is faster, better intelligence, not just safer, lest costly computing.

“We’re giving you some better tools,” DeLauter says. “So now: Can you come up with a better way to go through that data?”

Related Articles

GDIT Recruitment 600×300
GovExec Newsletter 250×250
GDIT Recruitment 250×250
GDIT HCSD SCM 1 250×250 Train
‘IT is Mission’: How Data Is Revolutionizing Intelligence

‘IT is Mission’: How Data Is Revolutionizing Intelligence

Intelligence agencies must stop viewing information technology as a back-office support service and instead elevate it to its rightful place as a mission-critical capability, argues Sean Roche, associate deputy director of digital innovation for the CIA.

“Stop treating IT like a service. Stop treating IT with the word ‘customer.’ Stop treating IT like it’s part of the admin portion of your organization,” he told the Department of Defense Intelligence Information Systems (DoDIIS) worldwide conference Aug. 2. “IT is mission.”

The distinction “changes the way we go about funding and prioritizing programs,” he said. It also has significant implications for the kinds of skills and talent intelligence agencies will need in the future, and for how systems are built, managed and designed.

For much of the past two decades, networks were the critical assets in intelligence – the ability to interact and quickly communicate across secure networks helped deliver data to the tactical edge more and more quickly. The time lag between intelligence shrunk from days to hours, minutes, even seconds. But in a data-centric world, intelligence is crossing into new territory. It is increasingly possible to predict likely outcomes, allowing national security leaders to make better decisions more quickly.

Open-source data analysis is one of the valuable data streams helping in that process, flooding the IC with new insights that must be tagged, collated, correlated, verified and analyzed. The volumes exceed human capacity, requiring intelligent systems, informed by data and able to learn as they go, to sort through all the signals to identify patterns that can point to truths.

Streaming social media analytics has become as valuable today as “the information we get clandestinely,” Roche said. The old notion that only information stamped “top secret” is of real value in intelligence has long since expired, he said. “Mining that rich vein of open source data that’s increasing rapidly in real time, doing sentiment analysis on it, is going to be more and more important,” he said.

IT Grows in Stature
Time and again, DoDIIS speakers returned to this theme: IT as not just an enabler, but critical to mission success. The IT enterprise is not just a network, but the underpinning of the IC’s future contributions to national security.

Under Secretary of Defense for Intelligence Marcel Lettre, acknowledged that the Pentagon “no longer controls the pace of change, especially in IT,” but emphasized that “technology is the secret sauce in the Third Offset strategy.” The Third Offset is Defense Secretary Ashton Carter’s long-term strategic vision to ensure the United States maintains technological superiority through sustained investment in emerging technologies like data science and machine intelligence.

Providing the underlying framework to enable that technological superiority is the primary reason Director of National Intelligence James Clapper is still on the job, he told a packed DoDIIS auditorium. Ensuring that IC ITE, the Intelligence Community Information Technology Enterprise, was sufficiently mature that it could not be reversed is not so much about providing infrastructure as it is about enhancing mission effectiveness across the IC.

“Data is a community property,” he said. “Integration simply means bringing the best and most appropriate resources from across the community together and applying them to our most challenging intelligence problems.”

And Marine Lt. Gen. Vince Stewart, director of the Defense Intelligence Agency (DIA) said the “decisive advantage over our adversaries in the future” will not be kinetic weapons or ground maneuver skills, but rather “this cognitive fight that matters most.”

Cross-Disciplinary Teams
Data and the ability to rapidly extract meaning from it is the means by which leaders expect to gain that decisive advantage, but achieving that nirvana will take more than technical acumen. Innovators across the IC are also looking at new organizational models and the kinds of integrated skill sets analysts and technologists will need to bring to the IC in the future.

“We need people who actually understand the math and can help validate and do things,” said Michael McCAbe, chief of applied research in DIA’s Chief Technology Office. “We need programmers and data experts who can move data, groom it, secure it. And you need users who actually understand what [the technology is] doing. And it’s more than analysis. It’s decision making, too.”

Cross-disciplinary teams are looking at what future skill sets DIA will need, what a data science career field and career path will look like, and how the technical skills of IT specialists and the analytic skills of analysts will begin to merge over time, he said.

“The technological skill set new analysts will need is significantly more than in the 1970s,” McCabe continued. “We’re not really at the full answer yet, but where we’re trending is this: IT services will be IT services and analysts will be analysts, but somewhere in the middle, you’re going to see IT reach out to the analyst and the analyst reach out to IT. You’ll have analysts who can code and coders who know how to think like an analyst. And I think we’re going to hit probably four different user groups on that spectrum.”

Already, he said, this is happening. Where the breakthroughs are taking place, it’s because the walls between analysts and coders are coming down, and the collaboration is increasing.

“Analyst cells, divisions and branches will include teams with that full spectrum of skills,” McCabe said. DIA has assembled multidisciplinary innovation pods – analysts, programmers and human resources specialists to help conceive of the skill sets and training and career paths future analysts will need. The agency is also “trying to do experiments to inform what new data sets should look like in the future,” he said, a process that likewise demands both “technological and also analysis skills.”

Across the agency, he continued, “we’ve got a bunch of users who want to get better at analytics, want to get better at decision making, tracking and monitoring, statistics and metrics.”

Roche cited the same phenomenon at CIA: “What we’re finding is most successful is the data analyst sitting with a programmer, sitting with a mission specialist, and often with an operator.”

Indeed, this pattern is repeating itself wherever large data sets and the desire to unlock their secrets has emerged, said Stan Tyliszczak, chief engineer at General Dynamics Information Technology. “We’ve seen it in healthcare, medical research and public safety, not just Intelligence or DoD: : Data analytics is a mission function, not an IT support function,” he says. “Sure, IT managers can help set up a support function or acquire a software tool. But big data solutions come out of the mission side of an organization. Increasingly, we’re seeing the mission specialists becoming IT savvy, and the IT staff bringing a new perspective to mission analysis.”

For example, the National Institutes for Health (NIH) found it’s easier to teach medical researchers how to program than it is to teach programmers the necessary medical knowledge to extract information from a data pool. “If you look at NIH’s Health IT model, they have these cells and the researchers become IT specialists, and the people who have IT knowledge gradually learn more about health.”

So just as agencies are discovering they can extract new knowledge by aligning disparate data sources, they’re also discovering the best way to do it is by assembling people with disparate skill sets into teams and then setting them loose on these complex data problems.

The long-term implications are profound. Just a few years ago, IT was part of the CIA’s administrative directorate, Roche said. IT was a support service, just like human resources and accounting departments. “Today, we realize it is inextricably linked to our ability to respond.”

Related Articles

GDIT Recruitment 600×300
GovExec Newsletter 250×250
GDIT Recruitment 250×250
GDIT HCSD SCM 1 250×250 Train
IT Chiefs Worry Aging Data Will Grow ‘Toxic’

IT Chiefs Worry Aging Data Will Grow ‘Toxic’

Global data storage needs are growing at 20 to 25 percent annually, and Cisco Systems forecasts that by 2019, the world will add another 10.4 trillion gigabytes of new data every year.

For government information technology and security chiefs, that represents a challenge: Rising data storage needs drive up costs and increase risk.

“We have to measure the toxicity of data over time,” said David Tillman, the Navy’s cyber security director, at the FedScoop Security Through Innovation Summit, April 14. “The longer we retain it, the more the potential threat. As storage is so cheap, we save and save and it becomes toxic from a threat perspective.”

Declining storage costs appeal to our worst instincts, Tillman said. “We’re natural hoarders.”

Ken Bible, deputy chief information officer

Ken Bible, deputy chief information officer

It takes discipline to beat back such instincts. Ken Bible, deputy chief information officer for the Marine Corps, said the issue boils down to organizational discipline.

“I don’t know that the data itself becomes toxic,” Bible said at the AFCEA Defensive Cyber Operations Symposium late last month. “The data is data. It’s how it’s used. It’s how we handle it. If we put the right rules in place for how we’re going to use analytics or handle the records going back however many years, then it can be managed. This requires some systems engineering.”

Not all data is worth saving, he argues. The Freedom of Information Act (FOIA) dictates many documents must be saved, including email. But other data is open to interpretation, either because it is exempt from FOIA, or because it is simply not addressed.

Though the law requires the Marine Corps archive email traffic from general officers and senior executive service, for example, it doesn’t address email records for every private or lance corporal. Do those also need to be saved?

Just as important as which data to save – and for how long – is determining the best way to store it, both for cost and security.

“There are technological solutions for which data to save and for how long per governing policy,” said Dan Gahagan, vice president, enterprise capabilities in General Dynamics Information Technology’s Intelligence Solutions Division. “Data tagging, for example, can allow you to target data for either long-term retention or short term destruction. The challenge is in getting the data tagged appropriately, and then implementing the right technologies to handle the tagged data. And that doesn’t even begin to address how we tag all the existing data that’s already out there.”

The rules of the road for what data should be saved are far from final. The Air Force collects so much surveillance video, for example, that it can’t view it all, let alone save it. After three months, , most drone surveillance video is destroyed.

Lt. Gen. Bill Bender, the Air Force chief information officer, said the service is 10 years behind on these issues. “We don’t have a detailed data strategy” – yet, he said. But he acknowledges that effort is underway. The service’s Information Dominance Flight Path, its IT policy-guiding document, notes under data management that “the Air Force will develop a roadmap … for implementing DoDI 8320.02 data standards by 4Q FY15.” That work is still in progress, however.

From a security perspective, the more data that’s saved, the more potential risk that not only the data will be compromised, but if it is, an enemy can piece together disparate pieces of data to develop a more comprehensive understanding of U.S. personnel, organizations, or even tactics and techniques.

Essye Miller, the Army’s director of cyber security, said users’ natural bent to keeping information easily accessible fails to take into account how powerful analytics and simple aggregation, can be. “We have a tendency to make as much info as we can available with no thought to the aggregation of that information by the enemy and without regard to analytics,” she said. “How long do we keep data? Historically, we have kept information too long.”

Human behavior and lack of discipline is one part of the problem. Constant personnel turnover, copied files misplaced in the wrong folders or drives can take a toll. The Marine Corps’ Cyber Security Division Chief Ray Letteer said he routinely sends cyber white teams out to comb through networks, seeing what’s been left and forgotten. “They find personal data, fitness reports, resumes, et cetera, that goes back in some cases to the 1980s. There’s got to be an expiration date on some of this stuff.”

New Data Sources
Data come in many forms. Surveillance photographs and video, signals intelligence, personnel records, correspondence: Rules must be established for each. Cyber analytic data represent another new trove of mineable information, Bible said. Cyber screening tools capture all the traffic coming in and out of government networks. Does all that data need to be saved permanently? How long are they of value?

Left to decide for themselves, Bible said, intelligence analysts would throw nothing away. “They’re very interested in anything they can get their hands on and would probably say, ‘Save everything,’” Bible said. “But I think we do have to figure out the right answer with respect to how long we keep certain data – and from a practical perspective, where we keep it.

“We’ve got to find less expensive ways to keep things we need to access once every other year, versus the things we need to have at our fingertips all the time,” Bible said. “That requires some systems engineering and some effort to figure that out and get that properly architected.”

Tiered data storage can be automated based on policy, an approach that helps take human behavior out of the equation. For example, files can either be automatically archived after 30 days without being touched, or deleted after a year.

Rob Foster, the Navy’s chief information officer, said the challenge is determining the rules of the road for what gets saved and what doesn’t. Rules change from agency to agency and even individual to individual. Setting an institutional approach is difficult, but attractive.

“Everybody’s got different rules and regulations,” Foster said. “I think people’s default is, if you let the individual determine what is an official record, you have a significant challenge when it comes to training. And if you retain everything to mitigate that, you have a significant challenge [with risk].”

Centralizing control can help. Enterprise systems management can reduce – if not erase – policy differences across an organization. Each of the services, as well as the Defense Department overall, are moving in that direction now, reducing the number of application and disparate user policies that have spread across the department over time.

“In the IT space particularly we’re coming to a consensus view of the critical requirement to manage the IT at an enterprise level,” Bender said. “In doing so, similar to what industry learned some time ago, we’re finding that centralized management ends up closing all of the intermediate gaps in terms of setting policies that represent a consensus view throughout the organization.”

Establishing those policies and standards – and communicating them broadly – is critical, Bible said. Service members and civilian employees trust the department to protect their data and to make good decisions about where the data goes and how it’s used. “If we handle data responsibly,” he said, “if we put the right rules in place, that’s the key to keeping data from becoming toxic.”

Related Articles

GDIT Recruitment 600×300
GovExec Newsletter 250×250
GDIT Recruitment 250×250
GDIT HCSD SCM 1 250×250 Train
In Age of Cloud, ITSM Still Matters

In Age of Cloud, ITSM Still Matters

Even if everything is up in the air – or the cloud – the business of managing information systems doesn’t change: Routine maintenance and major upgrades must be handled in a disciplined and comprehensive fashion and metrics must be employed to evaluate system performance and customer satisfaction.

The discipline of IT Service management (ITSM) applies no less when an organization outsources its information technology than when it manages them on its own. ITSM helps managers better understand user requirements for security, capacity and system availability and to build solutions to satisfy those needs.

The U.S. Air Force is a convert.

Even as it prepares for a major shift to outsource as much of its IT services as possible, the Air Force is rolling forward with a major push to inculcate ITSM across the enterprise. Col. Paul Young, chief of the U.S. Air Force’s Joint Information Environment Integration, said the Air Force’s transition to an IT service model will provide two big benefits:

  • “One is it really opens the aperture on who we can use to provide these services,” Young said. “We don’t have to do it internally. We can explain it in a way that everybody understands and we can broaden their horizons in terms of who you can go to for service provisioning.”
  • The second “is that we get standard service delivery no matter who provides the service.” By establishing metrics and requirements up front, the Air Force can establish its own standard model and make those standards a requirement of any contract.

For the Air Force, the cloud and outsourcing co-exist with ITSM, which in various forms has become a standard for many commercial organizations. One common ITSM approach is the IT Infrastructure Library (ITIL), a registered Axelos trademark, which was developed in the United Kingdom as an outgrowth of government ITSM implementations. Other ITSM frameworks include ISO20000,  FitSM, COBIT and the Microsoft Operations Framework.

ITIL is a framework of processes, approaches and capabilities used by many contractors as standard operating procedures, Young said. Because ITIL allows users to choose which elements to use in their own organizations in a descriptive, rather than prescriptive, way, it helps ensure the Air Force and its service suppliers are speaking the same language.

Standard metrics also help, ensuring provider and customer are on the same page when evaluating service performance.

The Air Force is less than a year into its enterprise-wide rollout of ITSM, with five internal groups substantially involved or touched by the implementation: SAF/CIO A6, Air Force Space Command, Air Force Network Integration Center, Air Force Life Cycle Management enter and the 24th Air Force. Officials said a sixth organization, the Air Force Installation and Mission Support Center, established only a year ago, will also be involved.

The plan is to apply the ITSM framework to new initiatives first, then as the concepts are proven, incorporate them into legacy system management, as well, Young said. As with many challenges today, the implementation is more of a cultural change than a technical one, because it represents a new way to do things.

The approach boils down to splitting responsibility along clear lines: “We want to make the decision about which services are most important and how do they underpin the Air Force’s core mission,” Young said. “Let the provider make the technical decisions about how to meet our service requirement.”

Going to School

Ohio State University adopted ITSM and ITIL years ago for a system-wide approach to supporting its 120,000 students, faculty and staff 24 hours a day, seven days a week. The university’s experience was outlined recently in an Axelos case study.

Bob Gribben, director of service operations, said Ohio State does not require vendors to be ITIL certified, as some government agencies might do, but the university does like to see experience with ITSM frameworks.

Certification in any ITSM approach  helps ensure vendors employ recognized best practices and standards, Gribben noted.

“Knowing that this consultant works on the basis of the customer is first and I need to do things efficiently and effectively and economically – the three E’s – is kind of appealing, versus somebody who doesn’t have that,” he said.

Just knowing a vendor has invested in a framework instills confidence, Gribben added. “I think there is something to say about somebody who’s taken the time to get the certification to understand what ITSM involves – it doesn’t have to be ITIL, it could be any of them.”

That is the case whether the service in question is provided locally or in the cloud. Ohio State uses many cloud services, such as Box for storage and Office365 for productivity. The cloud solutions cut costs, he said.

Using ITSM helps define customer requirements and identify appropriate solutions, and as technology evolves, helps ensure that service provision can evolve with it. An email or streaming service can quickly shift from cutting edge a short time ago to outdated and expensive, lacking in capabilities and performance characteristics that are common in newer offerings.

With a service model, solutions can be continually refreshed and kept up-to-date.

Lessons learned
In Ohio State’s implementation, Gribben said his staff wanted a self-service portal to help solve user support challenges. The more those could be solved by individual users, the more time help desk specialists would have for more serious concerns.

“The Service Desk had for years been a part of the organization that customers did not want to work with,” Gribben told the case study authors. “We started with our immediate pain point, the Service Desk function and the Incident Management process. From there, we were able to see the benefits of adding Request and Change Management.”

They settled on developing a one-stop shop that could differentiate between user types. So users with Administrative Web Interface (AWI) accounts would see options designed for their level of access, while others — students, staff, online account managers, and so on – would see options tailored for them.

Then, using a combination of in-house and commercial tools, Ohio State built, tested and fielded the system. In nine months, user traffic topped 1 million visits, the system has never failed and feedback is consistently positive. It’s working, Gribben said, “pretty darn well!”

ITSM experience has taught Gribben the importance of ensuring the right people are on each team and that teams are properly sized for each project. In one case at Ohio State, having too large a team slowed down development time; a single tool took six months to develop because there were too many cooks in the kitchen, Gribben said.

But too few people can also be a problem, Gribben said: “Usually, if you get too small a team, the person sitting next to you has the same idea you do.”

A second critical lesson is ITIL’s service-centered mentality. The customer always comes first. IT supports the mission, and needs to flex to mission needs, not the other way around. For some organizations, that may require a cultural shift, Gribben said.

Similarly, ITSM may also require a whole new way to approach a project, says John Gilmore, director, IT services and solutions and ITSM subject matter expert at General Dynamics Information Technology. It’s important to begin with the end goal, such as the service one needs and the way to measure its effectiveness, rather than the conventional starting point, which often focuses on available infrastructure or tools.

By starting with desired outcomes and working backwards, teams can determine how to reach mission objectives based on the current state, Gilmore said. After that, deciding the way forward becomes easier.

The Marine Corps specified ITSM when launching its Marine Corps Enterprise Information Technology Services (MCEITS) environment. The program involves bringing hundreds of applications and processes in house, and developing a new organization and culture to manage it. General Dynamics Information Technology successfully managed the transition.

“MCEITS represents a transition from a contractor owned/contractor operated environment to a government owned/contractor supported environment and, ultimately, to a government owned/government operated environment,” Gilmore said.

“The Marines quickly realized that a critical element of transition success relies on ITSM best practices through the integration and maturation of processes, tools, and personnel,” he said. That, in turn, gave birth to the Enterprise IT Service Management (EITSM) efforts to establish a foundation for global IT management.

“Communications with all the stakeholders is extremely important,” he said. “We first performed a global current-state assessment that involved reviewing strategic plans and end-state objectives to develop an ITSM implementation roadmap.  Our approach also included daily communications with key stakeholders, weekly and monthly status reports and a comprehensive communications and training program in support of customized processes and supporting tools.”

Keeping focused on the end-goals, reviewing progress, making necessary adjustments and communicating progress are all critical to ITSM implementation. Also critical: making sure end users – and not just system owners and managers – are informed along the way.

Users are focused on the the end product, not the process used to make it, said the Air Force’s Young. They may not care whether a drill, saw or some other tool is used to cut a hole in a piece of wood.

What’s important is the outcome, Young said, extending the woodworking analogy.  “Always remember that what the customer cares about is getting a hole that’s a half inch across.”

Related Articles

GDIT Recruitment 600×300
GovExec Newsletter 250×250
GDIT Recruitment 250×250
GDIT HCSD SCM 1 250×250 Train