intel gtw spotlight

How the Air Force Changed Tune on Cybersecurity

How the Air Force Changed Tune on Cybersecurity

Peter Kim, chief information security officer (CISO) for the U.S. Air Force, calls himself Dr. Doom. Lauren Knausenberger, director of cyberspace innovation for the Air Force, is his opposite. Where he sees trouble, she sees opportunity. Where he sees reasons to say no, she seeks ways to change the question.

For Kim, the dialogue they’ve shared since Knausenberger left her job atop a private sector tech consultancy to join the Air Force, has been transformational.

“I have gone into a kind of rehab for cybersecurity pros,” he says. “I’ve had to admit I have a problem: I can’t lock everything down.” He knows. He’s tried.

The two engage constantly, debating and questioning whether decisions and steps designed to protect Air Force systems and data are having their intended effect, they said, sharing a dais during a recent AFCEA cybersecurity event in Crystal City. “Are the things we’re doing actually making us more secure or just generating a lot of paperwork?” asks Knausenberger. “We are trying to turn everything on its head.”

As for Kim, she added, “Pete’s doing really well on his rehab program.”

One way Knausenberger has turned Kim’s head has been her approach to security certification packages for new software. Instead of developing massive cert packages for every program – documentation that’s hundreds of pages thick and unlikely to every be read – she wants the Air Force to certify the processes used to develop software, rather than the programs.

“Why don’t we think about software like meat at the grocery?” she asked. “USDA doesn’t look at every individual piece of meat… Our goal is to certify the factory, not the program.”

Similarly, Knausenberger says the Air Force is trying now to apply similar requirements to acquisition contracts, accepting the idea that since finding software vulnerabilities is inevitable, it’s best to have a plan for fixing them rather than hoping to regulate them out of existence. “So you might start seeing language that says, ‘You need to fix vulnerabilities within 10 days.’ Or perhaps we may have to pay bug bounties,” she says. “We know nothing is going to be perfect and we need to accept that. But we also need to start putting a level of commercial expectation into our programs.”

Combining development, security and operations into an integrated process – DevSecOps, in industry parlance – is the new name of the game, they argue together. The aim: Build security in during development, rather than bolting it on at the end.

The takeaways from the “Hack-the-Air-Force” bug bounty programs run so far, in that every such effort yields new vulnerabilities – and that thousands of pages of certification didn’t prevent them. As computer power becomes less costly and automation gets easier, hackers can be expected to use artificial intelligence to break through security barriers.

Continuous automated testing is the only way to combat their persistent threat, Kim said.

Michael Baker, CISO at systems integrator, General Dynamics Information Technology, agrees. “The best way to find the vulnerabilities – is to continuously monitor your environment and challenge your assumptions, he says. “Hackers already use automated tools and the latest vulnerabilities to exploit systems. We have to beat them to it – finding and patching those vulnerabilities before they can exploit them. Robust and assured endpoint protection, combined with continuous, automated testing to find vulnerabilities and exploits, is the only way to do that.”

I think we ought to get moving on automated security testing and penetration,” Kim added. “The days of RMF [risk management framework] packages are past. They’re dinosaurs. We’ve got to get to a different way of addressing security controls and the RMF process.”

Related Articles

GDIT Recruitment 600×300
Tom Temin 250×250
GovExec Newsletter 250×250
GDIT Recruitment 250×250
GDIT HCSD SCM 3 250×250 Train Yard
JOMIS Will Take E-Health Records to the Frontlines

JOMIS Will Take E-Health Records to the Frontlines

The Defense Department Military Health System Genesis electronic health records (EHR) system went live last October at Madigan Army Medical Center (Wash.), the biggest step so far in modernizing DOD’s vast MHS with a proven commercial solution. Now comes the hard part: Tying that system in with operational medicine for deployed troops around the globe.

War zones, ships at sea and aeromedical evacuations each present a new set of challenges for digital health records. Front-line units lack the bandwidth and digital infrastructure to enable cloud-based health systems like MHS Genesis. Indeed, when bandwidth is constrained, health data ranks last on the priority list, falling below command and control, intelligence and other mission data.

The Joint Operational Medicine Information Systems (JOMIS) program office oversees DOD’s operational medicine initiatives, including the legacy Theater Medical Information Program – Joint system used in today’s operational theaters of Iraq and Afghanistan, as well as aboard ships and in other remote locales.

“One of the biggest pain points we have right now is the issue of moving data from the various roles of care, from the first responder [in the war zone] to the First Aid station to something like Landstuhl (Germany) Regional Medical Center, to something in the U.S.,” Navy Capt. Dr. James Andrew Ellzy told GovTechWorks. He is deputy program executive officer (functional) for JOMIS, under the Program Executive Office, Defense Healthcare Management Systems (PEO DHMS).

PEO DHMS defines four stages or “roles,” once a patient begins to receive care. Role One is for first responders; Role Two: Forward resuscitative care; Role Three: Theater hospitals; and Role Four: Service-based medical facilities.

“Most of those early roles right now, are still using paper records,” Ellzy said. Electronic documentation begins once medical operators are in an established location. “Good records usually start the first place that has a concrete slab.”

Among the changes MHS Genesis will bring is consolidation. The legacy AHLTA (Armed Forces Health Longitudinal Technology Application – Theater) solution and its heavily modified theater-level variant AHLTA-T, incorporate separate systems for inpatient and outpatient support.

MHS Genesis however, will provide a single record regardless of patient status.

For deployed medical units, that’s important. Set up and maintenance for AHLTA’s outpatient records and the Joint Composite Health Care System have always been challenging.

“In order to set up the system, you have to have the technical skillset to initialize and sustain these systems,” said Ryan Loving, director of Health IT Solutions for military health services and the VA at General Dynamics Information Technology’s (GDIT) Health and Civilian Solutions Division. “This is a bigger problem for the Army than the other services, because the system is neither operated nor maintained until they go downrange. As a result, they lack the experience to be experts in setup and sustainment.”

JOMIS’ ultimate goal according to Stacy A. Cummings, who heads PEO DHMS, is to provide a virtually seamless representation of MHS Genesis deployed locations.

“For the first time, we’re bringing together inpatient and outpatient, medical and dental records, so we’re going to have a single integrated record for the military health system,” Cummings said at the HIMSS 2018 health IT conference in March. Last year, she told Government CIO magazine, “We are configuring the same exact tool for low-and no-communications environments.”

Therein lies the challenge, said GDIT’s Loving. “Genesis wasn’t designed for this kind of austere environment. Adapting to the unique demands of operational medicine will require a lot of collaboration with military health, with service-specific tactical networks, and an intimate understanding of those network environments today and where they’re headed in the future.”

Operating on the tactical edge – whether doing command and control or sharing medical data – is probably the hardest problem to solve, said Tom Sasala, director of the Army Architecture Integration Center and the service’s Chief Data Officer. “The difference between the enterprise environment and the tactical environment, when it comes to some of the more modern technologies like cloud, is that most modern technologies rely on an always-on, low-latency network connection. That simply doesn’t exist in a large portion of the world – and it certainly doesn’t exist in a large portion of the Army’s enterprise.”

Military units deploy into war zones and disaster zones where commercial connectivity is either highly compromised or non-existent. Satellite connectivity is limited at best. “Our challenge is how do we find commercial solutions that we cannot just adopt, but [can] adapt for our special purposes,” Sasala said.

MHS Genesis is like any modern cloud solution in that regard. In fact, it’s based on Cerner Millennium, a popular commercial EHR platform. So while it may be perfect for garrison hospitals and clinics – and ideal for sharing medical records with other agencies, civilian hospitals and health providers – the military’s operational requirements present unique circumstances unimagined by the original system’s architects.

Ellzy acknowledges the concern. “There’s only so much bandwidth,” he said. “So if medical is taking some of it, that means the operators don’t have as much. So how do we work with the operators to get that bandwidth to move the data back and forth?”

Indeed, the bandwidth and latency standards available via satellite links weren’t designed for such systems, nor fast enough to accommodate their requirements. More important, when bandwidth is constrained, military systems must line up for access, and health data is literally last on the priority list. Even ideas like using telemedicine in forward locations aren’t viable. “That works well in a hospital where you have all the connectivity you need,” Sasala said. “But it won’t work so well in an austere environment with limited connectivity.”

The legacy AHLTA-T system has a store-and-forward capability that allows local storage while connectivity is constrained or unavailable, with data forwarded to a central database once it’s back online. Delays mean documentation may not be available at subsequent locations when patients are moved from one level of care to the next.

The challenge for JOMIS will be to find a way to work in theater and then connect and share saved data while overcoming the basic functional challenges that threaten to undermine the system in forward locations.

“I’ll want the ability to go off the network for a period of time,” Ellzy said, “for whatever reason, whether I’m in a place where there isn’t a network, or my network goes down or I’m on a submarine and can’t actually send information out.”

AHLTA-T manages the constrained or disconnected network situation by allowing the system to operate on a stand-alone computer (or network configuration) at field locations, relying on built-in store-and-forward functionality to save medical data locally until it can be forwarded to the Theater Medical Data Store and Clinical Data Repository. There, it can be accessed by authorized medical personnel worldwide.

Engineering a comparable JOMIS solution will be complex and involve working around and within the MHS Genesis architecture, leveraging innovative warfighter IT infrastructure wherever possible. “We have to adapt Genesis to the store-and-forward architecture without compromising the basic functionality it provides,” said GDIT’s Loving.

Ellzy acknowledges compromises necessary to make AHLTA-T work, led to unintended consequences.

“When you look at the legacy AHLTA versus the AHLTA-T, there are some significant differences,” he said. Extra training is necessary to use the combat theater version. That shouldn’t be the case with JOMIS. “The desire with Genesis,” Ellzy said, “is that medical personnel will need significantly less training – if any – as they move from the garrison to the deployed setting.”

Reporter Jon Anderson contributed to this report.

Related Articles

GDIT Recruitment 600×300
Tom Temin 250×250
GovExec Newsletter 250×250
GDIT Recruitment 250×250
GDIT HCSD SCM 3 250×250 Train Yard
Design Thinking and DevOps Combine for Better Customer Experience

Design Thinking and DevOps Combine for Better Customer Experience

How citizens interact with government websites tells you much about how to improve – as long as you’re paying attention, said Aaron Wieczorek, digital services expert with U.S. Digital Services’ team at the Department of Veteran Affairs.

“At VA we will literally sit down with veterans, watch them work with the website and apply for benefits,” he said. The aim is to make sure the experience is what users want and expect he said, not “what we think they want.”

Taking copious notes on their observations, the team then sets to work on programming improvements that can be quickly put to the test. “Maybe some of the buttons were confusing or some of the way things work is confusing – so we immediately start reworking,” Wieczorek explained.

Applying a modern agile development approach means digital services can immediately put those tweaks to the test in their development environment. “If it works there, good. Then it moves to staging. If that’s acceptable, it deploys into production,” Wieczorek said.

That process can happen in days. Vets.gov deploys software updates into production 40 times per month Wieczorek said, and agency wide to all kinds of environments 600 times per month.

Case in point: Vets.gov’s digital Form 1010 EZ, which allows users to apply for VA healthcare online.

“We spent hundreds of hours watching veterans, and in end we were able to totally revamp everything,” Wieczorek said. “It’s actually so easy now, you can do it all on your phone.” More than 330,000 veterans have applied that way since the digital form was introduced. “I think that’s how you scale things.”

Of course, one problem remains: Vets.gov is essentially a veteran-friendly alternative site to VA.gov, which may not be obvious to search engines or veterans looking for the best way in the door. Search Google for “VA 1010ez” and the old, mobile-unfriendly PDF form still shows as the top result. The new mobile-friendly application? It’s the third choice.

At the National Geospatial-Intelligence Agency, developers take a similar approach, but focus hard on balancing speed, quality and design for maximum results. “We believe that requirements and needs should be seen like a carton of milk: The longer they sit around, the worse they get,” said Corry Robb product design lead in the Office of GEOINT Services at the National Geospatial-Intelligence Agency. “We try to handle that need as quickly as we can and deliver that minimally viable product to the user’s hands as fast as we can.”

DevOps techniques, where development and production processes take place simultaneously, increase speed. But speed alone is not the measure of success, Robb said. “Our agency needs to focus on delivering the right thing, not just the wrong thing faster.” So in addition to development sprints, his team has added “design sprints to quickly figure out the problem-solution fit.”

Combining design thinking, which focuses on using design to solve specific user problems, is critical to the methodology, he said. “Being hand in hand with the customer – that’s one of the core values our group has.”

“Iterative development is a proven approach,” said Dennis Gibbs, who established the agile development practice in General Dynamics Information Technology’s Intelligence Solutions Division. “Agile and DevOps techniques accelerate the speed of convergence on a better solution.  We continually incorporate feedback from the user into the solution, resulting in a better capability delivered faster to the user.”

Related Articles

GDIT Recruitment 600×300
Tom Temin 250×250
GovExec Newsletter 250×250
GDIT Recruitment 250×250
GDIT HCSD SCM 3 250×250 Train Yard
Unpleasant Design Could Encourage Better Cyber Hygiene

Unpleasant Design Could Encourage Better Cyber Hygiene

Recent revelations that service members and intelligence professionals are inadvertently giving up their locations and fitness patterns via mobile apps caught federal agencies by surprise.

The surprise wasn’t that Fitbits, smartphones or workout apps try to collect information, nor that some users ignore policies reminding them to watch their privacy and location settings. The real surprise is that many IT policies aren’t doing more to help stop such inadvertent fitness data leaks.

If even fitness-conscious military and intelligence personnel are unknowingly trading security and privacy for convenience, how can IT security managers increase security awareness and compliance?

One answer: Unpleasant design.

Unpleasant design is a proven technique for using design to discourage unwanted behavior. Ever get stuck in an airport and long for a place to lie down — only to find every bench or row of seats is fitted with armrests? That’s no accident. Airports and train terminals don’t want people sleeping across benches. Or consider the decorative metalwork sometimes placed on urban windowsills or planter walls — designed expressly to keep loiterers from sitting down. It’s the same with harsh lights in suburban parking lots, which discourage people from hanging out and make it harder for criminals to lurk in the shadows.

As the federal government and other agency IT security leaders investigate these inadvertent disclosures, can they employ those same concepts to encourage better cyber behavior?

Here’s how unpleasant design might apply to federally furnished Wi-Fi networks: Rather than allow access with only a password, users instead might be required to have their Internet of Things (IoT) devices pass a security screening that requires certain security settings. That screening could include ensuring location services are disabled while such devices are connected to government-provided networks.

Employees would then have to choose between the convenience of free Wi-Fi for personal devices and the risks of inadequate operations security (OPSEC) via insecure device settings.

This of course, only works where users have access to such networks. At facilities where personal devices must be deposited in lockers or left in cars, it won’t make a difference. But for users working (and living) on installations where personnel routinely access Wi-Fi networks, this could be highly effective.

Screening – and even blocking – certain apps or domains could be managed through a cloud access security broker, network security management software that can enforce locally set rules governing apps actively using location data or posing other security risks. Network managers could whitelist acceptable apps and settings, while blocking those deemed unacceptable. If agencies already do that for their wired networks, why not for wireless?

Inconvenient? Absolutely. That’s the point.

IT security staffs are constantly navigating the optimal balance between security and convenience. Perfect security is achievable only when nothing is connected to anything else. Each new connection and additional convenience introduces another dent in the network’s armor.

Employing cloud-access security as a condition of Wi-Fi network access will impinge on some conveniences. In most cases, truly determined users can work around those rules by using local cellular data access instead. In most parts of the world, however, those places where the need for OPSEC is greatest, that access comes with a direct cash cost. When users pay for data by the megabyte, they’re much more likely to give up some convenience, check security and privacy settings, and limit their data consumption.

This too, is unpleasant design at work. Cellular network owners must balance network capacity with use. Lower-capacity networks control demand by raising prices, knowing that higher priced data discourages unbridled consumption.

Training and awareness will always be the most important factors in securing privacy and location data, because few users are willing to wade through pages-long user agreements to discover what’s hidden in the fine print and legalese they contain. More plain language and simpler settings for opting-in or out of certain kinds of data sharing are needed – and app makers must recognize that failing to heed such requirements only increase the risk that government steps in with new regulations.

But training and awareness only go so far. People still click on bad links, which is why some federal agencies automatically disable them. It makes users take a closer, harder look and think twice before clicking. That too, is unpleasant design.

So is requiring users to wear a badge that doubles as a computer access card (as is the case with the Pentagon’s Common Access Card and most Personal Identity Verification cards). Yet, knowing that some will inevitably leave the cards in their computers, such systems automatically log off after only a few minutes of inactivity. It’s inconvenient, but more secure.

We know this much: Human nature is such that people will take the path of least resistance. If that means accepting security settings that aren’t safe, that’s what’s going to happen. Though interrupting that convenience and turning it on its head by means of Wi-Fi security won’t stop everyone. But it might have prevented Australian undergrad Nathan Ruser – and who knows who else – from identifying the regular jogging routes of military members (among other examples) from Strava’s house-built heat map and the 13 trillion GPS points all collected from users.

“If soldiers use the app like normal people do,” Ruser tweeted Jan. 27, “it could be especially dangerous … I shouldn’t be able to establish any pattern of life info from this far away.”

Exactly.

Related Articles

GDIT Recruitment 600×300
Tom Temin 250×250
GovExec Newsletter 250×250
GDIT Recruitment 250×250
GDIT HCSD SCM 3 250×250 Train Yard
Vendor-Generated STIGs Help DISA Accelerate New Technology Adoption

Vendor-Generated STIGs Help DISA Accelerate New Technology Adoption

New technologies are introduced all the time, and every product vendor touts the advantages of its new products. But ensuring it’s safe to connect those new systems to secure military networks can be a dicey proposition.

That’s why the Defense Information Systems Agency issues STIGS – Security Technical Implementation Guides – for high-demand information technology products.

greenwell_roger“A Security Technical Implementation Guide is a set of secure operationally configurable settings based on NIST 800-53 controls,” says Roger Greenwell, DISA’s director of Cybersecurity and its authorizing official for systems and applications used within the agency. “If a user has a STIG for a specific product, they have a guide to configure that product in a secure manner.”

STIGs do not represent DISA’s stamp of approval or official endorsement. Greenwell emphasizes it’s up to individual users to determine if the product has utility or value in a given application. Rather, they are instructions for safe use. A STIG guides installers and reviewers on the most secure implementation of a given product; so it might restrict certain services or capabilities, detail authentication requirements, or identify features that must be restricted only to administrators. Exactly what’s in the STIG depends on what the product itself actually does.

DISA issues only about 35-40 STIGs each year, far fewer than the number of products introduced to the market. But the agency no longer methodically develops every STIG itself. Instead, there are now three ways to develop a STIG – a major step to speeding up the process of getting approved guides for emerging technology products.

“We use three different methods now,” Greenwell said. “We can internally develop the STIG, do the research ourselves and write the STIG – how we started many years ago. We have a consensus effort [where] DISA partners with other entities, to include the vendor, in terms of working through what those requirements should be. Or we have Vendor-developed STIGs.”

Windows 10 is an example of a consensus effort, where DISA partnered with the Air Force and National Security Agency to develop an optimum safe configuration of the new operating system.

“The big thing we gain there,” Greenwell said, is “we get a product [into use] much faster, ultimately at lower cost to the tax payer.” Vendors can often work their STIGs in parallel with product development, so a STIG can be ready almost as soon as a new product is available. By contrast, a DISA-generated STIG can follow a product introduction by months or even longer.

VMware recently received a STIG for its NSX software-defined networking (SDN) solution – the first SDN solution to receive a STIG, DISA officials confirmed. Approved in July, the STIG establishes guidelines for implementing SDN on defense networks, a major development for an emerging technology.

NSX enables system managers to set up and tear down networks on the fly, enabling rapidly reconfigurable connections that can increase network security by reducing potential attack vectors. Take a mission planning environment that’s shared by a dozen or more coalition partners for an exercise or true operation.

Setting up multiple networks to support various levels of sharing across the group could take days or weeks and generate significant costs, only to be torn down weeks later when the event is over. But with SDN, the customer gets the same “communications path and tools they’re already familiar with from a compute standpoint,” VMware’s Federal VP Bill Rowan says, and a network that exists only in software. System managers, he explains, can “build and tear down on the fly, establishing different security parameters, as you need, for different partners.”

The result is increased security and a smaller attack surface.

“Software defined networking provides government users with new agility” said Scott Whitman, a Sr. Principal Analyst, Information Security, at General Dynamics IT. “Existing technology enables creating mission specific virtual communities but they still depend on integration with networks that are complex and often inflexible. With SDN technology, communities of interest can be set up based on mission needs and fully resourced with network connectivity, and then completely removed once that mission is complete. Security is paramount and SDN unites automation, networking and security into a tighter footprint enabling robust data governance which is at the heart of security in cloud based technology services.”

VMware made the investment to encourage adoption now, rather than waiting for the traditional STIG process to unfold by itself. “Most information assurance professionals in agencies and commands, they like the comfort of knowing a product has a STIG,” Rowan says. “It makes them feel that at least someone else has evaluated a product.”

With so many products hitting the market each year, DISA can’t possibly keep up. “Ideally, we would have a STIG for every product,” Greenwell says. Vendor-generated STIGs opens the door to more published solutions by shifting the workload from DISA’s limited personnel to vendors who are motivated to use the STIG to help generate sales.

“Vendors are given a framework and can develop the STIG themselves, with guidance as needed from DISA,” Greenwell says. “This enables us to ensure consistency in aligning to our configuration requirements while leveraging the vendor’s expertise with the product.”

Vendors are also better situated in most cases to do the necessary documentation and decision making quickly. Greenwell acknowledges they know their products best and are happy to work with DISA to provide the necessary documentation. Indeed, VMware is already looking ahead to future versions of NSX. “We’ve already started the next iteration of the STIG – as the product evolves, the STIG has to evolve with it.”

Related Articles

GDIT Recruitment 600×300
Tom Temin 250×250
GovExec Newsletter 250×250
GDIT Recruitment 250×250
GDIT HCSD SCM 3 250×250 Train Yard
Common Desktop, the Foundation for IC ITE, Expands Footprint

Common Desktop, the Foundation for IC ITE, Expands Footprint

photo credit: DIA

The intelligence community’s common desktop system, which now serves more than 50,000 users, will spread to new users this fall and early next year.

The Common Desktop Environment (DTE) covers nearly all of the Defense Intelligence Agency and National Geospatial Agency (NGA). A joint program office run by the two agencies will put a new architecture to the test this fall. Once accredited, the joint office will begin rolling out the Phase 2 system in January with a limited number of users and to launch a full-scale rollout next April.

DTE was developed as NGA moved to its new headquarters in 2011 and soon after expanded to DIA. Now a Joint Program Office runs the program, led by Kendrea DeLauter, a former analyst who heads the effort to create a single shared desktop computer system for the entire national intelligence apparatus.

The Face of IC ITE
DTE is one of nine core building blocks making up the Intelligence Community Information Technology Enterprise (IC ITE). It may be the most important – the single shared interface that everyone in the community will experience on a day-to-day basis.

“DTE is the user-facing component of IC ITE,” DeLauter told GovTechWorks in a modern, glassed-in conference room in DIA’s glistening headquarters at Joint Base Anacostia-Bolling, the walled mini-city across the river from Reagan National Airport in Washington, DC. “Everybody works off a desktop or laptop that allows them to get access to their data. It basically affects everybody who works with a desktop or laptop computers.”

This makes DTE not just user facing, but the literal face of IC ITE to the community. Everything else is accessed through the desktop, effectively making those other eight components invisible to users:

  • Commercial cloud services
  • Government cloud services
  • Enterprise Management (EMT), comprising the help desk and system management functions
  • Identification, authorization and authentication (IAA) for access control
  • Networks
  • Integrated Transport Service
  • Security Coordination
  • Application Mall

ICITE, like the Defense Department’s Joint Information Environment, is not a program of record by itself. Rather, it’s the umbrella that unifies a series of IC-wide programs into a single vision. “It’s all of the agencies getting together and brining our tools to a better standard, but doing it within our programs,” DeLauter says. “That will be an incredible test of integration across the intelligence community.”

Much of it rests on her shoulders.

Evolution of DTE
DTE came about almost as an accident. As NGA was moving into its new Springfield, Va., offices in September 2011, agency leaders sought a way to reduce the power consumption needed for every workstation and to accommodate the “green” building design. NGA’s IT support contractor General Dynamics Information Technology, came up with a state of the art design for a virtual desktop infrastructure (VDI). With approval of this design, NGA could reduce power requirements, network cabling and systems support requirements and enhance security at the same time. Patching could be done centrally, resources could be shared and users could be more mobile. Instead of being tied to a cubicle, they could easily move from work area to work area as needed, with email, phone number and data available simply by logging into a terminal anywhere in the building.

The effort was a runaway success, and soon DIA joined in, followed by intelligence teams at the joint combatant commands and Coast Guard Intelligence. To date, more than 72,000 users have been credentialed, although the actual number of current users is closer to 50,000 as retirements, transfers and normal attrition accounts for thousands of departures each year.

Eventually, DTE will boast some 250,000 users and possibly more. It will serve 17 intelligence agencies, plus some organizations “on the fringes,” DeLauter says, adding that the Office of the Secretary of Defense “has some interest in the system.”

Almost all of NGA and DIA are using the system, with a few exceptions. Additional users are added regularly, sometimes hundreds at a time.

The transition was smooth she said, thanks to extensive advanced planning and outreach by the government-contractor team: Emails, town halls, helpful tips and instructions all contributed to making the move easy on users, whose greatest worries revolved around ensuring access to favorite applications.

“You need to make sure you have a good customer experience the first time or you don’t get someone who’s going to be an advocate for you,” DeLauter says. “We put a heavy emphasis on the customer experience.”

Surveys showed 90 percent satisfaction rates for those who took advantage of the advance information, though the rate was lower for the inevitable portion who were too busy, distracted or fearful of change to prepare themselves for the move.

The biggest beef? Some applications weren’t ready in time. But with some 400 applications across the two organizations, that was inevitable. To minimize stress, the system was rolled out gradually, beginning with standard users – those performing basic computing functions and moving gradually up the scale to more sophisticated, high-demand users.

“We did not migrate the analysts first, because they tend to have the more complicated requirements,” DeLauter said.

Analysts are the power users, with more demanding applications and requirements. For some, those applications are accessed via a browser, while others were incorporated into the system image. For the most demanding instances, users were put on a thick client, rather than a thin client.

Phase Two
Now DTE is entering its next stage. Phase 2 will deploy DTE to at least 10 more agencies over the next few years, using a new architecture and a new contractor. Phase I’s 50,000-plus users will have to be migrated to the new platform, as well, before they are fully compatible with other users. Phase 2 uses an all-Microsoft software stack, including Microsoft Hyper-V in place of VMWare’s ESXI for virtualization and Skype for Business in place of Cisco Unified Communications suite, the voice application used in Phase 1.

But Phase 2 has been slow out of the starting blocks. Starting from scratch, the follow-on program has been held up by negotiations over software licensing and supply chain delays, Delauter says. While Phase 1 moved from contract to initial deployment in just six months, Phase 2 has taken about three times that long, and is not yet accredited. DeLauter says that should happen this fall, with the first few pilot users getting on the system in December and January, before a broader rollout in April.

To achieve that schedule, the Joint Program Office and Phase 2 contractor BAE want participating agencies to sign task order agreements now, even before the system is proven and accredited. The security accreditation process for such Top Secret systems can take 90 days or longer, depending on whether significant deficiencies are identified.

So the planned January roll-out depends on no major problems emerging this fall.

“We’ve been testing the data centers and getting the data centers ready, but not all the equipment is installed yet,” DeLauter said in June. Setting up an acceptable supply chain “for something that was going to be a community system” took longer than anticipated, as did extensive license negotiations aimed at reducing the number of licenses needed by accounting for users who move from one agency to another.

DeLauter says the testing will prove whether the approach was right or not, but that the technology decisions point to a more efficient and more secure system, including an integrated security enclave to support controlled access for multiple international partners with different levels of access.

The new system uses attribute-based access controls (ABAC), with enforcement extending to email, Sharepoint and directory files. “That’s a little new for the community and it will take some getting used to,” DeLauter said.

Security controls can provide short-term access for system administrators limited to a need-to-know requirement, a response to the Edward Snowden leaks in which he abused administrative privileges to download thousands of documents, subsequently publishing them via WikiLeaks. For administrators, it means having limited time and access privileges to do system maintenance, or risk being kicked offline mid-job.

Other features new to the Phase 2 system include:

  • An integrated NARA-compliant records management capability
  • A desktop as a service board for users to see what services or software is available and what the impact on their organizations will be if they add or expand services. This feature is to be added later in the Phase 2 schedule
  • Support for non-US partners

Long term, Phase 2 is intended to provide similar services to other security domains including the Secret Internet Protocol Network, or SIPRNet, and its Non-secure cousin, NIPRNet. That requirement is built into Phase 2, but won’t be activated until after the system is proven in the closed arena, DeLauter says.

Will Phase 2 go as smoothly as the initial rollout? DeLauter is cautiously optimistic: “We hope it is as exquisite an experience as it was the first time.”

Windows 10
DTE probably won’t meet the Defense Department mandate that systems upgrade to Windows 10 by February 2017. But it could. One of the principal advantages to a virtualized system like this is the ease with which upgrades can be managed globally across the platform. But while the Phase 1 users at DIA and NGA could be upgraded today, DeLauter says her office is deferring to the agencies on when they want to make the switch.

“We’ve gone back to DIA and NGA and said, ‘Do you want to go to Windows 10?’ We are prepared to put Windows 10 in,” she says, “but DIA and NGA need to tell us when they want that. They shouldn’t do it separately; they need to do it together. The Joint Program Management Office is ready to provide those services when the agencies are ready to receive it.”

An upcoming new release for Phase 1, scheduled for the August/September period, will not include Windows 10, but the next release after that will, she said. “We are still working the delivery of Windows 10 in Phase 1,” Delauter said. NGA and DIA must work through differences on timing before they can schedule the upgrade, she added, and meetings to broker a compromise will be held this fall.

Phase 2 is also not ready to push ahead on Windows 10. The system will launch with Windows 7 and won’t upgrade to Windows 10 until well after the initial release.

Faster, Better Intelligence
While DTE will be common to every agency, it won’t necessarily be identical. Each agency will have the ability to customize access to applications, email limits and more, making the next steps in the process daunting. No doubt, as DeLauter says, “It was easier for two agencies than it will be for 17.”

She has helped establish a cross-IC working group that invites agency leaders in ahead of time so they can examine the DTE baseline and requirements, and determine what additions they may need – and what those will cost. The goal is to demystify the process and system, to give every agency a chance at a little bit of ownership in the whole.

“Transparency has been a big part of how we’ve been running the desktop,” DeLauter says. “You let people come into your design reviews. You have forums to share. Transparency is part of integration across the community.”

As each additional agency joins DTE, it will represent an additional task order, with its own particulars to be worked out and its own cost to be borne by the agency. Changes from the baseline will vary from email and storage limits to the number of applications included in the standard image.

“We think it should be a thin image, but we also have agreed to expand the number as we move along,” DeLauter says, indicating the kind of give and take necessary to get so many independent agencies to agree to give up some autonomy for the benefits of a shared system.

In exchange for giving up some independence, agencies are expected to gain security and cost savings. Gone will be large on-site staffs needed to do manual system patches and other localized support. “We patch the gold image and then everyone is protected,” DeLauter says.

Applications will be common, so everyone will be on the same version of Word, PowerPoint or Excel, eliminating formatting problems that waste time and effort. Mobility – in this context the ability to log into the system anywhere and access your files, rather than from a mobile computing device – will enhance collaboration.

“If we eliminate infrastructure as an obstacle, analysts can now focus on content,” DeLauter says.

More importantly, what DTE allows are new ways of doing business. It’s not just that you’re getting a new computer, DeLauter says. This is an opportunity to develop new workflows, new ways to share and interact, and new ways to locate others working on related issues. The goal is faster, better intelligence, not just safer, lest costly computing.

“We’re giving you some better tools,” DeLauter says. “So now: Can you come up with a better way to go through that data?”

Related Articles

GDIT Recruitment 600×300
Tom Temin 250×250
GovExec Newsletter 250×250
GDIT Recruitment 250×250
GDIT HCSD SCM 3 250×250 Train Yard