State & Local GTW

Close the Data Center, Skip the TIC – How One Agency Bought Big Into Cloud

Close the Data Center, Skip the TIC – How One Agency Bought Big Into Cloud

It’s no longer a question of whether the federal government is going to fully embrace cloud computing. It’s how fast.

With the White House pushing for cloud services as part of its broader cybersecurity strategy and budgets getting squeezed by the administration and Congress, chief information officers (CIOs) are coming around to the idea that the faster they can modernize their systems, the faster they’ll be able to meet security requirements. And that once in the cloud, market dynamics will help them drive down costs.

“The reality is, our data-center-centric model of computing in the federal government no longer works,” says Chad Sheridan, CIO at the Department of Agriculture’s Risk Management Agency. “The evidence is that there is no way we can run a federal data center at the moderate level or below better than what industry can do for us. We don’t have the resources, we don’t have the energy and we are going to be mired with this millstone around our neck of modernization for ever and ever.”

Budget pressure, demand for modernization and concern about security all combine as a forcing function that should be pushing most agencies rapidly toward broad cloud adoption.

Joe Paiva, CIO at the International Trade Administration (ITA), agrees. He used an expiring lease as leverage to force his agency into the cloud soon after he joined ITA three years ago. Time and again the lease was presented to him for a signature and time and again, he says, he tore it up and threw it away.

Finally, with the clock ticking on his data center, Paiva’s staff had to perform a massive “lift and shift” operation to keep services running. Systems were moved to the Amazon cloud.  Not a pretty transition, he admits, but good enough to make the move without incident.

“Sometimes lift and shift actually makes sense,” Paiva told federal IT specialists at the Advanced Technology Academic Research Center’s (ATARC) Cloud and Data Center Summit. “Lift and shift actually gets you there, and for me that was the key – we had to get there.”

At first, he said, “we were no worse off or no better off.” With systems and processes that hadn’t been designed for cloud, however, costs were high. “But then we started doing the rationalization and we dropped our bill 40 percent. We were able to rationalize the way we used the service, we were able to start using more reserve things instead of ad hoc.”

That rationalization included cutting out software and services licenses that duplicated other enterprise solutions. Microsoft Office 365, for example, provided every user with a OneDrive account in the cloud. By getting users to save their work there, meant his team no longer had to support local storage and backup, and the move to shared virtual drives instead of local ones improved worker productivity.

With 226 offices around the world, offloading all that backup was significant. To date, all but a few remote locations have made the switch. Among the surprise benefits: happier users. Once they saw how much easier things were with shared drives that were accessible from anywhere, he says, “they didn’t even care how much money we were saving or how much more secure they were – they cared about how much more functional they suddenly became.”

Likewise, Office 365 provided Skype for Business – meaning the agency could eliminate expensive stand-alone conferencing services – another benefit providing additional savings.

Cost savings matter. Operating in the cloud, ITA’s annual IT costs per user are about $15,000 – less than half the average for the Commerce Department as a whole ($38,000/user/year), or the federal government writ large ($39,000/user/year), he said.

“Those are crazy high numbers,” Paiva says. “That is why I believe we all have to go to the cloud.”

In addition to Office 365, ITA uses Amazon Web Services (AWS) for infrastructure and Salesforce to manage the businesses it supports, along with several other cloud services.

“Government IT spending is out of freaking control,” Paiva says, noting that budget cuts provide incentive for driving change that might not come otherwise. “No one will make the big decisions if they’re not forced to make them.”

Architecture and Planning
If getting to the cloud is now a common objective, figuring out how best to make the move is unique to every user.

“When most organizations consider a move to the cloud, they focus on the ‘front-end’ of the cloud experience – whether or not they should move to the cloud, and if so, how will they get there,” says Srini Singaraju, chief cloud architect at General Dynamics Information Technology, a systems integrator. “However, organizations commonly don’t give as much thought to the ‘back-end’ of their cloud journey: the new operational dynamics that need to be considered in a cloud environment or how operations can be optimized for the cloud, or what cloud capabilities they can leverage once they are there.”

Rather than lift and shift and then start looking for savings, Singaraju advocates planning carefully what to move and what to leave behind. Designing systems and processes to take advantage of its speed and avoiding some of the potential pitfalls not only makes things go more smoothly, it saves money over time.

“Sometimes it just makes more sense to retire and replace an application instead of trying to lift and shift,” Singaraju says. “How long can government maintain and support legacy applications that can pose security and functionality related challenges?”

The challenge is getting there. The number of cloud providers that have won provisional authority to operate under the 5-year-old Federal Risk and Authorization Management Program (FedRAMP) is still relatively small: just 86 with another 75 still in the pipeline. FedRAMP’s efforts to speed up the process are supposed to cut the time it takes to earn a provisional authority to operate (P-ATO) from as much as two years to as little as four months. But so far only three cloud providers have managed to get a product through FedRAMP Accelerated – the new, faster process, according to FedRAMP Director Matt Goodrich. Three more are in the pipeline with a few others lined up behind those, he said.

Once an agency or the FedRAMP Joint Authorization Board has authorized a cloud solution, other agencies can leverage their work with relatively little effort. But even then, moving an application from its current environment is an engineering challenge. Determining how to manage workflow and the infrastructure needed to make a massive move to the cloud work is complicated.

At ITA, for example, Paiva determined that cloud providers like AWS, Microsoft Office 365 and Salesforce had sufficient security controls in place that they could be treated as a part of his internal network. That meant user traffic could be routed directly to them, rather than through his agency’s Trusted Internet Connection (TIC). That provided a huge infrastructure savings because he didn’t have to widen that TIC gateway to accommodate all that routine work traffic, all of which in the past would have stayed inside his agency’s network.

Rather than a conventional “castle-and-moat” architecture, Paiva said he had to interpret the mandate to use the TIC “in a way that made sense for a borderless network.”

“I am not violating the mandate,” he said. “All my traffic that goes to the wild goes through the TIC. I want to be very clear about that. If you want to go to www-dot-name-my-whatever-dot-com, you’re going through the TIC. Office 365? Salesforce? Service Now? Those FedRAMP-approved, fully ATO’d applications that I run in my environment? They’re not external. My Amazon cloud is not external. It is my data center. It is my network. I am fulfilling the intent and letter of the mandate – it’s just that the definition of what is my network has changed.”

Todd Gagorik, senior manager for federal services at AWS, said this approach is starting to take root across the federal government. “People are beginning to understand this clear reality: If FedRAMP has any teeth, if any of this has any meaning, then let’s embrace it and actually use it as it’s intended to be used most efficiently and most securely. If you extend your data center into AWS or Azure, those cloud environments already have these certifications. They’re no different than your data center in terms of the certifications that they run under. What’s important is to separate that traffic from the wild.”

ATARC has organized a working group of government technology leaders to study the network boundary issue and recommend possible changes to the policy, said Tom Suder, ATARC president. “When we started the TIC, that was really kind of pre-cloud, or at least the early stages of cloud,” he said. “It was before FedRAMP. So like any policy, we need to look at that again.” Acting Federal CIO Margie Graves is a reasonable player, he said, and will be open to changes that makes sense, given how much has changed since then.

Indeed, the whole concept of a network’s perimeter has been changed by the introduction of cloud services, Office of Management and Budget’s Grant Schneider, the acting federal chief information security officer (CISO), told GovTechWorks earlier this year.

Limiting what needs to go through the TIC and what does not could have significant implications for cost savings, Paiva said. “It’s not chump change,” he said. “That little architectural detail right there could be billions across the government that could be avoided.”

But changing the network perimeter isn’t trivial. “Agency CIOs and CISOs must take into account the risks and sensitivities of their particular environment and then ensure their security architecture addresses all of those risks,” says GDIT’s Singaraju. “A FedRAMP-certified cloud is a part of the solution, but it’s only that – a part of the solution. You still need to have a complete security architecture built around it. You can’t just go to a cloud service provider without thinking all that through first.”

Sheridan and others involved in the nascent Cloud Center of Excellence sees the continued drive to the cloud as inevitable. “The world has changed,” he says. “It’s been 11 years since these things first appeared on the landscape. We are in exponential growth of technology, and if we hang on to our old ideas we will not continue. We will fail.”

His ad-hoc, unfunded group includes some 130 federal employees from 48 agencies and sub-agencies that operate independent of vendors, think tanks, lobbyists or others with a political or financial interest in the group’s output. “We are a group of people who are struggling to drive our mission forward and coming together to share ideas and experience to solve our common problems and help others to adopt the cloud,” Sheridan says. “It’s about changing the culture.”

Related Articles

GDIT Recruitment 600×300
GM 250×250
GDIT HCSD SCM 5 250×250 Truck
GDIT Recruitment 250×250
Vago 250×250
New Framework Defines Cyber Security Workforce Needs

New Framework Defines Cyber Security Workforce Needs

Both the federal government and its contractors are locked in a battle for talent with commercial providers, each vying for the best personnel in critical areas of cybersecurity, and each dealing with a shortage of available talent.

Both would benefit from targeted investment in education and increased standardization to define the skills and knowledge required for different kinds of jobs – and now the National Institute for Standards and Technology (NIST) has taken a big step to help make that happen.

NIST published a framework for the future cybersecurity workforce this week, Special Publication 800-181, the culmination of years of effort under the National Initiative for Cybersecurity Education (NICE).

The framework defines “a common, consistent lexicon to describe cybersecurity work by category, specialty area, and work role,” and details the necessary knowledge, skills and abilities (KSAs) and tasks performed by individuals in each kind of job. The framework defines cyber operations jobs in seven operational categories and 32 job specialties.

The aim is that everyone – employers, educators, trainers and cyber professionals will be able to leverage that common language into a better understanding of the existing workforce and the knowledge gaps that need to be filled.

“Building the future workforce is a priority for all of us,” said Stan Tyliszczak, vice president and chief engineer at General Dynamics Information Technology, a systems integrator in Fairfax, Va. “Government, industry and academia all share in this problem. Having a common language we can use to understand each other will help employers explain their requirements and help educators deliver on those needs.”

It’s been a long time coming. A 2015 report on the cyber workforce – Increasing the Effectiveness of the Federal Role in Cybersecurity Education – concluded the government needed to make a host of changes to assure access to a skilled cyber workforce, said David Wennergren, until recently senior vice president for technology at the Professional Services Council, former assistant deputy chief management officer at the Defense Department and a one-time chief information officer for the Navy. Wennergren led the investigation.

The report examined two government-funded programs – the National Centers of Academic Excellence in Information Assurance/Cyber Defense (CAEs), funded by the National Security Agency (NSA) and the Department of Homeland Security (DHS); and the CyberCorps Scholarship for Service (SFS) program managed by the National Science Foundation (NSF) – and concluded they each needed:

  • More hands-on education. “We have to get people in the labs actually using tools and demonstrating proficiencies, not just doing text-book type work,” said Wennergren.
  • The government needs to ensure “we are delivering students who are competent and can to do the jobs without additional training to organizations,” Wennergren said.
  • Focus on the entire public sector – federal, state, local, tribal and territorial governments.
  • Expand programs to include qualified two-year degrees at community colleges. Not all cybersecurity jobs require a four-year degree and military members who have both technical training and practical experience may already have the skills needed to perform critical cyber functions in non-military settings.
  • The entire federal sector needs cyber skills, not just defense and intelligence agencies. The CAE program should embrace the entire federal sector.

Two bills now working their way through Congress build on some of those concepts, particularly the potential for two-year degrees as a means of lowering barriers to entry to this critical part of the workforce.

The Department of Defense Cyber Scholarship Program Act of 2017, a bipartisan bill co-sponsored by Sen. Mike Rounds (R-S.D.), chairman of the Senate Armed Services’ Committee Subcommittee on Cybersecurity, and Sen. Tim Kaine (D-Va.), seeks to provide $10 million in scholarship funds, at least $500,000 of that to fund two-year degree-level programs.

A second bipartisan measure, the Cyber Scholarship Opportunities Act of 2017, co-sponsored by Kaine, Sen. Roger Wicker (R-Miss.), Sen. Patty Murray (D-Wash.), and Sen. David Perdue (R-Ga.), would amend the Cybersecurity Enhancement Act of 2014 by setting aside at least 5 percent of federal cyber scholarship-for-service funds for two-year degree programs, either for military veterans, students pursuing careers in cybersecurity via associates’ degrees in that discipline or students who already have bachelors’ degrees.

Although the Wennergren report’s recommendations focused on federal programs, the concepts apply equally to federal contractors, Wennergren said.

“Clearly both industry and government would benefit from improvements in how cyber is taught in academic institutions [and] how we measure the successful development and placement of students,” he said, adding both will also benefit from the wide adoption of the NICE workforce standards in which government, academia, and the private sector collaborated.

Workforce Shortfall By the Numbers
According to Cyberseek.org, a joint project of NICE, Burning Glass Technologies and CompTIA, there are more than 299,000 cybersecurity job vacancies in the United States today, representing about 28 percent of all U.S. cyber jobs. For some of those jobs, there are as many openings – or more – as there are certified, qualified candidates to fill them – even though such people are almost all employed. For example, there are 69,549 individuals who have earned Certified Information Systems Security Professionals (CISSP) status. But there are 76,336 openings for people with CISSPs.

The most common cyber certification is CompTIA Security+, with more than 167,000 people holding that certification. But there are still more than 33,000 openings for such people, meaning a significant shortage remains.

Rodney Peterson, NIST’s director for NICE, called the cyber workforce the “key enabler” of the future of the nation’s cyber security in a recent interview with Federal News Radio’s Tom Temin.

“We’re clearly building momentum to promote and energize a robust and integrated ecosystem of cybersecurity education, training and workforce development,” he said on Temin’s Federal Drive program. “I think it’s that momentum that both allows us to create a community across both the public and private sector.  That NICE workforce framework really provides a common way to think about cybersecurity work, a taxonomy, a reference tool that can really help align our diverse and complex community together toward a common vision.”

Related Articles

GDIT Recruitment 600×300
GM 250×250
GDIT HCSD SCM 5 250×250 Truck
GDIT Recruitment 250×250
Vago 250×250
New Cyber Standards for IoT Ease – But Won’t Solve – Security Challenge

New Cyber Standards for IoT Ease – But Won’t Solve – Security Challenge

The first independent standard for cybersecurity for the Internet of Things (IoT) was approved earlier this month, following two years of debate and discussion over how to measure and secure such devices.

The American National Standards Institute (ANSI) approved UL 2900-1, General Requirements for Software Cybersecurity for Network-Connectable Products, as a standard on July 5. The effort was spearheaded by Underwriters Laboratories (UL), which is preparing two more standards to follow: UL 2900-2-1, which defines requirements for network-connectable components of healthcare systems, and UL 2900-2-2, which does the same for industrial control systems.

The three establish the first standard security protocols for software-controlled IoT devices, such as access controls, industrial controls for lighting and mechanical systems, internet-connected medical devices and more. They also offer potential answers to major worries about the lack of security built into such devices thus far.

The Internet of Things promises unparalleled opportunities to track, control and manage everything from lights and security cameras to pacemakers and medication delivery systems. But concerns about security, driven by real-world events in which unsecured IoT devices were co-opted in coordinated botnet attacks, have raised anxiety levels about the risks posed by connecting so many devices to the internet.

Those concerns have prompted government leaders from the Pentagon to Congress to call on industry to embrace security standards as a mark of quality and establish voluntary independent testing programs to assure customers that products are safe. The underlying warning: Either industry figures out how to police itself or government regulators will step in to fill the void.

Whether that is enough to inspire more companies to step up to the standards challenge remains unclear. “The market – that is, individual, corporate and government customers – has yet to put a price on IoT security in the same way that other markets have to determine the relative value of energy-efficient appliances or crash-worthy automobiles,” said Chris Turner, solutions architect with systems integrator General Dynamics Information Technology. “The market would benefit from standards. They’d help vendors back up product claims and integrators speed up adoption and implementation, which in turn would increase security and probably drive down prices, as well.”

Steven Walker Acting director of DARPA

Steven Walker
Acting director of DARPA

A standards regimen could change that equation, suggests Steven Walker, acting director of the Defense Advanced Research Agency (DARPA).

“What if customers were made aware of unsecure products and the companies that made them?” he asked at the AFCEA Defensive Cyber Symposium in June. “I’m pretty sure customers would buy the more secure products.”

As recently as Oct. 21, 2016, the Mirai botnet attack crippled Internet services provider Dyn via an international network of security cameras that launched an onslaught of bogus data requests on Dyn servers, peaking at about 1.2 terabytes/s. The attack brought down many of the most popular sites on the Internet.

Kevin Fu, director of the Archimedes Center for Medical Device Security and the Security and Privacy Research Group at the University of Michigan and the co-founder and chief scientist at Virta Labs, a startup medical device security firm, told the House Energy and Commerce Committee that the underlying problem is one of market failure.

“We are in this sorry and deteriorating state because there is almost no cost to a manufacturer for deploying [IoT] products with poor security to consumers,” he said at a November hearing. “Has a consensus body or federal agency issued a meaningful IoT security standard? Not yet. Is there a national testing lab to verify and assess the pre-market security of IoT devices? No. Is there a tangible cost to any company that puts an insecure IoT device into the market? I don’t think so.”

Could UL 2900 answer that need? Though Fu isn’t quite ready to endorse it, he did suggest the concept is sound.

“We know from the mathematician Gödel that it’s impossible to have both a sound and complete set of standards for any non-trivial problem,” Fu told GovTechWorks. “However, standards are important to improve security and simplify the problem to make it more tractable. No approach will completely solve security, but standards, sound engineering principles and experience gained through failure are necessary ingredients for reasonable defense.”

Developing the Standard
UL 2900 provides guidelines for how to evaluate and test connected products, including a standard approach to software analysis, efforts to root out embedded malware and process and control requirements for establishing IoT security risk controls in the architecture, design and long-term risk management of the product.

Rather than focus on hardware devices first, UL focused on software after initial conversations with the Department of Homeland Security (DHS), said Ken Modeste, leader of cybersecurity services, at UL. “One of DHS’s biggest challenges was their software supply chain,” he said. DHS was concerned about commercial software products running on computer systems, as well as industrial control software running the agencies operations technology, such as air conditioning, lighting and building or campus security systems.

Examining the problem, UL officials found clear similarities between the systems and sensors used in factory automation, enterprise building automation and security technology. “The majority of these cyber concerns – 90 percent – were in software,” Modeste told GovTechWorks. “So we realized, if we can create a standard for software, we can apply that to many, many products.”

UL invited representatives from industry, government and academia to participate in developing the standard. “We started looking at industry standards that make software better,” Modeste said. “A firmware file has a multitude of components. How can those be broken down and understood? How can they be protected?”

Participants studied every imaginable attack vector that threat actors could use to compromise a product, and then incorporated each into the testing process. Recognizing that new threats and vulnerabilities arise all the time, the testing and process was designed to be fluid and to incorporate follow-up testing after initial approval.

At first, Industry was slow to respond. “I thought we’d have more support early on,” Modeste said. “But there was an initial reluctance. It took a while for us to engage and get them to see the advantages.”

Now it seems interest is on the rise. Among the first movers with the standard: Electric Imp, an IoT software firm based in Los Altos, Calif., and Cambridge, U.K., which provides a cloud-based industrial IoT platform for fully integrating hardware, operating system, APIs, cloud services and security in a single flexible, scalable package. The Electric Imp platform is the first IoT platform to be independently certified to UL 2900-2-2.

Hugo Fiennes, co-founder and CEO at Electric Imp and former leader of Apple’s iPhone hardware development efforts (generations one through four), said:

“For security, UL has come at it at the right angle, because they’re not prescriptive,” Fiennes told GovTechWorks. “There are many ways to get security, depending on the application’s demands, latency requirements, data throughput requirements and everything like that. [But] the big problem has been that there has been no stake in the ground so far, nothing that says, ‘this is a reasonable level of security that shows a reasonable level of due diligence has been performed by the vendor.’”

What UL did was to study the problems of industrial control systems, look at the art of the possible, and then codify that in a standard established by a recognizable, independent third-party organization.

“It can’t be overstated how important that is,” Fiennes said. UL derives its trust from the fact that it is independent of other market players and forces.

Although UL 2900 “is not the be all and end all last word on cybersecurity for IoT,” Fiennes said, “it provides a good initial step for vendors.”

“They haven’t said this is one standard forever, because that’s not how security works,” he said. “They’ve said IoT security is a moving target, here is the current standard. We will test to it, we’ll give you a certificate and then you will retest and maintain compliance after.” The certification lasts a year, after which new and emerging threats must be considered in addition to those tested previously.

“This doesn’t absolve the people selling security products, platforms and security stacks from due diligence,” Fiennes warned. Firms must be vigilant and remain ready and able to react quickly to threats. “But it’s better than nothing. And we were in a state before where there was nothing.”  He noted that his product’s UL certification expires after a year, at which point some requirements are likely to change and the certification will have to be renewed.

Still, for customers seeking proof that a product has met a minimum baseline, this is the only option short of devoting extensive in-house resources to thoroughly test products on their own. Few have such resources.

“Auto makers and other large-scale manufacturers can afford that kind of testing because they can spread the cost out across unit sales in the hundreds of thousands,” says GDIT’s Turner. “But for government integration projects, individually testing every possible IoT product is cost-prohibitive. It’s just not practical. So reputable third-party testing could really help speed up adoption of these new technologies and the benefits they bring.”

Standards have value because they provide a baseline measure of confidence.

For Electric Imp, being able to tell customers that UL examined its source code, ran static analysis, performed fuzz testing and penetration testing and examined all of its quality and design controls, has made a difference.

For UL and Modeste, the notion that it will not be able to solve the IoT security problem with a single standard, proved something of an “aha moment.”

“Within cybersecurity, you have to recognize you can’t do everything at once,” he said. “You need a foundation, and then you can go in and take it step-by-step. Nothing anyone comes up with in one step will make you 100 percent cyber secure. It might take 10 years to come up with something perfect and then soon after, it will be obsolete. So it’s better to go in steps,” Modeste added. “That will make us increasingly secure over time.”

Related Articles

GDIT Recruitment 600×300
GM 250×250
GDIT HCSD SCM 5 250×250 Truck
GDIT Recruitment 250×250
Vago 250×250
Getting Past Passwords for Mobile Device Security

Getting Past Passwords for Mobile Device Security

In the beginning was the password. And the password was good.

Then came the Common Access Card (CAC) and token systems. And they were good.

But it wasn’t enough. And over time there came numerous other forms of identity verification: biometrics and voice recognition and facial recognition.

For deeper security, there were combinations of verifiers: for example, a smart card and fingerprint or a user name and a biometric confirmation.

And then came the smart phone.

Of all the challenges that the digital world has presented to those who need to maintain security, though, the smart phone stands as an opportunity and a threat, a challenge and a puzzle.

Randy Vanderhoof Executive Director of the Secure Technology Alliance

Randy Vanderhoof
Executive Director of the Secure Technology Alliance

“The smart phone is the break point because it started moving more of peoples’ business lives and social lives from the laptop and more and more people started using that technology,” says Randy Vanderhoof, executive director of the Secure Technology Alliance (formerly the Smart Card Alliance), a non-profit association of companies in the security market. “Mobile devices have reached the saturation point. [Virtually] every man, woman and child has a mobile device.”

That swift adoption of mobile technology by the world’s population is nothing short of astonishing. By 2015, subscriptions to mobile services had reached 4.7 billion globally, according to the Global System for Mobile Alliance, a professional organization that includes most carriers, mobile network operators and equipment makers. By 2020, that’s expected to reach 5.6 billion – 70 percent of the world’s population.

The Government Perspective
All those mobile devices make personal identity verification a burgeoning business and the same demand for mobile verification is extending into the highly secure world of government. The Department of Homeland Security (DHS) in particular, needs to verify government employees who need mobile or remote access to work files and systems.

The April 2017 Department of Homeland Security (DHS) Study on Mobile Device Security, released under the signature of Robert Griffin Jr., acting DHS under secretary for science and technology, notes that while government use of mobile devices represent “almost an insignificant market share,” the stakes are considerable: “Government mobile devices…represent an avenue to attack back-end systems containing data on millions of Americans in addition to sensitive information relevant to government functions.”

What is more, the vulnerabilities are numerous:

  • The mobile device technology stack, including mobile operating systems and lower-level device components
  • Mobile applications
  • Networks (e.g., cellular, Wi-Fi, Bluetooth) and services provided by network operators
  • Device physical access
  • Enterprise mobile services and infrastructure, including mobile device management, enterprise mobile app stores and mobile application management

While the report found that security is improving both for the devices themselves and among operating system providers, “many communication paths remain unprotected and leave the overall ecosystem vulnerable to attacks.” For government, verification and security is a systematic question of improving the overall mobile ecosystem. To do this, DHS is recommending:

  • Programmatic improvements
  • Increased DHS authorities
  • Adoption of standards and best practices
  • Additional research

Improving programs and adopting new standards and best practices are especially important. The report urges active DHS participation in standard-setting bodies and efforts.

Two legal gaps stand out in particular: DHS has no legal authority to compel mobile carriers to assess risks to their networks that might affect government mobile device use. Also, while DHS can evaluate carrier network vulnerability, it cannot compel carriers to provide the information it needs to make such evaluations. In response, DHS wants to alter Federal Information Security Modernization Act metrics to cover mobile devices and develop a new program of research and development to secure mobile networks and technology.

Overall, the report stated, “Federal departments and agencies should, where needed, develop or strengthen policies and procedures regarding government use of mobile devices overseas based on threat intelligence and emerging attacker tactics, techniques, and procedures.” To do this, DHS requires “proper” resources and legal authorities to assert itself in securing those devices.

The device perspective
While government itself tries to secure the overall mobile device ecosystem and its networks, the struggle continues to secure and validate individual users and devices—especially as they’re increasingly used to conduct business from a distance.

“The state of the technology is changing rapidly and it becomes increasingly important to be able to adjust to the demands of our mobile dependence on interactive means,” points out Vanderhoof. “So many of the changes that are happening in identity have to do with non-face-to-face interactions with people through the Internet or through their mobile device or through remote communications. We’re seeing more and more accuracy being developed and groups that are looking to leverage the advances in identity and all kinds of technology that will work in our environment that is becoming increasingly mobile as well as disconnected from any physical interaction.”

A variety of identifiers are being studied as possible forms of identity verification for mobile devices, some of which are already in use in other contexts. These include:

  • Gait: Measuring a person’s stride using embedded smartphone sensors like gyroscopes and accelerometers;
  • Facial recognition: Mobile devices can be equipped with facial recognition applications to verify user identity;
  • Fingerprints: Increasingly, smartphones are equipped with fingerprint scanners of growing accuracy;
  • Video: A user can submit a short video “selfie” for verification against an existing database;
  • Social media: Social network logins and profiles can be used to verify identity;
  • Smartphone identifiers: Serial numbers and device codes.

Any of these – plus, of course, the traditional username and password– can be used in combination to provide a variety of levels of security.

What’s more, verification at a distance – without the need for user input or even awareness – is already on the near horizon. “Their gait or voice patterns can be used forensically to match that individual,” Vanderhoof said. “If you’re watching someone type on a keyboard, the pressure on the keys can be measured. You can identify people from a distance or at an airport, where you may not be able to measure someone by a fingerprint biometric. If you’re actually touching something like a machine or keypad, you can measure the veins in their hands, by blood pressure and other physical characteristics that can be acquired against a known biometric.”

Integration is Critical
Whatever the identity verification mechanism an organization chooses, however, it takes skilled integrators to seamlessly meld the new technology with existing authentication and access control mechanisms.

“There are many different authentication mechanisms one could use,” said Rob Lentini, director for credentialing programs at systems integrator General Dynamics Information Technology (GDIT). “The challenge is making them work with the access control infrastructure that’s already in place. Few Agencies can afford to rip and replace their existing access control mechanism. But with careful engineering we can ensure that everything works robustly, reliably and at scale. That’s where many of the real challenges are.”

For all this, experts acknowledge that no method is flawless and any verification regime depends on the level of security and intrusiveness required. While consumer applications strive for ease of use and minimal intrusiveness, highly secure applications can require many layers of verification and deep intrusiveness – that is, highly personal unique information, such as a parent’s middle name, date of birth, home address and so forth.

There is no doubt, however, that the need for verification will continue and the means of providing it will continue to be explored.

As Vanderhoof puts it: “It’s becoming increasingly important that we get identity correct and authentication improved because we’re seeing the results of what happens when the bad guys are able to exploit the weaknesses in our current system: malware that gets spread to business computers and consumers because people can’t identify a hacker’s e-mail from a legitimate e-mail or people being able to hack into business computer systems by injecting malware from third party service provider systems that aren’t even systems managed by their own company. These are examples of why it’s important that we get identity and authentication correct. It’s becoming more difficult to fight the criminal exploitation of our electronic systems without it.”

Related Articles

GDIT Recruitment 600×300
GM 250×250
GDIT HCSD SCM 5 250×250 Truck
GDIT Recruitment 250×250
Vago 250×250
Can Micro Certifications Match Government’s Real-World Needs?

Can Micro Certifications Match Government’s Real-World Needs?

Certifications are the baseline for assessing proficiency in information technology and cybersecurity skills. But many career cyber professionals say required certs underestimate the value of on-the-job training and experience.

Others complain about the cost of obtaining and maintaining certifications that often don’t match up to the narrow responsibilities of a given job. They see emerging micro certifications – short, online training programs followed by knowledge exams – as more practical in many cases.

At the same time, new national standards are taking root to try to better tie knowledge requirements to specific job categories and certifying organizations are revising their programs.

Anil Karmel Founder / Chief Executive C2 Labs

Anil Karmel
Founder / Chief Executive
C2 Labs

“Nothing replaces real world expertise,” said Anil Karmel, founder and chief executive at IT startup C2 Labs of Reston, Va., and a former deputy chief technology officer at the Energy Department’s National Nuclear Security Administration (NNSA). “Having on the job skills is invaluable, especially in the cyber realm when you are entrusted to protect our nation’s critical IT assets. The aim is to strike the right balance between real world experience, certifications, and training, Karmel said.

Certifications were crucial early in his career, Karmel said. As he moved into mid-to-senior level positions however, his needs changed and so did the qualifications required for those jobs. The higher you go, the more experience and past performance defines your capabilities.

Karmel was a solutions architect at the Energy Department’s Los Alamos National Laboratory in Santa Fe, N.M., where he helped develop and launch a private cloud that let researchers automatically request virtual servers on-demand.

“As I grew and became a systems administrator, I focused on industry or vendor-specific certifications, such as VMware, which enabled me to build the private cloud at Los Alamos.”

Certifications can be seen as a baseline, onto which you may want to add additional skills. Micro training programs are one way to do that, Karmel noted.

For instance, a security analyst might need to move beyond incident response – reacting to events that could have a negative impact on an organization’s network – to learn about incident management. It focuses on preparing formal policies and procedures, as well as having the necessary tools in place, to thwart cyber threats. An online micro certification class on the Incident Management Lifecycle could meet that need.

Micro certification, a growing trend?
Micro certifications are narrowly focused, non-traditional skills training in which participants can earn a credential within a matter of days, versus months or years for traditional technical certification programs.

Micro certifications are well-liked by workers – and supervisors – according to a January 2017 Linux Academy/Cybrary survey of 6,000 IT pros. They allow employees to rapidly gain specific knowledge sets that answer specific needs.

Anthony James CEO and Founder Linux Academy

Anthony James
CEO and Founder
Linux Academy

“The growing micro certification trend is driven predominantly by industries such as IT and cybersecurity that have a workforce skills gap where jobs can’t be filled because of a lack of qualified applicants,” according to Anthony James, CEO and founder of Linux Academy, an online Linux and cloud training firm based in Fort Worth, Texas.

The survey, conducted in partnership with Cybrary, a provider of no-cost, open source cybersecurity Massive Open Online Courses (MOOCs), found IT professionals use micro certification programs to keep up with changing technologies and also learn at their own pace. Some 86 percent of respondents said they prefer learning and testing in small increments to receive IT skill credentials.

Thirty-five percent of respondents said that micro certifications have either helped them get or advance in a job; 70 percent think their company would benefit from partnering with micro certification providers; and 85 percent would most likely pursue micro certifications if employers facilitated the offering.

Opinions on micro certification versus traditional IT training varied. More than half – 58 percent – of respondents said micro certifications convey the same level of technical proficiency as traditional training and more than 94 percent believe that micro certifications give entry-level job candidates an edge in competing for jobs.

In terms of costs, 82 percent of respondents understood that micro certifications are more affordable than traditional IT training. Fifty-eight percent of those surveyed paid $25 or more for their own micro certification courses. Most respondents believe their company spends an average of up to $25,000 annually on IT skills training for employees.

Difference Between Certificates and Certification
Many government and government contractor jobs require certifications from established organizations, such as the Certified Information Systems Security Professional (CISSP) certification conferred by (ISC)2, which offers a portfolio of credentials that are part of a holistic, programmatic approach to security. Candidates must have a minimum of five years of paid full-time work experience in two of the eight domains of the CISSP Common Body of Knowledge (CBK). It covers application development security, cloud computing, communications and network security, identity and access management, mobile security, risk management and more.

CISSP certification is costly, ranging from $2,000 to $4,000, depending on the choice of study – CISSP Boot Camp, regular classroom or online training. The six-hour exam alone costs $599.

But Dan Waddell, managing director for North America with (ISC)2, doesn’t see such certifications going away.

“I don’t believe the certification requirement is overkill and I believe most cybersecurity executives in the government would agree,” Waddell said.

According to the recently released federal results of the 2017 Global Information Security Workforce Study, 73 percent of federal agencies require their IT staff members to hold information security certifications. The survey of over 19,600 InfoSec professionals includes responses from 2,620 U.S. Department of Defense (DOD), federal civilian and federal contractor employees. The study was conducted by The Center for Cyber Safety and Education and sponsored by (ISC)2, Booz Allen Hamilton, and Alta Associates. Findings of the report will be released throughout 2017 in a series of dedicated reports.

To effectively retain existing InfoSec professionals and attract new hires, federal respondents indicated that offering training programs, paying for professional cybersecurity certifications, boosting compensation, and providing more flexible and remote work schedules and opportunities were among the most important initiatives.

Still, Waddell acknowledged that traditional certifications must evolve over time, and that (ISC)2 must develop ways to support government efforts to move toward a more performance-based certification system.

Micro certifications aren’t necessarily a replacement for baseline job requirements, however. Scott Cassity, senior director at the Maryland-based SANS Institute Global Information Assurance Certification (GIAC) center, said there is room for both in the complex and rapidly evolving world of cybersecurity.

“I can appreciate folks saying [they need] more bite-size micro certifications,” Cassity said. “I can appreciate that there might be some particular bite-size training we need on a particular tool, a particular technique.

“But if you back up and say: ‘Hey, I want someone who can be a defender. I want them to have a broad range of skills.’ Then, we don’t think that is something that will be absorbed in bite-size chunks,” Cassity continued. “It is going to be very rigorous and challenging training. It is studying above and beyond that training.”

Take the GIAC program, for example, which offers several dozen certifications for a range of different skill sets. Courses typically run four months and cost $1,249. Most students spend 40 to 50 hours studying outside of the classroom, Cassity said. Like CISSP, GIAC is a DOD-approved credentialing body, and its programs meet requirements laid out in the DOD Directive 8570, setting training, certification and management requirements of government employees involved with Information Assurance and security.

(ISC)2’s Waddell agrees there is a difference between a certificate covering practical cyber security knowledge or a specific skill set and professional certification more rigorously assessing a broader range of knowledge, skills and competencies.

The cybersecurity industry keeps evolving, Cassity said. Fundamental skills for information security will stand the test of time. But with mobile security, forensics and other rapidly growing technologies, job functions and certifications must change, as well.

Building a Skills-based Workforce
Federal agencies are looking to adopt the skills-based workforce definitions developed under the National Initiative for Cybersecurity Education (NICE), a partnership between government, academia, and the private sector that’s managed by the National Institute of Standards and Technology (NIST). NICE aims to standardize expectations for cybersecurity education, training, and workforce development across the industry to level-set expectations for employers and employees alike.

“We are not in favor of check-the-box for knowledge and skills,” said Rodney Petersen, NICE director at NIST.  “We really want a robust process for validating an employee’s knowledge, skills and abilities or a job seeker’s knowledge, skills, and abilities.”

The NICE Cybersecurity Workforce Framework (NCWF) – released by NIST in November 2016 – is the centerpiece, describing seven broad job categories: security provision; operate and maintain; protect and defend; analyze; operate and collect; oversight and development and investigate. It also includes 31 specialty areas and 50 work roles, each predicated on specific knowledge and skills, Petersen said.

NICE aims to improve education programs, co-curriculum experiences, training and certification to increase the quality of those credentials, he added.

NICE also impacts certifications. Defense Department Directive 8140, Cyberspace Workforce Management, issued in August 2015, sets the stage for replacing DOD’s certification-based requirements with skill-based assessments rooted in NICE.

According to the 2017 Global Information Security Workforce Study, 30 percent of federal respondents said their organizations have at least partially adopted the NICE Cybersecurity Workforce Framework.

The U.S. Department of Homeland Security (DHS) is using the NICE framework to build up its cybersecurity workforce. As a government-wide workforce framework, NICE “helps us to implement best practices, to identify, find and recruit the really good people,” Phyllis Schneck, DHS deputy undersecretary told GovTechWorks last year.

Some certifying organizations are starting to develop new “performance-based” certifications that are more in line with the NICE standard:  ISACA unveiled its Cyber Security Nexus Practitioner (CSXP) certification, which tests a candidate’s skills in a live, virtual cyber-lab, and CompTIA’s A+, Network+, Security+ and CompTIA Advanced Security Practitioner (CASP) certifications also include performance-based assessments.

Both ISACA and CompTIA are building their new hands-on programs around the NICE standards and definitions. NICE doesn’t undo the call for certifications, but instead emphasizes functional roles to better align candidates’ skills with specific job functions.

(ISC)2 began mapping CISSP certification requirements to the NICE Cybersecurity Workforce Framework last year, Waddell said.

“Certification is just the beginning,” he added. “You are now required to maintain that certification. You are required to set aside a certain number of hours per year to maintain that certification.” Those Continuing Professional Education (CPE) hours can include hands-on training or even skilled-based micro certs.

“In a perfect world,” Waddell said, “a certification program and certificate program can co-exist in a healthy way.”

Related Articles

GDIT Recruitment 600×300
GM 250×250
GDIT HCSD SCM 5 250×250 Truck
GDIT Recruitment 250×250
Vago 250×250
Cyber Alert Overload: 3 Steps to Regaining Control

Cyber Alert Overload: 3 Steps to Regaining Control

Industry’s response to the proliferation of cyber attacks is a growing array of technologies and services designed to address them. Network owners add these products as new attack vectors emerge. One result: A growing cybersecurity stack with overlapping tools that produce so many alerts it is difficult for analysts to sift the signal from the noise.

“The administrator becomes numb to the alerts,” said Curtis Dukes, executive vice president of the Center for Internet Security (CIS) and the National Security Agency’s former director of information assurance. That means significant threats can go unaddressed.

“It’s an old problem that has been dealt with periodically and that comes back again,” said John Pescatore, director of emerging security trends at the SANS Institute who previously designed secure communications systems for the NSA and the Secret Service.

Standardizing technology and processes, prioritizing risks and automating processes are each critical to developing the right solution for an organization.

Chris Barnett, Chief Technology Officer, General Dynamics Information Technology's Intelligence Solutions Division

Chris Barnett
Chief Technology Officer, General Dynamics Information Technology’s Intelligence Solutions Division

“It’s well known that most Enterprises use only 15 percent to 20 percent of the technical capability already available within their toolsets,” said Chris Barnett, chief technology officer in General Dynamics Information Technology’s (GDIT) Intelligence Solutions Division. “It takes both time and expertise to implement the more advanced capabilities found in many of today’s tools. Standardizing tools across the enterprise gives security engineers the opportunity to leverage those sophisticated capabilities and provides opportunities for process automation and event correlation.”

The problem is less false positives than repeat offenders. Multiple products can flag alerts for the same threat or incident.

Security Information and Event Management (SIEM) tools were created in the 1990s in response to information and alerts being generated by perimeter security products such as antivirus and firewalls. This helped reduce the alert volume to a dull roar, Pescatore said. But products eventually fall behind the flood of alerts produced by new security tools, and administrators are again facing alert overload.

The increasing complexity of the Defense Department’s cybersecurity toolset “is driving inefficiencies,” Col. Brian Lytell, the Defense Information Systems Agency’s (DISA) deputy director of cyber development, said in December. “I’m going to have to eliminate some things within the architecture itself to try to simplify it and reduce it down.”

DISA has been evaluating each component in its security stack to determine which it will keep and which it will phase out. The agency’s problem is not unique: IT security stacks tend to grow ad hoc, so periodically modernizing and streamlining to create a more coherent cybersecurity environment is a good idea. But unwinding a complex security solution is time-consuming and complicated. Few enterprises can match the kind of enterprise-wide reach DISA possesses, and even DISA does not control all DOD IT systems.

Chriss Knisley, executive vice president at the security analytics company Haystax Technologies, said a study for one customer found that its systems generated 35,000 alerts over a three-month period, about 390 per day. The 2016 State of Monitoring survey by Big Panda found that only 17 percent of organizations receiving 100 or more alerts a day were able to address all of them within 24 hours.

Fortunately it is not necessary to address every alert. Many alerts are duplicates resulting from the same incident or activity. Of those that remain, some are low risk and can be assigned a lower priority in an effective risk management program.

SIEM tools provide a significant capability for data collection, correlation and risk management, GDIT’s Barnett said. “We’ve built applications using existing SIEM tools to automate, track and report performance-based metrics and dashboards to support risk-based prioritization,” he explained. “We’ve even been able to include logic that automatically changes colors based upon thresholds and service level agreements. Leveraging existing tools this way builds a strategic, scalable capability within the customer space that enables the agency to leverage its existing tool investments to replace timeconsuming, manual methods.”

There are several other practical steps for addressing alert overload and improving overall security.“My advice to DISA is to standardize on consensus-based security benchmarks,” Dukes said. “That would go a long way.” This can help prioritize threats and alerts, automate analysis and response, and reduce the burden on personnel.

Pescatore and Dukes and Barnett outline three essential steps to address alert overload:

1 Standardize

The bible for federal cybersecurity is the National Institute of Standards and Technology’s (NIST) Special Publication 800-53, Security and Privacy Controls for Federal Information Systems and Organizations. It contains a 233-page catalog of security controls that agencies can use. But not every agency will need every control; each agency is responsible for selecting the controls that meet its needs.

To jumpstart this task, the NSA in 2008 commissioned a list of controls that would help the DOD address “known bads” – the most pervasive and dangerous threats. The result was the 20 Critical Security Controls, developed through a consensus of industry and government experts and maintained by CIS.

This list is not a complete cybersecurity program; it reflects the 80/20 principle that a small number of actions – if they are the right actions – can address a large percentage of threats. “Organizations that apply just the first five CIS Controls can reduce their risk of cyberattack by around 85 percent,” according to CIS. “Implementing all 20 CIS Controls increases the risk reduction to around 94 percent.” Using a standardized set of controls makes it easier for security teams to focus on alerts that represent the most serious threats.

Standards-based security tools make it easier to implement third-party analytics and automation solutions. The Security Content Automation Protocol (SCAP), developed by NIST, standardizes how security information is generated, allowing automated management of alerts. When security content is standardized, redundant alerts from multiple products can be eliminated, reducing the number of alerts.

2 Prioritize

The total number of alerts and threats you address is less important than their seriousness. “You don’t have to fix everything, but you should do the business-critical things first,” Pescatore said. “Focus on the mission, not the nuisances.”

Prioritization is a force-multiplier, enabling limited manpower to focus on the things that pose the greatest threat to operations. To ensure that you are using the right controls and getting the right alerts, you need to understand your enterprise and its mission. This requires full discovery of the network and attached systems and collaboration with lines-of-business officials. These officials can identify the agency crown jewels in terms of processes and data so that alerts are aligned with high-value and high-impact resources.

When you know what is important, you can configure and tune the tools in your security stack to provide the information you need. You don’t have to ignore lower-priority events, but these can be dealt with on a different schedule or assigned for automated response.

3 Automate

Automation is not a silver-bullet. Letting tools automatically respond to security and threat alerts “almost never works” because of the complexity of IT systems, Pescatore said. Security fixes, patches and configurations often must be tested before they are applied. Intrusion Prevention Systems can automatically block suspect activity, but this is impractical in critical environments where false positives cannot be tolerated. IPSs often are used to alert rather than respond, creating another source of alerts.

But automated tools can be effective for sorting and evaluating alerts, eliminating duplicate information and identifying the most serious threats. SIEM tools are helpful here, but they work with proprietary products and protocols, Dukes said. They work through product APIs, and in a multi-vendor environment the number of SIEMs can multiply, adding complexity.

This is where SCAP comes in. Federal agencies are required to use SCAP-compliant security products when they available. By creating an environment in which security information is standardized for automation, administrators can come closer to the “single pane of glass” that gives full visibility into the status of and activity on the network and reducing the number of alerts.

Each of these activities supports the other two. Together they can reduce and sort through the growing volume of alerts being generated in an increasingly complex threat and security environment. The necessary humans in the loop are better informed so that they can focus on the most important tasks. “If I can do that, I’m ahead of the game,” Pescatore said. “I’m winning the battle.”

Related Articles

GDIT Recruitment 600×300
GM 250×250
GDIT HCSD SCM 5 250×250 Truck
GDIT Recruitment 250×250
Vago 250×250