Five Federal IT Trends to Watch in 2018

Out with the old, in with the new. As the new year turns, it’s worth looking back on where we’ve been to better grasp where we’re headed tomorrow.

Here are five trends that took off in the year past and will shape the year ahead:

1. Modernization
The White House spent most of 2017 building its comprehensive Report to the President on Federal IT Modernization, and it will spend most of 2018 executing the details of that plan. One part assessment, one part roadmap, the report defines the IT challenges agencies face and lays out a future that will radically alter the way feds manage and acquire information technology.

The plan calls for consolidating federal networks and pooling resources and expertise by adopting common, shared services. Those steps will accelerate cloud adoption and centralize control over many commodity IT services. The payoff, officials argue: “A modern Federal IT architecture where agencies are able to maximize secure use of cloud computing, modernize Government-hosted applications and securely maintain legacy systems.”

What that looks like will play out over the coming months as agencies respond to a series of information requests and leaders at the White House, the Office of Management and Budget, the Department of Homeland Security and the National Institute for Standards and Technology respond to 50 task orders by July 4, 2018.

Among them:

  • A dozen recommendations to prioritize modernization of high-risk, high-value assets (HVAs)
  • 11 recommendations to modernize both Trusted Internet Connections (TIC) and the National Cybersecurity Protection System (NCPS) to improve security while enabling those systems to migrate to cloud-based solutions
  • 8 recommendations to support agencies adoption of shared services to accelerate adoption of commercial cloud services and infrastructure
  • 10 recommendations designed to accelerate broad adoption of commercial cloud-based email and collaboration services, such as Microsoft Office 365 or Google’s G-Suite services
  • 8 recommendations to improve existing shared services and expand such offerings, especially to smaller agencies

The devil will be in the details. Some dates to keep in mind: Updating the Federal Cloud Computing Strategy (new report due April 30); a plan to standardize cloud contract language (due from OMB by June 30); a plan to improve the speed, reliability and reuse of authority to operate (ATO) approvals for both software-as-a-service (SaaS) and other shared services (April 1).

2. CDM’s Eye on Cyber
The driving force behind the modernization plan is cybersecurity, and a key to the government’s cyber strategy is the Department of Homeland Security’s (DHS) Continuous Diagnostics and Mitigation (CDM) program. DHS will expand CDM to “enable a layered security architecture that facilitates transition to modern computing in the commercial cloud.”

Doing so means changing gears. CDM’s Phase 1, now being deployed, is designed to identify what’s connected to federal networks. Phase 2 will identify the people on the network. Phase 3 will identify activity on the network and include the ability to identify and analyze anomalies for signs of compromise, and Phase 4 will focus on securing government data.

Now CDM will have to adopt a new charge: securing government systems in commercial clouds, something not included in the original CDM plan.

“A challenge in implementing CDM capabilities in a more cloud-friendly architecture is that security teams and security operations centers may not necessarily have the expertise available to defend the updated architecture,” DHS officials write in the modernization report. “The Federal Government is working to develop this expertise and provide it across agencies through CDM.” The department is developing a security-as-a-service model with the intent of expanding CDM’s reach beyond the 68 agencies currently using the program to include all civilian federal agencies, large and small.

3. Protecting Critical Infrastructure
Securing federal networks and data is one thing, but 85 percent of the nation’s critical infrastructure is in private, not public hands. Figuring out how best to protect privately owned critical national infrastructure, such as the electric grid, gas and oil pipelines, dams and rivers and levees, public communications networks and other critical infrastructure, has long been a thorny issue.

The private sector has historically enjoyed the freedom of managing its own security – and privacy.  However, growing cyber and terrorist threats and the potential liability that could stem from such attacks means those businesses also like having the guiding hand and cooperation of federal regulators.

To date, this responsibility has taken root in the DHS’s National Protection and Programs Directorate (NPPD), which operates largely beneath the public radar. That soon that could change: The House voted in December to elevate NPPD to be the next operational component within DHS, joining the likes of Customs and Border Protection, the Coast Guard and the Secret Service.

NPPD would become the Cybersecurity and Infrastructure Security Agency and while the new status would not explicitly expand its portfolio, it would pave the way for increased influence within the agency and a greater voice in the national debate.

First, it’s got to clear the Senate. The Cybersecurity and Infrastructure Security Agency Act of 2017 faces an uncertain future in the upper chamber because of complex jurisdictional issues, and a gridlocked legislative process that makes passage of any bill an adventure — even if as in this case, that bill has the active backing of both the White House and DHS leadership.

4. Standards for IoT
The Internet of Things (IoT), the Internet of Everything, the wireless, connected world – call it what you will – is challenging the makers of industrial controls and network-connected technology to rethink security and their entire supply chains.

If a lightbulb, camera, motion detector – or any number of other sensors – can be controlled via networks, they can also be co-opted by bad actors in cyberspace. But while manufacturers have been quick to field network-enabled products, most have been slow to ensure those products are safe from hackers and abuse.

Jim Langevin (D-R.I.) advocates legislation to mandate better security in connected devices. “We need to ensure we approach the security of the Internet of Things with the techniques that have been successful with the smart phone and desktop computers: The policies of automatic patching, authentication and encryption that have worked in those domains need to be extended to all devices that are connected to the Internet,” he said last summer. “I believe the government can act as a convener to work with private industry in this space.”

The first private standard for IoT devices was approved in July when the American National Standards Institute (ANSI) endorsed UL 2900-1, General Requirements for Software Cybersecurity for Network-Connectable Products. Underwriters Laboratories (UL) plans two additional standards to follow: UL 2900-2-1, network-connectable healthcare systems, and UL 2900-2- for industrial controls.

Sens. Mark R. Warner (D-Va.) and Cory Gardner (R-Colo.), co-chairs of the Senate Cybersecurity Caucus, introduced The Internet of Things Cybersecurity Improvement Act of 2017 in August with an eye toward holding suppliers responsible for providing insecure connected products to the federal government.

The bill would require vendors supplying IoT devices to the U.S. government to ensure their devices are patchable, not hard-coded with unchangeable passwords and are free of known security vulnerabilities. It would also require automatic, authenticated security updates from the manufacturer. The measure has been criticized for its vague definitions and language and for limiting its scope to products sold to the federal government.

Yet in a world where cybersecurity is a growing liability concern for businesses of every stripe – and where there is a dearth of industry standards – such a measure could become a benchmark requirement imposed by non-government customers, as well.

5. Artificial Intelligence
2017 was the year when data analytics morphed into artificial intelligence (AI) in the public mindset. Government agencies are only now making the connection that their massive data troves could fuel a revolution in how they manage, fuse and use data to make decisions, deliver services and interact with the public.

According to market researcher IDC, that realization is not limited to government: “By the end of 2018,” the firm predicts, “half of manufacturers will be using analytics, IoT, and social collaboration tools to extend the integrated planning process across the entire enterprise, in real time.”

Gartner goes even further: “The ability to use AI to enhance decision making, reinvent business models and ecosystems, and remake the customer experience will drive the payoff for digital initiatives through 2025,” the company predicts. More than half of businesses and agencies are still searching for strategies, however.

“Enterprises should focus on business results enabled by applications that exploit narrow AI technologies,” says David Cearley, vice president and Gartner Fellow, Gartner Research. “Leave general AI to the researchers and science fiction writers.”

AI and machine learning will not be stand-alone functions, but rather foundational components that underlie the applications and services agencies employ, Cearley says. For example, natural language processing – think of Amazon’s Alexa or Apple’s Siri – can now handle increasingly complicated tasks, promising more sophisticated, faster interactions when the public calls a government 800 number.

Michael G. Rozendaal, vice president for health analytics at General Dynamics Information Technology’s Health and Civilian Solutions Division, says today’s challenge with AI is two-fold: First, finding the right applications that provide a real return on investment, and second overcoming security and privacy concerns.

“There comes a tipping point where challenges and concerns fade and the floodgates open to take advantage of a new technology,” Rozendaal told GovTechWorks. “Over the coming year, the speed of those successes and lessons learned will push AI to that tipping point.”

What this Means for Federal IT
Federal agencies face tipping points across the technology spectrum. The pace of change quickens as the pressure to modernize increases. While technology is an enabler, new skills will be needed for cloud integration, shared services security and agile development. Similarly, the emergence of new products, services and providers, greatly expand agencies’ choices. But each of those choices has its own downstream implications and risks, from vendor lock-in to bandwidth and run-time challenges. With each new wrinkle, agency environments become more complex, demanding ever more sophisticated expertise from those pulling those hybrid environments together.

“Cybersecurity will be the linchpin in all this,” says Stan Tyliszczak, vice president and chief engineer at GDIT. “Agencies can no longer afford the cyber risks of NOT modernizing their IT. It’s not whether or not to modernize, but how fast can we get there? How much cyber risk do we have in the meantime?”

Cloud is ultimately a massive integration exercise with no one-size-fits-all answers. Agencies will employ multiple systems in multiple clouds for multiple kinds of users. Engineering solutions to make those systems work harmoniously is essential.

“Turning on a cloud service is easy,” Tyliszczak says. “Integrating it with the other things you do – and getting that integration right – is where agencies will need the greatest help going forward.”

Submit a Comment

Your email address will not be published. Required fields are marked *

Related Articles

GDIT Recruitment 600×300
GM 250×250
GDIT HCSD SCM 5 250×250 Truck
GDIT Recruitment 250×250
Vago 250×250
(Visited 1,788 times, 1 visits today)