Virtualization Could Redefine the Way Your Agency Looks at Cloud

Software-defined data centers (SDDC) offer an alternative to commercial cloud deployments, one that is potentially more efficient than both conventional data center architectures and commercial infrastructure models, providing an especially efficient solution for moving applications into a cloud environment.

Defining the data center in software helps managers escape the tyranny of hardware to become more agile and more efficient, while maintaining complete control over system security.

Market researcher Gartner says SDDCs will be a global priority by 2020 and Allied Market Research projects they will account for a $139 billion global market by 2022. In a survey of 500 business and IT leaders, security solutions provider HyTrust found 65 percent of respondents predict faster SDDC deployments and 51 percent say SDDC will deliver tangible benefits and a quantifiable return on investment.

For government, the potential is far more bang for every information technology buck.

SDDC “allows you to provision applications fast,” said Jacob Jensen, Director of Product Management, Data Center Business Group at Cisco. “People build data centers to run applications, so in government the focus is always on how you provision those applications fast, how you can have less downtime in those applications, how you can have more flexibility.”

By defining everything in software, operators can set common rules and apply them against multiple iterations of an application, speeding performance for end users. Divorcing networking functions from the constraints of the hardware layer frees IT managers from legacy hardware and allows for an infinitely changeable, upgradable replacement.

IT managers in Durham County, N.C., discovered this as they began to transition from their creaking legacy data center equipment into an SDDC environment.

Managing conventional hardware was burdensome, said Seth M. Price, senior network engineer at Durham County Information Services and Technology. “Networking in general has been very labor intensive,” he added. “You have gear that has to be programmed individually. You have to tell the different pieces of equipment how to talk and what they can talk to each other about. It takes a vast body of skills and a lot of labor to manage that way.”

Switching to SDDC “has given us a more holistic approach,” Price said. “It allows us to build policy in one area and push it out to the entire network. We say: This is what we want to accomplish. The software-define controller pushes it out and it automatically configures itself. It is much less labor-intensive.”

Overall, a software-defined approach offers the promise of streamlined management.

“If you are doing a lot of things like bringing up servers, patching them, installing operating systems – all those things take a lot of time to do and are prone to error,” said Mat Mathews, co-founder and vice president of product for cloud-based networks specialist Plexxi. “In the SDDC context, you push a button and say: ‘OK, move this virtual machine with all its data over here.’”

Architecture vs. Service
It’s easy to get confused between SDDC and cloud technology. The two technologies are related, but different: Cloud typically refers to a service for which customers pay by use, whether for infrastructure, software or a platform. SDDC is an architecture.

“Cloud usually refers to the ability to have self-service, dynamic scaling and metered pay-per-use,” said Miklos Sandorfi, senior vice president for product engineering at Sungard Availability Services. SDDC on the other hand “can support delivering these characteristics, but by itself does not require them.”

“At the core of it, the software defined data center is focused around the virtualized applications,” Jensen said. “When you make your applications virtualized from the get-go, this makes them inherently cloud friendly. Typically, applications are very hard to move and hard to manage in the cloud. With a virtualized application, you can literally zip it up and send it over and it is ready to roll.”

For IT departments considering a move to the cloud, SDDC offers a potential solution that could save money and preserve control, answering two of the biggest anxieties IT leaders face as they contemplate cloud architectures.

“If you are thinking cloud only, you may be going down the wrong path,” Plexi’s Mathews said. “The economics of cloud make a lot of sense for unpredictable workloads. But the cloud can be really expensive” for predictable work, where there’s no need to suddenly ramp up capacity or scale it back down. In those cases, SDDC may offer a better alternative to a conventional Infrastructure-as-a-Service solution.

“When you are talking about things that are stable, where you understand the requirements well, at that point it might make sense to own rather than rent,” he added.

Indeed, that’s just how the Durham, N.C., team is approaching it. “I don’t think we will ever go completely to the cloud,” Durham County’s Price said. “But the cloud is good for specific purposes, specific services and applications.”

For cases where cloud is appropriate, the virtualized nature of SDDC makes it easier to get there. A software-based architecture “gives us the ability to connect easily from our data center to locations in the cloud,” he said. “The process of bringing those new services into production in the cloud can be automated, just as they can from inside our network. We can manage those cloud services from our data center fabric just like we can our internal assets.”

There are risks, however. More autonomy means IT departments can spin up applications, configure firewalls and deploy networks much more quickly, using a single set of rules that can be deployed against multiple iterations in the network. “But if you don’t configure your role-based access and rules properly, one click can open up the firewall and everyone from the internet can come on in,” said Ed Amiryar, senior systems engineer and operations manager from NetCentrics.

Planning is key. “You have to map out your policies and role-based access right from the beginning,” he said. “You can’t just practice as you go, because the problems will compound very quickly.”

Stan TyliszczakWith SDDC, the software defined environment is more secure at the infrastructure and platform level, said Stan Tyliszczak, vice president for technology integration at General Dynamics Information Technology. “But that’s only part of the security challenge,” he said. “Virtualized applications can still be insecure if, for example, poor coding practices are followed. What SDDC does is allow you to shift your focus from securing the infrastructure to concentrate instead on making sure the applications code you build doesn’t have embedded security vulnerabilities. That’s a huge paradigm shift.”

Joel Bonestell, a network services manager in Durham County, said that shift requires agencies to take a fresh look at their approach to security.

While most networks operate on a black-list model, where anything is OK as long as it’s not on the list, SDDC works off a white-list model. “Nobody can talk to each other unless you tell them they can talk to each other,” he said. “So it’s a complete reverse in terms of how you think about security.”

This poses a challenge to government users with special security needs.

“In government systems, everything has to be certified, whether it is FedRamp or DoD certifications or something else,” said Steve Wallo, chief technology officer at Brocade Federal. “The industry has been very good about getting hardware components certified but now that has to evolve. When you have a virtualized piece, what exactly is tested? Where is it tested? Is it the hypervisor? Is it the services?”

Of equal concern is the fact that a different way of architecting a data center will inevitably require a different set of skills.

“In the past you had good software-centric folks, people who could write code and could work in a virtualized environment,” Wallo said. “On the other hand you had routers and firewalls, and that took a dedicated skill set. Now you are blending those into one thing and it is going to take a very specific skill set” to be able to do both.

In Durham, Bonestall said approaching and clearing those hurdles is worth the effort, delivering a more efficient system. “They’ll be able to do their work smarter,” he said. “It’s going to give my people more time to spend focusing on more innovative projects. It will help them to stay with the forward-thinking trends.”

Submit a Comment

Your email address will not be published. Required fields are marked *

Related Articles

GDIT Recruitment 600×300
Tom Temin 250×250
GM 250×250
GEMG 250×250
Intel & National Security Summit

Upcoming Events

GDIT Recruitment 250×250
USNI News: 250×250
gdit cloud 250×250