Virtualization Could Redefine the Way Your Agency Looks at Cloud

Virtualization Could Redefine the Way Your Agency Looks at Cloud

Software-defined data centers (SDDC) offer an alternative to commercial cloud deployments, one that is potentially more efficient than both conventional data center architectures and commercial infrastructure models, providing an especially efficient solution for moving applications into a cloud environment.

Defining the data center in software helps managers escape the tyranny of hardware to become more agile and more efficient, while maintaining complete control over system security.

Market researcher Gartner says SDDCs will be a global priority by 2020 and Allied Market Research projects they will account for a $139 billion global market by 2022. In a survey of 500 business and IT leaders, security solutions provider HyTrust found 65 percent of respondents predict faster SDDC deployments and 51 percent say SDDC will deliver tangible benefits and a quantifiable return on investment.

For government, the potential is far more bang for every information technology buck.

SDDC “allows you to provision applications fast,” said Jacob Jensen, Director of Product Management, Data Center Business Group at Cisco. “People build data centers to run applications, so in government the focus is always on how you provision those applications fast, how you can have less downtime in those applications, how you can have more flexibility.”

By defining everything in software, operators can set common rules and apply them against multiple iterations of an application, speeding performance for end users. Divorcing networking functions from the constraints of the hardware layer frees IT managers from legacy hardware and allows for an infinitely changeable, upgradable replacement.

IT managers in Durham County, N.C., discovered this as they began to transition from their creaking legacy data center equipment into an SDDC environment.

Managing conventional hardware was burdensome, said Seth M. Price, senior network engineer at Durham County Information Services and Technology. “Networking in general has been very labor intensive,” he added. “You have gear that has to be programmed individually. You have to tell the different pieces of equipment how to talk and what they can talk to each other about. It takes a vast body of skills and a lot of labor to manage that way.”

Switching to SDDC “has given us a more holistic approach,” Price said. “It allows us to build policy in one area and push it out to the entire network. We say: This is what we want to accomplish. The software-define controller pushes it out and it automatically configures itself. It is much less labor-intensive.”

Overall, a software-defined approach offers the promise of streamlined management.

“If you are doing a lot of things like bringing up servers, patching them, installing operating systems – all those things take a lot of time to do and are prone to error,” said Mat Mathews, co-founder and vice president of product for cloud-based networks specialist Plexxi. “In the SDDC context, you push a button and say: ‘OK, move this virtual machine with all its data over here.’”

Architecture vs. Service
It’s easy to get confused between SDDC and cloud technology. The two technologies are related, but different: Cloud typically refers to a service for which customers pay by use, whether for infrastructure, software or a platform. SDDC is an architecture.

“Cloud usually refers to the ability to have self-service, dynamic scaling and metered pay-per-use,” said Miklos Sandorfi, senior vice president for product engineering at Sungard Availability Services. SDDC on the other hand “can support delivering these characteristics, but by itself does not require them.”

“At the core of it, the software defined data center is focused around the virtualized applications,” Jensen said. “When you make your applications virtualized from the get-go, this makes them inherently cloud friendly. Typically, applications are very hard to move and hard to manage in the cloud. With a virtualized application, you can literally zip it up and send it over and it is ready to roll.”

For IT departments considering a move to the cloud, SDDC offers a potential solution that could save money and preserve control, answering two of the biggest anxieties IT leaders face as they contemplate cloud architectures.

“If you are thinking cloud only, you may be going down the wrong path,” Plexi’s Mathews said. “The economics of cloud make a lot of sense for unpredictable workloads. But the cloud can be really expensive” for predictable work, where there’s no need to suddenly ramp up capacity or scale it back down. In those cases, SDDC may offer a better alternative to a conventional Infrastructure-as-a-Service solution.

“When you are talking about things that are stable, where you understand the requirements well, at that point it might make sense to own rather than rent,” he added.

Indeed, that’s just how the Durham, N.C., team is approaching it. “I don’t think we will ever go completely to the cloud,” Durham County’s Price said. “But the cloud is good for specific purposes, specific services and applications.”

For cases where cloud is appropriate, the virtualized nature of SDDC makes it easier to get there. A software-based architecture “gives us the ability to connect easily from our data center to locations in the cloud,” he said. “The process of bringing those new services into production in the cloud can be automated, just as they can from inside our network. We can manage those cloud services from our data center fabric just like we can our internal assets.”

There are risks, however. More autonomy means IT departments can spin up applications, configure firewalls and deploy networks much more quickly, using a single set of rules that can be deployed against multiple iterations in the network. “But if you don’t configure your role-based access and rules properly, one click can open up the firewall and everyone from the internet can come on in,” said Ed Amiryar, senior systems engineer and operations manager from NetCentrics.

Planning is key. “You have to map out your policies and role-based access right from the beginning,” he said. “You can’t just practice as you go, because the problems will compound very quickly.”

Stan TyliszczakWith SDDC, the software defined environment is more secure at the infrastructure and platform level, said Stan Tyliszczak, vice president for technology integration at General Dynamics Information Technology. “But that’s only part of the security challenge,” he said. “Virtualized applications can still be insecure if, for example, poor coding practices are followed. What SDDC does is allow you to shift your focus from securing the infrastructure to concentrate instead on making sure the applications code you build doesn’t have embedded security vulnerabilities. That’s a huge paradigm shift.”

Joel Bonestell, a network services manager in Durham County, said that shift requires agencies to take a fresh look at their approach to security.

While most networks operate on a black-list model, where anything is OK as long as it’s not on the list, SDDC works off a white-list model. “Nobody can talk to each other unless you tell them they can talk to each other,” he said. “So it’s a complete reverse in terms of how you think about security.”

This poses a challenge to government users with special security needs.

“In government systems, everything has to be certified, whether it is FedRamp or DoD certifications or something else,” said Steve Wallo, chief technology officer at Brocade Federal. “The industry has been very good about getting hardware components certified but now that has to evolve. When you have a virtualized piece, what exactly is tested? Where is it tested? Is it the hypervisor? Is it the services?”

Of equal concern is the fact that a different way of architecting a data center will inevitably require a different set of skills.

“In the past you had good software-centric folks, people who could write code and could work in a virtualized environment,” Wallo said. “On the other hand you had routers and firewalls, and that took a dedicated skill set. Now you are blending those into one thing and it is going to take a very specific skill set” to be able to do both.

In Durham, Bonestall said approaching and clearing those hurdles is worth the effort, delivering a more efficient system. “They’ll be able to do their work smarter,” he said. “It’s going to give my people more time to spend focusing on more innovative projects. It will help them to stay with the forward-thinking trends.”

Related Articles

GDIT Recruitment 600×300
Tom Temin 250×250
GM 250×250
GovExec Newsletter 250×250
Cyber Education, Research, and Training Symposium 250×250
December Breakfast 250×250

Upcoming Events

USNI News: 250×250
Nextgov Newsletter 250×250
WEST Conference
GDIT Recruitment 250×250
NPR Morning Edition 250×250
Winter Gala 250×250
AFCEA Bethesda’s Health IT Day 2018 250×250
Many Options to Help Clean Up Your Data

Many Options to Help Clean Up Your Data

For organizations looking to make sense of their data, the first step is to ensure the data are managed well – are consistent, accurate and that rules are in force to keep users from doing anything to undermine data integrity. But whether you’re trying to ensure data quality going forward or clean up a whole history of sloppy data, the choices in the marketplace can be bewildering.

Market research firm Gartner reports businesses and governments invested $1.4 billion in data quality tools in 2014. It projects the market will top $2.1 billion by 2017 and breaks down the types of tools into eight categories:

  • Generalized cleansing. Modification of data values to meet domain restrictions, integrity constraints or other business rules
  • Data profiling and data quality measurement. Analysis to capture metadata that can help to identify data quality issues
  • Parsing and standardization. Decomposing text fields into component parts and formatting values into consistent layouts
  • Matching. Identifying, linking or merging related entries within or across datasets
  • Monitoring. Imposing controls to ensure data conforms to business rules that define data quality for an organization
  • Issue resolution and workflow. Identifying, quarantining and resolving data quality issues using processes and interfaces that enable collaboration with key roles, such as data stewards
  • Enrichment. Enhancing the value of internally held data by appending related attributes from external sources (such as mapping or location data)

Dirty Data Got You Down? Clean it Up

Harnessing government data to make better decisions is a great concept – but most databases require significant cleanup before they can start pointing decision-makers to smarter results.

0 comments

Gartner identifies 18 vendors in its “Magic Quadrant” report for data quality tools, a list that includes both major database software leaders like SAP, Oracle and IBM, along with other specialists. The leading specialists, according to Gartner:

  • Informatica. The company’s products are seen as among the easiest for both technical and non-technical users to navigate
  • Information Builders. The tools locate and rectify bad data – and also proactively stop bad data from entering the environment
  • Talend. Its tools profile, cleanse and mask data, using validation, standardization and enrichment techniques, along with integrated parsing technology to standardize and de-duplicate unstructured data
  • Trillium. These products emphasize data governance structures, strategies, practices and policies and include tools to put those in place, the company says
  • Neoppost. Validation and standardization products ensure clean data capture, including de-duplication and ensuring new data is relevant when integrating with existing data sets, the company says
  • Ataccama. The company offers industry-specific solutions including government products addressing such areas as tax compliance, Security and Law Enforcement, government information sharing requirements and interagency collaboration
  • MIOsoft. Tools embrace master data management, data migration and information governance initiatives using a virtual “lens” over data, so that original data and systems are untouched

Related Articles

GDIT Recruitment 600×300
Tom Temin 250×250
GM 250×250
GovExec Newsletter 250×250
Cyber Education, Research, and Training Symposium 250×250
December Breakfast 250×250

Upcoming Events

USNI News: 250×250
Nextgov Newsletter 250×250
WEST Conference
GDIT Recruitment 250×250
NPR Morning Edition 250×250
Winter Gala 250×250
AFCEA Bethesda’s Health IT Day 2018 250×250
Dirty Data Got You Down? Clean it Up

Dirty Data Got You Down? Clean it Up

You can’t turn a corner these days without someone touting Big Data, data analytics and the power of data-driven decision making as a solution to some of our biggest challenges.

Big data is critical to cyber defense, threat detection, fraud prevention and future advancements in precision medicine. Yet all of these potential advancements hinge on the quality of the incoming data, including historic data compiled over years and even over decades. But turning existing troves of data into actionable insights is harder than it looks.

Most raw data is too “dirty” –inconsistent, inaccurate, duplicative, irrelevant or incomplete – to be actionable without significant work to clean it up. So-called dirty data costs U.S. businesses $600 billion a year, according to The Data Warehousing Institute research firm. And the problem is likely worse in government.

Take federal spending data on USASpending.gov, for example. The General Accountability Office reported in 2014 that at least $619 billion representing 302 federal programs was missing. Two years later, GAO reported that up to 15 percent of data in the Pentagon’s Real Property Assets Database were inaccurate and incomplete. And in September, GAO found billions of dollars in faulty entries on ForeignAssistance.gov.

At the state level, seven out of 10 officials from 46 states reported that data problems were frequently or often an impediment to effectively doing business, according to a recent Governing survey. Even when bad data gets noticed, they don’t always get fixed. When a trade association complained to government agencies about the accuracy of data related to lead poisoning, agencies failed to make 59 of 87 correction requests.

Many Options to Help Clean Up Your Data

For organizations looking to make sense of their data, the first step is to ensure the data are managed well – are consistent, accurate and that rules are in force to keep users from doing anything to undermine data integrity. But whether you’re trying to ensure data quality going forward or clean up a whole history of sloppy data, the choices in the marketplace can be bewildering.

1 comment

Human error
Cleaning up databases so they can be plumbed for insights, starts with the basics, said Tyler Kleykamp, Connecticut’s state chief data officer. “The first issue is always inconsistency,” he said. “Maybe there are misspellings or format problems. Sometimes a field is entered in all caps and sometimes it is not. These seem like trivial issues to the person entering the data, but they become a problem when you try to use the data.”

Expediency can also cause problems. What if a regulatory change requires operators to begin identifying individuals’ gender when that hadn’t done so before? How that change is implemented will have long-term implications for how the data can be used. If the gender information is in a new field, it can be added for individual entries over time. But if agency managers take a shortcut, such as repurposing an unused field originally intended for some other purpose, the result could be troublesome.

At the Nuclear Regulatory Commission (NRC), a legacy system built over many years to fulfill multiple purposes contains data describing a few thousand licensees. Now, as part of its Reactor Program System modernization effort, the agency wants to extract, clean and build up the data into a manageable, standards-based and shareable database.

“As we set up the interfaces, we are doing the analysis, looking for duplicates, looking for empty fields,” said Cris Brown, NRC master data management program manager. It’s labor-intensive work. “You have to go back and ask the expert in the office: ‘What should this be, really?’ Then you can write a rule around that.”

To expedite the effort, Brown said, “We have set up data stewards – business people in various offices who can tell us what the information is supposed to look like.”

The technology market research firm Gartner said organizations increasingly identify such roles within the business sides of their operations in recognition that data quality is less an information technology problem than a business process matter.

“Key roles such as data steward, data quality champion, data quality analyst and data owner are more often either on the business side or a hybrid of business and IT roles,” Garter analysts Saul Judah and Ted Friedman wrote in a November 2015 report. “This indicates greater information maturity in the market and an increasing recognition that ensuring data quality requires organizational collaboration.”

More broadly, Judah and Friedman see this as an indication that over time, database management work will migrate from IT back offices to “self-service capabilities for data quality.” And some vendors are already developing products with that in mind.

More transparent
Brown and others like her spend much of their time cleaning up data rather than helping analyze it. Sometimes files are corrupted, sometimes metadata is incomplete or locked in a proprietary format.

For outsiders face the same challenges trying to extract insights from government data.

Data advocacy group Open Knowledge International recently conducted an extensive review of government data related to procurements. “We knew no data set would be perfect, but it is worse than we expected,” said Community Manager and Open Data for Development Program Manager Katelyn Rogers. “You can go through government data sets and names will have different forms within a single file. Nothing matches with anything. We will get data sets where big portions are missing. Instead of covering an entire procurement, it only covers 25 percent of the information about that procurement.”

It’s not that government agencies don’t want clean data – they do. At some, like USAID, clean data is even a critical organizational goal. But data quality problems often don’t arise until someone introduces a need or question not asked before. Elizabeth Roen, senior policy analyst in USAID’s Office of Learning, Evaluation and Research, said USAID is looking to outsiders to help identify those holes. “One of the things we are hoping will happen is that when third parties start using this data, that they will alert us to where there are issues,” she said.

“Finally, users need to be concerned with deliberate efforts to disguise data” said Jala Attia, senior program director of General Dynamics Information Technology’s Health Care Program Integrity Solutions Group. “In healthcare, for example, fraudulent claims use multiple variations on a person’s name. So Al Capone could be listed as Al Capone, Alphonse Capone, Alphonse G. Capone, Al G. Capone, Al Gabriel Capone, A. Gabriel Capone, or A.G. Capone. With advanced analytics tools, we can identify and reconcile some of these. But there’s still work to be done before the data is completely reliable.”

Careful How You Do That
Creating clean, structured data bases begins with good processes, according to the Center for Open Data Enterprise, a non-profit based in Washington, D.C. In April, the organization co-hosted a roundtable on the quality of government databases with the White House Office of Science and Technology Policy. In a summary of that meeting, the group listed strategies for improving the quality of data in government databases:

  • Address human factors to ensure data is formatted to meet end-user needs
  • Strengthen data governance to ensure integrity in data collection, management and dissemination
  • Establish effective feedback systems so that users can help to identify and eliminate data quality issues
  • Institute improved data policies such as the Information Quality Act and ISO 8000, which set out quality requirements for open government data

For agencies, facing up to dirty data and figuring out where to start can be daunting. Connecticut’s Kleykamp takes a pragmatic approach: Start by putting the data to work and sharing it – both internally and externally. Then wait to see what holes emerge. From there he said, data managers can prioritize the work that needs to be done.

“You have to start using [data] in bulk,” he said. “You never will figure out what the issues are until you try to answer a question with the data or do something with it that you aren’t currently doing. Nine times out of 10, that’s how you are going to find out where the issues lie.”

Related Articles

GDIT Recruitment 600×300
Tom Temin 250×250
GM 250×250
GovExec Newsletter 250×250
Cyber Education, Research, and Training Symposium 250×250
December Breakfast 250×250

Upcoming Events

USNI News: 250×250
Nextgov Newsletter 250×250
WEST Conference
GDIT Recruitment 250×250
NPR Morning Edition 250×250
Winter Gala 250×250
AFCEA Bethesda’s Health IT Day 2018 250×250
Navy to Re-Compete NGEN Piece by Piece

Navy to Re-Compete NGEN Piece by Piece

The Navy and Marine Corps are looking to leverage the latest innovations in cloud technology innovations when the multi-billion Next Generation Enterprise Networks Re-compete (NGEN-R) contract comes up for bid next year.

Instead of one massive program, officials are considering breaking NGEN-R into four to six functional blocks, beginning with end-user hardware and continuing with productivity services, enterprise cloud services, IT and transport service management and overall NGEN services integration. By engaging more support partners, the Navy hopes to expand its options, improve services and cut costs.

The first draft request for proposals will probably be for end-user hardware, and is due by March 2017 and the remaining RFPs will follow over the course of 2017. The services intend to to award contracts in early 2018, enabling a a smooth transition to the new structure by June 2018.

Much remains undecided, but some things are clear: NGEN-R will incorporate Windows 10 across the enterprise and will be tightly integrated with the Joint Regional Security Stacks (JRSS), the security gateways linking the open internet to the Defense Department’s overall Joint Information Environment (JIE). NGEN-R will also deliver faster, more flexible networks to better support delivery of enterprise services, officials promise.

“The Chief of Naval Operations has expressed a clear desire to increase our abilities there, and the things we do in the re-compete will be aligned with that vision, to give access to the right people at the right time, anywhere,” said Naval Enterprise Networks Program Manager Capt. Michael N. Abreu (USN). His office is overseeing the redesign and recompete of Hewlett-Packard’s $3.4 billion NGEN contract, which runs through mid-2018.

For Abreu, success will be measured by how the program drives down costs and increases performance all across the government-owned Navy Marine Corps Intranet (NMCI), the services’ shore-based enterprise network in the continental U.S. and Hawaii. He aims to build on what NGEN has already achieved, such as networks that have successfully kept pace with a 30 percent increase in throughput since the contract was put in place in 2013. Those demands will only increase under NGEN-R, as demand for higher-bandwidth network services continues to increase.

NGEN-R will be the third generation of Department of the Navy-wide effort to manage a global enterprise network. The Navy was years ahead of its time when it launched the original Navy Marine Corps Intranet (NMCI) program, which sought to privatize the Navy and Marine Corps IT infrastructure. NMCI faced a host of technical, process and service hurdles, and its next-generation replacement, NGEN, demonstrated those challenges could be overcome. That’s set the stage for the NGEN Recompete, or NGEN-R.

“Simply put, NGEN is a success story,” said John Zangardi, acting Navy CIO and deputy assistant secretary for C4I & Space, of a system that supports more than 800,000 users, 400,000 workstations and more than 2,500 locations across the continental United States, Hawaii and Japan. “The NGEN contract demonstrates continued innovation and exemplary acquisition practices.”

NGEN-R, Zangardi told the House Armed Services Committee in February, will “drive future innovation and price reduction without sacrificing performance or security of the DON’s network.”

It will do so by leveraging new advances in cloud and virtualization technologies that weren’t mainstream when NGEN was awarded back in 2013, and by making sure that future technologies can be adopted as they become available. In the past, emerging technologies were locked out simply because they were not specifically included in the original contract.

“We need to have language in there that allows us to change, to adopt new technologies and to shift over time,” Abreu said. “The contract language has to include the ability to make those changes. For example, we want to be able to request priced and un-priced options for a given piece of work. That allows me to think ahead. Now, if I decide to move ahead and do this thing, I don’t need to create a whole new statement of work.”

NGEN-R planners are looking to cloud technologies to support virtual machines, enhance productivity and host shared application services. Managers want to have the flexibility to host applications in government data centers, private clouds, secure commercial clouds – in short, wherever it is most cost-effective for the given use.

“We have a limited number of hosted virtual desktop machines on the network today, and we have been doing that for the last couple of years to understand how virtual desktops would work,” Abreu said. “What are the technical conditions for success?”

Those lessons will be incorporated into NGEN-R, he said, with an eye toward extending virtualization more broadly throughout the enterprise. At the same time, NGEN-R also will be responsive to Defense Department initiatives, such as the migration to Windows 10.

The Navy is pushing to ensure all its systems are Windows 10-compatible before NGEN-R comes online. Calling the Windows 10 migration a “significant challenge,” Abreu said, “We have begun the efforts to ensure that applications are compatible with Windows 10 and we intend to get to first test users in the first quarter of FY17.”

NGEN-R also will align with the Joint Information Environment (JIE), DoD’s vision for a unified enterprise information environment for command, control, communications and computing. While not a program of record, JIE does encompass a set of standards and programs that all of the armed services and defense support agencies must support. DoD’s Joint Regional Security Stacks (JRSS) are the first hardware manifestation to emerge from the JIE effort. “The first task is to align with the JRSS approach, and we are doing just that,” Abreu said. “Our intent is to align our boundary security with the JRSS capabilities, starting with JRSS Version 1.5 which is being tested today, and followed by JRSS 2.0 in 2018.”

Practically speaking, that means ensuring that JRSS security protocols do not disrupt activities currently supported by NGEN networks. “We are engaged in making sure the applications we are using that have to transit the network continue to perform as advertised and will not be disrupted by that security technology,” Abreu said, adding that NGEN planners have a positive working relationship with JRSS managers. “We have good communication and participation inside the effort.”

JIE also provides a common approach to Identity and Access Management, providing shared security controls over who has what kinds of rights to data and services throughout the defense enterprise. “We know that we have work to do to continue going down the road toward the next generation of identity access management,” Abreu said. “We are working with groups within the DoD to understand how to better stay ahead of our adversaries on that.”

All this must play out in a dynamic environment in which time never stands still. Missions are ongoing, as are system changes and upgrades. NGEN managers oversee some 700 upgrades and changes to network capabilities at the command level annually, along with more than 100 enterprise-level modernization projects; NGEN-R will be just as demanding.

Multiple Procurements?
Breaking NGEN-R into multiple, specialized contracts should yield greater specialization, more choice and lower prices, the Navy believes. Abreu acknowledges that multiple contracts could potentially create contractual and functional gaps and overlaps between mission areas, as well as more complex service management integration challenges. But he said the Navy anticipated those concerns and is developing contract terms to ensure vendors interoperate seamlessly.

“If you are going to do this you have to have a robust ability for the contractor to do the integration work,” Abreu said. “We also will have rules that we put in place to allow contractors to work together well. This includes associated contractor agreements, operation level agreements and the tried-and-true service level agreements and service level requirements.”

Given all these moving parts, NGEN-R clearly is a complex piece of contracting work that the DoN is working diligently to get right. At the core, though, the intent is direct and straightforward: “We want to balance the need for speed with security, while lowering total cost of ownership, in order to get the best value for the dollar moving forward,” Abreu said.

Related Articles

GDIT Recruitment 600×300
Tom Temin 250×250
GM 250×250
GovExec Newsletter 250×250
Cyber Education, Research, and Training Symposium 250×250
December Breakfast 250×250

Upcoming Events

USNI News: 250×250
Nextgov Newsletter 250×250
WEST Conference
GDIT Recruitment 250×250
NPR Morning Edition 250×250
Winter Gala 250×250
AFCEA Bethesda’s Health IT Day 2018 250×250
Army Rethinks Mobile Command Post Architecture

Army Rethinks Mobile Command Post Architecture

The Army is reengineering its command post infrastructure to be more agile and expeditionary, hoping to leapfrog from its decade-old Command Post of the Future to Command Post 2025, a long-range vision leveraging cloud technologies and the Internet of Things (IoT).

The goal: a single integrated command system that combines operations with logistics and intelligence and can be set up on the fly, providing deployed commanders a complete picture of their situation as soon as they hit the ground.

“It is a different operating environment, with different threats, and so we need to go back to reassess what the requirements are, what the vulnerabilities and the risks are,” said Mike McCarthy, chief of the LandWarNet Division at Training and Doctrine Command’s Army Capabilities and Integration Center (ARCIC). “We have to consider the advances that our potential adversaries have made in countering the capabilities that we designed 10 years ago.”

Command Post 2025 will be compatible with both the Joint Information Environment and the Intelligence Community Information Technology Environment, and tie operators on the front lines to their support services in the rear. The program will develop under the oversight of Army Materiel Command’s Communications-Electronics Research, Development and Engineering Center (CERDEC).

The present-day CP is bulky: a compilation of cables, cases and hardware that can take a day to assemble. That’s no longer good enough.

command-network“Army forces will have to deploy rapidly into unexpected locations and transition quickly into high tempo operations across wide areas,” according to the Army’s Mission Command Network vision statement.

At the heart of Command Post 2025 will be the Command Post Computing Environment (CP CE), which will integrate logistics, intelligence and operations systems. “You are talking common hardware, common software and web-based applications,” said Jeffrey R. Witsken, chief of the Network Integration Branch at the Army’s Mission Command Center of Excellence at Fort Leavenworth, Kansas.

This common environment will unify the user experience, “allowing commanders and staff access to the common oper­ating picture and associated data when and where they need it,” according to the Army.

CP CE is expected to incorporate elements from today’s Command Post of the Future (CPOF), which was developed and fielded for the Army by General Dynamics. CPOF, which provides commanders with a common operating picture at the brigade and battalion level, will become one of several applications within CP CE.

By making systems more interoperable, planners anticipate a more autonomous CP that needs fewer people and can operate further away from base camps. But first they have to overcome issues from latency to basic connectivity.

“The new infrastructure will have to take not just latency into account, but also differences in bandwidth across the length of the network,” said Lisa Heidelberg, Mission Command Capabilities Division chief for CERDEC’s Command, Power & Integration Division. Systems engineers will have to develop protocols and design applications to work in such disconnected, intermittent and latent (DIL) network environments, she said, to ensure that commanders have the fullest and most accurate picture possible of the battlefield.

No one technology will solve all these issues. “Cloud technologies offer many advantages, but also challenges associated with providing critical data and services to the tactical edge in DIL network environments,” Heidelberg said. “It is probable that a mix of cloud technologies and more traditional technology implementations will be the appropriate path for future mission command capabilities.”

System designers must take into account the wide variety of vehicles that must be tied into their networks. For example, Heidelberg said Combined Arms Battalions will employ tracked vehicles, since wheeled vehicles “do not offer the appropriate level mobility or survivability,” but infantry battalions will need lighter command vehicles that can deployed during air assault operations.

Command Post 2025 will also have to link back to Home Station Mission Command Centers (HSMCCs), which will incorporate standardized capabilities to leverage advances in network capability, telepresence and remote collaboration at the corps, division and other headquarters levels.

This in turn should provide commanders with “the flexibility to deploy command posts in a scalable, tailorable manner according to operational requirements,” according to Army documents.

On the other end of the spectrum are individual soldiers – the network’s most widely-dispersed asset. While the expeditionary command post will push information to troops on the net’s outermost edge, sensor-enabled soldiers will feed information back to the commands – both actively and, through IoT, passively, as well. Each soldier will be included in the overall architecture.

“Soldiers have always been active participants in mission command and will continue to be so in the future,” Heidelberg said. “As improvements to the lower tactical internet are made, IoT and similar technologies become more viable.”

As the Army works to better understand how to reduce system complexity, minimize system signatures and increase network security, she said, “We should expect to see these types of technologies finding their way into the tactical force.”

Related Articles

GDIT Recruitment 600×300
Tom Temin 250×250
GM 250×250
GovExec Newsletter 250×250
Cyber Education, Research, and Training Symposium 250×250
December Breakfast 250×250

Upcoming Events

USNI News: 250×250
Nextgov Newsletter 250×250
WEST Conference
GDIT Recruitment 250×250
NPR Morning Edition 250×250
Winter Gala 250×250
AFCEA Bethesda’s Health IT Day 2018 250×250