Wireless Tech Comes of Age for More Networks
This is the age of wireless. Everyone has a mobile phone, laptops that take work wherever they wish, untethering workers from desks and office cubicles. Yet security concerns have kept many agencies from fully leveraging what wireless can do.
Now that may be changing.
Wi-Fi, microwave and cellular networks are getting faster and more reliable. Throughput, reliability and latency continue to improve. Security is getting better and costs are coming down. For many applications, such as perimeter security, wireless networks of one kind or another are the most reliable, survivable option.
The U.S. Park Service opted for a 1.6 Gbps microwave network to support video surveillance on Liberty Island and Ellis Island in New York Harbor after Hurricane Sandy wiped out its prior solution. The wireless network eliminated hard-wire infrastructure investments and made it faster and easier to deploy and move cameras.
Similar networks can be set up to support perimeter security at government installations, flexible security setups for major sporting or entertainment events – or even applications for border security.
The Park Service built that network from scratch. But it’s also possible to piggyback off of existing wireless infrastructure. Today, that could be a 4G LTE cellular network. But coming 5G (fifth-generation) cellular technologies will support Internet of Things (IoT) applications better than 4G can, with some vendors promising latencies in the single-digit millisecond range, versus 50 ms today. Although standards for 5G are still in development, anticipated inexpensive 5G chipsets will make large-scale IoT deployments economically viable and more secure than is typical today.
Wireless isn’t new, of course. What’s changed is what wireless technology can do. Speed and security have come a long way in recent years.
“Many organizations are surprised by how much capacity can be provided by microwave systems, with connections of 2-5 Gbps now available,” says Greg Friesen, vice president of product management at Ottawa, Canada-based DragonWave, a microwave solutions provider.
Some applications have latency requirements that microwave and cellular historically could meet only with pricey gear. That’s changing, too.
“Latency can be controlled pretty heavily,” says Jaime Fink, co-founder of Santa Clara Calif.-based wireless provider Mimosa Networks. “Typically we can get down to sub-2 milliseconds. Legacy microwave was more like half a millisecond, but they cost four times as much.”
Design It Right
Wireless has its share of unique considerations. While it’s faster, cheaper and easier to deploy than say, trenching several thousand feet of fiber to link buildings on a campus, it’s not without its challenges. Finding places to mount antennas sometimes takes more time and money than users expect.
“Customers often underestimate the time to acquire equipment mounting space,” Friesen says. He advocates higher-power radios, which in turn reduces antenna requirements.
Network design is also key for ensuring reliability. Vendors often argue that wireless isn’t vulnerable to cable cuts the way that fiber and copper are. But wireless networks can be vulnerable to other kinds of failures. For security or other mission-critical applications, network designers must employ either a wireless mesh architecture or wired backups.
Network design is the purview of Systems Integrators, who can develop the requirements for each application and then align network data demands and latency requirements to achieve the best possible performance at the lowest possible cost. “Network design is a core competency,” said Peter Howard, senior director for General Dynamics Information Technology (GDIT). “We’ve invested heavily in design and modeling tools and capabilities to make sure the networks meet the specific system requirements – whether it be for voice, video or data-centric applications.”
Spectrum of Options
One of the biggest decisions is which spectrum band to use, especially for government applications, where security or mission-critical requirements often drive organizations to own, rather than lease, wireless network capacity. Here are a few key considerations:
- Bandwidth. Higher frequencies, such as 60 GHz, are best for bandwidth-intensive applications because bigger swaths of spectrum are available. This means more room for more bits.
- Range. Signals don’t travel as far at higher frequencies, a phenomenon that has advantages and disadvantages. On the plus side, this could mean hackers would have to get physically close to the network in order to eavesdrop or disable it; on the other hand, shorter range jacks up costs if facilities are scattered over a large area. “In 70 GHz, you need to be within a mile, and at 60 GHz you need to be within about 400-500 meters,” Fink says. “So [they’re] great for campuses.”
- Environment. Higher-frequency signals are more susceptible to weather-related interruptions, so in places where rain and fog are common, 60 GHz or 70 GHz might not be good choices. Meanwhile, the 5 GHz band – often too crowded to use in urban or suburban applications — can be just fine in sparsely populated areas, where its long range is a plus.
- Interference. Licensed bands offer more protection against interference—and not just for the obvious reason that licenses mean fewer users competing for the same frequencies. “It’s generally a lot harder for somebody to buy off-the-shelf [equipment] to interfere with you in the licensed bands such as 11 GHz or 23 GHz,” Fink says. But this security by obscurity also has drawbacks: Licensed spectrum limits the pool of potential customers, typically raising the cost of gear at those frequencies versus comparable gear for unlicensed bands.
But not always. Some vendors adapt technologies developed for unlicensed bands for use in licensed spectrum. Some of those technologies improve reliability, such as by mitigating interference. “By taking the same technology that enabled cost savings for 5 GHz unlicensed, we’ve been able to move that into licensed bands to give more protection,” Fink says.
- Evolving technology. Today’s 4G networks are tomorrow’s legacy technologies, Howard says. With 5G, cellular will expand into microwave bands partly because spectrum is so scarce at its traditional lower frequencies. That enables more throughput, potentially making 5G a new option for high-bandwidth fixed applications.
- While some vendors say they will launch 5G products in the next year, fully standards-based 5G won’t debut until around 2020 and both cellular providers and vendors will continue to support 4G technologies through the rest of that decade at least. So 4G modems and other gear will be cheaper than 5G versions, making 4G a good fit for price-sensitive applications, such as IoT.
Security is Everything
As government agencies move parts of their enterprises to wireless, those networks will become increasingly attractive to hackers.
“If I were a hacker, I’d spend my time trying to find ways to breach wireless because there’s a lot going on, and people rely on it,” says Robert Winters, head of the TeraVM product line at Hertfordshire, U.K.-based provider Cobham Wireless.
Stan Tyliszczak, vice president of technology integration and chief engineer at GDIT, agrees: “By its very nature, wireless is inherently more vulnerable to hacks than copper or fiber,” he says. “Unlike copper or fiber, you don’t need physical access to snoop a radio link – all you need is an antenna in the line of propagation.” That weakness can be solved using encrypted radio links, for example, but the lack of experience organizations have with the technology may lead to other cracks in their security.
Commercial wireless operators, on the other hand, know where the vulnerabilities are and focus on mitigating them because running such networks is their primary business.
“The big targets probably are going to be the specialized networks,” Winters says. “Large enterprises are going to be more prone to attack than cellular operators.”
In the case of microwave, most vendors have custom protocols that make it difficult for unauthorized users to eavesdrop on traffic or hijack links.
“You can’t just [use] a sniffer or buy one of the same brand because we typically limit the ability to add other radios,” Fink says.
There’s also a certain amount of “security by obscurity,” because microwave networks are still rare compared to Wi-Fi and cellular.
“Due to the proprietary systems, rare high-frequency components, and narrow beam widths, wireless [microwave] systems are inherently quite secure,” Friesen says. “In addition, many systems are now equipped with integrated, government-approved encryption. By installing systems on high towers in secured areas, with no intermediate access point, agencies can ensure the systems are not disabled or interfered with.”
For additional protection with private networks, consider using emulators and other tools that simulate attacks. Those drills help ferret out vulnerabilities and provide insights for developing responses, such as using software-defined networking (SDN) to isolate nodes that are under attack.
“We have the ability to generate DDoS attacks in a controlled way so our customers can assess their defenses and how it affects users,” Winters says. The simulated attacks let customers can see exactly how their networks perform under stress and discover whether they can maintain quality of service if attacked.
Adding security, such as encryption, comes at a cost, and network designers must take that into account. “There’s overhead associated with security, so you have to assess that,” Winters adds. “Encryption absolutely will eat up a lot of overhead – no question about it.”