Trends

Automated License Plate Readers on the U.S. Border

Automated License Plate Readers on the U.S. Border

AP/FILE 2014

When U.S. Border Patrol agents stopped a vehicle at the border checkpoint in Douglas, Ariz., it wasn’t a lucky break. They had been on the lookout for the driver’s vehicle and it had been spotted by an automated license plate reader (ALPR). The driver, attempting to escape into Mexico, was arrested on suspicion of murder.

All along U.S. borders, ALPRs changed the face and pace of security and enforcement – although not in ways most people might expect.

While APLRs may occasionally catch individuals with a criminal record trying to come into the United States, they play a much greater role in stopping criminals trying to leave. The systems have driven a dramatic drop in vehicle thefts in U.S. border towns. They’ve also been instrumental in finding missing persons and stopping contraband.

“Recognition technology has become very powerful,” says Mark Prestoy, lead systems engineer in General Dynamics Information Technology’s Video Surveillance Lab. “Capturing an image – whether a license plate or something more complex such as a face can be successful when you have a well-placed sensor, network connection and video analytics. Once you have the image, you can process it to enhance and extract information. License plate recognition is similar to optical character recognition used in a printed document.”

“It’s an enforcement tool,” says Efrain Perez, acting director of field operations and readiness for Customs and Border Protection (CBP). “They help us identify high-risk vehicles.”

The agency has about 500 ALPR systems deployed at 91 locations to process passenger vehicles coming into the United States. It also has ALPRs on all 110 outbound lanes to Mexico, which were added in 2009 after the U.S. committed to trying to interrupt the flow of cash and weapons from the U.S. into Mexico. CBP is slowly adding the devices to outbound lanes on the Canadian border, as well.

For APLRs surveilling inbound traffic, their primary purpose is to eliminate the need for border officers to manually enter license plate numbers, allowing them to maintain a steady gaze on travelers so they can spot suspicious behavior and maintain situational awareness. Outbound traffic trained ALPRs are used to identify high-risk travelers, help track the movement of stolen vehicles and support other U.S. law enforcement agencies through the National Law Enforcement Telecommunications System.

Along the southern U.S. border, most ALPRs are fixed units at ports of entry and cover both inbound and outbound vehicles. Along the Canadian border, most APLRs are handheld units. CBP officials hope to install fixed readers at northern ports of entry in the future. “The hand-held readers are not as robust,” points out Rose Marie Davis, acquisition program manager of the Land Border Integration Program (LBIP).

The first generation of readers was deployed around 1997­-98 timeframe. Today, LBIP incorporates the technology, experience and lessons learned from that initial effort. Another effort, under the Western Hemisphere Travel Initiative in 2008 and 2009, extended those lessons learned to all other aspects of inspection processing.

The readers serve three purposes: Information gathered from vehicles transiting checkpoints is checked against a variety of law enforcement databases for outstanding warrants or other alerts. Once through, the readers allow CBP officers who conducted the primary inspection to maintain observation of a vehicle after passage.

CBP operates both fixed and mobile border checkpoints in addition to ports of entry.

But the ALPRs’ facilitation of legitimate travel and processing is one of its most telling and least publicly appreciated roles, Davis noted. “That automation facilitates legitimate travel. On our land borders it’s used to keep up the flow.”

With roughly 100 million privately owned vehicles entering through land borders in fiscal 2016 and 24 million processed at inland Border Patrol checkpoints each year, the ALPRs significantly reduce the need to manually enter license plate information – which takes up to 12 seconds per vehicle – on top of entering numerous other data points and documents, according to Davis.

Those extra seconds add up. CBP says it averages 65.5 seconds to process each vehicle entering the country, or 55 vehicles per lane per hour. That number drops to 46.5 vehicles per lane per hour without ALPR.

“For a 12-lane port like Paso Del Norte in El Paso, Texas, the throughput loss without ALPRs [would be] equivalent to closing two lanes,” CBP said in a statement. The technology is even more critical to CBP’s Trusted Traveler Programs (NEXUS and SENTRI), which allow participants express border-crossing privileges. Those highly efficient lanes now process vehicles in just 36 seconds, so adding 12 seconds processing time to each vehicle would result in a 33 percent decline in throughput.

“At the most congested ports, where wait times exceed 30 minutes daily, even a 5 to 10 second increase in cycle time could result in a doubling of border delays for inbound vehicle travelers,” CBP said.

When it comes to data storage and management, ALPR data is managed and stored through CBP’s TECS system, which allows  users to input, access and maintain records for law enforcement, inspection, intelligence-gathering, and operations.

Privacy advocates like the Electronic Frontier Foundation have expressed concern about potential abuse and commercialization from the sharing of data acquired by law enforcement ALPRs around the country. However, border ALPR data is held by CBP and is law enforcement sensitive. Sharing is strictly with other federal and law enforcement agencies. Sharing of data is under the strict privacy rules of the Department of Homeland Security. However, most of the sharing comes from state and local enforcement agencies sending information on stolen or missing vehicles and people to CBP and the Border Patrol, rather than outward bound information from CBP.

The sharing pays off in numerous ways. For example, a young girl kidnapped in Pennsylvania was found in Arizona, thanks to ALPR border data. Armed and dangerous individuals from Indio, Calif., to Laredo, Texas have been apprehended thanks to border ALPRs. Missing and abducted children have been found and major drug busts have captured volumes of illegal drugs, including 2,827 pounds of marijuana in Falfurrias, Texas, and 60 pounds of cocaine in Las Cruces, N.M., all thanks to ALPR data.

One of the most startling reader successes on the border is the dramatic reduction in vehicle thefts in U.S. border towns. Thieves who steal cars in the United States and attempt to drive them into Mexico now have a much higher chance of being caught.

Laredo, Texas, led American cities in car thefts in 2009. In 2015, it was 137th. Similar drops were seen in San Diego, which dropped from 13th to 45th, Phoenix, which dropped from 40th to 80th and Brownsville, Texas, which dropped from 75th to 217th.

Funding for ALRP purchases comes from the Treasury Executive Office of Asset Forfeiture. While CBP makes an annual request to expand its outbound program, officials are now seeking a complete technology refresh to update second-generation readers installed between 2008 and 2011.

Improvements include higher-resolution day and night cameras, faster processing times, improved data security, lighter, more covert readers, mobile device connectivity , new audio and visual alarms, improved durability and reduced power consumption.

Officials would like to expand ALRP use along the northern border, reading vehicle plates leaving the U.S., starting in metro Detroit.

“We’ve requested the funding for the tech refresh,” says Davis. “We have a new contract and it has been priced out, but we’re not funded to do that refresh yet,” she says. However, officials are hopeful that funding will be found and an even more effective generation of readers can be deployed.

Related Articles

GDIT Recruitment 600×300
Tom Temin 250×250
GovExec Newsletter 250×250
GDIT HCSD SCM 5 250×250 Truck
GDIT Recruitment 250×250
Close the Data Center, Skip the TIC – How One Agency Bought Big Into Cloud

Close the Data Center, Skip the TIC – How One Agency Bought Big Into Cloud

It’s no longer a question of whether the federal government is going to fully embrace cloud computing. It’s how fast.

With the White House pushing for cloud services as part of its broader cybersecurity strategy and budgets getting squeezed by the administration and Congress, chief information officers (CIOs) are coming around to the idea that the faster they can modernize their systems, the faster they’ll be able to meet security requirements. And that once in the cloud, market dynamics will help them drive down costs.

“The reality is, our data-center-centric model of computing in the federal government no longer works,” says Chad Sheridan, CIO at the Department of Agriculture’s Risk Management Agency. “The evidence is that there is no way we can run a federal data center at the moderate level or below better than what industry can do for us. We don’t have the resources, we don’t have the energy and we are going to be mired with this millstone around our neck of modernization for ever and ever.”

Budget pressure, demand for modernization and concern about security all combine as a forcing function that should be pushing most agencies rapidly toward broad cloud adoption.

Joe Paiva, CIO at the International Trade Administration (ITA), agrees. He used an expiring lease as leverage to force his agency into the cloud soon after he joined ITA three years ago. Time and again the lease was presented to him for a signature and time and again, he says, he tore it up and threw it away.

Finally, with the clock ticking on his data center, Paiva’s staff had to perform a massive “lift and shift” operation to keep services running. Systems were moved to the Amazon cloud.  Not a pretty transition, he admits, but good enough to make the move without incident.

“Sometimes lift and shift actually makes sense,” Paiva told federal IT specialists at the Advanced Technology Academic Research Center’s (ATARC) Cloud and Data Center Summit. “Lift and shift actually gets you there, and for me that was the key – we had to get there.”

At first, he said, “we were no worse off or no better off.” With systems and processes that hadn’t been designed for cloud, however, costs were high. “But then we started doing the rationalization and we dropped our bill 40 percent. We were able to rationalize the way we used the service, we were able to start using more reserve things instead of ad hoc.”

That rationalization included cutting out software and services licenses that duplicated other enterprise solutions. Microsoft Office 365, for example, provided every user with a OneDrive account in the cloud. By getting users to save their work there, meant his team no longer had to support local storage and backup, and the move to shared virtual drives instead of local ones improved worker productivity.

With 226 offices around the world, offloading all that backup was significant. To date, all but a few remote locations have made the switch. Among the surprise benefits: happier users. Once they saw how much easier things were with shared drives that were accessible from anywhere, he says, “they didn’t even care how much money we were saving or how much more secure they were – they cared about how much more functional they suddenly became.”

Likewise, Office 365 provided Skype for Business – meaning the agency could eliminate expensive stand-alone conferencing services – another benefit providing additional savings.

Cost savings matter. Operating in the cloud, ITA’s annual IT costs per user are about $15,000 – less than half the average for the Commerce Department as a whole ($38,000/user/year), or the federal government writ large ($39,000/user/year), he said.

“Those are crazy high numbers,” Paiva says. “That is why I believe we all have to go to the cloud.”

In addition to Office 365, ITA uses Amazon Web Services (AWS) for infrastructure and Salesforce to manage the businesses it supports, along with several other cloud services.

“Government IT spending is out of freaking control,” Paiva says, noting that budget cuts provide incentive for driving change that might not come otherwise. “No one will make the big decisions if they’re not forced to make them.”

Architecture and Planning
If getting to the cloud is now a common objective, figuring out how best to make the move is unique to every user.

“When most organizations consider a move to the cloud, they focus on the ‘front-end’ of the cloud experience – whether or not they should move to the cloud, and if so, how will they get there,” says Srini Singaraju, chief cloud architect at General Dynamics Information Technology, a systems integrator. “However, organizations commonly don’t give as much thought to the ‘back-end’ of their cloud journey: the new operational dynamics that need to be considered in a cloud environment or how operations can be optimized for the cloud, or what cloud capabilities they can leverage once they are there.”

Rather than lift and shift and then start looking for savings, Singaraju advocates planning carefully what to move and what to leave behind. Designing systems and processes to take advantage of its speed and avoiding some of the potential pitfalls not only makes things go more smoothly, it saves money over time.

“Sometimes it just makes more sense to retire and replace an application instead of trying to lift and shift,” Singaraju says. “How long can government maintain and support legacy applications that can pose security and functionality related challenges?”

The challenge is getting there. The number of cloud providers that have won provisional authority to operate under the 5-year-old Federal Risk and Authorization Management Program (FedRAMP) is still relatively small: just 86 with another 75 still in the pipeline. FedRAMP’s efforts to speed up the process are supposed to cut the time it takes to earn a provisional authority to operate (P-ATO) from as much as two years to as little as four months. But so far only three cloud providers have managed to get a product through FedRAMP Accelerated – the new, faster process, according to FedRAMP Director Matt Goodrich. Three more are in the pipeline with a few others lined up behind those, he said.

Once an agency or the FedRAMP Joint Authorization Board has authorized a cloud solution, other agencies can leverage their work with relatively little effort. But even then, moving an application from its current environment is an engineering challenge. Determining how to manage workflow and the infrastructure needed to make a massive move to the cloud work is complicated.

At ITA, for example, Paiva determined that cloud providers like AWS, Microsoft Office 365 and Salesforce had sufficient security controls in place that they could be treated as a part of his internal network. That meant user traffic could be routed directly to them, rather than through his agency’s Trusted Internet Connection (TIC). That provided a huge infrastructure savings because he didn’t have to widen that TIC gateway to accommodate all that routine work traffic, all of which in the past would have stayed inside his agency’s network.

Rather than a conventional “castle-and-moat” architecture, Paiva said he had to interpret the mandate to use the TIC “in a way that made sense for a borderless network.”

“I am not violating the mandate,” he said. “All my traffic that goes to the wild goes through the TIC. I want to be very clear about that. If you want to go to www-dot-name-my-whatever-dot-com, you’re going through the TIC. Office 365? Salesforce? Service Now? Those FedRAMP-approved, fully ATO’d applications that I run in my environment? They’re not external. My Amazon cloud is not external. It is my data center. It is my network. I am fulfilling the intent and letter of the mandate – it’s just that the definition of what is my network has changed.”

Todd Gagorik, senior manager for federal services at AWS, said this approach is starting to take root across the federal government. “People are beginning to understand this clear reality: If FedRAMP has any teeth, if any of this has any meaning, then let’s embrace it and actually use it as it’s intended to be used most efficiently and most securely. If you extend your data center into AWS or Azure, those cloud environments already have these certifications. They’re no different than your data center in terms of the certifications that they run under. What’s important is to separate that traffic from the wild.”

ATARC has organized a working group of government technology leaders to study the network boundary issue and recommend possible changes to the policy, said Tom Suder, ATARC president. “When we started the TIC, that was really kind of pre-cloud, or at least the early stages of cloud,” he said. “It was before FedRAMP. So like any policy, we need to look at that again.” Acting Federal CIO Margie Graves is a reasonable player, he said, and will be open to changes that makes sense, given how much has changed since then.

Indeed, the whole concept of a network’s perimeter has been changed by the introduction of cloud services, Office of Management and Budget’s Grant Schneider, the acting federal chief information security officer (CISO), told GovTechWorks earlier this year.

Limiting what needs to go through the TIC and what does not could have significant implications for cost savings, Paiva said. “It’s not chump change,” he said. “That little architectural detail right there could be billions across the government that could be avoided.”

But changing the network perimeter isn’t trivial. “Agency CIOs and CISOs must take into account the risks and sensitivities of their particular environment and then ensure their security architecture addresses all of those risks,” says GDIT’s Singaraju. “A FedRAMP-certified cloud is a part of the solution, but it’s only that – a part of the solution. You still need to have a complete security architecture built around it. You can’t just go to a cloud service provider without thinking all that through first.”

Sheridan and others involved in the nascent Cloud Center of Excellence sees the continued drive to the cloud as inevitable. “The world has changed,” he says. “It’s been 11 years since these things first appeared on the landscape. We are in exponential growth of technology, and if we hang on to our old ideas we will not continue. We will fail.”

His ad-hoc, unfunded group includes some 130 federal employees from 48 agencies and sub-agencies that operate independent of vendors, think tanks, lobbyists or others with a political or financial interest in the group’s output. “We are a group of people who are struggling to drive our mission forward and coming together to share ideas and experience to solve our common problems and help others to adopt the cloud,” Sheridan says. “It’s about changing the culture.”

Related Articles

GDIT Recruitment 600×300
Tom Temin 250×250
GovExec Newsletter 250×250
GDIT HCSD SCM 5 250×250 Truck
GDIT Recruitment 250×250
IT Staffs Lag in Job Satisfaction vs. Non-IT Workers

IT Staffs Lag in Job Satisfaction vs. Non-IT Workers

Information Technology staff are more likely than other workers to feel disconnected from the missions of their overall organizations, a principal reason for diminished job satisfaction, according to a new study of 5,000 employees in 500 different technology organizations.

TinyPulse, a Seattle, Wash. specialist in employee morale and company culture surveyed workers about job satisfaction, happiness at work and company values. They found tech employees less satisfied than others.

“What we found to be the most surprising was technology workers’ misalignment with their organization’s purpose and values,” TinyPulse CEO David Niu told GovTechWorks. “Only 28 percent of them know their company’s mission, vision and values, versus 43 percent for non-IT employees.”

Other significant disparities included the extent to which their personal values matched those of their employers.

“For non-IT employees, 45 percent responded with a [top score of] 9 or 10,” Niu said, versus 34 percent for IT workers. “That’s surprising, given how much we see in the popular press about how technology companies like Google and Facebook preach their culture and work-life balance.”

Among areas of concern:

  • Only 19 percent of IT employees gave a strongly positive answer when asked how happy they were on the job. That compares to 22 percent among non-IT workers “which is a statistically significant difference that makes us worry,” the report stated. Employee engagement is key. “The creativity and passion we need from workers in the tech space can’t thrive without it. So when IT employees, some of our best and brightest, tell us that they’re so much unhappier than people in other industries, we need to pay attention and find out why.”
  • IT employees are less likely to see a clear career path ahead of them. Roughly half of all non-IT employees see clear promotion and career paths ahead versus lightly more than 1 in 3 IT employees.
  • Only a slim 17 percent of IT employees feel strongly valued at work, compared with 22 percent for non-IT employees. “We asked employees if they would reapply for their current job, then compared those answers to how valued they feel at work,” the report claimed. “The two go hand in hand: Even if they stick around, an unappreciated worker is not a motivated one. Recognition communicates to employees that their work matters, driving them to keep putting in that effort.”
  • Only 47 percent of IT employees say they have strong relationships with their coworkers, versus 56 percent for non-IT employees. “Peers are the number one reason that motivates employees to excel,” reads the report. “It’s not their salary, it’s not their boss — it’s not even their own passion for the field. Tactics like awarding raises and measuring job fit are important, but they can’t substitute for colleagues.”

IT staff working for government contractors and embedded in government offices may face particular challenges. They have to support both the government customer’s mission and their mission as a contractor. While most of the time those two challenges are aligned, sometimes they are not.

“Open and honest communication and trusted relationships are critical,” says Collen Nicoll, director of talent acquisition at systems integrator General Dynamics Information Technology (GDIT). “If the relationship is strong and built on mutual transparency from the beginning, whatever disconnects might arise can be dealt with and eliminated quickly and easily. When it’s not, that’s when problems arise. Onsite managers are there to ensure alignment, make sure they are meeting the customer’s needs and work through problems when and if they occur. For most employees, there should be no question about the alignment between the company’s values, their work and that of the government customer.”

Getting that relationship and tone right is especially important for younger, less experienced employees – the heart of the future workforce. Job satisfaction and career progression are the most critical factors in determining their propensity to stay with the same employer.

“One of the most pressing concerns for employees is to know where they’re going at a company,” the TinyPulse report states. “Our internal research found that among millennials — the largest generation in the workplace — 75 percent would consider looking for a new job if they didn’t have opportunities for professional growth.”

Daniel Todd, CEO and Founder of Affinity Influencing Systems in Kirkland, Wash., said, “Keeping people motivated is often times a mix of giving them clear, detailed direction while simultaneously talking about the big picture and how each element of what they are working on fits into the big picture.”

What can leaders do to improve IT staff morale?

  • Foster professional growth. Make sure employees fit with their jobs and know where they’re going in the organization. Managers should routinely discuss career development with employees.
  • Build the right team. Leaders should understand what kind of culture they want to create, and hire with it in mind. They should understand how a new hire will fit in before they bring them aboard.
  • Prioritize positive feedback. There’s an epidemic of feeling undervalued at work, leading to disengagement and attrition. Acknowledging employees accomplishments every day and talking to them when things go right, as well as wrong, builds confidence and trust.
  • Align employees with the company mission. If the mission isn’t clear to the team, the team won’t pull in the same direction. Clearly communicating core values and hiring the people who fit them helps ensure everyone is on the same page.

Unhappy employees “directly impact others with their work, so disengagement and unhappiness has ripple ef­fects throughout [an organization],” the report concludes. Helping unhappy employees improve their situation – and solving the underlying causes – are among the most important things leaders do.

But that doesn’t mean leaders need to do it all by themselves. Admir Hadziabulic, knowledge supervisor at Heavy Construction System Specialist (HCSS), which creates system software in Sugar Land, Texas, leans on employees to spread the company culture to new hires.

“HCSS evolved over time to develop its culture,” says Hadziabulic, “and we try to ensure that everyone who works here has a hand in that culture.”

Each new employee receives a “Culture Book,” a document written by employees and designed to help new hires integrate “into our established culture,” Hadziabulic says. New employees also go through a culture overview class that “explains why we do things the way we do.”

By helping employees understand and buy into that culture, he says, they’re more likely to stick around.

“Job satisfaction starts with the hiring process and then the employees’ start in the workplace,” says Nicoll of GDIT. “Cultures are hard to change, but morale is fluid and always a function of leadership. Hiring the right people is the first, best step. Next comes aligning them with our company values, this includes giving them the tools, training and support they need to succeed. And finally, celebrating their successes – and helping them to learn from their failures – is also important. Morale is just higher when leaders follow that approach.”

Related Articles

GDIT Recruitment 600×300
Tom Temin 250×250
GovExec Newsletter 250×250
GDIT HCSD SCM 5 250×250 Truck
GDIT Recruitment 250×250
Employees Wanting Mobile Access May Get it —As 5G Services Come Into Play

Employees Wanting Mobile Access May Get it —As 5G Services Come Into Play

Just about every federal employee has a mobile device: Many carry two – one for work and one for personal use. Yet by official policy, most federal workers cannot access work email or files from a personal phone or tablet. Those with government-owned devices usually are limited to using it for email, calendar or Internet searches.

Meanwhile, many professionals use a work or personal phone to do a myriad of tasks. In a world where more than 70 percent of Internet traffic includes a mobile device, government workers are frequently taking matters into their own hands.

According to a recent FedScoop study of 168 federal employees and others in the federal sector, only 35 percent said their managers supported the use of personal mobile devices for official business. Yet 74 percent said they regularly use personally-owned tablets to get their work done. Another 49 percent said they regularly used personal smartphones.

In other words, employees routinely flout the rules – either knowingly or otherwise – to make themselves more productive.

“They’re used to having all this power in their hand, being able to upgrade and download apps, do all kinds of things instantaneously, no matter where they are,” says Michael Wilkerson, senior director for end-user computing and mobility at VMWare Federal, the underwriter for the research study conducted by FedScoop. “The workforce is getting younger and employees are coming in with certain expectations.”

Those expectations include mobile. At the General Services Administration (GSA), where more than 90 percent of jobs are approved for telework and where most staff do not have permanent desks or offices, each employee is issued a mobile device and a laptop. “There’s a philosophy of anytime, anywhere, any device,” says Rick Jones, Federal Mobility 2.0 Manager at GSA. Employees can log into virtual desktop infrastructure to access any of their work files from any device. “Telework is actually a requirement at GSA. You are expected to work remotely one or two days a week,” he says, so the agency is really serious about making employees entirely independent of conventional infrastructure. “We don’t even have desks,” he says. “You need to register for a cube in advance.”

That kind of mobility is likely to increase in the future, especially as fifth-generation (5G) mobile services come into play. With more wireless connections installed more densely, 5G promises data speeds that could replace conventional wired infrastructure, save wiring costs and increase network flexibility – all while significantly increasing the number of mobile-enabled workers.

Shadow IT
When Information Technology (IT) departments don’t give employees the tools and applications they need or want to get work done, they’re likely to go out and find it themselves, using cloud-based apps they can download to their phones, tablets and laptops.

Rajiv Gupta, president of Skyhigh Networks of Campbell, Calif., which provides a cloud-access security service, says his company found that users in any typical organization – federal, military or commercial –access more than 1,400 cloud-based services, often invisibly to IT managers. Such uses may be business or personal, but either can have an impact on security if devices are being used for both. Staff may be posting on Facebook, Twitter and LinkedIn, any of which could be personal but could also be official or in support of professional aims. Collaboration tools like Basecamp, Box, DropBox or Slack are often easy means of setting up unofficial work groups to share files when solutions like SharePoint come up short. Because such uses are typically invisible to the organization, he says, they create a “more insidious situation” – the potential for accidental information leaks or purposeful data ex-filtrations by bad actors inside the organization.

“If you’re using a collaboration service like Microsoft 365 or Box, and I share a file with you, what I’m doing is sharing a link – there’s nothing on the network that I can observe to see the files moving,” he says. “More than 50 percent of all data security leaks in a service like 365 is through these side doors.”

The organization may offer users the ability to use OneDrive or Slack, but if users perceive those as difficult or the access controls as unwieldly (user authentication is among mobile government users’ biggest frustrations, according to the VMWare/FedScoop study), they will opt for their own solutions, using email to move data out of the network and then collaborating beyond the reach of the IT and security staff.

While some such instances may be nefarious – as in the case of a disgruntled employee for example – most are simply manifestations of well-meaning employees trying to get their work done as efficiently as possible.

“So employees are using services that you and I have never even heard of,” Gupta says, services like Zippyshare, Footlocker and Findspace. Since most of these are simply classified as “Internet services,” standard controls may not be effective in blocking them, because shutting down the whole category is not an option, Gupta says. “If you did you would have mutiny on your hands.” So network access controls need to be narrowly defined and operationalized through whitelisting or blacklisting of sites and apps.

Free services are a particular problem because employees don’t see the risk, says Sean Kelley, chief information security officer at the Environmental Protection Agency (EPA). At an Institute for Critical Infrastructure conference in May, he said the problem traces back to the notion that free or subscription services aren’t the same as information technology. “A lot of folks said, well, it’s cloud, so it’s not IT,” he said. “But as we move from network-based security to data security, we need to know where our data is going.”

The Federal Information Technology Acquisition Reform Act was supposed to empower chief information officers (CIOs) by giving them more control over such purchases. But regulating free services and understanding the extent to which users may be using them is extremely difficult, whether in government or the private sector. David Summitt, chief information security officer (CISO) at the Moffit Cancer Center in Tampa, Fla., described an email he received from a salesman representing a cloud service provider. The email contained a list of more than 100 Moffit researchers who were using his company’s technology – all unbeknownst to the CISO. His immediate reply: “I said thank you very much – they won’t be using your service tomorrow.” Then he shut down access to that domain.

Controlling Mobile Use
Jon Johnson, program manager for enterprise mobility at GSA acknowledges that even providing access to email opens the door to much wider use of mobile technology. “I too download and open documents to read on the Metro,” he said. “The mobile devices themselves do make it more efficient to run a business. The question is, how can a CIO create tools and structures so their employees are more empowered to execute their mission effectively, and in a way that takes advantage not only of the mobile devices themselves, but also helps achieve a more efficient way of operating the business?”

Whether agencies choose to whitelist approved apps or blacklist high-risk ones, Johnson said, every agency needs to nail down the solution that best applies to its needs. “Whether they have the tools that can continually monitor those applications on the end point, whether they use vetting tools,” he said, each agency must make its own case. “Many agencies, depending on their security posture, are going to have those applications vetted before they even deploy their Enterprise Mobility Management (EMM) onto that device. There is no standard for this because the security posture for the Defense Information Systems Agency (DISA) and the FBI are different from GSA and the Department of Education.

“There’s always going to be a tradeoff between the risk of allowing your users to use something in a way that you may not necessarily predict versus locking everything down,” says Johnson.

Johnson and GSA have worked with a cross-agency mobile technology tiger team for years to try to nail down standards and policies that can make rolling out a broader mobile strategy easier on agency leaders. “Mobility is more than carrier services and devices,” he says. “We’ve looked at application vetting, endpoint protection, telecommunication expense management and emerging tools like virtual mobile interfaces.” He adds they’ve also examined the evolution of mobile device management solutions to more modern enterprise mobility management systems that take a wider view of the mobile world.

Today, agencies are trying to catch up to the private sector and overcome the government’s traditionally limited approach to mobility. At the United States Agency for International Development (USAID), Lon Gowan, chief technologist and special advisor to the CIO, says even though half the agency’s staff are in far-flung remote locations, many of them austere. “We generally treat everyone as a mobile worker,” Gowan says.

Federal agencies remain leery of adopting bring-your-own-device policies, just as many federal employees are leery of giving their agencies access to their personal information. While older mobile device management software gave organizations the ability to monitor activity and wipe entire devices; today’s enterprise management solutions enable devices to effectively be split, containing both personal and business data. And never the twain shall meet.

“We can either allow a fully managed device or one that’s self-owned, where IT manages a certain portion of it,” says VMWare’s Wilkerson. “You can have a folder that has a secure browser, secure mail, secure apps and all of that only works within that container. You can set up secure tunneling so each app can do its own VPN tunnel back to the corporate enterprise. Then, if the tunnel gets shut down or compromised, it shuts off the application, browser — or whatever — is leveraging that tunnel.

Another option is to use mobile-enabled virtual desktops where applications and data reside in a protected cloud environment, according to Chris Barnett, chief technology officer for GDIT’s Intelligence Solutions Division. “With virtual desktops, only a screen image needs to be encrypted and communicated to the mobile device. All the sensitive information remains back in the highly-secure portion of the Enterprise. That maintains the necessary levels of protection while at the same time enabling user access anywhere, anytime.”

When it comes to classified systems, of course, the bar moves higher as risks associated with a compromise increase. Neil Mazuranic of DISA’s, Mobility Capabilities branch chief in the DoD Mobility Portfolio Management Office, says his team can hardly keep up with demand. “Our biggest problem at DISA at the secret level and top secret level, is that we don’t have enough devices to go around,” he says. “Demand is much greater than the supply. We’re taking actions to push more phones and tablets out there.” But capacity will likely be a problem for a while.

The value is huge however, because the devices allow senior leaders “to make critical, real-world, real-time decisions without having to be tied to a specific place,” he says. “We want to stop tying people to their desks and allow them to work wherever they need to work, whether it’s classified work or unclassified.”

DISA is working on increasing the numbers of classified phones using Windows devices that provide greater ability to lock down security than possible with iOS or Android devices. By using products not in the mainstream, the software can be better controlled. In the unclassified realm, DISA secures both iOS and Android devices using managed solutions allowing dual office and personal use. For iOS, a managed device solution establishes a virtual wall in which some apps and data are managed and controlled by DISA, while others are not.

“All applications that go on the managed side of the devices, we evaluate and make sure they’re approved to use,” DISA’s Mazuranic told GovTechWorks. “There’s a certain segment that passes with flying colors and that we approve, and then there are some questionable ones that we send to the authorizing official to accept the risk. And there are others that we just reject outright. They’re just crazy ones.”

Segmenting the devices, however, gives users freedom to download apps for their personal use with a high level of assurance that those apps cannot access the controlled side of the device. “On the iOS device, all of the ‘for official use only’ (FOUO) data is on the managed side of the device,” he said. “All your contacts, your email, your downloaded documents, they’re all on the managed side. So when you go to the Apple App Store and download an app, that’s on the unmanaged side. There’s a wall between the two. So if something is trying to get at your contacts or your data, it can’t, because of that wall. On the Android device, it’s similar: There’s a container on the device, and all the FOUO data on the device is in that container.”

Related Articles

GDIT Recruitment 600×300
Tom Temin 250×250
GovExec Newsletter 250×250
GDIT HCSD SCM 5 250×250 Truck
GDIT Recruitment 250×250
How Employers Try to Retain Tech Talent

How Employers Try to Retain Tech Talent

As soon as Scott Algeier hires a freshly minted IT specialist out of college, a little clock starts ticking inside his head.

It’s not that he doesn’t have plenty to offer new hires in his role as director of the non-profit Information Technology-Information Sharing and Analysis Center (IT-ISAC) in Manassas, Va., nor that IT-ISAC cannot pay a fair wage. The issue is Algeier is in an all-out war for talent – and experience counts. Contractors, government agencies – indeed virtually every other employer across the nation – values experience almost as much as education and certifications.

As employees gain that experience, they see their value grow. “If I can get them to stay for at least three years, I consider that a win,” says Algeier. “We have one job where it never lasts more than two years. The best I can do is hire quality people right out of college, train them and hope they stick around for three years.”

The Military Context
An October 2016 White Paper from the Air Force University’s Research Institute says the frequency of churn is even more dire among those in the military, particularly in the Air Force which is undergoing a massive expansion of its cyber operations units.

The present demand for cybersecurity specialists in both the public and private sectors could undoubtedly lead the Air Force to be significantly challenged in retaining its most developed and experienced cyber Airmen in the years ahead, writes Air Force Major William Parker IV, author of the study.

“In the current environment, shortages in all flavors of cyber experts will increase, at least in the foreseeable future. Demand for all varieties of cybersecurity-skilled experts in both the private and public sectors is only rising.”

Meanwhile, it is estimated that today there are at least 30,000 unfilled cybersecurity jobs across the federal government, writes Parker. According to the International Information System Security Certification Consortium (ISC2), demand for cyber-certified professionals will continue to increase at 11 percent per year for the foreseeable future. Some estimates placed the global cyber workforce shortage at close to a million.

The military – both a primary trainer and employer in cyber — offers some interesting insight. A recent survey of Air Force cyber specialists choosing between re-enlistment or pursuit of opportunities in the civilian world indicates those who chose to reenlist were primarily influenced by job security and benefits, including health, retirement and education and training.

“For those Airmen who intended to separate, civilian job opportunities, pay and allowances, bonuses and special pays, promotion opportunities and the evaluation system contributed most heavily to their decisions [to leave the military],” Parker’s paper concluded.

Indeed, several airmen who expressed deep pride and love of serving in the Air Force stated they chose to separate because they felt their skills were not being fully utilized.

“Also, they were aware they had the ability to earn more income for their families in the private sector,” adds Parker. The re-enlistment bonuses the Air Force offered were not enough to make up the pay differences these airmen saw.

“It is also interesting that many of those who say that they will reenlist, included optimistic comments that they hope ‘someday’ they may be able to apply the cyber skills they have attained in the service of the nation.”

Tech companies present a different set of competitive stresses: competing with high pay, industrial glamor and attractive perks. Apple’s new Cupertino, Calif., headquarters epitomizes the age: an airy glass donut that looks like it just touched down from a galaxy far, far away, filled with cafés, restaurants, a wellness center, a child care facility and even an Eden-like garden inside the donut hole. Amazon’s $4 billion urban campus, anchored by the improbable “spheres,” in which three interlocking, multistory glass structures house treehouse meeting rooms, offices and collaborative spaces filled with trees, rare plants, waterfalls and a river that runs through it all.

While Washington, D.C., contractors and non-profits do not have campus rivers or stock option packages, they do have other ways to compete. At the forefront are the high-end missions in which both they and their customers perform. They also offer professional development, certifications, job flexibility and sometimes, the ability to work from home.

“We work with the intelligence community and the DoD,” says Chris Hiltbrand, vice president of Human Resources for General Dynamics Information Technology’s Intelligence Solutions Division. “Our employees have the opportunity to apply cutting-edge technologies to interesting and important missions that truly make a difference to our nation. It’s rewarding work.”

While sometimes people leave for pay packages from Silicon Valley, he admits, pay is rarely the only issue employees consider. Work location, comfort and familiarity, quality of work, colleagues, career opportunities and the impact of working on a worthwhile mission, all play a role.

“It’s not all about maximizing earning potential,” Hiltbrand says. “In terms of money, people want to be compensated fairly – relative to the market – for the work they do. We also look at other aspects of what we can offer, and that is largely around the customer missions we support and our reputation with customers and the industry.”

Especially for veterans, mission, purpose and service to the nation are real motivators. GDIT then goes a step further, supporting staff who are members of the National Guard or military reservists with extra benefits, such as paying the difference in salary when staff go on active duty.

Mission also factors in to the equation at IT-ISAC, Algeier says. “Our employees get to work with some of the big hitters in the industry and that experience definitely keeps them here longer than they might otherwise. But over time, that also has an inevitable effect.

“I get them here by saying: ‘Hey, look who you get to work with,’ he says. “And then within a few years, it’s ‘hey, look who they’re going to go work with.’”

Perks and Benefits
Though automation may seem like a way to replace people rather than entice them to stay, it can be a valuable, if unlikely retention tool.

Automated tools spare staff from the tedious work some find demoralizing (or boring), and save hours or even days for higher-level work, Algeier says. “That means they can now go do far more interesting work instead.” More time doing interesting work leads to happier employees, which in turn makes staff more likely to stay put.

Fitness and wellness programs are two other creative ways employers invest in keeping the talent they have. Gyms, wellness centers, an in-house yoga studio, exercise classes and even CrossFit boxes are some components. Since exercise helps relieve stress and stress can trigger employees to start looking elsewhere for work, it stands that reducing stress can help improve the strains of work and boost production. Keeping people motivated helps keep them from negative feelings that might lead them to seek satisfaction elsewhere.

Providing certified life coaches is another popular way employers can help staff, focusing on both personal and professional development. Indeed, Microsoft deployed life coaches at its Redmond headquarters more than a decade ago. They specialize in working with adults with Attention Deficit Hyperactivity Disorder (ADHD), and can help professionals overcome weaknesses and increase performance.

Such benefits used to be the domain of Silicon Valley alone, but not anymore. Fairfax, Va.-based boutique security company MKACyber, was launched by Mischel Kwon after posts as director of the Department of Homeland Security’s U.S. Computer Emergency Response Team and as vice president of public sector security solutions for Bedford, Mass.-based RSA. Kwon built her company with what she calls “a West Coast environment.”

The company provides breakfast, lunch and snack foods, private “chill” rooms, and operates a family-first environment, according to a job posting. It also highlights the company’s strong commitment to diversity and helps employees remain “life-long learners.”

Kwon says diversity is about more than just hiring the right mix of people. How you treat them is the key to how long they stay.

“There are a lot of things that go on after the hire that we have to concern ourselves with,” she said at a recent RSA conference.

Retention is a challenging problem for everyone in IT, Kwon says, but managers can do more to think differently about how to hire and keep new talent, beginning by focusing not just on raw technical knowledge, but also on soft skills that make a real difference when working on projects and with teams.

“We’re very ready to have people take tests, have certifications, and look at the onesy-twosy things that they know,” says Kwon. “What we’re finding though, is just as important as the actual content that they know, is their actual work ethic, their personalities. Do they fit in with other people? Do they work well in groups? Are they life-long learners? These types of personal skills are as important as technical skills,” Kwon says. “We can teach the technical skills. It’s hard to teach the work ethic.”

Flexible Work Schedules
Two stereotypes of the modern tech age are all-night coders working in perk-laden offices and fueled by free food, lattes and energy drinks. On the other hand are virtual meetings populated by individuals spread out across the nation or the globe, sitting in home offices or bedrooms, working on their laptops. For many, working from home is no longer a privilege. It’s either a right or at least, an opportunity to make work and life balance out. Have to wait for a plumber to fix the leaky sink? No problem: dial in remotely. In the District of Columbia, the government and many employers encourage regular telework as a means to reduce traffic and congestion — as well as for convenience.

For some, working from home also inevitably draws questions. IBM, for years one of the staunchest supporters of telework, now backtracks on the culture it built, telling workers they need to regularly be in the office if they want to stay employed. The policy shift follows similar moves by Yahoo!, among others.

GDIT’s Hiltbrand says because its staff works at company locations as well as on government sites, remote work is common.

“We have a large population of people who have full or part-time teleworking,” he says. “We are not backing away from that model. If anything, we’re trying to expand on that culture of being able to work from anywhere, anytime and on any device.”

Of course, that’s not possible for everyone. Staff working at military and intelligence agencies don’t typically have that flexibility. “But aside from that,” adds Hiltbrand, “we’re putting a priority on the most flexible work arrangements possible to satisfy employee needs.”

Related Articles

GDIT Recruitment 600×300
Tom Temin 250×250
GovExec Newsletter 250×250
GDIT HCSD SCM 5 250×250 Truck
GDIT Recruitment 250×250
Automation Critical to Securing Code in an Agile, DevOps World

Automation Critical to Securing Code in an Agile, DevOps World

The world’s biggest hack might have happened to anyone. The same software flaw hackers exploited to expose 145 million identities in the Equifax database – most likely yours included – was also embedded in thousands of other computer systems belonging to all manner of businesses and government agencies.

The software in question was a commonly used open-source piece of Java code known as Apache Struts. The Department of Homeland Security’s U.S. Computer Emergency Readiness Team (US-CERT) discovered a flaw in that code and issued a warning March 8, detailing the risk posed by the flaw. Like many others, Equifax reviewed the warning and searched its systems for the affected code. Unfortunately, the Atlanta-based credit bureau failed to find it among the millions of lines of code in its systems. Hackers exploited the flaw three days later.

Open source and third-party software components like Apache Struts now make up between 80 and 90 percent of software produced today, says Derek Weeks, vice president and DevOps advocate at Sonatype. The company is a provider of security tools and manager of the world’s largest open source software collections The Central Repository. Programmers completed nearly 60 billion software downloads from the repository in 2017 alone.

“If you are a software developer in any federal agency today, you are very aware that you are using open-source and third-party [software] components in development today,” Weeks says. “The average organization is using 125,000 Java open source components – just Java alone. But organizations aren’t just developing in Java, they’re also developing in JavaScript, .Net, Python and other languages. So that number goes up by double or triple.”

Reusing software saves time and money. It’s also critical to supporting the rapid cycles favored by today’s Agile and DevOps methodologies. Yet while reuse promises time-tested code, it is not without risk: Weeks estimates one in 18 downloads from The Central Repository – 5.5 percent – contains a known vulnerability. Because it never deletes anything, the repository is a user-beware system. It’s up to software developers themselves – not the repository – to determine whether or not the software components they download are safe.

Manual Review or Automation?

Performing a manual, detailed security analysis of each open-source software component takes hours to ensure it is safe and free of vulnerabilities. That in turn, distracts from precious development time, undermining the intended efficiency of reusing code in the first place.

Tools from Sonatype, Black Duck of Burlington, Mass., and others automate most of that work. Sonatype’s Nexus Firewall for example, scans modules as they come into the development environment and stops them if they contain flaws. It also suggests alternative solutions, such as newer versions of the same components, that are safe. Development teams can employ a host of automated tools to simplify or speed other parts of the build, test and secure processes.

Some of these are commercial products, and others like the software itself, are open-source tools. For example, Jenkins is a popular open-source DevOps tool that helps developers quickly find and solve defects in their codebase. These tools focus on the reused code in a system; static analysis tools, like those from Veracode, focus on the critical custom code that glues that open-source software together into a working system.

“Automation is key to agile development,” says Matthew Zach, director of software engineering at General Dynamics Information Technology’s (GDIT) Health Solutions. “The tools now exist to automate everything: the builds, unit tests, functional testing, performance testing, penetration testing and more. Ensuring the code behind new functionality not only works, but is also secure, is critical. We need to know that the stuff we’re producing is of high quality and meets our standards, and we try to automate as much of these reviews as possible.”

But automated screening and testing is still far from universal. Some use it, others don’t. Weeks describes one large financial services firm that prided its software team’s rigorous governance process. Developers were required to ask permission from a security group before using open source components. The security team’s thorough reviews took about 12 weeks for new components and six to seven weeks for new versions of components already in use. Even so, officials estimated some 800 open source components had made it through those reviews, and were in use in their 2,000-plus deployed applications.

Then, Sonatype was invited to scan the firm’s deployed software. “We found more than 13,000 open source components were running in those 2,000 applications,” Weeks recalls. “It’s not hard to see what happened. You’ve got developers working on two-week sprints, so what do you think they’re going to do? The natural behavior is, ‘I’ve got a deadline, I have to meet it, I have to be productive.’ They can’t wait 12 weeks for another group to respond.”

Automation, he said, is the answer.

Integration and the Supply Chain

Building software today is a lot like building a car: Rather than manufacture every component, from the screws to the tires to the seat covers, manufacturers focus their efforts on the pieces that differentiate products and outsource the commodity pieces to suppliers.

Chris Wysopal, chief technology officer at Veracode, said the average software application today uses 46 ready-made components. Like Sonatype, Veracode offers a testing tool that scans components for known vulnerabilities; its test suite also includes a static analysis tool to spot problems in custom code and a dynamic analysis tool that tests software in real time.

As development cycles get shorter, the demand for automating features is increasing, Wysopal says. The five-year shift from waterfall to Agile, shortened typical development cycles from months to weeks. The advent of DevOps and continuous development accelerates that further, from weeks to days or even hours.

“We’re going through this transition ourselves. When we started Veracode 11 years ago, we were a waterfall company. We did four to 10 releases a year,” Wysopal says. “Then we went to Agile and did 12 releases a year and now we’re making the transition to DevOps, so we can deploy on a daily basis if we need or want to. What we see in most of our customers is fragmented methodologies: It might be 50 percent waterfall, 40 percent agile and 10 percent DevOps. So they want tools that can fit into that DevOps pipeline.”

A tool built for speed can support slower development cycles; the opposite, however, is not the case.

One way to enhance testing is to let developers know sooner that they may have a problem. Veracode is developing a product that will scan code as its written by running a scan every few seconds and alerting the developer as soon as a problem is spotted. This has two effects: First, to clean up problems more quickly, but second, to help train developers to avoid those problems in the first place. In that sense, it’s like spell check in a word processing program.

“It’s fundamentally changing security testing for a just-in-time programming environment,” Wysopal says.

Yet as powerful and valuable as automation is, these tools alone will not make you secure.

“Automation is extremely important,” he says. “Everyone who’s doing software should be doing automation. And then manual testing on top of that is needed for anyone who has higher security needs.” He puts the financial industry and government users into that category.

For government agencies that contract for most of their software, understanding what kinds of tools and processes their suppliers have in place to ensure software quality, is critical. That could mean hiring a third-party to do security testing on software when it’s delivered, or it could mean requiring systems integrators and development firms to demonstrate their security processes and procedures before software is accepted.

“In today’s Agile-driven environment, software vulnerability can be a major source of potential compromise to sprint cadences for some teams,” says GDIT’s Zach. “We can’t build a weeks-long manual test and evaluation cycle into Agile sprints. Automated testing is the only way we can validate the security of our code while still achieving consistent, frequent software delivery.”

According to Veracode’s State of Software Security 2017, 36 percent of the survey’s respondents do not run (or were unaware of) automated static analysis on their internally developed software. Nearly half never conduct dynamic testing in a runtime environment. Worst of all, 83 percent acknowledge releasing software before or resolving security issues.

“The bottom line is all software needs to be tested. The real question for teams is what ratio and types of testing will be automated and which will be manual,” Zach says. “By exploiting automation tools and practices in the right ways, we can deliver the best possible software, as rapidly and securely as possible, without compromising the overall mission of government agencies.”

Related Articles

GDIT Recruitment 600×300
Tom Temin 250×250
GovExec Newsletter 250×250
GDIT HCSD SCM 5 250×250 Truck
GDIT Recruitment 250×250