A fast lane to the cloud

As cloud traffic grows, wide-area network optimization could help keep users happily cruising along.

Everything old is new again, as the saying goes. Take wide-area network (WAN) optimization. Products designed to boost data traffic over long distances date back about 10 years. Agencies have used the technology to speed the delivery of applications from a centralized data center to branch offices or help back up data to an off-site facility.

Today, cloud computing can create bandwidth and latency issues reminiscent of the situations that prompted WAN optimization in the past. Now agencies are tapping public cloud providers or recasting their own data centers as private clouds that offer IT resources to remote shared-services customers. Both scenarios potentially involve long network trips.

So should cloud adopters incorporate WAN optimization into their plans?

Despite the slew of vendor marketing materials and white papers that make the case for doing so, not many government agencies are buying into WAN optimization, likely because cloud computing has yet to tax bandwidth to the point where the technology seems important, experts say.

“There’s not a lot of bandwidth consumption at this point,” said Michael Sorenson, director of cloud services at QinetiQ North America.

Nevertheless, interest in WAN optimization will likely grow when cloud-deploying organizations begin encountering bandwidth and latency obstacles. Agencies that already use WAN optimization — for instance, in the branch-office example cited above — could potentially redeploy the technology in a cloud setting.

But agencies will need to proceed carefully. WAN optimization products use proprietary compression algorithms, so a cloud service provider’s WAN optimization appliance might not be compatible with an agency’s technology. Interoperability standards don’t yet exist, so agencies must be prepared to compromise.

Why it matters

Some agencies have begun looking into WAN optimization, particularly those pursuing more ambitious cloud strategies. For instance, Michigan launched MiCloud in 2010, a cloud-based program that offers data storage and automated hosting. A year later, bandwidth became a bigger consideration as the state sought to bring cloud services to local governments under its shared-services initiative.

Bandwidth is a key enabler of the state’s next generation of shared services, so WAN optimization might be among the future options, said Bob McDonough, Michigan’s lead cloud architect.

“If we come up with a very innovative shared solution and nobody can consume it — because key stakeholders are bandwidth-constrained — that’s a failure,” he added.

Tom Houston, chief technologist at Hewlett-Packard Enterprise Services’ U.S. public sector, said latency concerns mean WAN optimization “will remain important with the cloud.”

But before taking the leap, organizations should determine whether their style of cloud computing could benefit from a WAN optimization product.

For example, an agency that uses software as a service might be able to take advantage of optimization techniques with existing gear, such as bandwidth shaping, said John Burke, a principal research analyst at Nemertes Research. That approach makes sure traffic from an organization’s SaaS provider has precedence over less critical network traffic. Routers and server operating systems can be configured to perform bandwidth shaping.

Indeed, some clouds require just a bit of tweaking to boost wide-area connections.

Ray O’Brien, chief technology officer for IT at NASA’s Ames Research Center, said organizations using a WAN to move data into or out of the cloud might find that performance doesn’t meet their expectations. In those cases, host tuning could get things flowing faster, he said. That approach could involve adjusting the Transmission Control Protocol (TCP) buffer size to get the best throughput.

In the case of its Nebula cloud computing platform, NASA draws on networking specialists at Ames and Goddard Space Flight Center to troubleshoot bandwidth issues. “When there are large datasets or some time-sensitivity, that is when they tend to get involved,” O’Brien said.

The fundamentals

Some agencies might discover that bandwidth constraints require WAN optimization in the form of additional technology. If that’s the case, they can choose from a bevy of products available from vendors such as Blue Coat Systems, Cisco Systems, F5 Networks, Juniper Networks and Riverbed Technology, among others.

Such products use data compression and de-duplication to cut back on the amount of data that traverses a WAN. They also incorporate techniques such as traffic prioritization and protocol optimization.

Traditionally, WAN optimization products have taken the form of hardware. An appliance is installed on both ends of a link — for example, at the data center and at the branch office. Vendors refer to the two-appliance approach as symmetrical.

In the past couple of years, companies have started offering virtual appliances for cloud computing and virtualized IT. Some hardware vendors, such as Certeon and NetEx, now also offer the software alternative.

Alternatively, customers can choose WAN optimization as a managed service. In that scenario, a service provider manages the appliance under a subscription plan.

In yet another variation, agencies can opt for an asymmetrical approach. Last year, Blue Coat debuted its CloudCaching Engine, which uses an appliance housed at the customer’s location to accelerate applications hosted in a public cloud. The idea behind caching is to localize a good chunk of an application and thereby reduce the number of network trips.

Some agencies have built their own optimization devices. The Energy Department’s Energy Sciences Network, for instance, deploys data transfer nodes at three sites to boost traffic. Eli Dart, ESnet network engineer at DOE, said each node operates in a “Science DMZ” portion of a DOE site or campus network. The Science DMZ typically resides near the site network’s perimeter so local resources have ready access to science-oriented WANs such as ESnet.

The hurdles

The good news for agencies already equipped with WAN optimization products is that they can be repurposed for the cloud. The underlying communication protocols — TCP and User Datagram Protocol — apply to WAN and cloud alike, so products can be redeployed, industry executives say.

And now for the bad news: Agencies dealing with external cloud providers might run into compatibility problems. WAN optimization products run proprietary algorithms to handle tasks such as compression and decompression, and agencies must match appliances at each end of the link to make symmetrical acceleration work.

To succeed, agencies might have to adopt their vendor’s product choice.

One Texas agency is facing such a possibility. Right now, the Houston Police Officers' Pension System uses NetEx’s HyperIP product to speed the replication of virtual machines to a co-location facility that is outside the local hurricane zone, said Brian Poer, the pension system’s IT manager.

Poer said the agency considered going to the cloud for disaster recovery and business continuity but decided the do-it-yourself approach was more cost-effective. He noted that the pension system might adopt the cloud model in the future — at which point it might decide to implement the cloud vendor’s optimization technology if a return-on-investment analysis determines that’s the way to go.

Industry executives say agencies shouldn’t count on cloud vendors meeting their WAN optimization preferences. The cost of evaluating, certifying and accrediting technology makes vendors reluctant to insert new material, he added.

Virtual appliances are somewhat more flexible. A cloud vendor might agree to deploy the agency’s choice of virtualized optimizer as part of the virtual infrastructure it’s running for the agency, Burke said. That presumes that all the vendor and agency gear runs on the same virtualization platform. If not, workarounds would be needed, he added.

Time for action? Questions to ask cloud providers

Agencies turning to an outside cloud provider should assess candidates’ capabilities carefully before pursuing a wide-area network optimization strategy.

1. Can the cloud provider deal with growing data requirements?

Fred Whiteside, project manager for cloud computing at the National Institute of Standards and Technology, said he thinks the larger cloud providers already have a good grasp on handling large amounts of data. He said smaller providers could face some challenges, however.

2. How does the cloud provider connect to the WAN?

John Burke, a principal research analyst at Nemertes Research, said the way in which a cloud service provider engineers network access is one factor that mitigates the need for optimization. Some providers, such as Salesforce.com, deal with carriers directly, he said. That type of reach gives customers the same response time regardless of their location. A customer working with such a vendor might be able to do without bandwidth shaping but still need traffic prioritization, Burke added.

On the other hand, a customer dealing with a smaller provider that lacks the reach of a larger company might need an optimizer to get consistent response times, Burke said.