- By Brian Robinson
- Oct 21, 2002
As one of the first federal installations to use 10 Gigabit Ethernet, the Walter Reed Army Medical Center is unique among agencies in some ways, but typical in others.
On the unique side, it's a leading clinical research and education center that provides referral care for other centers in the United States and around the world. As such, its goal "is to take a visionary approach to [information technology], to cater to the needs of health care both today and for the future," according to Lt. Col. Vaseal Lewis, the center's chief information officer.
Part of that vision is to enable doctors to practice telemedicine, which provides real-time consultation services to far-flung centers using a seamless combination of voice, video and data communication services. However, the point at which vision intersected with reality was the center's existing data network, which was not the best candidate for delivering the needed bandwidth for new applications.
Although Walter Reed's IT requirements may be more demanding than some agencies', its budgetary constraints are similar. Therefore, any network upgrade would have to start earning its keep right away.
One way to do that is through a new voice-over-IP telephone system, which will be installed shortly after a 10 Gigabit Ethernet network is deployed. The voice-over-IP system is expected to cut $2 million a year in phone costs.
"We used the same Centrex phone system that a lot of other people use, which led right back to Verizon as our service provider and for which we paid a lot of money," said Roger Miller, the center's chief technology officer. "So we also wanted a voice-over-IP system that would work straight out of the box [over the new Ethernet network] so we wouldn't have to pay those phone bills anymore."
Disarming its ATM
For Walter Reed officials, the ultimate goal for the new network was to build a "stable, reliable, redundant platform with no single point of failure," Lewis said. The upgrade would focus on the part of the network that carries data traffic among buildings and major servers and networking equipment.
Center officials examined the possibility of improving and stabilizing the existing Asynchronous Transfer Mode (ATM) network, but they quickly decided against that option. It would cost about $4 million and would result in a system that could only meet the center's needs for few years.
Meanwhile, Lewis and her staff were considering other choices, including Gigabit Ethernet and, to a lesser extent, 10 Gigabit Ethernet, which was just emerging as a viable option. Gigabit-speed Ethernet solutions were attractive because they could meet the center's future speed requirements and because the center's IT staff was already familiar with the technology, which could keep management costs down.
"We looked first at putting in 1 Gigabit Ethernet, but as we were going through the assessment process for that, we also looked at the possibilities of 10 Gigabit Ethernet over a single fiber," said Earl Kimberlake, chief of Walter Reed's networking branch.
The center already had multimode fiber in the ground for the ATM network, he said, but deploying new 10 Gigabit Ethernet rather than 1 Gigabit Ethernet meant using one-tenth the amount of that fiber to provide the desired 32 gigabits/sec data rate for the network's core.
The desire to conserve fiber encouraged Walter Reed administrators to go with 10 Gigabit Ethernet, said Ken Cheng, vice president of marketing at Foundry Networks Inc., which provided the networking equipment and support for the installation. Although it's more expensive initially than using 1 Gigabit Ethernet, 10 Gigabit Ethernet minimizes the need to lay new fiber for future expansions.
"The amount of money needed to get digging started to lay fiber is huge," said Gary Hubbard, a Foundry engineer working on the Walter Reed project.
And using just one 10 Gigabit Ethernet card in the network switch for each fiber is a much easier setup to manage, Kimberlake said.
Using 10 Gigabit Ethernet also made the initial installation easier than using multiple 1 Gigabit ports, said Trevia Martin, vice president of operations for Force 3, the integrator for the Walter Reed project. Aside from the smaller number of ports, however, there was "nothing specific" that distinguished the 10 Gigabit installation from 1 Gigabit projects the company has completed in the past couple of years, she said.
That doesn't mean the Walter Reed project was easy. For one thing, keeping the old ATM network in place meant Force 3 couldn't simply install the new network in one attempt. Instead, with Ethernet running over ATM, the company had to take the Ethernet side of that configuration and "mesh it" into the separate 10 Gigabit network through a router.
Also, because the ATM network had never been upgraded since its installation in 1996, it was not operating efficiently. To help migrate the system to 10 Gigabit Ethernet, the project team broke the ATM network into segments so that the traffic could be transferred to the new network bit by bit, saving the most problematic areas for last.
The new networking equipment was one part of the project that didn't create problems, Martin said, despite the fact that Foundry was chosen as the supplier well before the Institute of Electrical and Electronics Engineers Inc. published the final 802.3ae 10 Gigabit standard in June. The Foundry implementation of the standard was "rock solid," she said.
Assessing vendors took about a year, Kimberlake said, and then various 10 Gigabit cards were tested before the Foundry card was chosen. Testing cards under various conditions was important to make sure they could deliver as claimed, he said.
"We looked for cards that were capable of wire speed performance," Kimberlake said. "We wanted to know that whatever we threw into the card chassis would come down the [fiber] at 10 gigabits."
The six-month project, which should have taken two years, ended in June. The result is an end-to-end Ethernet network that the existing IT staff can operate and manage and that provides substantially higher performance and capability than the old system.
And, by using the fiber that's still left in the ground and adding more 10 Gigabit cards to the network switch chassis, the network's capacity could be increased to several hundred gigabits for little extra cost.
No formal cost/benefit analysis was done, but the estimated savings alone seem to justify the project. For about $2 million more than what the cost of the ATM upgrade would have been, plus $200,000 a year in ongoing maintenance costs, Walter Reed will eventually save $2 million a year in telephone charges, the salaries of the three people needed to handle the old Centrex phone system and the money spent on training people on ATM and Ethernet. There will also be savings on the clinical side, Lewis said. The ability to provide regional centers with real-time consultations using specialists at Walter Reed will save those centers the cost of hiring their own local specialists.
And, in the final analysis, that's what this effort aimed to do, she said. In putting the 10 Gigabit Ethernet network in place, the goal was to "see where the medical side was going in the future and build for that."
Robinson is a freelance journalist based in Portland, Ore. He can be reached at [email protected]
Built for speed
Agency: Walter Reed Army Medical Center
Challenge: Upgrade the center's network so that medical staff can conduct research and provide real-time clinical consultations to remote centers around the world using voice, video and data telemedicine.
Solution: Foundry Networks Inc.'s IronWare switch/router using 10 Gigabit Ethernet modules. Cost: About $5.5 million to $6 million.
Agency benefit: An immediate, substantial increase in network performance that provides the center's medical staff with the capability to use telemedicine, the potential for major future capacity increases for only incremental increases in cost, and once the voice-over-IP system is deployed, elimination of large payments to the phone service provider.