Time to raise networking expectations in government
- By Frank Konkel
- Feb 20, 2014
Greg Bell, division director of the Energy Sciences Network (ESnet) at the Energy Department, has a message for federal agencies: Networking matters.
ESnet, the first 100 gigabit-per-second network at a continental scale, connects 40 labs and facilities and more than 100 total networks. In science terms, Bell said “faster data leads to faster discovery.” In federal agencies, faster data translates to “better fulfillment of mission” and a more positive experience for end-users using myriad applications.
Bell suggested agencies step away from the old method of data transport, which he described as “a hard drive system plus FedEx.” For most situations, that physical method of transfer – still common in government – almost always equates to a loss in efficiency.
“If you have a workflow that involves sharing data, combining or allowing others to access data, you don’t want FedEx,” said Bell, speaking at a briefing in Washington, D.C., on Feb. 20.
Bell then suggested a data transfer rate agencies should be able to meet.
“Can your users move one terabyte in an hour?” Bell said. “If not, please raise their expectations.”
Common problems like “naively configured firewalls, lossy networks” – things like poor cabling, for example – and poorly configured end-systems hinder network speed. These kinds of problems are a much simpler fix than alternatives such as costly improvements to the physical network.
ESnet is a unique use case among networks. It has transferred tens of petabytes of scientific information from behemoth science facilities like the Large Hadron Collider in Geneva, Switzerland, since 1990 using increasingly fast physical networks optimized for “massive science data flows,” Bell said.
ESnet is exploring the possibility of a prototype 400 gigabit-per-second network, and believes the scientific side of government is likely to see networks faster than 100 gigabits-per-second within the next 18 months.
Most civilian agencies do not yet require the kinds of network speeds that transfer science data to and from disparate supercomputing facilities, but data is growing at an exponential rate. The flood of data is going to require better networks in government, the largest data producer in the world.
“In the old world, we built business on what your network could do,” said Stephen Alexander, senior vice president and CTO of Ciena. “What we want to get to now is that whatever business you’re running, the network is responsive to it. The network is far more dynamic, far more real-time. It is time for the network to step up with compute and storage to build a better machine. And the machine needs to be infrastructure.”
Such a fundamental architectural change is “going to take some years to happen,” Alexander said. But it will happen, and the government will have to adapt or be left behind in a bottleneck.
Frank Konkel is a staff writer covering big data, mobile, open government and a range of science/technology issues. Connect with him on Twitter at @Frank_Konkel.