Girding for the grid

Grid computing is finally out of the lab. A product of six years of largely academic research, grid technology aims to coordinate large-scale resource sharing, making it possible to set up virtual organizations in which users in one group can tap the computers, networks and applications managed by other far-off groups.

Grids rely on IP and a common middleware layer. That layer — embodied by the open-source Globus Toolkit — provides protocols for enabling shared access to various computing resources.

Communication protocols, for example, permit data exchange among computer clusters, storage devices or scientific instruments. Resource protocols, meanwhile, manage grid transactions, determining the current load of a given resource and its usage policy.

The Globus Project (www.globus.org) is the keeper of the code. This Linux-like approach to collaborative computing has taken off in the scientific and technical computing world. Ian Foster, head of the Distributed Systems Laboratory in the Argonne National Laboratory's Mathematics and Computer Science Division and a leading figure in grid development, said that about 100 projects worldwide have embraced grid computing. Examples include the Network for Earthquake Engineering Simulation sponsored by the National Science Foundation.

Earthquake engineering is one thing, but is the grid coming to a computer near you? Experts say this collaborative computing model does indeed have implications for mainstream federal information technology. They forecast wider deployment in 2003.

Foster sees "clear application" to federal IT. "In the context of current concerns, combining resources and expertise to address emergencies is a grid problem," he said. "So is sharing resources — data or otherwise — across agencies."

Mike Nelson, director of Internet technology and strategy at IBM Corp., said grids can help resource-strapped agencies obtain the computing power and applications they need. "I know from 10 years in government that there's never enough money to hire enough people to manage the systems and get the most up-to-date software," he said. A grid, however, offers resources that agencies don't have to manage on their own.

However, most observers agree that the grid needs work before it becomes a commonplace computing utility.

"Continued standardization and interoperability are prerequisites for broader adoption," said Dan Reed, director of the National Center for Supercomputing Applications (NCSA). "The technical and scientific computing domains serve as early test beds for grid technology, just as ARPAnet, NSFNet and other research networks served as test beds that later evolved into the Internet. We expect the same to happen with grid technology."

NCSA is one of four sites building NSF's TeraGrid (www.teragrid.org), which aims to link scientific computing centers via a 40 gigabits/sec pipe.

Wider use of grid technology depends on how quickly it can be "hardened" into a commercial platform. Nelson said grid computing is following a path similar to Linux in that regard. "The grid is probably two years behind where Linux is" in its evolution, he said.

Brand-name vendors are now backing the technology. For example, IBM and the Globus Project last month unveiled an initiative to integrate grid computing with Web services such as Extensible Markup Language. That effort, called the Open Grid Services Architecture, will help push the grid beyond its scientific and technical origins, according to IBM. Microsoft Corp. and Platform Computing Inc., a distributed computing software vendor, are on board with the Open Grid Services Architecture as well.

"We are identifying problems and filling in the holes," Nelson said of efforts to bolster grid computing.

***

Select components of the Globus Toolkit for grid computing

* The Globus Resource Allocation Manager provides services for creating, monitoring and managing the resource allocation process.

* The Grid Security Infrastructure provides an authentication service that allows single sign-on and remote access.

* The Metacomputing Directory Service provides a uniform framework for accessing data on the system’s configuration and status.

* Global Access to Secondary Storage enables programs running at remote locations to read and write files on a local system.

* The Heartbeat Monitor allows systems administrators or users to detect the failure of system components or application processes.

Moore is a freelance writer based in Chantilly, Va.

IN THIS SERIES

Introduction: "Emerging technologies"

Search technology: "The search continues"

VoiceXML: "A voice from the near future"

Handheld computers: "Handhelds in a new world order"

Wireless: "Breaking the tether"

Visualization: "Data analysis: Picture this"

Featured

Stay Connected

FCW Update

Sign up for our newsletter.

I agree to this site's Privacy Policy.