Grid accelerates research

Tools help deliver CERN data to labs worldwide

Whatever its success in the marketplace, grid has been a great success in the research community. Fermi National Accelerator Laboratory, in Batavia, Ill., has been testing a grid network that will eventually distribute experimental data from the European laboratory for particle physics (CERN) in Geneva, Switzerland, to multiple research laboratories around the globe. Thanks to grid tools, data created at CERN can be distributed and then analyzed at other facilities around the world.

While electronically shepherding large amounts of information from one location to another is a difficult problem in itself, the task grows even more complex with multiple recipients, said Ian Fisk, Fermi associate scientist.

In 2007, CERN will crank up the Large Hadron Collider, which will be the world’s largest particle accelerator. The physics community around this project wants to channel the collision results to labs worldwide, which could test out advanced physics hypotheses concerning supersymmetry, string theory and other theories.

This approach could tap the potential power of distributed computing. CERN itself has the most computer capacity of all the laboratories involved in the project, yet it only has 20 percent of the total computing capability involved in the project globally. The remaining 80 percent is split across the other participating partners, Fisk said.

Grid tools are essential for the job, Fisk said, because they provide the storage interfaces. Fermilab uses Storage Resource Manager, grid middleware developed in part by Lawrence Berkeley National Laboratory. “The SRM interface allows us to describe that large group of servers as an interface,” Fisk said. CERN sends the data from multiple servers, which are received by another batch of numerous servers at Fermilab. SRM lends a hand in load balancing, traffic shaping, performance monitoring, authentication and resource usage accountability as well.

Grid software also presents uniform interfaces for local computing resources, Fisk said. Someone could submit a job-processing request using the Condor workload management system, developed by the University of Wisconsin. “The grid interface provides a consistent view of the batch system,” Fisk said. Fermilab also uses grid tools for resource monitoring and accounting. “Components of the Globus Toolkit itself provides us with the gatekeepers we use for the processing submission. “We have thousands of jobs a day through the Globus toolkit,” he said.

About the Author

Connect with the GCN staff on Twitter @GCNtech.

Featured

  • Cybersecurity
    CISA chief Chris Krebs disusses the future of the agency at Auburn University Aug. 22 2019

    Shared services and the future of CISA

    Chris Krebs, the head of the Cybersecurity and Infrastructure Security Agency at DHS, said that many federal agencies will be outsourcing cyber to a shared service provider in the future.

  • Telecom
    GSA Headquarters (Photo by Rena Schild/Shutterstock)

    GSA softens line on looming EIS due date

    Think of the September deadline for agencies to award contracts under the General Services Administration's $50-billion telecommunications contract as a "yellow light," said GSA's telecom services director.

  • Defense
    Shutterstock photo id 669226093 By Gorodenkoff

    IC looks to stand up a new enterprise IT program office

    The intelligence community wants to stand up a new program executive office to help develop new IT capabilities.

Stay Connected

FCW INSIDER

Sign up for our newsletter.

I agree to this site's Privacy Policy.