DOE raises the bar on supercomputing

Computationally intensive research creates insatiable demand for faster supercomputers

The Energy Department’s Oak Ridge National Laboratory and Cray announced a $200 million deal in June to complete the world’s most powerful supercomputer in 2008.

The supercomputer, which Cray nicknamed Baker, will use optimized Advanced Micro Devices dual-core Opteron processors to reach a peak speed of a petaflop, or 1,000 trillion floating-point operations/sec (teraflops). In comparison, the average PC reaches speeds of about 0.0001 teraflops.

Later in June, DOE’s Lawrence Livermore National Laboratory and IBM teamed to announce they had deployed the most powerful software computer code for the world’s current most powerful supercomputer, Blue Gene. The computer code, dubbed Qbox, will help researchers run science simulations deemed essential to national security.

Researchers say the DOE labs’ race for greater supercomputing speed will generate new ideas and technologies for classified and unclassified science. “Intellectual competition creates a perpetual game of leap frog, where each new system eclipses the previous leader,” said Dan Reed, director of the Renaissance Computing Institute and a member of the President’s Council of Advisors on Science and Technology (PCAST).

“This friendly competition is both healthy and inevitable,” Reed added.

DOE’s weapons program requires large, detailed models and large computing capacity. In unclassified research, access to high-performance computing systems benefits climate and atmospheric modeling, quantum chemistry and physics, materials science, engineering, and manufacturing. PCAST is conducting an assessment of the federal government’s Networking and Information Technology Research and Development portfolio, which includes a multidecade road map for computational science.

“Software remains one of the great impediments in high-performance computing,” Reed said. “Our programming models remain very low level, software development costs are high, and great human effort is required to extract substantial fractions of peak performance from current systems.”

Reed said the solution is a sustained, coordinated software R&D program. “Computing really is the third pillar of science, and its potential is limited only by our imaginations,” he said.

Petascale systems, such as the one in development at Oak Ridge, will expand research opportunities. It happens with each new order-of-magnitude increase in computing power, Reed said. “There are already discussions of transpetascale systems,” he added.

Assembling systems with high-peak performance is relatively easy. The challenge for petascale systems is sustained performance rather than peak performance, Reed said. “We’re not there yet.”

Oak Ridge scientists will use the unclassified Cray supercomputer to solve problems in nanotechnology, biology and energy, Cray officials said. Industrial researchers will also get time on the system through a DOE program that grants academic and corporate institutions supercomputer access for computationally intensive research that has national interest. For instance, this year, guest researchers from Boeing and DreamWorks Animation SKG will run simulations to help design more efficient aircraft and improve computer animation, respectively.

“There’s an almost insatiable demand for computing power,” Cray spokesman Steve Conway said. “The more they can get, the better off the science is going to be.”

Lawrence Livermore’s Qbox operates at a level comparable to an online game with 300 million simultaneous players. With the Qbox application, Blue Gene achieved a sustained performance of 207.3 teraflops. Blue Gene, a classified machine, belongs to the National Nuclear Security Administration.

The IBM-built Blue Gene provides analysis that NNSA needs to safeguard the nuclear weapons stockpile without going underground to test the weapons, said Dimitri Kusnezov, who leads NNSA’s Advanced Simulation and Computing Program.

The code enhancements for Blue Gene are critical to performing predictive simulations of nuclear weapons, Kusnezov said. “These simulations are vital to ensuring the safety and reliability of our nuclear weapons stockpile.”

NNSA researchers are attempting to decipher the radioactive decay of materials buried decades ago. “We have to figure out what aging means and then put [the material] under extreme conditions and see if [it] behaves like we think it’s going to behave,” Kusnezov said.

“Before, we used to take [the materials] to Nevada and blow them up,” he added. “That’s why we’re pushing our computing so aggressively.”

Supercode named Qbox

Herb Schultz, Blue Gene product manager for IBM’s Deep Computing group, said the Qbox code will let scientists at the National Nuclear Security Administration simultaneously model more molecules than previously possible and discover never-before-observed molecular dynamics.

The Q in Qbox stands for quantum, which refers to the code’s quantum mechanical descriptions of electrons. The software is a type of first principles molecular dynamics (FPMD) code designed to visualize the properties of metals under extreme temperature and pressure.

Qbox is tailored for nuclear testing, but the underlying FPMD code can run complex models on an atomic scale in a number of other disciplines, such as solid-state physics and nanotechnology. The same code could be used to study human proteins.

“Such spin-off benefits often accompany focused programmatic efforts to foster technology,” said Dimitri Kusnezov, who leads NNSA’s Advanced Simulation and Computing Program. “This was certainly true for NASA during the years of the moon landing and is true today.”

IBM said the Qbox announcement should encourage other software vendors and researchers to develop groundbreaking software for the computing community. IBM shares its facilities with smaller software outfits that cannot afford to independently experiment with supercomputing software.

“We’ve invested lab time for independent software vendors to come in and work their codes in our facilities,” Schultz said. “We think we can stimulate interest with this development in other unclassified applications elsewhere.”


  • Cybersecurity
    CISA chief Chris Krebs disusses the future of the agency at Auburn University Aug. 22 2019

    Shared services and the future of CISA

    Chris Krebs, the head of the Cybersecurity and Infrastructure Security Agency at DHS, said that many federal agencies will be outsourcing cyber to a shared service provider in the future.

  • Telecom
    GSA Headquarters (Photo by Rena Schild/Shutterstock)

    GSA softens line on looming EIS due date

    Think of the September deadline for agencies to award contracts under the General Services Administration's $50-billion telecommunications contract as a "yellow light," said GSA's telecom services director.

  • Defense
    Shutterstock photo id 669226093 By Gorodenkoff

    IC looks to stand up a new enterprise IT program office

    The intelligence community wants to stand up a new program executive office to help develop new IT capabilities.

Stay Connected


Sign up for our newsletter.

I agree to this site's Privacy Policy.