DOE raises the bar on supercomputing
Computationally intensive research creates insatiable demand for faster supercomputers
- By Aliya Sternstein
- Aug 07, 2006
The Energy Department’s Oak Ridge National Laboratory and Cray announced a $200 million deal in June to complete the world’s most powerful supercomputer in 2008.
The supercomputer, which Cray nicknamed Baker, will use optimized Advanced Micro Devices dual-core Opteron processors to reach a peak speed of a petaflop, or 1,000 trillion floating-point operations/sec (teraflops). In comparison, the average PC reaches speeds of about 0.0001 teraflops.
Later in June, DOE’s Lawrence Livermore National Laboratory and IBM teamed to announce they had deployed the most powerful software computer code for the world’s current most powerful supercomputer, Blue Gene. The computer code, dubbed Qbox, will help researchers run science simulations deemed essential to national security.
Researchers say the DOE labs’ race for greater supercomputing speed will generate new ideas and technologies for classified and unclassified science. “Intellectual competition creates a perpetual game of leap frog, where each new system eclipses the previous leader,” said Dan Reed, director of the Renaissance Computing Institute and a member of the President’s Council of Advisors on Science and Technology (PCAST).
“This friendly competition is both healthy and inevitable,” Reed added.
DOE’s weapons program requires large, detailed models and large computing capacity. In unclassified research, access to high-performance computing systems benefits climate and atmospheric modeling, quantum chemistry and physics, materials science, engineering, and manufacturing. PCAST is conducting an assessment of the federal government’s Networking and Information Technology Research and Development portfolio, which includes a multidecade road map for computational science.
“Software remains one of the great impediments in high-performance computing,” Reed said. “Our programming models remain very low level, software development costs are high, and great human effort is required to extract substantial fractions of peak performance from current systems.”
Reed said the solution is a sustained, coordinated software R&D program. “Computing really is the third pillar of science, and its potential is limited only by our imaginations,” he said.
Petascale systems, such as the one in development at Oak Ridge, will expand research opportunities. It happens with each new order-of-magnitude increase in computing power, Reed said. “There are already discussions of transpetascale systems,” he added.
Assembling systems with high-peak performance is relatively easy. The challenge for petascale systems is sustained performance rather than peak performance, Reed said. “We’re not there yet.”
Oak Ridge scientists will use the unclassified Cray supercomputer to solve problems in nanotechnology, biology and energy, Cray officials said. Industrial researchers will also get time on the system through a DOE program that grants academic and corporate institutions supercomputer access for computationally intensive research that has national interest. For instance, this year, guest researchers from Boeing and DreamWorks Animation SKG will run simulations to help design more efficient aircraft and improve computer animation, respectively.
“There’s an almost insatiable demand for computing power,” Cray spokesman Steve Conway said. “The more they can get, the better off the science is going to be.”
Lawrence Livermore’s Qbox operates at a level comparable to an online game with 300 million simultaneous players. With the Qbox application, Blue Gene achieved a sustained performance of 207.3 teraflops. Blue Gene, a classified machine, belongs to the National Nuclear Security Administration.
The IBM-built Blue Gene provides analysis that NNSA needs to safeguard the nuclear weapons stockpile without going underground to test the weapons, said Dimitri Kusnezov, who leads NNSA’s Advanced Simulation and Computing Program.
The code enhancements for Blue Gene are critical to performing predictive simulations of nuclear weapons, Kusnezov said. “These simulations are vital to ensuring the safety and reliability of our nuclear weapons stockpile.”
NNSA researchers are attempting to decipher the radioactive decay of materials buried decades ago. “We have to figure out what aging means and then put [the material] under extreme conditions and see if [it] behaves like we think it’s going to behave,” Kusnezov said.
“Before, we used to take [the materials] to Nevada and blow them up,” he added. “That’s why we’re pushing our computing so aggressively.”