Supercomputing takes NASA to infinity and beyond

From science to space exploration, the world’s fourth-fastest computer is an agency asset.

The role of the NASA Ames Research Center as the information technology leader for the new spaceship program signals one giant leap for U.S. supercomputing, researchers and federal officials say.

The research center at Moffett Field, Calif., is home to the supercomputer Columbia, a project of NASA’s Advanced Supercomputing Division. It ranks fourth on the Top 500 list of the world’s fastest computers.

Earlier this month, NASA officials announced that Ames is in charge of IT, software and thermal protection systems for the program to send a manned mission to the moon and then to Mars.

Because of new spaceship duties and budget priorities, supercomputing funding is holding steady, said Eugene Tu, director of the Exploration Technology Directorate at NASA Ames.

Last fall, NASA Administrator Michael Griffin designated the Columbia project as an important asset and funded the program at the agency level instead of the program level, as it had been funded in the past.

In addition, the agency has recognized Columbia’s contribution to science, aeronautics and the new vision for space exploration. The division now has enough funding to upgrade the system in late 2007 with next-generation capabilities.

“Had we not gotten this stable funding and priority funding, we would have dropped down in our capabilities,” Tu said, explaining that the Ames center must return some leased components of the Silicon Graphics Inc.-built Columbia system in the fall of 2007.

The Columbia supercomputer is already heavily involved in the space exploration program. Its simulations are helping define design requirements for the spaceship and the vehicle used to launch it.

“These are tools that were not available back in the Apollo days,” Tu said.

They are powered by processors that can absorb more details and calculate more answers than ever before. That capability translates into more accurate illustrations of a wide range of worst-case scenarios, from debris striking the vehicle to overheating upon re-entry into the Earth’s atmosphere.

Supercomputing can assist in spaceship construction by simulating the effects of those conditions on vehicle designs, according to SGI officials.

“With supercomputers, designers can simulate thousands of scenarios rather than 10 to 100 with smaller systems — greatly increasing the robustness and safety of a design,” said Michael Brown, SGI’s application segment manager for scientific research. “No matter how exhaustively pieces of a rocket are analyzed, there will always be undetected flaws that may even be within design parameters.”

In addition, space debris and micrometeoroids can cause damage to a vehicle. Winds, precipitation and clouds create dynamics that can lead to icing and buffeting.

“It is important to analyze what happens to a design when this damage occurs and to develop — as much as possible — a design [that] maintains functionality and survivability in the face of flaws and damage,” Brown said.

NASA scientists are also modeling ascent scenarios that could force the crew to abort the mission.

“During the ascent, if something goes wrong, you must be able to abort the mission while keeping the crew safe,” said Rupak Biswas, acting division chief of NASA’s Advanced Supercomputing Division.

Columbia can also hypothesize what might happen to the spaceship’s heat shield during re-entry in various high-temperature scenarios.

The high-resolution supercomputer gives NASA scientists confidence that the answers derived from the simulations are accurate, Biswas said.

In 2007, Columbia’s output will become even more precise with the addition of new technologies, such as faster clocks and multicore processors.

Those processors fit multiple computational engines onto a single chip, thereby boosting speed without significantly adding to the supporting infrastructure. Compact efficiency is a must because Columbia is constrained by floor space, electrical power and cooling capacities.

“The industry’s push toward multiple processors enables you to pack in more computing power within the same real estate,” Biswas said.

Without such processors, NASA scientists can analyze only a limited set of variables at once. For example, researchers can model various shuttle designs, but they cannot simultaneously simulate the effects of weather conditions on a launch scenario.

“We do not have the power to bring in those multiple variables into one model,” Biswas said. “We are still modeling systems, but what we want to really model [are] systems of systems.... If you had a computer that was 10 times more powerful, then you’d be able to do that.”

Supercomputing experts say such integration of computational models across multiple disciplines represents a trend that will require continued federal funding.

“One need only look at the evolution of ocean, weather and climate modeling to see the progressive inclusion of human activities — for example, carbon emissions, sea ice — and higher-resolution, more detailed models,” said Dan Reed, director of the Renaissance Computing Institute and a member of the President’s Council of Advisors on Science and Technology. “There are multidisciplinary problems where even petascale computing will not be fully adequate to address all of the opportunities and challenges.”

NASA expects supercomputer to enable 14-day forecasts

In addition to space exploration, earth science will benefit from supercomputing at NASA’s Ames Research Center. Earth and space science account for nearly 50 percent of supercomputer Columbia’s usage.

Columbia supports hurricane preparedness by modeling five-day forecasts for federal meteorologists at the National Oceanic and Atmospheric Administration.

In late 2007, NASA will upgrade the Columbia system to give the scientific community reliable 14-day forecasts, NASA officials said.