NASA computes climate change

Using supercomputers that are 10 times more powerful than those of just a few years ago, NASA scientists can now evaluate the global impact of natural and human-induced activities on the Earth's climate and more accurately predict future climate patterns.

William Feiereisen, chief of the NASA Advanced Supercomputing Division, said global climate change is one of the most important questions that the space agency is attempting to answer.

He said it is a three-part process that includes:

* Gathering data from satellites and weather stations.

* Unifying that data through modeling.

* Tying it all together "by being able to simulate these things with supercomputers."

"Until recently, we were unable to harness the amount of computing horsepower to harness something like that," Feiereisen said. But thanks to some new supercomputers from Silicon Graphics Inc., including a 512-processor SGI Origin 3800 system at NASA's Ames Research Center, global climate prediction — and humans' effect on the climate—is becoming much more clear.

"We can now make more accurate predictions about what we're doing to the climate of the Earth," he said.

Pressure is coming from the head of the space agency to keep Feiereisen and his colleagues on their toes. During a keynote address in June, NASA Administrator Daniel Goldin called on the private sector, educational institutions and government agencies to begin exploring and using next-generation, nontraditional computational models to meet the computing power needs of weather and climate-modeling programs.

"Modeling climate is the single toughest problem we have except for modeling aircraft and spacecraft," Goldin said. "Get to work," he demanded of those in attendance.

Feiereisen said that NASA is not alone in tackling the global climate question because the science has progressed to a point where no single organization could do it independently. In addition to Ames and NASA's Goddard Space Flight Center, many other government and private organizations are working in the same arena.

Despite the significant progress being made in supercomputing power and global climate modeling and prediction, Feiereisen said much work remains to be done. "We're trying to reach another 100 times improvement in performance in the next five years."

And as if that goal isn't ambitious enough, there will be a need for performance improvements of 1 million times in the next 10 to 15 years.

"We can easily foresee the need to answer those kinds of questions. But how to deliver the horsepower to Earth science models? That's going to be the challenge," Feiereisen said.

In other supercomputing news, the Arctic Region Supercomputing Center and the University of Alaska Fairbanks announced a new partnership July 31 with the Institute for Systems Biology in Seattle.

The affiliation will enable the centers to establish cross-institute faculty appointments and share information and technology. The agreement will advance the center and the university into the field of computational biology, while providing the institute with the computational power and experience to tackle its immense data sets.


  • Management
    shutterstock image By enzozo; photo ID: 319763930

    Where does the TMF Board go from here?

    With a $1 billion cash infusion, relaxed repayment guidelines and a surge in proposals from federal agencies, questions have been raised about whether the board overseeing the Technology Modernization Fund has been scaled to cope with its newfound popularity.

  • IT Modernization
    shutterstock image By enzozo; photo ID: 319763930

    OMB provides key guidance for TMF proposals amid surge in submissions

    Deputy Federal CIO Maria Roat details what makes for a winning Technology Modernization Fund proposal as agencies continue to submit major IT projects for potential funding.

Stay Connected