NSF taps two centers for continued funding

The National Science Foundation last month tapped two of its four supercomputing centers to receive continued funding as part of its revamped program to create a mega-computing environment linked by networks.

Two teams of academic and research institutions led by the University of Illinois at Urbana-Champaign and the University of California San Diego were awarded contracts under the new Partnerships for Advanced Computational Infrastructure (PACI) program. The PACI winners each will receive five-year grants that will not exceed $170 million. These two centers are home to national supercomputing centers that have been funded by the foundation since 1985.

Two other centers that have been funded by NSF since 1985 - the Cornell Theory Center at Cornell University and the Pittsburgh Supercomputing Center at Carnegie Mellon University - submitted unsuccessful bids on the program. Each of these centers will have up to two years to phase out their programs and will receive up to $11 million each through May 1999. As part of the previous NSF supercomputing centers program each of the centers received $16 million per year.

Robert Borchers NSF division director for Advanced Scientific Computing said the selection of the winning teams was based on four criteria: quality of the proposed computing infrastructure quality of proposed partners quality of proposed management and cost-sharing principles within the proposals.

Despite the reduction in the number of centers that will receive funding high-performance computing research will not suffer sources said last week."Supercomputers are not being de-emphasized " said Larry Smarr director of the National Center for Supercomputing Applications at the University of Illinois. "They are being re-emphasized. We will be able to have a faster upgrade path...than we would have been able to have with four centers. It's simple arithmetic."

The move to cut the number of centers funded by NSF was approved in December 1995 by the National Science Board NSF's highest decision-making board. It was first suggested however in a September 1995 task force report. The Hayes Report concluded that recompetition would offer a "fair and orderly" method to migrate to a smaller number of supercomputing sites given the prospect that NSF's future budget dollars for the centers program would likely support only two or three centers.

The group led by the University of Illinois called the National Computational Science Alliance plans to create a distributed computing environment to serve as a prototype of advanced computational infrastructure that enables "bleeding edge" computational research. The group plans to integrate many computational visualization and information resources into a national-scale "grid." This "grid" will link powerful high-performance architectures housed at leading-edge research institutions to mid-range versions of these architectures to end-user workstations. It will also connect dozens of visualization and virtual reality displays massive data stores and remote instruments. Software developed by the alliance will work as a glue to unify the grid to solve complex problems according to the group's proposal.The alliance plans to work with major high-performance computing vendors - including Silicon Graphics Inc./Cray Research Inc. Hewlett-Packard Co./Convex Technology Center and IBM Corp. - to construct this grid. As part of its project the group will provide access to the nation's first NT/Intel teraflop (one that can perform one trillion computations per second) scalable supercomputer and the world's first "visual supercomputer" infrastructure.

Smarr said scientists tackling increasingly complex problems such as mapping human genetic patterns and modeling environmental cleanups will need this type of advanced computational infrastructure to bind multiple systems - and researchers - together."These are huge problems and they're not going to be solved by one person " Smarr said. "It's going to take these virtual teams of scientists working together."

The National Partnership for Advanced Computational Infrastructure led by the University of California plans to develop a national-scale metacomputing environment with diverse hardware and several high-end sights. The group's major focus will include data-intensive computing digital libraries and the manipulation of large data sets across many disciplines.

The partnership plans to acquire a teraflop scalable system that will be installed at the San Diego Supercomputer Center and mid-range systems at two partner institutions: the University of Texas and the University of Michigan."In the future we'll be able to have a scientist seamlessly acquire data from an institution at one place [and]...massage that data at another place...in total assemble something far more powerful [than 10 years ago] " said Sidney Karin director of the National Partnership for Advanced Computational Infrastructure.

Featured

  • Cybersecurity

    DHS floats 'collective defense' model for cybersecurity

    Homeland Security Secretary Kirstjen Nielsen wants her department to have a more direct role in defending the private sector and critical infrastructure entities from cyberthreats.

  • Defense
    Defense Secretary James Mattis testifies at an April 12 hearing of the House Armed Services Committee.

    Mattis: Cloud deal not tailored for Amazon

    On Capitol Hill, Defense Secretary Jim Mattis sought to quell "rumors" that the Pentagon's planned single-award cloud acquisition was designed with Amazon Web Services in mind.

  • Census
    shutterstock image

    2020 Census to include citizenship question

    The Department of Commerce is breaking with recent practice and restoring a question about respondent citizenship last used in 1950, despite being urged not to by former Census directors and outside experts.

Stay Connected

FCW Update

Sign up for our newsletter.

I agree to this site's Privacy Policy.