Council calls for $1B for IT research
- By Elana Varon
- Aug 16, 1998
Too much of the information technology research sponsored by federal agencies serves short-term, mission-specific goals, and the government should spend more money studying fundamental IT problems, according to an interim report by the President's Information Technology Advisory Council.
To maintain its competitiveness in the global marketplace and safeguard national security, the United States must improve the reliability of software, build new tools that are capable of managing huge, complex networks and develop faster computers, the report concluded. The panel recommended increasing the federal IT research budget by $1 billion over the next five years, although most of the spending recommendations will be detailed in the final report due in February.
Meanwhile, President Clinton said the Office of Science and Technology Policy would use the report to devise a new IT research plan for the fiscal 2000 budget.
But advocates for more research money inside and outside government agreed last week that it could be hard to find the funds called for by the panel. The report noted there has been "a marked shift toward applied R&D'' by federal agencies since the 1980s because of budget constraints.
"I think everyone concedes that it is important that agencies meet their near-term mission requirements,'' said Kay Howell, director of the National Coordination Office for Computing, Information and Communications. "There will have to be tradeoffs somewhere else. It's going to be a hard sell, not because I don't think people understand this is an important technology but because there are many priorities out there."
Gary Johnson, a professor at George Mason University and a member of the Institute of Electrical and Electronics Engineers' Communications and Information Policy Committee, said another $1 billion for research is not enough, but getting that much "would be a hard sell'' for any program. "It may depend on a lot of factors that have nothing to do with the technical arguments," he said.
Among other proposals, the panel said federal agencies should fund more basic research into software development, saying more "scientifically sound'' methods for testing software are needed to make it secure and reliable. As an example, the report cited the difficulty the government has faced upgrading large systems for the Federal Aviation Administration and the Internal Revenue Service.
Although the failures of these modernization efforts have been blamed on poor management, Susan Graham, a computer science professor at the University of California at Berkeley and a member of the advisory group, said the absence of good software development practices and tools contributes to agencies' problems managing complex system upgrades.
"These are systems you want to modify while they are in place,'' Graham said. "There needs to be some technology to maintain the integrity of the system as it changes under you. We have ways of doing all those things now, but they are not methods that scale up adequately. It's too hard, it takes too many people, and there are too many ways to go wrong.''
The report said that by focusing too much on short-term, application-oriented research, agencies are short-changing the U.S. lead in technology development. "Today we are reaping the benefits of technical advancements paved by past federal investments,'' according to the report. "Our future ability to harness the power and promise of this technology depends upon our willingness to maintain an adequate research base.''
However, the panel may be drawing too fine a distinction between basic and applied research. According to GMU's Johnson, when it comes to software development "one of the real challenges is to get the problem statement correct." Advances in this field are closely related to how they are applied, he said, "so I would encourage software developers to stay in continuing contact with applications people.''
A copy of the report is available at www.hpcc.gov/ac/interim.