2005: 7 lessons from GAO
If there's a problem in government, you can bet the Government Accountability Office will eventually investigate it. From bad management to bad contracting, from failing to be up-to-date on cybersecurity to failing programs, the government's watchdog agency will likely investigate what happened and why.
Along the way, GAO will give program officials a chance to speak out, challenge the findings, correct them or promise
to do better. That certainly was the case
in 2005 as GAO tackled hundreds of
problems involving the use and misuse
of federal information technology. GAO's findings offer plenty of lessons for agencies to learn. Here is a look at some of those reports on IT issues the good and the bad with examples of how GAO's scrutiny changes agencies' behavior.
Lesson #1: Keep an eye on your contractors
GAO auditors found that only five of 24 executive branch agencies had developed policies for ensuring that federal contractors protect government information on computer networks, according to a report GAO released in May.
Federal agencies have few resources at their disposal for holding contractors accountable for the security of government information on systems and networks that contractors control, the auditors found. Three tools that agency officials use to oversee contractors contracts, oversight policies and self-assessments have been relatively ineffective at preventing the risks posed by contractor operations, the report states.
Those risks include unnecessary exposure to worms and viruses, weak system access controls and unauthorized release or use of government information.
Lesson #2: Learn to deliver the bad news
Federal agencies need more detailed instructions to handle and report computer security threats, such as phishing, spyware and hacking, government auditors said in a June report.
GAO auditors found that most federal officials do not understand which computer security incidents they should report or how and to whom they should report them, even though such reporting is mandatory under the Federal Information Security Management Act.
As a result, the Homeland Security Department's U.S. Computer Emergency Readiness Team, which handles incident reporting, is unable to coordinate and respond to cyberthreats that target multiple federal agencies.
To remedy the lack of accurate and comprehensive reporting, the auditors recommended that Office of Management and Budget officials increase their oversight of agencies' efforts to identify, report and respond to emerging cybersecurity threats.
Lesson #3: Think in terms of budget trade-offs
In July, GAO admonished the Federal Aviation Administration for not divulging how belt-tightening efforts, needed to finish an overdue air traffic control modernization program, were affecting aviation safety systems.
GAO auditors recommended that FAA officials clearly identify trade-offs they are making to reach their budget targets by highlighting programs slated for funding increases and reductions. Without such information, according to GAO's report, lawmakers cannot evaluate the FAA's budget requests.
For decades, GAO's auditors have criticized the air traffic control modernization program for wasting taxpayer dollars through costly schedule and performance miscalculations.
A new FAA unit, the Air Traffic Organization (ATO), was created in 2004 to streamline the agency's acquisitions.
Auditors said ATO officials don't include all the pros and cons of cuts when they submit budget proposals for senior officials and lawmakers to review.
Lesson #4: Plan for trouble
GAO gave kudos to the National Archives and Records Administration for practicing good risk management in its Electronic Records Archives program.
The program has some weaknesses, but GAO declined to make any recommendations, saying the agency already had plans in place to address those issues.
NARA officials said they were aware of the risks in not forecasting the volume of e-records they might process now and in the future. It might mean they miscalculate the archives' size and scalability specifications.
NARA also accepted GAO's criticism for not knowing whether they will save e-files in their original formats or migrate the files to easily accessible formats.
NARA officials' self-reported risks should help them achieve their goals, according to NARA and GAO officials. At the time of the report, NARA had achieved all major milestones on or ahead of schedule.
Lesson #5: Communicate, communicate, communicate
In July, GAO took the Defense Department to task for continuing development of its new National Security Personnel System without holding adequate discussions with various stakeholders.
DOD did not identify key people interested in the personnel reforms or their concerns, according to the auditors' report. Employees were not part of the DOD working groups that drafted the plan.
"Failure to adequately consider a wide variety of people and cultural issues can lead to unsuccessful transformations," the July GAO report states.
But the agency commended DOD for using many practices for successful organizational transformations. Auditors cited DOD's process to design a system that department and Bush administration officials could support. GAO praised DOD for the guiding principles and performance parameters that guided the new personnel system's design process.
GAO recommended that DOD also devise a way to evaluate the effect of the new personnel system after its implementation.
In contrast to the low grades DOD earned on its collaborative report card, the Environmental Protection Agency got a thumbs up from GAO for good collaboration when it created a cross-agency e-rulemaking initiative.
"Even when an agency's suggestion was not incorporated into the system design, [those agencies] acknowledged that e-Rulemaking officials treated their concerns fairly, completely, and they understood why the suggestion was rejected," the report states.
Lesson #6: Put it in writing, distribute it widely
When GAO gave the Census Bureau advice on managing the 2010 decennial census, one theme was prevalent: Don't just say it, document it.
Census officials had developed policies and procedures to successfully manage IT in several areas, but those policies are not fully and consistently performed, according to the auditors' report.
For example, the bureau has established executive-level investment boards but does not have written procedures for how those boards should operate and make decisions on IT spending. GAO recommended creating a comprehensive repository of up-to-date investment information accessible to decision-makers.
GAO suggested that bureau officials develop and implement criteria and document policies for overseeing all IT projects. The bureau should also establish a written policy endorsing and enforcing enterprise architecture, the auditors said.
Lesson #7: Collect data...and use it
Agencies could improve the federal contracting process by reporting more information about contractors involved in suspension and debarment cases, GAO said in a September report.
The additional reporting could make it harder for excluded contractors to continue getting new contracts in defiance of their status, the auditors wrote. GAO specifically recommended that a governmentwide database of information on exclusions, the Excluded Parties List System, be modified so that each excluded company's contractor identification number would become part of the company's database listing.
That number is a unique identifier, so its mandatory inclusion would make it harder for an excluded company to get new business under a different name. GAO also recommended that agencies be required to share with other agencies any information on administrative agreements between an agency and a company.
Data is worthless if no one uses it, GAO wrote in a September report.
GAO reported that agencies are doing a good job of collecting data to measure the effectiveness of their programs, as required by the Government Performance and Results Act of 1993. They collect more now than they did in 1997, when GAO conducted a similar review. But auditors found that federal managers have not progressed much beyond where they were in 1997 in using that data to make better management decisions.