Doing dashboards right
- By Doug Beizer
- Oct 30, 2008
Software dashboards that consolidate and report performance metrics about programs, business processes or information technology infrastructure can be valuable tools for agency managers and decision-makers. Dashboards can also become seldom-used frills if executives decide they don’t deliver the information they need to do their jobs.
Experts say that a dashboard’s success or failure boils down to a few key elements.The right metrics
“Before you even go buy the technology, know what you are trying to accomplish and examine if you’re going to be able to do it,” said Greg Cohen, chief of business management and metrics at the Coast Guard’s Acquisition Directorate. “And then you have to stick with it and take action on it, or people won’t take their metrics seriously.” The right tool
It can help immensely if the dashboard software can incorporate many different data types. Technologies that do not require a lot of up-front schema knowledge or data modeling tend to work the best, said Matt Eichner, vice president of strategic development at Endeca.
Older dashboard systems required data modeling that had to anticipate what questions the dashboard users might ask. New systems are capable of finding data relationships automatically.
“In the intelligence and defense communities, the ability to do off-road analysis is a fundamental requirement,” Eichner said. “So dashboards must provide exploratory capabilities that any person can use. They must provide high-level summaries but also guide users through the next step process of drilling into the individual pieces behind it.”
Agencies might also want a dashboard that can collect data automatically. Manual collection increases costs dramatically and can become a major burden. Because dashboards are only as good as the data going into them, determining how the data will be collected is critical. The right processes
Starting with a small trial project is a good way for agencies to decide whether dashboards are a good option for them and work out the kinks before a bigger implementation, said Shawn O’Rourke, vice president of risk management services for American Systems, a government technology solutions provider.
“Do not try to capture all your information on the dashboard the first time out,” O’Rourke said.
Also, don’t underestimate the potential resistance to a program that will implicitly monitor people’s performance, O’Rourke said.
“When instituting a dashboard, it is important to engage the organization to communicate the dashboard’s purpose within the organization,” he said. “Open communication and clear goals will help ensure the trust in the program and its operations. Also, test driving the dashboard before it goes into full operation will help bring attention to any potential problems.”
Doug Beizer is a staff writer for Federal Computer Week.