5 pitfalls to avoid with performance management dashboards

Dashboards promise greater visibility into day-to-day agency operations, but poor data and lack of user involvement can spoil projects.

Performance management dashboards have been drawing a lot of interest lately, including from federal Chief Information Officer Vivek Kundra. He believes that gathering key performance data and presenting it in an easy-to-grasp fashion on a dashboard can help the public keep better tabs on the government and help agency decision-makers manage resources more wisely.

But experienced dashboard users say achieving success with those management tools is not as quick or easy as it might seem.

Under the hood

Here's a look at the inner workings of the dashboards profiled in this story.

Agency: South Carolina Office of Research and Statistics
Back-office components: Microsoft SQL Server, SAS software, ESRI to enable geographic information systems
Presentation layer: Panorama Software’s NovaView

Agency: Tallahassee, Fla., Police Department
Back-office components: Open Text Connectivity Solutions Group’s Hummingbird business intelligence system that extracts data from computer-aided dispatch and records management systems
Presentation layer: MyDials, provided on software-as-a-service basis

Agency: Agriculture Department’s Cooperative State Research, Education and Extension Service
Back-office components: Oracle Database
Presentation layer: Oracle’s Portal Development Environment

Agency: Washington State Transportation Improvement Board
Back-office components: Microsoft SQL Server, SAP BusinessObjects’ Xcelsius for data handling
Presentation layer: Custom dashboard running on ColdFusion server

Washington state’s Transportation Improvement Board launched a performance management dashboard in 2003 that monitors about 400 grants it gives to local agencies to fund road repairs and new construction. The board credits the tool with helping it align grants with available funding, speed disbursements and shrink the number of delayed projects. But getting to that point has not been a simple matter of dumping a few data fields into a colorful graphic.

“We’ve had to make dozens and dozens of data collection changes — most minor, some significant,” said Steve Gorcester, executive director of the board. “We measure things today automatically in real time that we didn’t even count four or five years ago.”

Identifying and then collecting meaningful data are among the first considerations for agencies interested in using dashboards. Along the way, officials can encounter a number of pitfalls — some tied to data and others related to development and deployment.

The issues that can reduce a promising project to shelfware loom particularly large now that more agencies are building dashboards to meet the Obama administration’s goals for transparency and the American Recovery and Reinvestment Act’s mandates for accountability.

Here are the five most common mistakes agencies make when creating dashboards — and how to avoid them.

1. Faulty data

Dashboards aim to provide an easy-on-the-eyes graphical presentation that lets managers readily grasp key performance trends. But all the fancy design in the world won’t compensate for faulty data.

“Dashboards are really only as good as the data,” said Ramon Barquin, chief executive officer of Barquin International, a business intelligence firm with clients in the government and private sectors. “So you want to be absolutely sure before you put the dashboard out into production that you have done all of the due diligence vis-a-vis the data.”

Unfortunately, agencies often overlook data quality when they’re scrambling to create dashboards. Those shortcomings surface later when managers examine the data more closely to glean performance trends, Gorcester said. Charts that show wild swings in productivity or funding trends that don’t exist are symptomatic of a troubled dashboard.

Such dashboards soon acquire a bad reputation, although the problem stems from poor data rather than the tool or its functionality, Barquin said.

To avoid that fate, agencies should take time early in the project to assess the soundness of the sources that will supply data to the dashboard. It’s often helpful to agree on standard data quality tools before starting a dashboard project.

2. The wrong data

Collecting the wrong information can also be a problem. Gorcester said a common thread in his discussions with agency officials is a tendency to use whatever data they have on hand rather than figuring out what data they need to create useful metrics.

“A big thing in performance management … is whether you are counting the right stuff,” Gorcester said.

For example, counting the number of phone calls an organization receives might be useful information for a call center, but it isn’t particularly meaningful for the typical government agency.

The Transportation Improvement Board uses process mapping to help point it toward data that is worth collecting. Officials use a whiteboard to diagram every step of their workflows. The mapping reveals points at which yes-or-no decisions must be made or activities branch into alternative routes. Those places tend to indicate points at which data collection should occur, Gorcester said.

“Understanding what types of performance information give you useful decision-making information is a big part of the art,” he said.

He cited the case of a customer who submitted an approval request for a construction project but didn’t hear back from the agency. When board officials looked into the matter, they discovered that they knew when such requests were processed but not when they arrived or how long they stayed in the queue awaiting action.

So the board began tracking inbound e-mail, USPS mail and phone requests, and agency staff members entered requests into a database that tracks them and clocks them out upon completion. Meanwhile, the board’s engineering team came up with a reasonable turnaround time for handling requests.

With that time-in-queue data now available, the agency was able to add a transaction-processing chart to its dashboard. The new element alerts managers to transactions that exceed the established time frame and has sparked competition within the agency to shorten turnaround times.

3. Too much data

Lack of data generally isn’t the problem at South Carolina’s Office of Research and Statistics, part of the state’s Budget and Control Board. The office maintains a data warehouse for more than 20 agencies and the state’s acute care hospitals. It also works with state, nonprofit and some private-sector customers on data integration, warehousing and dashboard solutions.

“It’s an enormous data collection we have to work with,” said David Patterson, chief of the Office of Research and Statistics’ Health and Demographics Section.

When it comes to data on the dashboard, more is not better. Too much or improper data can distract or frustrate users.

The Office of Research and Statistics consults with customers to pick data to include in dashboard projects. Patterson likened that process to a joint application design session in the software development world. The office’s technologists collaborate with customers to determine the specific data elements, presentation layer and assignment of roles for role-based access.

4. Overly long development process

The construction of a dashboard raises another set of potential hurdles, including development approaches that don’t get functionality to users quickly enough.

Joseph Barbano, project manager for the Research, Education, and Economic Information System-Leadership Management Dashboard at the Agriculture Department, said traditional development methods can stall a dashboard effort. USDA uses the technology to track projects funded through its Cooperative State Research, Education and Extension Service and makes it accessible to the service’s staff and state partners, such as land-grant universities.

In past years, the government’s application development process involved waiting for all requirements to be identified before proceeding with a massive project that could take years to complete.

“Users would lose interest in it because they couldn’t see anything they could use on a periodic basis,” Barbano said.

However, USDA’s dashboard was built using agile programming and rapid prototyping techniques. The project team worked with a core group of users to determine high-level requirements and quickly demonstrated potential approaches to give the users a sense of the dashboard’s direction. After refining the tool, the developers released a prototype dashboard in late 2006; the first production release came out in July 2007, and developers continually add new functions.

The idea behind rapid prototyping is to “release parts of this dashboard quickly so you can build momentum for future releases of the dashboard,” Barbano said.

5. Neglected users

Dashboards are dynamic projects, which means that as users’ needs evolve, the management tool must do the same.

“If every time I go there, it isn’t telling me anything new, usage is just going to fall off,” said Greg Frost, executive services director at the Tallahassee, Fla., Police Department.

The department’s senior managers use a dashboard to track a variety of data, including response times and staffing levels.

“You have to continually meet with users and continually…add more content or it becomes a dead project,” Barbano said.

Users have flooded USDA with requests for the information they would like to see displayed on the Leadership Management Dashboard. The core user group helps prioritize those requests for inclusion in new releases, which occur about every five months.

Tallahassee uses a software-as-a-service offering called MyDials. A city systems administrator can customize the dashboard and add new features.

A dashboard’s presentation layer might also require periodic updates. Too many objects on the dashboard make for a cluttered and confusing display. Although it's only a little more than two years old, the Leadership Management Dashboard’s interface is undergoing a redesign.

“We got to the point where we needed to update our interface with a new look and feel,” said Bill Bristow, data manager at USDA’s Research, Education and Economics Information System.

The new interface will make it easier for users to get to the data they need, which is the ultimate measure of a dashboard’s success.