In state and local governments, a consolidated view of all the data and records generated by individual constituents is a distant, idealized goal. Yet it is a service vision that holds a powerful grasp on government executives. Like corporations that have integrated large chunks of customer data, p
In state and local governments, a consolidated view of all the data and records generated by individual constituents is a distant, idealized goal. Yet it is a service vision that holds a powerful grasp on government executives. Like corporations that have integrated large chunks of customer data, public-sector agencies hope a complete data view will enable them to improve service levels, speed turnaround times and even polish their public image.
In reality, state and local governments face enormous technical, policy, and financial obstacles to integrating data from different systems and agencies. First, there are the inherent difficulties of combining data from many different, older, stovepiped systems. On top of that is the challenge of justifying the need for consolidation, which often spurs public fears that data could fall into the wrong hands and be misused.
As a result, most data consolidation in the state and local government marketplace is practical and piecemeal. "The issue of data consolidation or integration is coming up only on an application-by-
application basis," said Rick Knode, vice president and general manager of operations and telecommunications for the San Diego Data Processing Center, which is wrestling with how best to create a more unified data architecture.
To help the process, a new generation of software tools has been introduced to link databases. If, for example, the San Diego center replaces a mainframe application with a client/server Oracle Corp. database application and needs to provide World Wide Web views of the old CICS or IMS data and the new Oracle data, "we look for middleware tools that can help," Knode said.
Another common-sense approach to data consolidation is to build a centralized "data dictionary" and allow each agency to control access to its own data.
That's what North Carolina's statewide Federated Data Initiative is all about. For the past six months, state IT planners have been working on a centralized data dictionary that stores information defining, for example, whether tomatoes are fruits or vegetables and whether the state serves families or households.
"Many think this data dictionary effort is an esoteric project, but in reality it's critical," said Emilie Schmidt, chief technology officer for North Carolina's Information Technology Services. "So many data warehouses have failed. We see the need to set up the data dictionary so accurate comparisons of data from different systems can be made, and data can be shared across agencies."
In one case, the state's wildlife and fisheries agency is writing a program using the data dictionary to access specific information in the Motor Vehicle Administration's database. The goal is to speed the process of granting hunting and fishing licenses. In another case, the state's prison system may need access to information stored in court and law enforcement systems. Both applications cross agencies and create a consolidated view of data pertaining to individuals.
The Metadata Mindset
Data dictionaries are a way to tie-in data that has been created without any thought to single or consolidated views. Indeed, most state and local government agencies have created their own data definitions autonomously. Inside many agencies there may be different definitions for the same data, depending on the system used.
"Today it is nearly impossible to ask two people in different state agencies if two 20-hour employees are equal to one full-timer or two part-timers," said Mike Schiff, director of data warehousing strategies for Current Analysis Inc., a Sterling, Va., IT market research firm. "And without a common set of definitions for what makes a full-time employee, it's easy to add apples and oranges. In order to make meaningful measurements from data stored in operational systems, you must have common data definitions."
For almost any type of organization, this translates to creating a repository of metadata-the data about data. In simple terms, metadata explains what specific data means. If a state or local government's metadata isn't integrated, it can't compare apples to apples. For example, some states might want to withhold state tax refunds from parents who owe child support. In order to do this, the state needs an accurate, consolidated view of each citizen, at least across child welfare and tax or revenue departments.
Next, Enterprise Apps
Increasingly, software, database and tool developers are recognizing the need to integrate metadata. As a result, the companies are creating enterprise environments geared toward integration. "Just a year ago, they each had a strong data-warehousing message; now they are promoting the concept of enterprise application integration," said Jeanine Fournier, an analyst with the Boston-based Aberdeen Group.
Among the more important metadata developments in the marketplace, analysts said, are Microsoft Corp. joining the Metadata Coalition and donating its information model source code, and Ardent Software Inc.'s acquisition of Dovetail Software Inc., developer of a leading data transformation tool. Such actions could help spur metadata application development, they said.
Customers also are looking for a single vendor to fill their data analysis needs. Both IBM Corp.'s DB2, which uses tools from Information Builders Inc., and Oracle's Datawarehousebuilder II use several different analytical tools in one package.
Analytical tools from smaller developers are aimed at data extraction, transformation, migration and integration. Some of the suppliers include Active Software, CrossWorlds Software Inc., Oberon Software, Vitria Technology Inc., Tibco, IBM's MQ Integrator, Evolutionary Technologies International Inc., Constellar Corp., Enterworks and Prism. North Carolina, for example, is using a middleware tool from Blue Angel Software to build its metadata repository because it complies with the standard for public record data formats.
In choosing a software solution for data integration, agencies should closely examine what they want to achieve and compare that to the tools available.
Aberdeen Group's Fournier also suggested that agencies check the references of suppliers, bring them in for pilot projects and find out about their partners and strategic relationships, especially whether they have any working relationships with the major database providers. "Enterprise application integration is all about strong working partnerships among small and large software suppliers," Fournier said.
The Internet will continue to drive demand for data integration, according to analysts. Each government's desire to accomplish specific goals, such as welfare reform, Year 2000 remediation and improving responsiveness to constituents will drive the move to a consolidated view of data. But it may take years for ever-pressured governmental bodies to build the consensus necessary to create a unified view of each constituent.
-- Barbara DePompa Reimers is a free-lance writer based in Germantown, Md. She can be reached at firstname.lastname@example.org.
Top 10 Tips for Consolidating Data
10. Pick a small project team from the technical and business ends of your organization.
9. Set practical goals. No single integration project can supply all of the information needed.
8. Pick a project for which results can be delivered in less than six months.
7. Use available tools whenever possible.
6. Watch out for dirty data. Data cleansing tools and accuracy checks are critical.
5. Judge suppliers based on how scalable and reusable their tools are.
4. Be choosy about the data you want to access.
3. Plan for growth, as the system, the users accessing it and their demands will all grow.
2. Building a consolidated view of data is a journey, not a destination.
1. Work to resolve turf wars before they arise.
Source: Enterworks Inc., Aberdeen Group
* * * * * * * * * *
New York Data Chase
Since November 1997, the state of New York has been bringing data together from multiple sources. Working via a program called Using Information in Government, several state agencies want to provide better information and analysis of the results of state services.
"We asked state and local government program managers what problems they encounter when trying to use government information to do their jobs," said Theresa Pardo, project director for the Center for Technology in Government (CTG) at the University of Albany, which is working with New York state government organizations to apply information technology to reduce costs and improve services.
There were many responses to CTG's query, ranging from not knowing what information is available, to not understanding the incentives to sharing data, to the lack of data standards, and recognizing the difficulty of selling the value of integrated data to superiors. The situation is a problem, Pardo said, because there are no existing policies to guide people in sharing data across government agencies.
"How do we identify, and then technically integrate and use the data sharing tools?" she asked. "Even the policies to perform actual sharing were missing."
Out of those initial efforts, four teams were formed to explore ways to better share and analyze data to measure the performance of state services. These teams involved improving reporting of data within the Central New York Psychiatric Council, improving decision-making at several state government financial bureaus, and improving services provided to children in the child welfare agency.
Of the four teams, the one furthest along involves the Bureau of Shelter Services, which is working to gain a better understanding of services provided to the homeless in New York. The bureau's shelters, located in 57 counties statewide, each report information on services provided in different ways, ranging from handwritten reports to computerized and online case management systems. Additionally, a variety of services are doled out at different levels within the bureau and the shelters, making service outcomes difficult to track.
And while the bureau receives quarterly reports from each of the shelters, it still is impossible to tell if three people came back five times or if there were 15 different people in need of shelter in any one period.
To tackle these problems, a project team is creating standard definitions for what constitutes "providing adequate services." The team is also building what it calls a Homeless Services Integrated Management System, which will act as a repository of information provided by various shelters across the state to detail who was housed in the shelters and what other services were received.
The goal is to gain a better-if not completely unified-picture of the services provided. To show, for example, if some people have found a job because of job training services provided at the shelters. "Ultimately, we want to show that constituents are paying less money in public assistance, because formerly homeless people now earning salaries and paying taxes," Pardo said. "The state supports the move to permanent independence but currently has no idea how close it's getting to achieving that goal via the services provided."
-- Barbara DePompa Reimers
NEXT STORY: SSG revamps IT acquisition