Unblock the data clog
Well-designed data architectures may prove a boon to future systems development, but agency users face the practical problem of getting more mileage out of the data they already own. In many cases, that requires integrating data from existing and new stand-alone systems that were not designed to interact with others.
Choosing the right data-integration method, however, is not always simple. You have to consider not only the data's age and quality, but also what applications will use the merged product. Other issues include costs and ensuring that employees have the skills to handle the project.
Here are a few of the most common integration options and their strengths and weaknesses.
1. Extract, transform and load
Extract, transform and load (ETL) tools are used to extract data from various sources, transform them into a common format and then load them into a single database.
It's the preferred method for building data warehouses, which began finding favor with large organizations in the 1990s. More recently, however, small and midsize organizations have become big users of data warehouses, said Philip Russom, an analyst at Forrester Research.
Ascential Software, which IBM recently bought, and Informatica are leaders in this market, according to Forrester researchers, with companies such as Business Objects, Oracle and SAS Institute trailing them.
Federal agencies such as the U.S. Geological Survey, the Environmental Protection Agency, the FBI and the Defense Information Systems Agency, and a number of state and local governments use such tools.
Enterprise ETL is expensive, but other benefits can outweigh the costs. Chris Kemp, manager of data management services for Washington state's Transportation Department, said the primary reason his department chose Informatica's tools for a mainframe-to-server/client migration was their ease of use.
"Cost was an issue, but it was lower down the list," Kemp said.
2. Enterprise information integration
Whereas ETL pulls data from separate sources into a central repository, enterprise information integration (EII) uses virtualization to give the appearance of a consolidated data store, while leaving the datasets in their native locations. The tools act as a metadata resource, a sort of arbiter among systems, providing detailed information for each data source.
EII is still a new and emerging market, according to Forrester analysts, earning no more than $250 million in 2004. It was originally the preserve of small companies such as MetaMatrix and Nimble Technologies, but giants such as IBM and BEA Systems have recently entered the fray to give the EII market at least the appearance of more depth.
Sometimes perceived as a substitute for ETL, EII is being cast as a supplement to it, as a way of accessing information that for one reason or another can't be physically pulled into a data warehouse.
EII's advantage is that it requires less movement of data than ETL and far less actual data transformation. Also, most EII operations can be done via the Internet, which can mean significant cost savings. On the other hand, EII also requires close attention to issues such as data modeling and metadata management.
3. Database replication
Database replication has become popular with organizations that are spread out geographically but still need access to common data. That can be a problem with the kinds of massive datasets that many agencies need to work with.
Replication moves updated data among source and target systems according to a set of management tables, making sure that source and target data match for coherency and consistency.
"Many organizations turn to this technique [for data integration] because they can use the replication facilities that modern databases include as a standard feature," Russom said.
However, because data replication doesn't work well among products from different vendors, it should be used to integrate data between two or more databases from the same vendor, he said. ETL is the option to use to integrate data in multivendor environments, he said.
4. Web services and data services
When Web services burst onto the scene a few years ago, they were hailed as a panacea for all kinds of integration challenges. But reality has since set in.
"There's still the challenge of semantically mapping data between the various services," said Harriet Fryman, group director of product marketing at Informatica. "There's still a need for data integration for those gray spaces between the services."
That has led many data integration vendors to develop a new product line that essentially provides data integration as a layer within service-oriented architectures. When the need arises for different applications to use the same data, a call is made to a service that represents data integration and the appropriate data is passed to it. Everything needed for the integration is contained within the Web Services Definition Language file.
If there's any drawback, it's that agencies have not used those data services much because few organizations have implemented public-facing Web services that operate beyond their firewalls.
5. Hardware-based integration
New devices for hardware-based integration account for a minuscule part of the data-
integration market now, but big corporate customers such as Motorola, British American Tobacco and Kyocera have already committed at least part of their integration efforts to the devices.
The appliances can either be used to build integration projects from scratch or plugged into existing application integration architectures. Sitting on the network, they combine all of the connectivity, data transformation and management functions needed to shuttle data among databases and applications.
Fred Meyer, president and chief executive officer of Cast Iron Systems, which makes systems for hardware-based integration, believes as many as 80 percent of data-integration projects now done using software-based approaches can be covered by these appliances, for one-fifth of their cost.
"The real world is people who mostly have to deal with flat files or custom applications with fairly simple, dumb interfaces on them," he said. "And we can deal with that."
Such appliances aren't good for situations involving many manual processes that require custom coding, Meyer added.