GAO urges standardization of biometric data
The Government Accountability Office has recommended that agencies develop standards for the collection of biometric data. The lack of standards and the use of proprietary formats has inhibited information sharing, according to a new GAO report issued Oct. 15.
Standardization is important because the raw data is often not very useful. For example, software extracts information about fingerprints, creating an analysis of the print features that serve as the basis for comparison, said Anil Jain, a biometrics expert in Michigan State University’s computer science department. However, the data is useful only when compared to other data generated by the same set of algorithms — which often means it’s useful only with prints analyzed by the same vendors’ systems, he said.
“So how that information is extracted, how it is stored and how it matched is proprietary,” Jain said. “Vendors don’t want to share what they’ve extracted from each fingerprint image because that could reveal something about their proprietary algorithm.”
Meanwhile, it’s possible to share the images of fingerprints, but that doesn’t help much because the matching is based on the features that are extracted and analyzed, not the raw image.
Another issue on which agencies have to agree is how many fingerprints to take. In law enforcement, the standard is 10 fingers. For the Homeland Security Department’s U.S. Visitor and Immigrant Status Indicator Technology program, only two fingerprints were collected until recently.
Some progress has been made in defining a standard template for collecting fingerprint data, but agencies have to be careful how far those standards go, Jain said.
“There are only certain common things we can standardize because the vendors do not want to reveal everything they use,” he said.
Doug Beizer is a staff writer for Federal Computer Week.