House seeks clarity on FBI facial recognition database
- By Matt Leonard
- Mar 22, 2017
The FBI has expanded its access to photo databases and facial recognition technology to support its investigations. Lawmakers, however, have voiced a deep mistrust in the bureau's ability to protect those images of millions of American citizens and properly follow regulations relating to transparency.
Kimberly Del Greco, the deputy assistant director of the FBI's Criminal Justice Information Services Division, faced tough questioning from both sides of the aisle at a March 22 hearing of the House Committee on Oversight and Government Reform.
The FBI's use of facial recognition technology was called into question last year after the Government Accountability Office issued a report saying the bureau had not updated its privacy impact assessment when the Next Generation Identification-Interstate Photo System "underwent significant changes."
"So here's the problem," said Rep. Jason Chaffetz (R-Utah), the committee chairman. "You're required by law to put out a privacy statement and you didn't and now we're supposed to trust you with hundreds of millions of people's faces."
The FBI's NGI-IPS allows law enforcement agencies to search a database of over 30 million photos to support criminal investigations. The bureau can also access an internal unit called Facial Analysis, Comparison and Evaluation, which can tap other federal photo repositories and databases in 16 states that can include driver's license photos. Through these databases, the FBI has access to more than 411 million photos of Americans, many of whom have never been convicted of a crime.
The GAO report said the FBI was not testing the accuracy of its system on a regular basis and has not done testing to ensure that the system provides accurate results for "all allowable candidate list sizes."
Multiple witnesses, including Jennifer Lynch, the senior staff attorney at the Electronic Frontier Foundation, and Alvaro Bedoya, executive director at the Center on Privacy and Technology at Georgetown Law, said that facial recognition technologies have provided false positives more regularly for women, younger individuals and people of color.
"That is due to the training data that is used in facial recognition systems," Lynch said. "Most facial recognition systems are developed using pretty homogeneous images of people's faces, so that means mostly whites and men."
Bedoya said Georgetown's Center on Privacy and Technology recommends having citizens vote to approve their state's use of driver's license photos in such databases, especially since "most people have no idea this is happening."
Del Greco said, both in her testimony and in response to multiple questions, that privacy is of the utmost importance to the FBI and that the facial recognition results are used only as investigative leads.
Chaffetz pushed back on that argument. He noted that other biometric data, like fingerprints and DNA, are also used for investigative leads, but said collection of those is much narrower.
"DNA is a valuable investigative tool," he said. "Fingerprints are a valuable investigative lead ... what scares me is the FBI and the Department of Justice proactively trying to collect everyone's face."
The FBI's failure to update the privacy impact assessment, Chaffetz added, was yet another reason not to trust the agency with ordinary Americans' personal information.
Del Greco said the assessment was submitted to the DOJ. But Maurer said it was submitted only after the technology had been used in real-world applications for years.
Matt Leonard is a former reporter for GCN.