Rights groups urge top tech vendors to stop selling facial recognition systems to governments

Warning that facial recognition systems "exacerbate historical and existing bias" that hurts disadvantaged communities, a coalition of 85 civil rights groups asks Amazon, Google and Microsoft to stop selling the tech to governments worldwide.

facial recognition (Shutterstock.com)
 

A coalition of civil rights organizations have written to top executives at Amazon, Google and Microsoft urging them to stop selling facial recognition software systems to governments.

The groups, which include the American Civil Liberties Union, the Electronic Frontier Foundation and the Government Accountability Project, say that "systems built on face surveillance will amplify and exacerbate historical and existing bias that harms these and other over-policed and over-surveilled communities."

The letters target specific facial recognition activities at difference companies. Google comes in for some praise for its "initial steps Google has taken to recognize the harms of invasive surveillance technologies and artificial intelligence."

The group also praises Microsoft for CEO Brad Smith's grasp of the risks inherent in facial recognition software, but it adds that his call for some government oversight constitute "wholly inadequate safeguards."

Amazon, a vendor of a facial recognition system dubbed Rekognition, is warned that the company "is gravely threatening the safety of community members, ignoring the protests of its own workers, and undermining public trust in its business" by continuing to do business with government and specifically by piloting Rekognition with the FBI and discussing its use with Customs and Border Protection.

According to a recent online survey by the Center for Data Innovation, the public largely supports the use of facial recognition for law enforcement and security purposes. Respondents to a series of questions keyed to the accuracy of facial recognition reflected more support for fielding a system that is 100 percent accurate over a system that is 80 percent accurate. More respondents agreed than disagreed with a question over whether surveillance cameras should be "strictly" limited by government, but fewer agreed with the proposition that that facial recognition technology should be similarly regulated --  even though such systems require cameras or some form of image collection to operate.

Facial recognition is increasingly commonplace in the federal government. The FBI and law enforcement has been active in developing technology to match individuals to photographs, and the Department of Homeland Security is piloting an effort to put real-time facial recognition capabilities in place at airports. CBP is pushing out a biometric exit program that includes facial recognition software to capture images of international travelers and match them to passport photos. The Sprint 8 system has been in a testing phase at airports around the U.S., including Jackson/Hartfield in Atlanta and Dulles in the Washington, D.C. area. Last August, CBP publicized the system's identification of a foreign national traveling on another person's documents.

A September 2018 report from the inspector general at DHS found that the Sprint 8 pilot had some successes by its own terms but faced certain technical and operational challenges, including latency issues because of inconsistent wireless networks and problems matching very young or very old individuals to photographs. Additionally, the report found that U.S. citizens were six times more likely to be rejected by the Travel Verification Service algorithm than non-citizens, largely because there were fewer pre-existing photographs available in the CBP gallery and because the U.S. only collects new passport photos from citizens every 10 years. The IG report did not touch on any of the civil liberties or privacy issues of concern to the coalition of groups.

FCW requested comment on the letter from Google, Amazon and Microsoft and will update this article with any responses.