Amazon’s decision to market a powerful face recognition tool to police is alarming privacy advocates.
Advocates like the American Civil Liberties Union (ACLU) say law enforcement agencies could use the tool, named Rekognition, to easily build a system that would automate the identification and tracking of anyone.
That could have potentially dire consequences for minorities who are already arrested at disproportionate rates, immigrants who may be in the country illegally or political protesters, they said.
It isn’t known how many police forces have adopted the technology since its launch in late 2016 or its update last autumn, when Amazon added capabilities that allow it to identify people in videos and follow their movements almost instantly.
The Washington County Sheriff’s Office in Oregon has used it to quickly compare unidentified suspects in surveillance images to a database of more than 300,000 booking photos from the county jail – a common use of such technology around the US – while the Orlando Police Department in Florida is testing whether it can be used to single out people of interest in public spaces and alert officers to their presence.
“People should be free to walk down the street without being watched by the government,” the privacy advocacy groups wrote in a letter to Amazon on Tuesday. “Facial recognition in American communities threatens this freedom.”
The letter to Amazon followed public records requests from ACLU chapters in California, Oregon and Florida. More than two dozen organisations signed it, including the Electronic Frontier Foundation and Human Rights Watch.
In a statement, Amazon Web Services stressed that it requires all of its customers to comply with the law and to be responsible in the use of its products.
The statement said some agencies have used the tool to find abducted people, and amusement parks have used it to find lost children.
British broadcaster Sky News used Rekognition to help viewers identify celebrities at the royal wedding of Prince Harry and Meghan Markle last weekend.
Amazon’s technology isn’t that different from what face recognition companies are already selling to law enforcement agencies.
But its vast reach and its interest in recruiting more police departments, at extremely low cost, are troubling, said Clare Garvie, an associate at the Centre on Privacy and Technology at Georgetown University Law Centre.
“This raises very real questions about the ability to remain anonymous in public spaces,” Garvie said.
While police might be able to videotape public demonstrations, face recognition is not merely an extension of photography but a biometric measurement — more akin to police walking through a demonstration and demanding identification from everyone there, she said.
Last year, the Orlando, Florida, Police Department announced it would begin a pilot programme relying on Amazon’s technology to “use existing city resources to provide real-time detection and notification of persons of interest, further increasing public safety”.
Orlando has a network of public safety cameras, and in a presentation posted to YouTube this month, Ranju Das, who leads Amazon Rekognition, said the company would receive feeds from the cameras, search them against photos of people being sought by law enforcement and notify police of any hits.
“It’s about recognising people, it’s about tracking people, and then it’s about doing this in real time, so that the law enforcement officers… can be then alerted in real time to events that are happening,” he said.
The Orlando Police Department (OPD) said in an email that it “is not using the technology in an investigative capacity or in any public spaces at this time”.
The testing has been limited to eight city-owned cameras and a handful of officers who volunteered to have their images used to see if the technology works, Sergeant Eduardo Bernal wrote in an email on Tuesday.
“As this is a pilot and not being actively used by OPD as a surveillance tool, there is no policy or procedure regarding its use as it is not deployed in that manner,” Bernal wrote.