Looks like Orlando won’t quit its controversial test of Amazon’s facial recognition software after all.
The city of Orlando and Orlando Police Department released a joint statement on Monday announcing the city would continue testing Rekognition, Amazon's deep learning facial recognition technology, which has the power to identify every face in a crowd.
Last month, the American Civil Liberties Union sent a letter to Orlando lawmakers claiming it started testing the program "without inviting a public debate, obtaining local legislative authorization, or adopting rules to prevent harm to Orlando community members," and demanded that it "immediately" stop using it.
Orlando did stop using Rekognition, but the decision wasn't due to the outcry from privacy and anti-surveillance advocates. Instead, the trial contract simply expired, which left open the possibility of using Rekognition again.
"Partnering with innovative companies, like Amazon Web Services, to test new technology is one way to ensure we offer the best in tools, training and technology for the men and women whose job it is to keep our community safe," reads the joint statement sent to Mashable.
"To that end, the City of Orlando will continue to test Amazon Rekognition facial recognition software to determine if this technology could reliably identify specific individuals as they come within view of specific cameras."
Although the new trial period will not use any images of the public for testing (seven police officers volunteered to have Rekognition take photos of their faces), privacy advocates are still unhappy with its usage.
Adam Schwartz of the Electronic Frontier Foundation, a digital rights group working with the ACLU to advocate against cities' use of Rekognition, said he fears this will deter people from fully participating in democratic activities like protests.
"Many of these kinds of technology are deployed disproportionately in low-income and minority and immigrant communities," Schwartz said.
"We think that use of this technology with reinforce existing injustices that will further harm people"
"The tech that exists today has a higher error rate — there are more false positives with people of color and women and younger people, and we think that use of this technology with reinforce existing injustices that will further harm people."
He said the EFF believes the ideal facial recognition policy is not allowing law enforcement agencies to use such technology.
Amazon disputes claims that Rekognition is being used for surveillance. An Amazon Web Services spokesperson wrote in a statement to Mashable that if the technology was being abused, Amazon would suspend the abusers.
"Our quality of life would be much worse today if we outlawed new technology because some people could choose to abuse the technology. Imagine if customers couldn’t buy a computer because it was possible to use that computer for illegal purposes?" the AWS spokesperson wrote.
"Like any of our AWS services, we require our customers to comply with the law and be responsible when using Amazon Rekognition."
Details about how Orlando will use the technology is still being worked out, but should be finalized within the next few days.