Amazon’s facial recognition software falsely matches over 2 dozen Mass. athletes with mugshots, ACLU says

Amazon’s facial recognition technology falsely matched pictures of professional athletes in New England with mugshots, according to a newly released report from the American Civil Liberties Union.

The ACLU of Massachusetts used 188 official headshots from the Bruins, Celtics, Red Sox and Patriots and the software falsely matched 27 athletes like Chris Sale and Gordon Hayward to a database of about 20,000.

“There is no regulation or oversight or transparency at all required by state law with respect to how government agencies use these tools,” Kade Crockford of the Massachusetts Division of the ACLU said. “We want people to know that this is dangerous stuff. You could be falsely ID’d.”

Patriots safety Duron Harmon was also incorrectly matched and spoke out against the program.

“This technology is flawed. If it misidentified me, my teammates, and other professional athletes in an experiment, imagine the real-life impact of false matches,” Harmon said in a statement released by the organization.

Harmon said he supports Senate Majority Leader Cynthia Stone Creem’s legislation in which she calls for a hold on the face surveillance technology.

“I think it is critical that we step back and say, ‘Let’s get together and talk about it. If there is a reason to do it let’s make it work the right way.” the senator said. “I’m not saying we will never use this. But, I am saying we just can’t go so quickly without any rules.”

In a statement to 7NEWS, Amazon web services said the ACLU is misrepresenting the information and that their technology has a long list of benefits.

“The ACLU is once again misusing and misrepresenting Amazon Rekognition to make headlines. As we’ve said many times in the past, when used with the recommended 99 percent confidence threshold and as one part of a human-driven decision, facial recognition technology can be used for a long list of beneficial purposes, from assisting in the identification of criminals to helping find missing children to inhibiting human trafficking,” the statement read in part. “We continue to advocate for federal legislation of facial recognition technology to ensure responsible use.”

“It is totally possible that in its unregulated state here in Massachusetts, these systems could be used to target people for their political speech, to target people inappropriately for the neighborhood they live in or the color of their skin,” Crockford said.

A hearing is set to be held at the statehouse Tuesday regarding Stone Creem’s bill. There will not be a vote but a committee will be making a decision on this issue in the coming weeks.

(Copyright (c) 2019 Sunbeam Television. All Rights Reserved. This material may not be published, broadcast, rewritten, or redistributed.)