According to a recent report from The Guardian, South Wales police’s facial recognition algorithms wrongly identified as potential criminals many of the people heading to the nation’s capital to attend the 2017 Champions League final.
The system spotted 2,470 soccer-match goers as potential criminals, and of those, it falsely ruled that 2,297 had a criminal record. This means that it had a 92% false-positive rate, which is not a new thing for the system.
A journalistic investigation revealed that the cameras and facial recognition tech were a bit paranoid about potential lawbreakers.
According to Wired, the Wales police’s facial recognition tech has been consistently bad. In 2017, the same system generated 90% false positives at a boxing match and 87% false positives at a rugby match.
British Police’s Reaction
When contacted with a request for comment, the South Wales Police department defended the system, arguing that “technical issues” are “normal” for such automated systems.
“Of course no facial recognition system is 100 percent accurate under all conditions,” a written statement from the police reads. The police expect more false positives in the future. They also underlined that the false positives led to no arrests and no members of the public suffered any harm.
According to the police, the facial recognition system correctly identified 2,000 people as criminals, which led to 450 arrests, in its first nine months.
Police chief Matt Jukes told reporters that the technology is essential to keep other people safe especially when the police have to deal with massive crowds of people. Around 170,000 people attended the 2017 Champions League soccer match.
However, the Wales police’s facial recognition tech has mistakenly labeled as criminals more people at a single event than it did in its entire career. The police insisted that the system is more accurate in non-crowded situations.
Image Source: PxHere