Facial recognition tech used by UK cops is 'dangerously inaccurate'

Overall, the facial recognition technology used by UK cops has resulted in inaccurate matches 95 percent of the time.

Facial recognition tech used by UK cops is 'dangerously inaccurate'
Thinkstock

What’s “dangerously inaccurate” and puts the burden of proof on citizens who are incorrectly matched as a person of interest? It is the automated facial recognition technology being used by police in the UK. The face recognition runs alongside surveillance cameras on the street, scanning crowds and public spaces to match faces in real time against folks on watchlists.

How wildly inaccurate is the technology? The Metropolitan Police, for example, have managed to correctly identify only two people, according to a new report by Big Brother Watch (pdf). However, neither one of those is a criminal — one was incorrectly on a watchlist, and the other was on a “mental health-related watchlist.”

In London, the police facial recognition tech has helped the cops make zero, zippy, negative, none when it comes to arrests. What it did manage to do was incorrectly match the faces of innocent citizens to police-created watchlists a whopping 98 percent of the time.

And in South Wales, 91 percent of “matches” made by the police’s face recognition tech were dead wrong. But, hey, they didn’t let that stop them from storing the biometric photos of all 2,451 innocent people wrongly identified by the system for 12 months. Big Brother Watch noted that the policy “is likely to be unlawful.”

The South Wales cops reportedly staged “interventions with 31 innocent members of the public incorrectly identified by the system who were then asked to prove their identity and thus their innocence.”

NeoFace Watch not as accurate as NEC Corp. claims

Both the Metropolitan and South Wales police force use NeoFace Watch, an automated, AI facial recognition product made by the NEC Corporation.

NeoFace is branded as the world’s “fastest and most accurate face recognition technology.” That’s fairly scary when you consider how widely inaccurate it has been for the UK cops. NeoFace has also, according to Big Brother Watch, “not been tested for demographic accuracy biases.”

Yet the facial recognition tech has been deployed “at shopping centers, festivals, sports events, concerts, community events — and even at a peaceful demonstration. One police force even used the surveillance tool to keep innocent people with mental health issues away from a public event.”

So, how many face photos of innocent people are the cops sitting on? Big Brother Watch noted:

Out of the 35 police forces that responded to our Freedom of Information request, not one was able to tell us how many photos they hold of innocent people in their custody image database.

Big Brother Watch warned that “automated facial recognition cameras are biometric identification checkpoints that risk making members of the public walking ID cards.”

As for how the cops defended the system, South Wales police claimed it had a “number of safeguards” in place, while Metropolitan cops told the BBC, “Regarding ‘false’ positive matches — we do not consider these as false-positive matches because additional checks and balances are in place to confirm identification following system alerts.”

Nevertheless, Big Brother Watch Legal & Policy Officer Griff Ferris wrote:

The UK already has one of the world’s largest CCTV networks. Adding real-time facial recognition to our surveillance state’s already worryingly militaristic arsenal would fundamentally change policing in the UK, and indeed the health of our democracy. Innocent citizens being constantly tracked, located and identified — or, as is currently most likely, misidentified as a criminal — by an artificially intelligent camera system conjures up images of futuristic dystopian societies that even Orwell would be proud of.

NEW! Download the Winter 2018 issue of Security Smart