Overall, the facial recognition technology used by UK cops has resulted in inaccurate matches 95 percent of the time. Credit: Thinkstock What’s “dangerously inaccurate” and puts the burden of proof on citizens who are incorrectly matched as a person of interest? It is the automated facial recognition technology being used by police in the UK. The face recognition runs alongside surveillance cameras on the street, scanning crowds and public spaces to match faces in real time against folks on watchlists.How wildly inaccurate is the technology? The Metropolitan Police, for example, have managed to correctly identify only two people, according to a new report by Big Brother Watch (pdf). However, neither one of those is a criminal — one was incorrectly on a watchlist, and the other was on a “mental health-related watchlist.”In London, the police facial recognition tech has helped the cops make zero, zippy, negative, none when it comes to arrests. What it did manage to do was incorrectly match the faces of innocent citizens to police-created watchlists a whopping 98 percent of the time.And in South Wales, 91 percent of “matches” made by the police’s face recognition tech were dead wrong. But, hey, they didn’t let that stop them from storing the biometric photos of all 2,451 innocent people wrongly identified by the system for 12 months. Big Brother Watch noted that the policy “is likely to be unlawful.” The South Wales cops reportedly staged “interventions with 31 innocent members of the public incorrectly identified by the system who were then asked to prove their identity and thus their innocence.”NeoFace Watch not as accurate as NEC Corp. claimsBoth the Metropolitan and South Wales police force use NeoFace Watch, an automated, AI facial recognition product made by the NEC Corporation. NeoFace is branded as the world’s “fastest and most accurate face recognition technology.” That’s fairly scary when you consider how widely inaccurate it has been for the UK cops. NeoFace has also, according to Big Brother Watch, “not been tested for demographic accuracy biases.”Yet the facial recognition tech has been deployed “at shopping centers, festivals, sports events, concerts, community events — and even at a peaceful demonstration. One police force even used the surveillance tool to keep innocent people with mental health issues away from a public event.”So, how many face photos of innocent people are the cops sitting on? Big Brother Watch noted:Out of the 35 police forces that responded to our Freedom of Information request, not one was able to tell us how many photos they hold of innocent people in their custody image database.Big Brother Watch warned that “automated facial recognition cameras are biometric identification checkpoints that risk making members of the public walking ID cards.”As for how the cops defended the system, South Wales police claimed it had a “number of safeguards” in place, while Metropolitan cops told the BBC, “Regarding ‘false’ positive matches — we do not consider these as false-positive matches because additional checks and balances are in place to confirm identification following system alerts.”Nevertheless, Big Brother Watch Legal & Policy Officer Griff Ferris wrote: The UK already has one of the world’s largest CCTV networks. Adding real-time facial recognition to our surveillance state’s already worryingly militaristic arsenal would fundamentally change policing in the UK, and indeed the health of our democracy. Innocent citizens being constantly tracked, located and identified — or, as is currently most likely, misidentified as a criminal — by an artificially intelligent camera system conjures up images of futuristic dystopian societies that even Orwell would be proud of. Related content news Dow Jones watchlist of high-risk businesses, people found on unsecured database A Dow Jones watchlist of 2.4 million at-risk businesses, politicians, and individuals was left unprotected on public cloud server. By Ms. Smith Feb 28, 2019 4 mins Data Breach Hacking Security news Ransomware attacks hit Florida ISP, Australian cardiology group Ransomware attacks might be on the decline, but that doesn't mean we don't have new victims. A Florida ISP and an Australian cardiology group were hit recently. By Ms. Smith Feb 27, 2019 4 mins Ransomware Security news Bare-metal cloud servers vulnerable to Cloudborne flaw Researchers warn that firmware backdoors planted on bare-metal cloud servers could later be exploited to brick a different customer’s server, to steal their data, or for ransomware attacks. By Ms. Smith Feb 26, 2019 3 mins Cloud Computing Security news Meet the man-in-the-room attack: Hackers can invisibly eavesdrop on Bigscreen VR users Flaws in Bigscreen could allow 'invisible Peeping Tom' hackers to eavesdrop on Bigscreen VR users, to discreetly deliver malware payloads, to completely control victims' computers and even to start a worm infection spreading through VR By Ms. Smith Feb 21, 2019 4 mins Hacking Vulnerabilities Security Podcasts Videos Resources Events SUBSCRIBE TO OUR NEWSLETTER From our editors straight to your inbox Get started by entering your email address below. Please enter a valid email address Subscribe