Police Facial Recognition Technology Fails
Facial recognition technology used by Met and South Wales police is wrong nine times out of 10.
17:53 17 May 2018
Police attempts to use camera to recognise people from their faces have been unsuccessful with the wrong person picked out nine times out of 10.
A report from Big Brother Watch says that the Metropolitan Police used a technology where computer databases of faces are linked to CCTV and other cameras. The technology was trialed at the 2017 Remembrance Sunday event to spot people on a mental health watch list. The results of the trial show that the system was wrong 98per cent of the time, falsely telling officers on 102 occasions it had spotted a suspect.
Silkie Carlo, the director of Big Brother Watch, said: “Real-time facial recognition is a dangerously authoritarian surveillance tool that could fundamentally change policing in the UK. Members of the public could be tracked, located and identified – or misidentified – everywhere they go.
“We’re seeing ordinary people being asked to produce ID to prove their innocence as police are wrongly identifying thousands of innocent citizens as criminals.
“It is deeply disturbing and undemocratic that police are using a technology that is almost entirely inaccurate, that they have no legal power for and that poses a major risk to our freedoms.
“This has wasted millions in public money and the cost to our civil liberties is too high. It must be dropped.”