Police facial recognition tech 'dangerous and inaccurate'

Share

A member of staff from the human rights organisation Liberty who observed the Met Police's operation at Notting Hill Carnival a year ago claimed the technology led to at least 35 false positives, five people being unduly stopped and one wrongful arrest.

Automated facial recognition is an artificially intelligent (AI) computer system which runs alongside a surveillance camera on the street, recognising people's faces in real time and matching them against watch-lists created by the police.

"We're seeing ordinary people being asked to produce ID to prove their innocence as police are wrongly identifying thousands of innocent citizens as criminals".

Police records suggest the technology is grossly unreliable, however, and authorities who continue using AFR risk potentially violating British privacy laws, according to Big Brother Watch, a nonprofit civil liberties group that released the report.

The software is used at major events such as sporting fixtures and music concerts to detect people on a watch list, including wanted criminals.

That the police think these embarrassing inaccuracy rates are acceptable - claiming that "no facial recognition system is 100% accurate" - is even more worrying, as South Wales Police has already planned future deployments and Metropolitan Police is in fact growing its use of facial recognition throughout 2018.

Police facial recognition cameras have already been trialed at large events across the United Kingdom, including football matches and festivals.

The Metropolitan Police used facial recognition at London's Notting Hill Carnival in both 2016 and 2017, and also at a Remembrance Sunday event.

"Facial recognition has always been feared as a feature of a future authoritarian society, with its potential to turn CCTV cameras into identity checkpoints, creating a world where citizens are intensively watched and tracked", it said.

Freedom of Information requests by the Big Brother Watch organisation have shown that 91 per cent of so-called "matches" found by South Wales Police's technology were wrong.

The Met admitted that as a result of using facial recognition it has stored 102 innocent people's biometrics data for 30 days. It's likely that the police's use of this intrusive and oppressive technology is not compatible with the UK's human rights laws, as it poses a significant threat to people's right to privacy and freedom of expression. It was also used by South Wales police at protests against an arms fair. Police must stop using it now.

Big Brother Watch's campaign, calling on United Kingdom public authorities to immediately stop using automated facial recognition software with surveillance cameras, is backed by David Lammy MP and 15 rights and race equality groups including Article 19, Football Supporters Federation, Index on Censorship, Liberty, Netpol, Police Action Lawyers Group, the Race Equality Foundation, and Runnymede Trust.

"Real-time facial recognition is a dangerously authoritarian surveillance tool that could fundamentally change policing in the U.K. Members of the public could be tracked, located and identified - or misidentified - everywhere they go", said Silkie Carlo, the watchdog's director.

Share