News

New CCTV use rules criticised as bare bones

A proposed code of practice covering police use of live facial recognition in England and Wales has been criticised for offering ‘unclear guidance’.

Former CCTV watchdog Tony Porter said the new rules were ‘bare bones’, whilst two campaign groups, Liberty and Big Brother Watch, have called for the practice to be ended entirely.

Former Surveillance Camera Commissioner Porter has previously produced 72 pages of guidance on the use of LFR, for police forces in England in Wales, covering ethics, governance, leadership and the technical development of watch-lists. He said that the proposed Home Office code doesn’t provide ‘much guidance to law enforcement’ more ‘a great deal of guidance to the public as to how the technology will be deployed’.

The Home Office said the new guidelines, included in the first update to the Surveillance Camera Code of Practice in eight years, empowered police and maintained public trust. Live facial-recognition systems compare faces captured on closed-circuit television with those on a watch-list, alerting officers to a match.

The new code, which covers CCTV use by local authorities and the police, says LFR deployments should: take into account any potential adverse impact on protected groups; be justified and proportionate; quickly delete any unused biometric data collected; follow an authorisation process; and set out and publish the categories of people sought on the watch-list and the criteria on which the decision to deploy is based.

In August 2020, Ed Bridges won a court case after twice being filmed by South Wales Police's automatic facial-recognition van in Cardiff. In its judgement, the Court of Appeal said more checks should have been made to ensure the live facial-recognition algorithm used had no gender or racial bias - and tighter regulations were needed.

Liberty’s Megan Goulding, who worked on Bridges's case, told BBC News: "One year since our case led the court to agree that this technology violates our rights and threatens our liberty, these guidelines fail to properly account for either the court's findings or the dangers created by this dystopian surveillance tool.

"Facial recognition will not make us safer, it will turn public spaces into open-air prisons and entrench patterns of discrimination that already oppress entire communities.”

Partners

View the latest
digital issue