News

Watchdog urges for halt of public facial recognition tech use

The Equalities and Human Rights Commission has said that the mass screening of the public by police officers using facial recognition software must be halted because it could amplify racial discrimination and stifle free expression.

Police in London and south Wales have been at the forefront of using automated facial recognition (AFR) technology, which uses cameras to capture images of faces and double-checks these against databases of wanted suspects. This includes scanning shoppers in Stratford and at Oxford Circus in the capital, as well as by South Wales police to monitor football fans.

The EHRC said the technology should be suspended until its impact has been independently scrutinised and laws governing its application improved.

Professor Peter Fussey, a surveillance expert from Essex University who conducted the only independent review of the Metropolitan police’s public trials on behalf of the force, has found it was verifiably accurate in just 19 per cent of cases.

The EHRC noted in a report to the United Nations on civil and political rights in the UK that evidence indicates many AFR algorithms ‘disproportionately misidentify black people and women and therefore operate in a potentially discriminatory manner’.

Rebecca Hilsenrath, chief executive at the EHRC, said: “The law is clearly on the back foot with invasive AFR and predictive policing technologies. It is essential that their use is suspended until robust, independent impact assessments and consultations can be carried out, so that we know exactly how this technology is being used and are reassured that our rights are being respected.”

Partners

View the latest
digital issue