Utilising facial recognition to keep the public safe
Facial recognition has been in the news for featuring at high-profile events such as Harry Styles concerts. But these articles in the mainstream press often highlight the privacy concerns surrounding the technology. Without a high-profile success case, it’s hard to convince the public of the technology’s viability
Ahead of Harry Styles’ concerts in Cardiff on 20 and 21 June, fans were warned that they could be scanned by live facial recognition cameras deployed in the area by South Wales Police. The cameras were to be used to identify people wanted for priority offences.
South Wales Police stated: “it’s being deployed specifically to seek out wanted individuals. Fully appreciate the concert has a young audience, however concert-goers won’t be the only people in the city centre during this time.”
How can it be used?
Live facial recognition (LFR) works by comparing faces with a watchlist, using artificial intelligence. The police stated that if you are not on a watch list, the biometric data collected won’t be stored and it will immediately be deleted.
At a Beyoncé concert earlier in the year, the force said the technology would be used “to support policing in the identification of persons wanted for priority offences… to support law enforcement… and to ensure the safeguarding of children and vulnerable persons”.
South Wales Police have used the technology at previous events. Using the technology at a rugby match, 108,540 faces were scanned, resulting in the arrests of two people.
South Wales Police has a LFR FAQ page on its website. It states: “The specific purpose for Live Facial Recognition deployment is: To support Policing in the identification of persons wanted for priority offences, to support law enforcement including the administration of justice (through arrest of persons wanted on warrant or unlawfully at large/recall to prison), and to ensure and promote the safeguarding of children and vulnerable persons at risk.”
The website also lists occasions where the force has used LFR, as well as the events already listed, the technology was utilised at Pride Cymru in August 2022 and Wales Airshow in July 2023.
The website says: “Live Facial Recognition technology is used as an efficient and effective policing tactic to prevent and detect crime, and protect the most vulnerable in our society.”
Biometrics and surveillance camera commissioner Professor Fraser Sampson recently commissioned an independent gap analysis by Professors Pete Fussey and William Webster. Fussey and Webster highlighted that the use of such technology is likely to increase. They said: “the Policing Minister expressed his desire to embed facial recognition technology in policing and is considering what more the government can do to support the police on this. Such embedding is extremely likely to include exploring integration of this technology with police body worn video”.
persuading the public
So if use of the technology is to increase, how can the public be persuaded to accept it?
In his annual report, published in February, Sampson claims that the extent to which the public will tolerate facial recognition will rely on whether or not they believe that measures are in place to make sure that it is used lawfully and responsibly.
There are two main ways to achieve this. The first is to update the legislation. Legislation is currently in the works in the form of the Data Protection and Digital Information (No.2) Bill, which is still at the early stages of its journey through parliament. There needs to be legal oversight over how and when the technology is used and by whom. For example, it may be acceptable to the public for the police to use LFR at concerts, but the public are less likely to welcome it when used by shops, schools or even private users.
There was criticism when Sports Direct announced that the use of LFR had cut crime in its shops. 50 MPs and peers supported a letter opposing the use of LFR by Frasers Group.
The legislation also needs to cover how the data collected will be stored and used, who will have access to it and when it should be deleted.
The other factor in changing the public’s opinion is to keep the public informed. Forces need to be clear about when they are using the technology and how, so the public can trust them.
Another aspect to consider, is that so far, there have been no high-profile success stories. Perhaps if the public became aware of an occasion or occasions where the technology had been used to identify a threat to the public and potentially prevent a crime, they would be more accepting of its use.
One high-profile use case is set to be the Paris Olympics next year. Real-time cameras are set to use AI to detect suspicious activity, such as abandoned luggage and unexpected crowds. However, a new law states that while police can use CCTV algorithms to pick up anomalies such as crowd rushes, fights or unattended bags, it outlaws using LFR to trace “suspicious” individuals. It may be that introducing technology like this could be a stepping stone to getting the public to trust LFR.