Feature

Ensuring legitimacy and accountability

Many police forces and other security services have been using facial recognition technology for a while now. However, it is a hotly debated topic with privacy concerns and struggles to win over the public

In February, Professor Fraser Sampson, the biometrics and surveillance camera commissioner, published his annual report in which he mentioned facial recognition technology.

The Commissioner is responsible for overseeing police use of DNA and fingerprints in England, Wales and Northern Ireland, and for encouraging the proper use of public space surveillance cameras.

The report, which was submitted to the Home Secretary in November, sets out Professor Sampson’s findings in relation to his statutory responsibilities, and other observations about the use of biometrics and overt surveillance. Among other topics, it also covers facial recognition technology.

Professor Sampson said: “The areas of biometrics and surveillance are becoming both increasingly important and increasingly inter-related. In recent years we have seen an explosion of surveillance technology in the public and private realms, with devices such as drones and body worn video, dashcams and smart doorbells. At the same time, there have been enormous advances in the power of AI to exploit the vast amount of surveillance data now being produced.

“I believe that many of the issues raised in my report show that we urgently need to wake up to the opportunities presented, and the threats posed by, the explosion of capability in AI-driven biometric surveillance. If we fail, we risk missing out on the potential benefits it can offer and exposing ourselves to the potential dangers it poses.

“Now more than ever, we need a clear, comprehensive and coherent framework to ensure proper regulation and accountability in these crucial areas.”

Legislation
Sampson notes that the police are using biometric surveillance technology such as facial recognition, though there remains uncertainty around the regulatory framework for ensuring legitimacy and accountability if and when they do use such technology.

He outlines the two sides to the debate: “Biometric surveillance technologies can undoubtedly be intrusive to privacy and raise other human rights considerations, but there is no question that they can also be powerful weapons in the fight against serious crime and safeguard other fundamental rights such as the right to life and freedom from degrading or inhumane treatment.” This debate will likely continue for a long time, with supporters of both sides strongly arguing their case and competing for support among the public.

Sampson claims that the extent to which the public will tolerate facial recognition will rely on whether or not they believe that measures are in place to make sure that the technology is used lawfully and responsibly.

Parliament is considering legislation for reform and Sampson points out the need to address questions surrounding the legitimate role for new technology such as facial recognition in biometric surveillance by the police and law enforcement: “The ramifications of AI-driven facial recognition in policing and law enforcement are [ … ] profound enough to be taken seriously and close enough to require our immediate attention.”

Concerns
The revised Surveillance Camera Code of Practice was approved by Parliament in January 2022 and addresses the use of public space surveillance, including the use of facial recognition technology, by the police and local authorities. Sampson has advised how the code of conduct can be useful if adopted across government departments to address some of the concerns about surveillance companies and their practices.

In his report, Sampson points out some concerns that have been raised around the use of facial recognition technology, including the potential for racial and gender bias; accuracy of the technology; a need for greater transparency and governance in the use of LFR; accuracy of reporting of false positives in the media; proportionality arguments particularly with reference to the rate of ‘success’ compared to the number of faces scanned; and the legal basis for deployment of the technology together with the need for independent authorisation.

There is also a concern around whether the technology can be hacked for nefarious purposes.
Much discussion has also been had around where the technology comes from. China is the world’s leading exporter of the technology and there is concern that foreign governments may have access to the data generated by the technology that is exported.

A recent survey by the commissioner found that 18 of the 39 police forces who responded say that their external camera systems use equipment about which there have been security or ethical concerns (including Dahua, Hikvision, Honeywell and Huawei, and Nuuo) and at least 24 respondents say that their internal camera systems use equipment about which there have been security or ethical concerns (including Dahua, Hikvision, Honeywell and Huawei, and Nuuo).

Sampson said: “It is abundantly clear from this detailed analysis of the survey results that the police estate in the UK is shot through with Chinese surveillance cameras. It is also clear that the forces deploying this equipment are generally aware that there are security and ethical concerns about the companies that supply their kit.

“There has been a lot in the news in recent days about how concerned we should be about Chinese spy balloons 60,000 feet up in the sky. I do not understand why we are not at least as concerned about the Chinese cameras 6 feet above our head in the street and elsewhere.

“Parliament has already acted to curtail the use of equipment made by several Chinese manufacturers from some areas of public life where security is key. Myself and others have been saying for some time that we should, both for security and ethical reasons, really be asking ourselves whether it is ever appropriate for public bodies to use equipment made by companies with such serious questions hanging over them.”

Criticisms
There are a few well-publicised cases where the use of facial recognition technology has been criticised. For example, speaking to the BBC, Clearview CEO Hoan Ton-That revealed that the company has run nearly a million searches for US police. The founder also revealed that Clearview now has 30bn images scraped from platforms such as Facebook, which have been taken without users’ permissions. The company has been fined several times in Europe and Australia for breaches of privacy. The technology allows a law enforcement customer to upload a photo of a face and then find matches in a database of billions of images it has collected.

The company is banned from selling its services to most US companies, after being taken to court in Illinois for breaking privacy law. However, this ban does not apply to the police.

Facial recognition technology is also being used by some governments to curb dissent and target protesters. A Reuters review of more than 2,000 court cases in Russia, has revealed how the technology is being used to identify opponents of the regime.

In September, the Iranian government announced that it was planning to use facial recognition technology on public transport to identify women who are not complying laws on wearing the hijab.

This month, it was announced that the Iranian government had started to install cameras to identify women not wearing the hijab.

In Scotland, a council has been criticised by data watchdog the Information Commissioner’s Office (ICO) for using facial recognition technology in nine schools.

Successful use cases
On the other hand, there are examples of where the technology has been used to apprehend criminals.
In South Africa, six men were arrested for a series of heists after being identified through facial recognition technology. The suspects were found after facial recognition analysis was carried out on the CCTV footage from the stores they robbed.

This example adds to the argument that facial recognition technology can help keep the public safe.
In Australia, a recent survey has found that 72 per cent of the 4000 people asked want more facial recognition at airports to speed up the customs process. Adam Schwab, CEO and co-founder of Luxury Escapes, which carried out the research, told news.com.au: “Facial recognition technology is just one of many ways Australian travellers, and the travel industry, continue to look for ways to make travel safer, more efficient and less stressful for all.”

In airports, use of facial recognition technology is twice as fast as fingerprint scanning, and is also not subject to passenger error.

Facial recognition technology can be used to search for missing people. In 2020, Indian police used a facial recognition app to reunite thousands of missing and trafficked children with their families. Thousands of children go missing every year and many are trafficked to work in eateries, handicraft industries, brick kilns, factories or into begging and brothels. The technology was used to reunite more than 1500 children with their families.

It has even been used in casinos to bar entry to gamblers who have requested to be excluded.

Justification
Sampson argues that for the use of facial recognition technology to be justified, it needs to be proportionate. i.e. we need to know how many people have been arrested as a result of their use, compared to how many people have been scanned.

The National Physical Laboratory recently published independent research into the Met’s deployment of facial recognition.

The study, which was entitled ‘Facial Recognition Technology in Law Enforcement’, tested the accuracy, in operational conditions, of the algorithm used by the Met in terms of different demographics.
The research found that there are settings the algorithm can be operated at where there is no statistical significance between demographic performance.

It was also found that when used at a threshold setting of 0.6 or above, correct matches (True Positive Identification Rate) were 89 per cent. The incorrect match rate (False Positive Identification Rate) was 0.017 per cent. The chance of a false match therefore, is just 1 in 6000 people walking past the camera.
When used at a threshold setting of 0.6 or above, any differences in matches across groups were not statistically significant - meaning performance was the same across race and gender.

With regards to Retrospective Facial Recognition, the true positive identification rate for high quality images was 100 per cent.

The Met says it will use “Facial Recognition Technology as a first, but significant, step towards precise community-based crime fighting.”

According to the Met: “Live Facial Recognition (LFR) enables us to be more focussed in our approach to tackle crime, including robbery and violence against women and girls.”

Lindsey Chiswick, director of intelligence for the Met said: “Live Facial Recognition technology is a precise community crime fighting tool. Led by intelligence, we place our effort where it is likely to have the greatest effect. It enables us to be more focused in our approach to tackle crime, including robbery and violence against women and girls.

“This is a significant report for policing, as it is the first time we have had independent scientific evidence to advise us on the accuracy and any demographic differences of our Facial Recognition Technology.

“We commissioned the work so we could get a better understanding of our facial recognition technology, and this scientific analysis has given us a greater insight into its performance for future deployments.
“We know that at the setting we have been using it, the performance is the same across race and gender and the chance of a false match is just 1 in 6000 people who pass the camera. All matches are manually reviewed by an officer. If the officer thinks it is a match, a conversation will follow to check.

“The study was large enough to ensure any demographic differences would be seen. However, he has also been able to extrapolate these figures to reflect results more representative of watch list size for previous LFR deployments.”

While there are some success stories, it is clear there is still a long way to go to gain the public’s trust on the use of facial recognition technology. However, there are ways to do this, and it involves being open, legislating, justifying and managing the technology. Any use needs to be lawful, justified and responsible.

Partners

View the latest
digital issue