News

AI safeguards needed against mass surveillance

The European Parliament’s Civil Liberties Committee has said that the use of artificial intelligence on law enforcement and the judiciary should be subject to strong safeguards and human oversight.

MEPs highlight in a new report the need for democratic guarantees and accountability for the use of artificial intelligence in law enforcement.

MEPs worry that the use of AI systems in policing could potentially lead to mass surveillance, breaching key EU principles of proportionality and necessity, with the committee warning that otherwise legal AI applications could be re-purposed for mass surveillance.

The draft resolution highlights the potential for bias and discrimination in the algorithms on which AI and machine-learning systems are based. As a system’s results depend on its inputs, it is important to take algorithm bias into account.

Addressing specific techniques available to the police and the judiciary, the committee notes that AI should not be used to predict behaviour based on past actions or group characteristics. On facial recognition, MEPs note that different systems have different implications and demand a permanent ban on the use of biometric details like gait, fingerprints, DNA or voice to recognise people in publicly accessible spaces.

The use of biometric data for remote identification is of particular concern to MEPs, especially automated recognition-based border control gates and the iBorderCtrl project which are problematic and should be discontinued, say MEPs, who urge the Commission to open infringement procedures against member states if necessary.

Rapporteur Petar Vitanov said: “The use of AI is growing exponentially, and things that we thought possible only in sci-fi books and movies – predictive policing, mass surveillance using biometric data – are a reality in some countries. I am satisfied that the majority of the Civil Liberties Committee recognises the inherent danger of such practices for our democracy. Technical progress should never come at the expense of people’s fundamental rights.”

Partners

View the latest
digital issue