High Court judges have ruled that the use of automated facial recognition (AFR) by police is lawful, in a landmark decision that privacy campaigners fear could set a precedent for the proliferation of biometric mass surveillance.

The world's first case against AFR was brought by human rights group Liberty on behalf of Ed Bridges, a Cardiff resident whose face was scanned both at an anti-arms protest and while doing his Christmas shopping during a trial of AFR by South Wales Police.

© iStock/pixinoo
© iStock/pixinoo

Judges ruled that South Wales Police's use of AFR was consistent with the requirements of the Human Rights Act and existing data protection legislation. They found that the current legal regime was adequate for the deployment, although they added that it would have to be subject to periodic review.

Proponents of the technology argue that it can improve the quality and efficiency of policing while cutting the costs. Jason Tooley, chief revenue officer at biometrics authentication company Veridium and board member of techUK, called the decision "a victory for technology innovation".

"As police forces recognise that biometrics can drive improved policing, there is evidently a need to focus on how the technology can be implemented quickly by officers whilst gaining widespread public acceptance," he said.

"The use of biometrics has been proven to greatly enhance identity verification at scale, as seen in many countries where officers currently use consumer technology to verify suspects on-demand."

Liberty disagreed, arguing that AFR is authoritarian, discriminatory and breaches human rights.

"Facial recognition is a highly intrusive surveillance technology that allows the police to monitor and track us all," said Liberty lawyer Megan Goulding. "It is time that the government recognised the danger this dystopian technology presents to our democratic values and banned its use. Facial recognition has no place on our streets."

Public opinion on AFR is more mixed. In a survey of 4,109 adults undertaken by the Ada Lovelace Institute and YouGov, 55 percent of respondents said that the government should limit police use of facial recognition to specific circumstances, while 46 percent wanted the right to opt-out of its use.

Whether their demands are met remains to be seen

What are the implications of the case?

The High Court decision may be a landmark, but it is not necessarily the end of the journey for opponents of AFR use by police. An appeal against the decision is expected and the Information Commissioner's Office has announced that it will be reviewing the judgment carefully.

Whatever their conclusions, the decision does not reflect blanket approval for use of the technology by law enforcement agencies.

The judges argued that the use of AFR by South Wales Police was legal because on each occasion the technology was used, it was deployed for a limited time, and for specific and limited purposes.

They added that members of the public could not be identified by name and that their personal data collected it would be deleted immediately after it had been processed – unless their image was matched to a person on the police watchlist. Police forces that use the technology differently may find their deployments deemed illegal.

It is also worth noting that the rules that regulate police use of AFR are not the same as those that govern the private sector, although the lines between the two can be more blurry than they first appear, as the controversial deployment in London's King's Cross Central development illustrated. 

Read next: Facial recognition on the rise: can current laws protect the public?

On the same day that the court in Cardiff made its decision, London Mayor Sadiq Khan revealed that the Metropolitan Police had admitted sharing images related to facial recognition with the King's Cross developer, despite initially denying its involvement with the deployment.

The two incidents have brought the issue of AFR into the public focus.

What next for AFR?

Professor Peter Fussey and Dr Daragh Murray, members of the ESRC Human Rights, Big Data & Technology Project based at the University of Essex Human Rights Centre, expect the ruling to provoke significant debate and perhaps legal challenges.

In a Twitter thread, they noted that the court's "conclusion regarding this being sufficient protection against arbitrariness will likely be contested" as it "largely rests on intrusive non-covert classification". They also raised doubts over the claim that the deployment was in accordance with the right to privacy.

Bridges confirmed that he would be appealing the decision.

"Obviously I'm disappointed by elements of today's judgement, but the fight doesn't end here," he
tweeted. "This is not an issue that will disappear, and recent months have underlined that there are serious
flaws, both technical and ethical, with automatic facial recognition.

"This sinister technology undermines our privacy and I will continue to fight against its unlawful use to ensure our rights are protected and we are free from disproportionate government surveillance. As such, I will be appealing today's judgment."

The progress of his appeal will be followed closely – by opponents and supporters of AFR alike.