London is estimated to have more CCTV cameras than any other global city outside of China, and recent revelations about the use of facial recognition technology across the capital suggest the level of surveillance is being turned up another notch.
There is a growing awareness of the use of facial recognition by UK law enforcement agencies, but its increasing prevalence in the private sector had largely escaped the public’s attention, until the press discovered a system had been quietly installed by a property developer in London’s King’s Cross.
Argent, the developer behind the King’s Cross project, has admitted using facial recognition to "ensure public safety" across the 67-acre, 50-building site, which thousands of people pass through every day.
The news sparked an outcry that spread from privacy campaigners, to the Mayor of London requesting more information from Argent, and finally to the Information Commissioner’s Office (ICO) launching an investigation into the deployment.
But the controversy does not necessarily mean that Argent has broken the law, largely because the rules that govern the technology are so unclear.
There is no specific regulation governing the use of facial recognition technology. The UK’s biometrics commissioner has confirmed that it does not fall under his remit because current legislation only recognises DNA and fingerprints as biometrics. The technology does come under the jurisdiction of the Law Enforcement Facial Images and New Biometrics Oversight and Advisory Board, but this body only oversees the use of the technology by police.
Human rights laws could potentially be invoked if judges decide that the technology breaches privacy rules by capturing people’s biometric data without their knowledge or consent. Equality laws could similarly be called upon if the technology is disproportionately misidentifying women and ethnic minorities. However, no precedent for this has yet been set in the courts.
Tamara Quinn, a partner who specialises in data protection at international law firm Osborne Clarke, told Techworld that GDPR will provide the primary set of rules.
"Facial recognition biometrics will count as personal data and therefore any use of them is going to come within the remit of the GDPR or the Data Protection Act, and will be looked at by the Information Commissioner's Office," she said.
Under GDPR, a lawful basis is required for the processing of personal data, which includes any information related to an identifiable person.
In the case of facial recognition, another barrier is erected, as the processing enables the unique identification of that person. This designates biometric information as a "special category of personal data”, which is normally prohibited from processing without the subject’s consent.
Argent is yet to confirm whether this been gained. A spokesperson for the King's Cross project told Techworld that the company is working collaboratively with the ICO on its inquiry and will comment further in due course, but the company has provided no details on the legal basis for its use.
Quinn believes that this will be hard to prove.
“You've got to provide those notices that were clogging our inboxes up a year or so ago, and you can imagine that's obviously a little more challenging if you are using a system that’s surveying public places, because how do you make sure that you're informing individuals?
“I'm a lawyer, so there are exceptions in some cases, for all these obligations, but they're not easy exceptions to get yourself in. Any companies using these systems have got to be very careful about what they're doing to make sure they can do it at all.”
Private sector preparations
GBG is one of the companies keeping a close eye on the ICO investigation. Gus Tomlinson, the identity data intelligence firm's head of strategy, hopes the outcome will help to clarify the rules.
"If I look at GDPR, and the whole use of data by the public or the private sector, what's important is that balance test: is using data and technology in this way in the best interest of an individual and, does that outweigh potential negative things like infringements on privacy?
“If there was that clarity, then it would be much easier for companies to be able to use this technology in the best way, and consumers and data subjects would feel more confident that their data is being used in their best interests," she said.
GBG uses facial recognition in its document authentication system, which verifies the identity of a person in an ID card, passport or driving license by comparing an image of the owner to their photograph on the document.
As that consumer is a willing user of the system, it would not face the same restrictions as Argent or any other company using the technology on unsuspecting members of the public.
Tomlinson advises other companies deploying facial recognition to be transparent about how their algorithms work and take accountability for any potential bias in their systems.
“Computers, just like human brains, learn by patterns and unfortunately, the patterns learn on what data is fed into them, and that's where you can get some bias across gender and across race,” she said. “Technology companies need to take responsibility to make sure that doesn't exist."
London will remain a global leader in surveillance for the foreseeable future, but the outcome of the ICO investigation could stunt the growth of automated facial recognition, or at least give its proponents pause for thought.
"There's been a lot of publicity around this," said Tomlinson. "I think it's a bit of a wakeup call for companies to realise that it's a complicated area and they need to make sure that they're looking at all the laws so they can make sure that if they do choose to use it, that they can do so in a way that is compliant and is not going to attract the attention of journalists or regulators."