UK – The Court of Appeal has ruled that South Wales Police’s use of facial recognition technology breaches privacy.
It follows a legal challenge from Cardiff resident Ed Bridges and civil rights group Liberty over the use of automated facial recognition (AFR) by the police force.
The technology, known as ‘AFR Locate’, extracts images of faces captured in a live camera feed and automatically compares them to faces on a watch list. The software automatically deletes the facial image captured if no match is detected.
In September 2019, the High Court ruled that South Wales Police’s use of the technology is not unlawful. Liberty challenged the ruling at the Court of Appeal in June 2020.
The court upheld three of the five points raised in the appeal. It held that there was no clear guidance on where ‘AFR Locate’ could be used and who could be put on a watch list, that a data protection impact assessment (DPIA) was deficient, and that the police force did not investigate whether the technology had bias on racial or sex.
Liberty lawyer Megan Goulding said: “This judgment is a major victory in the fight against discriminatory and oppressive facial recognition.”
A spokesperson for the Information Commissioner’s Office (ICO) said: “We welcome the Court of Appeal’s judgment that provides clarification on the police use of live facial recognition technology in public places.
“Facial recognition relies on sensitive personal data and balancing people’s right to privacy with the surveillance technology the police need to carry out their role effectively is challenging. But for the public to have trust and confidence in the police and their actions there needs to be a clear legal framework.”
The ICO has previously called for the establishment of a statutory code of practice on live facial recognition technology in public.