202008.13
0

Surveillance Society – Court of Appeal Puts Brakes on Police Scheme

Several police forces have been using facial recognition technology to detect suspects. Cameras placed in public places can scan thousands of faces as they pass by, matching those images with images on a database. The technology can assist in the apprehension of wanted offenders, and potentially track people as they go about their lawful business.

To many, if you have nothing to hide, you have nothing to fear, for others, it is a significant infringement of civil liberties and the start of a sinister world of state surveillance.

In a critical judgment, the Court of Appeal has provided clarification on the use of such technology.

What was the case about?

The appeal concerned the lawfulness of the use of automated facial recognition technology (“AFR”) in a pilot project by the South Wales Police Force (“SWP”). AFR is a new technology used to assess whether two facial images depict the same person. The specific type of AFR at issue, known as AFR Locate, works by extracting faces captured in a live feed from a camera and automatically comparing them to faces on a watchlist. If no match is detected, the software will automatically delete the facial image captured from the live feed. If a match is detected, the technology produces an alert and the person responsible for the technology, usually a police officer, will review the images to determine whether to make an intervention. SWP deployed AFR Locate on about 50 occasions between May 2017 and April 2019 at a variety of public events. These deployments were overt, rather than secret. The watchlists used in deployments included persons wanted on warrants, persons who had escaped from custody, persons suspected of having committed crimes, persons who may be in need of protection, vulnerable persons, persons of possible interest to SWP for intelligence purposes, and persons whose presence at a particular event causes particular concern. To date SWP watchlists have comprised between 400-800 people, and the maximum capacity for a watchlist is 2,000 images. AFR Locate is capable of scanning 50 faces per second. Over the 50 deployments undertaken in 2017 and 2018, it is estimated that around 500,000 faces may have been scanned. The overwhelming majority of faces scanned will be of persons not on a watchlist, and therefore will be automatically deleted.

What did the court decide?

The court rejected several challenges (brought by Mr Bridges) but upheld the following:

(1) Although the legal framework comprised primary legislation (DPA 2018), secondary legislation (The Surveillance Camera Code of Practice), and local policies promulgated by SWP, there was no clear guidance on where AFR Locate could be used and who could be put on a watchlist. The Court held that this was too broad a discretion to afford to the police officers to meet the standard required by Article 8(2).

(2) The High Court was wrong to hold that SWP provided an adequate “data protection impact assessment” (“DPIA”) as required by section 64 of the DPA 2018. The Court found that, as the DPIA was written on the basis that Article 8 was not infringed, the DPIA was deficient.

(3) The High Court was wrong to hold that SWP complied with the PSED. The Court held that the purpose of the Public Sector Equality Duty was to ensure that public authorities give thought to whether a policy will have a discriminatory potential impact. SWP erred by not taking reasonable steps to make enquiries about whether the AFR Locate software had bias on racial or sex grounds. The Court did note, however, that there was no clear evidence that AFR Locate software was in fact biased on the grounds of race and/or sex.

Will the Police Appeal?

The police have confirmed that they do not intend to appeal the ruling.

Does this mean that facial recognition will not now be used?

The ruling does not ban the use of facial recognition technology, but it does strengthen the safeguards that must be in place.

In a statement, South Wales Police said:

‘We are pleased that the court has acknowledged that there was no evidence of bias or discrimination in our use of the technology. But questions of public confidence, fairness and transparency are vitally important, and the Court of Appeal is clear that further work is needed to ensure that there is no risk of us breaching of our duties around equality. In 2019 we commissioned academic analysis of this question and although the current pandemic has disrupted its progress, this work has restarted and will inform our response to the Court of Appeal’s conclusions.’

Our work

We continue to closely monitor the use and lawfulness of emerging technologies, ensuring that at all stages, our client’s fundamental human rights are protected.

[Image credit: “Surveillance” by jonathan mcintosh is licensed under CC BY-SA 2.0]

How can we help?

If you need specialist advice, then get in touch with Oliver Gardner on 0161 872 9999 or by email: oliver.gardner@howardssolicitors.com and let us help. We can advise on a plea, defences and potential sentences in a wide range of circumstances.