On 11 August 2020, the Court of Appeal published its decision challenging the High Court’s approval of South Wales Police’s (‘SWP’) use of CCTV facial recognition. We wrote about the High Court’s judgment in September last year, which can be viewed here.

As a quick recap of the case, SWP used CCTV automated facial recognition (‘AFR’) software to enable faces on a ‘watchlist’ to be checked against faces taken from the CCTV feed in real-time. If a match was not found, the image taken from the CCTV was deleted automatically after the comparison was made, but if a match was found, the matching images would be reviewed by an AFR operator. Edward Bridges, assisted by human rights organisation Liberty, claimed that SWP’s use of AFR was not compatible with human rights law nor with data protection legislation. He made a claim for judicial review of SWP’s decision to deploy AFR. The High Court dismissed the claim.

The Court of Appeal decision

While the Court reaffirmed that the use of AFR was a proportionate interference with human rights, it reversed the High Court’s decision in a number of areas. In particular, the Court found:

  • There was insufficient operational guidance for the software, including guidance as to where the software could be used and who could be put on a watchlist. In particular, SWP’s Standard Operating Procedure did not specify any normative requirement as to where or when the monitoring could take place, or on what grounds the SWP could consider using AFR (e.g. on what basis they may believe the relevant people on the watchlist may be present at a particular location);
  • SWP’s data protection impact assessment was deficient, particularly as the assessment proceeded on the basis that data subjects’ Article 8 rights under the European Convention on Human Rights were not engaged (whereas the Court found that there were) and therefore the risks to the rights and freedoms of data subjects was not adequately assessed; and
  • SWP did not take reasonable steps to find out if the AFR software had a racial or gender bias, particularly as the software was a novel and controversial technology. It can be noted however that the Court did not find that the AFR software had such a bias, nor did it assess the extent to which such bias may be present.

The Court granted declaratory relief to reflect the points made above. It is understood that SWP does not seek to appeal against this judgment.

Comment

This judgment follows the spirit of the ICO’s comments made shortly after the High Court case (which can be read here, with a full opinion here), in which the ICO expressed concerns that police forces need to ‘slow down’ and ‘justify [AFR’s] use’. The decision was also welcomed by the Surveillance Camera Commissioner, whose statement may be read here.
This case demonstrates the importance of assessing the risks of implementing and using novel technologies that use biometric information, even if such biometric data is only held temporarily. Organisations hoping to use similar technologies should consider drafting a watertight operational policy for use of the technology and conduct a thorough data protection impact assessment.