In a decision (French only) dated 27 February 2020, the French Administrative Court of Marseille invalidated the deliberation of the Provence-Alpes-Côte d’Azur Regional Council which allowed to set up, on an experimental basis, a facial recognition mechanism in two high schools in order to (i) better control and speed up entry of students into the high schools and (ii) control access to premises of occasional visitors.
This decision is important as this is the first administrative court decision in France about facial recognition. Since the GDPR entered into force, it is also the first French administrative court decision relating to data protection not based on a deliberation issued by the French Data Protection Authority (CNIL), which was already quite uncommon before GDPR’s entry into force.
Facts and procedure
On October 2017, the President of the Provence-Alpes-Côte d’Azur Regional Council consulted the CNIL to request its assistance in setting up, on an experimental basis, a facial recognition system in two high schools in the South of France to be used at the entry and inside the premises to control access of students and visitors. Although the experiment had not been authorized by the CNIL, the Regional Council, in a deliberation (French only) dated 14 December 2018, decided to launch it. Expressly opposed to this measure, several French data protection and human and civil rights associations, including the French association “La Quadrature du Net“, filed an action for annulment of the Regional Council’s deliberation before the French Administrative Court of Marseille on 14 February 2019.
In the meantime, the Regional Council pursued its discussions with the CNIL and communicated to it the data protection impact assessment (DPIA) drafted for the facial recognition experiment. On October 29, 2019, the CNIL finally published on its website a press release (French only), in which it considered, based on the finalized version of the DPIA communicated by the regional council, that the experiment, which concerned students, most of whom were minors, with the sole aim of making access to their high schools more fluid and secure, was neither necessary nor proportionate to achieve the intended purposes.
In its decision (French only) of 27 February 2020, the French Administrative Court of Marseille took up most of the points raised by the CNIL in its press release and invalidated the decision of the Regional Council insofar as it (i) had not provided sufficient guarantees to obtain free and informed consent of students to the processing and (ii) did not demonstrate that the purpose of checking the entrances to the high schools could not be achieved by other, less intrusive means.
Key grounds of the decision
Although it would be practical for the high schools to implement “virtual gates” to facilitate the daily life of security guards, the Court considered that the proportionality criteria was not satisfied as controls can be carried out by less intrusive means.
The Court thus used the “less intrusive” argument which is often used by the CNIL to check whether a given solution is proportionate. It underlined therefore a lack of proportionality as it stated that other, less intrusive, means such as a badge access control or CCTV system, could be more appropriate.
The path taken by the administrative court to consider that the consent was not validly collected is not surprising as it reiterates the analysis table usually used to assess the validity of consent based on freely given and informed consent: the consent was collected through a simple form to be signed by high school students in a subordinate position to their respective high schools’ directors.
It is however surprising that the Regional Council relied on consent rather than another legal basis, as consent is, in several respects, simultaneously a fragile and rigorous legal basis: numerous and binding criteria for obtaining a valid consent, possible withdrawal of consent at any time, risk that the controller might not be able to carry out the proposed processing operation if the data subject does not provide consent, etc.
Processing of minors personal data
The Court remained silent on the particularity of the mechanism, namely the processing of biometric data of high school students, most of whom are minors.
This raises the more general question of the absence of a specific provision within the GDPR or the French Data Protection Act concerning the processing of sensitive biometric data of minors over the age of fifteen (defined age of minors under the French Data Protection Act, based on Article 8 GDPR), insofar as minors are presumed, irrespective of age, to be so-called vulnerable persons within the meaning of the GDPR.
Key takeaways of the decision
Experimentation does not preclude privacy requirements
The decision highlights the fact that although the facial recognition processing was still at an experimental stage, it does not preclude the application of data protection principles, in particular as it related here to sensitive data in the form of biometric data and to vulnerable persons such as minors.
Facial recognition, as innovative processing of biometric data, requires higher demonstration of compliance with privacy standards
When processing personal data in innovative ways such as facial recognition, data protection authorities expect controllers to demonstrate a high level of compliance with data protection principles due to the complexity of ensuring compliance when deploying facial recognition tools.
The European Commission, for example, stated in its White Paper on artificial intelligence adopted on 19 February 2020 that processing, gathering and using biometric data for remote identification purposes “carries specific risks for fundamental rights” and must be “duly justified, proportionate and subject to adequate safeguards”.
Appropriate legal basis for facial recognition must be examined in detail
Apart from consent, which here did not satisfy the criteria of free and informed consent, the controller could have identified its legitimate interest as the legal basis. However, that might not have been valid either.
Indeed, the CNIL already had the opportunity to decide that when biometric data processing is implemented for reasons of convenience, i.e. outside of security or public health imperatives, it cannot be imposed on the data subjects, despite the legitimate interest of the data controller (CNIL Deliberation No. 2018-012 of 18 January 2018 refusing the implementation of facial recognition processing for access control of members to the premises of a sport association).
Where there are no strong security reasons or societal issues legitimating the processing, the identification of the legal basis for the implementation of facial recognition is therefore a crucial point that requires a high level of attention.
As of today, it has not been confirmed whether the Regional Council has appealed the decision from the Court. It could still appeal as the time limit for appealing has been extended in France due to the current coronavirus pandemic.
Alexandra Antalis, a trainee in our Paris office, contributed to this entry.