Court said NO to Facial Recognition Technology
FIRST EVER DECISION OF A FRENCH COURT APPLYING GDPR TO FACIAL RECOGNITION
In a ruling on 27th of February, a French court canceled a decision by the South-Est Region of France (Provence-Alpes-Côte d’Azur – PACA) to undertake a series of tests using facial recognition at the entrance of two High schools considering that this would be illegal. This has been a well noticed decision of the French Court applying the General Data Protection Regulation (GDPR) on Facial Recognition Technologies (FRTs).
A great win for « La Quadrature du Net », la Ligue des droits de l’Homme, parent association of state schools of Alpes-Maritimes et CGT Union Educ’Action of Alpes-Maritimes,
Following previous recommendations by the CNIL, the Administratif court ruled that the processing of biometric data are subject to strict conditions.
The local region of Provence-Alpes-Côte d’Azur had set a tripartite agreement for a trial with schools and Cisco International Limited to install
access control, facial recognition and tracking in three high schools.
Facial recognition data processing relate to biometrics, special category data under the General Data Protection Regulation. As such their processing is in principal prohibited.
Article 9 lists the conditions for processing special category data:
(a) Explicit consent
(b) Employment, social security and social protection (if authorised by law)
(c) Vital interests
(d) Not-for-profit bodies
(e) Made public by the data subject
(f) Legal claims or judicial acts
(g) Reasons of substantial public interest (with a basis in law)
(h) Health or social care (with a basis in law)
(i) Public health (with a basis in law)
(j) Archiving, research and statistics (with a basis in law)
The municipality had based its ground of processing on student’s consents and for minors, their parents’s consent collected on a signed form. The court objected against the legality of this project based on the imbalance of power.
Article 9(2)(a) permits to process special category if:
“the data subject has given explicit consent to the processing of those personal data for one or more specified purposes”.
In order to be valid, explicit consent must be freely given, specific, affirmative (opt-in) and unambiguous, and able to be withdrawn at any time. In practice, the extra requirements for consent to be ‘explicit’ are likely to be:
- explicit consent must be confirmed in a clear statement (whether oral or written), rather than by any other type of affirmative action;
- it must specify the nature of the special category data; and
- it should be separate from any other consents.
It must be made sure that a genuine choice over whether and how you use their data.
In particular, if consent is required as a condition of access to services, or wherever there is a position of power over the individual, such as a public authority or their employer, consent cannot be freely given and deemed to be invalid.
In a context of a consent required by a local authority from students, without allowing any alternative, knowing the imbalance of power, it is with no surprise that the Administratif court has invalidated the facial recognition monitoring in school.
It is unreasonable to believe ALL students, present and future, would freely consent to have their biometrics processed, without alternative. Additionally, are concerned all staff and visitors, including minors.
Convinced by the imbalance of powers, the Court then argued the lack of proportionality, ruling the security aim could be reached in a less intrusive manner with student badges.
Following the CNIL opinion, the court ruled that facial recognition is a disproportionate measure to manage the access to school, especially since alternative measures could reach the same result in a less intrusive manner. Student badges. are much less detrimental to rights exist to do this. On this subject,
Another case will be heard on Monday, March 2, 2020 at the Marseille administrative court. An appeal filed with the Ligue des Droits Humains against automated video surveillance (which identifies among other things “suspicious behavior” of individuals on the public highway.
We have warned previously against the issue of biometrics data processing in surveillance and the inherent security issues of a data thirsty technology. Every data base creates risks of data leak. Clearview data base has since revealed a major leak.
Clearview AI Faces California, Illinois Lawsuit After Breach (1). The question will remain how to monetise the compensation resulting from the harm caused by the loss of control over something as unique as biometrics.
‘Clearview AI, a controversial facial recognition app being used by US law enforcement to identify suspects and other people, is facing another lawsuit. The new suit, filed Thursday, seeks class-action status and $5 million in damages for what it calls willful, reckless or negligent violations of biometrics laws in Illinois by Clearview and CDW.’ in CNET
‘New York facial recognition startup Clearview AI – which has amassed a huge database of more than three billion images scraped from employment sites, news sites, educational sites, and social networks including Facebook, YouTube, Twitter, Instagram and Venmo – is being sued in a potential class action lawsuit that claims the company gobbled up photos out of “pure greed” to sell to law enforcement.‘ writes Lisa Vaas in Naked Security.
Facial recognition is spreading faster than you realise.
Anna Merlon writes ‘Here’s the File Clearview AI Has Been Keeping on Me, and Probably on You Too’
Anna Merlon writes ‘Here’s the File Clearview AI Has Been Keeping on Me, and Probably on You Too’.
“The first ever independent evaluation into the Metropolitan Police’s use of facial recognition technology suggests it is 81% inaccurate and could be illegal.” – Sky News
Concerning the processing of biometrics, Susan Brown, Executive Chairwoman and Founder of Zortrex wrote :
“Time to wake up! Data Pipeline – Personalisation Marketing/advertising – taking our identities, the bad actors see this open air gap and taking over the data output and input. Biometrics is not security, your actually giving away your DNA! If governments companies do not start securing identities now, going to be disastrous. You cannot replace your DNA or your Identity. Every day, mass surveillance, greed, humans are the product, if we do not start securing the future children now and their education and their financial banking, their medical history then everyone may just give up while the world comes to an end. “
Finally, the UK ICO in his guidance give an example :
A gym introduces a facial recognition system to allow members access to the facilities. It requires all members to agree to facial recognition as a condition of entry – there is no other way to access the gym. This is not valid consent as the members are not being given a real choice – if they do not consent, they cannot access the gym. Although facial recognition might have some security and convenience benefits, it is not objectively necessary in order to provide access to gym facilities, so consent is not freely given.
However, if the gym provides an alternative, such as a choice between access via facial recognition and access via a membership card, consent could be considered freely given. The gym could rely on explicit consent for processing the biometric facial scans of the members who indicate that they prefer that option.
Despite that, UK ICO Approves Unconsented Facial Recognition At Security Conferences
This Filter Makes Your Photos Invisible to Facial Recognition.
The European regulatory corpus applicable to facial recognition is composed of:
- 2012: Working Party 29, Opinion 3/2012 on developments in biometric technologies
- 2016: Directive 2016/680 Data Protection Law Enforcement Directive
- 2016: Regulation 2016/679 General Data Protection Regulation (GDPR)
- 2019: EU Agency for Fundamental Rights (FRA), Facial Recognition Technology : fundamental rights considerations in the context of law enforcement.
- 2020: European Commission (EC), White Paper on AI