French family welfare scoring algorithm challenged by La Quadrature du Net

0

The French advocacy group, La Quadrature du Net (“LQDN”) have challenged the use of a scoring algorithm by the French welfare system for certain recipients of family benefits. The case, which will be heard by the Conseil d’État (France’s highest administrative court), raises interesting arguments about algorithmic processing in public administration and the issue of algorithmic discrimination. This case has been brought against the Caisse Nationale des Allocations Familiales (the National Family Allowance Fund- “CNAF”). CNAF employ an algorithm to determine the individuals who are most likely to have received an undue payment- a payment which they should not be in receipt of.  These individuals are then checked by Caisses d’Allocations Familiales (CAF).

This piece provides an account of the legal submission of LQDN to the Conseil d’État challenging the continued use of this algorithm. First, this piece will provide an account of how this algorithm works in practice in the French welfare system, followed by an analysis of the two legal arguments put forward by LQDN.

How does the scoring algorithm work?

This algorithm provides a risk score to each recipient of a benefit. This risk score corresponds to the alleged probability that someone has received an undue payment- ie received a benefit from the French welfare state that they should not have been in receipt of. The higher the score, the greater the likelihood that a recipient of benefits from CAF is subject to checks. Once a certain score has been reached, recipients files are sent to CAF inspectors. Fraud detection is not the purpose of the processing- it is simply to detect situations where an undue payment has been provided.

LQDN note that CNAF have persistently refused to disclose the version of its code it is currently using to give recipients risk scores. LQDN have access to several models and variables, dating back to 2010. The three risk scoring models disclosed by CNAF- are constructed using logistic regression[1]. This machine learning algorithm combed through datasets in order to learn which characteristics are most predictive of a target (eg. mistake or not mistake)[2].

This processing affects 32.3 million people, including 13.5 million children, because it encompasses the personal data not only of the beneficiaries but also of relatives of recipients. A significant breadth of data, including personal data, is used in this algorithmic processing.

Legal arguments:

The use of this practice by the French authorities has been challenged on several grounds, in particular under Article 22 GDPR and the grounds of alleged resulting direct and indirect discrimination against citizens.

Article 22 GDPR challenge:

Article 22[3] states that a data subject has the right to not be subject to a decision based solely on automated processing which produces legal effects. In its submission, LQDN considers the recent case of Schufa [4], which held that for article 22 to apply there must be a ‘decision’ , that this decision must be based exclusively on automated processing, including profiling and that it must produce legal effects concerning the data subject or affect him ‘significantly in a similar way’.

The submission by LQDN argues that the processing engaged in here is exclusively automatic processing and that this score is calculated entirely automatically. It argues that the processing involves the strict application of variables without any human evaluation. On the question of human intervention,  it states that the fact that this score is used to guide human checks carried out by CAF has no bearing on the exclusively automated nature of the score. It further argues that the three cumulative conditions, as required in Schufa[5], have been met in this case.

It should be noted that there are exceptions to the prohibition in article 22(2)[6]. However, LQDN argues that none of these exceptions apply to this case. The exceptions include where it is necessary for entering into, or performing a contract between the data subject and the data controller; where it is authorised by Union or Member State law to which the controller is the subject and which lays down suitable measures to safeguard data subject’s rights and freedoms and legitimate interests; or where it is based on the data subject’s consent.

In applying this law to the case at hand, LQDN says that consent from the data subjects was not obtained, that there was no contractual relationship with the data subjects and that legitimate interest cannot constitute a legal basis for a public authority in the exercise of its tasks. It argues that targeting of checks is not intended to safeguard vital interests of the data subjects or of any other natural person and that the targeting of controls does not constitute missions of public interest.

Article 22[7] also prohibits decisions based on special categories of personal data as articulated in article 9[8]of the GDPR unless it is based on explicit consent or it is necessary for reasons of public interest in the area of public health. LQDN argues that the algorithmic processing at issue is sensitive data processing- within the meaning of article 9[9]– and does not fall within an exception, as outlined above.

In sum, LQDN argue that the algorithmic processing constitutes profiling within the meaning of article 22[10] and does not fall within any stated exceptions.  This would make the practice carried out by CNAF illegal under EU law.

Issue of direct and indirect discrimination:

The issues of direct and indirect discrimination against individuals through the use of this algorithmic processing is also raised in the LQDN submission. Under Article 14[11] of the European Convention on Human Rights prohibits discrimination on the grounds of sex, race, colour, language, religion, political or other opinion, national or social origin, association with a national minority, property, birth or other status.

LQDN argues the inclusion of some variables can result in indirect discrimination against beneficiaries. The concept of indirect discrimination is defined in Article 2 of Directive 2000/43/EC[12]. There is a two stage process to identify indirect discrimination. Firstly, it is necessary to establish whether the neutral criteria complained of is itself discriminatory Then it must be ascertained whether the criterion is objectively justified by a legitimate aim and whether the means of achieving that aim are appropriate. Furthermore Recital 71 of the GDPR requires data controllers to take steps to prevent, inter alia, discriminatory effects on natural persons on the basis of racial or ethnic origin, political opinion, religion or beliefs, trade union membership, genetic or health status or sexual orientation[13].

Variables included in the model include a family situation, where it considers whether individuals have had a change in their family situation (such as a divorce), in the last 18 months. It also includes variables such as age, disability etc. LQDN argues that single parent families are subject to greater checks and that this constitutes indirect discrimination on the grounds of family status and gender. The vast majority of single parent families are families with a single mother at the helm (95% of this group are women) so the inclusion of some of the variables are adding to their risk score and thus is indirectly discriminating against them.

Alongside these examples of indirect discrimination, LQDN also argue that CNAF engages in direct discrimination on the grounds of disability, age, family situation and gender, as these are determinative variables within the model.

Conclusion:

The use of such algorithms by public authorities is a paramount issue in democratic societies today. As more governments are seeking to cut welfare costs, new technologies facilitate new “prediction” of citizen behaviour.  This is often not prediction of an individual’s behaviour derived from an individual examination. In this case, it’s an assignment of an individual to categories that are formed from machine learning. They are therefore expected to produce certain outcomes.   This “prediction” determines the level of oversight a citizen is subject to when interacting with essential services. This has grave consequences for the dignity of the individual and undermines  the notion of equality of persons in society.  The legality of these practices is still to be determined. LQDN has invited the Conseil d’État to refer three questions to the Court of Justice of the European Union to give a preliminary ruling on some of the issues raised in the case. Time will tell whether these actions are legal.

As this new technological revolution dawns, this case provokes a further reflection of whether the safeguards to fundamental rights introduced in new legislative efforts such as the AI act are robust enough to protect citizens. This case provokes rumination on this more consequential question.

References:

[1] Manon Romain et al, ‘How We Investigated France’s Mass Profiling Machine’  (Lighthouse Reports, 4 December 2023) <https://www.lighthousereports.com/methodology/how-we-investigated-frances-mass-profiling-machine/> accessed 29/11/2024

[2] Ibid.

[3] Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) (Text with EEA relevance) Article 22

[4] SCHUFA Holding CJEU, 7 December 2023, C-634/2

[5] SCHUFA Holding CJEU, 7 December 2023, C-634/2

[6] Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) (Text with EEA relevance) Article 22

[7] Ibid

[8] Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) (Text with EEA relevance) Article 9

[9] Ibid

[10] Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) (Text with EEA relevance) Article 22

[11] Article 14 of the European Convention on Human Rights

[12] Council Directive 2000/43/EC of 29 June 2000 implementing the principle of equal treatment between persons irrespective of racial or ethnic origin

[13] Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) (Text with EEA relevance) Recital 71

Share this article!
Share.

About Author

Leave A Reply