In case C-634/21, the CJEU provided an innovative interpretation of “automated decision” under Art. 22 GDPR in relation to algorithmic profiling by credit rating agencies.
SCHUFA is an agency that provides credit information to banks and other institutions through a risk scoring system based on artificial intelligence. A German citizen, being denied a loan application due to the information provided by SCHUFA, asked the agency to share details on how her credit risk score was determined. SCHUFA, however, objected that such information was secret, and gave the applicant only general information about the data involved in algorithmic profiling, without any details on the variables that influenced her individual score.
Accordingly, the recurrent sent a complain to the German DPA, which was declined. The recurrent then appealed the decision before the German Court of Appeal, which filed a preliminary ruling to the CJEU concerning the application of Art. 22 GDPR to the case at hand.
The CJEU concluded that, under certain circumstances, credit risk scoring by an agency falls under the scope of Art. 22 GDPR, even though the agency and the loaner are separated subjects that engage in different phases of the credit granting process.
In fact, the Court argues, credit risk scoring should be intended as an “automated decision” under art. 22 GPDR whenever it produces a “legal effect”, or otherwise an effect of “equivalent or similarly significant impact”, on the individual addressed by the decision. The existence of such “effect” mainly depends on the behaviour of the loaners, i.e., on the consideration attributed to algorithmic risk scores during negotiations.
Accordingly, the CJEU sent the case back to the (German) national court to establish whether i) in the case at hand, risk scores played a determining role in the refusal of credit, and if so, whether ii) credit risk scoring qualifies as a lawful exception to the general prohibition to rely on automated decisions, and if so, whether iii) SCHUFA, as a credit scoring agency, complies with the requirements set by the GDPR for (lawful) automated decisions.
This holding by the CJEU forces data providers to comply with data protection law not only when ADM (automated decision making) is the only component of a decision-making process, but also in every other case where ADM plays a determining role, albeit not exclusive, in the choices of the data user. As a result of this interpretation, several data providers might be forced to disclose meaningful information about the logic involved in algorithmic risk assessment, carving a way for a general legal requirement for AI explainability in credit risk scoring and elsewhere.
Naturally, the applicability of art. 22 GDPR to the case at hand ultimately depends on the role that risk scoring played in the refusal of credit according to the national court.
Furthermore, the CJEU left to the German Administrative court to establish whether SCHUFA needs to comply with the requirements set by the GDPR relating to individual explanations, human intervention in the process, risk management and documentation. Accordingly, the German court might exempt SCHUFA from complying with some of the requirements provided by the GDPR, by arguing that the obligations related to risk management and explainability should be subject to considerations of technical feasibility and trade secrecy.
SCHUFA was also concerned by a ruling (joint cases C-26/22 and C-64/22) regarding the storage of insolvency data for longer than public registers for the purpose of credit risk scoring.
The CJEU ruled that the storage of data for more time than public registers violates the GDPR, while the storage of that same data for a time equal to public registers should be subject to a proportionality assessment by the (German) national court.