The evolution of recent administrative case-law
Traditionally, Italian administrative courts (Tribunali Amministrativi Regionali – TAR) have been regarded as conservative institutions, showing a certain degree of reluctance in interpreting the law in light of tech innovations. This prejudice, however, has repeatedly proven to be unjustified, as recent decisions in terms of algorithms and administrative procedures have shown. Conversely, one could argue whether such “technological enthusiasm” is properly addressing certain privacy issues software-based decisions can raise, with particular regard to the right of individuals not to be subject to a decision based solely on automated processing[1].
Just two years ago, the administrative court of Rome, by decision no. 6606/2019[2], pointed out that algorithm-based administrative decisions are to be deemed illegitimate per se. The case at stake was referring to a mobility procedure of certain categories of civil servants (i.e. school managers), which had been completely demanded to an algorithm: in that case, the TAR held that these kind of administrative procedures cannot be deferred to a software and assessments on the personal situation of candidates must be carried out by humans, being algorithms per se unable to comply with the principles of impartiality, effectiveness and transparency which must inspire the public action. Significantly, no reference was made to Article 22 GDPR. However, the statement at hand was conflicting with other decisions already rendered in 2017 by the same section of the same Tribunal[3], clearly admitting that algorithms can be part of administrative procedures.
As of today, the “anti-technological” approach of decision no. 6606/2019 can definitely be considered as overcome. A few months after the issuance of decision no. 6606/2019, indeed, the State Council (Consiglio di Stato) rendered three landmarking decisions (no. 2270/2019[4], no. 8472/2019[5] and no. 8474/2019[6]), marking a U-turn with respect to the previous case-law.
In the afore decisions, the State Council moved from the assumption that a higher degree of digitization is crucial to improve the quality of the services rendered by the Public Administration. Automated decision-making was recognized to be of particular benefit with regard to serial and standardized procedures (ironically, reference was made to selection procedures set up to hire a high number of school managers, similar to those forming the object of decision no. 6606/2019): as a consequence, the State Council decided that algorithm-based administrative decisions must be welcomed, rather than biased, because they are in line with the general principles of efficiency and speediness of administrative procedures, set forth pursuant to Law no. 241/1990 and pursuant to Section 97 of the Italian Constitution. In light of this, the State Council clarified that algorithms have a specific legal value whenever they actively contribute to shape administrative actions and must therefore be designed taking into account the general principles governing administrative procedures under Italian law (e.g. transparency, effectiveness, proportionality, rationality, non-discrimination): moreover, administrative courts shall have the right to vet them.
In addition, the highest administrative court clarified that, whenever a software house makes an algorithm available to a Public Administration for its institutional purposes, confidentiality undertakings cannot be used to oppose access requests filed by third parties, if said requests are grounded on the need to bring a remedy action against a prejudice allegedly caused by the algorithm itself: indeed, whenever an algorithm becomes part of an administrative decision making process, transparency requirements automatically apply and shall prevail on commercial agreements[7].
In terms of algorithms and accessibility of the related source code, reference shall be made to recent decisions no. 7370/2020[8] and no. 7526/2020[9] of the administrative court of Rome. Also in these cases, the appellants were contesting the Ministry of Education’s refusal to disclose the source code of the algorithm used to manage the open competition launched for the selection of school managers. Appellants were arguing that the software which collected the exam answers was poorly engineered and thus failed to save all the inputs of the participants, who were consequently excluded from the competition itself: for this reason, they asked the source code to be disclosed. In this context, the algorithm and the related source code were regarded by the administrative court as the tools by which the Ministry collected the answers to the tests, as such serving a public interest and falling within the scope of the “administrative documents” subject to freedom of information laws[10].
In the cases at stake, the source code was deemed as ancillary with respect to the carrying out of the competition, which on its turn resulted in the exercise of an administrative function: for this reason, the source code was regarded as an administrative document falling within the scope of freedom of information laws. This is an important point to be highlighted, because the source code was deemed as a public document given its use in a public procedure: this means that asking for the disclosure of a source code used in a procedure other than a public one wouldn’t presumably have the same chances of success.
Significantly, the developer of the source code tried to rebut the disclosure request by arguing that it would have jeopardized any capacity to further monetize on the market the algorithm at stake. However, considering that the access was requested with a view to starting a judicial procedure against the defendant and given the material impact of the algorithm on the public procedure, the administrative court dismissed the argument, holding that, in such a scenario, transparency obligations must prevail on any other commercial evaluation.
In light of the above, it is reasonable to say that, in terms of relationships between administrative decision-making processes and algorithms, the following principles are by now well-established from an administrative law standpoint:
- whenever used to make an administrative decision, algorithms and the related source codes are to be considered as administrative documents: as such, they (i) must be engineered in compliance with the general principles governing administrative procedures, (ii) must – upon request – be disclosed to interested third parties claiming to have suffered a damage due to the inadequate design of the tool and (iii) can be vetted by administrative courts,
- IP rights cannot generically be leveraged to oppose the disclosure of source codes of algorithms grounding administrative decisions, because whenever an administrative action is concerned, transparency shall prevail,
- the software house which designed the algorithm must be involved in the decision of the Public Administration on whether or not disclosing the source code, as well as in the subsequent litigation (if any)[11],
- it does not make sense to argue that the disclosure of a source code per se makes the related algorithm non further usable by the Public Administration, given that based on Legislative Decree no. 82/2005 public bodies are requested to preferably go for open source softwares[12].
GDPR related issues
Moving forward, while it now seems reasonable to state that the use of algorithms in the public decision-making process is generally admitted (at least as far as serial and standardized procedures are concerned), a number of new issues might arise in the interplay among various areas of the law.
In particular, some of the above mentioned administrative courts’ decisions highlight a possible intersection between administrative law principles and the GDPR, in light of the general right of individuals, as recognized by Article 22 GDPR, not to be subject to decisions based solely on automated processing – including profiling – producing legal effects on the individual or significantly affecting him or her.
In general terms, Article 22 GDPR has a number of implications, i.e.:
- automated decisions are allowed only if necessary for the execution or performance of a contract, or following the data subject’s explicit consent or if authorized by a law providing for measures to safeguard individuals’ right and freedoms;
- in the first two cases above, data controllers shall implement measures so as to safeguard individuals’ rights, at least to obtain human intervention, or to express their point of view or to contest the decision;
- data subjects have the right to be informed about the existence of such automated decisions as well as about the logic involved, according to Articles 13 and 14 GDPR (either when data is collected directly from him/her or when collected from third parties).
Recent decisions of the State Council expressly referred to Article 22 above and to the general principle laid down by the GDPR (namely, Decisions no. 8472/2019 and no. 8474/2019). In particular, the highest Italian administrative court recognized that the existence of this principle in the Italian legal system implies that any automated decision implemented by a Public Administration shall comply with the principles of transparency, human intervention and non-discrimination, as well as with the principle of imputability of the decision to the competent administrative body. In such a context, reference to Article 22 GDPR was made in order to (i) recognize that the algorithm can be used by the Public Administration as far as the decision is attributable to the competent body and (ii) ascertain that the use of the algorithm was not – per se – unlawful, being irregularities rather detectable in the outcome of the automated decision-making process as well as in the lack of transparency in the logic of the algorithm itself. This explanation, however, could prove to be insufficient to tackle a potential jurisdictional issue.
Indeed, shifting from a purely administrative law standpoint to a more general perspective, the implementation of automated administrative procedures seems to mark the beginning of a new possibility of legal actions for individuals who have suffered damages for having been subject to automated decisions. In this respect, whilst on the one hand an increase of claims whereby individuals challenge the adequacy of algorithms grounding administrative decisions is to be expected – in terms of the algorithms themselves being or not compliant with the general principles governing administrative decisions under Italian laws –, on the other hand the same automated processes could ignite a number of actions before civil courts, whereby individuals could claim to have suffered damages from a violation of the GDPR. Litigations before administrative and civil courts could overlap, and so court decisions.
In other words, it could be possible for the data subject not only to challenge the automated decision before an administrative court on any question concerning both the possibility for the public entity to use an algorithm to take the decision or the logic involved by the functioning of the algorithm itself, but also to either bring a claim before the Italian Data Protection Authority or to start an action before an Italian civil court, in order to ascertain violation of GDPR rules by the public administration concerning automated decisions and profiling. Being the aim of the two litigations different (i.e. obtaining (i) the annulment of the administrative decision by the administrative court and (ii) a compensation for damages by the civil judge), it cannot be excluded that these actions can be brought simultaneously.
The extreme consequence of the above reasoning could be that any time a public entity performs an automated decision-making activity involving the processing of personal data, this could trigger a multi-sided litigation risk, given the complex nature of the interests impacted and in light of the legal interplay these interests lay at.
Moreover, procedural issues should also be considered. Indeed, going back to Article 22 GDPR a public entity should be allowed to ground its action (having legal effect or other significant consequence for a data subject) on an algorithm only if the relevant personal data processing is either based on consent or necessary for a contract or provided by law. If none of these conditions is met, will the data subject be allowed to bring an action before administrative court, or civil courts, or both? The question is open and could possibly ignite a conflict between the two jurisdictions that could escalate until the Supreme Court of Cassation (Corte di Cassazione).
In addition to that, public entities relying on automated means for hiring or mobility purposes (or any other purposes) should take all the steps required by the GDPR: among those outlined above , an information notice should be provided to data subjects and a proper impact assessment according to Article 35 GDPR should be conducted. It is reasonable to hold that such additional steps shall become part of the administrative action, this passage requiring a new awareness in terms of GDPR compliance by public entities.
Conclusions
To conclude, in less than 2 years the case-law on the compliance of algorithm-based public decisions with the general principles governing Italian administrative law has marked a U-turn. At the same time, however, new challenges might arise in the interplay between administrative and data protection law: on the one hand, indeed, public entities might become subject to actions simultaneously brought before civil and administrative courts when deciding based on personal data processing, such as in the context of mobility or hiring procedures (but also in the field of public procurement); on the other hand, certain data protection requirements might start becoming relevant for administrative courts as they become part of the administrative action and hence of the related administrative litigation. Therefore, alongside with simplification, an increasing level of complexity is likely to be faced by Public Administrations in terms of structuring administrative procedures which are compliant with several legal requirements. Time will tell whether the public decision-makers shall be worthy of the task.
[1] Please refer to Article 22 of the EU General Data Protection Regulation no. 2016/679 (GDPR).
[2] Administrative court of Rome, Section III bis, Decision no. 6606 of 27 May 2019.
[3] Reference can be made, by way of example, to decisions (i) no. 3742 of 21 March 2017 and (ii) 3769 of 22 March 2017, both rendered by Section III bis of that tribunal. In both litigations, the appellants were challenging the Ministry of Education’s refusal to disclose the source code of the algorithm used to manage schoolteachers’ mobility procedures.
[4] State Council, Section VI, decision no. 2270 of 8 April 2019.
[5] State Council, Section VI, decision no. 8472 of 13 December 2019.
[6] State Council, Section VI, decision no. 8474 of 13 December 2019.
[7] Please refer to State Council, Section VI, decision no. 8472 of 13 December 2019.
[8] Administrative court of Rome, Section III bis, decision no. 7370 of 30 June 2020.
[9] Administrative court of Rome, Section III bis, decision no. 7526 of 1 July 2020.
[10] Reference shall be made to Articles 22 and ss. of Law no. 241/1990 and to Legislative Decree no. 33/2013.
[11] Reference shall be made to State Council, Section VI, decision no.30 of 2 January 2020.
[12] Articles 68 and 69 of Legislative Decree no. 82/2005 (Code of Digital Public Administration).
1 Comment
Pingback: Algoritmi e diritto – Privacy Italiana