A new tile for the EU content moderation governance mosaic? The proposal for a Child Sexual Abuse Material Regulation

0

In May 2022, the Commission submitted a proposal for a new EU Regulation on online child sexual abuse material (CSAM Regulation proposal)[1], with a view to harmonising the legal framework on the prevention and fight against online child sexual abuse by providing, contextually, «legal certainty to providers as to their responsibility to assess and mitigate risks and, where necessary, to detect, report and remove such abuse on their services»[2]. At the same time, the proposal stresses the importance of balancing such moderation activities with the need to guarantee the full respect of the relevant fundamental rights as laid down in the Charter of the Fundamental Rights of the European Union and as general principles of EU law[3].

The significance of such a proposal in the context of online content moderation governance in the European Union is evident, namely, in light of the fact that it represents one of the first attempts at introducing sector-specific law in this field following the debate about and, subsequently, the approval of the Digital Services Act (DSA)[4]: the CSAM Regulation proposal is, in fact, self-declaredly intended precisely as a lex specialis vis-à-vis to the more general and horizontal framework developed by the DSA[5]. The present contribution thus aims, on the one hand, at describing the content of the proposal, focusing namely on the introduction of new rules on detection orders concerning child sexual abuse materials.

  1. The new rules for relevant service providers

The proposed CSAM Regulation aims at establishing rules directed specifically at the providers of online services that are considered to be the most vulnerable to misuse with respect to the illegal activities it addresses. Relevant information society services include, notably, hosting services and interpersonal communication services, but also software applications stores and Internet access services[6]. All these actors should thus be subjected a range of duties, although the intensity of these new obligations is inherently dependent on the nature itself of the services provided – with the most rules being applicable to providers of hosting and interpersonal communications services[7].

The aim of the suggested framework is that of countering three types of content and activities: that is, known child sexual abuse material (known CSAM); new child sexual abuse material (new CSAM); and solicitation of children (grooming). The first two types of child sexual abuse consist of material constituting “child pornography” or “pornographic performance” as defined by the Directive on Child Sexual Abuse[8]: however, the first group of materials is distinct from the second as it includes those materials that have already been detected and identified as such. “Grooming”, on the other hand, refers to the conduct of soliciting minors for sexual purposes via the relevant information service societies.

First, with respect to such criminal conducts, providers of hosting and interpersonal communication services must establish a risk assessment and risk mitigation system. Because of its sectorial, vertical, perspective, as opposed to the horizontal and more general one characterising the DSA, the CSAM Regulation proposal lists analytically the criteria that providers should take into account when operating their own risk assessments: notably, it specifies in detail what elements should be considered with regard to the risks of grooming, as such a conduct is inherently perceived by the drafters of the Regulation as especially tricky (and harmful)[9]. Following this assessment, the Regulation proposal requires that providers of hosting and interpersonal communications services take «reasonable mitigation measures, tailored to the risk identified […] to minimise that risk»[10]: these measures, besides, will have to be proportionate and applied in a «diligent and non-discriminatory manner, having due regard, in all circumstances, to the potential consequences of the mitigation measures for the exercise of fundamental rights of all parties affected»[11].

Second, those providers must also comply with the detection orders that national judicial or independent administrative authority, on request of the local Coordinating Authority, may decide to issue. When reached by such orders, providers of hosting and interpersonal communication services must put in place mechanisms to identify the dissemination to the public of known or new CSAM and/or the carrying out of activities constituting solicitation of children, namely through the installing and operating of dedicated technologies[12]. Providers of the mentioned services are also required to report to national authorities of any information they may have become aware of indicating the potential carrying out of online child sexual abuse on their services.[13] Besides, providers of hosting and interpersonal communications services may be reached by removal orders issued by national judicial or independent administrative authorities[14].

  1. The rules on detection orders: critical perspectives

This set of new duties and obligations inherently translate into forms of delegation of “policing” and law enforcement activities from the state to online intermediaries. In fact, the CSAM Regulation proposal is seemingly aware of the possible implications that the new envisaged regime may have on the enjoyment, by recipients of the services, of their fundamental rights as protected by the EU Charter.

In this respect, the new obligation of complying with detection orders appears to be particularly challenging, especially in the light of the different types of content that may constitute the objective scope of the orders themselves. Indeed, known CSAM, new CSAM, and “grooming” all entail significantly different challenges and problematics for the purposes of their (automated) detection systems. Whereas known CSAM may quite easily be detected and recognised through the use of relatively simple artificial intelligence (AI) systems for content moderation by comparing uploaded items to those already stored in dedicated repositories, the identification of new CSAM may not always be as simple. Thus, the issue of collateral censorship and over-removal is particularly likely in the case of new CSAM. As for grooming, it is the proposal’s Explanatory Memorandum itself that recognises the inherent challenges faced by its detection. These challenges do not only attain to the technical difficulties faced by AI in understanding, semantically, when an adult is actually engaging in acts of grooming, but also to the significant impact that its detection might entail on users’ freedom to privacy and on their right to secrecy of communications.[15]

For these reasons, the choice of the EU lawmaker, with respect to the CSAM Regulation proposal has been to scale the new obligations accordingly as well as to put in place «robust limits and safeguards» with a view to ensuring as much as possible the contextual protection of constitutionally relevant interests. Notably, the CSAM Regulation proposal envisages «adjusted criteria for the imposition of the detection orders, a more limited period of application of those orders and reinforced reporting requirements during that period» as well as «strong oversight mechanisms».[16] Therefore, not only does the proposal define an elaborate procedural system of guarantees for the adoption and implementation of any detection order, by setting for instance as necessary conditions for its adoption the existence evidence of a significant risk of the service being used for purposes of child sexual abuse and the presence of reason for issuing the order that outweigh the rights and legitimate interests of all parties affected[17], but it also specifies some additional rules for detection orders concerning activities of grooming[18].

Additionally, some concerns may emerge with respect to the coherence of the new detection orders with the prohibition of any obligation of general monitoring of content, established by the e-Commerce Directive and subsequently confirmed by the DSA. Indeed, as a general framework principle, EU law does not admit the imposition of duties to monitor user-generated and third-party content with a view to detecting illegal content, as such a measure is considered disproportionate and at odds with the needs of the Digital Single Market and the protection of fundamental rights.

As a matter of fact, the discipline envisaged by the new proposal is careful, on the one hand, to tailor the limited cases where such orders can be adopted, and, on the other hand, to envision significant substantive and procedural safeguards to avoid that the new detection orders are in breach of that principle. Most notably, the CSAM Regulation proposal specifies the contents that such orders must include, with a view to limiting as much as possible the risk of overbreadth[19]. Be it as it may, the application of the new Regulation’s rules, if adopted, will require careful consideration on the part of relevant judicial and/or administrative authorities when and adopting and executing such orders.

  1. Conclusions

Overall, the rules contained within the CSAM proposal dedicated to the issuance of detection orders reveal the acknowledgement of the inherent challenges and issues connected to the imposition of such orders and seek to strike a balance that takes into account the essential need to protect and ensure the fundamental rights relevant to European constitutional principles. Though maintaining the protection of minors from forms of sexual abuse as the lodestar of the developing regulation, the Commission’s draft law aspires to avoid the result of the new framework translating into a negation of individuals’ fundamental rights to freedom of expression and to privacy and data protection: an aspiration which is, ultimately, the trait d’union connecting the developing Union framework against online illegal and harmful content.

 

[1] Communication COMM(2022) 209 final from the Commission of 11 May 2022, Proposal for a Regulation of the European Parliament and of the Council laying down rules to prevent and combat child sexual abuse.

[2] Ibid. art. 3

[3] Ibid.

[4] Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market For Digital Services and amending Directive 2000/31/EC (Digital Services Act). Cf. the dedicated symposium on this blog: Simposio: Verso il Digital Services Act, in medialaws.eu.

[5] Communication COMM(2022) 209 final, cit., recital. 8.

[6] Ibid. art. 2(f).

[7] The rules concerning software application stores and access providers are fewer and rather less invasive. With respect to the former, Ibid. art. 6 lists the obligations they are subjected to; while access providers, pursuant to arts 16 ff., may be subjected to orders requiring them to «take reasonable measures to prevent users from accessing known child sexual abuse material indicated by all uniform resource locators on the list of uniform locators included in the database of indicators […] provided by the EU Centre». On the topic of the risk-based approach, also in the context of the DSA, see G. De Gregorio – P. Dunn, The European Risk-based Approaches: Connecting Constitutional Dots in the Digital Age, in Common Market Law Review, 59-2, 2022, 473 ff.

[8] Directive 2011/93/EU of the European Parliament and of the Council of 13 December 2011 on combating the sexual abuse and sexual exploitation of children and child pornography, and replacing Council Framework Decision 2004/68/JHA, art. 2(c) and (e).

[9] Communication COMM(2022) 209 final cit., art. 3(2)(e).

[10] Ibid. art. 4(1).

[11] Ibid. art. 4(2)(c).

[12] Ibid. arts 7-11.

[13] Ibid. arts 12-13. Such a provision clearly builds upon and specifies (explicitly extending its scope of application to the dissemination of CSAM and to grooming) the general rule, established by the DSA, pursuant to which providers of hosting services should promptly inform the authorities of any information they are made aware of «giving rise to a suspicion that a criminal offence involving a threat to the life or safety of a person or persons» (art. 18(1).

[14] Ibid. arts 14-15.

[15] Thus Communication COMM(2022) 209 final cit., 14: «Detecting ‘grooming’ would have a positive impact on the fundamental rights of potential victims especially by contributing to the prevention of abuse; if swift action is taken, it may even prevent a child from suffering harm. At the same time, the detection process is generally speaking the most intrusive one for users (compared to the detection of the dissemination of known and new child sexual abuse material), since it requires automatically scanning through texts in interpersonal communications. It is important to bear in mind in this regard that such scanning is often the only possible way to detect it and that the technology used does not ‘understand’ the content of the communications but rather looks for known, pre-identified patterns that indicate potential grooming».

[16] Ibid. 15.

[17] Ibid. art. 7(4).

[18] For instance, Ibid. art. 7(3) establishes that, prior to the submission of the Coordinating Authority’s request to the relevant national authority, the provider of hosting and interpersonal communications services is required to conduct a data protection impact assessment and a prior consultation procedure, as referred to in the GDPR (arts 35-36 Regulation 2016/679 cit.) in relation to the measures it intends to adopt to execute the order and contained within its draft implementation plan. Additionally, art. 7(9) states that detection orders concerning grooming activities cannot exceed 12 months.

[19] Ibid. art. 7(9).

Share this article!
Share.

About Author

Leave A Reply