Disinformation, filter bubbles, and echo chambers radically change how people express themselves online by amplifying divisions that increasingly polarize society. Current events, such as the Russian invasion of Ukraine or the COVID-19 pandemic, have shown how disinformation is highly damaging as it radicalizes public opinion and dangerously influences public policymaking[1]. The European Union has demonstrated that it is aware of the risks related to disinformation and, in particular, the role of digital platforms, as confirmed by the introduction of the Digital Services Act[2].
With regard to soft law choices, on June 16, the European Commission published the Enhanced Code against Disinformation. This instrument is an example of the European strategy, as already emphasized by the European Commission Guidance on Strengthening the Code of Practice on Disinformation; the European Commission’s Communication on the European Democracy Action Plan; the Communication Tackling online disinformation: a European approach; Council Conclusions (2018).
The Code is the result of a new elaboration edited by Prof. Oreste Pollicino, with the support of Dr. Giovanni De Gregorio, and it strengthens the European Union’s strategy against disinformation.
Therefore, such a path had already been started in 2018, when the European Commission called some experts to define a first strategy against disinformation: a first document that was all about self-regulation of the parties involved. This choice, while representing, indeed, a first laudable attempt to systematize and regulate a problem that goes beyond the boundaries of digital and, on closer inspection, affects the democratic reality of many countries, proved insufficient. Self-regulation, which echoes the well-known U.S. metaphor of the “free marketplace of ideas”[3], fits poorly with a regulatory need whose supporting basis is not the prevalence of freedom of expression over the other rights at stake.
Within such a framework, it appeared necessary to strengthen the structure referred to in the 2018 Code of Conduct, subtracting autonomy in favor of a joint and procedurally verifiable commitment. Then, based on these premises, the need emerged to strengthen further the Code instrument that could well demarcate the pressing phenomenon of misinformation. In addition to an opening for signatories beyond the big web giants, the code imposes specific obligations to demonetize disinformation (Commitments no. 1 and 2), strengthens the position of users (Commitment 17 et seq.), raises the level of security (Chapter IV) and transparency of systems, including artificial intelligence (Commitment no. 15)[4]. Moreover, given the forthcoming entry into force of the Digital Service Act, the code of conduct will become a co-regulatory instrument for very large online platforms, with the consequent possibility for European institutions to impose sanctions upon non-compliance with the obligations therein[5].
In this context, MediaLaws intends to inaugurate an online symposium to foster a high-level debate concerning the evolving legal and regulatory framework of disinformation in the digital sphere. In light of the recent adoption of the Enhanced Code Against Disinformation. Specifically, legal, economic, and social opinions and reflections on the issues briefly analysed are welcome. In addition, comments are offered on the following questions:
- What scope will the Enhanced Code Against Disinformation have?
- How does the Code fit in with the proposed DSA Regulations, particularly with the obligations placed on platforms?
- What impact will the Code be able to have globally? What are the effects on “digital sovereignty”?
- What is the impact of the Code on human rights and fundamental values of the Union?
- What are the distinctive aspects of adopting a self-regulatory code instead of a co-regulatory one?
- What meaning can be drawn from the Commission’s choice of field in this case, combining hard law and soft law sources? Can it represent a model?
The blog posts must be sent to submissions@medialaws.eu.
The authors are invited to follow the MediaLaws Blog’s Guidelines.
[1] O. Pollicino, G. De Gregorio, P. Dunn, Digitisation and the central role of intermediaries in a post-pandemic world, in Medialaws, 17 November 2021.
[2] The impact of the Digital Services Act was already discussed in a dedicated Symposium.
[3] We refer to the famous Justice Holmes’ dissenting opinion in Abrams v. United States (1919).
[4] For a further look, O. Pollicino, The Road Towards a Strengthened Code Against Disinformation: About Metaphors in Free Speech and the Need to Handle Them Carefully, ELI, June 2022; O. Pollicino, Sulla disinformazione arriva un nuovo codice Ue, Il Sole 24 Ore, 16 June 2022.
[5] See article 35 of the DSA, as recalled by the Code’s Preamble, lett. i).