The Game of Boards: The role of authorities in concerting the Digital Services Act and the Media Freedom Act for protecting media freedom

0

In May 2024, the European Media Freedom Act (EMFA) entered into force, marking the first comprehensive European Union regulation encompassing all media. The EMFA underscores the importance of media freedom and pluralism for democracy and the rule of law, which are essential for a well-functioning internal market. It also seeks to address the significant power imbalance between the media and very large online platforms, which have grown into key intermediaries between the media, their audiences, and advertisers. For the first time, this EU law establishes that media content is distinct from other types of content on very large online platforms. Media freedom and pluralism, as fundamental EU rights and values, require special treatment in content moderation practices by global technology companies. However, this special treatment applies only if the content does not contribute to systemic risks as defined by the Digital Services Act (DSA). Due to the complexity and interplay between these two EU laws, particularly regarding fundamental rights and the platforms’ operationalisation of the systemic risks, competent authorities will play a crucial role in the process of enforcement. Both laws establish EU platforms for regulators to cooperate and advise the Commission on relevant matters: the European Board for Digital Services under the DSA and the European Board for Media Services under the EMFA. What will need to be defined further is how these Boards will communicate and cooperate in the areas where the EMFA and DSA intersect.

The special treatment of content from media service providers on very large online platforms is mandated by Article 18 of the EMFA. The provision requires very large online platforms (VLOPs)[1] to offer a functionality that allows the media to declare as such, with a set of conditions included in the declaration: ownership transparency; editorial independence from “Member States, political parties, third countries and entities controlled or financed by third countries”; being subject to regulatory requirements for editorial responsibility and oversight by a competent national regulatory authority or adhering to a widely recognised self-regulatory mechanism. Furthermore, the media that seek to use this provision must not provide artificial intelligence generated content without human review or editorial control. The VLOPs reserve the right to verify a declaration with a national authority or self-regulatory body in case of “reasonable doubt”. They need to inform the media whose declarations are accepted and make that information public and easily accessible. The VLOPs are now required to provide their contact details, through which the media can communicate directly and quickly with them. This is a significant step forward for smaller media in general and especially for the media in smaller EU countries, where media outlets were often ignored by platforms and lacked direct communication channels. Frequently, they had no recourse when their content was unfairly removed or its visibility reduced, impacting their advertising potential and already fragile business.

Now, Article 18 EMFA requires VLOPs to, whenever they intend to suspend or restrict the visibility of the media content, first provide a statement of reasons to the media, allowing them 24 hours to respond, or a shorter time frame in crisis situations. This fundamentally changes content moderation practices vis-à-vis the media, which previously operated on a “remove and notify” basis. In some cases, such notifications were even missing or were overly general to be informative. Additionally, visibility reductions often lacked any individual notifications. This should no longer be the case for media once EMFA goes into full implementation, from August 2025. However, this provision of EMFA applies only when the platforms act on the basis of their terms of use, but not when they act against systemic risks, as introduced by the DSA.

Article 34 of the DSA outlines four broad categories of systemic risks, one of them being negative effects on the exercise of fundamental rights, explicitly mentioning the right to freedom of expression and information, including media freedom and pluralism. Another risk regards “negative effects on civic discourse and electoral processes, and public security”. The widely recognised risk of disinformation[2] can be classified under both categories, as it threatens civic discourse and electoral processes and undermines fundamental rights such as freedom of expression and access to (accurate) information. At the same time, tackling disinformation may as well have negative effects on freedom of expression, especially when a few technology companies are entrusted to determine when expression is illegal or harmful (see, for instance: Bayer at el., 2021; Nenadić & Verza, 2022; Vese, 2022; Article 19, nd). Such platforms are primarily global entities with significant market power and business interests. They aim for cost-effective, minimal compliance with regulatory requirements, rather than fully investing in the comprehensive implementation and nuanced understanding of media pluralism and information integrity in the European context. Therefore, regular and informed monitoring of how platforms implement systemic risk assessment and mitigation (Articles 34 and 35 of the DSA) is crucial. This area concerns fundamental rights, and the way these rights and systemic risks are operationalised under the DSA will determine the potential of both the DSA and the EMFA to safeguard media freedom in platforms’ content moderation. The role of competent national authorities, through their EU platforms – the European Board for Digital Services and the European Board for Media Services, and their cooperation, will be essential in ensuring compliance and safeguarding fundamental rights.

The power to supervise very large online platforms and search engines, as well as to enforce systemic risk assessment and mitigation under the DSA, resides with the European Commission. In this effort, the EC is supported by the digital services coordinator (DSC) in the country where the major platforms have their EU headquarters. The role of other national DSCs is not extensively elaborated in Articles 34 and 35. However, they play, in general, a crucial role in the DSA’s implementation as the primary competent authorities designated by Member States, bringing an understanding of the specific national contexts. They also contribute significantly to the overall framework for systemic risk assessment and mitigation oversight. Under Article 35, the Board and the Commission collaborate to publish annual reports on systemic risks posed by large online platforms and search engines (their use or design). These reports are expected to detail risk identification, assessment, and mitigation guidelines and best practices considering both the national and Union-wide levels.

The Member States were granted the freedom to determine an appropriate body for the DSC role, which resulted in varied approaches: some designated media authorities (including Ireland, where many VLOPs have their EU headquarters), some competition authorities, the biggest number are regulators of electronic communications, telecommunications, and postal services; and France, Italy, and Slovenia assigned converged authorities in the communications industry. [3] Furthermore, Member States can designate additional authorities to support DSCs in enforcing the DSA. In some cases, however, media authorities are not included in this regulatory framework, raising concerns about aligning the DSA with the EMFA. Croatia is an example where the media authority is not among the several authorities designated to implement the DSA. In countries with converged or media regulators as DSCs, a single authority enforces both laws, which facilitates coordination at the points where the two laws meet. To overcome the challenge of such varieties, effective communication and coordination between the Boards will be crucial.

The European Board for Digital Services is composed of the DSCs. The European Board for Media Services gathers representatives of national regulatory authorities in the media sector. The latter inherits ERGA – the European Regulators Group for Audiovisual Media Services, reflecting the extension of its powers to the entire ​​media sector and partly to online platforms. Both Boards formally have an independent advisory role, but their guidelines and recommendations will, in fact, be crucial for adequate and harmonised enforcement of the two new pieces of EU law, especially on complex, sensitive, and intersecting issues. The EMFA establishes a consultation mechanism (Article 12) for the Board for Media Services to engage with relevant stakeholders on matters beyond the audiovisual media sector. This mechanism allows the Board to involve experts and conduct extensive consultations, recognising the complexity and evolving nature of the media and information environment. Similarly, the Board for Digital Services is expected to support and promote the development and implementation of European standards and guidelines under the DSA, including identifying emerging issues and potential systemic risks. This Board is also encouraged by law to engage with experts.

National regulatory authorities are crucial for the proper application of both laws across the Union. The Boards, as EU platforms of competent regulatory authorities, play a vital role in addressing complex concerns related to how platforms operationalise systemic risks. Moreover, as such operationalisation is essential for determining when the EMFA’s special treatment for media in content moderation is applicable, and how effectively media freedom and pluralism are safeguarded within the platform environment. The authorities gathered under these Boards have vast practical expertise. Leveraging Boards as platforms to facilitate communication, exchange, and consultations among the various authorities represented on the two Boards, as well as with external experts, will be key to ensuring adequate enforcement of the DSA and the EMFA.

 

 

[1] Those that have more than 45 million users per month in the EU.

[2] See, for instance, EC (2018): https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX%3A52018DC0236 and EC (2021): https://eur-lex.europa.eu/legal-content/en/TXT/?uri=CELEX%3A52021DC0262

[3] Full list available here: https://digital-strategy.ec.europa.eu/en/policies/dsa-dscs

Share this article!
Share.

About Author

Leave A Reply