- Background
On 20 January 2025, the European Commission and the European Board for Digital Services announced the integration of the Revised Code of Conduct on Countering Illegal Hate Speech Online (“Code of Conduct+“) into the Digital Services Act (“DSA“) regulatory framework. This step aligns voluntary commitments by online platforms with legally binding obligations under the DSA.
The original Code of Conduct, introduced in 2016, was designed to combat the spread of illegal hate speech online through voluntary measures. The updated version, now embedded in the DSA, strengthens enforcement mechanisms and increases platform accountability in identifying, moderating, and removing illegal hate speech in compliance with EU and national legal frameworks.
Subsequently, on 13 February 2025, the European Commission further strengthened the regulatory framework by integrating the Code of Practice on Disinformation into the DSA. Originally established in 2018 and reinforced in 2022, this Code aims to combat the spread of disinformation by fostering cooperation among online platforms, advertisers, and other key stakeholders. Its inclusion in the DSA ensures a more robust and enforceable approach to tackling disinformation within the European Union.
- Key provisions of the integrated codes.
2.1 Strengthened content moderation mechanisms
The Code of Conduct+ imposes stricter obligations on online platforms regarding the detection and removal of illegal hate speech. It ensures that content moderation processes are timely and effective, preventing the proliferation of hate speech before it causes harm. Similarly, the Code of Practice on Disinformation introduces obligations for reducing the visibility of disinformation by:
- Demonetising false or misleading content to remove financial incentives
- Enhancing fact-checking partnerships to provide verified content
- Implementing measures to prevent the amplification of disinformation through algorithmic adjustments
These provisions aim to strengthen platform accountability and help mitigate the negative societal impact of both hate speech and disinformation.
2.2 Transparency and accountability standards
A core requirement of both codes is increased transparency in content moderation policies, which means that platforms will have to provide publicly accessible information regarding:
- The scope of illegal content removal
- The role of recommendation algorithms in amplifying flagged content
- The reach of illegal or misleading content before it is removed
In compliance with DSA risk mitigation obligations, Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs) must conduct annual independent audits to assess compliance.
Additionally, the Code of Practice on Disinformation requires platforms to report on the prevalence and impact of misleading content and to enhance cooperation with fact-checkers and researchers to counteract online falsehoods.
2.3 Collaboration with civil society and independent experts
The Code of Conduct+ fosters stronger partnerships between platforms and non-governmental organisations (NGOs), public institutions, and academic entities specialised in hate speech monitoring.
Under this commitment, platforms must review at least two-thirds of flagged hate speech content within 24 hours of notification. Likewise, the Code of Practice on Disinformation mandates close cooperation between platforms and fact-checking organisations to ensure faster response times to remove misleading content, on one side, and greater public access to verified, fact-checked information, on the other.
- Signatories and their commitments.
Both the Code of Conduct+ and the Code of Practice on Disinformation have been endorsed by leading digital platforms, spanning social media, video-sharing services, search engines, and messaging applications.
Adhering to Codes of Conduct under Article 45 of the DSA is a voluntary commitment. However, once a platform chooses to adhere, it is expected to respect the commitments outlined in the relevant code(s). While these commitments do not replace legally binding obligations under the DSA, they serve as a structured framework to help platforms align with regulatory expectations, particularly regarding risk mitigation.
As such, for VLOPs and VLOSEs participation in such codes can play a crucial role in demonstrating compliance with systemic risk mitigation requirements under Article 34 of the DSA. By adhering to a Code of Conduct, these platforms can reinforce their risk management strategies, show cooperation with regulators, and proactively address potential harms associated with illegal content, disinformation, or other risks covered by the DSA.
Moreover, under the DSA’s supervisory and enforcement mechanisms, VLOPs and VLOSEs are subject to mandatory annual audits to assess their compliance with their obligations, including those linked to their commitments under Codes of Conduct. These audits ensure transparency and accountability, providing regulators with insight into how platforms implement content moderation, risk mitigation measures, and other safeguards outlined in the codes.
By integrating voluntary commitments within a broader regulatory framework, the DSA encourages platforms to take proactive steps in maintaining a safer digital environment while ensuring regulatory oversight and accountability.
- Compliance with the DSA.
The integration of these codes serves as a complementary mechanism to the legally binding obligations imposed by the DSA. Platforms must take proactive measures to mitigate systemic risks, including:
- The dissemination of illegal hate speech;
- The spread of coordinated disinformation campaigns;
- The negative impact of algorithmic amplification of harmful content.
By implementing these measures, platforms not only align with regulatory requirements but also reduce their exposure to sanctions under the DSA’s enforcement framework.
Following the integration of both the Code of Conduct+ and the Code of Practice on Disinformation, the European Commission has outlined the next steps to ensure the effective implementation and enforcement of these commitments.
As to the Code of Practice on Disinformation, the latter will become fully enforceable from 1 July 2025, at which point compliance will be subject to independent audits. These audits will align with the DSA’s broader supervisory and enforcement mechanisms, ensuring that platforms adhere to the transparency and risk mitigation measures outlined in the regulation.
- Policy implications and commission’s perspective.
The European Commission has welcomed the integration of both codes into the DSA framework, emphasizing the importance of multi-stakeholder cooperation in regulating digital spaces.
In a public statement, Executive Vice-President Henna Virkkunen, responsible for Tech Sovereignty, Security, and Democracy, affirmed:
“In Europe, there is no place for illegal hate, either offline or online. I welcome the stakeholders’ commitment to a strengthened Code of Conduct under the Digital Services Act. Cooperation among all parties involved is the way forward to ensure a safe digital space for all.”
Similarly, Commissioner for the Internal Market, Martin Šimečka, stated:
“The integration of the Code of Practice on Disinformation into the Digital Services Act is a critical milestone in our efforts to fight online falsehoods. This ensures that platforms are held accountable and that European citizens can trust the information they encounter online.”
- Next steps and conclusion.
The integration of both the Code of Conduct+ and the Code of Practice on Disinformation into the DSA marks a significant evolution in the EU’s regulatory approach to online content governance. By transitioning from voluntary commitments to enforceable obligations, the European Commission ensures that digital platforms remain accountable for their content moderation policies while fostering a more transparent and safer online environment.
Furthermore, this integration serves as a model for regulatory frameworks worldwide, demonstrating how binding obligations can be combined with industry collaboration to mitigate harmful content while preserving freedom of expression. The coming months will be crucial for monitoring platform compliance, particularly as VLOPs and VLOSEs prepare for their first regulatory audits under the DSA enforcement regime.
On a side note, within the framework of the Digital Services Act’s provisions on Codes of Conduct, the Commission was expected to encourage the development of two additional voluntary codes by 18 February 2025. The first, under Article 46, is the Code of Conduct for Online Advertising, which aims to enhance transparency and fairness throughout the online advertising value chain. The second, as provided for in Article 47, is the Code of Conduct for Accessibility, designed to improve access to online services for persons with disabilities. Both codes are scheduled for application by 18 August 2025. Once developed, these codes should be applied by 18 August 2025. However, to date, no public updates have been provided on their progress.