Skip to main content
European Commission logo
European Commission

Strengthening online platforms' responsibility

Overview

The EU Code of Practice on Disinformation

The EU Code of Practice on Disinformation, established in 2018, is the world’s first voluntary self-regulatory instrument for the online platforms, setting standards and commitments to fight disinformation. 

The Code was strengthened in June 2022, with 34 signatories agreeing to increase the transparency and accountability of their platforms’ actions. The Transparency Centre includes information on the Code and implemented actions.

On 13 February 2025, the Commission and the European Board for Digital Services endorsed the integration of the 2022 Code of Practice into the framework of the Digital Services Act (DSA) as a Code of Conduct on Disinformation. This integration will make the Code a benchmark for determining platforms' compliance with the DSA.

The Digital Services Act

In August 2023, the Digital Services Act (DSA) became legally enforceable, regulating online intermediaries and platforms. Its main goal is to prevent illegal and harmful activities online and the spread of disinformation. It ensures user safety, protects fundamental rights, and creates a fair and open online platform environment.  

Very Large Online Platforms and Very Large Online Search Engines must share their annual risk assessments on illegal content disseminated through their service. They must also adjust mitigation measures.

The Artificial Intelligence (AI) Act

The AI Act is the world's first-ever legal framework on AI. It addresses the risks of AI and positions Europe to play a leading role globally.

The aim of the new rules is to foster trustworthy AI in Europe and beyond, by ensuring that AI systems respect fundamental rights, safety, and ethical principles and by addressing risks of very powerful and impactful AI models.

Transparency of political advertising

On 9 April 2024, the new Regulation on the transparency and targeting of political advertising entered into force. The Regulation aims to ensure that the provision of political advertising is in full respect of fundamental rights and that voters are better placed to make well informed choices.

Under the new rules, political adverts must be clearly labelled as such, and must also provide information on who paid for them, how much was paid, to which elections, referendums or regulatory processes they are linked, and whether any targeting techniques were used.

Most of its provisions apply as of 10 October 2025.

Countering illegal hate speech online

To prevent and counter the spread of illegal hate speech online, the Commission created a Code of Conduct which several of the most important online platforms have signed up to. 

Under the Code of Conduct, companies must remove content flagged for hate speech from their platforms within 24 hours.

In January 2025, a revised Code of Conduct was integrated into the framework of the Digital Services Act. The Code of Conduct on countering illegal hate speech online + builds on the initial Code of Conduct by strengthening the way online platforms deal with content that EU and national laws define as illegal hate speech.

Addressing the dissemination of terrorist content online

The Regulation to address the dissemination of terrorist content online applies since 7 June 2022. Based on the Regulation, terrorist content must be taken down within one hour after it is identified online. This applies for online platforms offering services in the EU, to ensure the safety and security of citizens.