Skip to main content

For the first time a common set of rules on intermediaries' obligations and accountability across the single market will open up new opportunities to provide digital services across borders, while ensuring a high level of protection to all users, no matter where they live in the EU.

What are the key goals of the Digital Services Act?

The new rules are proportionate, foster innovation, growth and competitiveness, and facilitate the scaling up of smaller platforms, SMEs and start-ups. The responsibilities of users, platforms, and public authorities are rebalanced according to European values, placing citizens at the centre. The rules

  • Better protect consumers and their fundamental rights online
  • Establish a powerful transparency and a clear accountability framework for online platforms
  • Foster innovation, growth and competitiveness within the single market
For citizens
  • Better protection of fundamental rights
  • More choice, lower prices
  • Less exposure to illegal content
For providers of digital services
  • Legal certainty, harmonisation of rules
  • Easier to start-up and scale-up in Europe
For business users of digital services
  • More choice, lower prices
  • Access to EU-wide markets through platforms
  • Level-playing field against providers of illegal content
For society at large
  • Greater democratic control and oversight over systemic platforms
  • Mitigation of systemic risks, such as manipulation or disinformation

Which providers are covered?


The Digital Services Act includes rules for online intermediary services, which millions of Europeans use every day. The obligations of different online players match their role, size and impact in the online ecosystem.

New obligations

Intermediary services

(cumulative obligations)


(cumulative obligations)


(cumulative obligations)

Very large

(cumulative obligations)

Transparency reporting
Requirements on terms of service due account of fundamental rights
Cooperation with national authorities following orders
Points of contact and, where necessary, legal representative
Notice and action and obligation to provide information to users 
Reporting criminal offences 
Complaint and redress mechanism and out of court dispute settlement  
Trusted flaggers  
Measures against abusive notices and counter-notices  
Special obligations for marketplaces, e.g. vetting credentials of third party suppliers ("KYBC"), compliance by design, random checks  
Bans on targeted adverts to children and those based on special characteristics of users  
Transparency of recommender systems  
User-facing transparency of online advertising  
Risk management obligations and crisis response   
External & independent auditing, internal compliance function and public accountability   
User choice not to have recommendations based on profiling   
Data sharing with authorities and researchers   
Codes of conduct   
Crisis response cooperation   

What is the impact of new obligations?

This means concretely:

  • measures to counter illegal goods, services or content online, such as a mechanism for users to flag such content and for platforms to cooperate with “trusted flaggers”
  • new obligations on traceability of business users  in online market places, to help identify sellers of illegal goods or reasonable efforts by online market places to randomly check whether products or services have been identified as being illegal in any official database
  • effective safeguards for users, including the possibility to challenge platforms’ content moderation decisions
  • ban on certain type of targeted adverts on online platforms (when they target children or when they use special categories of personal data, such as ethnicity, political views, sexual orientation)
  • transparency measures for online platforms on a variety of issues, including on the algorithms used for recommendations
  • obligations for very large platforms and very large online search engines to prevent the misuse of their systems by taking risk-based action and by independent audits of their risk management systems
  • access for researchers to key data of the largest platforms and search engines, in order to understand how online risks evolve
  • oversight structure to address the complexity of the online space: EU countries will have the primary role, supported by a new European Board for Digital Services; for very large platforms, supervision and enforcement by the Commission

New rules in a nutshell


Find out more on new rules for users


Find out more on new rules for businesses


Find out more on new rules for platforms

What are the next steps?

Following the entry into force of the Digital Services Act on 16 November 2022, online platforms will have 3 months to report the number of active end users (17 February 2023) on their websites. The Commission is also inviting all online platforms to notify to it the published numbers. Based on these user numbers, the Commission will make an assessment as to whether a platform should be designated a very large online platform or search engine. Following such a designation decision by the Commission, the entity in question will have 4 months to comply with the obligations under the DSA, including carrying out and providing to the Commission the first annual risk assessment exercise. EU Member States will need to empower their Digital Services Coordinators by 17 February 2024, the general date of entry in application of the DSA, when the DSA is fully applicable for all entities in its scope.


27 OCTOBER 2022
Regulation on Digital Services Act
(948.64 KB - HTML)