For the first time a common set of rules on intermediaries' obligations and accountability across the single market will open up new opportunities to provide digital services across borders, while ensuring a high level of protection to all users, no matter where they live in the EU.
What are the key goals of the Digital Services Act?
The new rules are proportionate, foster innovation, growth and competitiveness, and facilitate the scaling up of smaller platforms, SMEs and start-ups. The responsibilities of users, platforms, and public authorities are rebalanced according to European values, placing citizens at the centre. The rules
- Better protect consumers and their fundamental rights online
- Establish a powerful transparency and a clear accountability framework for online platforms
- Foster innovation, growth and competitiveness within the single market
- Better protection of fundamental rights
- More choice, lower prices
- Less exposure to illegal content
- Legal certainty, harmonisation of rules
- Easier to start-up and scale-up in Europe
- More choice, lower prices
- Access to EU-wide markets through platforms
- Level-playing field against providers of illegal content
- Greater democratic control and oversight over systemic platforms
- Mitigation of systemic risks, such as manipulation or disinformation
Which providers are covered?
The Digital Services Act includes rules for online intermediary services, which millions of Europeans use every day. The obligations of different online players match their role, size and impact in the online ecosystem.

- Intermediary services offering network infrastructure: Internet access providers, domain name registrars, including also:
- Hosting services such as cloud and webhosting services, including also:
- Online platforms bringing together sellers and consumers such as online marketplaces, app stores, collaborative economy platforms and social media platforms.
- Very large online platforms pose particular risks in the dissemination of illegal content and societal harms. Specific rules are foreseen for platforms reaching more than 10% of 450 million consumers in Europe.

All online intermediaries offering their services in the single market, whether they are established in the EU or outside, will have to comply with the new rules. Micro and small companies will have obligations proportionate to their ability and size while ensuring they remain accountable. In addition, even if micro and small companies grow significantly, they would benefit from a targeted exemption from a set of obligations during a transitional 12-month period.
New obligations | Intermediary services (cumulative obligations) | Hosting (cumulative obligations) | Online (cumulative obligations) | Very large (cumulative obligations) |
---|---|---|---|---|
Transparency reporting | ● | ● | ● | ● |
Requirements on terms of service due account of fundamental rights | ● | ● | ● | ● |
Cooperation with national authorities following orders | ● | ● | ● | ● |
Points of contact and, where necessary, legal representative | ● | ● | ● | ● |
Notice and action and obligation to provide information to users | ● | ● | ● | |
Reporting criminal offences | ● | ● | ● | |
Complaint and redress mechanism and out of court dispute settlement | ● | ● | ||
Trusted flaggers | ● | ● | ||
Measures against abusive notices and counter-notices | ● | ● | ||
Special obligations for marketplaces, e.g. vetting credentials of third party suppliers ("KYBC"), compliance by design, random checks | ● | ● | ||
Bans on targeted adverts to children and those based on special characteristics of users | ● | ● | ||
Transparency of recommender systems | ● | ● | ||
User-facing transparency of online advertising | ● | ● | ||
Risk management obligations and crisis response | ● | |||
External & independent auditing, internal compliance function and public accountability | ● | |||
User choice not to have recommendations based on profiling | ● | |||
Data sharing with authorities and researchers | ● | |||
Codes of conduct | ● | |||
Crisis response cooperation | ● |
What is the impact of new obligations?
The Digital Services Act significantly improves the mechanisms for the removal of illegal content and for the effective protection of users’ fundamental rights online, including the freedom of speech. It also creates a stronger public oversight of online platforms, in particular for platforms that reach more than 10% of the EU’s population.
This means concretely:
- measures to counter illegal goods, services or content online, such as a mechanism for users to flag such content and for platforms to cooperate with “trusted flaggers”
- new obligations on traceability of business users in online market places, to help identify sellers of illegal goods or reasonable efforts by online market places to randomly check whether products or services have been identified as being illegal in any official database
- effective safeguards for users, including the possibility to challenge platforms’ content moderation decisions
- ban on certain type of targeted adverts on online platforms (when they target children or when they use special categories of personal data, such as ethnicity, political views, sexual orientation)
- transparency measures for online platforms on a variety of issues, including on the algorithms used for recommendations
- obligations for very large platforms and very large online search engines to prevent the misuse of their systems by taking risk-based action and by independent audits of their risk management systems
- access for researchers to key data of the largest platforms and search engines, in order to understand how online risks evolve
- oversight structure to address the complexity of the online space: EU countries will have the primary role, supported by a new European Board for Digital Services; for very large platforms, supervision and enforcement by the Commission
New rules in a nutshell
Users
Find out more on new rules for users
Businesses
Find out more on new rules for businesses
Platforms
Find out more on new rules for platforms
What are the next steps?
Following the entry into force of the Digital Services Act on 16 November 2022, online platforms will have 3 months to report the number of active end users (17 February 2023) on their websites. The Commission is also inviting all online platforms to notify to it the published numbers. Based on these user numbers, the Commission will make an assessment as to whether a platform should be designated a very large online platform or search engine. Following such a designation decision by the Commission, the entity in question will have 4 months to comply with the obligations under the DSA, including carrying out and providing to the Commission the first annual risk assessment exercise. EU Member States will need to empower their Digital Services Coordinators by 17 February 2024, the general date of entry in application of the DSA, when the DSA is fully applicable for all entities in its scope.