What is the Digital Services Act (DSA)?
The Digital Services Act is the most important and most ambitious regulation in the world in the field of the protection of the digital space against the spread of illegal content, and the protection of users’ fundamental rights. There is no other legislative act in the world having this level of ambition to regulate social media, online marketplaces, very large online platforms (VLOPs) and very large online search engines (VLOSEs). The rules are designed asymmetrically: Larger intermediary services with significant societal impact (VLOPs and VLOSEs) are subject to stricter rules.
After the Digital Services Act, platforms will not only have to be more transparent, but will also be held accountable for their role in disseminating illegal and harmful content.
Amongst other things, the DSA:
1. Lays down special obligations for online marketplaces in order to combat the online sale of illegal products and services;
2. Introduces measures to counter illegal content online and obligations for platforms to react quickly, while respecting fundamental rights;
3. Protects minors online by prohibiting platforms from using targeted advertising based on the use of minors’ personal data as defined in EU law;
4. Imposes certain limits on the presentation of advertising and on the use of sensitive personal data for targeted advertising, including gender, race and religion;
5. Bans misleading interfaces known as ‘dark patterns’, and practices aimed at misleading.
Stricter rules apply for very large online platforms and search engines (VLOPs and VLOSEs), which will have to:
1. Offer users a system for recommending content that is not based on profiling;
2. Analyse the systemic risks they create: Risks related to the dissemination of illegal content, negative effects on fundamental rights, on electoral processes and on gender-based violence or mental health.
In the context of the Russian military invasion in Ukraine, involving grave and widespread violations of the human rights of the Ukrainian people, and the particular impact on the manipulation of online information, the Digital Services Act introduces a crisis response mechanism. This mechanism will make it possible to analyse the impact of the activities of VLOPs and VLOSEs on the crisis, and rapidly decide on proportionate and effective measures to ensure the respect of fundamental rights.
Important notes:
1. On 25 August 2023, the Digital Services Act came into effect for very large online platforms and very large online search engines.
It becomes fully applicable to other entities on 17 February 2024.
2. The Digital Services Act package includes:
a. The Digital Services Act,
b. The Digital Markets Act.
Both legislative acts were adopted by the Council and the European Parliament in 2022, and we have the final text at the "Links" page.
The Digital Markets Act (DMA) affects “gatekeeper platforms” like Google, Amazon and Meta, and covers the need for user consent before processing personal data for targeted advertising. It is interesting that most of the companies that are affected by the Digital Markets Act and the Digital Services Act are based in the United States of America.
If you believe that the sanctions for GDPR violations are very strict (up to 4% of global annual turnover), you will be surprised with the sanctions for Digital Services Act violations (up to 6% of global annual turnover), and the sanctions for Digital Markets Act violations (up to 10% of global annual turnover, or up to 20% in case of repeat offence.)
4 November 2024 - Implementing Regulation standardising the format, content, and reporting periods for transparency reports under the Digital Services Act (DSA).
The Regulation establishes uniform reporting templates and periods. Providers will have to start collecting data according to the Implementing Regulation as of 1 July 2025, with the first harmonised reports due in the beginning of 2026. The reporting periods for providers of VLOPs and VLOSEs will now be aligned, depending on their dates of designation.
To ensure consistency between the transparency tools of the DSA, the requirements for submitting statements of reasons to the DSA Transparency Database will be updated to be aligned with the data categories in the Implementing Regulation. Providers will have to submit statements of reasons according to the new requirements starting from 1 July 2025, same as for the transparency reporting templates.
26 March 2024 - Commission publishes guidelines under the DSA for the mitigation of systemic risks online for elections.
The European Commission has published guidelines on recommended measures to Very Large Online Platforms and Search Engines to mitigate systemic risks online that may impact the integrity of elections, with specific guidance for the upcoming European Parliament elections in June.
Under the Digital Services Act (DSA), designated services with more than 45 million active users in the EU have the obligation to mitigate the risks related to electoral processes, while safeguarding fundamental rights, including the right to freedom of expression.
These guidelines recommend mitigation measures and best practices to be undertaken by Very Large Online Platforms and Search Engines before, during, and after electoral events, such as to:
1. Reinforce their internal processes, including by setting up internal teams with adequate resources, using available analysis and information on local context-specific risks and on the use of their services by users to search and obtain information before, during and after elections, to improve their mitigation measures.
2. Implement elections-specific risk mitigation measures tailored to each individual electoral period and local context. Among the mitigation measures included in the guidelines, Very Large Online Platforms and Search Engines should promote official information on electoral processes, implement media literacy initiatives, and adapt their recommender systems to empower users and reduce the monetisation and virality of content that threatens the integrity of electoral processes. Moreover, political advertising should be clearly labelled as such, in anticipation of the new regulation on the transparency and targeting of political advertising.
3. Adopt specific mitigation measures linked to generative AI: Very Large Online Platforms and Search Engines whose services could be used to create and/or disseminate generative AI content should assess and mitigate specific risks linked to AI, for example by clearly labelling content generated by AI (such as deepfakes), adapting their terms and conditions accordingly and enforcing them adequately.
4. Cooperate with EU level and national authorities, independent experts, and civil society organisations to foster an efficient exchange of information before, during and after the election and facilitate the use of adequate mitigation measures, including in the areas of Foreign Information Manipulation and Interference (FIMI), disinformation and cybersecurity. Adopt specific measures, including an incident response mechanism, during an electoral period to reduce the impact of incidents that could have a significant effect on the election outcome or turnout.
5. Assess the effectiveness of the measures through post-election reviews. Very Large Online Platforms and Search Engines should publish a non-confidential version of such post-election review documents, providing opportunity for public feedback on the risk mitigation measures put in place.
The guidelines include specific measures ahead of the upcoming European elections. Given their unique cross-border and European dimension, Very Large Online Platforms and Search Engines should ensure that sufficient resources and risk mitigation measures are available and distributed in a way that is proportionate to the risk assessments. The guidelines also encourage close cooperation with the European Digital Media Observatory (EDMO) Task Force on the 2024 European elections.
Next Steps
The specific mitigation measures that a Very Large Online Platform or Search Engine should take depend on the specificities of their service and on their risk profile. The guidelines represent best practices for mitigating risks related to electoral processes at this moment in time.
As such, Very Large Online Platforms and Search Engines which do not follow these guidelines must prove to the Commission that the measures undertaken are equally effective in mitigating the risks. Should the Commission receive information casting doubt on the suitability of such measures, it can request further information or start formal proceedings under the Digital Services Act.
To add an additional element of readiness, the Commission plans a stress test with relevant stakeholders at the end of April to exercise the most effective use of the instruments and the cooperative mechanisms that have been put in place.
26 September 2023 - The European Commission has launched the DSA Transparency Database.
Under the DSA, all providers of hosting services are required to provide users with clear and specific information, so-called statements of reasons, whenever they remove or restrict access to certain content.
Article 17 DSA requires providers of hosting services to provide affected recipients of the service with clear and specific reasons for restrictions on content that is allegedly illegal or incompatible with the provider’s terms and conditions. In other words, providers of hosting services need to inform their users of the content moderation decisions they take and explain the reasons behind those decisions. A statement of reasons is an important tool to empower users to understand and potentially challenge content moderation decisions taken by providers of hosting services.
The new database will collect these statements of reasons in accordance with Article 24(5) of the DSA. This makes this database a first-of-its-kind regulatory repository, where data on content moderation decisions taken by providers of online platforms active in the EU are accessible to the general public at an unprecedented scale and granularity, enabling more online accountability.
Only Very Large Online Platforms (VLOPs) need to submit data to the database as part of their compliance with DSA already now. From 17 February 2024, all providers of online platforms, with the exception of micro and small enterprises, will have to submit data on their content moderation decisions.
Thanks to the Transparency Database users can view summary statistics (currently in beta version), search for specific statements of reasons, and download data. The Commission will add new analytics and visualisation features in the coming months and in the meantime welcomes any feedback on its current configuration.
The source code of the database is publicly available. Together with the Code of Practice on Disinformation, as well as further transparency enhancing measures under the DSA, the new database allows all users to act in a more informed manner on the spread of illegal and harmful content online.
25 April 2023 - The European Commission adopted the first designation decisions under the Digital Services Act (DSA).
The European Commission designated 17 Very Large Online Platforms (VLOPs) and 2 Very Large Online Search Engines (VLOSEs) that reach at least 45 million monthly active users.
Very Large Online Platforms:
- Alibaba AliExpress
- Amazon Store
- Apple AppStore
- Booking.com
- Facebook
- Google Play
- Google Maps
- Google Shopping
- Instagram
- LinkedIn
- Pinterest
- Snapchat
- TikTok
- Twitter
- Wikipedia
- YouTube
- Zalando
Very Large Online Search Engines:
- Bing
- Google Search
Following their designation, the companies will now have to comply, within four months, with the full set of new obligations under the DSA. These aim at empowering and protecting users online, including minors, by requiring the designated services to assess and mitigate their systemic risks and to provide robust content moderation tools.
This includes:
More user empowerment:
- Users will get clear information on why they are recommended certain information and will have the right to opt-out from recommendation systems based on profiling;
- Users will be able to report illegal content easily and platforms have to process such reports diligently;
- Advertisements cannot be displayed based on the sensitive data of the user (such as ethnic origin, political opinions or sexual orientation);
- Platforms need to label all ads and inform users on who is promoting them;
- Platforms need to provide an easily understandable, plain-language summary of their terms and conditions, in the languages of the Member States where they operate.
Strong protection of minors:
- Platforms will have to redesign their systems to ensure a high level of privacy, security, and safety of minors;
- Targeted advertising based on profiling towards children is no longer permitted;
- Special risk assessments including for negative effects on mental health will have to be provided to the Commission 4 months after designation and made public at the latest a year later;
- Platforms will have to redesign their services, including their interfaces, recommender systems, terms and conditions, to mitigate these risks.
More diligent content moderation, less disinformation:
- Platforms and search engines need to take measures to address risks linked to the dissemination of illegal content online and to negative effects on freedom of expression and information;
- Platforms need to have clear terms and conditions and enforce them diligently and non-arbitrarily;
- Platforms need to have a mechanism for users to flag illegal content and act upon notifications expeditiously;
- Platforms need to analyse their specific risks, and put in place mitigation measures – for instance, to address the spread of disinformation and inauthentic use of their service.
More transparency and accountability:
- Platforms need to ensure that their risk assessments and their compliance with all the DSA obligations are externally and independently audited;
- They will have to give access to publicly available data to researchers; later on, a special mechanism for vetted researchers will be established;
- They will need to publish repositories of all the ads served on their interface;
- Platforms need to publish transparency reports on content moderation decisions and risk management.
By 4 months after notification of the designated decisions, the designated platforms and search engines need to adapt their systems, resources, and processes for compliance, set up an independent system of compliance and carry out, and report to the Commission, their first annual risk assessment.
Risk assessment
Platforms will have to identify, analyse and mitigate a wide array of systemic risks ranging from how illegal content and disinformation can be amplified on their services, to the impact on the freedom of expression and media freedom. Similarly, specific risks around gender-based violence online and the protection of minors online and their mental health must be assessed and mitigated. The risk mitigation plans of designated platforms and search engines will be subject to an independent audit and oversight by the Commission.
19 October 2022 - We have the final text of Regulation (EU) 2022/2065 on a Single Market For Digital Services (Digital Services Act).
The Digital Services Act Regulation applies to providers of intermediary services, and in particular intermediary services consisting of services known as ‘mere conduit’, ‘caching’ and ‘hosting’ services, given that the exponential growth of the use made of those services, mainly for legitimate and socially beneficial purposes of all kinds, has also increased their role in the intermediation and spread of unlawful or otherwise harmful information and activities.
In practice, certain providers of intermediary services intermediate in relation to services that may or may not be provided by electronic means, such as remote information technology services, transport, accommodation or delivery services. This Regulation applies only to intermediary services and not affect requirements set out in Union or national law relating to products or services intermediated through intermediary services, including in situations where the intermediary service constitutes an integral part of another service which is not an intermediary service as recognised in the case-law of the Court of Justice of the European Union.
In order to ensure the effectiveness of the rules laid down in this Regulation and a level playing field within the internal market, those rules should apply to providers of intermediary services irrespective of their place of establishment or their location, in so far as they offer services in the Union, as evidenced by a substantial connection to the Union.
Such a substantial connection to the Union should be considered to exist where the service provider has an establishment in the Union or, in the absence of such an establishment, where the number of recipients of the service in one or more Member States is significant in relation to the population thereof, or on the basis of the targeting of activities towards one or more Member States.
The targeting of activities towards one or more Member States can be determined on the basis of all relevant circumstances, including factors such as the use of a language or a currency generally used in that Member State, or the possibility of ordering products or services, or the use of a relevant top-level domain.
The targeting of activities towards a Member State could also be derived from the availability of an application in the relevant national application store, from the provision of local advertising or advertising in a language used in that Member State, or from the handling of customer relations such as by providing customer service in a language generally used in that Member State.
This Regulation fully harmonises the rules applicable to intermediary services in the internal market with the objective of ensuring a safe, predictable and trusted online environment, addressing the dissemination of illegal content online and the societal risks that the dissemination of disinformation or other content may generate, and within which fundamental rights enshrined in the Charter are effectively protected and innovation is facilitated.
Accordingly, Member States should not adopt or maintain additional national requirements relating to the matters falling within the scope of this Regulation, unless explicitly provided for in this Regulation, since this would affect the direct and uniform application of the fully harmonised rules applicable to providers of intermediary services in accordance with the objectives of this Regulation.
In order to achieve the objective of ensuring a safe, predictable and trustworthy online environment, for the purpose of this Regulation the concept of ‘illegal content’ should broadly reflect the existing rules in the offline environment. In particular, the concept of ‘illegal content’ should be defined broadly to cover information relating to illegal content, products, services and activities.
In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that the applicable rules render illegal in view of the fact that it relates to illegal activities.
Illustrative examples include the sharing of images depicting child sexual abuse, the unlawful non-consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, the sale of products or the provision of services in infringement of consumer protection law, the non-authorised use of copyright protected material, the illegal offer of accommodation services or the illegal sale of live animals.
In contrast, an eyewitness video of a potential crime should not be considered to constitute illegal content, merely because it depicts an illegal act, where recording or disseminating such a video to the public is not illegal under national or Union law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is in compliance with Union law and what the precise nature or subject matter is of the law in question.
The concept of ‘dissemination to the public’, as used in this Regulation, should entail the making available of information to a potentially unlimited number of persons, meaning making the information easily accessible to recipients of the service in general without further action by the recipient of the service providing the information being required, irrespective of whether those persons actually access the information in question.
Accordingly, where access to information requires registration or admittance to a group of recipients of the service, that information should be considered to be disseminated to the public only where recipients of the service seeking to access the information are automatically registered or admitted without a human decision or selection of whom to grant access.
Interpersonal communication services, such as emails or private messaging services, fall outside the scope of the definition of online platforms as they are used for interpersonal communication between a finite number of persons determined by the sender of the communication.
However, the obligations set out in this Regulation for providers of online platforms may apply to services that allow the making available of information to a potentially unlimited number of recipients, not determined by the sender of the communication, such as through public groups or open channels. Information should be considered disseminated to the public within the meaning of this Regulation only where that dissemination occurs upon the direct request by the recipient of the service that provided the information.
Intermediary services span a wide range of economic activities which take place online and that develop continually to provide for transmission of information that is swift, safe and secure, and to ensure convenience of all participants of the online ecosystem.
For example, ‘mere conduit’ intermediary services include generic categories of services, such as internet exchange points, wireless access points, virtual private networks, DNS services and resolvers, top-level domain name registries, registrars, certificate authorities that issue digital certificates, voice over IP and other interpersonal communication services, while generic examples of ‘caching’ intermediary services include the sole provision of content delivery networks, reverse proxies or content adaptation proxies. Such services are crucial to ensure the smooth and efficient transmission of information delivered on the internet.
Examples of ‘hosting services’ include categories of services such as cloud computing, web hosting, paid referencing services or services enabling sharing information and content online, including file storage and sharing.
Intermediary services may be provided in isolation, as a part of another type of intermediary service, or simultaneously with other intermediary services. Whether a specific service constitutes a ‘mere conduit’, ‘caching’ or ‘hosting’ service depends solely on its technical functionalities, which might evolve in time, and should be assessed on a case-by-case basis.
Providers of intermediary services should also be required to designate a single point of contact for recipients of services, enabling rapid, direct and efficient communication in particular by easily accessible means such as telephone numbers, email addresses, electronic contact forms, chatbots or instant messaging. It should be explicitly indicated when a recipient of the service communicates with chatbots. Providers of intermediary services should allow recipients of services to choose means of direct and efficient communication which do not solely rely on automated tools. Providers of intermediary services should make all reasonable efforts to guarantee that sufficient human and financial resources are allocated to ensure that this communication is performed in a timely and efficient manner.
Providers of intermediary services that are established in a third country and that offer services in the Union should designate a sufficiently mandated legal representative in the Union and provide information relating to their legal representatives to the relevant authorities and make it publicly available.
In order to comply with that obligation, such providers of intermediary services should ensure that the designated legal representative has the necessary powers and resources to cooperate with the relevant authorities.
This could be the case, for example, where a provider of intermediary services appoints a subsidiary undertaking of the same group as the provider, or its parent undertaking, if that subsidiary or parent undertaking is established in the Union. However, it might not be the case, for instance, when the legal representative is subject to reconstruction proceedings, bankruptcy, or personal or corporate insolvency. That obligation should allow for the effective oversight and, where necessary, enforcement of this Regulation in relation to those providers.
It should be possible for a legal representative to be mandated, in accordance with national law, by more than one provider of intermediary services. It should be possible for the legal representative to also function as a point of contact, provided the relevant requirements of this Regulation are complied with.
4 October 2022 - The Council approved the Digital Services Act (DSA).
Following the Council’s approval, as it has also been approved by the European Parliament, the legislative act was adopted.
Next step: After being signed by the President of the European Parliament and the President of the Council, it will be published in the Official Journal of the European Union, and will start to apply fifteen months after its entry into force.
5 July 2022 - Text of the European Parliament legislative resolution on the proposal for a regulation on a Single Market For Digital Services (Digital Services Act).
This Regulation fully harmonises the rules applicable to intermediary services in the internal market with the objective to ensure a safe, predictable and trusted online environment, addressing the dissemination of illegal content online and the societal risks that the dissemination of disinformation or other content may generate, where fundamental rights enshrined in the Charter are effectively protected and innovation is facilitated.
Accordingly, Member States should not adopt or maintain additional national requirements on the matters falling within the scope of this Regulation, unless explicitly provided for in this Regulation, since this would affect the direct and uniform application of the fully harmonised rules applicable to providers of intermediary services in accordance with the objectives of this Regulation.
This does not preclude the possibility to apply other national legislation applicable to providers of intermediary services, in compliance with Union law, where the provisions of national law which pursue other legitimate public interest objectives than those pursued by this Regulation.