The Digital Services Act (DSA)



What is the Digital Services Act (DSA)?

The Digital Services Act is the most important and most ambitious regulation in the world in the field of the protection of the digital space against the spread of illegal content, and the protection of users’ fundamental rights. There is no other legislative act in the world having this level of ambition to regulate social media, online marketplaces, very large online platforms (VLOPs) and very large online search engines (VLOSEs). The rules are designed asymmetrically: Larger intermediary services with significant societal impact (VLOPs and VLOSEs) are subject to stricter rules.

After the Digital Services Act, platforms will not only have to be more transparent, but will also be held accountable for their role in disseminating illegal and harmful content.

Amongst other things, the DSA:

1. Lays down special obligations for online marketplaces in order to combat the online sale of illegal products and services;

2. Introduces measures to counter illegal content online and obligations for platforms to react quickly, while respecting fundamental rights;

3. Protects minors online by prohibiting platforms from using targeted advertising based on the use of minors’ personal data as defined in EU law;

4. Imposes certain limits on the presentation of advertising and on the use of sensitive personal data for targeted advertising, including gender, race and religion;

5. Bans misleading interfaces known as ‘dark patterns’, and practices aimed at misleading.

Stricter rules apply for very large online platforms and search engines (VLOPs and VLOSEs), which will have to:

1. Offer users a system for recommending content that is not based on profiling;

2. Analyse the systemic risks they create: Risks related to the dissemination of illegal content, negative effects on fundamental rights, on electoral processes and on gender-based violence or mental health.

In the context of the Russian military invasion in Ukraine, involving grave and widespread violations of the human rights of the Ukrainian people, and the particular impact on the manipulation of online information, the Digital Services Act introduces a crisis response mechanism. This mechanism will make it possible to analyse the impact of the activities of VLOPs and VLOSEs on the crisis, and rapidly decide on proportionate and effective measures to ensure the respect of fundamental rights.


Important notes:

1. This Regulation will apply from 17 February 2024 (according to Article 93, Entry into force and application). Some provisions apply from 16 November 2022.


2. The Digital Services Act package includes:

a. The Digital Services Act,

b. The Digital Markets Act.

Both legislative acts were adopted by the Council and the European Parliament in 2022, and we have the final text at the "Links" page.

The Digital Markets Act (DMA) affects “gatekeeper platforms” like Google, Amazon and Meta, and covers the need for user consent before processing personal data for targeted advertising. It is interesting that most of the companies that are affected by the Digital Markets Act and the Digital Services Act are based in the United States of America.

If you believe that the sanctions for GDPR violations are very strict (up to 4% of global annual turnover), you will be surprised with the sanctions for Digital Services Act violations (up to 6% of global annual turnover), and the sanctions for Digital Markets Act violations (up to 10% of global annual turnover, or up to 20% in case of repeat offence.)


19 October 2022 - We have the final text of Regulation (EU) 2022/2065 on a Single Market For Digital Services (Digital Services Act).

The Digital Services Act Regulation applies to providers of intermediary services, and in particular intermediary services consisting of services known as ‘mere conduit’, ‘caching’ and ‘hosting’ services, given that the exponential growth of the use made of those services, mainly for legitimate and socially beneficial purposes of all kinds, has also increased their role in the intermediation and spread of unlawful or otherwise harmful information and activities.

In practice, certain providers of intermediary services intermediate in relation to services that may or may not be provided by electronic means, such as remote information technology services, transport, accommodation or delivery services. This Regulation applies only to intermediary services and not affect requirements set out in Union or national law relating to products or services intermediated through intermediary services, including in situations where the intermediary service constitutes an integral part of another service which is not an intermediary service as recognised in the case-law of the Court of Justice of the European Union.

In order to ensure the effectiveness of the rules laid down in this Regulation and a level playing field within the internal market, those rules should apply to providers of intermediary services irrespective of their place of establishment or their location, in so far as they offer services in the Union, as evidenced by a substantial connection to the Union.

Such a substantial connection to the Union should be considered to exist where the service provider has an establishment in the Union or, in the absence of such an establishment, where the number of recipients of the service in one or more Member States is significant in relation to the population thereof, or on the basis of the targeting of activities towards one or more Member States.

The targeting of activities towards one or more Member States can be determined on the basis of all relevant circumstances, including factors such as the use of a language or a currency generally used in that Member State, or the possibility of ordering products or services, or the use of a relevant top-level domain.

The targeting of activities towards a Member State could also be derived from the availability of an application in the relevant national application store, from the provision of local advertising or advertising in a language used in that Member State, or from the handling of customer relations such as by providing customer service in a language generally used in that Member State.

This Regulation fully harmonises the rules applicable to intermediary services in the internal market with the objective of ensuring a safe, predictable and trusted online environment, addressing the dissemination of illegal content online and the societal risks that the dissemination of disinformation or other content may generate, and within which fundamental rights enshrined in the Charter are effectively protected and innovation is facilitated.

Accordingly, Member States should not adopt or maintain additional national requirements relating to the matters falling within the scope of this Regulation, unless explicitly provided for in this Regulation, since this would affect the direct and uniform application of the fully harmonised rules applicable to providers of intermediary services in accordance with the objectives of this Regulation.

In order to achieve the objective of ensuring a safe, predictable and trustworthy online environment, for the purpose of this Regulation the concept of ‘illegal content’ should broadly reflect the existing rules in the offline environment. In particular, the concept of ‘illegal content’ should be defined broadly to cover information relating to illegal content, products, services and activities.

In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that the applicable rules render illegal in view of the fact that it relates to illegal activities.

Illustrative examples include the sharing of images depicting child sexual abuse, the unlawful non-consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, the sale of products or the provision of services in infringement of consumer protection law, the non-authorised use of copyright protected material, the illegal offer of accommodation services or the illegal sale of live animals.

In contrast, an eyewitness video of a potential crime should not be considered to constitute illegal content, merely because it depicts an illegal act, where recording or disseminating such a video to the public is not illegal under national or Union law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is in compliance with Union law and what the precise nature or subject matter is of the law in question.

The concept of ‘dissemination to the public’, as used in this Regulation, should entail the making available of information to a potentially unlimited number of persons, meaning making the information easily accessible to recipients of the service in general without further action by the recipient of the service providing the information being required, irrespective of whether those persons actually access the information in question.

Accordingly, where access to information requires registration or admittance to a group of recipients of the service, that information should be considered to be disseminated to the public only where recipients of the service seeking to access the information are automatically registered or admitted without a human decision or selection of whom to grant access.

Interpersonal communication services, such as emails or private messaging services, fall outside the scope of the definition of online platforms as they are used for interpersonal communication between a finite number of persons determined by the sender of the communication.

However, the obligations set out in this Regulation for providers of online platforms may apply to services that allow the making available of information to a potentially unlimited number of recipients, not determined by the sender of the communication, such as through public groups or open channels. Information should be considered disseminated to the public within the meaning of this Regulation only where that dissemination occurs upon the direct request by the recipient of the service that provided the information.

Intermediary services span a wide range of economic activities which take place online and that develop continually to provide for transmission of information that is swift, safe and secure, and to ensure convenience of all participants of the online ecosystem.

For example, ‘mere conduit’ intermediary services include generic categories of services, such as internet exchange points, wireless access points, virtual private networks, DNS services and resolvers, top-level domain name registries, registrars, certificate authorities that issue digital certificates, voice over IP and other interpersonal communication services, while generic examples of ‘caching’ intermediary services include the sole provision of content delivery networks, reverse proxies or content adaptation proxies. Such services are crucial to ensure the smooth and efficient transmission of information delivered on the internet.

Examples of ‘hosting services’ include categories of services such as cloud computing, web hosting, paid referencing services or services enabling sharing information and content online, including file storage and sharing.

Intermediary services may be provided in isolation, as a part of another type of intermediary service, or simultaneously with other intermediary services. Whether a specific service constitutes a ‘mere conduit’, ‘caching’ or ‘hosting’ service depends solely on its technical functionalities, which might evolve in time, and should be assessed on a case-by-case basis.

Providers of intermediary services should also be required to designate a single point of contact for recipients of services, enabling rapid, direct and efficient communication in particular by easily accessible means such as telephone numbers, email addresses, electronic contact forms, chatbots or instant messaging. It should be explicitly indicated when a recipient of the service communicates with chatbots. Providers of intermediary services should allow recipients of services to choose means of direct and efficient communication which do not solely rely on automated tools. Providers of intermediary services should make all reasonable efforts to guarantee that sufficient human and financial resources are allocated to ensure that this communication is performed in a timely and efficient manner.

Providers of intermediary services that are established in a third country and that offer services in the Union should designate a sufficiently mandated legal representative in the Union and provide information relating to their legal representatives to the relevant authorities and make it publicly available.

In order to comply with that obligation, such providers of intermediary services should ensure that the designated legal representative has the necessary powers and resources to cooperate with the relevant authorities.

This could be the case, for example, where a provider of intermediary services appoints a subsidiary undertaking of the same group as the provider, or its parent undertaking, if that subsidiary or parent undertaking is established in the Union. However, it might not be the case, for instance, when the legal representative is subject to reconstruction proceedings, bankruptcy, or personal or corporate insolvency. That obligation should allow for the effective oversight and, where necessary, enforcement of this Regulation in relation to those providers.

It should be possible for a legal representative to be mandated, in accordance with national law, by more than one provider of intermediary services. It should be possible for the legal representative to also function as a point of contact, provided the relevant requirements of this Regulation are complied with.


4 October 2022 - The Council approved the Digital Services Act (DSA).

Following the Council’s approval, as it has also been approved by the European Parliament, the legislative act was adopted.

Next step: After being signed by the President of the European Parliament and the President of the Council, it will be published in the Official Journal of the European Union, and will start to apply fifteen months after its entry into force.


5 July 2022 - Text of the European Parliament legislative resolution on the proposal for a regulation on a Single Market For Digital Services (Digital Services Act).

This Regulation fully harmonises the rules applicable to intermediary services in the internal market with the objective to ensure a safe, predictable and trusted online environment, addressing the dissemination of illegal content online and the societal risks that the dissemination of disinformation or other content may generate, where fundamental rights enshrined in the Charter are effectively protected and innovation is facilitated.

Accordingly, Member States should not adopt or maintain additional national requirements on the matters falling within the scope of this Regulation, unless explicitly provided for in this Regulation, since this would affect the direct and uniform application of the fully harmonised rules applicable to providers of intermediary services in accordance with the objectives of this Regulation.

This does not preclude the possibility to apply other national legislation applicable to providers of intermediary services, in compliance with Union law, where the provisions of national law which pursue other legitimate public interest objectives than those pursued by this Regulation.


Contact us

Cyber Risk GmbH
Dammstrasse 16
8810 Horgen
Tel: +41 79 505 89 60
Email: george.lekatis@cyber-risk-gmbh.com








Web: https://www.cyber-risk-gmbh.com









We process and store data in compliance with both, the Swiss Federal Act on Data Protection (FADP) and the EU General Data Protection Regulation (GDPR). The service provider is Hostpoint. The servers are located in the Interxion data center in Zürich, the data is saved exclusively in Switzerland, and the support, development and administration activities are also based entirely in Switzerland.


Understanding Cybersecurity in the European Union.

1. The NIS 2 Directive

2. The European Cyber Resilience Act

3. The Digital Operational Resilience Act (DORA)

4. The Critical Entities Resilience Directive (CER)

5. The Digital Services Act (DSA)

6. The Digital Markets Act (DMA)

7. The European Health Data Space (EHDS)

8. The European Chips Act

9. The European Data Act

10. European Data Governance Act (DGA)

11. The Artificial Intelligence Act

12. The European ePrivacy Regulation

13. The European Cyber Defence Policy

14. The Strategic Compass of the European Union

15. The EU Cyber Diplomacy Toolbox