The final text of the Digital Services Act (DSA)



Preamble 51-60, Digital Services Act (DSA)


(51) Having regard to the need to take due account of the fundamental rights guaranteed under the Charter of all parties concerned, any action taken by a provider of hosting services pursuant to receiving a notice should be strictly targeted, in the sense that it should serve to remove or disable access to the specific items of information considered to constitute illegal content, without unduly affecting the freedom of expression and of information of recipients of the service.

Notices should therefore, as a general rule, be directed to the providers of hosting services that can reasonably be expected to have the technical and operational ability to act against such specific items. The providers of hosting services who receive a notice for which they cannot, for technical or operational reasons, remove the specific item of information should inform the person or entity who submitted the notice.


(52) The rules on such notice and action mechanisms should be harmonised at Union level, so as to provide for the timely, diligent and non-arbitrary processing of notices on the basis of rules that are uniform, transparent and clear and that provide for robust safeguards to protect the right and legitimate interests of all affected parties, in particular their fundamental rights guaranteed by the Charter, irrespective of the Member State in which those parties are established or reside and of the field of law at issue.

Those fundamental rights include but are not limited to: for the recipients of the service, the right to freedom of expression and of information, the right to respect for private and family life, the right to protection of personal data, the right to non-discrimination and the right to an effective remedy; for the service providers, the freedom to conduct a business, including the freedom of contract; for parties affected by illegal content, the right to human dignity, the rights of the child, the right to protection of property, including intellectual property, and the right to non-discrimination. Providers of hosting services should act upon notices in a timely manner, in particular by taking into account the type of illegal content being notified and the urgency of taking action.

For instance, such providers can be expected to act without delay when allegedly illegal content involving a threat to life or safety of persons is being notified. The provider of hosting services should inform the individual or entity notifying the specific content without undue delay after taking a decision whether or not to act upon the notice.


(53) The notice and action mechanisms should allow for the submission of notices which are sufficiently precise and adequately substantiated to enable the provider of hosting services concerned to take an informed and diligent decision, compatible with the freedom of expression and of information, in respect of the content to which the notice relates, in particular whether or not that content is to be considered illegal content and is to be removed or access thereto is to be disabled.

Those mechanisms should be such as to facilitate the provision of notices that contain an explanation of the reasons why the individual or the entity submitting a notice considers that content to be illegal content, and a clear indication of the location of that content. Where a notice contains sufficient information to enable a diligent provider of hosting services to identify, without a detailed legal examination, that it is clear that the content is illegal, the notice should be considered to give rise to actual knowledge or awareness of illegality.

Except for the submission of notices relating to offences referred to in Articles 3 to 7 of Directive 2011/93/EU of the European Parliament and of the Council (26), those mechanisms should ask the individual or the entity submitting a notice to disclose its identity in order to avoid misuse.


(54) Where a provider of hosting services decides, on the ground that the information provided by the recipients is illegal content or is incompatible with its terms and conditions, to remove or disable access to information provided by a recipient of the service or to otherwise restrict its visibility or monetisation, for instance following receipt of a notice or acting on its own initiative, including exclusively by automated means, that provider should inform in a clear and easily comprehensible way the recipient of its decision, the reasons for its decision and the available possibilities for redress to contest the decision, in view of the negative consequences that such decisions may have for the recipient, including as regards the exercise of its fundamental right to freedom of expression.

That obligation should apply irrespective of the reasons for the decision, in particular whether the action has been taken because the information notified is considered to be illegal content or incompatible with the applicable terms and conditions. Where the decision was taken following receipt of a notice, the provider of hosting services should only reveal the identity of the person or entity who submitted the notice to the recipient of the service where this information is necessary to identify the illegality of the content, such as in cases of infringements of intellectual property rights.


(55) Restriction of visibility may consist in demotion in ranking or in recommender systems, as well as in limiting accessibility by one or more recipients of the service or blocking the user from an online community without the user being aware (‘shadow banning’). The monetisation via advertising revenue of information provided by the recipient of the service can be restricted by suspending or terminating the monetary payment or revenue associated to that information.

The obligation to provide a statement of reasons should however not apply with respect to deceptive high-volume commercial content disseminated through intentional manipulation of the service, in particular inauthentic use of the service such as the use of bots or fake accounts or other deceptive uses of the service. Irrespective of other possibilities to challenge the decision of the provider of hosting services, the recipient of the service should always have a right to effective remedy before a court in accordance with the national law.


(56) A provider of hosting services may in some instances become aware, such as through a notice by a notifying party or through its own voluntary measures, of information relating to certain activity of a recipient of the service, such as the provision of certain types of illegal content, that reasonably justify, having regard to all relevant circumstances of which the provider of hosting services is aware, the suspicion that that recipient may have committed, may be committing or is likely to commit a criminal offence involving a threat to the life or safety of person or persons, such as offences specified in Directive 2011/36/EU of the European Parliament and of the Council, Directive 2011/93/EU or Directive (EU) 2017/541 of the European Parliament and of the Council.

For example, specific items of content could give rise to a suspicion of a threat to the public, such as incitement to terrorism within the meaning of Article 21 of Directive (EU) 2017/541. In such instances, the provider of hosting services should inform without delay the competent law enforcement authorities of such suspicion. The provider of hosting services should provide all relevant information available to it, including, where relevant, the content in question and, if available, the time when the content was published, including the designated time zone, an explanation of its suspicion and the information necessary to locate and identify the relevant recipient of the service.

This Regulation does not provide the legal basis for profiling of recipients of the services with a view to the possible identification of criminal offences by providers of hosting services. Providers of hosting services should also respect other applicable rules of Union or national law for the protection of the rights and freedoms of individuals when informing law enforcement authorities.


(57) To avoid disproportionate burdens, the additional obligations imposed under this Regulation on providers of online platforms, including platforms allowing consumers to conclude distance contracts with traders, should not apply to providers that qualify as micro or small enterprises as defined in Recommendation 2003/361/EC.

For the same reason, those additional obligations should also not apply to providers of online platforms that previously qualified as micro or small enterprises during a period of 12 months after they lose that status. Such providers should not be excluded from the obligation to provide information on the average monthly active recipients of the service at the request of the Digital Services Coordinator of establishment or the Commission. However, considering that very large online platforms or very large online search engines have a larger reach and a greater impact in influencing how recipients of the service obtain information and communicate online, such providers should not benefit from that exclusion, irrespective of whether they qualify or recently qualified as micro or small enterprises.

The consolidation rules laid down in Recommendation 2003/361/EC help ensure that any circumvention of those additional obligations is prevented. Nothing in this Regulation precludes providers of online platforms that are covered by that exclusion from setting up, on a voluntary basis, a system that complies with one or more of those obligations.


(58) Recipients of the service should be able to easily and effectively contest certain decisions of providers of online platforms concerning the illegality of content or its incompatibility with the terms and conditions that negatively affect them. Therefore, providers of online platforms should be required to provide for internal complaint-handling systems, which meet certain conditions that aim to ensure that the systems are easily accessible and lead to swift, non-discriminatory, non-arbitrary and fair outcomes, and are subject to human review where automated means are used. Such systems should enable all recipients of the service to lodge a complaint and should not set formal requirements, such as referral to specific, relevant legal provisions or elaborate legal explanations.

Recipients of the service who submitted a notice through the notice and action mechanism provided for in this Regulation or through the notification mechanism for content that violate the terms and conditions of the provider of online platforms should be entitled to use the complaint mechanism to contest the decision of the provider of online platforms on their notices, including when they consider that the action taken by that provider was not adequate. The possibility to lodge a complaint for the reversal of the contested decisions should be available for at least six months, to be calculated from the moment at which the provider of online platforms informed the recipient of the service of the decision.


(59) In addition, provision should be made for the possibility of engaging, in good faith, in the out-of-court dispute settlement of such disputes, including those that could not be resolved in a satisfactory manner through the internal complaint-handling systems, by certified bodies that have the requisite independence, means and expertise to carry out their activities in a fair, swift and cost-effective manner. The independence of the out-of-court dispute settlement bodies should be ensured also at the level of the natural persons in charge of resolving disputes, including through rules on conflict of interest.

The fees charged by the out-of-court dispute settlement bodies should be reasonable, accessible, attractive, inexpensive for consumers and proportionate, and assessed on a case-by-case basis. Where an out-of-court dispute settlement body is certified by the competent Digital Services Coordinator, that certification should be valid in all Member States. Providers of online platforms should be able to refuse to engage in out-of-court dispute settlement procedures under this Regulation when the same dispute, in particular as regards the information concerned and the grounds for taking the contested decision, the effects of the decision and the grounds raised for contesting the decision, has already been resolved by or is already subject to an ongoing procedure before the competent court or before another competent out-of-court dispute settlement body.

Recipients of the service should be able to choose between the internal complaint mechanism, an out-of-court dispute settlement and the possibility to initiate, at any stage, judicial proceedings. Since the outcome of the out-of-court dispute settlement procedure is not binding, the parties should not be prevented from initiating judicial proceedings in relation to the same dispute.

The possibilities to contest decisions of providers of online platforms thus created should leave unaffected in all respects the possibility to seek judicial redress in accordance with the laws of the Member State concerned, and therefore should not affect the exercise of the right to an effective judicial remedy under Article 47 of the Charter. The provisions in this Regulation on out-of-court dispute settlement should not require Member States to establish such out-of-court settlement bodies.


(60) For contractual consumer-to-business disputes regarding the purchase of goods or services, Directive 2013/11/EU ensures that Union consumers and businesses in the Union have access to quality-certified alternative dispute resolution entities. In this regard, it should be clarified that the rules of this Regulation on out-of-court dispute settlement are without prejudice to that Directive, including the right of consumers under that Directive to withdraw from the procedure at any stage if they are dissatisfied with the performance or the operation of the procedure.



Note: This is the final text of the Digital Services Act. The full name is "Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market For Digital Services and amending Directive 2000/31/EC (Digital Services Act)".



Contact us

Cyber Risk GmbH
Dammstrasse 16
8810 Horgen
Tel: +41 79 505 89 60
Email: george.lekatis@cyber-risk-gmbh.com








Web: https://www.cyber-risk-gmbh.com









We process and store data in compliance with both, the Swiss Federal Act on Data Protection (FADP) and the EU General Data Protection Regulation (GDPR). The service provider is Hostpoint. The servers are located in the Interxion data center in Zürich, the data is saved exclusively in Switzerland, and the support, development and administration activities are also based entirely in Switzerland.


Understanding Cybersecurity in the European Union.

1. The NIS 2 Directive

2. The European Cyber Resilience Act

3. The Digital Operational Resilience Act (DORA)

4. The Critical Entities Resilience Directive (CER)

5. The Digital Services Act (DSA)

6. The Digital Markets Act (DMA)

7. The European Health Data Space (EHDS)

8. The European Chips Act

9. The European Data Act

10. European Data Governance Act (DGA)

11. The Artificial Intelligence Act

12. The European ePrivacy Regulation

13. The European Cyber Defence Policy

14. The Strategic Compass of the European Union

15. The EU Cyber Diplomacy Toolbox