The final text of the Digital Services Act (DSA)




Article 35, Mitigation of risks - the Digital Services Act (DSA)


1. Providers of very large online platforms and of very large online search engines shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 34, with particular consideration to the impacts of such measures on fundamental rights. Such measures may include, where applicable:


(a) adapting the design, features or functioning of their services, including their online interfaces;


(b) adapting their terms and conditions and their enforcement;


(c) adapting content moderation processes, including the speed and quality of processing notices related to specific types of illegal content and, where appropriate, the expeditious removal of, or the disabling of access to, the content notified, in particular in respect of illegal hate speech or cyber violence, as well as adapting any relevant decision-making processes and dedicated resources for content moderation;


(d) testing and adapting their algorithmic systems, including their recommender systems;


(e) adapting their advertising systems and adopting targeted measures aimed at limiting or adjusting the presentation of advertisements in association with the service they provide;


(f) reinforcing the internal processes, resources, testing, documentation, or supervision of any of their activities in particular as regards detection of systemic risk;


(g) initiating or adjusting cooperation with trusted flaggers in accordance with Article 22 and the implementation of the decisions of out-of-court dispute settlement bodies pursuant to Article 21;


(h) initiating or adjusting cooperation with other providers of online platforms or of online search engines through the codes of conduct and the crisis protocols referred to in Articles 45 and 48 respectively;


(i) taking awareness-raising measures and adapting their online interface in order to give recipients of the service more information;


(j) taking targeted measures to protect the rights of the child, including age verification and parental control tools, tools aimed at helping minors signal abuse or obtain support, as appropriate;


(k) ensuring that an item of information, whether it constitutes a generated or manipulated image, audio or video that appreciably resembles existing persons, objects, places or other entities or events and falsely appears to a person to be authentic or truthful is distinguishable through prominent markings when presented on their online interfaces, and, in addition, providing an easy to use functionality which enables recipients of the service to indicate such information.


2. The Board, in cooperation with the Commission, shall publish comprehensive reports, once a year. The reports shall include the following:


(a) identification and assessment of the most prominent and recurrent systemic risks reported by providers of very large online platforms and of very large online search engines or identified through other information sources, in particular those provided in compliance with Articles 39, 40 and 42;


(b) best practices for providers of very large online platforms and of very large online search engines to mitigate the systemic risks identified.


Those reports shall present systemic risks broken down by the Member States in which they occurred and in the Union as a whole, as applicable.


3. The Commission, in cooperation with the Digital Services Coordinators, may issue guidelines on the application of paragraph 1 in relation to specific risks, in particular to present best practices and recommend possible measures, having due regard to the possible consequences of the measures on fundamental rights enshrined in the Charter of all parties involved. When preparing those guidelines the Commission shall organise public consultations.



Note: This is the final text of the Digital Services Act. The full name is "Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market For Digital Services and amending Directive 2000/31/EC (Digital Services Act)".



Contact us

Cyber Risk GmbH
Dammstrasse 16
8810 Horgen
Tel: +41 79 505 89 60
Email: george.lekatis@cyber-risk-gmbh.com








Web: https://www.cyber-risk-gmbh.com









We process and store data in compliance with both, the Swiss Federal Act on Data Protection (FADP) and the EU General Data Protection Regulation (GDPR). The service provider is Hostpoint. The servers are located in the Interxion data center in Zürich, the data is saved exclusively in Switzerland, and the support, development and administration activities are also based entirely in Switzerland.


Understanding Cybersecurity in the European Union.

1. The NIS 2 Directive

2. The European Cyber Resilience Act

3. The Digital Operational Resilience Act (DORA)

4. The Critical Entities Resilience Directive (CER)

5. The Digital Services Act (DSA)

6. The Digital Markets Act (DMA)

7. The European Health Data Space (EHDS)

8. The European Chips Act

9. The European Data Act

10. European Data Governance Act (DGA)

11. The Artificial Intelligence Act

12. The European ePrivacy Regulation

13. The European Cyber Defence Policy

14. The Strategic Compass of the European Union

15. The EU Cyber Diplomacy Toolbox