Preamble 91-100, Digital Services Act (DSA)
(91) In times of crisis, there might be a need for certain specific measures to be taken urgently by providers of very large online platforms, in addition to measures they would be taking in view of their other obligations under this Regulation. In that regard, a crisis should be considered to occur when extraordinary circumstances occur that can lead to a serious threat to public security or public health in the Union or significant parts thereof.
Such crises could result from armed conflicts or acts of terrorism, including emerging conflicts or acts of terrorism, natural disasters such as earthquakes and hurricanes, as well as from pandemics and other serious cross-border threats to public health. The Commission should be able to require, upon recommendation by the European Board for Digital Services (‘the Board’), providers of very large online platforms and providers of very large search engines to initiate a crisis response as a matter of urgency.
Measures that those providers may identify and consider applying may include, for example, adapting content moderation processes and increasing the resources dedicated to content moderation, adapting terms and conditions, relevant algorithmic systems and advertising systems, further intensifying cooperation with trusted flaggers, taking awareness-raising measures and promoting trusted information and adapting the design of their online interfaces.
The necessary requirements should be provided for to ensure that such measures are taken within a very short time frame and that the crisis response mechanism is only used where, and to the extent that, this is strictly necessary and any measures taken under this mechanism are effective and proportionate, taking due account of the rights and legitimate interests of all parties concerned. The use of the mechanism should be without prejudice to the other provisions of this Regulation, such as those on risk assessments and mitigation measures and the enforcement thereof and those on crisis protocols.
(92) Given the need to ensure verification by independent experts, providers of very large online platforms and of very large online search engines should be accountable, through independent auditing, for their compliance with the obligations laid down by this Regulation and, where relevant, any complementary commitments undertaken pursuant to codes of conduct and crises protocols. In order to ensure that audits are carried out in an effective, efficient and timely manner, providers of very large online platforms and of very large online search engines should provide the necessary cooperation and assistance to the organisations carrying out the audits, including by giving the auditor access to all relevant data and premises necessary to perform the audit properly, including, where appropriate, to data related to algorithmic systems, and by answering oral or written questions.
Auditors should also be able to make use of other sources of objective information, including studies by vetted researchers. Providers of very large online platforms and of very large online search engines should not undermine the performance of the audit. Audits should be performed according to best industry practices and high professional ethics and objectivity, with due regard, as appropriate, to auditing standards and codes of practice. Auditors should guarantee the confidentiality, security and integrity of the information, such as trade secrets, that they obtain when performing their tasks.
This guarantee should not be a means to circumvent the applicability of audit obligations in this Regulation. Auditors should have the necessary expertise in the area of risk management and technical competence to audit algorithms. They should be independent, in order to be able to perform their tasks in an adequate and trustworthy manner. They should comply with core independence requirements for prohibited non-auditing services, firm rotation and non-contingent fees. If their independence and technical competence is not beyond doubt, they should resign or abstain from the audit engagement.
(93) The audit report should be substantiated, in order to give a meaningful account of the activities undertaken and the conclusions reached. It should help inform, and where appropriate suggest improvements to the measures taken by the providers of the very large online platform and of the very large online search engine to comply with their obligations under this Regulation. The audit report should be transmitted to the Digital Services Coordinator of establishment, the Commission and the Board following the receipt of the audit report.
Providers should also transmit upon completion without undue delay each of the reports on the risk assessment and the mitigation measures, as well as the audit implementation report of the provider of the very large online platform or of the very large online search engine showing how they have addressed the audit’s recommendations. The audit report should include an audit opinion based on the conclusions drawn from the audit evidence obtained.
A ‘positive opinion’ should be given where all evidence shows that the provider of the very large online platform or of the very large online search engine complies with the obligations laid down by this Regulation or, where applicable, any commitments it has undertaken pursuant to a code of conduct or crisis protocol, in particular by identifying, evaluating and mitigating the systemic risks posed by its system and services. A ‘positive opinion’ should be accompanied by comments where the auditor wishes to include remarks that do not have a substantial effect on the outcome of the audit.
A ‘negative opinion’ should be given where the auditor considers that the provider of the very large online platform or of the very large online search engine does not comply with this Regulation or the commitments undertaken. Where the audit opinion could not reach a conclusion for specific elements that fall within the scope of the audit, an explanation of reasons for the failure to reach such a conclusion should be included in the audit opinion. Where applicable, the report should include a description of specific elements that could not be audited, and an explanation of why these could not be audited.
(94) The obligations on assessment and mitigation of risks should trigger, on a case-by-case basis, the need for providers of very large online platforms and of very large online search engines to assess and, where necessary, adjust the design of their recommender systems, for example by taking measures to prevent or minimise biases that lead to the discrimination of persons in vulnerable situations, in particular where such adjustment is in accordance with data protection law and when the information is personalised on the basis of special categories of personal data referred to in Article 9 of the Regulation (EU) 2016/679.
In addition, and complementing the transparency obligations applicable to online platforms as regards their recommender systems, providers of very large online platforms and of very large online search engines should consistently ensure that recipients of their service enjoy alternative options which are not based on profiling, within the meaning of Regulation (EU) 2016/679, for the main parameters of their recommender systems. Such choices should be directly accessible from the online interface where the recommendations are presented.
(95) Advertising systems used by very large online platforms and very large online search engines pose particular risks and require further public and regulatory supervision on account of their scale and ability to target and reach recipients of the service based on their behaviour within and outside that platform’s or search engine's online interface.
Very large online platforms or very large online search engines should ensure public access to repositories of advertisements presented on their online interfaces to facilitate supervision and research into emerging risks brought about by the distribution of advertising online, for example in relation to illegal advertisements or manipulative techniques and disinformation with a real and foreseeable negative impact on public health, public security, civil discourse, political participation and equality.
Repositories should include the content of advertisements, including the name of the product, service or brand and the subject matter of the advertisement, and related data on the advertiser, and, if different, the natural or legal person who paid for the advertisement, and the delivery of the advertisement, in particular where targeted advertising is concerned. This information should include both information about targeting criteria and delivery criteria, in particular when advertisements are delivered to persons in vulnerable situations, such as minors.
(96) In order to appropriately monitor and assess the compliance of very large online platforms and of very large online search engines with the obligations laid down by this Regulation, the Digital Services Coordinator of establishment or the Commission may require access to or reporting of specific data, including data related to algorithms.
Such a requirement may include, for example, the data necessary to assess the risks and possible harms brought about by the very large online platform’s or the very large online search engine’s systems, data on the accuracy, functioning and testing of algorithmic systems for content moderation, recommender systems or advertising systems, including, where appropriate, training data and algorithms, or data on processes and outputs of content moderation or of internal complaint-handling systems within the meaning of this Regulation. Such data access requests should not include requests to produce specific information about individual recipients of the service for the purpose of determining compliance of such recipients with other applicable Union or national law.
Investigations by researchers on the evolution and severity of online systemic risks are particularly important for bridging information asymmetries and establishing a resilient system of risk mitigation, informing providers of online platforms, providers of online search engines, Digital Services Coordinators, other competent authorities, the Commission and the public.
(97) This Regulation therefore provides a framework for compelling access to data from very large online platforms and very large online search engines to vetted researchers affiliated to a research organisation within the meaning of Article 2 of Directive (EU) 2019/790, which may include, for the purpose of this Regulation, civil society organisations that are conducting scientific research with the primary goal of supporting their public interest mission.
All requests for access to data under that framework should be proportionate and appropriately protect the rights and legitimate interests, including the protection of personal data, trade secrets and other confidential information, of the very large online platform or of the very large online search engine and any other parties concerned, including the recipients of the service. However, to ensure that the objective of this Regulation is achieved, consideration of the commercial interests of providers should not lead to a refusal to provide access to data necessary for the specific research objective pursuant to a request under this Regulation.
In this regard, whilst without prejudice to Directive (EU) 2016/943 of the European Parliament and of the Council (32), providers should ensure appropriate access for researchers, including, where necessary, by taking technical protections such as through data vaults. Data access requests could cover, for example, the number of views or, where relevant, other types of access to content by recipients of the service prior to its removal by the providers of very large online platforms or of very large online search engines.
(98) In addition, where data is publicly accessible, such providers should not prevent researchers meeting an appropriate subset of criteria from using this data for research purposes that contribute to the detection, identification and understanding of systemic risks. They should provide access to such researchers including, where technically possible, in real-time, to the publicly accessible data, for example on aggregated interactions with content from public pages, public groups, or public figures, including impression and engagement data such as the number of reactions, shares, comments from recipients of the service.
Providers of very large online platforms or of very large online search engines should be encouraged to cooperate with researchers and provide broader access to data for monitoring societal concerns through voluntary efforts, including through commitments and procedures agreed under codes of conduct or crisis protocols. Those providers and researchers should pay particular attention to the protection of personal data, and ensure that any processing of personal data complies with Regulation (EU) 2016/679. Providers should anonymise or pseudonymise personal data except in those cases that would render impossible the research purpose pursued.
(99) Given the complexity of the functioning of the systems deployed and the systemic risks they present to society, providers of very large online platforms and of very large online search engines should establish a compliance function, which should be independent from the operational functions of those providers.
The head of the compliance function should report directly to the management of those providers, including for concerns of non-compliance with this Regulation. The compliance officers that are part of the compliance function should have the necessary qualifications, knowledge, experience and ability to operationalise measures and monitor the compliance with this Regulation within the organisation of the providers of very large online platform or of very large online search engine.
Providers of very large online platforms and of very large online search engines should ensure that the compliance function is involved, properly and in a timely manner, in all issues which relate to this Regulation including in the risk assessment and mitigation strategy and specific measures, as well as assessing compliance, where applicable, with commitments made by those providers under the codes of conduct and crisis protocols they subscribe to.
(100) In view of the additional risks relating to their activities and their additional obligations under this Regulation, additional transparency requirements should apply specifically to very large online platforms and very large online search engines, notably to report comprehensively on the risk assessments performed and subsequent measures adopted as provided by this Regulation.
Note: This is the final text of the Digital Services Act. The full name is "Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market For Digital Services and amending Directive 2000/31/EC (Digital Services Act)".
Cyber Risk GmbH
Tel: +41 79 505 89 60
We process and store data in compliance with both, the Swiss Federal Act on Data Protection (FADP) and the EU General Data Protection Regulation (GDPR). The service provider is Hostpoint. The servers are located in the Interxion data center in Zürich, the data is saved exclusively in Switzerland, and the support, development and administration activities are also based entirely in Switzerland.
Understanding Cybersecurity in the European Union.