DSA obligations are now binding for very large online platforms and search engines

PrintMailRate-it

published 10 October 2023 | reading time approx. 3 minutes


As of 25 August 2023, the obligations set out in the Digital Services Act (“DSA”) for online platforms and very large search engines became directly applicable throughout the European Union. From the strengthening of online protection for minors, to the extension of transparency obligations on content moderation mechanisms, from the fight against fake news, to the clampdown on the publication of illegal content: let us guide you through the main requirements.

 
  
Before proceeding, it is crucial to recall that only platforms and search engines with an average monthly number of active recipients in the European Union of 45 million or more shall be considered as “very large”. The relevant entities were expressly identified by the European Commission with a specific act dated 25 April 2023, among which we can find – without particular surprises – some major players in the technological field, including Google, Meta, Apple and Microsoft.
  
The strategic importance of such entities is all the more evident if we look at the goal that the European legislator intends to achieve through the DSA, namely the creation of a safe, predictable and trusted online environment that facilitates innovation and effectively protects the fundamental rights of the individual. And it is clear that this mission cannot be achieved without taking advantage of the dominant position that the players identified by the Commission hold in the digital market.
 

General obligations

What are the prescriptions imposed by the DSA on such entities? Firstly, we find a series of broader obligations that the DSA addresses to all intermediary service providers falling within its scope, without distinction. These entities are required, in brief, to: 
  • designate a single point of contact to communicate directly with the competent authorities and the recipients of the service;
  • supplementing its terms and conditions by including information on the restrictions imposed, in clear, simple, understandable, user-friendly and unambiguous language;
  • ensure greater transparency of content moderation logics and procedures, with particular focus on algorithmic decision-making processes and human verification and supervision.

  

Obligations regardless of size

Secondly, the DSA specifically tailors some requirements for online platform providers, regardless of their size. Among these, it is worth mentioning the following ones:
  • adopt illegal content reporting mechanisms;
  • provide clear and specific justifications for restrictions imposed in relation to illegal contents or to information incompatible with their terms and conditions of the recipients of the service;
  • ensure access by recipients of the service to an internal complaint handling system related to the services provided;
  • suspend, on a temporary basis, the provision of services to parties that frequently provide manifestly illegal content;
  • design online interfaces so that they enable people to take free and informed decisions, avoiding any manipulation or deception (which should be interpreted, in simple terms, as a general ban on the implementation of dark or deceptive patterns);
  • provide detailed information on advertising activities carried out within the platform, in a clear, concise, unambiguous and real-time manner;
  • ensure greater transparency regarding the use and operation of any content recommendation systems;
  • increase the protection of minors in the digital world, by providing – among other things – an express prohibition on the use of profiling-based advertising;
  • adopt mechanisms for illegal content reporting ;
  • provide clear and specific justifications for restrictions on illegal content or content incompatible with the terms and conditions of the recipient of the service.
 

Obligations exclusive to large online platforms and search engines

Finally, the DSA establishes a series of obligations exclusively dedicated to online platforms and large search engines, which include the duty to:
  • assess the systemic risks arising from the design, operation and use of the service – with particular attention to any algorithmic systems deployed – and duly mitigate such risks;
  • collaborate with the European Commission in the context of responding to critical events;
  • undergo, at their own expense and at least once a year, independent audits to assess the degree of compliance with the DSA;
  • include, among its recommendation systems, at least one option that is not based on the profiling of the recipient of the service;
  • upon request, provide access to the data that are necessary to monitor adherence to the requirements of the DSA;
  • establish an internal independent function to monitor compliance with the DSA.
 

High penalties in the event of violation

Only time will tell us how the entities affected by the obligations hereby described will be able to tackle the burdensome measures envisaged. What we know is that the fines in the event of violation are of absolute relevance, even exceeding those provided for under the GDPR: penalties may in fact amount to up to 6 percent of the annual worldwide turnover of the provider of intermediary services concerned in the preceding financial year.
Skip Ribbon Commands
Skip to main content
Deutschland Weltweit Search Menu