EU Digital Services Act to Revolutionize Legal Landscape for Online Intermediaries

Alert
|
7 min read

On July 5, the European Parliament approved the final draft text of the EU Digital Services Act (DSA), a landmark law that will transform the EU’s legal framework for regulating online content. The Council is expected to adopt the DSA in September, which would potentially allow it to apply to some of the world’s largest online platforms and search engines as early as the first half of 2023. 
 
The DSA proposes a monumental shift in the online regulatory landscape by establishing a harmonized, EU-wide framework for intermediary services in relation to the content that they transmit, host, and, for online platforms, make available to the public. While it largely preserves the intermediary liability principles established in the E-Commerce Directive of 2000, the DSA goes much further, introducing new requirements for intermediaries and empowering regulators with broad investigative and enforcement powers to deal with non-compliance at the national and EU level.
 
Below, we provide an overview of the DSA’s scope, as well as key requirements concerning notice & action; user appeals and out-of-court dispute settlement; know your business customer (KYBC) requirements, and transparency requirements, including for online advertising and recommender systems; risk assessments, mitigations, audits, and information requests; and compliance functions. We also set out how the DSA’s obligations will apply to different types of intermediary services.

 

1. Scope

The DSA applies to "intermediary services" offered to EU users, which consist of "caching," "mere conduit," and "hosting" services, as well as "online search engines." The DSA imposes additional obligations on "online platforms" ("hosting" services – i.e., services that store information provided by, and at the request of, users of the service – that disseminate information to the public), and even more obligations on "very large online platforms" (VLOPs) and "very large online search engines" (VLOS) that have at least 45 million average monthly active users in the EU. Obligations are cumulative, and the most onerous obligations apply only to VLOPs and VLOS. 
 
The recitals to the DSA confirm that a provider offering multiple categories of intermediary services (e.g., mere conduit, hosting, VLOP, etc.) need only comply, with respect to each service it offers, with the portions of the DSA that apply to such service. 

 

2. Notice & Action

Under the DSA’s "notice & action" regime, hosting service providers must provide an electronic reporting mechanism that allows any individual or entity to easily report specific items of allegedly illegal content. Providers must review and act upon these reports "without undue delay," and, where applicable, must expeditiously remove any reported illegal content. In addition, providers must notify affected users of their decisions and provide a formal "statement of reasons" explaining any decision to restrict visibility to specific items of information (e.g., by removing, disabling access to, or demoting content, or suspending or terminating an account). 
 
Online platform providers must ensure that reports from "trusted flaggers" (certified independent entities with expertise and competence in reporting illegal content) are processed and decided on a priority basis.

 

3. User Appeals and Out-of-Court Dispute Settlement

The DSA requires all providers of online platforms to provide users, including individuals and entities that report allegedly illegal content via the notice & action mechanism, with access to an effective electronic internal complaints mechanism to appeal certain decisions regarding reported content and accounts. In addition, the DSA provides for the establishment of independent out-of-court dispute settlement bodies, which can issue non-binding decisions on certain disputes related to accounts or content.

 

4. Know your Business Customer (KYBC)

The DSA sets forth new information-gathering and verification requirements for online platforms that allow traders to use their services to conclude distance contracts with consumers. Specifically, online platforms must obtain and make reasonable efforts to verify certain trader traceability information before allowing them to use their platforms. Some of that traceability information must be shared with users on the online interfaces, and online platforms must take certain steps to protect consumers, including conducting random checks for illegal products/services and notifying users if they identify that an illegal product/service has been offered.

 

5. Transparency Requirements

The DSA introduces extensive transparency requirements on intermediaries regarding both content moderation and online advertising, with extra requirements for VLOPs, and certain obligations also applying to VLOS. For example, all intermediary service providers must publish periodic transparency reports with, among other things, details concerning any content moderation engaged in during the relevant period.  
 
In addition, all online platforms must display certain mandatory information on each advertisement that users encounter, including a prominent marking identifying the content as an ad; who posted the ad (or on whose behalf the ad is displayed); who paid for the ad (if different); and meaningful information about the main parameters used to determine that the ad should appear to that user and, where applicable, how to change those parameters. VLOPs and VLOS must also make publicly available online repositories of the ads that appear on their platforms that include certain information about those ads. 
 
All online platforms must also design, organize, and operate their online interfaces in a manner that does not deceive, manipulate, or otherwise materially distort or impair users’ ability to make free and informed decisions. They must set out in their terms of service, among other things, the main parameters used in their recommender systems for content, as well as any options available to users to modify or influence those parameters. VLOPs and VLOS must provide users with at least one option for each recommender system that is not based on GDPR profiling. 

 

6. Risk Assessment, Mitigation, Audits, and Information Requests

The DSA requires VLOPs and VLOS to conduct risk assessments for systemic risks, and to put in place reasonable, proportionate, and effective mitigation measures. All online platforms are also subject to an obligation to put in place certain measures to ensure a high level of privacy, safety, and security of minors on their service.
 
In addition, VLOPs and VLOS are subject to annual independent audits (at their own expense) and information requests, in order to monitor and assess compliance with certain DSA obligations.

 

7. Compliance Function

Each VLOP and VLOS must establish an independent compliance function that reports directly to the management body of the provider and can raise concerns and warn the management body of non-compliance risks.

 

Application of DSA Obligations to Intermediary Services

As noted above, the DSA imposes cumulative obligations on intermediary services, and certain obligations on VLOS, as follows:

 

The DSA has different categories of regulated entity (obligations are cumulative)

For each category, different obligations will apply  Intermediary services Hosting services Online platforms Very large online platforms*
Transparency reporting 
Requirements on terms of service due account of fundamental rights 
Cooperation with national authorities following orders 
Points of contact and, where necessary, legal representative 
No general monitoring obligation
Compensation
Notice and action and obligation to provide information to users   
Reporting criminal offences/harm  
Complaint and redress mechanism and out of court dispute settlement     
Requirements re: online marketplaces, e.g., vetting credentials of third party supplies ("KYBC"), compliance by design, random checks    
Trusted flaggers     
Measures against abusive notices and counter-notices    
User-facing transparency of online advertising    
Transparency of recommender systems and user choice for access to information    
Crisis response cooperation     
Protections for minors and ban on targeted adverts to children and those based on special characteristics     
Risk management obligations and compliance officer      
External and independent risk auditing and public accountability       
Data sharing with authorities and researchers      
Supervisory fee      
Codes of conduct      
User choice not to have recommendations based on profiling      

*The obligations in Ch.3, Sec. 4 DSA (save for certain reporting obligations) also apply to Very Large Online Search Engines pursuant to Art. 33a(1) DSA, which states as follows:

"This Section, with the exception of Article 33(1a) and (1b), shall apply to online search engines which reach a number of  average monthly active recipients of the service in the Union equal to or higher than 45 million, and which are designated as very large online search engines in accordance with Article 25(4)."
 

White & Case means the international legal practice comprising White & Case LLP, a New York State registered limited liability partnership, White & Case LLP, a limited liability partnership incorporated under English law and all other affiliated partnerships, companies and entities.

This article is prepared for the general information of interested persons. It is not, and does not attempt to be, comprehensive in nature. Due to the general nature of its content, it should not be regarded as legal advice.

© 2022 White & Case LLP

 

Top