Click here to view the Transparency Database.
To recap, the DSA sets out asymmetric obligations on different types of digital service providers, depending on the nature of their services and the scope of their reach and societal impact. The DSA breaks digital service providers down into four categories, each of which is a narrower subset of the category before: (i) online intermediaries; (ii) hosting services (services that consist of the storage of information provided by and at the request of a recipient of the service - e.g. cloud and webhosting services); (iii) online platforms (hosting services that publicly disseminate user information (e.g. social media platforms, content sharing platforms and online marketplaces); and (iv) very large online platforms (VLOPs)/ very large search engines (VLOSEs). Some obligations apply to all four of these categories whilst others apply only to some of them, with the highest tier category of VLOPs/VLOSEs being subject to all of the DSA’s rules.
In particular, the DSA imposes a number of content moderation obligations on providers of hosting services (which includes online platforms). Article 17 of the DSA requires all hosting providers to provide a clear and specific statement of reasons (SORs) to any affected service user when they remove or otherwise restrict availability of and access to content provided by the user or its service provision to such users (i.e. inform their users of the content moderation decisions they take and explain the reasons behind those decisions). This obligation applies irrespective of whether action is taken on the basis that content is illegal or is contrary to a hosting provider’s terms and conditions or on the basis of a notice or the hosting provider’s own-initiative investigations.
Note this obligation currently only applies to VLOPs but will apply to all other online platforms and hosting service providers from 17 February 2024.
At a minimum, SORs must include:
- Details on the consequences of the decision, its territorial scope and duration
- The facts and circumstances relied on in taking the decision
- Information on any use made of automated means in taking the decision
- A reference to the legal ground relied on and why in cases involving illegal content
- A reference to the contractual ground relied on and why in cases concerning content that is incompatible with terms and conditions
- Clear and user-friendly information on the possibilities for redress available.
In addition, providers of online platforms are also required under Article 24(5) of the DSA to submit SORs to the TDB (unless it benefits from the micro and small enterprises exemption – i.e. it has fewer than 50 employees and less than €10 million in annual sales). This requirement is to ensure transparency and enable scrutiny of content moderation decisions and to monitor the spread of illegal and harmful content online. SORs must be submitted without “undue delay” and in an automated manner to allow close to real-time updates of the database where technically possible. Online platform providers must ensure that SORs submitted do not contain personal data.
Over 63 million SORs have already been submitted to the TDB. The level of detail provided in each SOR varies considerably as do the reasons cited for removal (e.g. some providers are simply citing “incompatible with Community Guidelines/Policies” or “incompatible with terms and conditions” as the ground for removal without any further explanation). Under the DSA, service users can challenge an online platform provider’s content moderation decisions through the provider’s internal complaint-handling mechanisms, an out-of-court dispute settlement body or before a national court in accordance with applicable law. Judging by the number of SORs being submitted to the TDB on an hourly basis, the complaint-handling teams of VLOPs are going to be kept very busy for the foreseeable future!
How can in-scope service providers prepare for the DSA’s Content Moderation and SOR obligations?
Digital service providers should firstly assess whether any of their services qualify as a hosting service under the DSA. If so, they should ensure that they can comply with the various content moderation and SOR obligations under the DSA which apply to hosting services. Obligations on all hosting providers include implementing a “notice and action” mechanism under Article 16 of the DSA to enable the notification of content considered to be illegal. Hosting providers must ensure that these mechanisms are easy to access, user-friendly, allow for the submission of notices exclusively by electronic means and facilitate the submission of sufficiently precise and adequately substantiated notices with clear explanations of the content and why they allege it is illegal, which should in theory allow the provider to properly assess whether the information is in fact illegal and should be removed. Hosting providers must process such notices and make their content moderation decisions in a timely, diligent, non-arbitrary and objective manner. Hosting providers will then need to co-ordinate its content moderation processes with the SOR information requirements under Article 17 of the DSA.
If any of the hosting provider’s services fall within the wide definition of an online platform (and the provider cannot benefit from the micro/small enterprises exemption), it must also submit its SORs to the TDB. To facilitate this, the European Commission has provided an API that allows providers of online platforms that issue large numbers of statements of reasons to submit them without using the web interface (which is publicly available on GitHub). The Commission will add new features to the database in the coming months and a Research API is being considered.
As mentioned above, only VLOPs are currently caught by these content moderation requirements. However, 17 February 2024 is only just around the corner so any in-scope business whose digital services fall within the scope of these content moderation obligations (as well as the wider DSA obligations) should be making preparations now to ensure they can comply with all relevant obligations by this deadline.
Given the looming deadline, in-scope businesses should now be taking the time to review their terms and conditions, interfaces, internal processes (such as content moderation, algorithmic decision-making systems, recommender systems, take down mechanisms and complaints procedures) and governance functions to identify any compliance gaps in respect of their obligations under the DSA and put in place plans to remediate any gaps identified before the 17 February 2024 deadline. See our article here for more information on the scope of the DSA and its key provisions.
Please contact Bryony Long, Mary Traynor or Sam Berriman if you have any questions about this article or other DSA related requirements including whether you fall within the scope of the DSA.