LSE - Small Logo
LSE - Small Logo

Jonny Shipp

Ruairí Harrison

Aebha Curtis

October 7th, 2024

Resolving content disputes outside the courtroom using the Digital Services Act

0 comments | 7 shares

Estimated reading time: 5 minutes

Jonny Shipp

Ruairí Harrison

Aebha Curtis

October 7th, 2024

Resolving content disputes outside the courtroom using the Digital Services Act

0 comments | 7 shares

Estimated reading time: 5 minutes

Article 21 of the EU Digital Services Act (DSA) establishes a mechanism for out-of-court settlement of disputes between users and platforms. The Internet Commission’s Jonny Shipp, Ruairí Harrison, and Aebha Curtis examine how this might work in practice, and potential challenges along the way.

Out-of-court Dispute Settlement in the DSA – An Overview

In 2022, the European Commission introduced the Digital Services Act (DSA), a new digital rulebook impacting EU users, platforms and businesses alike. In general, the DSA’s core objectives are to improve the safety of EU internet users whilst protecting fundamental rights such as freedom of expression, and to empower users to assert their rights online.

Article 21 DSA’s out-of-court dispute settlement (ODS) solution is one unique tool to achieve these objectives with potentially far-reaching implications for users. This is because the provision allows users to challenge a platform’s content-related decision via an independent, expert non-judicial body. Previously, users wishing to challenge a platform’s moderation decision could either challenge this via the platform’s internal complaints process (which only the larger platforms typically offer) or in a court of law. Considering the potential for bias in the former and the costs involved in the latter, ODS provides a novel redress solution which EU regulators believe will be affordable, independent and easily accessible.

What is ODS?

ODS (also known as Alternative Dispute Resolution [ADR]) is a non-judicial form of redress that allows consumers to resolve their complaint with a business via a third party (ODS) body. Article 21 DSA represents the first time ODS is being used for content-related disputes, but ODS has been effective for disputes in various sectors including aviation, public transport, telecommunications and energy as well as playing a role online regarding domain name disputes since 2000. Across numerous markets, ODS has played an indispensable role in restoring consumer trust and recalibrating the power imbalance between businesses and consumers. Effective redress provides users with the tools to protect their own interests and participate in the administration of justice. In doing so, redress can thereby enhance users’ sense of agency which in turn builds trust between the user and the service.

In the case of online platforms, Article 21 DSA introduces ODS bodies that will be certified by national regulators in the EU to settle content moderation-related disputes. This includes platforms’ decisions to remove content, to disable access to content, but also to merely demote content (as is the case for shadow banning, a popular platform practice of significantly lessening the visibility of content).

To be certified, ODS bodies must satisfy several conditions, most notably, their ‘independence’ and ‘expertise.’ The above conditions are the focus of the working paper we presented earlier this year at the DSA and Platform Regulation Conference at the Amsterdam Law School, entitled: “Settling DSA-related Disputes Outside the Courtroom: The Opportunities and Challenges Presented by Article 21 of the Digital Services Act”. The paper focuses on these conditions and anticipates the kinds of complaints that ODS bodies may resolve.

The Requirements of ODS Bodies under the DSA

Independence 

Under the new EU rules, ODS bodies must prove to the certifying DSCs that they will be ‘impartial’ and ‘(financially) independent’ of both platforms and users. Our paper explores how minimum thresholds should be coordinated in line with a recent Commission Recommendation  addressing quality requirements for dispute resolution procedures which explicitly address the above condition(s). Tying together both the most pragmatic understanding of impartiality and independence with the manner in which in it presented in the Commission Recommendation, we found that ‘independence’ means not being subject to any instructions from either party and that ‘impartiality’ will require bodies to guarantee that they are not remunerated in a way that is linked to the outcome of the procedure.

Our paper thus addresses where bias may impinge on the ODS ecosystem from the perspective of both the claimant and the ODS body. We looked to the Meta Oversight Board as a case study and also looked at borderline cases of ex-employees with sufficient expertise but questionable independence. We present some scenarios to highlight the difficulties in establishing these minimum standards. We also discuss how threats to independence may emerge from parties not explicitly related to the dispute, for instance, interest groups or state interests, as these scenarios are not directly addressed in the regulation.

One overarching conclusion, which also applies to the ‘expertise’ condition, is that concerns of malicious claims and inherently pro-claimant bias may be addressed by effectively defining these ‘conditions’ via coordinated guidance at national (DSC) level. We note the importance of the independence of these bodies as a way to build effective engagement and establish the trust of users in this redress tool. Therefore, transparency regarding the funding and fee structures of ODS bodies will help maximise the trust in them and help to curb biased decision-making most effectively.

Expertise 

ODS bodies must also prove their ‘expertise’ in resolving content disputes. This means that they must have expertise in not only the ‘subject matter’ of the complaint (for example illegal and/ or harmful types of content) but also in at least one EU language and the resolution of complaints more generally. Previous EU ADR rules provide useful context to flesh out DSA-ODS ‘expertise’ conditions in that they highlight the value of legal knowledge in establishing expertise. Thus, although not explicitly mentioned in the DSA, we argue that the expertise required of ODS bodies should encompass a minimum level of legal competence. This is crucial for complex fundamental rights-related disputes concerning, most notably, privacy versus freedom of expression online.

We also discuss the importance of further delineating what type of expertise each ODS body has. The emerging landscape of ODS bodies may be segmented in several ways, with bodies having particular expertise regarding individual platforms (e.g. Instagram), types of platforms (e.g. gaming platforms), types of content (e.g. hate speech), and/or state-specific expertise (relating to local culture and/or language).

The natural response for state-specific expertise is for states to establish their own state-backed ODS body. Austria and Hungary have announced that they will create state-backed ODS bodies, and others may follow. However, questions arise regarding how these state bodies can claim to be experts across the vast array of content disputes that may land on their doorstep and how regulators can guarantee such a vast ‘scope’ of expertise. Further uncertainties exist regarding how state-backed ODS bodies’ expertise can be effectively shared throughout the market, particularly for recurring types of disputes which proliferate across the EU and may require a coordinated approach. Effective dispute resolution for numerous platform types, such as dating platforms, online marketplaces and video-sharing platforms might be more easily achievable at a pan-European level, rather than across all services at a national level.

We argue that one key means to address this potential knowledge gap in the ODS market would be the creation of a hub for all ODS bodies, be they state-backed or private. ‘Expertise’ can be enhanced and shared at an industry-wide level by establishing mechanisms for the exchange of knowledge about the market, innovation and best practices. This could bring together existing and new organisations that have developed or are in the process of developing expertise, ensuring efficient referrals of cases to bodies that have the right expertise, as well as helping to consolidate insights about emerging risks and harms in order to communicate with the industry and facilitate improvements.

ODS in Practice: Opportunities and Challenges

To be effective, the recommendations of ODS bodies will need to reflect a balance of punishment and rehabilitation that is appropriate to the environment and its norms and values. For example, an online dating platform may be directed towards adopting specific approaches for first-time offenders versus repeat offenders. In a child-focused environment, the emphasis may be more appropriately rehabilitative or educational where the offending user is a child, just as the punitive element should be more definitive where the offending user is an adult. ODS bodies should recommend platforms to adjust their approach as new issues emerge and ensure they are able to scale their operations. Their rules of procedure will need to address the admissibility and reliability of evidence but allow for the fact that these are dynamic environments in which issues will extend across platforms and jurisdictions.

By engaging effectively with ODS bodies, platforms will have new opportunities to improve user experiences across national borders and to receive helpful, system-level insights. Our paper discusses these opportunities and challenges in detail, and we are keen to engage with all stakeholders as we develop our thinking in this field.

***

The Internet Commission was established in 2018 with the support of WayraArm and LSE’s Department of Media and Communications. It engaged closely with the development of both the UK Online Safety Act and the EU’s DSA. In 2022, The Internet Commission was acquired by Trust Alliance Group, an organisation that also operates the UK Energy Ombudsman and the UK Communications Ombudsman. Our mission is to contribute to a safer and fairer internet for citizens across the world by helping digital organisations understand their impact on society and improve their trust and safety practices. 

In anticipation of digital legislation, we developed a pioneering framework that became the basis of our two accountability reports on platform content moderation and business cultures, showing how independent evaluation can support digital responsibility. As the DSA came into force, we turned our attention to the opportunities and challenges presented by the out-of-court dispute settlement (ODS) provision in Article 21 DSA. This post is based on a working paper on the legal and practical issues for regulators and companies as they implement this provision. 

———————————————————————————————————–

This article gives the views of the authors and does not represent the position of the Media@LSE blog, nor of the London School of Economics and Political Science.

About the author

Jonny Shipp

Jonny Shipp is an Independent Consultant and Visiting Fellow at the London School of Economics. He founded The Internet Commission in 2017 and served as its Executive Chair from 2020 until securing its acquisition by Trust Alliance Group (TAG) in 2022. He is retained by TAG as a specialist in digital policy, European affairs, and out-of-court dispute resolution.

Ruairí Harrison

Ruairí Harrison is a Digital Policy Analyst at The Internet Commission. After completing a traineeship at the European Commission’s Digital Services Act Unit and later contributing to The Internet Commission’s Accountability Report 2.0, Ruairí joined The Internet Commission full-time in August 2022. He has an extensive background in European digital rights and has more recently been focusing on the DSA’s redress provisions.

Aebha Curtis

Aebha Curtis is a Strategic Insight Analyst at Trust Alliance Group (TAG), having previously led the quantitative and qualitative analysis for The Internet Commission's Accountability Reports. Aebha has a depth of expertise in tech policy and a proven track record of delivering impactful analyses across a wide range of organisations and projects, including the UK Government's Verification of Children Online project. She recently completed a Graduate Diploma in Law to deepen her understanding of digital regulations and legislation.

Posted In: Internet Governance

Leave a Reply

Your email address will not be published. Required fields are marked *