LSE - Small Logo
LSE - Small Logo

Aleksandra Kuczerawy

August 9th, 2023

Understanding the Digital Services Act: EU’s stricter rules for large tech companies focusing on the protection of minors

0 comments | 15 shares

Estimated reading time: 3 minutes

Aleksandra Kuczerawy

August 9th, 2023

Understanding the Digital Services Act: EU’s stricter rules for large tech companies focusing on the protection of minors

0 comments | 15 shares

Estimated reading time: 3 minutes

From August 25th, the European Union is tightening its grip on large tech companies through the implementation of the Digital Services Act (DSA), a set of rules aimed at protecting users of online platforms, with a specific focus on the largest online platforms. In this blog, Leen d’Haenens interviews Aleksandra Kuczerawy from the KU Leuven’s Centre for IT & IP Law about the DSA’s potential impact on the tech industry and how the EU is going to improve the protection of minors online in relation to content moderation, advertising and transparency regarding the functioning of algorithms that recommend content.

The DSA is a regulation that imposes stricter rules on online platforms. It is set to take effect in August 2023 for the designated Very Large Online Platforms (VLOPs), and for all other platforms, it will become applicable as of February 2024. Violations of the DSA may result in significant penalties, including unprecedented fines and (potentially) a temporary suspension of services in EU countries. The DSA represents a new experiment in regulating large tech companies. Despite concerns about the enforcement powers awarded to the European Commission, the DSA is a necessary step. The sanctions for violations, including fines of up to 6% of a company’s global annual revenue, are designed to be a deterrent and proportionate to the infringement.

Can you tell us more about the Digital Services Act (DSA) and which companies it applies to?

The DSA is a new law that will apply from August 25 to a selection of very large online platforms (VLOPs) operating in EU countries, and from February 2024 to other online services operating in the EU. The law was passed in April 2022, but the EU only recently announced which companies are designated as VLOPs. These are platforms that each have more than 45 million daily active users in the EU. Tech companies like Google, Meta, and Microsoft are subject to the rules for multiple services (e.g. Google Play and Google Maps). Moreover, the list of VLOPs includes eight social media platforms (Facebook (Meta), TikTok, Twitter, YouTube, Instagram, LinkedIn, Pinterest, and Snapchat); five marketplaces (Amazon, Booking, AliExpress, Google Shopping, and Zalando); two application stores (Google Play Store and Apple App Store); two search engines (Google Search and Bing), and two other online platforms (Google Maps and Wikipedia).

Why is there a need for this new law when the EU has previously successfully imposed financial penalties on big tech players that break the rules?

We have, indeed, seen large fines for big tech before, but they were imposed on the basis of different rules, specifically for anti-competitive behaviour. The goal of the DSA is to make the digital space safer but also to strengthen the protection of fundamental rights online. It focuses on combatting illegal content online but also contains rules in other areas, e.g., prohibiting advertising that targets minors.

The DSA is also meant to update the rules on the liability of intermediary online services for content posted by their users, contained in the E-Commerce Directive from 2000. Both under the E-Commerce Directive and the DSA online services are exempted — under certain conditions — from liability for illegal content originating from their users. Specifically, they should remove illegal content from their users, when they find out about its illegal character although they do not have to actively search for such content. But platforms also remove posts that are not illegal, but simply because they are considered “inappropriate” in the platform’s terms and conditions. The new legislation does not specify what content is illegal, this is left to the EU countries. It also does not prohibit platforms from removing content that is violating their Terms and Conditions (but is not illegal). But the novelty of the DSA is that it contains procedural rules for platforms, clarifying how the process of content moderation should look like, for example, what the notification procedure about illegal content should look like. It also says that platforms can make their own rules about the content they allow, but they have to clearly state it in their Terms and Conditions so users know what to expect.

Why did the EU only recently announce which companies would be covered by the DSA?

The DSA consists of layered obligations, with an increasing amount of rules for different services, depending on their type and reach. The strictest rules apply to VLOPs. Platforms had to submit information about the number of active users to the EC by February 2023. The EC has then carefully reviewed the provided information to determine which companies should qualify as VLOPs. They wanted to make sure that only the largest tech companies were covered by the strictest rules. The decision was made to cover platforms that each have more than 45 million daily active users in the EU. Europe is also looking at whether four to five more companies should be added to the list, but that decision has not yet been released.

Are the companies that fall under the Digital Services Act also the ones that may play a critical role in safeguarding the rights of minors online?

The scope of the DSA is very broad. We have 19 VLOPs and VLOSEs but many other online services offered in the EU will have to comply, to a different extent, with the new rules. For example, they have to put in place appropriate and proportionate measures to ensure a high level of privacy, safety, and security of minors, and they will not be allowed to present ads targeting minors. The designated VLOPs will have an obligation to identify, analyse, and assess any systemic risks resulting from the design, functioning and use made of their services. There are different areas of these systemic risks. For example, the spread of illegal content via their platform, negative impact on minors as well as negative consequences on the physical and mental well-being of all users. Although we will have to see how this process will look like in practice, it will definitely provide more information about the risks for minors. It will also allow platforms to propose measures to mitigate the identified risks. The proposed measures will be then reviewed by independent auditors who can propose additional mitigation measures. Hopefully, altogether these steps will lead to better safeguarding of the rights of the minors in the future.

As an expert in the field, what recommendations would you suggest to complement the current legal framework of the Digital Services Act and better protect the rights of children online?

The main purpose of the DSA is to make the digital space safer e.g., by facilitating combatting illegal content online; but also to strengthen the protection of fundamental rights online. We should not forget that children also have the right to privacy and the right to freedom of expression and to access information. Any measures introduced to protect children online should take that into account. I think we need to focus on improving media literacy skills of children and to teach them about risks online, to critically assess information they find, and to equip them with some basic verification tools. There are programmes like that already, but they are not very consistent and vary significantly across countries and regions, and that creates a digital gap and starts a pattern of digital exclusion very early on.

The Digital Services Act (DSA) will take effect soon, and it is set to have a significant impact on online platforms operating in EU countries. Can you tell us which rules of the DSA will have the greatest impact on online platforms?

Content moderation is a big challenge for social media platforms, and it’s also a topic that causes dissatisfaction among many users. The DSA imposes certain procedures on platforms for the content moderation process, which means that if e.g., Facebook wants to moderate a  post or a comment below, it must notify the author and provide a statement of reasons for the decision. It also has to provide information about available ways to appeal the decision. The number of notifications related to content moderation will be enormous. Also, the effort required to effectively operate an appeal system will be significant. Platforms will additionally have to get involved in out-of-court dispute resolutions if a user chooses such an option to object to a removal or blocking decision. These rules are not only for VLOPs, and it will take a lot of effort for companies (especially smaller ones) to comply with all the new obligations. We will probably see a lot of automation happening but it will require a significant workforce, at least at the beginning.

In addition, the restrictions around targeted ads will have a big impact. Online advertising may no longer target minors or be based on profiling based on sensitive data such as political beliefs. And then there is the systemic risk assessment for VLOPs in multiple areas, e.g., illegal content, respect for fundamental rights, possible negative impact on minors, impact on mental well-being, or gender violence.  At the moment, there is no clear methodology on how to conduct such assessments, so it will take some time before this approach becomes fully operational.

Where would you place the emphasis on ensuring the online world is as safe as possible for children? Additionally, in your opinion, where should the responsibility lie for implementing these measures?

I think there will always be risks for children online. We don’t need to panic and prohibit everything, but we need to educate them and prepare them for the technology-intense future. We should also insist on a human-centred and human rights-based approach for the developers and creators of new technologies, which is something that the DSA attempts to achieve. It effectively requires online platforms to actively protect the fundamental rights of their users. We will see how this works out in practice.

In conclusion, the Digital Services Act marks a significant step forward in regulating the largest online platforms and protecting the rights of users. However, it is important to remember that legislation alone cannot solve all the challenges that arise in the online world. Continued research, education, and collaboration between stakeholders will be essential to ensure a safer and more responsible digital environment for all.

Notes


This text was originally published on the ySKILLS blog and has been re-posted with permission and small edits.

This post represents the views of the authors and not the position of the Parenting for a Digital Future blog, nor of the London School of Economics and Political Science.

Featured image: photo by Agung Pandit Wiguna on Pexels

About the author

Aleksandra Kuczerawy

Aleksandra Kuczerawy is a postdoctoral researcher at KU Leuven’s Centre for IT & IP Law. She works on the topics of AI, big data, and smart cities, intermediary liability, content moderation, freedom of expression and platform governance. Aleksandra Kuczerawy holds a PhD from the KU Leuven and is the author of the book Intermediary Liability and Freedom of Expression in the EU: from concepts to safeguards. In June 2020 Aleksandra was awarded a grant from the Research Foundation – Flanders (FWO) to conduct 3 years research on Online Content Moderation and the Rule of Law.

Posted In: Reflections