LSE - Small Logo
LSE - Small Logo

Karissa Dzurik

January 11th, 2023

To protect our children, social media companies must be held accountable

0 comments | 20 shares

Estimated reading time: 3 minutes

Karissa Dzurik

January 11th, 2023

To protect our children, social media companies must be held accountable

0 comments | 20 shares

Estimated reading time: 3 minutes

The beauty of childhood is its simplicity. A child’s years should be free from the responsibility and pressure that the transition into adulthood brings. This idyllic childhood is only secured when those around the child protect it. Unfortunately, for children’s online lives, the United States’ legal landscape has failed to protect our youths from adult dangers. For www.parenting.digital, Karissa Dzurik discusses the implications of underprotecting children in a digital environment where machine learning is used to suggest, sometimes harmful, content to children. 

The Children’s Online Privacy Protection Act (COPPA), enacted in 1998, still serves as the primary source of online protection. Despite seismic changes to the online landscape since its enaction, little modernisation has come to the regulation. Many argue that COPPA has also gone largely unenforced. The statute is structured without a private right of action so that only the Federal Trade Commission can bring a case. Because COPPA woefully under-protects children in a dynamic, online environment, parents and advocates have found success in challenging companies under other statutes in court. There is, however, a significant hurdle when it comes to challenging social media companies: Section 230.

Section 230 of the Communications Decency Act was enacted two years before COPPA, and it bars suits against “interactive computer service(s)” that publish “any information provided by another information content provider.” This section creates a barrier against individuals bringing suit by shielding social media from liability for content disseminated through their platform. Existing case law interprets which companies can be protected, i.e. defines the term “publisher,” to include websites that conduct “traditional editorial functions,” including reviewing, editing, and deciding whether to publish content. As long as websites can claim that the content that caused the harm was created by a third party, the platform is immune from liability.

But how expansive should Section 230’s protections extend? This is exactly the question the Supreme Court is likely to address in Gonzalez v. Google this upcoming term.

Gonzalez arises out of a 2015 ISIS attack that killed 23-year-old U.S. citizen, Nohemi Gonzalez. Her parents sued Google under the Anti-Terrorism Act, alleging that Google-owned YouTube’s artificial-intelligence-powered algorithms presented ISIS videos to and targeted users that would be interested in the content. Section 230 has barred Gonzalez’s case in both the district and appellate court, but with the Supreme Court taking the case, the justices are free to decide what they believe is Section 230’s rightful scope.

YouTube’s recommendation system uses machine learning to study users’ behaviour and offer videos the user may like, even if that specific user has never sought out or watched that type of video content before. As the VP of Engineering at YouTube, Cristos Goodrow, wrote on the video streaming service’s own blog:

there’s an audience for almost every video, and the job of our recommendation system is to find that audience.

 

Gonzalez argues that YouTube’s method of delivering ISIS content to vulnerable individuals extends beyond the role of a publisher and into that of an accomplice.

Where plaintiffs have found success in circumventing Section 230 at the lower courts is through a theory of product liability. A Ninth Circuit case in 2021 found that Section 230 did not protect Snap, Inc. from liability in a negligent design suit over Snapchat’s “Speed Filter.” Three teenage boys died in a car that crashed while travelling at approximately 113 mph just minutes after the boys were using the filter to capture their extreme speeds. Because the suit asked whether Snap had violated its duty to refrain from designing a product with unreasonable risk independent of any content third parties posted, this case fell outside Section 230.

Citing the Snapchat case, a District Court in Portland, Oregon, recently found that Section 230 did not bar a sexual abuse case against Omegle, a popular online chat and video site. In the case, Omegle paired an 11-year-old girl with a man in his late thirties who then sexually abused her. The plaintiffs targeted the company’s matching system that pairs users to chat and alleged product liability through negligent and defective design. Because Omegle could have designed its service to avoid matching minors with adults, the case turns on whether that failure constituted negligent design, a question that is independent of the users’ content.

The Court should continue down this path and use the Gonzalez case to clarify the definition of a “publisher” under Section 230 and realign the statute with what a 1995 Congress had in mind. Although tech companies claim that this narrowing would expose online platforms to unreasonable liability and limit their incentive to grow, reach users, and create systems to monitor content, the potential benefits outweigh the risks.

Large numbers of children are active on social media where they might be exposed to online bullying, sex trafficking, and suicidal ideation. Platforms like Snapchat, Omegle, and YouTube can be sources of severe danger, particularly for vulnerable children and youths. Without the threat of liability, these corporations will likely never make the changes necessary to protect this vulnerable population. Allowing social media to rely on machine learning that exposes individuals to content they may have never otherwise found without any legal liability is a risk that society—and the Court—should not be willing to take.

First published at www.parenting.digital, this post represents the views of the authors and not the position of the Parenting for a Digital Future blog, nor of the London School of Economics and Political Science.

You are free to republish the text of this article under Creative Commons licence crediting www.parenting.digital and the author of the piece. Please note that images are not included in this blanket licence.

Featured image: photo by Omkar Patyane from Pexels

About the author

Karissa Dzurik

Karissa Dzurik is a second-year student at the Levin College of Law and a Fellow with the University of Florida Levin College of Law’s Center on Children and Families. She is passionate about juvenile justice and online privacy.

Posted In: Reflections