Trial Magazine
Theme Article
An Unsafe Space
Social media use is at a record high among teens but poses serious mental health and safety risks. Here are some potential ways to hold platforms accountable.
August 2022In the past 20 years, social media has transformed public and private life. Social media use among Americans grew from 5% in 2005 to 72% in 2021.1 Use is even higher among teens: 85% use social media, and 45% say they are online “almost constantly.”2 While these platforms bring people together and provides safe spaces for marginalized groups, it also has caused significant harm to its users, particularly teens.3
In December 2021, the U.S. Surgeon General issued an advisory warning of a youth mental health crisis caused in part by the overuse of social media.4 Even Facebook’s own research—disclosed in 2021 by whistleblower Frances Haugen—revealed the harms of social media, from poor body image to anxiety to depression.5 Most alarming is the Centers for Disease Control and Prevention’s finding of a 146% increase in suicide deaths among 12- to 16-year-olds between 2008 and 2019.6
Social media use is not just linked to psychological harm—it’s also associated with sexual exploitation and abuse of minors. Social media platforms do not verify the self-reported age or identity of their users or prevent the exchange of explicit sexual material with minors. Adults posing as kids can interact with minor users, induce minors to share explicit photographs, and groom children to engage in online sexual acts. Sex traffickers also use social media platforms to recruit underage users into sexual exploitation.7
Despite strong evidence that social media is harming American youth, social media companies remain largely unregulated and enjoy statutory immunity from legal claims arising from third-party content on their platforms.8 Most attempts to hold social media companies accountable for harms inflicted on minor users have been unsuccessful. Social media platforms have been held immune in cases involving allegations of online sex trafficking,9 failure to implement safety measures that would prevent minors from lying about their age,10 and facilitating illegal drug sales.11 However, harnessing traditional products liability theories may offer an opportunity for legal recovery.
Products Liability Claims
Internet platforms have historically been treated solely as service providers, not product manufacturers.12 However, the argument for their treatment as a product, rather than a service, is a strong one. Products liability case law has steadily progressed toward recognizing intangible goods such as computer software and algorithms as products.13
Social media companies have sophisticated algorithms that use artificial intelligence and operant conditioning techniques to maximize the amount of time that users spend on their platforms.14 These algorithms anticipate the content that will be attractive to the user and are intentionally designed to be habit-forming.15 Traits such as the personalization of platforms to each consumer’s preferences categorize software as a good under commercial law, and it is arguably “disconsonant to insist on a different standard” under tort law.16
Strict products liability. This is one potential avenue to counter the growing harms of social media use. The rationale for strict products liability, first articulated in 1944, still resonates: “Public policy demands that responsibility be fixed wherever it will most effectively reduce the hazards to life and health inherent in defective products that reach the market. It is evident that the manufacturer can anticipate some hazards and guard against the recurrence of others, as the public cannot.”17 Placing liability on manufacturers who are best positioned to identify and address product hazards is particularly appropriate given the wide disparity in information between social media companies and users.18
In recent cases, plaintiffs have argued that social media platforms used by minors are unreasonably dangerous in their design.
Design defect. In recent cases, plaintiffs have argued that social media platforms used by minors are unreasonably dangerous in their design.19 Minor users and their parents are not aware that social media companies—whose profits are directly linked to user engagement levels—intentionally design their platforms to be addictive.
In addition, the algorithms designed to maximize user engagement operate to direct minor users to harmful and dangerous content and abusive and exploitative adult users. For example, adolescent girls interested in healthy food choices are unknowingly directed by Instagram to sites promoting unnatural body images and anorexic behaviors, and TikTok algorithms direct users to sites encouraging minor users to engage in fatal “blackout challenges” that have resulted in young children hanging themselves.20 African American boys interested in hip hop music are directed to content promoting gun violence,21 and adolescent girls are connected with adult users who encourage them to exchange sexually explicit content.
As designers, operators, and monitors of algorithms, social media companies know how to guard against hazards.
As designers, operators, and monitors of algorithms, social media companies are aware of how to guard against these hazards but have negligently failed to undertake reasonable product modifications to safeguard their minor users from foreseeable harm. Pending federal district court cases allege that minors have committed suicide or suffered severe psychological harm from social media use or addiction.22 The complaints in these cases point to design defects in the algorithms that power the defendants’ social media platforms.
Failure to warn. Plaintiffs in pending cases also assert failure-to-warn claims based on undisclosed hazards arising from foreseeable product use.23 A basic principle of products liability and negligence law is to provide users with a sane appreciation of the risk associated with hazardous products. Despite mounting evidence that social media companies know that their products can hurt minors, these companies make no attempt to warn them and their parents of the addictive nature of these products and the well-documented mental and physical health consequences of excessive use. They also fail to warn of platform algorithms’ tendencies to direct minor users to harmful content and the widespread practice of adults who use social media platforms to exchange sexually explicit content from minors.
Heightened standard of care. When a child has been harmed by a social media platform, the legal basis for liability is particularly strong. Under the attractive nuisance doctrine, a device of an unusually attractive nature may be “especially alluring to children of tender years” and may impliedly invite children to come on the premises (in this case, the platform).24 Although social media platforms generally require users to be at least 13 years old, much younger children use these platforms. These child users are particularly susceptible to algorithms designed to enhance user engagement.
In recently filed cases, plaintiffs argue that a heightened standard of care applies to social media companies under the attractive nuisance doctrine.25 These plaintiffs argue that social media platforms are readily available to children and invite them as users, yet companies have failed to take steps to prevent abuses such as predatory communications, online bullying, and child sex trafficking.
Overcoming Section 230
Despite efforts to hold social media companies accountable for harms to minors, these companies often successfully invoke Section 230 of the Communications Decency Act to evade liability.26
Section 230 immunizes companies from liability for third-party content posted on their platforms, declaring that “no provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”27 Courts have interpreted Section 230 “to confer sweeping immunity on some of the largest companies in the world,”28 holding that because virtually all content on social media platforms comes from third parties, any harm resulting from this conduct is preempted.29
In 2018, Congress passed the Fight Online Sex Trafficking Act and Stop Enabling Sex Traffickers Act (FOSTA-SESTA), amending Section 230 and creating a narrow statutory exception.30 This law clarified that it does not impair or limit “any claim in a civil action” brought against a provider that knowingly “recruits, entices, harbors, transports, provides, obtains, advertises, maintains, patronizes, or solicits by any means a person” or benefits from participation in such a venture.31
However, Section 230 remains a daunting challenge for any plaintiff seeking to hold social media companies accountable for harms caused by their platforms. A few recent cases offer guidance to those who come up against a social media company claiming immunity under Section 230.
Snapchat cases. Despite Section 230’s broad immunity, in 2018, the Georgia Court of Appeals allowed products liability claims brought against Snap in connection with its “Speed Filter” to move forward in Maynard v. Snapchat Inc.32 That case involved a driver who struck the plaintiffs’ vehicle while using a feature on the Snapchat app that allows users to superimpose a speedometer on an image to show the speed they were moving.33 The plaintiffs brought design defect claims, alleging Snap’s product “encourages” dangerous speeding.34 The court concluded that the claims did not arise out of third-party content but rather the design of Snap’s product, so Section 230 did not bar them.35
The Ninth Circuit recently adopted the reasoning in Maynard. In Lemmon v. Snap, Inc., another case involving Snapchat’s Speed Filter, a young driver posted his speed on Snapchat shortly before a fatal crash that killed the driver and his passenger.36 The boys’ parents sued Snap, alleging that it encouraged their sons to drive at dangerous speeds and caused the boys’ deaths through the negligent design of its smartphone application. The district court dismissed the action under Section 230; however, the Ninth Circuit reversed—it found that the plaintiffs did not seek to hold Snap responsible as a publisher or speaker but rather merely sought to “hold Snapchat liable for its own conduct, principally for the creation of the Speed Filter.”37
Facebook cases. The Texas Supreme Court’s 2021 decision in In re Facebook, Inc. demonstrates the challenges of using products liability theory to avoid Section 230 and the opportunity that FOSTA provides.38 In that case, three minor girls alleged they were victims of sex trafficking and had become entangled with their abusers through Facebook.39 The three plaintiffs alleged they were contacted on Facebook or Instagram by adult males, groomed to send naked photographs that were sold over the internet, and ultimately lured into sex trafficking.
The plaintiffs sued Facebook under a Texas statute that creates a civil cause of action against anyone “who intentionally or knowingly benefits from participating in a venture that traffics another person.”40 They alleged Facebook violated this statute by “knowingly facilitating the sex trafficking” and “creating a breeding ground for sex traffickers to stalk and entrap survivors.”41 The Texas Supreme Court concluded that these allegations fell under FOSTA and thus were not barred by Section 230.42
The plaintiffs also brought products liability claims under the theory that, “as a manufacturer, Facebook is responsible for the defective and unreasonable products” that were “marketed to children under the age of 18, without providing adequate warnings and/or instructions regarding the dangers of ‘grooming’ and human trafficking.”43 The plaintiffs argued that these claims did not treat Facebook as a “publisher” or “speaker” because they did not seek to hold the company liable for exercising any sort of editorial function over its users’ communications.44
The Texas Supreme Court squarely acknowledged that “Section 230 is no model of clarity, and there is ample room for disagreement about its scope.”45 However, the court conceded that “federal and state courts have uniformly held that Section 230 ‘should be construed broadly in favor of immunity’” and concluded that the plaintiffs’ products liability claims were barred by Section 230.46
The plaintiffs petitioned the U.S. Supreme Court for certiorari—and 25 state attorneys general joined as amicus curiae.47 However, the Court denied certiorari on jurisdictional grounds because the Texas action was not concluded.48 While agreeing that the Court lacked jurisdiction, Justice Clarence Thomas issued a statement excoriating the broad interpretation of Section 230 by appellate courts and urging that “we should, however, address the proper scope of immunity under Section 230 in an appropriate case.”49
Section 230’s broad interpretation has been criticized extensively, and there’s been a push to narrow its scope.50 With actions pending in multiple federal circuits, the hope is that “an appropriate case” will soon reach the Supreme Court.
Meanwhile, litigators must continue to confront the consequences of social media platforms on America’s youth and push for the companies that design them to be held accountable.
Matthew P. Bergman is the founder of the Social Media Victims Law Center and Bergman Draper Oslund Udo in Seattle and an adjunct professor at Lewis & Clark Law School in Portland, Ore. He can be reached at matt@socialmediavictims.org. The views expressed in this article are the author’s and do not represent Trial or AAJ.
Notes
- Pew Research Ctr., Social Media Fact Sheet, Apr. 7, 2021, https://www.pewresearch.org/internet/fact-sheet/social-media/.
- Monica Anderson & Jingjing Jiang, Teens, Social Media & Technology 2018, Pew Research Ctr., May 31, 2018, https://tinyurl.com/5bv658r3.
- See generally Social Media, Politics and the State: Protests, Revolutions, Riots, Crime and Policing in the Age of Facebook, Twitter and YouTube (Daniel Trottier & Christian Fuchs eds. 2015). See also Hunt Allcott, Matthew Gentzkow & Lena Song, Digital Addiction (Nat’l Bureau Econ. Research, Working Paper No. 28936, 2022), https://www.nber.org/papers/w28936 (attributing self-control problems to 31% of social media use).
- U.S. Surgeon General’s Advisory, Protecting Youth Mental Health 25 (2021), https://www.hhs.gov/sites/default/files/surgeon-general-youth-mental-health-advisory.pdf; see also Ctrs. for Disease Control & Prevention, Fatal Injury Reports, National, Regional and State (1981–2020), https://wisqars.cdc.gov/fatal-reports.
- A 2019 Facebook document stated: “We make body image issues worse for one in three teen girls.” And a March 2020 presentation to Facebook executives stated: “Thirty-two per cent of teen girls said that when they felt bad about their bodies, Instagram made them feel worse,” and “teens blame Instagram for increases in the rate of anxiety and depression. This reaction was unprompted and consistent across all groups.” See Damien Gayle, Facebook Aware of Instagram’s Harmful Effect on Teenage Girls, Leak Reveals, Guardian, Sept. 14, 2021, https://tinyurl.com/3b383ssw.
- Ctrs. for Disease Control & Prevention, supra note 4.
- A recent Facebook internal report revealed that the platform “enables all three stages of the human exploitation lifecycle (recruitment, facilitation, exploitation) via complex real-world networks.” See Clare Duffy, Facebook Has Known It Has a Human Trafficking Problem For Years. It Still Hasn’t Fully Fixed It, CNN, Oct. 25, 2021, https://tinyurl.com/57n49vm7.
- Communications Decency Act of 1996, 47 U.S.C. §230 (2018).
- See, e.g., In re Facebook, Inc., 625 S.W.3d 80, 89–93 (Tex. 2021), cert. denied, 142 S. Ct. 1087 (2022); Jane Doe No. 1 v. Backpage.com, LLC, 817 F.3d 12, 18–24 (1st Cir. 2016).
- Doe v. MySpace, Inc., 528 F. 3d 413, 418 (5th Cir. 2008).
- Dyroff v. Ultimate Software Grp., Inc., 934 F.3d 1093, 1094 (9th Cir. 2019), cert. denied, 140 S. Ct. 2761 (2020).
- See, e.g., Zeran v. Am. Online, Inc., 129 F.3d 327, 330 (4th Cir. 1997); Fair Housing Council of San Fernando Valley v. Roommates.com, LLC, 521 F.3d 1157, 1162 (9th Cir. 2008) (en banc); Doe v. MySpace, Inc., 528 F.3d at 418.
- Allison Zakon, Comment, Optimized for Addiction: Extending Product Liability Concepts to Defectively Designed Social Media Algorithms and Overcoming the Communications Decency Act, 2020 Wis. L. Rev. 1107, 1121 (2020), https://tinyurl.com/59a424tx.
- Bill Davidow, Skinner Marketing: We’re the Rats, and Facebook Likes Are the Reward, The Atlantic, June 13, 2013, https://tinyurl.com/5bknetbp. For more on operant conditioning generally, see Kendra Cherry, What Is Operant Conditioning and How Does It Work?, Verywell Mind, June 3, 2020, https://tinyurl.com/3ddswme7.
- See, e.g., Nir Eyal, Hooked: How to Build Habit-Forming Products (2014). Social media companies do not charge consumers for use of their platforms—rather, they sell advertising on their platforms targeted to individual users based on their demographic profile and internet browsing history. The more time that users are engaged on a particular social media platform, the more their exposure to advertising and the greater the profits earned from the platforms. Companies also sell users’ personal data to consumer product and service providers. See also Samuel M. Roth, Data Snatchers: Analyzing TikTok’s Collection of Children’s Data and Its Compliance with Modern Data Privacy Regulations, 22 J. High Tech. L. 1, 37 (2021).
- Zakon, supra note 13, at 1124.
- Escola v. Coca Cola Bottling Co. of Fresno, 24 Cal.2d 453,462 (Cal. 1944).
- See, e.g., id. at 461 (noting that the defendant had “exclusive control over both the charging and inspection of the bottles”).
- See, e.g., Rodriguez v. Meta Platforms, Inc., No. 3:22-CV-00401 (N.D. Cal. Jan 20, 2022) (wrongful death action on behalf of 11-year-old girl who became addicted to social media and committed suicide on Snapchat); Doffing v. Meta Platforms, Inc., No. 1:22-CV-00100 (D. Or. Jan. 20, 2022) (personal injury claim on behalf of 16-year-old girl who became addicted to social media and was groomed to exchange sexually inappropriate images with adults, suffers from severe depression and anxiety, and developed an eating disorder).
- See Complaint, Anderson v. TikTok, Inc., No. 2:22-cv-01849 (E.D. Pa. May 12, 2022), https://tinyurl.com/f45b5jdz.
- The author is currently handling four cases in which young African American men using social media were directed to content promoting guns and gang violence and suffered harm or took their lives. For more on how social media has exacerbated the spread of gun violence among young African Americans, see Desmond Upton Patton et al., Youth Gun Violence Prevention in a Digital Age, 141 Pediatrics Perspectives 1 (2018), https://tinyurl.com/yckk3mae.
- See Rodriguez, No. 3:22-CV-00401; Doffing, No. 1:22-CV-00100; Anderson, No. 2:22-cv-01849.
- Id.
- Texas Utilities Elec. Co. v. Timmons, 947 S.W.2d 191, 193 (Tex. 1997) (quoting Banker v. McLaughlin, 208 S.W.2d 843, 847–48 (Tex. 1948)).
- See, e.g., Complaint, Rodriguez, No. 3:22-CV-00401.
- 47 U.S.C. §230.
- Id.
- Malwarebytes, Inc. v. Enigma Software Grp. USA, LLC, 141 S. Ct. 13 (2020).
- See, e.g., Zeran, 129 F.3d 327.
- Pub. Law No. 115-164, 132 Stat. 1253 (2018).
- Id.
- 816 S.E.2d 77, 79 (2018).
- Id.
- Id.
- Id. at 83.
- 995 F.3d 1085, 1088 (9th Cir. 2021).
- Id. at 1093 (citations and quotations omitted).
- In re Facebook, Inc., 625 S.W.3d 80, 83 (Tex. 2021), cert. denied sub nom., Doe v. Facebook, Inc., 142 S. Ct. 1087 (2022).
- Id. at 83.
- Tex. Civ. Prac. & Rem. Code §98.002.
- In re Facebook, 625 S.W.3d at 96.
- Id. at 83.
- Id. at 85 (internal quotations omitted).
- Id. at 93.
- Id. at 83.
- Id. at 90 (citing Force v. Facebook, Inc., 934 F.3d 53, 64 (2d Cir. 2019)).
- Brief for the State of Texas and 24 Other States as Amici Curiae in Support of Petitioner, Doe v. Facebook, 142 S. Ct. 1087 (2022), https://tinyurl.com/mr2vsxfa.
- Doe v. Facebook, 142 S. Ct. at 1087.
- Id. at 1088.
- See Hearing to Examine Section 230 Immunity: Focusing on Big Tech., Before the S. Comm. On Commerce, Science & Transportation, 116th Congress (2020), https://tinyurl.com/3madz9rr.