Free Meta Logo illustration and picture


Trinity Chapman 

On October 24, 2023, thirty-three states filed suit against Meta[1], alleging that its social media content harms and exploits young users.[2] The plaintiffs go on to allege that Meta’s services are intentionally addictive, promoting compulsive use and leading to severe mental health problems in younger users.[3]  The lawsuit points to specific aspects of Meta’s services that the states believe cause harm. The complaint asserts that “Meta’s recommendation Algorithms encourage compulsive use” and are harmful to minors’ mental health,[4] and that the use of “social comparison features such as ‘likes’” cause further harm.[5]  The suit further asserts that the push notifications from Meta’s products disrupt minors’ sleep and that the company’s use of visual filters “promote[s] eating disorders and body dysmorphia in youth.”[6]

Social media plays a role in the lives of most young people.  A recent Advisory by the U.S. Surgeon General revealed that 95% of teens ages thirteen to seventeen and 40% of children ages eight to twelve report using social media.[7] The report explains that social media has both negative and positive effects.[8]  On one hand, social media connects young people with like-minded individuals online, offers a forum for self-expression, fosters a sense of acceptance, and promotes social connections.[9]  Despite these positive effects, social media harms many young people; researchers have linked greater social media use to poor sleep, online harassment, lower self-esteem, and symptoms of depression.[10]  Social media content undoubtedly impacts the minds of young people—often negatively.  However, the question remains as to whether companies like Meta should be held liable for these effects.

This is far from the first time that Meta has faced suit for its alleged harm to minors.  For example, in Rodriguez v. Meta Platforms, Inc., the mother of Selena Rodriguez, an eleven-year-old social media user, sued Meta after her daughter’s death by suicide.[11]  There, the plaintiff alleged that Selena’s tragic death was caused by her “addictive use and exposure to [Meta’s] unreasonabl[y] dangerous and defective social media products.”[12]  Similarly, in Heffner v. Meta Platforms, Inc., a mother sued Meta after her eleven-year-old son’s suicide.[13]  That complaint alleged that Meta’s products “psychologically manipulat[ed]” the boy, leading to social media addiction.[14]  Rodriguez and Heffner are illustrative of the type of lawsuit regularly filed against Meta.

A.        The Communications Decency Act

 In defending such suits, Meta invariably invokes the Communications Decency Act.  Section 230 of the act dictates that interactive online services “shall not be treated as the publisher or speaker of any information provided by another information content provider.”[15] In effect, the statute shields online services from liability arising from the effects of third-party content.  In asserting the act, defendant [1] [2] internet companies present a “hands off” picture of their activities; rather than playing an active role in the content that users consume, companies depict themselves as merely opening a forum through which third parties may produce content.[16]

Plaintiffs have responded with incredulity to this application of the act by online service providers, and the act’s exact scope is unsettled.[17]  In Gonzalez v. Google LLC, the parents of a man who died during an ISIS terrorist attack sued Google, alleging that YouTube’s algorithm recommended ISIS videos to some users, leading to increased success by ISIS in recruitment efforts.[18]  In defense, Google relied on Section 230 of the Communications Decency Act.[19]  The Ninth Circuit ruled that Section 230 barred the plaintiff’s claims,[20] but the Supreme Court vacated the Ninth Circuit’s Ruling on other grounds, leaving unanswered questions about the act’s scope.[21]

Despite that uncertainty, the defense retains a high likelihood of success. In the October 24 lawsuit, Meta’s success on the Section 230 defense depends on how active a role the court determines Meta played in suggesting and exposing the harmful content to minors.

B.        Product Liability

The October 24 complaint against Meta alleges theories of product liability.[22] In framing their product liability claims, plaintiffs focus on the harmful design of Meta’s “products” rather than the harmful content to which users may be exposed.[23] The most recent lawsuit alleges that “Meta designed and deployed harmful and psychologically manipulative product features to induce young users’ compulsive and extended use.”[24]

A look at Meta’s defense in Rodriguez is predictive of how the company will respond to the October 24 suit. There, the company refuted the mere qualification of Instagram as a “product.”[25] Meta’s Motion to Dismiss remarked that product liability law focuses on “tangible goods” or “physical articles” and contrasted these concepts with the “algorithm” used by Instagram to recommend content.[26]  Given traditional notions about what constitutes a “product,” Meta’s defenses are poised to succeed.  As suggested by Meta in their motion to dismiss Rodriguez’s suit, recommendations about content, features such as “likes,” and communications from third parties fall outside of what is typically considered a “product” by courts.[27]

To succeed on a product liability theory, plaintiffs must advocate for a more modernized conception of what counts as a “product” for purposes of product liability law.  Strong arguments may exist for shifting this conception; the world of technology has transformed completely since the ALI defined product liability in the Restatement (Second) of Torts.[28]  Still, considering this well-settled law, plaintiffs are likely to face an uphill battle.

 C.        Whose job is it anyway?

Lawsuits against Meta pose large societal questions about the role of courts and parents in ensuring minors’ safety.  Some advocates place the impetus on companies themselves, urging top-down prevention of access by minors to social media.[29]  Others emphasize the role of parents and families in preventing minors from unsafe exposure to social media content[30]; parents, families, and communities may be in better positions than tech giants to know, understand, and combat the struggles that teens face.  Regardless of who is to blame, nearly everyone can agree that the problem needs to be addressed.


[1] In 2021, the Facebook Company changed its name to Meta. Meta now encompasses social media apps like WhatsApp, Messenger, Facebook, and Instagram. See Introducing Meta: A Social Technology Company, Meta(Oct. 28, 2021), https://about.fb.com/news/2021/10/facebook-company-is-now-meta/

[2] Complaint at 1, Arizona v. Meta Platforms, Inc., 4:23-cv-05448 (N.D. Cal. Oct. 24, 2023) [hereinafter October 24 Complaint] (“[Meta’s] [p]latforms exploit and manipulate its most vulnerable users: teenagers and children.”).

[3] Id. at 23.

[4] Id. at 28.

[5] Id. at 41.

[6] Id. at 56.

[7] U.S. Surgeon General, Advisory: Social Media and Youth Mental Health 4 (2023).

[8] Id. at 5.

[9] Id. at 6.

[10] Id. at 7.

[11] Complaint at 2, Rodriguez v. Meta Platforms, Inc., 3:22-cv-00401 (Jan. 20, 2022) [hereinafter Rodriguez Complaint].

[12] Id.

[13] Complaint at 2, Heffner v. Meta Platforms, Inc., 3:22-cv-03849 (June 29, 2022).

[14] Id. at 13.

[15] 47 U.S.C.S. § 230 (LEXIS through Pub. L. No. 118-19).

[16] See, e.g., Dimeo v. Max, 433 F. Supp. 2d 523, 34 Media L. Rep. (BNA) 1921, 2006 U.S. Dist. LEXIS 34456 (E.D. Pa. 2006), aff’d, 248 Fed. Appx. 280, 2007 U.S. App. LEXIS 22467 (3d Cir. 2007). Dimeo is just one example of the strategy used repeatedly by Meta and other social media websites.

[17] Gonzalez v. Google LLC, ACLU, https://www.aclu.org/cases/google-v-gonzalez-llc#:~:text=Summary-,Google%20v.,content%20provided%20by%20their%20users (last updated May 18, 2023).

[18] Gonzalez v. Google LLC, 2 F.4th 871, 880–81 (9th Cir. 2021).

[19] Id. at 882.

[20] Id. at 881.

[21] Gonzalez v. Google LLC, 598 U.S. 617, 622 (2023).

[22] October 24 Complaint, supra note 1, at 145–98.

[23] Id. at 197.

[24] Id. at 1.

[25] Motion to Dismiss, Rodriguez v. Meta Platforms, Inc., 3:22-cv-00401 (June 24, 2022).

[26] Id.

[27] Id.

[28] Restatement (Second) of Torts § 402A (Am. L. Inst. 1965).

[29] Rachel Sample, Why Kids Shouldn’t get Social Media Until they are Eighteen, Medium (June 14, 2020), https://medium.com/illumination/why-kids-shouldnt-get-social-media-until-they-are-eighteen-2b3ef6dcbc3b.

[30] Jill Filipovic, Opinion: Parents, Get your Kids off Social Media, CNN (May 23, 2023, 6:10 PM), https://www.cnn.com/2023/05/23/opinions/social-media-kids-surgeon-general-report-filipovic/index.html.