From Casetext: Smarter Legal Research

The Estate of Bride v. Yolo Techs.

United States Court of Appeals, Ninth Circuit
Aug 22, 2024
No. 23-55134 (9th Cir. Aug. 22, 2024)

Opinion

23-55134

08-22-2024

THE ESTATE OF CARSON BRIDE, by and through his appointed administrator KRISTIN BRIDE; A. K., by and through her legal guardian Jane Doe 1; A. C., by and through her legal guardian Jane Doe 2; A. O., by and through her legal guardian Jane Does 3; TYLER CLEMENTI FOUNDATION, on behalf of themselves and all others similarly situated, Plaintiffs-Appellants, v. YOLO TECHNOLOGIES, INC., Defendant-Appellee.

Juyoun Han (argued), Eric M. Baum, Andrew Clark, and Jonathan Axel, Eisenberg &Baum LLP, New York, New York, for Plaintiffs-Appellants. Ramnik S. Pujji (argued), Carol Yur, and Emma Moralyan, Dentons U.S. LLP, Los Angeles, California, for DefendantAppellee. Megan Iorio and Tom McBrien, Electronic Privacy Information Center, Washington, D.C., for Amici Curiae Electronic Privacy Information Center and Fairplay.


Argued and Submitted April 11, 2024 Pasadena, California

Appeal from the United States District Court for the Central District of California Fred W. Slaughter, District Judge, Presiding D.C. No. 2:21-cv-06680-FWS-MRW

COUNSEL

Juyoun Han (argued), Eric M. Baum, Andrew Clark, and Jonathan Axel, Eisenberg &Baum LLP, New York, New York, for Plaintiffs-Appellants.

Ramnik S. Pujji (argued), Carol Yur, and Emma Moralyan, Dentons U.S. LLP, Los Angeles, California, for DefendantAppellee.

Megan Iorio and Tom McBrien, Electronic Privacy Information Center, Washington, D.C., for Amici Curiae Electronic Privacy Information Center and Fairplay.

Before: Eugene E. Siler, [*] Carlos T. Bea, and Sandra S. Ikuta, Circuit Judges.

SUMMARY [**]

Communications Decency Act

The panel reversed the district court's dismissal of plaintiffs' misrepresentation claims and affirmed the district court's dismissal of plaintiffs' products liability claims in their diversity class action alleging that YOLO Technologies violated multiple state tort and product liability laws by developing an anonymous messaging app which promised to unmask bullying and abusive users, but YOLO never actually did so.

The district court held that § 230 of the Communications Decency Act-which protects apps and websites which receive content posted by third-party users from liability for any content posted on their services-immunized YOLO from liability on plaintiffs' claims and dismissed the complaint.

Reversing the district court's dismissal of plaintiffs' misrepresentation claims, the panel held that the claims survived because plaintiffs seek to hold YOLO accountable for its promise to unmask or ban users who violated the terms of service, and not for a failure to take certain moderation actions.

Affirming the district court's dismissal of plaintiffs' products liability claims, the panel held that § 230 precludes liability because plaintiffs' product liability theories attempt to hold YOLO liable as a publisher of third-party content.

OPINION

SILER, Circuit Judge:

Appellee YOLO Technologies developed an extension for use on the Snapchat application ("app") which allowed users to ask public questions and send and receive anonymous responses. YOLO informed all users that it would reveal the identities of, and ban, anyone who engaged in bullying or harassing behavior. Appellants, three living minor children and the estate of a fourth, all suffered extreme harassment and bullying through YOLO resulting in acute emotional distress, and in the case of Carson Bride, death by suicide. They brought this diversity class action alleging that YOLO violated multiple state tort and product liability laws by developing an anonymous messaging app which promised to unmask, and thereby prevent, bullying and abusive users, but YOLO never actually did so.

The district court held that § 230 of the Communications Decency Act immunized YOLO from these claims and dismissed the complaint. We affirm and reverse in part, holding that § 230 bars Plaintiffs' products liability claims but not their misrepresentation claims.

I.

A.

YOLO Technologies developed their app as an extension upon the already-popular Snapchat app. Marketed mainly toward teenagers in mobile app stores, YOLO achieved tremendous popularity, reaching the top of the download charts within a week of its launch. It eventually reached ten million active users.

Anonymity was YOLO's key feature. Users would install the app and use it to post public questions and polls for their followers. Other users, also using YOLO, could respond to the questions or polls anonymously, unless they chose to "swipe up" and voluntarily disclose their identity as part of their answer. Without such voluntary revelation, the recipient would not know the responder's account nickname, user information, or any other identifying data.

Anonymous messaging applications, even ones marketed specifically to teens, are not new inventions. Plaintiffs contend that "it [has] long been understood that anonymous online communications pose a significant danger to minors, including by increasing the risk of bullying and other antinormative behavior." In fact, prior applications with anonymous communication features had caused "teenagers [to] take[] their own lives after being cyberbullied."

As a hedge against these potential problems, YOLO added two "statements" to its application: a notification to new users promising that they would be "banned for any inappropriate usage," and another promising to unmask the identity of any user who "sen[t] harassing messages" to others. But, Plaintiffs argue, with a staff of no more than ten people, there was no way YOLO could monitor the traffic of ten million active daily users to make good on its promise, and it in fact never did. Many user reviews of the YOLO app on Apple's app store reflected frustration with harassing and bullying behavior.

B.

Plaintiffs A.K., A.C., A.O., and Carson Bride all downloaded the YOLO extension and used it on the Snapchat app. All four were inundated with harassing, obscene, and bullying messages including "physical threats, obscene sexual messages and propositions, and other humiliating comments." Users messaged A.C. suggesting that she kill herself, just as her brother had done. A.O. was sent a sexual message, and her friend was told she was a "whore" and "boy-obsessed." A.K. received death threats, was falsely accused of drug use, mocked for donating her hair to a cancer charity, and exhorted to "go kill [her]self," which she seriously considered. She suffered for years thereafter. Carson Bride was subjected to constant humiliating messages, many sexually explicit and highly disturbing. Despite his efforts, Carson was unable to unmask the users who were sending these messages and discover their identities. On June 23, 2020, Carson hanged himself at his home.

A.K. attempted to utilize YOLO's promised unmasking feature but received no response. Carson searched the internet diligently for ways to unmask the individuals sending him harassing messages, with no success. Carson's parents continued his efforts after his death, first using YOLO's "Contact Us" form on its Customer Support page approximately two weeks after his death. There was no answer. Approximately three months later, his mother Kristin Bride sent another message, this time to YOLO's law enforcement email, detailing what happened to Carson and the messages he received in the days before his death. The email message bounced back as undeliverable because the email address was invalid. She sent the same to the customer service email and received an automated response promising an answer that never came. Approximately three months later, Kristin reached out to a professional friend who contacted YOLO's CEO on LinkedIn, a professional networking site, with no success. She also reached out again to YOLO's law enforcement email, with the same result as before.

Kristin Bride filed suit against YOLO and other defendants no longer part of the action. The first amended complaint alleged twelve causes of action including product liability based on design defects and failure to warn, negligence, fraudulent and negligent misrepresentation, unjust enrichment, and violations of Oregon, New York, Colorado, Pennsylvania, Minnesota, and California tort law. Plaintiffs' counsel agreed at a hearing that the state law claims were all based in "misrepresentation, intentional and negligent." Forty-eight hours after Plaintiffs filed this suit, Snap suspended YOLO's access to its application and later announced a complete ban on anonymous messaging apps in its app store.

C.

Plaintiffs' theories essentially fall into two categories: products liability and misrepresentation. Counsel admitted that the state law claims all fell under misrepresentation, and YOLO splits them between products liability and misrepresentation.

The products liability claims allege that YOLO's app is inherently dangerous because of its anonymous nature and that it was negligent for YOLO to ignore the history of teen suicides stemming from cyberbullying on anonymous apps. Plaintiffs based their products liability claim solely on the anonymity of YOLO's app at the district court and through initial briefing at this court.

In their reply brief, Plaintiffs advance a new theory that several of YOLO's features taken together created liability. YOLO moved to strike this argument because it was raised for the first time in the reply brief. We agree and will grant the motion. Our grant of this motion, however, does not affect any possible motions in the district court to amend the complaint on remand.

Plaintiffs' misrepresentation claims are based on their allegation that YOLO alerted all new users that bullying and harassing behavior would result in the offending user being banned and unmasked, but YOLO never followed through on this threat despite A.K.'s requests and Kristin Bride's emails.

The district court granted YOLO's motion to dismiss, finding that the entire complaint sought to hold YOLO responsible for the content of messages posted on its app by users and not for any separate duty or obligation to the Plaintiffs. The court relied heavily on Dyroff v. Ultimate Software Group, Inc., 934 F.3d 1093 (9th Cir. 2019), which involved a lawsuit against a completely anonymous website through which the plaintiff's deceased son purchased fentanyl-laced drugs. The district court found this matter essentially on all fours with Dyroff and dismissed the suit.

II.

We review de novo the district court's decision to grant YOLO's motion to dismiss under Federal Rule of Civil Procedure 12(b)(6). Puri v. Khalsa, 844 F.3d 1152, 1157 (9th Cir. 2017). Questions of statutory interpretation are reviewed de novo as well. Collins v. Gee W. Seattle LLC, 631 F.3d 1001, 1004 (9th Cir. 2011). And we take all factual allegations in the complaint as true and "construe the pleadings in the light most favorable to the nonmoving party." Rowe v. Educ. Credit Mgmt. Corp., 559 F.3d 1028, 1029-30 (9th Cir. 2009) (quoting Knievel v. ESPN, 393 F.3d 1068, 1072 (9th Cir. 2005)).

A.

The Internet was still in its infancy when Congress passed the Communications Decency Act ("CDA") in 1996. 47 U.S.C. § 230; Batzel v. Smith, 333 F.3d 1018, 1026-27 (9th Cir. 2003). Even at its young age, legislators recognized its tremendous latent potential. Barnes v. Yahoo!, Inc., 570 F.3d 1096, 1099 (9th Cir. 2009). However, because of the unprecedented reach and speed of the new forum, that potential would be significantly limited if courts imposed traditional publisher liability on internet platforms. See Doe v. Internet Brands, Inc., 824 F.3d 846, 851-52 (9th Cir. 2016). Traditional publisher liability held that if a publisher took upon itself the task of moderating or editing the content that appeared within its pages, it became responsible for anything tortious written there. Id. at 852. A New York state court perfectly illustrated this danger when it found that an online message board became a publisher responsible for the offensive content of any messages "because it deleted some offensive posts but not others." Id. In light of the sheer volume of internet traffic, this presented providers with a "grim choice": voluntarily filter some content and risk overlooking problems and thereby incurring tort liability, or take a hands-off approach and let the trolls run wild. Id.

To address this problem, Congress enacted § 230 of the CDA. This section allows services "to perform some editing on user-generated content without thereby becoming liable for all defamatory or otherwise unlawful messages they didn't edit or delete." Fair Hous. Council of San Fernando Valley v. Roommates.Com, LLC, 521 F.3d 1157, 1163 (9th Cir. 2008) (en banc) [hereinafter Roommates]. Congress included a policy statement within § 230 concluding that "[i]t is the policy of the United States . . . to promote the continued development of the Internet and other interactive computer services and other interactive media." 47 U.S.C. § 230(b)(1). To that end, the law sought to encourage the development and use of technologies that would allow users to filter and control the content seen by themselves or their children. Id. § 230(b)(3)-(4).

The operative section of the law, § 230(c), titled "Protection for 'Good Samaritan' blocking and screening of offensive material," is divided into two working parts. Id. § 230(c). The first broadly states that no service provider "shall be treated as the publisher or speaker of any information provided by another information content provider," or, more colloquially, by a third-party user of the service. Id. § 230(c)(1). The second part protects actions taken by a service provider to moderate and restrict material it "considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable." Id. § 230(c)(2). Section 230 expressly preempts any state laws with which it may conflict. Id. § 230(e)(3).

In short, § 230 protects apps and websites which receive content posted by third-party users (i.e., Facebook, Instagram, Snapchat, LinkedIn, etc.) from liability for any of the content posted on their services, even if they take it upon themselves to establish a moderation or filtering system, however imperfect it proves to be. This immunity persists unless the service is itself "'responsible, in whole or in part, for the creation or development of' the offending content." Roommates, 521 F.3d at 1162 (quoting 47 U.S.C. § 230(f)(3)).

This robust immunity applies to "(1) a provider or user of an interactive computer service (2) whom a plaintiff seeks to treat, under a state law cause of action, as a publisher or speaker (3) of information provided by another information content provider." Barnes, 570 F.3d at 1100-01 (footnote omitted). The parties agree that YOLO is an interactive computer service under § 230, and therefore satisfies the first prong. See 47 U.S.C. § 230(f)(2). YOLO is clearly the developer of the YOLO app, which allows users to communicate anonymously, send polls and questions, and send and receive anonymous responses.

The second Barnes prong considers whether the cause of action alleged in the complaint seeks to plead around the CDA's strictures and treat the defendant as a "publisher or speaker" of third-party content. See 47 U.S.C. § 230(c)(1). "[W]hat matters is not the name of the cause of action . . . [but] whether the cause of action inherently requires the court to treat the defendant as the 'publisher or speaker' of content provided by another." Barnes, 570 F.3d at 1101-02 (listing successful cases against services that failed to qualify for § 230 immunity). The act of "publication involves reviewing, editing, and deciding whether to publish or to withdraw from publication third-party content." Id. at 1102.

It is imperative to consider that "neither [subsection 230(c)] nor any other declares a general immunity from liability deriving from third-party content." Id. at 1100. Indeed, that could not be true; for most applications of § 230 in our internet age involve social media companies, which nearly all provide some form of platform for users to communicate with each other. In cases such as these, "[p]ublishing activity is a but-for cause of just about everything [defendants are] involved in. [They are] internet publishing business[es]." Internet Brands, 824 F.3d at 853; see also Calise v. Meta Platforms, Inc., 103 F.4th 732, 742 (9th Cir. 2024) ("Putting these cases together, it is not enough that a claim, including its underlying facts, stems from third-party content for § 230 immunity to apply."). The proper analysis requires a close examination of the duty underlying each cause of action to decide if it "derives from the defendant's status or conduct as a publisher or speaker." Barnes, 570 F.3d at 1107. Therefore, services can still be liable under traditional tort theories if those theories do not require the services to exercise some kind of publication or editorial function. Id. at 1102.

B.

In short, we must engage in a "careful exegesis of the statutory language" to determine if these claims attempt to treat YOLO as the "publisher or speaker" of the allegedly tortious messages. Id. at 1100. This exacting analysis helps us avoid "exceed[ing] the scope of the immunity provided by Congress." Internet Brands, 824 F.3d at 853 (quoting Roommates, 521 F.3d at 1164 n.15). After all, § 230 immunity is extraordinarily powerful, granting complete immunity where it applies and, in the process, preempting even the will of the people as expressed in their state legislatures. See 47 U.S.C. § 230(e)(3) (preempting state law). Our analysis, therefore, "ask[s] whether the duty that the plaintiff alleges the defendant violated derives from the defendant's status or conduct as a 'publisher or speaker.' If it does, section 230(c)(1) precludes liability." Barnes, 570 F.3d at 1102. But if it does not, then the suit may proceed as against the claim of immunity based on § 230(c)(1).

In light of this, we have explicitly disclaimed the use of a "but-for" test because it would vastly expand § 230 immunity beyond Congress' original intent. See Internet Brands, 824 F.3d at 853.

Our opinion in Calise v. Meta Platforms, published earlier this year, clarified the required duty analysis that originated in Barnes v. Yahoo, Lemmon v. Snap, Inc., and HomeAway.com, Inc. v. City of Santa Monica. Calise, 103 F.4th at 742 ("Our cases instead require us to look to the legal 'duty.' 'Duty' is 'that which one is bound to do, and for which somebody else has a corresponding right.'" (quoting Duty, Black's Law Dictionary (11th ed. 2019))). We now conduct a two-step analysis. Id. First, we examine the "right from which the duty springs." Id. (quotations omitted). Does it stem from the platform's status as a publisher (in which case it is barred by § 230)? Or does it spring from some other obligation, such as a promise or contract (which, under Barnes, is distinct from publication and not barred by § 230)? Second, we ask what "this duty requir[es] the defendant to do." Id. If it requires that YOLO moderate content to fulfill its duty, then § 230 immunity attaches. See id.; HomeAway.com, Inc. v. City of Santa Monica, 918 F.3d 676, 682 (9th Cir. 2019).

We emphasize, however, that this does not mean immunity attaches anytime YOLO could respond to a legal duty by removing content. See HomeAway.com, Inc. v. City of Santa Monica, 918 F.3d 676, 682 (9th Cir. 2019). Instead, we look at what the purported legal duty requires- "specifically, whether the duty would necessarily require an internet company to monitor third-party content." Id. For immunity to attach at this second step, moderation must be more than one option in YOLO's menu of possible responses; it must be the only option.

Barnes perfectly illustrates the duty distinction reemphasized in Calise. In that case, Barnes's estranged boyfriend posted nude images of her on a fake profile on Yahoo's website, and she reached out to Yahoo to get them removed. Barnes, 570 F.3d at 1098-99. Yahoo's Director of Communications promised Barnes over the phone that she would personally facilitate the removal of the offending fake profile. Id. at 1099. Nothing happened and Barnes sued, alleging negligent undertaking and promissory estoppel. Id. Skeptical of Barnes's negligent undertaking claim, we held that it was simply a defamation claim recast as negligence and asked,

[W]hat is the undertaking that Barnes alleges Yahoo failed to perform with due care? The removal of the indecent profiles that her former boyfriend posted on Yahoo's website.
But removing content is something publishers do, and to impose liability on the basis of such conduct necessarily involves treating the liable party as a publisher of the content it failed to remove.
Id. at 1103; see 47 U.S.C. § 230(c)(1). We determined that Barnes's negligent undertaking claim faulted Yahoo for failure to remove content, and "such conduct is publishing conduct . . . that can be boiled down to" editorial behavior. Id. at 1103 (emphasis and quotations omitted) (quoting Roommates, 521 F.3d at 1170-71). Such claims are explicitly foreclosed by § 230(c)(1).

Barnes's promissory estoppel claim, however, fared better. Because this claim "is a subset of a theory of recovery based on a breach of contract," it was not ultimately grounded in Yahoo's failure to remove content, but in their failure to honor a "private bargain[]." Id. at 1106 (quotations omitted). While yes, that was a promise to moderate content, the underlying obligation upon which Barnes relied was not an obligation to remove a profile, but the promise itself. Id. at 1107-09. As we noted, Barnes did "not seek to hold Yahoo liable as a publisher or speaker of third-party content, but rather as the counter-party to a contract, as a promisor who [had] breached." Id. at 1107. Section 230 only "precludes liability when the duty the plaintiff alleges the defendant violated derives from the defendant's status or conduct as a publisher or speaker." Id. We justified the distinction because of where the individual claims derive liability: negligent undertaking is grounded in "behavior that is identical to publishing or speaking," whereas "[p]romising is different because it is not synonymous with the performance of the action promised." Id. "[W]hereas one cannot undertake to do something without simultaneously doing it, one can, and often does, promise to do something without actually doing it at the same time." Id. Therefore, contractual liability stood where negligence fell.

The question of whether § 230 immunity applies is not simply a matter of examining the record to see if "a claim, including its underlying facts, stems from third-party content." Calise, 103 F.4th at 742. Nor is there a bright-line rule allowing contract claims and prohibiting tort claims that do not require moderating content, for that would be inconsistent with those cases where we have allowed tort claims to proceed, see Internet Brands, 824 F.3d 846 (negligent failure to warn claim survived § 230 immunity); Lemmon v. Snap, Inc., 995 F.3d 1085 (9th Cir. 2021) (authorizing a products liability claim based in negligent design), and contradict our prior position that the name of a cause of action is irrelevant to immunity, Barnes, 570 F.3d at 1102 ("[W]hat matters is not the name of the cause of action . . . what matters is whether the cause of action inherently requires the court to treat the defendant as the 'publisher or speaker' of content provided by another."). Instead, we must engage in a careful inquiry into the fundamental duty invoked by the plaintiff and determine if it "derives from the defendant's status or conduct as a 'publisher or speaker.'" Id.

C.

We now conduct that inquiry here. The parties divide the claims into two categories-misrepresentation and products liability-and we will continue that distinction in our analysis.

1.

Turning first to Plaintiffs' misrepresentation claims, we find that Barnes controls. YOLO's representation to its users that it would unmask and ban abusive users is sufficiently analogous to Yahoo's promise to remove an offensive profile. Plaintiffs seek to hold YOLO accountable for a promise or representation, and not for failure to take certain moderation actions. Specifically, Plaintiffs allege that YOLO represented to anyone who downloaded its app that it would not tolerate "objectionable content or abusive users" and would reveal the identities of anyone violating these terms. They further allege that all Plaintiffs relied on this statement when they elected to use YOLO's app, but that YOLO never took any action, even when directly requested to by A.K. In fact, considering YOLO's staff size compared to its user body, it is doubtful that YOLO ever intended to act on its own representation.

While it is certainly an open question whether YOLO has any defenses to enforcement of its promise, at this stage we cannot say that § 230 categorically prohibits Plaintiffs from making the argument. YOLO may argue that it did not intend to induce reliance on the promise by the Plaintiffs, or that the statements were not promises made to Plaintiffs but instead warnings to others. But we treat "the outwardly manifested intention to create an expectation on the part of another as a legally significant event. That event generates a legal duty distinct from the conduct at hand," a duty which we will enforce. Barnes, 570 F.3d at 1107.

The district court oversimplified the proper analysis for § 230 immunity and essentially dismissed the claims because malicious third-party postings were involved or must be edited by YOLO. In its own words, "Plaintiffs' claims that [YOLO] . . . misrepresented their applications' safety would not be cognizable" without the harmful behavior of third-party users, and therefore immunity applies. The proper analysis is to examine closely the duty underlying each cause of action and decide if it "derives from the defendant's status or conduct as a publisher or speaker." Id. If it does, then § 230(c)(1) immunizes the defendant from liability on that claim.

In summary, Barnes is on all fours with Plaintiffs' misrepresentation claims here. YOLO repeatedly informed users that it would unmask and ban users who violated the terms of service. Yet it never did so, and may have never intended to. Plaintiffs seek to enforce that promise-made multiple times to them and upon which they relied-to unmask their tormentors. While yes, online content is involved in these facts, and content moderation is one possible solution for YOLO to fulfill its promise, the underlying duty being invoked by the Plaintiffs, according to Calise, is the promise itself. See Barnes, 570 F.3d at 1106-09. Therefore, the misrepresentation claims survive.

2.

Next, we address the product liability claims. In general, these claims assert that YOLO's app is inherently dangerous because of its anonymous nature, and that previous high-profile suicides and the history of cyberbullying should have put YOLO on notice that its product was unduly dangerous to teenagers. We hold that § 230 precludes liability on these claims.

Plaintiffs first allege product liability claims for design defect, and negligence. The defective design claim alleges that YOLO "developed, designed, manufactured, marketed, sold, and distributed to at least hundreds of thousands of minors" a product that was unreasonably dangerous because of its anonymity. They claim that the bare fact of YOLO's anonymity made it uniquely dangerous to minors and that YOLO should have known this because prior anonymous applications had a deleterious effect on minor users. The negligence claim is similar, claiming that YOLO failed to "protect users from an unreasonable risk of harm arising out of the use of their app[]." Failure to mitigate this "foreseeable risk of harm," Plaintiffs claim, makes YOLO liable.

Plaintiffs also allege products liability claims under a failure to warn theory. The alleged risks are the same as those for defective design and negligence, but the claims are centered more on YOLO's alleged failure to disclose these risks to users when they downloaded the YOLO app. Plaintiffs therefore ask for compensatory damages, pecuniary loss, and loss of society, companionship, and services to Carson Bride's parents, and punitive damages "based on [YOLO's] willful and wanton failure to warn of the known dangers" of its product.

At root, all Plaintiffs' product liability theories attempt to hold YOLO responsible for users' speech or YOLO's decision to publish it. For example, the negligent design claim faults YOLO for creating an app with an "unreasonable risk of harm." What is that harm but the harassing and bullying posts of others? Similarly, the failure to warn claim faults YOLO for not mitigating, in some way, the harmful effects of the harassing and bullying content. This is essentially faulting YOLO for not moderating content in some way, whether through deletion, change, or suppression.

Our decision in Lemmon v. Snap, Inc. does not help Plaintiffs. In that case, parents of two teens killed while speeding sued the company that owns Snapchat. Lemmon, 995 F.3d at 1087. They alleged that the boys had been speeding because of a feature on the Snapchat app that allowed users to overlay their current speed onto photos and videos. Id. at 1088-89. It was widely believed that Snapchat would reward users with in-app rewards of some kind if they attained a speed over 100 mph. Id. at 1089. The boys operated the filter moments before their deaths. Id. at 1088. The parents brought negligent design claims alleging that Snapchat, despite numerous news articles, an online petition about the inherent problems with the filter, "at least three accidents," and "at least one other lawsuit," continued to offer a feature that "incentiviz[ed] young drivers to drive at dangerous speeds." Id. at 1089. The district court dismissed the complaint on § 230 grounds. Id. at 1090. On appeal, we held that the negligent design claims were not an attempt "to treat a defendant as a 'publisher or speaker' of third-party content." Id. at 1091. Instead, the parents sought to hold Snap liable for creating (1) Snapchat, (2) the speed filter, and (3) an incentive structure that enticed users to drive at unsafe speeds. Id. In clarifying that the parents' product liability claim was not "a creative attempt to plead around the CDA," we explained that claim did "not depend on what messages, if any, a Snapchat user employing the Speed Filter actually sends." Id. at 1094. As a result, the claim did not depend on third-party content. Id.

Here, Plaintiffs allege that anonymity itself creates an unreasonable risk of harm. But we refuse to endorse a theory that would classify anonymity as a per se inherently unreasonable risk to sustain a theory of product liability. First, unlike in Lemmon, where the dangerous activity the alleged defective design incentivized was the dangerous behavior of speeding, here, the activity encouraged is the sharing of messages between users. See id. Second, anonymity is not only a cornerstone of much internet speech, but it is also easily achieved. After all, verification of a user's information through government-issued ID is rare on the internet. Thus we cannot say that this feature was uniquely or unreasonably dangerous.

Similarly, Internet Brands provides no cover for Plaintiffs' failure to warn theory. In that case, we upheld liability against a professional networking site for models under a failure to warn theory. Internet Brands, 824 F.3d at 848. Plaintiff created a profile on the website Model Mayhem, owned by Internet Brands, advertising her services as a model. Id. Meanwhile, the site's owners were aware that a pair of men had been using the site to set up fake auditions, lure women to "auditions" in Florida, and then rape them. Id. at 848-49. Yet the owners did not warn plaintiff, and she fell victim to the scheme. Id. at 848. We reasoned that plaintiff sought to hold defendant liable under a traditional tort theory-the duty to warn-which had no bearing on Model Mayhem's decision to publish any information on its site. Id. at 851. After all, plaintiff had posted her own profile on the website, and did not allege that the rapists had posted anything on the website. Id. Therefore, § 230 was no protection.

In short, the defendant in Internet Brands failed to warn of a known conspiracy operating independent of the site's publishing function. Id. But here, there was no conspiracy to harm that could be defined with any specificity. It was merely a general possibility of harm resulting from use of the YOLO app, and which largely exists anywhere on the internet. We cannot hold YOLO responsible for the unfortunate realities of human nature.

Finally, we clarify the extent to which Dyroff v. Ultimate Software Group is applicable, but not dispositive, here. In that case, a grieving mother sued an anonymous website that allowed users to post whatever they wanted, anonymously, and receive anonymous replies. Dyroff, 934 F.3d at 109495. Her son purchased drugs using the site and died because the drugs he purchased were laced with fentanyl. Id. at 1095. As we explained, "[s]ome of the site's functions, including user anonymity and grouping, facilitated illegal drug sales." Id. at 1095. The mother sued, alleging that the site had allowed users to engage in illegal activity, that the website's recommendation algorithm had promoted and enabled these communications, and that defendant failed to moderate the website's content to eliminate these problems. Id. We concluded that § 230(c) granted defendant immunity from these claims. Id. at 1096. First, we noted that § 230 "provides that website operators are immune from liability for third-party information . . . unless the website operator 'is responsible, in whole or in part, for the creation or development of [the] information.'" Id. (brackets in original) (quoting 47 U.S.C. § 230(c)(1), (f)(3)). We then looked at whether the claims "inherently require[] the court to treat the defendant as the 'publisher or speaker' of content provided by another." Id. at 1098 (brackets in original) (quoting Barnes, 570 F.3d at 1102). Because the automated processes contained in the site's algorithm were not themselves content but merely "tools meant to facilitate the communication and content of others," we found the second Barnes prong satisfied. Id. Finally, the third Barnes prong was satisfied because the content was clearly developed by others, not the defendant. Id. at 1098. Unlike in Roommates, where the defendant played a role in developing the illegal content by requiring users to answer particular questions, the defendant in Dyroff merely provided a "blank text box" that users could utilize however they wanted. Id. at 1099.

In our view, Plaintiffs' product liability theories similarly attempt to hold YOLO liable as a publisher of third-party content, based in part on the design feature of anonymity. To be sure, our opinion in Dyroff did not rely on anonymity for its § 230 analysis. See id. at 1096-99. But our analysis of Plaintiffs' product liability claims is otherwise consistent with Dyroff's reasoning: here, the communications between users were direct, rather than suggested by an algorithm, and YOLO similarly provided users with a blank text box. These facts fall within Dyroff's ambit. As we have recognized, "No website could function if a duty of care was created when a website facilitates communication, in a content-neutral fashion, of its users' content." Id. at 1101. Though the claims asserted in Dyroff were different than the claims asserted here, our conclusion is consistent with Dyroff's reasoning.

In summary, Plaintiffs' product liability claims attempt to hold YOLO responsible as the speaker or publisher of harassing and bullying speech. Those product liability claims that fault YOLO for not moderating content are foreclosed, see supra at 18; otherwise, nothing about YOLO's app was so inherently dangerous that we can justify these claims, and unlike Lemmon, YOLO did not turn a blind eye to the popular belief that there existed in-app features that could only be accessible through bad behavior. Lemmon, 995 F.3d at 1089-90 (describing how users thought that exceeding 100 mph while using the Snapchat app would produce a reward). And to the degree that the online environment encouraged and enabled such behavior, that is not unique to YOLO. It is a problem which besets the entire internet. Thus, § 230 immunizes YOLO from liability on these claims.

D.

In holding that the Plaintiffs' misrepresentation claims may proceed, we adhere to long-established circuit precedent. We must strike a delicate balance by giving effect to the intent of Congress as expressed in the statute while not expanding the statute beyond the legislature's expressed intent in the face of quickly advancing technology. Today's decision does not expand liability for internet companies or make all violations of their own terms of service into actionable claims. To the degree that such liability exists, it already existed under Barnes and Calise, and nothing we do here extends that legal exposure to new arenas. Section 230 prohibits holding companies responsible for moderating or failing to moderate content. It does not immunize them from breaking their promises. Even if those promises regard content moderation, the promise itself is actionable separate from the moderation action, and that has been true at least since Barnes. In our caution to ensure § 230 is given its fullest effect, we must resist the corollary urge to extend immunity beyond the parameters established by Congress and thereby create a free-wheeling immunity for tech companies that is not enjoyed by other players in the economy.

III.

We therefore REVERSE the district court's grant of YOLO's motion to dismiss the misrepresentation claims but AFFIRM in all other respects. YOLO's motion to strike is GRANTED.

[*]The Honorable Eugene E. Siler, United States Circuit Judge for the U.S. Court of Appeals for the Sixth Circuit, sitting by designation.

[**] This summary constitutes no part of the opinion of the court. It has been prepared by court staff for the convenience of the reader.


Summaries of

The Estate of Bride v. Yolo Techs.

United States Court of Appeals, Ninth Circuit
Aug 22, 2024
No. 23-55134 (9th Cir. Aug. 22, 2024)
Case details for

The Estate of Bride v. Yolo Techs.

Case Details

Full title:THE ESTATE OF CARSON BRIDE, by and through his appointed administrator…

Court:United States Court of Appeals, Ninth Circuit

Date published: Aug 22, 2024

Citations

No. 23-55134 (9th Cir. Aug. 22, 2024)

Citing Cases

Kennedy v. Meta Platforms, Inc.

Section 230 of the CDA “protects apps and websites which receive content posted by third-party users (i.e.,…