Opinion
Case No.: 22cv619-LAB-MDD
2023-06-05
Byron Everard, Ma, John K. Buche, Buche & Associates, P.C., La Jolla, CA, Juyoun Han, Pro Hac Vice, Eisenberg & Baum LLP, New York, NY, for Plaintiffs. Jessica Lynn Grant, Michael Burshteyn, Ernesto Rojas Guzman, Morrison and Foerster LLP, San Francisco, CA, Joseph Alexander Lawrence, Tamara Raquel Wiesebron, Morrison & Foerster LLP, New York, NY, for Defendant Snap Inc. David R. Singer, Kate Spelman, Jenner & Block LLP, Los Angeles, CA, for Defendant Apple Inc. Benjamin D. Margo, Pro Hac Vice, Brian M. Willen, Pro Hac Vice, Vivek Vijay Tata, Pro Hac Vice, Wilson Sonsini Goodrich & Rosati, New York, NY, Natalie Jordana Morgan, Wilson Sonsini Goodrich and Rosati, San Diego, CA, for Defendant Google LLC.
Byron Everard, Ma, John K. Buche, Buche & Associates, P.C., La Jolla, CA, Juyoun Han, Pro Hac Vice, Eisenberg & Baum LLP, New York, NY, for Plaintiffs. Jessica Lynn Grant, Michael Burshteyn, Ernesto Rojas Guzman, Morrison and Foerster LLP, San Francisco, CA, Joseph Alexander Lawrence, Tamara Raquel Wiesebron, Morrison & Foerster LLP, New York, NY, for Defendant Snap Inc. David R. Singer, Kate Spelman, Jenner & Block LLP, Los Angeles, CA, for Defendant Apple Inc. Benjamin D. Margo, Pro Hac Vice, Brian M. Willen, Pro Hac Vice, Vivek Vijay Tata, Pro Hac Vice, Wilson Sonsini Goodrich & Rosati, New York, NY, Natalie Jordana Morgan, Wilson Sonsini Goodrich and Rosati, San Diego, CA, for Defendant Google LLC.
ORDER:
(1) GRANTING MOTIONS TO DISMISS FIRST AMENDED COMPLAINT [Dkt. 51, 53, 54];
(2) DENYING AS MOOT MOTION TO STRIKE ALLEGATIONS [Dkt. 52];
(3) GRANTING MOTION TO SEAL [Dkt. 64]; AND
(4) DENYING MOTION FOR RULE 11 SANCTIONS [Dkt. 67]
Larry Alan Burns, United States District Judge
Minor Plaintiffs L.W., C.A., and C.O. (collectively, "Plaintiffs") commenced this suit against Defendants Snap Inc. ("Snap"), Apple Inc. ("Apple"), and Google LLC ("Google") (collectively, "Defendants") for claims stemming from allegations that Snap's Snapchat application, available for download through the Apple Store and Google Play, is an inherently dangerous software product that Defendants deceptively advertise and promote in a way that facilitates sex crimes against children. Plaintiffs' First Amended Complaint ("FAC") asserts ten causes of action against Defendants, including for product liability, fraudulent and negligent misrepresentation, and violation of various state consumer protection laws.
Defendants separately filed motions to dismiss the FAC, (Dkt. 51, 53, 54), and Defendant Apple also filed a Motion to Strike Class Allegations, (Dkt. 52). On November 15, 2022, the Court held a hearing on the respective motions to dismiss. Defendant Snap filed a Motion for Rule 11 Sanctions against Plaintiffs, citing certain offending allegations made in the FAC, (Dkt. 67), and requesting further that the Motion for Sanctions be sealed, (Dkt. 64). The Court determined the latter motions would be decided on the papers. (Dkt. 77).
The Court has read all materials submitted in support of and in opposition to the respective motions, and rules as follows.
I. BACKGROUND
Snapchat is a "a widely popular photo sharing application" that allows users to exchange photos and messages and engage in video chats with one another. (FAC ¶ 83). A distinguishing feature of Snapchat is its ephemeral nature—meaning communications automatically disappear after being opened. (Id. ¶ 86). Users can connect with one another by either searching for another user and requesting to add them as a "friend," or by using the "Quick Add" function, which "suggests that a user add another user as a friend . . . based on who you're already friends with, who you subscribe to, and other factors." (Id. ¶ 76). Snapchat can be downloaded on users' mobile phones through the Apple Store or Google Play, which are "digital distribution platform[s] where individuals can buy and download digital software and applications," like Snapchat. (Id. ¶¶ 135, 151).
When Plaintiff L.W. was 12 years old, "on or about" September 5, 2018, adult Perpetrator B.P. first approached her on Instagram. (Id. ¶¶ 2-3). B.P. asked L.W. to connect with him on Snapchat, and began conversing with her regularly. (Id. ¶¶ 3-4). On September 11, 2018, B.P. "demanded" a nude photograph from L.W. and sent her a picture of his erect penis. (Id. ¶ 6). Over the next two-and-a-half years, and through April 15, 2021, B.P. sexually groomed L.W. by "manipulat[ing] and coerc[ing] her" to send him pornographic images and videos of herself over Snapchat. In turn, he sent her hundreds of pornographic photos and videos of himself. (Id. ¶¶ 7,11). Although L.W. tried to block B.P. on numerous occasions, he was able to resume contact through Instagram or a fake account and ask L.W. to reconnect with him on Snapchat until she yielded to his request. (Id. ¶ 13). Further, B.P. used the app Chitter to distribute the Child Sexual Abuse Material ("CSAM") of L.W. to others. (Id. ¶¶ 21-23). "B.P. admitted that he solely used Snapchat with L.W.—and not any other social media platform—to gain new CSAM and transmit his pornographic images and videos to her because he 'kn[e]w the chats [would] go away' on Snapchat." (Id. ¶ 14).
Plaintiff C.A. was also 12 years old when, in 2021, an adult "perpetrator," connected with her on Twitter. (Id. ¶¶ 43-44, 47). "The perpetrator [then] . . . connected with her on [S]napchat." (Id. ¶ 45). The perpetrator had been "charged with serious sexual crimes against minor victims," but "Snapchat enabled him to make a new account without any issues," and "[u]pon information and belief . . . knew that this sex offender was using its platform but failed to stop him." (Id. ¶¶ 49-50). The perpetrator proceeded to request CSAM from C.A. and sent her CSAM depicting other minors, and through manipulation and coercion, obtained 20 to 30 sexually explicit pictures and 10 or 15 sexually explicit videos from her. (Id. ¶¶ 51-54). In March 2021, the perpetrator travelled to C.A.'s state and pressured her to engage in sexual acts with him, which he filmed and later distributed online. (Id. ¶¶ 54-55). "In April 2021, another perpetrator connected with C.A. on Kik, a social media app," and asked to move their conversations to Snapchat. (Id. ¶¶ 59-60). "On Snapchat, the second perpetrator sent C.A. explicit photos and videos" of himself for the next several weeks. (Id. ¶¶ 61-62).
Plaintiff C.O. was 11 years old when, in 2018, a "perpetrator" connected with her on Omegle, "an online video chat room," and asked for her username. He later connected with her on Snapchat. (Id. ¶¶ 69-70). On Snapchat, the perpetrator pretended to be a minor girl, requested CSAM from C.O., and sent her CSAM from other minors. C.O. eventually sent him nude photographs of herself. (Id. ¶¶ 72-73). "Since 2018 four or five additional perpetrators sought to connect with C.O.[ ] on Snapchat"—some through Quick Add—"and similarly coerced her to send nude photos and CSAM," as well as sent her sexually explicit photos and videos of themselves. (Id. ¶¶ 74-75, 77). Upon information and belief," these perpetrators "downloaded Snapchat using Apple's and Google's App Stores." (Id. ¶¶ 38, 67, 79).
Plaintiffs allege ten claims against all three Defendants: (1) strict liability product design and defect; (2) strict product liability (failure to warn); (3) negligence and negligence per se product design and defect; (4) fraudulent misrepresentation and negligent misrepresentation; (5) unjust enrichment; (6) violation of California's Unfair Competition Law and False Advertising Law; (7) violation of the Colorado Consumer Protection Act; (8) violation of the Kentucky Consumer Protection Act; (9) injunctive relief; and (10) violation of the Trafficking Victims Protection Reauthorization Act ("TVPRA"), 18 U.S.C. §§ 1591, 1595.
II. MOTION TO DISMISS
A. Legal Standard
A motion brought under Federal Rule of Civil Procedure 12(b)(6) tests the sufficiency of a complaint. Navarro v. Block, 250 F.3d 729, 732 (9th Cir. 2001). "To survive a motion to dismiss, a complaint must contain sufficient factual matter, accepted as true, to 'state a claim to relief that is plausible on its face.' " Ashcroft v. Iqbal, 556 U.S. 662, 678, 129 S.Ct. 1937, 173 L.Ed.2d 868 (2009) (quoting Bell Atl. Corp. v. Twombly, 550 U.S. 544, 547, 127 S.Ct. 1955, 167 L.Ed.2d 929 (2007)). A claim is facially plausible when the factual allegations permit "the court to draw the reasonable inference that the defendant is liable for the misconduct alleged." Id. While a plaintiff need not give "detailed factual allegations," a plaintiff must plead sufficient facts that, if true, "raise a right to relief above the speculative level." Twombly, 550 U.S. at 545, 127 S.Ct. 1955. "The plausibility standard is not akin to a 'probability requirement,' but it asks for more than a sheer possibility that a defendant has acted unlawfully." Iqbal, 556 U.S. at 678, 129 S.Ct. 1937 (quoting Twombly, 550 U.S. at 556, 127 S.Ct. 1955). The Court need not accept legal conclusions couched as factual allegations. See Twombly, 550 U.S. at 555, 127 S.Ct. 1955.
B. Section 230 of the Communications Decency Act
Defendants first argue that they are immune from suit under Section 230(c)(1) of the Communications Decency Act of 1996 ("CDA"), 47 U.S.C. § 230(c)(1), which "protects certain internet-based actors from certain kinds of lawsuits." Barnes v. Yahoo!, Inc., 570 F.3d 1096, 1099 (9th Cir. 2009). The statute provides, in relevant part, that "[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider." 47 U.S.C. § 230(c)(1). The statute also provides that "[n]o cause of action may be brought and no liability may be imposed under any State or local law that is inconsistent with this section." 47 U.S.C. § 230(e)(3). "The majority of federal circuits have interpreted the CDA to establish broad federal immunity to any cause of action that would make service providers liable for information originating with a third-party user of the service." Perfect 10, Inc. v. CCBill LLC, 488 F.3d 1102, 1118 (9th Cir. 2007) (citations and internal quotation marks omitted).
CDA immunity under Section 230(c)(1) "applies only if the interactive computer service provider is not also an 'information content provider,' which is defined as someone who is 'responsible, in whole or in part, for the creation or development of' the offending content." Fair Hous. Council of San Fernando Valley v. Roommates.com, LLC, 521 F.3d 1157, 1162 (9th Cir. 2008) (en banc) (quoting 47 U.S.C. § 230(f)(3)). The "prototypical service qualifying for [CDA] immunity is an online messaging board (or bulletin board) on which Internet subscribers post comments and respond to comments posted by others." Kimzey v. Yelp! Inc., 836 F.3d 1263, 1266 (9th Cir. 2016) (quoting FTC v. Accusearch Inc., 570 F.3d 1187, 1195 (10th Cir. 2009)). Under the Ninth Circuit's three-prong test, immunity from liability exists for "(1) a provider or user of an interactive computer service (2) whom a plaintiff seeks to treat, under a state law cause of action, as a publisher or speaker (3) of information provided by another information content provider." Barnes, 570 F.3d at 1101. "When a plaintiff cannot allege enough facts to overcome Section 230 immunity, a plaintiff's claims should be dismissed." Dyroff v. Ultimate Software Grp., Inc., 934 F.3d 1093, 1097 (9th Cir. 2019) (citation omitted).
1. Interactive Computer Services
Plaintiffs don't contest Defendants' status as providers of interactive computer services within the meaning of Section 230. (Dkt. 63 at 3). Under the statute, "[t]he term 'interactive computer service' means any information service, system, or access software provider that provides or enables computer access by multiple users to a computer server, including specifically a service or system that provides access to the Internet and such systems operated or services offered by libraries or educational institutions." 47 U.S.C. § 230(f)(2). Courts have noted that providers of interactive computer services include entities that create, own, and operate applications that enable users to share messages over its internet-based servers, like Defendants. See, e.g., Coffee v. Google, LLC, No. 20-CV-03901-BLF, 2022 WL 94986, at *5 (N.D. Cal. Jan. 10, 2022) (acknowledging that Google, whose Play Store "contains more than 2.9 million third-party apps on its virtual shelves" for user purchase and download, qualifies as an interactive computer service provider); Lemmon v. Snap, Inc., 995 F.3d 1085, 1091 (9th Cir. 2021) (holding Snap, the creator, owner, and operator of an application that "permits its users to share photos and videos through [its] servers and the internet," qualifies as a "provider of an interactive computer service") (citations and internal quotation marks omitted). The Court agrees that Snap, Google, and Apple each qualify as interactive computer service providers.
2. Defendants are Information Content Providers
The next consideration is "whether the claims 'inherently require[ ] the court to treat the defendant as the 'publisher or speaker' of content provided by another.' " Dyroff, 934 F.3d at 1098 (alteration in original) (quoting Barnes, 570 F.3d at 1102). Plaintiffs argue their claims do not treat Defendants as publishers or speakers, but rather "focus[ ] on the inherently dangerous design of Defendants' products that can be made safer without altering third-party content." (Dkt. 63 at 4). They maintain that by enabling the transmission of ephemeral content on the application, Defendants facilitate the exchange of CSAM, and that Snap's design of the application assists users in "evad[ing] supervision by legal guardians or law enforcement." (Id.). Defendants respond that, regardless of how Plaintiffs' claims are styled, Plaintiffs are seeking to hold them liable for content published by third parties.
A website acts as a publisher when it decides whether to post online material submitted by a third party. See Roommates.com, 521 F.3d at 1170. Publication "involves reviewing, editing, and deciding whether to publish or to withdraw from publication third-party content." Barnes, 570 F.3d at 1102 (citation omitted) ("[A] publisher reviews material submitted for publication, perhaps edits it for style or technical fluency, and then decides whether to publish it."). "[A]ny activity that can be boiled down to deciding whether to exclude material that third parties seek to post online is perforce immune under section 230." Id. at 1102. "[W]hat matters is not the name of the cause of action—defamation versus negligence versus intentional infliction of emotional distress—what matters is whether the cause of action inherently requires the court to treat the defendant as the 'publisher or speaker' of content provided by another." Id. at 1101-02.
Applying these definitions, the Court must treat Defendants as publishers or speakers, regardless of how their claims are framed, because their theories of liability plainly turn on Defendants' alleged failure to monitor and remove third-party content. By definition, Snap's failure to remove CSAM distributed on Snapchat by third parties, and Apple's and Google's choice to allow Snapchat to remain available for download in their online stores, involve "reviewing . . . and deciding whether to publish or to withdraw from publication third-party content." Barnes, 570 F.3d at 1102-03 (holding the "removal of the indecent profiles that [plaintiff's] former boyfriend posted on Yahoo's website . . . is something publishers do, and to impose liability on the basis of such conduct necessarily involves treating the liable party as a publisher of the content it failed to remove"); Ginsberg v. Google Inc., 586 F. Supp. 3d 998, 1004-05 (N.D. Cal. 2022) (holding the alleged conduct—failure to remove an application from Google's Play Store—"boils down to deciding whether to exclude material . . . that a third party seeks to place in the online Play Store"). Whether they style their allegations as claims for product liability, fraud, or negligence, Plaintiffs can't sue Defendants "for third-party content simply by changing the name of the theory." Barnes, 570 F.3d at 1102; see also Bride v. Snap Inc., No. 221CV06680FWSMRW, 2023 WL 2016927, at *5 (C.D. Cal. Jan. 10, 2023) ("Ultimately, although Plaintiffs frame user anonymity as a defective design feature of Defendants' applications, Plaintiffs fundamentally seek to hold Defendants liable based on content published by anonymous third parties on their applications."); Ginsberg, 586 F. Supp. 3d at 1006 (finding the second prong satisfied where "the undertaking that Google allegedly failed to perform with due care was removing offending content from the Play Store").
Attempting to end run these definitional barriers, Plaintiffs argue that Defendants aren't entitled to Section 230 immunity because they materially contributed to the wrongful behavior that caused harm to Plaintiffs. (Dkt. 63 at 7). They urge the Court to view this case as falling within the ambit of Lemmon, which rejected arguments that section 230 immunity protected Snap against claims arising from use of its Snapchat app. Lemmon involved Snapchat's "Speed Filter," an "interactive system" that "encouraged [Snapchat's] users to pursue certain unknown achievements and rewards" and "entice[d] young Snapchat users to drive at speeds exceeding 100 MPH." 995 F.3d at 1091-92. Plaintiffs alleged that these functions didn't involve Snap's "editing, monitoring, or removing of the content that its users generate through Snapchat." Id. at 1092. The Ninth Circuit agreed, concluding that because the plaintiffs' claims turned on Snap's own design feature, it didn't enjoy publisher status and wasn't entitled to Section 230 immunity. Id. at 1093. The Ninth Circuit reached a similar conclusion in Roommates.com, holding that an online platform for matching roommates to one another was not entitled to Section 230 immunity because the platform was specifically designed to match potential roommates based on criteria such as sex, family status, and sexual orientation, thereby encouraging users to post content that violated fair housing laws. 521 F.3d at 1167-68. The court held that "[b]y requiring subscribers to provide the [discriminatory] information as a condition of accessing its service," including the requirement to choose between "a limited set of pre-populated answers," the website became "much more than a passive transmitter," and instead became "the developer, at least in part, of that information." Id. at 1166. According to the court, "Roommate's own acts—posting the questionnaire and requiring answers to it—are entirely its doing and thus section 230 of the CDA does not apply to them." Id. at 1165.
Plaintiffs make a similar argument here—that Snapchat's ephemeral design features, specifically the disappearing messages and the Quick Add function, combined with users' ability to create multiple accounts—is inherently dangerous and doesn't involve any material contribution by either of the three Defendants. These design features, in Plaintiffs' view, make Snapchat the platform of choice for those seeking to engage in "exploitation and predatory behavior" "without the fear of getting caught." (FAC ¶¶ 86-90). And despite Snap's policy against child exploitation and the distribution of CSAM, the platform's existing "CSAM detection technology like PhotoDNA and CSAI Match is . . . a poor fit to prevent sexual grooming on Snapchat." (Id. ¶¶ 102, 110).
After thoughtfully considering Plaintiffs' arguments, the Court disagrees. Unlike in Lemmon and Roommates.com, the harm Plaintiffs allege here doesn't flow from a design defect. Rather, the harm animating Plaintiffs' claims "is directly related to the posting of third-party content on [Snapchat]." Doe v. Twitter, Inc., 555 F. Supp. 3d 889, 930 (N.D. Cal. 2021), aff'd in part, rev'd in part and remanded sub nom. Doe #1 v. Twitter, Inc., No. 22-15103, 2023 WL 3220912 (9th Cir. May 3, 2023), abrogated by Does 1-6 v. Reddit, Inc., 51 F.4th 1137 (9th Cir. 2022). As other courts have observed, the alleged flaws in Snapchat's design, "in essence, seek to impose liability on [Snap] based on how well [Snap] has designed its platform to prevent the posting of third-party content containing child pornography and to remove that content after it is posted." Twitter, Inc., 555 F. Supp. 3d at 930. "[T]o meet the obligation Plaintiffs seek to impose on [Defendant] on this claim, [Defendant] would have to alter the content posted by its users, in contrast to the design defect alleged in Lemmon." Id. Here, similarly, Plaintiffs' arguments more closely implicate a publication function than a design or development function.
Plaintiffs' product liability claims against Google and Apple fare no better. As with Snapchat, their claims against Google and Apple don't turn on the publication of dangerous applications, but rather "on the Defendants' facilitation of sale and recommendation of inherently dangerous app[lication]s (namely Snapchat and Chitter), for which they receive a commission from those app[lication]s and users." (Dkt. 63 at 3). Citing to HomeAway v. City of Santa Monica, 918 F.3d 676, 684-85 (9th Cir. 2019), Plaintiffs argue that Apple and Google can be held liable for their "facilitation of downloading the app[lications], recommendation of the app[lications], and pocketing a commission" when those applications are downloaded from their respective platforms. (Dkt. 63 at 3-4, 6). But as Defendants point out, the Ninth Circuit has consistently held that an online platform's use of neutral algorithms to recommend content to users does not forfeit the platform's entitlement to Section 230 immunity. See Gonzalez v. Google LLC, 2 F.4th 871, 894 (9th Cir. 2021), reversed rev'd on other grounds by Twitter, Inc., v. Taamneh, 598 U.S. 471, 143 S.Ct. 1206, 1226-27, 215 L.Ed.2d 444 (2023) ("Though we accept as true the [complaint]'s allegation that Google's algorithms recommend ISIS content to users, the algorithms do not treat ISIS-created content differently than any other third-party created content, and thus are entitled to § 230 immunity."); Dyroff, 934 F.3d at 1098 (distinguishing HomeAway.com and holding that "[b]y recommending user groups and sending email notifications, [Defendant] . . . was acting as a publisher of others' content. These functions—recommendations and notifications—are tools meant to facilitate the communication and content of others. They are not content in and of themselves."). But see Dangaard v. Instagram, LLC, No. C 22-01101 WHA, 2022 WL 17342198, at *4 (N.D. Cal. Nov. 30, 2022) ("While providing neutral tools to carry out what may be unlawful or illicit [conduct] does not amount to development, Meta defendants are not alleged to have filtered pornographic content in a neutral manner . . . . [W]hen automated content-moderation tools are allegedly designed to facilitate unlawful conduct; the claims survive CDA defenses.") (emphasis in original) (internal quotation marks omitted). Because Plaintiffs' complaint doesn't allege that either Google or Apple did anything more than create neutral tools by which users could download and access Snapchat, the argument fails.
Plaintiffs' claims, premised on Defendants' publishing activity, satisfy the second prong of the Barnes analysis.
3. Third-Party Content
The last prong of the Barnes analysis requires the Court to determine whether Plaintiffs' allegations demonstrate that the published material was provided by a third-party content provider. An exception to Section 230 immunity applies only if the defendants are "responsible in part, for the creation or the development of the offending content on the internet." Lemmon, 995 F.3d at 1093 (cleaned up). In other words, the third Barnes prong focuses solely on who created the content at issue.
Here, because the offending CSAM material was unquestionably created and distributed by third-party individuals, this prong is satisfied. The FAC makes no suggestion that the harmful content was created or developed by Snap, or that either Google or Apple was involved in the development of the Snapchat application, let alone the sexually explicit content distributed on the application. "Defendants did not create or develop the [sexually] explicit messages that led to the harm suffered by Plaintiffs; the sending users did." Bride, 2023 WL 2016927, at *6. And although Plaintiffs don't appear to dispute whether Defendants meet the last prong of the Barnes test, the Court nevertheless finds that Plaintiffs' claims, whether based in product liability or false advertising, are predicated on content developed by third parties—not by Defendants.
4. Application to Plaintiffs' Claims
On behalf of a national class, as well as Kentucky and Colorado subclasses, Plaintiffs assert ten claims for product design and defect, failure to warn, misrepresentation, unjust enrichment, violation of state consumer protection laws, and violation of the TVPRA. Each of these claims is predicated on the theory that Defendants violated various state laws by failing to adequately monitor and regulate end-users' harmful messages, and each is therefore barred by Section 230. See e.g., Kimzey, 836 F.3d at 1270 (affirming dismissal of state law consumer protection claims premised on the defendant's role as a publisher of third-party content); Beckman v. Match.com, LLC, 668 F. App'x 759, 759-60 (9th Cir. 2016) (affirming dismissal of negligence and misrepresentation claims where the "basis for each of those claims is [Defendant]'s role as a publisher of third-party information"); Barnes, 570 F.3d at 1102-03 (holding negligence claim under state law that "derive[d] from [the defendant's] role as a publisher" was subject to CDA immunity); Bride, 2023 WL 2016927, at *6 (dismissing claims for product liability, state consumer protection law violations, negligence, and unjust enrichment where the "Plaintiffs fundamentally seek to hold Defendants liable based on content published by anonymous third parties on their applications").
Plaintiffs argue that even if the Court finds that Section 230 applies to the present suit, the immunity the statute affords doesn't bar Plaintiffs' TVPRA claim. In 2018, Congress passed the Allow States and Victims to Fight Online Sex Trafficking Act ("FOSTA"), which amended Section 230 and ensured that "[n]othing in [Section 230] . . . shall be construed to impair or limit . . . any claim in a civil action brought under section 1595 of title 18, if the conduct underlying the claim constitutes a violation of section 1591 of that title." 47 U.S.C. § 230(e)(5)(A). Section 1595 of the TVPRA, in turn, provides a civil cause of action for violations of federal trafficking laws, while Section 1591 of the TVPRA creates a direct liability claim for "[w]hoever knowingly (1) . . . recruits, entices, harbors, transports, provides, obtains, advertises, maintains, patronizes, or solicits by any means a person," or "(2) benefits, financially or by receiving anything of value, from participation in a venture which has engaged in an act described in violation of paragraph (1)." 18 U.S.C. § 1591(a). The statute defines "participation in a venture" as "knowingly assisting, supporting, or facilitating" sex trafficking activities. Id. § 1591(e)(4). Both Sections 1591 and 1595 cover perpetrators and "beneficiaries" of trafficking. Does 1-6 v. Reddit, Inc., 51 F.4th 1137, 1141 (9th Cir. 2022). However, while Section 1595 incorporates a constructive knowledge standard (i.e., the "person knew or should have known" of the trafficking conduct), "the standard for beneficiary liability pursuant to section 1591 is higher: to be held criminally liable as a beneficiary, a defendant must have actual knowledge of the trafficking and must 'assist[ ], support[ ], or facilitat[e]' the trafficking venture." Id. (quoting § 1591(e)(4)) (alterations in original).
Plaintiffs dispute whether the "actual knowledge" standard of Section 1595 applies here. (Dkt. 63 at 14). They maintain that "the majority of courts in this district have refused to import Section 1591's actual knowledge requirement," and instead rely on Section 1595 for its application to online platforms. (Id.). Recent case law suggests the exact opposite. In Reddit, the Ninth Circuit analyzed the statutory language and context of Section 230(e)(5)(A) and concluded that FOSTA creates an immunity exception "only when a website violates 18 U.S.C. § 1591." 51 F.4th at 1143 (emphasis added). The court held that "FOSTA requires that a defendant-website violate the criminal statute by [1] directly sex trafficking or, [2] with actual knowledge, 'assisting, supporting, or facilitating' trafficking, for the immunity exception to apply." Id. at 1145. Since Reddit, the Ninth Circuit has consistently upheld this interpretation of Section 230(e)(5)(A). See Doe #1 v. Twitter, Inc., No. 22-15103, 2023 WL 3220912, at *2 (9th Cir. May 3, 2023) (affirming Reddit's holding that FOSTA's immunity exception applies only where a plaintiff "plausibly allege[s] that the website's own conduct violated section 1591") (citation omitted); J.B. v. Craigslist, Inc., No. 22-15290, 2023 WL 3220913, at *1 (9th Cir. May 3, 2023) (same).
Applying Section 1591's heightened standard here, Plaintiffs' TVPRA claim is plainly deficient. The FAC alleges that Apple and Google derived financial benefit by recommending Snapchat to users for download, and that user reviews on the Apple Store and Google Play generally discussing underage sexual misconduct occurring on Snapchat were sufficient to confer actual knowledge on Defendants. But Reddit is explicit that attenuated allegations like these are insufficient to plausibly suggest that Defendants knowingly participated in or benefited from a sex trafficking venture:
Mere association with sex traffickers is insufficient absent some knowing participation in the form of assistance, support, or facilitation. The statute does not target those that merely turn a blind
eye to the source of their revenue. And knowingly benefitting from participation in such a venture requires actual knowledge and a causal relationship between affirmative conduct furthering the sex-trafficking venture and receipt of a benefit.Reddit, 51 F.4th at 1145 (citations and internal quotation marks omitted).
Plaintiffs' allegation that Snap knew of the trafficking conduct because it regularly collects "troves of [user] data and information" is equally unpersuasive. (FAC ¶ 107). Even accepting this allegation as true, the Court can't conclude that Snap's knowledge of illicit activity is tantamount to participation in the activity. The most that can be gleaned from Plaintiffs' allegations is that Snap should have done a better job monitoring and policing content distributed by its users. But failing to efficiently monitor or police user-generated content is not punishable under Section 1591. See Reddit, 51 F.4th at 1145 ("Plaintiffs who have successfully alleged beneficiary liability for sex trafficking have charged defendants with far more active forms of participation than the plaintiffs allege here.").
This case is likewise different from Doe v. Internet Brands, Inc., 824 F.3d 846 (9th Cir. 2016), cited by Plaintiffs. In Internet Brands, the defendant model networking website was informed by an outside source that its website was being targeted by third parties who were using fake identities to lure models to supposed modeling auditions and then sexually assault them. Id. at 848. The perpetrators identified the potential targets of their assaults by browsing models' user profiles on the website. Id. Among those misusing the website were two rapists who sexually assaulted the plaintiff. Id. Plaintiffs argue that, as in Internet Brands, Defendants failed to warn users "about the crimes of sexual grooming on their respective platforms." (Dkt. 63 at 10). Their argument ignores the distinction drawn by the Internet Brands court between monitoring and policing website content and the duty imposed by California to warn website users of a known harm or danger. Internet Brands pointed out that "[t]he duty to warn allegedly imposed by California law would not require [Defendant] to remove any user content or otherwise affect how it publishes or monitors such content." Id. at 851. To the contrary, "[a]ny alleged obligation to warn could have been satisfied without changes to the content posted by the website's users and without conducting a detailed investigation," such as by posting a warning to users on the website or informing the users by email. Id.
In contrast to Internet Brands, all of Plaintiffs' theories of liability here are intertwined with Defendants' publishing activities and related allegations that they failed to monitor and edit third-party content. Whereas the duty to warn requires only a "self-produced warning," Doe, 824 F.3d at 851, Plaintiffs argue for an expansion of the duty to include editing and/or removal of user-generated content. Internet Brands rejected that proposal: lawsuits brought against interactive computer service providers based solely on failure to adequately monitor and regulate end-users' harmful messages fall squarely within protections of Section 230.
Defendants are entitled to Section 230 immunity on each of Plaintiffs' claims, and the FAC is DISMISSED in its entirety. Because the FAC has now been dismissed, the Court need not consider Defendant Apple's Motion to Strike Class Allegations, which the Court DENIES AS MOOT. (Dkt. 52).
C. Leave to Amend
Under Rule 15(a) of the Federal Rules of Civil Procedure, leave to amend "shall be freely given when justice so requires," because "the court must remain guided by the underlying purpose of Rule 15 . . . to facilitate decisions on the merits, rather than on the pleadings or technicalities." Lopez v. Smith, 203 F.3d 1122, 1127 (9th Cir. 2000) (en banc) (citation and internal quotation marks omitted). "The decision of whether to grant leave to amend nevertheless remains within the discretion of the district court," which may deny leave to amend if allowing amendment would unduly prejudice the opposing party, cause undue delay, be futile, or if the party seeking amendment has acted in bad faith. Leadsinger, Inc. v. BMG Music Publ'g, 512 F.3d 522, 532 (9th Cir. 2008) (citing Foman v. Davis, 371 U.S. 178, 182, 83 S.Ct. 227, 9 L.Ed.2d 222 (1962)).
Neither undue delay nor bad faith is implicated here. Nor has the Court issued a previous order addressing Plaintiffs' claims. And given that this litigation is at an early stage, granting Plaintiffs a further opportunity to amend the complaint wouldn't unduly prejudice Defendants. However, although Rule 15 provides that leave to amend should be "freely" given, "that liberality does not apply when amendment would be futile." Ebner v. Fresh, Inc., 838 F.3d 958, 968 (9th Cir. 2016).
It is clear from the FAC that all of Plaintiffs' claims are premised on treating Defendants as publishers or speakers of third-party content sent on Snapchat, and that all such claims are barred by Section 230. Bride, 2023 WL 2016927, at *8 (citing Sikhs for Just., Inc. v. Facebook, Inc., 697 F. App'x 526, 526 (9th Cir. 2017)) ("Because the court finds the core theory underlying Plaintiffs' claims seeks to treat Defendants as a 'publisher or speaker' of the posts of third parties utilizing their applications, the court finds amendment to be futile."). Additionally, Plaintiffs have given no indication, either in their opposition brief or during oral argument, that they can allege additional facts to cure these deficiencies. To the contrary, Plaintiffs have exhaustively set forth all of the facts upon which their claims are based.
Having thoroughly reviewed the record and the applicable law, the Court concludes that Defendants enjoy immunity from liability under Section 230, rendering any attempt to further amend futile. The FAC is therefore DISMISSED WITH PREJUDICE.
III. MOTION TO SEAL
In support of its motion for sanctions, Snap seeks to file limited portions of its motion and accompanying papers under seal—namely specific usernames and display names captured in various screenshots included in the motion and the Declaration of David Boyle. (Dkt. 64 at 1). Snap explains that the "information was included because demonstrating how Snapchat works . . . required taking screenshots of a real person's Snapchat account"—that of Boyle—and the screenshots "consequently display his friends' usernames and display names." (Id.). As a matter of policy, Snap doesn't publicly disclose its users' friend lists, nor would this information otherwise be available to the public.
There is a presumption of public access to judicial records and documents. Nixon v. Warner Commc'ns, Inc., 435 U.S. 589, 597, 98 S.Ct. 1306, 55 L.Ed.2d 570 (1978). Courts generally apply a "compelling reasons" standard when considering motions to seal, recognizing that "a strong presumption in favor of access is the starting point." Kamakana v. City & County of Honolulu, 447 F.3d 1172, 1178 (9th Cir. 2006) (citation and internal quotation marks omitted). But "[s]imply mentioning a general category of privilege, without any further elaboration or any specific linkage with the documents, does not satisfy the burden." Id. at 1184.
Here, Snap's request is narrowly tailored. It asks that only the text in the moving papers displaying specific usernames and display names be sealed. None of the associated users whose screen identities Snap seeks to protect are connected to the present action. Nor do the usernames bear any real importance to the lawsuit, except to demonstrate generally how an individual's username and display name would appear on another user's Quick Add list. Considering all of these circumstances, the Court finds that the "public has a minimal interest" (if any at all) in knowing the specific names displayed in these exemplar screen shots. There are, on the other hand, respectable concerns relating to user privacy and safety that the Court finds are compelling and that tip the balance in favor of sealing the specific portions of the moving papers Snap has identified. See Icon-IP Ltd. v. Specialized Bicycle Components, Inc., No. 12-cv-03844-JST, 2015 WL 984121, at *3 (N.D. Cal. Mar. 4, 2015) (noting that "invasion of [a] third party's privacy" constitutes a compelling reason to file an exhibit under seal). These minimal redactions protect the individuals' privacy and outweigh the public's interest in disclosure and knowing this particular information.
Snap's motion to seal is GRANTED.
IV. MOTION FOR RULE 11 SANCTIONS
Defendant Snap also filed a Motion for Rule 11 Sanctions against Plaintiffs for making various allegedly false allegations in the FAC. (Dkt. 67). Snap maintains that the sanctionable statements include those related to how Snapchat's Quick Add feature works, whether Snap was aware of the specific instances of misconduct alleged in the FAC, what procedures Snap has in place to guard against users creating multiple accounts, and whether one of the perpetrators mentioned in the FAC indeed had multiple user accounts. Snap requests that the Court dismiss the FAC and order Plaintiffs to file an amended complaint omitting the offending allegations, as well as award attorney fees incurred by Snap in connection with filing its Rule 11 motion.
A. Legal Standard
Federal Rule of Civil Procedure 11 outlines procedural and substantive requirements to guide whether a court should sanction an attorney. "[T]he central purpose of Rule 11 is to deter baseless filings in district court and . . . streamline the administration and procedure of the federal courts." Cooter & Gell v. Hartmarx Corp., 496 U.S. 384, 393, 110 S.Ct. 2447, 110 L.Ed.2d 359 (1990). Rule 11(c) permits a court to sanction a party and/or its attorney, "[i]f, after notice and a reasonable opportunity to respond, the court determines that Rule 11(b) has been violated." Fed. R. Civ. P. 11(c)(1). When sanctions are sought on the basis of a complaint, the court must determine: "(1) whether the complaint is legally or factually 'baseless' from an objective perspective, and (2) if the attorney has conducted a 'reasonable and competent inquiry' before signing and filing it." Christian v. Mattel, Inc., 286 F.3d 1118, 1127 (9th Cir. 2002) (citing Buster v. Greisen, 104 F.3d 1186, 1190 (9th Cir. 1997)).
"A sanction imposed under this rule must be limited to what suffices to deter repetition of the conduct or comparable conduct by others similarly situated." Fed. R. Civ. P. 11(c)(4). If sanctions are imposed based on a motion and are "warranted for effective deterrence," a court may order the payment to the moving party of "part or all of the reasonable attorney's fees and other expenses directly resulting from the violation," in addition to other monetary or nonmonetary remedies. Id. "Rule 11 is an extraordinary remedy, one to be exercised with extreme caution." Operating Eng'rs Pension Tr. v. A-C Co., 859 F.2d 1336, 1345 (9th Cir. 1988).
B. Frivolous
The Court may sanction an attorney or party under Rule 11 for filing a pleading or other paper that is "frivolous." Est. of Blue v. County of Los Angeles, 120 F.3d 982, 985 (9th Cir. 1997). "Frivolous" filings are those that are "both baseless and made without a reasonable and competent inquiry." Townsend v. Holman Consulting Corp., 929 F.2d 1358, 1362 (9th Cir. 1990) (en banc); see Holgate v. Baldwin, 425 F.3d 671, 676 (9th Cir. 2005). After reviewing the factual allegations in the FAC, the parties' moving papers, and the applicable law, the Court concludes that the contested allegations are not entirely baseless, nor are they attributable to Plaintiff's not conducting a reasonable and competent inquiry.
Regarding Snap's complaint that Plaintiffs mischaracterized how Snapchat's Quick Add feature works—namely that it can be used by perpetrators to "find minor aged children" by geographic location or similar topics of interest, (Dkt. 67 at 7 (citing FAC ¶ 90))—Snap explains that for two users to be recommended to one other through Quick Add, either (1) they must have mutual friends on Snapchat, or (2) one must have the other's phone number or email address in their phone's contact list, (id. at 10-11). Snap has attached the declaration of David Boyle, Director of Product Management at Snapchat, who attests to that: "Quick Add does not, and never has, enabled users to find and add other users based on geographic location proximity or shared topic interests." (Dkt. 67-2 ¶ 15). Plaintiffs respond by pointing out that Snapchat's privacy policy openly states that it collects user data to determine whether two users are likely to know each other. (Dkt. 72 at 5). But that policy statement doesn't speak to how users of the platform are recommended to one another based on location and shared interests. Plaintiffs also assert that they relied on a "reputable, well-read blog" in making the allegation, but they curiously fail to provide the blog cite or attach the specific blog post that supposedly led them to their belief. (Id.). Instead, they posit a hypothetical scenario in which a sexual predator may be recommended to the friends of a child who is already "friends" with the predator. (Id. at 4). But again, this doesn't explain how Plaintiffs concluded that users may be recommended to one another based on location and shared interests. Plaintiffs' explanation for making this allegation is tenuous and confounding.
Snap also disputes Plaintiff's allegation that at least one of the adult perpetrators first connected with Plaintiff C.O. using the Quick Add function. The FAC explains that although one of the perpetrators connected with C.O. outside the Snapchat application (on Omegle), at least one other perpetrator first reached out to her using the Quick Add function. (See FAC ¶¶ 74-75). The Court can't assess the veracity of this allegation because it's unclear at this stage of the litigation how or why the perpetrator was recommended to C.O. through Quick Add. And as for the allegations that Snap knew of the specific crimes committed against Plaintiffs and whether one of the perpetrators employed multiple Snapchat accounts to target victims, the Court lacks sufficient facts—one way or the other—to determine the veracity of the allegations.
Finally, Snap contests Plaintiffs' allegation that Snapchat has no procedures in place to identify and expel known and repeat wrongdoers from the platform. (Dkt. 67 at 15 (quoting FAC at 3)). Snap points to various measures it is taking to combat known and potential abuse on its platform, including using CSAM detection tools, removing offending content, banning accounts, blocking users from setting up new accounts, and working with law enforcement by diligently responding to subpoenas and search warrants. (Dkt. 67-3 ¶¶ 9-10, 14-15). Plaintiffs acknowledge some of these measures in their complaint. (See FAC ¶¶ 97-104, 107). And yet despite Plaintiffs' acknowledgement that at least some measures are in place—whether effective or not—the preliminary statement in their FAC states in absolute terms that "Snapchat has no procedures in place to keep known offenders and repeat-abusers away from its platform." (Id. at 3 (emphasis added)). Curiously, in a different section of the FAC, Plaintiffs simultaneously admit and acknowledge that Snap has attempted to implement safety measures in response to concerns regarding sexual grooming, but "those safety measures are not effective to prevent the foreseeable and known harms of sexual grooming." (Id. ¶¶ 205-06). Plaintiffs' explanation for this apparent gaffe is that their preliminary statement was made in connection with the allegation that users can create multiple accounts—an allegation not totally disputed by Snap. (See Dkt. 67 at 16). The Court regards Plaintiffs' proffered explanation for the discrepancy as a stretch, suggestive of sloppy drafting and demonstrating the perils of inattentive editing (or perhaps no editing at all). Regardless, though some of Plaintiffs' allegations lack apparent basis in fact, the Court finds that they don't rise to the level of frivolousness sufficient to warrant imposing sanctions at the pleading stage.
Given the Court's ruling on Defendants' respective motions to dismiss and the dismissal of Plaintiffs' claims in their entirety, conducting a further hearing to determine whether any of Plaintiffs' allegations were made with knowing falsity isn't likely to result in a net benefit to any party nor promote future deterrence. While the Court has expressed its concerns with some of the sources of information in the FAC and with the questionable wording of some of the claims, Plaintiffs' allegations aren't so baseless as to qualify as "frivolous" within the meaning of Rule 11.
Snap's motion for sanctions is DENIED.
V. CONCLUSION
Defendants' motions to dismiss the FAC are GRANTED and the FAC is DISMISSED WITH PREJUDICE. (Dkt. 51, 53, 54). Defendant Apple's Motion to Strike Class Allegations is DENIED AS MOOT. (Dkt. 52). The Court GRANTS Defendant Snap's motion to file documents under seal, (Dkt. 64), and DENIES its Rule 11 motion for sanctions, (Dkt. 67).
IT IS SO ORDERED.