From Casetext: Smarter Legal Research

NetChoice, LLC v. Griffin

United States District Court, Western District of Arkansas
Aug 31, 2023
5:23-CV-05105 (W.D. Ark. Aug. 31, 2023)

Summary

concluding NetChoice has associational standing to challenge state law on behalf of its members and its members' users

Summary of this case from NetChoice LLC v. Reyes

Opinion

5:23-CV-05105

08-31-2023

NETCHOICE, LLC PLAINTIFF v. TIM GRIFFIN, in his Official Capacity as Attorney General of Arkansas DEFENDANT


MEMORANDUM OPINION AND ORDER

TIMOTHY L. BROOKS UNITED STATES DISTRICT JUDGE

TABLE OF CONTENTS

I. INTRODUCTION................................................................................................. 3

II. BACKGROUND...................................................................................................5

A. The Social Media Safety Act: Objectives and Requirements.............................5

B. Social Media Use Among Minors.........................................................................10

C. Types of Speech Available on NetChoice Members' Platforms..........................13

D. Existing Parental Controls......................................................................................15

III. LEGAL STANDARD.............................................................................................20

IV. DISCUSSION..........................................................................................................21

A. Standing....................................................................................................................21

1. Constitutional Standing

................................................................................................

22

2. Prudential Standing ....................................................................................................25

B. Likelihood of Success on the Merits..........................................................................30

1. Void for Vagueness: NetChoice Members' Claim ........................................................35

2. Burdens on First Amendment Rights: Platform Users' Rights

.......................................

35

a. Level of Scrutiny...........................................................................................................35

b. Burdens on Adults' Access to Speech...........................................................................39

c. Burdens on Minors' Access to Speech..........................................................................40

d. Narrow Tailoring............................................................................................................41

C. Irreparable Harm...........................................................................................................48

D. Balance of the Equities and the Public Interest............................................................49

V. CONCLUSION...............................................................................................................49

I. INTRODUCTION

This case presents a constitutional challenge to Arkansas Act 689 of 2023, the “Social Media Safety Act” (“Act 689”), a new law that aims to protect minors from harms associated with the use of social media platforms. Act 689-which becomes effective tomorrow, September 1, 2023-requires social media companies to verify the age of all account holders who reside in Arkansas. Self-reporting one's age (a common industry practice) is not sufficient; Arkansans must submit age-verifying documentation before accessing a social media platform.

Under Act 689, a “social media company,” as defined in the Act, must outsource the age-verification process to a third-party vendor. A prospective user of social media must first prove their age by uploading a specified form of identification, such as a driver's license, to the third-party vendor's website. A verified adult may obtain a social media account. Minors, however, will be denied an account and prohibited from accessing social media platforms, unless a parent provides express consent-which will require more proof to confirm the parent's age, identity, and relationship to the minor.

The Plaintiff, NetChoice, LLC, is an Internet trade association whose members include Facebook, Instagram, Twitter, TikTok, Snapchat, Pinterest, and Nextdoor. NetChoice asks the Court to preliminarily enjoin Act 689 from taking effect. NetChoice does not dispute that social media usage poses risks to minors' physical and mental wellbeing. Rather, NetChoice claims the Social Media Safety Act does not provide a constitutional way to address the dangers that minors face online. According to NetChoice, Act 689 is unconstitutionally vague because it is impossible to determine which social media companies and platforms fall within its purview. In addition, NetChoice contends Act 689 violates Arkansans' First Amendment rights. NetChoice argues that Act 689's age-verification requirements are not narrowly tailored to address the harms that minors may face on social media, while at the same time placing an undue burden on both adults' and minors' access to constitutionally protected speech.

The Defendant (the “State”) is Arkansas Attorney General Tim Griffin, who is sued in his official capacity, because his office is tasked to enforce Act 689 on behalf of the State of Arkansas. The State maintains that Act 689 is a constitutional way to curtail minors' access to social media platforms. The State contends that Act 689 is narrowly tailored to address the harms posed by social media, while the alleged burdens are neither too costly for NetChoice members nor too intrusive for Arkansans who wish to open social media accounts. The State concedes that social media platforms host a wide range of protected free speech, but it contends that the slight burden to protected speech is justified by the important goal of protecting minors.

NetChoice's Motion for Preliminary Injunction was fully briefed by the parties, see Docs. 17, 18, 34, 38, and the ACLU submitted an amicus brief in support of NetChoice's position, see Doc. 31. In its response, the State lead with an argument that NetChoice lacks standing to assert the First Amendment rights of Arkansas social media users. The Court ordered additional briefing on this issue. See Docs. 39-41.

On August 15, the Court held an evidentiary hearing. The parties introduced documentary evidence, witness declarations, and stipulations of fact. The State also presented the live testimony of its expert witness, Tony Allen. Afterwards, the Court engaged counsel in a lengthy period of oral argument.

Having taken these matters under advisement, the Court now concludes that NetChoice has standing to assert a constitutional challenge to Act 689 on behalf of its members and its members' users. Therefore, for the reasons explained below, the Court finds that NetChoice's arguments are likely to succeed on the merits and its request for a Preliminary Injunction is GRANTED.

II. BACKGROUND

A. The Social Media Safety Act: Objectives and Requirements

According to Act 689, “social media compan[ies]” will be required to “verify the age[s] of . . . account holder[s]” using the age-verification methods sanctioned by the State. See Ark. Code Ann. § 4-88-1102(b)(1).Further, the regulated companies “shall not permit an Arkansas user who is a minor to be an account holder . . . unless the minor has the express consent of a parent or legal guardian.” Id. at § 1102(a).

All citations to Act 689 in this Opinion refer to particular subsections of Chapter 88, Subchapter 11 of the Arkansas Code. For brevity's sake, the Court will cite only to the subsection, e.g., “Act 689 at § 1102(b)(1).”

Not every online company or platform will be subject to the State's new ageverification requirements. Under Act 689, a “social media company” is defined in terms of what account holders may do on the company's platform. A “social media company” is one that permits its account holders to: (1) create a public profile “for the primary purpose of interacting socially with other profiles and accounts”; (2) upload or post content; (3) view content of other account holders; and (4) interact with other account holders “through request and acceptance.” Id. at § 1101(7)(A) (emphasis added). The State offers no guidance on how it will assess the “primary purpose” of account holders. Furthermore, Act 689 specifically exempts any company that “[d]erives less than twenty-five percent (25%) of [its] revenue from operating a social media platform” and “[o]ffers cloud storage services.” Id. at § 1101(7)(B)(iv)(a)-(b). This exemption shields Google (a subsidiary of Alphabet, Inc.) from compliance. Neither Google Hangouts nor Google's video-sharing platform, YouTube, will be required to verify the ages of their account holders.

Act 689 defines “social media platform” as a “public or semipublic internet-based service or application” of which the “substantial function . . . is to allow users to interact socially with each other within the service or application.” Id. at § 1101(8)(A)(ii)(a) (emphasis added). Once again, the State does not identify the criteria it will rely on to determine the “substantial function” of an online platform. Act 689 exempts platforms controlled by businesses that generate less than $100 million annually. Id. at § 1101(8)(C). As a result, platforms like Parler, Gab, and Truth Social will fall outside the scope of Act 689, even though they may host the same potentially harmful content with which the State is concerned.

Act 689 is littered with other exemptions. For example, companies that exclusively offer interactive online gaming, cloud storage services, cybersecurity services, professional networking, career development, or educational tools need not comply. See id. at § 1101 (7)(B)(iii)-(v). Platforms that predominantly or exclusively provide users with email or direct-messaging services are entirely exempt. See id. at § 1101(8)(B)(i)-(ii). Act 689 also gives a free pass to streaming services (for licensed movies or music); platforms that pre-select news, sports, entertainment, or other content for account holders to view; and online shopping or e-commerce sites-provided that the type of user interaction on these sites is limited to posting and commenting on product reviews or displaying lists of goods. See id. at § 1101(8)(B)((iii)-(v).

During the hearing for preliminary injunction, the State called Tony Allen to testify in support of Act 689. Mr. Allen is an expert in age-verification trade standards for the United Kingdom. He serves as the technical editor of the international standard for age assurance systems used in the UK and has global oversight over the operation of the UK's age-assurance standardization program. See State's Hearing Exhibit 1-A. Mr. Allen testified that he was familiar with the sort of robust age-verification requirements Act 689 would likely require, in part, due to his work on the UK's Online Safety Bill (“OSB”), which is expected to pass the Houses of Parliament sometime next month. He noted that the OSB is similar to Act 689 in that both laws are likely to require social media companies to tighten their age-verification procedures. However, unlike Act 689, the OSB's ageverification requirements will be triggered by particular content, called “primary priority content,” which the UK has determined is damaging or harmful to minors. Arkansas, in contrast, will require age verification for particular companies at the time of account creation. Examples of the UK's “primary priority content” include adult pornography and information about suicide, self-harm, and dieting.

Mr. Allen analogized a social media platform, like Facebook, to a shopping mall consisting of various “stores” full of content. For example, a Facebook account holder may use the platform to read the news, interact with a favorite actor or author, share family photographs, watch videos of people dancing or singing, review books, order products, or comment on important political events. None of these topics appear to be obscene, illegal, immoral, or otherwise concerning for minors to view. According to Mr. Allen, the OSB will only require rigorous age-verification methods “when the primary priority content risk is triggered”-in other words, when a user approaches the door of a harmful “store” within the “mall.” Act 689, in contrast, requires age verification at the “front door” of the “mall” of online platforms, regardless of the content within.

Mr. Allen also testified about the technology used to perform age verification and the commercial entities that provide age-assurance services in the UK and the European Union.

Act 689 generally permits a company to utilize “[a]ny commercially reasonable age verification method.” Act 689 at § 1102(c)(2)(C). Mr. Allen explained the current state-of-the-art capabilities in online age verification. He testified that a typical scenario would involve a user being asked to verify his or her age online. Then, the user then would be shunted to a third-party servicer that collects official documents, such as digital identification cards or digital driver's licenses. The user would upload documents to prove his or her age.Mr. Allen also explained that artificial intelligence programs could be used to verify age as an alternative (or in addition to) the user providing an identification card. For example, a user could be asked to upload a selfie of his or her face to prove that the user was the same person pictured in the official identity document. In addition, selfies or voice recordings could be required for the servicer to estimate the user's age using artificial intelligence. Mr. Allen opined that uploading and scanning a digital driver's license would take less than a minute, while age estimation using biometric scanning would likely take even less time.

The Act authorizes the use of digitized identification cards or driver's licenses, as well as other digitized forms of government-issued identification. Act 689 at § 1102(c)(2). A “digitized identification card” is defined as “a data file available on a mobile device that has connectivity to the internet through a state-approved application that allows the mobile device to download the data file from the Office of Driver Services that contains all of the data elements visible on the face and back of a driver's license or identification card and displays the current status of the driver's license or identification card, including valid, expired, cancelled, suspended, revoked, active, or inactive.” Id. at § 1101(4). Mr. Allen testified during the hearing that Arkansas was still in the process of developing its “state-approved” online app for downloading data files from the Office of Driver Services-which means that this technology is unlikely to be available when Act 689 takes effect on September 1.

Mr. Allen further explained that once an age-verification servicer gathered enough proof to know that a user was either an adult or a minor, the servicer would generate an encrypted “token” that answered “yes” or “no” to the question of whether the user was an adult. After the “token” was sent electronically to the social media company or platform requesting it, the third-party servicer would then delete the user's documents, images, and other data used to verify age and retain a record of the transaction for billing purposes only.

Mr. Allen identified at least one critical gap in Act 689's regulatory structure: How will a regulated company prove that it obtained parental consent for a minor to open a social media account? He testified that in the UK, online parental consent is only required when a minor seeks to perform some action online that the law forbids, such as enter into a contract for the sale or purchase of goods. By contrast, Act 689 will require social media companies to obtain “express consent of a parent or legal guardian” much more frequently-whenever an Arkansas minor seeks to open a social media account-and to use procedures reliable enough to ensure that these companies avoid incurring civil and criminal penalties. See Act 689 at § 1102(a). Implementing these parental-consent procedures will not be an easy task, according to Mr. Allen:

I think the biggest challenge you have with parental consent is actually establishing the relationship, the parental relationship. It's easy to say that this person who is giving the consent is, let's say, in their 40s, versus the person that's asking for the consent being under 18. But actually establishing that that is a parent or a legal guardian, that's the challenge with those processes.

B. Social Media Use Among Minors

“There is broad agreement among the scientific community that social media has the potential to both benefit and harm children and adolescents.” U.S. Surgeon General, Social Media and Youth Mental Health 5 (2023), https://www.hhs.gov/sites/default/files /sg-youth-mental-health-social-media-advisory.pdf (last accessed Aug. 19, 2023) (State's Hearing Exhibit 5). Moreover, “different children and adolescents are affected by social media in different ways, based on their individual strengths and vulnerabilities, and based on cultural, historical, and socio-economic factors.” Id. Still, experts agree that social media use carries significant risk to minors' physical and mental well-being.

The State's medical expert, Dr. Karen Farst, notes in her Affidavit that in her practice, she has encountered numerous examples of children “active on social media” who “thought they were communicating with a same-aged peer, and instead it was someone posing in that role in order to gain trust of the child.” (State's Hearing Exhibit 2, p. 3, ¶ 10).Adult predators frequent the internet, and children “from homes where there has been abuse, neglect, or family discord .... look to social media for attachments and relationships they do not have within their family.” Id. at ¶ 12. This “makes them more susceptible to being lured into a situation they think is supportive.” Id. Dr. Karst also cautions that youth experience “cyberbullying” while “on social media” and “can become so consumed in their online image that it can lead into unlawful and criminal behavior.” Id. at p. 4, ¶ 14.

Dr. Farst practices in Arkansas and is a licensed pediatrician and member of the American Academy of Pediatrics' counsel on child abuse and neglect.

The State observes in its Brief that “[a]dult predators often create fake accounts, posing as minors, and then take advantage of real minors' comfort in online environments, coercing them into sending explicit images of themselves.” (Doc. 34, p. 17) (quotation marks and citation omitted). Moreover, an FBI report the State relies on found:

Financial sextortion schemes occur in online environments where young people feel most comfortable-using common social media sites, gaming sites, or video chat applications that feel familiar and safe. On these platforms, online predators often use fake female accounts and target minor males between 14 to 17 years old, but the FBI has interviewed victims as young as 10 years old.
FBI Nat'l Press Off., FBI and Partners Issue National Public Safety Alert on Financial Sextortion Schemes (Dec. 19, 2022), https://www.fbi.gov/news/press-releases/fbi-and-partners-issue-national-public-safety-alert-on-financial-sextortion-schemes (last accessed Aug. 19, 2023) (State's Hearing Exhibit 6).

In addition, several recent studies have highlighted a possible link between social media use by young people and negative effects on youth mental health. According to an article published by the American Academy of Child & Adolescent Psychiatry, “On average, teens are online almost nine hours a day, not including time for homework.” See Am. Academy of Child & Adolescent Psychiatry, Social Media and Teens (updated Mar. 2018), https://www.aacap.org/AACAP/FamiliesandYouth/FactsforFamilies/FFF-Guide/Social-Media-and-Teens-100.aspx (last accessed Aug. 19, 2023) (State's Hearing Exhibit 7). And according to an advisory report issued this year by the U.S. Surgeon General, a longitudinal cohort study of U.S. adolescents aged 12-15 found “that adolescents who spent more than 3 hours per day on social media faced double the risk of experiencing poor mental health outcomes including symptoms of depression and anxiety.” U.S. Surgeon General, Social Media and Youth Mental Health 4 (2023) https://www.hhs.gov/sites/default/files/sg-youth-mental-health-social-media-advisory.pdf (last accessed Aug. 19, 2023) (State's Hearing Exhibit 5). Yet another study conducted among 14-year-olds “found that greater social media use predicted poor sleep, online harassment, poor body image, low self-esteem, and higher depressive symptom scores with a larger association for girls than boys.” Id. at 7.

The amount of time minors spend online appears indicative of negative mentalhealth outcomes, and studies indicate that the content minors find online may be to blame. The Surgeon General's report observed:

Extreme, inappropriate, and harmful content continues to be easily and widely accessible by children and adolescents. This can be spread through direct pushes, unwanted content exchanges, and algorithmic designs. In certain tragic cases, childhood deaths have been linked to suicide- and selfharm-related content and risk-taking challenges on social media platforms. This content may be especially risky for children and adolescents who are already experiencing mental health difficulties. Despite social media providing a sense of community for some, a systematic review of more than two dozen studies found that some social media platforms show live depictions of self-harm acts like partial asphyxiation, leading to seizures, and cutting, leading to significant bleeding. Further, these studies found that discussing or showing this content can normalize such behaviors, including through the formation of suicide pacts and posting of self-harm models for others to follow.
Id. at 8.

C. Types of Speech Available on NetChoice Members' Platforms

The parties jointly stipulate that adults and minors use NetChoice members'online services to engage in an array of expressive activity that is protected by the First Amendment. See Court's Exhibit 1. Social media companies and platforms “allow[] users to gain access to information and communicate with one another about it on any subject that might come to mind.” Packingham v. North Carolina, 582 U.S. 98, 107 (2017). “[U]sers employ these websites to engage in a wide array of protected First Amendment activity on topics ‘as diverse as human thought.'” Id. at 105 (quoting Reno v. Am. Civil Liberties Union, 521 U.S. 844, 870 (1997)). “On Facebook, for example, users can debate religion and politics with their friends and neighbors or share vacation photos.” Id. at 104. On “Twitter, users can petition their elected representatives and otherwise engage with them in a direct manner.” Id. at 104-05.

As previously stated, NetChoice, LLC, is an internet trade association consisting of members such as Facebook, Instagram, Twitter, TikTok, Snapchat, Pinterest, and Nextdoor.

According to the Declaration of Carl Szabo, Vice President and General Counsel of NetChoice, minors routinely use the services of NetChoice members “to read the news, connect with friends, explore new interests, and follow their favorite sports teams and their dream colleges,” as well as “showcase their creative talents to others,” “raise awareness about social causes,” and “participate in public discussion on the hottest topics of the day.” (Plaintiff's Hearing Exhibit 1, ¶ 6). Adults use Facebook for various reasons, including taking part in religious services. Id. at ¶ 9. Users of Pinterest share ideas for recipes, style, and home decor. Id.

According to the Declaration of Antigone Davis, Vice President and Global Head of Safety at Meta Platforms, Inc., Facebook and Instagram provide online platforms for users to engage in speech for the purpose of making social connections, showcasing creative talents, gathering information and news about the world, receiving education, and participating in the democratic process. (Plaintiff's Hearing Exhibit 2, ¶¶ 11-20). Her Declaration cites to a number of surveys that indicate social media may promote connectedness, reduce social isolation, establish online-only friendships, assist individuals in finding support and inspiration during times of depression, stress, and anxiety, and allow users of all ages to engage with differing viewpoints on civic issues. Id. at ¶¶ 22-25.

With all that said, it cannot be assumed that NetChoice members host only constitutionally protected speech. Despite these entities' efforts to self-regulate, it is undoubtedly true that social media users of any age may still encounter some speech online that is not entitled to constitutional protection, including real threats, child pornography, obscenity, defamation, fighting words, or speech integral to criminal conduct. See United States v. Alvarez, 567 U.S. 709, 717 (2012). In addition, minors may encounter speech online that is constitutionally protected as to adults, but not as to minors.

D. Existing Parental Controls

Of course, parents may rightly decide to regulate their children's use of social media-including restricting the amount of time they spend on it, the content they may access, or even those they chat with. And many tools exist to help parents with this.

Cell carriers and broadband providers provide parents with tools to block certain applications and websites from their children's devices, ensure that their children are texting and chatting with trusted contacts only, and restrict their children's access to screen time during certain hours of the day. See, e.g., Verizon, Verizon Smart Family, https://tinyurl.com/ycyxy6x6 (Plaintiff's Hearing Exhibit 17); AT&T, Parental Controls, https://tinyurl.com/3ypvj7bv (Plaintiff's Hearing Exhibit 6); T-Mobile, Family Controls and Privacy, https://tinyurl.com/57run7ac (Plaintiff's Hearing Exhibit 16); Comcast Xfinity, Set Up Parental Controls for the Internet, https://tinyurl.com/5acdsnat (Plaintiff's Hearing Exhibit 7).

Wireless routers, which provide internet connectivity, also offer parental control settings. See Molly Price & Ry Crist, How to Set Up and Use Your Wi-Fi Router's Parental Controls, CNET (Feb. 11, 2021), https://archive.ph/wip/uGaN2 (Plaintiff's Hearing Exhibit 15). Parents can use these settings to block certain websites or online services that they deem inappropriate, set individualized content filters for their children, and monitor the websites their children visit and the services they use. See Netgear, Circle Smart Parental Controls, https://archive.ph/wip/0GbB5 (Plaintiff's Hearing Exhibit 14). Parents can also use router settings to turn off their home internet at particular times of day, pause internet access for a particular device or user, or limit the amount of time that a child can spend on a particular website or online service. Id.

Additional parental controls are available at the device level. For example, iPhones and iPads empower parents to limit the amount of time their children can spend on the device, choose which applications (e.g., YouTube, Facebook, Snapchat, or Instagram) their children can use, set age-related content restrictions for those applications, filter online content, and control privacy settings. See Apple, Use Parental Controls on Your Child's iPhone, iPad, and iPod Touch, https://archive.ph/T68VI (Plaintiff's Hearing Exhibit 5). Google and Microsoft similarly offer parental controls for their devices. See Google Family Link, Help Keep Your Family Safer Online, https://tinyurl.com/mr4bnwpy (Plaintiff's Hearing Exhibit 8); Microsoft, Getting Started with Microsoft Family Safety, https://tinyurl.com/yc6kyruh (Plaintiff's Hearing Exhibit 10). In addition, numerous third- party applications allow parents to control and monitor their children's use of Internet-connected devices and online services. See Ben Moore & Kim Key, The Best Parental Control Apps for Your Phone, PCMag (Mar. 29, 2022), https://archive.ph/HzzfH (Plaintiff's Hearing Exhibit 12).

Parental controls on internet browsers offer another layer of protection. Apple Safari, Google Chrome, Microsoft Edge, and Mozilla Firefox offer parents tools to control which websites their children can access. See, e.g., Mozilla, Block and Unblock Websites with Parental Controls on Firefox, https://tinyurl.com/6u6trm5y (Plaintiff's Hearing Exhibit 13). Microsoft offers “Kids Mode,” which allows children to access only a pre-approved list of websites. See Microsoft, Learn More About Kids Mode in Microsoft Edge, https://tinyurl.com/59wsev2k (Plaintiff's Hearing Exhibit 11). Google has a similar feature. It also provides parents with “activity reports,” allowing them to see what apps and websites their children access most frequently. See Google, Safety Center, https://tinyurl.com/kwkeej9z (Plaintiff's Hearing Exhibit 9).

To be sure, parents may or may not use these tools. Mr. Allen's Declaration explains that even though these filtering controls can be “applied in the home, on the router or on laptops, tablets, and smartphones through family cellular plans,” research indicates “that many parents are unaware of this technology” or “do not know how to use it, or discover their children also know how to use it or have circumvented it some other way.” (State's Hearing Exhibit 1, p. 5). Furthermore, he attests, “Children can be very persuasive, and parents might release the controls to allow them to play a game designed for 18+ within a social media platform, unaware the game or platform may be a portal to pornographic or other unsuitable content and dangerous functionality.” Id.

NetChoice members have developed their own policies and practices designed to protect minors who use social media. Facebook, Instagram, Twitter, Pinterest, Snapchat, and Nextdoor require users in the United States to be at least 13 years old before they can create an account-though account holders are asked only to self-report their ages. See Declaration of Justyn Harriman, Plaintiff's Hearing Exhibit 4, ¶ 13. TikTok offers a limited app experience for users under 13 called “TikTok for Younger Users” where children are provided a viewing experience that does not permit them to share personal information and puts extensive limitations on content and user interaction. (Doc. 2, ¶ 17). TikTok also partners with Common Sense Networks to try and ensure that content is both age-appropriate and safe for an audience under 13. Id. Facebook, Instagram, TikTok, and Pinterest default to private settings for teenage users when they sign up for accounts, and these platforms claim they encourage their teenage users to choose more private settings through prompts and suggestions. See Doc. 2, ¶ 17; Declaration of Antigone Davis, Plaintiff's Hearing Exhibit 2, ¶ 31.

NetChoice members also attempt to curate the content that users post on their platforms. See, e.g., Declaration of Justyn Harriman, Plaintiff's Hearing Exhibit 4, ¶ 9. Members attempt to restrict the uploading of violent and sexual content, bullying, and harassment. See Declaration of Carl Szabo, Plaintiff's Hearing Exhibit 1, ¶ 7. Several NetChoice members use age-verification technology to try to keep minors from seeing certain content visible to adults, or to keep younger teens from seeing content visible to older teens. Id. NetChoice members implement these policies through algorithms, automated editing tools, and human review. See Declaration of Antigone Davis, Plaintiff's Hearing Exhibit 2, ¶¶ 27, 36. If a platform decides that certain content violates its policies, it may remove the content, restrict it, or add a warning label or a disclaimer to accompany it. See Declaration of Justyn Harriman, Plaintiff's Hearing Exhibit 4, ¶ 10.

Evidence received by the Court demonstrates that NetChoice members also provide users with tools to curate the content they wish to see. Facebook users can control the content that Facebook recommends to them by hiding a post or opting to see fewer posts from a specific person or group. See Declaration of Antigone Davis, Plaintiff's Hearing Exhibit 2, ¶ 41. Instagram users can use the “not interested” button or keyword filters (for example, “fitness” or “recipes” or “fashion”) to filter out content they do not wish to see. Id. Parents can use Instagram's “supervision tools” to see how much time their teens spend on Instagram, set time limits and scheduled breaks, receive updates on which accounts their teens follow and the accounts that follow their teens, and receive notifications if a change is made to their child's settings. Id. at ¶ 28. Instagram also uses online prompts and safety notices to encourage teens to be cautious in their conversations with adults, even those they may already know. Id. at ¶ 30. Further, Instagram informs young people when an adult who has been exhibiting potentially suspicious behavior tries to interact with them. Id. Instagram claims that if an adult is sending a large number of friend or message requests to people under age 18, or if the adult has recently been blocked by people under age 18, the platform alerts the recipients and gives them an option to end the conversation and block, report, or restrict the adult. Id. TikTok also has a “family pairing” feature that allows parents to, among other things, set a screen-time limit, restrict exposure to certain content, decide whether their teen's account is private or public, turn off direct messaging, and decide who can comment on their teen's videos. See Declaration of Carl Szabo, Plaintiff's Hearing Exhibit 1, ¶ 7.

Mr. Allen offered helpful testimony about his impressions of NetChoice members' internal parental-control features. He agreed that “the vast majority [of member platforms] have . . . family control centers” and similar features that allow parents to “set [their] preferences and controls [they] want to have in place for [their children].” These controls are available “on the individual platform” or “can be programmed as part of the device that [is being used] to access them”-i.e., through the user's phone or laptop computer. Mr. Allen believes the key to keeping children safe online is to age-gate harmful content through rigorous methods and to dramatically increase parents' use of filtering and other control methods to curate and monitor minors' activities online.

III. LEGAL STANDARD

In determining whether to grant a motion for preliminary injunction to a plaintiff with standing, the Court must weigh the following four considerations: (1) the threat of irreparable harm to the moving party; (2) the movant's likelihood of success on the merits; (3) the balance between the harm to the movant if the injunction is denied and the harm to other party if the injunction is granted; and (4) the public interest. Dataphase Sys., Inc. v. CL Sys., Inc., 640 F.2d 109, 114 (8th Cir. 1981). “While no single factor is determinative, the probability of success factor is the most significant.” Kodiak Oil & Gas (USA) Inc. v. Burr, 932 F.3d 1125, 1133 (8th Cir. 2019) (citation and quotation omitted). In particular, “[w]hen a Plaintiff has shown a likely violation of his or her First Amendment rights, the other requirements for obtaining a preliminary injunction are generally deemed to have been satisfied.” Phelps-Roper v. Troutman, 662 F.3d 485, 488 (8th Cir. 2011) (per curiam), vacated on reh'g on other grounds, 705 F.3d 845 (8th Cir. 2012).

IV. DISCUSSION

A. Standing

To bring a cause of action in federal court, the plaintiff must establish standing to sue. City of Clarkson Valley v. Mineta, 495 F.3d 567, 569 (8th Cir. 2007). “In essence the question of standing is whether the litigant is entitled to have the court decide the merits of the dispute or of particular issues.” Warth v. Seldin, 422 U.S. 490, 498 (1975). The “inquiry involves both constitutional limitations on federal-court jurisdiction and prudential limitations on its exercise.” Id. Constitutional standing addresses who has the right to invoke the power of a court (e.g., by filing a lawsuit), while prudential standing addresses what arguments a party may raise as a claim or defense. See Curtis A. Bradley, Ernest A. Young, Unpacking Third-Party Standing, 131 Yale L.J. 1, 26 (2021).

For a plaintiff to prove it has constitutional standing to sue under Article III, it must demonstrate it has suffered, or will suffer, and injury-in-fact that is concrete and particularized, actual or imminent, fairly traceable to the defendant's actions, and likely to be redressed by a favorable decision. Lujan v. Defenders of Wildlife, 504 U.S. 555, 56061 (1992). Requiring a plaintiff to establish constitutional standing to sue “ensures that the Federal Judiciary confines itself to its constitutionally limited role of adjudicating actual and concrete disputes, the resolutions of which have direct consequences on the parties involved.” United States v. Sanchez-Gomez, 138 S.Ct. 1532, 1537 (2018).

NetChoice contends that its constitutional standing enables it to bring two separate claims. First is a due process claim made on behalf of NetChoice's members. NetChoice argues that certain pivotal terms in Act 689 are too vague to be understood by the regulated parties and uniformly enforced by the State, which makes Act 689 unconstitutional. The State does not dispute that NetChoice has constitutional standing to assert this claim, as it clearly arises from economic injuries that are fairly traceable to Act 689's regulatory requirements.

Second, NetChoice asserts a constitutional claim on behalf of Arkansans. It argues that Act 689's regulatory requirements unconstitutionally burden Arkansans' First Amendment rights. The State maintains that NetChoice lacks prudential standing to assert this claim.

1. Constitutional Standing

As the Court previously noted, NetChoice is an internet trade association with members who are subject to Act 689's requirements. Though an entity like NetChoice is not directly injured by a law, it may nevertheless assert associational standing on behalf of its injured members. See Higgins Elec., Inc. v. O'Fallon Fire Prot. Dist., 813 F.3d 1124, 1128 (8th Cir. 2016). To establish associational standing, the entity must show: (1) its members would have standing to sue in their own right; (2) the suit seeks to protect interests germane to the association's purpose; and (3) neither the claim asserted nor the relief requested requires the individual members of the association to participate in the lawsuit. See Hunt v. Wash. State Apple Advert. Comm'n, 432 U.S. 333, 343 (1977).

NetChoice establishes associational standing on behalf of its members due to the non-speculative economic injury members must incur to comply with Act 689. Economic injury associated with state regulatory requirements forms a sufficient basis for first-party standing. See Virginia v. Am. Booksellers Ass'n, Inc., 484 U.S. 383, 393 (1988) (finding that a booksellers' association had constitutional standing to challenge a state law “aimed directly” at it, since the association would “have to take significant and costly compliance measures or risk criminal prosecution”).

If Act 689 goes into effect, the member entities will have three choices: incur expenses to implement an age-verification system in compliance with the Act; bar Arkansans from opening accounts on all regulated platforms; or face criminal penalties and civil enforcement actions brought by the Arkansas Attorney General. See Doc. 18, p. 45 (arguing that those entities “covered by the Act will face a perilous choice between exposing themselves to massive liability for disseminating speech to minors or taking costly and burdensome steps that will drastically curtail access to their online services, all before a court decides the merits of their claims.”); Doc. 17-2, p. 21, Declaration of Antigone Davis, Vice President, Global Head of Safety, Meta (explaining that Act 689 requires “substantial and burdensome changes to the design and operation of the Facebook and Instagram services”); Doc. 17-4, p. 11, Declaration of Justyn Harriman, Senior Engineering Manager, Trust & Safety and Verification, Nextdoor (explaining that it would take Nextdoor “at least six months” to implement Act 689's requirements and would increase costs “by up to 3000%”).

While the State quibbles with precisely how burdensome Act 689 will prove in practice, it does not deny that compliance will impose some costs. The injuries here are sufficient to establish that NetChoice members would have standing to sue in their own right, and thereby satisfy the first prong of the associational-standing test. See Dakota Energy Coop., Inc. v. E. River Elec. Power Coop., Inc., 2023 WL 4834598, at *2 (8th Cir. July 28, 2023) (finding a “risk of direct financial harm establishes injury in fact for standing purposes” (brackets and quotations omitted)).

As NetChoice has standing under an economic theory of injury, it is not necessary for the Court to evaluate its non-economic theory of injury at this time.

As for the second prong of associational standing, the relief sought by NetChoice is central to its organizational purpose of “mak[ing] the Internet safe for free enterprise and free expression.” (Declaration of Carl Szabo, Plaintiff's Hearing Exhibit 1). And as to the third prong, the resolution of NetChoice's claims does not require the “individual participation of each injured party.” United Food & Com. Workers Union Loc. 751 v. Brown Grp., Inc., 517 U.S. 544, 552 (1996) (brackets omitted) (quoting Warth, 422 U.S. at 511). NetChoice's “claims can be proven by evidence from representative injured members, without fact-intensive-individual inquiry,” and, under these circumstances, “the participation of [certain] individual members does not thwart associational standing.” Ass'n of Am. Physicians & Surgeons, Inc. v. Texas Med. Bd., 627 F.3d 547, 552 (5th Cir. 2010).In sum, NetChoice possesses constitutional standing to challenge Act 689.

Many courts have found the third prong of the associational standing test to be prudential. See Housatonic River Initiative v. United States Env't Prot. Agency, New England Region, 75 F.4th 248, 265 (1st Cir. 2023) (“The first two prongs of this test have constitutional dimensions; the third prong is prudential.”) (citing United Food, 517 U.S. at 554-58 (1996)). However, the prudential nature of NetChoice's associational standing is not at issue.

2. Prudential Standing

Prudential standing asks, “who, according to the governing substantive law, is entitled to enforce the right.” Abraugh v. Altimus, 26 F.4th 298, 304 (5th Cir. 2022). “Even when Article III permits the exercise of federal jurisdiction, prudential considerations demand that the Court insist upon ‘that concrete adverseness which sharpens the presentation of issues upon which the court so largely depends for illumination of difficult constitutional questions.'” United States v. Windsor, 570 U.S. 744, 760 (2013) (quoting Baker v. Carr, 369 U.S. 186, 204 (1962)). Accordingly, the “prudential standing rule . . . normally bars litigants from asserting the rights or legal interests of others in order to obtain relief from injury to themselves.” Warth, 422 U.S. at 509. However, there are exceptions to the rule-which, the Court concludes, enable NetChoice to properly assert both a due process challenge based on direct injury to its members and a First Amendment challenge based on indirect injury to Arkansans.

“There is no prudential standing bar when member-based organizations advocate for the rights of their members.” Memphis A. Philip Randolph Inst. v. Hargett, 2 F.4th 548, 557 (6th Cir. 2021). By establishing associational standing, NetChoice also establishes itself as the appropriate party to raise a due process challenge to Act 689 based on direct injury to the due process rights of its members.

The Supreme Court has also held that a litigant may assert the rights of a third party “when enforcement of the challenged restriction against the litigant would result indirectly in the violation of third parties' rights.” Warth, 422 U.S. at 510. Here, the State contends no such exception applies because NetChoice members cannot adequately advocate in favor of their users' First Amendment rights. According to the State:

Social-media companies are businesses that seek a profit, and they do not have the same concerns as parents and children. This is a “substantial conflict” of interests that would make NetChoice an ineffective proponent of the rights of the users and thus defeats any potentially close relationship.
(Doc. 41, p. 3) (cleaned up).

The Court disagrees. The relationship between NetChoice members and their users is analogous to the relationship between vendors of goods and their customers- and the Supreme Court has held that vendors have prudential standing to advocate in favor of their customers' constitutional rights when those rights are burdened by the state's regulation of the vendor. For example, in Craig v. Boren, a licensed vendor of beer and her underage male customer challenged the constitutionality of gender-based distinctions in Oklahoma's liquor laws. 429 U.S. 190, 193 (1976). During the pendency of the lawsuit, the customer, Craig, turned 21, so his claim became moot. Nevertheless, the Court held that the vendor, Whitener, had standing to assert constitutional equal protection claims on behalf of Craig and other underage male customers. The Court explained:

As a vendor with standing to challenge the lawfulness of [Oklahoma's liquor laws], appellant Whitener is entitled to assert those concomitant rights of third parties that would be ‘diluted or adversely affected' should her constitutional challenge fail and the statutes remain in force. Otherwise, the threatened imposition of governmental sanctions might deter appellant Whitener and other similarly situated vendors from selling 3.2% beer to young males, thereby ensuring that “enforcement of the challenged restriction against the (vendor) would result indirectly in the violation of third parties' rights.” Warth v. Seldin, 422 U.S. 490, 510 (1975). Accordingly, vendors and those in like positions have been uniformly permitted to resist efforts at restricting their operations by acting as advocates of the rights of third parties who seek access to their market or function.
Id. at 195.

Just a year after the decision in Craig, the Court took up a similar prudential standing question in Carey v. Population Services International, 431 U.S. 678, 682-84 (1977). There, the Court decided that a vendor of contraceptive devices had standing to challenge the constitutionality of a New York law that restricted the sale of such devices. The vendor “ha[d] standing to challenge [state law], not only in its own right but also on behalf of its potential customers,” as was “settled in Craig v. Boren.” Id. at 683.

Since Carey, many circuit courts have found prudential standing to exist in the context of vendor-customer relationships. See, e.g., Postscript Enters., Inc. v. Whaley, 658 F.2d 1249, 1252 (8th Cir. 1981) (finding that a vendor had standing, not only in its individual capacity, but also with respect to its ability to assert the rights of its present and potential customers in challenging a municipal ordinance that banned the sale of contraceptives and related products except by certain entities); Md. Shall Issue, Inc. v. Hogan, 971 F.3d 199 (4th Cir. 2020) (holding that firearms dealer had third-party standing to pursue claim that Maryland's handgun qualification license violated its potential customers' Second Amendment rights); Kaahumanu v. Hawaii, 682 F.3d 789, 797 (9th Cir. 2012) (agreeing that a wedding planner had standing to challenge permitting regulations on behalf of those seeking to marry); Ezell v. City of Chi., 651 F.3d 684, 696 (7th Cir. 2011) (allowing a vendor to challenge city ordinance banning firing-range facilities for third parties who sought access to those facilities); United States v. Extreme Assocs., Inc., 431 F.3d 150, 155 (3d Cir. 2005) (holding that a vendor of obscene materials had standing to challenge a federal obscenity statute on behalf of its customers).

Another Supreme Court case, Virginia v. American Booksellers Association, Inc., is particularly compelling. 484 U.S. 383, 393 (1988). In Virginia, a booksellers' association challenged the constitutionality of a state law on the ground that it infringed on the First Amendment rights of book buyers. The trial court dismissed the book buyer plaintiffs after finding their claims were too speculative. Id. at 392. When the case finally made its way to the Supreme Court, the state argued that the booksellers' association lacked standing to bring a First Amendment challenge on behalf of its book buying customers. The Court rejected that argument, reasoning:

Even if an injury in fact is demonstrated, the usual rule is that a party may assert only a violation of its own rights. However, in the First Amendment context, “‘[l]itigants . . . are permitted to challenge a statute not because their own rights of free expression are violated, but because of a judicial prediction or assumption that the statute's very existence may cause others not before the court to refrain from constitutionally protected speech or expression.'” Secretary of State of Maryland v. J.H. Munson Co., 467 U.S. 947, 956-957, 104 S.Ct. 2839, 2846-2847, 81 L.Ed.2d 786 (1984), quoting Broadrick v. Oklahoma, 413 U.S. 601, 612, 93 S.Ct. 2908, 2916, 37 L.Ed.2d 830 (1973). This exception applies here, as plaintiffs have alleged an infringement of the First Amendment rights of bookbuyers.
Id. at 392-93.

Turning now to the instant case, the State argues that the vendor line of cases cited above is inapposite because those cases involved existing customers, while Act 689 seeks to regulate hypothetical future users of social media platforms. During the hearing, the State pointed the Court to Kowalski v. Tesmer, a case in which the Supreme Court found that criminal defense attorneys lacked third-party standing to challenge the constitutionality of a Michigan statute on behalf of hypothetical future clients. 543 U.S. 125, 130-31 (2004). The Court finds Kowalski to be clearly distinguishable from the case at bar because the contested issue there was Article III standing-not prudential standing.

The Supreme Court dismissed Kowalski upon finding that the attorneys who filed suit had no injury-in-fact and, thus, no constitutional standing in their own right. Here, NetChoice asserts a cognizable economic injury-in-fact that directly arises from compliance with Act 689. In addition, NetChoice has asserted the constitutional rights of its users and the injuries that users are likely to suffer as a direct result of the State's regulation of NetChoice's members. These concerns are not speculative. Moreover, the Court finds that NetChoice members are well positioned to raise these concerns. They have a thorough understanding of the content hosted on their platforms and the ways in which their customers exercise their First Amendment rights on those platforms. The Court therefore concludes that NetChoice-like the booksellers' association in the Virginia case-is in a unique position to advocate for the rights of Arkansas users and may appropriately do so here.

B. Likelihood of Success on the Merits

1. Void for Vagueness: NetChoice Members' Claim

NetChoice argues that Act 689 violates the due process rights of its members because pivotal terms are unconstitutionally vague. As the Supreme Court explained in Grayned v. City of Rockford:

It is a basic principle of due process that an enactment is void for vagueness if its prohibitions are not clearly defined. Vague laws offend several important values. First, because we assume that man is free to steer between lawful and unlawful conduct, we insist that laws give the person of ordinary intelligence a reasonable opportunity to know what is prohibited, so that he may act accordingly. Vague laws may trap the innocent by not providing fair warning. Second, if arbitrary and discriminatory enforcement is to be prevented, laws must provide explicit standards for those who apply them.
408 U.S. 104, 108 (1972) (footnotes omitted).

A regulation “violates the first essential of due process of law” by failing to provide adequate notice of prohibited conduct. Connally v. General Constr. Co., 269 U.S. 385, 391 (1926) (citations omitted). A court should find a regulation unconstitutional if it “forbids or requires the doing of an act in terms so vague that [persons] of common intelligence must necessarily guess at its meaning and differ as to its application ....” Id.

Although the “void for vagueness” doctrine often applies to criminal laws enacted under a state's penal code, see Vill. of Hoffman Estates v. Flipside, Hoffman Estates, Inc., 455 U.S. 489, 498-99 (1982), the doctrine is applicable here because Act 689 not only imposes possible criminal and civil penalties on companies that fail to comply with its requirements, but also interferes with their customers' access to constitutionally protected speech. The void-for-vagueness doctrine provides that “regulated parties should know what is required of them so they may act accordingly . . . [and that] precision and guidance are necessary so that those enforcing the law do not act in an arbitrary or discriminatory way.” F.C.C. v. Fox TV Stations, Inc., 567 U.S. 239, 253 (2012) (citing Grayned, 408 U.S. at 108-109). The stakes are even higher, however, “[w]hen speech is involved.” Id. at 253-54. It is critical “to ensure that ambiguity does not chill protected speech.” Id.

Act 689 states that a regulated “social media company” is to be held strictly liable for “fail[ing] to perform a reasonable age verification.” Act 689 at § 1103(a)(1). The Act contemplates the imposition of a Class A misdemeanor penalty for non-compliance, see § 1103(b)(1) (cross-referencing Ark. Code Ann. § 4-88-103), a possible civil enforcement action by the Attorney General, see § 1103(b)(2) (cross-referencing Ark. Code Ann. § 488-104), and civil lawsuits brought by aggrieved citizens, see § 1103(c)(1).

“It is essential that legislation aimed at protecting children from allegedly harmful expression-no less than legislation enacted with respect to adults-be clearly drawn and that the standards adopted be reasonably precise so that those who are governed by the law and those that administer it will understand its meaning and application.” Interstate Circuit, Inc. v. City of Dallas, 390 U.S. 676, 689 (1968) (striking down city ordinance imposing misdemeanor penalty on movie theaters for showing films “unsuitable for minors” as impermissibly vague); see also Joseph Burstyn, Inc. v. Wilson, 343 U.S. 495, 497 (1952) (invalidating state law banning motion picture distributors from distributing “sacrilegious” movies due to vague standards); Winters v. New York, 333 U.S. 507, 518-19 (1948) (finding unconstitutionally vague a state law regulating the distribution of certain commercial publications).

Here, Act 689 is unconstitutionally vague because it fails to adequately define which entities are subject to its requirements. A “social media company” is defined as “an online forum that a company makes available for an account holder” to “[c]reate a public profile, establish an account, or register as a user for the primary purpose of interacting socially with other profiles and accounts,” “[u]pload or create posts or content,” “[v]iew posts or content of other account holders,” and “[i]nteract with other account holders or users, including without limitation establishing mutual connections through request and acceptance.” Act 689 at § 1101(7)(A) (emphasis added). But the statute neither defines “primary purpose”-a term critical to determining which entities fall within Act 689's scope-nor provides any guidelines about how to determine a forum's “primary purpose,” leaving companies to choose between risking unpredictable and arbitrary enforcement (backed by civil penalties, attorneys' fees, and potential criminal sanctions) and trying to implement the Act's costly age-verification requirements. Such ambiguity renders a law unconstitutional.

Even when speech is not at issue, the void for vagueness doctrine addresses at least two connected but discrete due process concerns: first, that regulated parties should know what is required of them so they may act accordingly; second, precision and guidance are necessary so that those enforcing the law do not act in an arbitrary or discriminatory way. When speech is involved, rigorous adherence to those requirements is necessary to ensure that ambiguity does not chill protected speech.
See Fox, 567 U.S. at 253-54. “[A] regulation is not vague because it may at times be difficult to prove an incriminating fact but rather because it is unclear as to what fact must be proved.” Id. (emphasis added) (citing United States v. Williams, 553 U.S. 285, 306 (2008)). Here, NetChoice argues that the actions and intentions of platform users drive whether a platform is subject to regulation, and because the motivations of platform users are varied, it is impossible for companies to know whether they are subject to regulation.

The State argues that Act 689's definitions are clear and that “any person of ordinary intelligence can tell that [Act 689] regulates Meta, Twitter[,] and TikTok.” (Doc. 34, p. 20). But what about other platforms, like Snapchat? David Boyle, Snapchat's Senior Director of Products, stated in his Declaration that he was not sure whether his company would be regulated by Act 689. He initially suspected that Snapchat would be exempt until he read a news report quoting one of Act 689's co-sponsors who claimed Snapchat was specifically targeted for regulation. See Plaintiff's Hearing Exhibit 3, ¶ 8 (citing Brian Fung, Arkansas Governor Signs Sweeping Bill Imposing a Minimum Age Limit for Social Media Usage, CNN.com (Apr. 12, 2023), https://www.cnn.com/2023/04/l2/tech/arkansas-social-media-age-limit/index.html).

During the evidentiary hearing, the Court asked the State's expert, Mr. Allen, whether he believed Snapchat met Act 689's definition of a regulated “social media company.” He responded in the affirmative, explaining that Snapchat's “primary purpose” matched Act 689's definition of a “social media company” (provided it was true that Snapchat also met the Act's profitability requirements). When the Court asked the same question to the State's attorney later on in the hearing, he gave a contrary answer-which illustrates the ambiguous nature of key terms in Act 689. The State's attorney disagreed with Mr. Allen-his own witness-and said the State's official position was that Snapchat was not subject to regulation because of its “primary purpose.”

Other provisions of Act 689 are similarly vague. The Act defines the phrase “social media platform” as an “internet-based service or application . . . [o]n which a substantial function of the service or application is to connect users in order to allow users to interact socially with each other within the service or application”; but the Act excludes services in which “the predominant or exclusive function is” “[d]irect messaging consisting of messages, photos, or videos” that are “[o]nly visible to the sender and the recipient or recipients” and “[a]re not posted publicly.” Act 689 at §1101(8)(A)-(B) (emphasis added). Again, the statute does not define “substantial function” or “predominant . . . function,” leaving companies to guess whether their online services are covered. Many services allow users to send direct, private messages consisting of texts, photos, or videos, but also offer other features that allow users to create content that anyone can view. Act 689 does not explain how platforms are to determine which function is “predominant,” leaving those services to guess whether they are regulated.

Act 689 also fails to define what type of proof will be sufficient to demonstrate that a platform has obtained the “express consent of a parent or legal guardian.” Id. at § 1102(a). If a parent wants to give her child permission to create an account, but the parent and the child have different last names, it is not clear what, if anything, the social media company or third-party servicer must do to prove a parental relationship exists. And if a child is the product of divorced parents who disagree about parental permission, proof of express consent will be that much trickier to establish-especially without guidance from the State.

These ambiguities were highlighted by the State's own expert, who testified that “the biggest challenge . . . with parental consent is actually establishing the relationship, the parental relationship.” Since the State offers no guidance about the sort of proof that will be required to show parental consent, it is likely that once Act 689 goes into effect, the companies will err on the side of caution and require detailed proof of the parental relationship. As a result, parents and guardians who otherwise would have freely given consent to open an account will be dissuaded by the red tape and refuse consent-which will unnecessarily burden minors' access to constitutionally protected speech.

For all these reasons, the Court finds that NetChoice is likely to succeed on the merits of its vagueness claim, and the law is likely to be unconstitutional on that basis alone.

2. Burdens on First Amendment Rights: Platform Users' Rights

a. Level of Scrutiny

NetChoice contends that Act 689's age-verification requirements target speech on social media websites and platforms based on content, speaker, and viewpoint, so the Act is subject to strict scrutiny. The State disagrees and argues that Act 689's ageverification requirements are merely a content-neutral regulation on access to speech at particular “locations,” so intermediate scrutiny should apply. According to the Supreme Court's seminal opinion in Ward v. Rock Against Racism:

The principal inquiry in determining content neutrality, in speech cases generally and in time, place, or manner cases in particular, is whether the government has adopted a regulation of speech because of disagreement
with the message it conveys. The government's purpose is the controlling consideration. A regulation that serves purposes unrelated to the content of expression is deemed neutral, even if it has an incidental effect on some speakers or messages but not others.
491 U.S. 781,791-92 (1989).

Deciding whether Act 689 is content-based or content-neutral turns on the reasons the State gives for adopting the Act. First, the State argues that the more time a minor spends on social media, the more likely it is that the minor will suffer negative mentalhealth outcomes, including depression and anxiety. Second, the State points out that adult sexual predators on social media seek out minors and victimize them in various ways. Therefore, to the State, a law limiting access to social media platforms based on the user's age would be content-neutral and require only intermediate scrutiny.

On the other hand, the State points to certain speech-related content on social media that it maintains is harmful for children to view. Some of this content is not constitutionally protected speech, while other content, though potentially damaging or distressing, especially to younger minors, is likely protected nonetheless. Examples of this type of speech include depictions and discussions of violence or self-harming, information about dieting, so-called “bullying” speech, or speech targeting a speaker's physical appearance, race or ethnicity, sexual orientation, or gender. If the State's purpose is to restrict access to constitutionally protected speech based on the State's belief that such speech is harmful to minors, then arguably Act 689 would be subject to strict scrutiny.

During the hearing, the State advocated for intermediate scrutiny and framed Act 689 as “a restriction on where minors can be,” emphasizing it was “not a speech restriction” but “a location restriction.” The State's briefing analogized Act 689 to a restriction on minors entering a bar or a casino. But this analogy is weak. After all, minors have no constitutional right to consume alcohol, and the primary purpose of a bar is to serve alcohol. By contrast, the primary purpose of a social media platform is to engage in speech, and the State stipulated that social media platforms contain vast amounts of constitutionally protected speech for both adults and minors. Furthermore, Act 689 imposes much broader “location restrictions” than a bar does. The Court inquired of the State why minors should be barred from accessing entire social media platforms, even though only some of the content was potentially harmful to them, and the following colloquy ensued:

THE COURT: Well, to pick up on Mr. Allen's analogy of the mall, I haven't been to the Northwest Arkansas mall in a while, but it used to be that there was a restaurant inside the mall that had a bar. And so certainly minors could not go sit at the bar and order up a drink, but they could go to the Barnes & Noble bookstore or the clothing store or the athletic store. Again, borrowing Mr. Allen's analogy, the gatekeeping that Act 689 imposes is at the front door of the mall, not the bar inside the mall; yes?
THE STATE: The state's position is that the whole mall is a bar, if you want to continue to use the analogy.
THE COURT: The whole mall is a bar?
THE STATE: Correct.

Clearly, the State's analogy is not persuasive.

NetChoice argues that Act 689 is not a content-neutral restriction on minors' ability to access particular spaces online, and the fact that there are so many exemptions to the definitions of “social media company” and “social media platform” proves that the State is targeting certain companies based either on a platform's content or its viewpoint. Indeed, Act 689's definitions and exemptions do seem to indicate that the State has selected a few platforms for regulation while ignoring all the rest. The fact that the State fails to acknowledge this causes the Court to suspect that the regulation may not be contentneutral. “If there is evidence that an impermissible purpose or justification underpins a facially content-neutral restriction, for instance, that restriction may be content-based.” City of Austin v. Reagan Nat'l Advertising of Austin, LLC, 142 S.Ct. 1464, 1475 (2022).

Having considered both sides' positions on the level of constitutional scrutiny to be applied, the Court tends to agree with NetChoice that the restrictions in Act 689 are subject to strict scrutiny. However, the Court will not reach that conclusion definitively at this early stage in the proceedings and instead will apply intermediate scrutiny, as the State suggests. Under intermediate scrutiny, a law must be “narrowly tailored to serve a significant governmental interest.” Ward, 491 U.S. at 796, which means it must advance that interest without "sweep[ing] too broadly” or chilling more constitutionally protected speech than is necessary, and it must not “raise serious doubts about whether the statute actually serves the state's purported interest” by “leav[ing] [out]” and failing to regulate “significant influences bearing on the interest,” Republican Party of Minn. v. White, 416 F.3d 738, 752 (8th Cir. 2005) (citations omitted and cleaned up).

Since Act 689 clearly serves an important governmental interest, the Court will address whether the Act burdens adults' and/or minors' access to protected speech and whether the Act is narrowly tailored to burden as little speech as possible while effectively serving the State's interest in protecting minors online.

b. Burdens on Adults' Access to Speech

“The right of freedom of speech . . . includes not only the right to utter or to print, but the right to distribute, the right to receive, the right to read and freedom of thought ....” Griswold v. Connecticut, 381 U.S. 479, 482 (1965) (emphasis added and citation omitted). An individual has the “right to read or observe what he pleases,” and that right is “fundamental to our scheme of liberty” and cannot be restricted. Stanley v. Georgia, 394 U.S. 557, 568 (1969). “[T]he State may not, consistently with the spirit of the First Amendment, contract the spectrum of available knowledge.” Griswold, 381 U.S. at 482.

Social media sites are, “for many . . . the principal sources for knowing current events, checking ads for employment, speaking and listening in the modern public square, and otherwise exploring the vast realms of human thought and knowledge.” See Packingham, 582 U.S. at 107. Requiring adult users to produce state-approved documentation to prove their age and/or submit to biometric age-verification testing imposes significant burdens on adult access to constitutionally protected speech and “discourage[s] users from accessing [the regulated] sites.” Reno v. American Civil Liberties Union, 521 U.S. 844, 856 (1997). Age-verification schemes like those contemplated by Act 689 “are not only an additional hassle,” but “they also require that website visitors forgo the anonymity otherwise available on the internet.” Am. Booksellers Found. v. Dean, 342 F.3d 96, 99 (2d Cir. 2003); see also ACLU v. Mukasey, 534 F.3d 181, 197 (3d Cir. 2008) (finding age-verification requirements force users to “relinquish their anonymity to access protected speech”).

Other courts examining similar regulations have found that “[r]equiring Internet users to provide . . . personally identifiable information to access a Web site would significantly deter many users from entering the site, because Internet users are concerned about security on the Internet and because Internet users are afraid of fraud and identity theft on the Internet.” ACLU v. Gonzales, 478 F.Supp.2d 775, 806 (E.D. Pa. 2007); see also PSINET, Inc. v. Chapman, 167 F.Supp.2d 878, 889 (W.D. Va. 2001), aff'd, 362 F.3d 227 (4th Cir. 2004) (“Fear that cyber-criminals may access their [identifying information] . . . may chill the willingness of some adults to participate in the ‘marketplace of ideas' which adult Web site operators provide.”). The Court agrees. It is likely that many adults who otherwise would be interested in becoming account holders on regulated social media platforms will be deterred-and their speech chilled-as a result of the age-verification requirements, which, as Mr. Allen testified, will likely require them to upload official government documents and submit to biometric scans.

c. Burdens on Minors' Access to Speech

Act 689 bars minors from opening accounts on a variety of social media platforms, despite the fact that those same platforms contain vast quantities of constitutionally protected speech, even as to minors. It follows that Act 689 obviously burdens minors' First Amendment Rights. The Supreme Court instructs:

[M]inors are entitled to a significant measure of First Amendment protection, and only in relatively narrow and well-defined circumstances may
government bar public dissemination of protected materials to them. No doubt a State possesses legitimate power to protect children from harm, but that does not include a free-floating power to restrict the ideas to which children may be exposed. Speech that is neither obscene as to youths nor subject to some other legitimate proscription cannot be suppressed solely to protect the young from ideas or images that a legislative body thinks unsuitable for them.
Brown v. Entertainment Merchants Ass'n, 564 U.S. 786, 794-95 (2011) (internal quotation marks and citations omitted).

Neither the State's experts nor its secondary sources claim that the majority of content available on the social media platforms regulated by Act 689 is damaging, harmful, or obscene as to minors. And even though the State's goal of internet safety for minors is admirable, “the governmental interest in protecting children does not justify an unnecessarily broad suppression of speech addressed to adults.” Reno, 521 U.S. at 875; see also Brown, 564 U.S. at 804-05 (“Even where the protection of children is the object, the constitutional limits on governmental action apply.”).

d. Act 689 is Not Narrowly Tailored

Using the State's analogy, if a social media platform is like a bar, Act 689 contemplates parents dropping their children off at the bar without ever having to pick them up again.

As described above, Act 689 burdens both adults' and minors' access to constitutionally protected speech. The State asserts an important governmental objective for doing so. To withstand challenge under intermediate scrutiny, then, Act 689 must be narrowly tailored to avoid unduly burdening Arkansans' First Amendment rights.

The Court first considers the Supreme Court's narrow-tailoring analysis in Brown v. Entertainment Merchants Association, which involved a California law prohibiting the sale or rental of violent video games to minors. 564 U.S. at 802. The state “claim[ed] that the Act [was] justified in aid of parental authority: By requiring that the purchase of violent video games [could] be made only by adults, the Act ensure[d] that parents [could] decide what games [were] appropriate.” Id. The Brown Court recognized that the state legislature's goal of “addressing a serious social problem,” namely, minors' exposure to violent images, was “legitimate,” but where First Amendment rights were involved, the Court cautioned that the state's objectives “must be pursued by means that are neither seriously underinclusive nor seriously overinclusive.” Id. at 805. “As a means of protecting children from portrayals of violence, the legislation [was] seriously underinclusive, not only because it exclude[d] portrayals other than video games, but also because it permit[ted] a parental . . . veto.” Id. If the material was indeed “dangerous [and] mindaltering,” the Court explained, it did not make sense to “leave [it] in the hands of children so long as one parent . . . says it's OK.” Id. at 802. Equally, “as a means of assisting concerned parents,” the Court held that the regulation was “seriously overinclusive because it abridge[d] the First Amendment rights of young people whose parents . . . think violent video games are a harmless pastime.” Id. at 805. Put simply, the legislation was not narrowly tailored.

In the end, the Brown Court rejected the argument “that the state has the power to prevent children from hearing or saying anything without their parents' prior consent,” for “[s]uch laws do not enforce parental authority over children's speech and religion; they impose governmental authority, subject only to a parental veto.” 564 U.S. at 795, n.3. “This is not the narrow tailoring to ‘assisting parents' that restriction of First Amendment rights requires.” Id. at 804. The Court also expressed “doubts that punishing third parties for conveying protected speech to children just in case their parents disapprove of that speech is a proper governmental means of aiding parental authority.” Id. at 802. “Accepting that position would largely vitiate the rule that ‘only in relatively narrow and well-defined circumstances may government bar public dissemination of protected materials to [minors].'” Id. (quoting Erznoznik v. City of Jacksonville, 422 U.S. 205, 212213 (1975)).

The State regulation here, like the one in Brown, is not narrowly tailored to address the harms that the State contends are encountered by minors on social media. The State maintains that Act 689's exemptions are meant to precisely target the platforms that pose the greatest danger to minors online, but the data do not support that claim.

To begin with, the connection between these harms and “social media” is ill defined by the data. It bears mentioning that the State's secondary sources refer to “social media” in a broad sense, though Act 689 regulates only some social media platforms and exempts many others. For example, YouTube is not regulated by Act 689, yet one of the State's exhibits discussing the dangers minors face on “social media” specifically cites YouTube as being “the most popular online activity among children aged 3-17” and notes that “[a]mong all types of online platforms, YouTube was the most widely used by children ....” See OfCom, Children and parents: media use and attitudes report 2022, Mar. 30, 2022, https://www.ofcom.org.uk/data/assets/pdffile/0024/234609/ childrens-media-use-and-attitudes-report-2022.pdf (cited in Declaration of Tony Allen, States' Hearing Exhibit 1, p. 23 n.10).

Likewise, another State exhibit published by the FBI noted that “gaming sites or video chat applications that feel familiar and safe [to minors]” are common places where adult predators engage in financial “sextortion” of minors. See State's Hearing Exhibit 6. However, Act 689 exempts these platforms from compliance. Mr. Allen, the State's expert, criticized the Act for being “very limited in terms of the numbers of organizations that are likely to be caught by it, possibly to the point where you can count them on your fingers....” He then stated that he did not “want to be unkind to the people who drafted [Act 689],” but at least some exempt platforms are ones that adult sexual predators commonly use to communicate with children, including Kik and Kik Messenger, Google Hangouts, and interactive gaming websites and platforms.

The Court asked the State's attorney why Act 689 targets only certain social media companies and not others, and he responded that the General Assembly crafted the Act's definitions and exemptions using the data reported in an article published by the National Center for Missing and Exploited Children (“NCMEC”). See 2022 CyberTipline Reports by Electronic Service Providers (ESP) 1, National Center for Missing & Exploited Children (2023), https://www.missingkids.org/content/dam/missingkids/pdfs/2022-reports-by-esp.pdf (State's Hearing Exhibit 9). This article lists the names of dozens of popular platforms and notes the number of suspected incidents of child sexual exploitation that each self-reported over the past year. The State selected what it considered the most dangerous platforms for children-based on the NCMEC data-and listed those platforms in a table in its brief. See Doc. 34, p. 16.

During the hearing, the Court observed that the data in the NCMEC article lacked context; the article listed raw numbers but did not account for the amount of online traffic and number of users present on each platform. The State's attorney readily agreed, noting that “Facebook probably has the most people on it, so it's going to have the most reports.” But he still opined that the NCMEC data was a sound way to target the most dangerous social media platforms, so “the highest volume [of reports] is probably where the law would be concentrated.”

Frankly, if the State claims Act 689's inclusions and exemptions come from the data in the NCMEC article, it appears the drafters of the Act did not read the article carefully. Act 689 regulates Facebook and Instagram, the platforms with the two highest numbers of reports. But, the Act exempts Google, WhatsApp, Omegle, and Snapchat- the sites with the third-, fourth-, fifth-, and sixth-highest numbers of reports. Nextdoor is at the very bottom of NCMEC's list, with only one report of suspected child sexual exploitation all year, yet the State's attorney noted during the hearing that Nextdoor would be subject to regulation under Act 689.

None of the experts and sources cited by the State indicate that risks to minors are greater on platforms that generate more than $100 million annually. Instead, the research suggests that it is the amount of time that a minor spends unsupervised online and the content that he or she encounters there that matters. However, Act 689 does not address time spent on social media; it only deals with account creation. In other words, once a minor receives parental consent to have an account, Act 689 has no bearing on how much time the minor spends online. Using the State's analogy, if a social media platform is like a bar, Act 689 contemplates parents dropping their children off at the bar without ever having to pick them up again. The Act only requires parents to give express permission to create an account on a regulated social media platform once. After that, it does not require parents to utilize content filters or other controls or monitor their children's online experiences-something Mr. Allen believes the real key to keeping minors safe and mentally well on social media.

The State's brief argues that “requiring a minor to have parental authorization to make a profile on a social media site .... means that many minors will be protected from the well-documented mental health harms present on social media because their parents will have to be involved in their profile creation” and are therefore “more likely to be involved in their minor's online experience.” (Doc. 34, p. 19). But this is just an assumption on the State's part, and there is no evidence of record to show that a parent's involvement in account creation signals an intent to be involved in the child's online experiences thereafter. Mr. Allen testified to that effect in the following colloquy with the Court:

THE COURT: Okay. Let's say that the parental consent is legitimate. The 17-year-old goes to mom or dad and says, “All my friends are on Facebook, I want to be able to communicate with them on Facebook, will you sign this consent, will you provide your driver's license, will you sit down for 10 minutes,” or however long it takes, and mom or dad says, “yes.” Does that automatically mean, just because the parent has given their consent, that the 17-year-old won't surf to content that is harmful to them?
MR. ALLEN: No. It will answer the question that they were asked: “Can I have . . . consent to have an account.” But . . .
it's then down to the company's policies of how it treats users that it knows are under 18 and what material it makes available to them.
THE COURT: So, you are saying that the parents will still have to stay involved in overseeing the content that their minor child views?
MR. ALLEN: They may do .... Those controls can either be web based or device-based and they can be tailored and they can be-some of them are quite advanced in terms of what they will and won't allow you to access. And they can be updated as well by the parents.
THE COURT: Parental controls?
THE WITNESS: Yes.

Finally, the Court concludes that Act 689 is not narrowly tailored to target content harmful to minors. It simply impedes access to content writ large. Consider the differences between Act 689 and the UK's Online Safety Bill. Mr. Allen, who worked on the UK legislation, testified that the UK's main concern was preventing minors from accessing particular content, whereas Arkansas will require age verification at the time of account creation, regardless of the content. It appears the UK's approach is more consistent with Supreme Court precedent than Arkansas's approach. In Packingham, the Court observed that it was possible for a state to “enact specific, narrowly tailored laws” targeted to “conduct that often presages a sexual crime, like contacting a minor or using a website to gather information about a minor”; but it would be unconstitutional for a state to unduly burden adult access to social media. 582 U.S. at 106-07.

Age-verification requirements are more restrictive than policies enabling or encouraging users (or their parents) to control their own access to information, whether through user-installed devices and filters or affirmative requests to third-party companies. “Filters impose selective restrictions on speech at the receiving end, not universal restrictions at the source.” Ashcroft v. ACLU, 542 U.S. 656, 657 (2004). And “[u]nder a filtering regime, adults . . . may gain access to speech they have a right to see without having to identify themselves[.]” Id. Similarly, the State could always “act to encourage the use of filters . . . by parents” to protect minors. Id.; see also United States v. Playboy Entertainment Group, 529 U.S. 803, 809-10, 815 (2000) (finding that voluntary, “targeted blocking” of certain content by viewers “is less restrictive than banning” the same content).

In sum, NetChoice is likely to succeed on the merits of the First Amendment claim it raises on behalf of Arkansas users of member platforms. The State's solution to the very real problems associated with minors' time spent online and access to harmful content on social media is not narrowly tailored. Act 689 is likely to unduly burden adult and minor access to constitutionally protected speech. If the legislature's goal in passing Act 689 was to protect minors from materials or interactions that could harm them online, there is no compelling evidence that the Act will be effective in achieving those goals.

C. Irreparable Harm

Because Act 689 contains terms too vague to be reasonably understood, NetChoice members are likely to suffer irreparable harm if the Act goes into effect. It is unclear which NetChoice members will be subject to regulation, and several terms that are pivotal to NetChoice members' compliance with Act 689 are undefined or subject to multiple interpretations. Separately, Act 689 is likely to abridge the First Amendment rights of users of NetChoice's members' platforms, which will cause those users to suffer irreparable harm. No legal remedy exists to compensate Arkansans for the loss of their First Amendment rights. Nat'l People's Action v. Vill. of Wilmette, 914 F.2d 1008, 1013 (7th Cir. 1990). “Loss of First Amendment freedoms, even for minimal periods of time, constitute[s] irreparable injury.” Ingebretsen v. Jackson Public Sch. Dist., 88 F.3d 274, 280 (5th Cir. 1996) (citing Elrod v. Burns, 427 U.S. 347, 373 (1976)).

D. Balance of the Equities and the Public Interest

When the government opposes the issuance of a preliminary injunction, the final two factors-the balance of the equities and the public interest-merge. See Nken v. Holder, 556 U.S. 418, 435 (2009). The balance of the equities and public interest decidedly favor NetChoice, given the likelihood that Act 689 will infringe the public's First Amendment rights. Act 689 is not targeted to address the harms it has identified, and further research is necessary before the State may begin to construct a regulation that is narrowly tailored to address the harms that minors face due to prolonged use of certain social media. Age-gating social media platforms for adults and minors does not appear to be an effective approach when, in reality, it is the content on particular platforms that is driving the State's true concerns. The many exemptions in Act 689 all but nullify the State's purposes in passing the Act and ignore the State's expert's view that parental oversight is what is really needed to insulate children from potential harms that lurk on the internet.

V. CONCLUSION

For the reasons explained herein, Plaintiff's Motion for Preliminary Injunction (Doc. 17) is GRANTED. Act 689 of 2023, the “Social Media Safety Act,” is PRELIMINARILY ENJOINED under Federal Rule of Civil Procedure 65(a), pending final disposition of the issues on the merits.

IT IS SO ORDERED.


Summaries of

NetChoice, LLC v. Griffin

United States District Court, Western District of Arkansas
Aug 31, 2023
5:23-CV-05105 (W.D. Ark. Aug. 31, 2023)

concluding NetChoice has associational standing to challenge state law on behalf of its members and its members' users

Summary of this case from NetChoice LLC v. Reyes
Case details for

NetChoice, LLC v. Griffin

Case Details

Full title:NETCHOICE, LLC PLAINTIFF v. TIM GRIFFIN, in his Official Capacity as…

Court:United States District Court, Western District of Arkansas

Date published: Aug 31, 2023

Citations

5:23-CV-05105 (W.D. Ark. Aug. 31, 2023)

Citing Cases

NetChoice, LLC v. Yost

. There are two constraints that govern a party's standing: “[c]onstitutional standing addresses who has the…

NetChoice LLC v. Reyes

Courts considering recent challenges NetChoice has brought against similar state laws have reached the same…