Indiana Law Journal Indiana Law Journal
Volume 99 Issue 4 Article 2
2024
Content Moderation Regulation as Legal Role-Scripting Content Moderation Regulation as Legal Role-Scripting
Sari Mazzurco
SMU Dedman School of Law
Follow this and additional works at: https://www.repository.law.indiana.edu/ilj
Part of the Internet Law Commons
Recommended Citation Recommended Citation
Mazzurco, Sari (2024) "Content Moderation Regulation as Legal Role-Scripting,"
Indiana Law Journal
: Vol.
99: Iss. 4, Article 2.
Available at: https://www.repository.law.indiana.edu/ilj/vol99/iss4/2
This Article is brought to you for free and open access by
the Maurer Law Journals at Digital Repository @ Maurer
Law. It has been accepted for inclusion in Indiana Law
Journal by an authorized editor of Digital Repository @
Maurer Law. For more information, please contact
Content Moderation Regulation as Legal Role-Scripting
SARI MAZZURCO*
Lawmakers and scholars concerned with content moderation regulation typically
appeal to “analogies” to justify or undermine different forms of regulation. The logic
goes: law should afford individuals due process rights against speech platforms
because speech platforms are “like” speech governors as a matter of objective
reality. Other common analogies include common carriers, publishers, distributors,
shopping malls, and bookstores.
Commentators attempt to invoke social roles to understand what the content
moderation relationship is, what behaviors are “right” and “wrong” within it, and
how law should police behavioral deviations. But they do so without relying on
foundational sociology theory that explains what social roles are, what they do, and
how they come to be. Without this theoretical foundation, the discourse incompletely
portrays the project of content moderation regulation. Content moderation
regulations do not simply “take” speech platforms’ role as it currently exists––they
will also “make” speech platforms’ role by expressing that speech platforms should
be speech governors, common carriers, publishers, or something else, based on how
lawmakers choose to regulate.
This Article is the first to introduce role theory into the content moderation
discourse. Content moderation regulations are poised to define the basic contours
of what it means to be a “speech platform” because the role remains unsettled.
Earlier, the Communications Decency Act failed to articulate coherent roles within
the content moderation relationship. But current content moderation regulatory
reformsincluding the PACT Act in Congress as well as state platform-common
carriage laws and their judicial reviewhave a renewed opportunity to script social
roles for speech platforms and individuals. Foregrounding these reforms’ role
scripts directs attention to urgent questions about whether they are likely to produce
a desirable content moderation relationship and an online speech ecosystem that
meets the public’s needs.
* Assistant Professor of Law, SMU Dedman School of Law. The author thanks for their
generous commentary Robert Post, Jack Balkin, Amy Kapczynski, Mark Lemley, Al
Klevorick, Vincent Mazzurco, Judge Thomas Ambro, and Spencer Livingstone. This Article
has benefitted from helpful feedback from participants in Cardozo Law School’s Intellectual
Property Scholars Conference, the Media Law & Policy Scholars Conference at The
University of Texas at Austin School of Law, the Works in Progress for Intellectual Property
Scholars Colloquium hosted by the University of Missouri School of Law and Saint Louis
University School of Law, the University of Michigan Law School Junior Scholars
Conference, the Yale Law School Freedom of Expression Scholars Conference, and the Yale
Information Society Project Writing Workshop. The views expressed in this Article are the
author’s own.
392239-ILJ 99-4_Text.indd 81392239-ILJ 99-4_Text.indd 81 5/29/24 10:48 AM5/29/24 10:48 AM
1132 INDIANA LAW JOURNAL [Vol. 99:1131
INTRODUCTION ..................................................................................................... 1132
I. SOCIAL ROLE AND CONTENT MODERATION REGULATION: THE PRESUMPTION OF
LINEARITY .................................................................................................... 1137
A. PRIMER ON CONTENT MODERATION ...................................................... 1137
B. SCHOLARLY DISCOURSES QUASI-ROLE LENS ....................................... 1138
C. THE MUTUAL CONSTITUTION OF LAW AND SOCIAL ROLES ................... 1141
II. TURNING THE LINE INTO A LOOP: LAW AND SOCIAL ROLES ON THE PLATFORM
INTERNET ...................................................................................................... 1142
A. PRIMER ON SOCIAL ROLES ..................................................................... 1143
B. LAWS ROLE-SCRIPTING FUNCTION ....................................................... 1145
III. CDA SECTION 230 AS A LEGAL ROLE-SCRIPTING FAILURE .......................... 1150
A. SECTION 230 AS WRITTEN AND INTENDED ............................................ 1151
B. WHOM DOES SECTION 230 REGULATE AND WHOM DOES IT SERVE? .... 1152
C. ROLE EQUIVOCATION, ROLE CONFUSION, AND CONTENT MODERATION AS
“BUSINESS PRACTICE ......................................................................... 1155
D. GONZALEZ V. GOOGLES RESIDUAL UNCERTAINTY ON SPEECH PLATFORMS
AS
PUBLISHERS .................................................................................... 1159
IV. CONTENT MODERATION REGULATIONS ROLE SCRIPTS ................................ 1162
A. THE PACT ACT & THE PROTECT SPEECH ACT: SPEECH PLATFORMS AS
SPEECH GOVERNORS ........................................................................... 1163
B. TEXAS & FLORIDA PLATFORM COMMON CARRIAGE LAWS: SPEECH
PLATFORMS AS COMMON CARRIERS (MAYBE?) .................................. 1167
C. THE NETCHOICE CASES: SPEECH PLATFORMS AS CABLE OPERATORS .. 1172
CONCLUSION ........................................................................................................ 1175
I
NTRODUCTION
Months before Elon Musk purchased Twitter, he made his intentions clear. He
declared that he was a “free-speech absolutist” who would “set Twitter free.’”
1
He
regarded Twitter as “the digital town square where matters vital to the future of
humanity are debated” and stated he was “against censorship that goes far beyond
the law.”
2
But beginning shortly after the deal closed, Musk began making Twitter
content moderation decisions that went far beyond the law. He suspended Kanye
West’s account after West posted an image of a swastika combined with a Star of
David;
3
he suspended a number of journalists from the Washington Post, New York
1. Khalid Albaih, Elon Musk, the Social Media Autocrat, AL JAZEERA (Jan. 1, 2023),
https://www.aljazeera.com/opinions/2023/1/1/2022-gave-us-a-new-social-media-autocrat
[https://perma.cc/TE9S-4ERC].
2. Maria Pasquini, Twitter Faces User Exodus After Elon Musk DealBut Some High-
Profile Accounts Are Rapidly Gaining Followers, P
EOPLE (Apr. 27, 2022, 2:43 PM),
https://people.com/human-interest/twitter-faces-user-exodus-after-elon-musk-deal-but-some-
accounts-are-gaining-followers/ [https://perma.cc/XQG3-62RT].
3. Rachel Lerman, Cat Zakrzewski & Ellen Francis, Elon Musk Says Kanye West
Suspended from Twitter After Swastika Tweet, W
ASH. POST (Dec. 2, 2022, 1:19 PM),
https://www.washingtonpost.com/technology/2022/12/02/kanye-west-twitter-suspended-
elon-musk/ [https://perma.cc/FX4D-LYGW].
392239-ILJ 99-4_Text.indd 82392239-ILJ 99-4_Text.indd 82 5/29/24 10:48 AM5/29/24 10:48 AM
2024] CONTENT MODERATION REGULATION 1133
Times, and CNN for disclosing the location of Musk’s private jet (which is otherwise
publicly available information);
4
and he announced he would take down accounts
“created solely” to promote competing social networks.
5
More recently, he stated,
“New Twitter policy is to follow the science, which necessarily includes reasoned
questioning of the science.”
6
At many of these junctures, Musk took informal “polls”
of his followers, asking whether he should reinstate banned accounts or make other
changes.
7
On December 18, 2022, Musk again polled users, this time to ask whether
he should step down as CEO, and pledged to abide by the outcome.
8
Fifty-seven
percent of votes cast favored Musk stepping down,
9
but days later Musk said, “No
one wants the job who can actually keep Twitter alive. There is no successor.”
10
From the time Musk took Twitter over, he seemed to regard his role as a sort of
speech governor: one who exerts unilateral control over the public discourse that
flows through the speech platform, one who would be as responsive to Twitter users’
will as he pleased.
11
Users and journalists also seemed to relate to Musk’s content
moderation as a form of governance. Since his takeover, Insider Intelligence
predicted an “exodus” from Twitter by 2024,
12
and MIT Technology Review
reported Twitter might have lost a million users and counting.
13
Newsweek puzzled
4. Kurt Wagner, Davey Alba & Vlad Savov, Twitter Suspends Journalists Who Musk
Says Imperiled His Safety, B
LOOMBERG (Dec. 16, 2022, 12:30 AM),
https://www.bloomberg.com/news/articles/2022-12-16/twitter-suspends-accounts-of-
mastodon-journalists-covering-musk [https://perma.cc/P6P4-YN9E].
5. Susanne Barton, Twitter Will Remove Accounts That Link to Other Social Media,
B
LOOMBERG (Dec. 18, 2022, 12:56 PM), https://www.bloomberg.com/news/articles/2022-12-
18/twitter-will-remove-accounts-that-link-to-other-social-media [https://perma.cc/ZT8T-
WKT8].
6. Heather Hamilton, ‘Reasoned Questioning’ of Science: Musk’s New Twitter Policy
Met with Praise and Disdain, W
ASH. EXAMR (Dec. 29, 2022, 10:41 AM),
https://www.washingtonexaminer.com/news/reasoned-questioning-science-elon-musk-new-
twitter-policy [https://perma.cc/T27U-S9HA].
7. Lerman et al., supra note 3.
8. Chantal Da Silva, Twitter Users Vote for Elon Musk to Step Down as CEO in Poll He
Launched, NBC
NEWS (Dec. 19, 2022, 7:53 AM), https://www.nbcnews.com/tech/social-
media/twitter-users-vote-elon-musk-quit-ceo-poll-rcna62332 [https://perma.cc/LR7H-J7Q4].
9. Id.
10. Edward Ludlow, Elon Musk Says ‘No One Wants’ Top Twitter Job, But Some People
Raise Their Hands, B
LOOMBERG (Dec. 19, 2022, 9:57 AM),
https://www.bloomberg.com/news/articles/2022-12-19/musk-says-no-one-wants-twitter-top-
job-but-some-people-pipe-up [https://perma.cc/RR2U-NSMY].
11. See supra notes 1–10 and accompanying text.
12. See Rob Pegoraro, Analysts Predict ‘Exodus’ of Twitter Users by 2024, PC
MAG.
(Dec. 13, 2022), https://www.pcmag.com/news/analysts-predict-exodus-of-twitter-users-by-
2024 [https://perma.cc/7L3B-WUMX].
13. See Chris Stokel-Walker, Twitter May Have Lost More Than a Million Users Since
Elon Musk Took Over, MIT T
ECH. REV. (Nov. 3, 2022),
https://www.technologyreview.com/2022/11/03/1062752/twitter-may-have-lost-more-than-
a-million-users-since-elon-musk-took-over/ [https://perma.cc/2P4Z-6XD7] (“The firm Bot
Sentinel, which tracks inauthentic behavior on Twitter by analyzing more than 3.1 million
accounts and their activity daily, believes that around 877,000 accounts were deactivated and
a further 497,000 were suspended between October 27 and November 1.”).
392239-ILJ 99-4_Text.indd 83392239-ILJ 99-4_Text.indd 83 5/29/24 10:48 AM5/29/24 10:48 AM
1134 INDIANA LAW JOURNAL [Vol. 99:1131
over whether Musk was a “savior come to rescue free speech . . . or [an] authoritarian
figure . . . who shuts down the free speech of his critics.”
14
Market researchers
suggested the #TwitterMigration was an act of “protest” in response to Musk’s
content moderation decisions.
15
Twitter users are not alone in relating to content moderation as a form of
governance. Digital ethnographers studying Twitch (a speech platform concerned
primarily with video game commentary) and Reddit also document banned users’
experience as one of governance.
16
In 2009, Facebook conducted its own experiment
with public participation in platform rulemaking, inviting users to vote on certain
policy changes.
17
Ultimately, it declined to follow the vote’s outcome.
18
Though speech governance has become a centerpiece in discourse on content
moderation,
19
it is one of many narratives percolating among the public, the press,
legal scholars, and lawmakers. Others have examined whether speech platforms
might be like common carriers,
20
publishers,
21
distributors,
22
shopping malls,
23
and
other entities,
24
or whether they are simply businesses, “profit-driven entities that
14. Katherine Brodsky, Elon Musk Is the New Trump: You Either Love Him or Hate Him,
N
EWSWEEK (Dec. 27, 2022, 5:37 PM), https://www.newsweek.com/elon-musk-new-trump-
you-either-love-him-hate-him-opinion-1769808 [https://perma.cc/6ZTQ-EV36].
15. Tim Chambers, A Snapshot of the X (Ex-Twitter) Migration, at 12, 1014, Dewey
Square Group (2023), http://www.deweysquare.com/wp-content/uploads/2023/10/DSG-
Snapshot-of-the-XTwitter-Migration-2023-Q3.pdf [https://perma.cc/M2TE-4XSC]
(describing the link between Musk’s tweet-based content decisions and “#twittermigration”
from Twitter to other platforms); see also Stokel-Walker, supra note 13.
16. See Hibby Thach, Samuel Mayworm, Daniel Delmonaco & Oliver Haimson,
(In)visible Moderation: A Digital Ethnography of Marginalized Users and Content
Moderation on Twitch and Reddit, N
EW MEDIA & SOCY 1, 16 (July 18, 2022),
https://doi.org/10.1177/14614448221109804 [https://perma.cc/JNE8-P2TL].
17. See Facebook Opens Governance of Service and Policy Process to Users, F
ACEBOOK
(Feb. 26, 2009), https://about.fb.com/news/2009/02/facebook-opens-governance-of-
service-and-policy-process-to-users [https://perma.cc/EV54-VXZQ]; see also Sari Mazzurco,
Democratizing Platform Privacy, 31 F
ORDHAM INTELL. PROP. MEDIA & ENT. L. J. 792, 819
20 (2021).
18. See Mazzurco, supra note 17, at 820.
19. See Evelyn Douek, Content Moderation as Systems Thinking, 136 H
ARV. L. REV. 526,
53539 (2022); Evelyn Douek, Governing Online Speech: From “Posts-As-Trumps” to
Proportionality and Probability, 121 C
OLUM. L. REV. 759, 76976 (2021); Aileen Nielsen,
The Rights and Wrongs of Folk Beliefs About Speech: Implications for Content Moderation,
27 UCLA
J.L. & TECH. 118, 12130 (2022); Hannah Bloch-Wehba, Global Platform
Governance: Private Power in the Shadow of the State, 72 SMU
L. REV. 27, 3342 (2019).
20. See Adam Candeub, Reading Section 230 As Written, 1 J.
FREE SPEECH L. 139, 146
47 (2021); Christopher S. Yoo, The First Amendment, Common Carriers, and Public
Accommodations: Net Neutrality, Digital Platforms, and Privacy, 1 J.
FREE SPEECH L. 463,
50004 (2021); Eugene Volokh, Treating Social Media Platforms Like Common Carriers?, 1
J.
FREE SPEECH L. 377, 39094 (2021).
21. See Volokh, supra note 20, at 40307.
22. See id.
23. See id. at 415.
24. See id. at 41527; Olivier Sylvain, Intermediary Design Duties, 50
CONN. L. REV.
203, 213, 23435 (2018) (examining a “Good Samaritan” analogy).
392239-ILJ 99-4_Text.indd 84392239-ILJ 99-4_Text.indd 84 5/29/24 10:48 AM5/29/24 10:48 AM
2024] CONTENT MODERATION REGULATION 1135
moderate because it is in their business interests.”
25
Commentators assert they are
drawing analogies to figure out how law should regulate speech platforms, based on
whether platforms satisfy criteria that make them sufficiently “like” these other
entities.
Without acknowledging as much, commentators are in fact invoking social roles
of governor, common carrier, and publisher to understand what the content
moderation relationship is, what behaviors are “right” and “wrong” within it, and
how law should police deviations from behavioral expectations. But they are doing
so without building on foundational sociology theory on what social roles are, the
function they serve, and how they come to be. This Article contends, without this
theoretical foundation, the current discourse incompletely represents the project of
content moderation regulation. Content moderation regulations express that speech
platforms should be speech governors, common carriers, publishers, or something
else, based on how lawmakers choose to regulate. In other words, content moderation
regulation scripts speech platforms’ social roles.
This Article builds this essential theoretical architecture by introducing sociology
literature on social roles and legal scholarship on law and roles to the content
moderation discourse. Social roles typically help people navigate uncertain
interactions. Individuals’ roles, like priest or superintendent, and organizations’
roles, like school and military, make interactions meaningful by communicating
societal expectations about participants’ appropriate behaviors, interests,
characteristics, and values.
26
In emerging relationshipssuch as the content
moderation relationship between speech platforms and the publicroles can be
uncertain, causing confusion about how participants should treat one another.
27
To
decry that Twitter banned certain journalists (per Elon Musk’s orders) implies it
moderated content inappropriately. But whether it acted inappropriately depends
largely on how one understands Twitter’s social role, whether it’s a business, speech
governor, common carrier, or something else. Current regulatory reforms will supply
crucial content here. Law will help define what it means to be a “good” speech
platform through the rights, duties, behavioral constraints, and entitlements it assigns
to those who meet particular attributes.
Content moderation regulation is at a critical juncture. Texas and Florida
promulgated laws purporting to regulate speech platforms as common carriers.
28
The
Fifth Circuit and Eleventh Circuit have reached competing holdings about the
constitutionality of regulating speech platforms as common carriers, based on
divergent perspectives on whether speech platforms exercise “editorial judgment”
the First Amendment protects.
29
Congress is also considering whether to hold speech
25. Douek, Content Moderation as Systems of Thinking, supra note 19, at 559 (emphasis
omitted).
26. R
ALF DAHRENDORF, ESSAYS IN THE THEORY OF SOCIETY 36 (1968); Neal Gross et al.,
E
XPLORATIONS IN ROLE ANALYSIS 63 (1958).
27. See infra Part II.A.
28. H.B. 20, 87th Leg., 1st Spec. Sess. (Tex. 2021); S.B. 7072, 2021 Leg., Reg. Sess. (Fla.
2021).
29. See NetChoice, LLC v. Paxton, 49 F.4th 439, 445 (5th Cir. 2022) (reasoning “editorial
discretion” is not itself First Amendment-protected speech and disregarding the platforms’
claims they engage in protected “editorial discretion”); NetChoice, LLC v. Att’y Gen. Fla., 34
392239-ILJ 99-4_Text.indd 85392239-ILJ 99-4_Text.indd 85 5/29/24 10:48 AM5/29/24 10:48 AM
1136 INDIANA LAW JOURNAL [Vol. 99:1131
platforms accountable as governors when they take down user content.
30
That is to
say, at least five distinct legal authorities are poised to define speech platforms’
social rolepotentially in tension with one another.
This Article draws insights from social role theory to help characterize the current
disordered socio-legal landscape and illuminate some repercussions likely to flow
from lawmakers’ regulatory decisions. It identifies Section 230 of the
Communications Decency Act (CDA)
31
as a legal role-scripting failure that led
courts to immunize an ever-widening scope of platforms’ conduct and left scholars
to puzzle over what exactly speech platforms are for the purpose of law. The sheer
number of social roles courts and scholars invoke to define “good” and “bad” content
moderation exemplifies the CDA’s failure to define the role-relationship.
Current content moderation regulatory reforms are another exercise in legal role-
scripting. Lawmakers’ choices to regulate speech platforms as speech governors,
common carriers, or publishers will guide platforms’ and individuals’ behavior,
address or neglect social harm, and support or undermine possible reforms in quite
different ways. For instance, the Eleventh Circuit’s characterization of speech
platforms like cable operatorssuggests that when Musk banned journalists from
Twitter, he did nothing wrong.
32
It is reasonable for publishers to make unilateral
decisions about what content they will and won’t publish. Directing attention to these
regulations’ role constructions channels debate to urgent questions about whether
they establish content moderation relationships and an online speech ecosystem that
meet the public’s needs.
This Article proceeds in four parts. Part I delves into the discourse on content
moderation, placing special focus on how legal scholars and courts commonly draw
role-based analogies to justify or undermine different forms of content moderation
regulation. Part II introduces sociological theory on social roles and presents law’s
role-scripting function. Part III relies on the theoretical architecture Part II builds to
identify Section 230 of the CDA as a failure of legal role-scripting that led to current
uncertainty about what role speech platforms play when they moderate content. Part
IV presents two current content moderation regulatory reforms as projects of legal
role-scripting. The Platform Accountability and Consumer Transparency (PACT)
Act and Protect Speech Act in Congress and Texas and Florida common carriage
laws (as well as courts reviewing those state laws) each construct different role-
relationships for platforms and people. Part IV enriches discourse on content
moderation regulation by examining the social meaning and legal pathways these
role constructions sustain.
F.4th 1196, 120910 (11th Cir. 2022) (holding platforms’ content moderation is “editorial
discretion” protected under the First Amendment).
30. See Protect Speech Act, H.R. 3827, 117th Cong. (2021); PACT Act, S. 4066, 116th
Cong. (2020).
31. 47 U.S.C. § 230 (1996).
32. NetChoice v. Att’y Gen. Fla., 34 F.4th at 1203.
392239-ILJ 99-4_Text.indd 86392239-ILJ 99-4_Text.indd 86 5/29/24 10:48 AM5/29/24 10:48 AM
2024] CONTENT MODERATION REGULATION 1137
I. SOCIAL ROLE AND CONTENT MODERATION REGULATION: THE PRESUMPTION OF
LINEARITY
In the current discourse on content moderation regulation, scholars and courts ask
what speech platforms are “like” when they moderate content: Speech governors?
Common carriers? Publishers? Or something else entirely? They attempt to fit
speech platforms into these categories to justify or undermine the imposition of
different sorts of legal rules to regulate content moderation. They do so typically by
examining whether speech platforms meet a set of objective and fixed criteria that
make them sufficiently “like” one of these other entities. When speech platforms fail
to meet enough of these criteria, they are deemed too different to merit that particular
form of regulation.
This Part presents the current discourse on content moderation, with a particular
focus on this typical, but incomplete way scholars and courts have engaged with
social roles. Commentators suggest the relationship between roles and law is linear,
with speech platforms’ role (as it exists in a sort of empirical reality) either
supporting or undermining the application of a particular form of law. But this is
only one side of the story: roles and law are in a feedback relationship, with law also
shaping what it means to be a speech platform. This Article asserts that side of the
feedback relationship is the core project of content moderation regulationlegal
role-scriptingthough it has, so far, slipped under the radar.
A. Primer on Content Moderation
Professor Evelyn Douek gives a comprehensive account of content moderation.
At its most succinct, content moderation is “platforms’ systems and rules that
determine how they treat user-generated content on their services.”
33
Just below the
surface lies a web of complexity. She explains content moderation involves far more
than speech platforms’ decisions to take down particular pieces of content users
already posted. Content moderation also includes:
[I]ncreased reliance on automated moderation; sticking labels on posts;
partnerships with fact-checkers; . . . adding friction to how users share
content; giving users affordances to control their own online experience;
looking . . . to how users behave online to determine what should be
removed; and tinkering with the underlying dynamics of the very
platforms themselves.
34
Professor Olivier Sylvain explains platforms “design [their] application[s] to
elicit or shape user content or, conversely, ensure that certain kinds of content never
see the light of day.”
35
Professor Danielle Citron and Professor Neil Richards add
that often, platforms “tailor people’s online experiences,” including the content they
see, “based on fine-grained surveillance” about them.
36
An expansive range of
33. Douek, Content Moderation as Systems Thinking, supra note 19, at 528 n.2.
34. Id. at 531 (emphasis omitted).
35. Sylvain, supra note 24, at 224.
36. Danielle Keats Citron & Neil M. Richards, Four Principles for Digital Expression
392239-ILJ 99-4_Text.indd 87392239-ILJ 99-4_Text.indd 87 5/29/24 10:48 AM5/29/24 10:48 AM
1138 INDIANA LAW JOURNAL [Vol. 99:1131
speech platforms engage in content moderation. Beyond Facebook, Google, and
Twitter, platforms like Tinder,
37
Reddit,
38
Wikipedia,
39
Dropbox,
40
Amazon,
41
Yelp,
42
and even the New York Times
43
and the Washington Post
44
in their comments
sections engage in content moderation.
45
Citron and Richards explain, on the whole, content moderation entails “[p]rivate
power over digital expression” that “should be paired with responsibility to the
public.”
46
But what, exactly, should those responsibilities be? As it turns out, that
question is at the center of debates on content moderation regulation, with
commentators typically justifying or undermining particular responsibilities based
on whether speech platforms are sufficiently “like” certain other entities.
B. Scholarly Discourse’s Quasi-Role Lens
As in the Muskificationof Twitter, speech governance has become a dominant
narrative in discourse on content moderation regulation. Professor Hannah Bloch-
Wehba puts the logical connection succinctly: “When Internet platforms engage in
policing and moderating content online, they engage in a form of private
governance,” albeit one that “suffer[s] from a ‘democratic deficit.’”
47
“Like
administrative agencies, platforms set rules by promulgating internal regulations
about user speech. . . . [A]s the number of disputes . . . grows . . . platforms
increasingly create adjudicatory and appellate mechanisms,”
48
like Facebook’s
(You Won’t Believe #3!), 95 WASH. U. L. REV. 1353, 1356 (2018).
37. Community Guidelines, T
INDER, https://policies.tinder.com/community-
guidelines/intl/en/ [https://perma.cc/QR3X-8EYD].
38. Reddit Content Policy,
REDDIT, https://www.redditinc.com/policies/content-policy
[https://perma.cc/TQ2T-VUPQ].
39. Wikipedia: Automated Moderation, W
IKIPEDIA,
https://en.wikipedia.org/wiki/Wikipedia:Automated_moderation [https://perma.cc/K9UL-
DVHY].
40. Dropbox Acceptable Use Policy, D
ROPBOX,
https://www.dropbox.com/acceptable_use [https://perma.cc/5GB8-L4PH].
41. Community Guidelines, A
MAZON.COM,
https://www.amazon.com/gp/help/customer/display.html?nodeId=GLHXEX85MENUE4XF
[https://perma.cc/HE97-S2CL].
42. Content Guidelines, Y
ELP, https://www.yelp.com/guidelines
[https://perma.cc/6LXW-6VXP].
43. The Comments Section, N.Y.
TIMES, https://help.nytimes.com/hc/en-
us/articles/115014792387-The-Comments-Section [https://perma.cc/39VS-3ZQ6].
44. Discussion and Submission Guidelines, W
ASH. POST (Nov. 23, 2021, 12:00 PM),
https://www.washingtonpost.com/discussions/2021/11/23/discussion-submission-guidelines/
[https://perma.cc/WTD4-U6EN].
45. See Enrique Armijo, Reasonableness as Censorship: Section 230 Reform, Content
Moderation, and The First Amendment, 73
FLA. L. REV. 1199, 1212 (2021) (explaining that
numerous entities not commonly regarded as platforms also host user content and engage in
content moderation to varying degrees).
46. Citron et al., supra note 36, at 1383.
47. Bloch-Wehba, supra note 19, at 33.
48. Id. at 2930.
392239-ILJ 99-4_Text.indd 88392239-ILJ 99-4_Text.indd 88 5/29/24 10:48 AM5/29/24 10:48 AM
2024] CONTENT MODERATION REGULATION 1139
Oversight Board.
49
Law should then “ensur[e] democratic accountability and
legitimacy,” through “the application of administrative law principles and values to
hold platforms to account.
50
A number of other scholars draw similar analogies to
justify the promulgation of public-law-type content moderation regulations. Douek
similarly makes a connection to administrative law.
51
Professor Rory Van Loo
suggests a likeness to the judiciary, calling for legally mandated platform
procedure.
52
On the other side of the coin, numerous scholars (including some in favor of these
reforms) point out where these analogies break down. Professor Kyle Langvardt
argues the governance analogy is inappropriate because speech platforms aren’t
chosen through democratic processes.
53
Douek raises that the scale and speed of
content moderation make the implementation of additional individual procedural
rights infeasible.
54
Douek also asserts that “[p]latforms are businesses [that] will
prioritize their bottom line”
55
and content moderation is the “commodity [they]
offer” because it is in their “business interest.”
56
Douek seems to suggest platforms
are not governors and should not be required to satisfy individual rights because they
are actually, as a matter of empirical reality, businesses. If content moderation
regulation must take speech platforms as businesses, any perception of a democratic
deficit is mistaken and an inappropriate driver for law. By extension, both individual
rights and administrative law remedies would be misplaced.
The next most prominent narrative is common carriage. Professor Christopher
Yoo provides the most comprehensive account of what it means to be a common
carrier and whether speech platforms meet those criteria.
57
He examines Justice
Thomas’s opinion in Biden v. Knight First Amendment Institute,
58
which represents
“the types of considerations that have historically been used to define common
carriers.”
59
These include: [M]arket power, whether an industry is “affected with the
public interest,” whether the entity regulated is part of the transportation or
communications industry, whether it receives countervailing benefits from the
government, and whether the actor holds itself out as providing service to all.
60
He
concludes that each of these criteria has been applied inconsistently, fails to
determine an entity’s status as a common carrier, or is easily evaded. Ultimately, he
49. See Rory Van Loo, Federal Rules of Platform Procedure, 88 U. CHI. L. REV. 829, 870
(2021) (recommending that Congress require large platforms to have external oversight
boards, like Facebook’s Oversight Board).
50. Bloch-Wehba, supra note 19, at 33.
51. See Douek, Content Moderation as Systems Thinking, supra note 19, at 53233.
52. See Van Loo, supra note 49, at 833.
53. See Kyle Langvardt, Regulating Online Content Moderation, 106 G
EO. L.J. 1353,
1358 (2018).
54. See Douek, Content Moderation as Systems Thinking, supra note 19, at 532.
55. Id. at 589.
56. Douek, Governing Online Speech: From “Posts-As-Trumps” to Proportionality and
Probability, supra note 19, at 768, 806.
57. See generally Yoo, supra note 20.
58. Id. at 465 (citing 141 S. Ct. 1220, 122223 (2021) (Thomas, J., concurring)).
59. Id.
60. Id. at 46566.
392239-ILJ 99-4_Text.indd 89392239-ILJ 99-4_Text.indd 89 5/29/24 10:48 AM5/29/24 10:48 AM
1140 INDIANA LAW JOURNAL [Vol. 99:1131
finds speech platforms engage in too much editorial discretion when it comes to
content moderation, such that they don’t hold themselves out as serving all members
of the public.
61
Professor Eugene Volokh, on the other hand, proposes platforms
might be close enough to common carriers when they host user content.
62
He argues
advocacy groups depend on speech platforms to communicate with the public, and
“denying a group a vastly important means of public communication is a serious
burden.”
63
He also suggests, much like e-mail services, users neither want nor expect
platforms to screen what a user posts to their own page based on the viewpoint
expressed.
64
Part IV.B discusses, in greater detail, the Eleventh Circuit and Fifth Circuit
decisions on Florida and Texas’s platform common carriage laws. The courts’
opinions also wrestle with whether platforms are sufficiently “like” common carriers
to justify their regulation as such. The Eleventh Circuit found platforms are not like
common carriers because they aren’t “dumb pipes” that simply transmit data from
point A to point B.
65
Rather, it reasoned platforms exercise “editorial judgment”
protected by the First Amendment.
66
The Fifth Circuit, by contrast, found speech
platformscontent moderation practices implicate the public interest.
67
It reasoned
that the platforms host public discourse and they hold themselves out to serve the
public by “permit[ting] any adult to make an account and transmit expression after
agreeing to the same boilerplate terms of service.”
68
The third most prominent narrative in content moderation discourse concerns
whether speech platforms engage in “editorial discretion” (also called “editorial
judgment”)
69
such that content moderation is itself a form of expression the First
Amendment protects from government regulation. Rather than focus on a single
analogy like democratic government or common carriage, scholars and courts
instead assess whether speech platforms exercise editorial discretion by analogizing
to and distinguishing from a number of differently regulated entities, including
newspapers, broadcasters, editors, publishers, book stores, law schools, malls, and
parades.
70
This is, in a sense, an outgrowth from the case method all law students
learn in their first year. Legal reasoning and argument typically rely on compelling
analogies or disanalogies to prior precedent.
What’s notable, here, is scholars’ and courts’ analogies center on speech
platforms’ likeness to a particular set of entities, as opposed to other facts. For
example, Professor Volokh engages in a rigorous analysis of whether platforms are
sufficiently “like” any of these entities to warrant similar First Amendment treatment
by examining whether they share distinctive features.
71
He concludes, for instance,
61. See id. at 46575.
62. See Volokh, supra note 20, at 408.
63. Id. at 390.
64. See id. at 386.
65. NetChoice, LLC v. Att’y Gen., Fla., 34 F.4th 1196, 1204 (11th Cir. 2022).
66. See id. at 1209.
67. NetChoice, LLC v. Paxton, 49 F.4th 439, 473 (5th Cir. 2022).
68. Id. at 474.
69. See NetChoice v. Att’y Gen., Fla., 34 F.4th at 1203.
70. See supra notes 1925 and accompanying text.
71. See Volokh, supra note 20, at 403, 41524.
392239-ILJ 99-4_Text.indd 90392239-ILJ 99-4_Text.indd 90 5/29/24 10:48 AM5/29/24 10:48 AM
2024] CONTENT MODERATION REGULATION 1141
that platforms’ hosting function makes them unlike newspapers or broadcasters
because: (1) platforms don’t have the same time or space constraints when it comes
to the content they make available; (2) users don’t rely on platforms (as they do
newspapers and broadcasters) to exclude material they find offensive or useless; and
(3) platforms arent a coherent and consistent speech product.
72
Using a similar form
of reasoning, the Eleventh Circuit found:
Just as the parade organizer exercises editorial judgment when it refuses
to include in its lineup groups with whose messages it disagrees, and just
as a cable operator might refuse to carry a channel that produces content
it prefers not to disseminate, social-media platforms regularly make
choices ‘not to propound a particular point of view.’
73
In effect, the court held platforms merit the same First Amendment protection as
parade organizers and cable operators because they are sufficiently “like” those
entities.
C. The Mutual Constitution of Law and Social Roles
Bloch-Wehba, arguing content moderation regulation should draw from
administrative law, asserts “rather than debating about the nature of platforms’
power, the right question is whether the exercise of that power is legitimate . . . .
74
But, whether the power platforms exert when they moderate content is legitimate
depends quite a lot on the role one perceives the platform is playing. The scholarly
discourse suggests that platforms’ role is unsettled, both as a matter of social
understanding and for the purposes of law.
This Part presents the scholarly debate on content moderation regulation to
highlight a key, common oversight: commentators in the academy and the courts
presume the relationship between speech platforms’ social role and law is static and
linear. The argument goes, speech platforms are like X, and so law is justified
regulating them as though they are X. (Or, inversely, speech platforms are not like
X, and so law isn’t justified regulating them like X.)
Much effort is placed on determining whether speech platforms are like governors,
common carriers, newspapers, etc. by connecting the “empirical realities”
75
of how
platforms behave to objective, fixed criteria that make any entity a governor,
common carrier, newspaper, etc.
72. See id. at 40405.
73. NetChoice v. Att’y Gen., Fla., 34 F.4th at 1213 (quoting Hurley v. Irish-American
Gay, Lesbian and Bisexual Group of Boston, 515 U.S. 557, 575 (1995)).
74. Bloch-Wehba, supra note 19, at 68.
75. Douek, Governing Online Speech: From “Posts-As-Trumps” to Proportionality and
Probability, supra note 19, at 767.
Social roles Law
392239-ILJ 99-4_Text.indd 91392239-ILJ 99-4_Text.indd 91 5/29/24 10:48 AM5/29/24 10:48 AM
1142 INDIANA LAW JOURNAL [Vol. 99:1131
Commentators are attempting to approach content moderation regulation from a
social role lens, but without relying on a theoretical architecture that lays out on what
roles are, the function they serve in social life, and their mutually constitutive
relationship with law.
76
As the next Part explains in greater detail, social roles are
not lists of essential physical, technical, or economic criteria law responds to when
making regulatory decisions. Rather, roles are a shorthand for the expectations
society holds for entities’ appropriate behavior, values, interests, and attributes in
various contexts. Roles are constantly in flux, open to contest, and partly constituted
by law. That is to say, law will influence what it means to be a speech platform based
on how lawmakers decide to regulate content moderation, just as law continually
shapes what it means to be a governor, common carrier, newspaper, etc. by how it
characterizes and regulates those roles.
This theoretical reorientation suggests that, rather than focus on crafting the perfect
analogy for law as a matter of observing objective reality, scholars and lawmakers
ought to place more attention on whether content moderation regulation sets a
desirable expectation of what speech platforms should be like.
II.
TURNING THE LINE INTO A LOOP:
L
AW AND SOCIAL ROLES ON THE PLATFORM INTERNET
This Part constructs the theoretical architecture on which the remainder of the
Article rests. It draws insights from sociology theory on social roleswhat social
roles are, the function they serve, and how they come to beto fill a gap in the
current discourse on content moderation regulation. Social roles supply societal
expectations about an entity’s appropriate attributes, interests, values, and behaviors
76. C.f. Sylvain, supra note 24, at 277. Professor Olivier Sylvain provides an exceptional
account of how the CDA failed to deliver on its ostensible aim to motivate interactive
computer services to act as Good Samaritansand receive concomitant immunity from
liability for any harm done in the course of rendering aid. Id. at 213. Professor Sylvain
recommends that amending Section 230(c)(1)’s liability shield to cover only providers
“voluntar[y] act[ions] in good faith as Good Samaritans” might motivate speech platforms to
act as Good Samaritans. Id. at 277, 214. Without drawing an overt connection to role theory,
Professor Sylvain recognizes that law’s treatment of its covered entity (i.e., interactive
computer services) has the capacity to shape speech platforms’ role, as a Good Samaritan or
something else entirely.
Social roles
Law
392239-ILJ 99-4_Text.indd 92392239-ILJ 99-4_Text.indd 92 5/29/24 10:48 AM5/29/24 10:48 AM
2024] CONTENT MODERATION REGULATION 1143
in a particular relationship context. Law often contributes to these expectations based
on how it defines roles and regulates entities that meet definitional parameters. Law
is especially influential when social roles are otherwise unsettled or uncertain, as is
the case for speech platforms. Placing focus on this side of the feedback looplaw’s
role-scripting functionsuggests content moderation regulations do not simply
“take” speech platforms’ role (and regulate on that basis) but they “make” speech
platforms’ role based on how they regulate content moderation.
A. Primer on Social Roles
As individuals navigate their daily lives, they interact with others who occupy any
number of social roles: for instance, the teachers who educate their children, the
landlords who collect their rent, or the baristas who make their lattes. For
sociologists, social roles are a “building block” of social reality because they make
everyday social interactions meaningful and relatively predictable.
77
Social roles
serve as a kind of shorthand for a set of expectations society holds for actors’
appropriate behavior, values, interests, and attributes in various contexts.
78
Though the concept of social roles may seem abstract, in practice, social roles are
the lenses through which individuals see the world.
79
One might expect teachers to
help students learn different subjects, but not give them medical advice. It’s
reasonable to expect teachers to have achieved a high level of education in the field
of education, and not be registered sex offenders. These expectationsalso called
normshelp individuals evaluate the meaning of others’ actions based on whether
they conform or diverge.
80
If a teacher does not meet these expectations, it’s
reasonable to regard his conduct as “inappropriate” or consider him a “bad” teacher.
Roles also help individuals figure out how to treat others.
81
A teacher should know,
because of his social role and that of his students, that he should not give his students
medical advice.
77. See Meir Dan-Cohen, Between Selves and Collectivities: Toward a Jurisprudence of
Identity, 61 U.
CHI. L. REV. 1213, 121819, 122829 (1994); see also PETER L. BERGER &
THOMAS LUCKMANN, THE SOCIAL CONSTRUCTION OF REALITY 6061, 7476 (1967).
78. See Frank Dobbin, Economic Sociology, in T
WENTY-FIRST CENTURY SOCIOLOGY: A
REFERENCE HANDBOOK 320 (Clifton D. Bryant & Dennis L. Peck eds., 2007) (explaining
Émile Durkheim’s view that social role shapes economic behavior); D
AHRENDORF, supra note
26, at 3537 (explaining social roles carry expected modes of behavior, characteristics, beliefs,
and interests); B.J. Biddle, Recent Development in Role Theory, 12 A
NN. REV. SOCIO. 67, 70
71 (1986) (cataloguing sociology scholarship on functional and symbolic interactionist role
theory); G
ROSS ET AL., supra note 26, at 5960, 63 (explaining that a role is a set of
expectations about someone’s attributes and behaviors); John Scott, Status and Role:
Structural Aspects, in I
NTL ENCYCLOPEDIA SOC. & BEHAV. SCIS. (Neil Smelser & Paul Balies
eds., 2001) (synthesizing sociology literature on role-taking and role-making).
79. See G
EORGE A. AKERLOF & RACHEL E. KRANTON, IDENTITY ECONOMICS: HOW OUR
IDENTITIES SHAPE OUR WORK, WAGES, AND WELL-BEING 11 (2010).
80. See Peter M. Hall, A Symbolic Interactionist Analysis of Politics, 42 S
OCIO. INQUIRY
35, 3840 (1972); D
AHRENDORF, supra note 26, at 44; Ralph H. Turner, Role Theory, in
H
ANDBOOK OF SOCIOLOGICAL THEORY 233, 235 (Jonathan H. Turner ed., 2001).
81. See Hall, supra note 80, at 3940, 55; Turner, supra note 80, at 235.
392239-ILJ 99-4_Text.indd 93392239-ILJ 99-4_Text.indd 93 5/29/24 10:48 AM5/29/24 10:48 AM
1144 INDIANA LAW JOURNAL [Vol. 99:1131
Organizations of people can also have social roles on an organization-wide level.
For instance, one can talk about the ways it’s appropriate for a school to discipline a
student, regulate student or teacher speech, or facilitate religious practice.
82
It would
be reasonable to expect a school, the military, and a church to regulate speech quite
differently, on the basis of their different social roles.
Social roles typically arise through a process of continuous interaction in
societybetween and among individuals, organizations, governments, and others.
83
The process is dialectical, meaning it involves persistent conflict between opposing
ideas about what social roles are and the expectations we should associate with
them.
84
Still, in normal times, social roles and their associated norms are well known
and form the assumed, background rules against which individuals operate.
85
Individuals can understand what it means to be a school before ever interacting with
one because they have access to cultural knowledge about a school’s appropriate
attributes and behaviors.
People learn social roles in society through the process of socialization; that is,
by learning their culture from the time they are born.
86
Socialization also helps
perpetuate social roles. When someone internalizes a social rolewhen they accept
it as frame for their own behaviorthey are likely to conform to it and carry it
forward in their culture.
87
Viviana Zelizer gives the example of payments for
household chores within a family: if children accept that, by virtue of their role, they
are expected to participate in household chores without payment, they are likely to
comply and carry the same expectations for their future children.
88
Beyond
socialization, forms of sanctions, including social shaming and ostracism, rewards,
and legal penalties, help sustain social roles in society.
89
When social roles like “speech platform” are uncertain, individuals are limited in
their ability to understand what constitutes a “normal” relationship with a speech
platform. How is it appropriate for a speech platform to behave? What individual
interests are reasonable within this relationship? How should individuals tailor their
behavior to match the relationship’s context? For instance, when Elon Musk took
82. See, e.g., Kennedy v. Bremerton Sch. Dist., 597 U.S. 507, 52526 (2022).
83. See J
EFFREY K. HASS, ECONOMIC SOCIOLOGY: AN INTRODUCTION 9 (2006); Mark
Granovetter, Economic Action and Social Structure: The Problem of Embeddedness, 91 A
M.
J. SOCIO. 481, 486 (1985); NEIL FLIGSTEIN, THE ARCHITECTURE OF MARKETS 2728 (2001).
84. See B
ERGER ET AL., supra note 77, at 61. Ideas about social roles may also differ
dramatically between different communities. One could imagine, for instance, LGBTQ ideas
of what it means to be a parent, father, or mother diverging from heteronormative ideas about
these roles.
85. See F
LIGSTEIN, supra note 83, at 27.
86. See Hall, supra note 80, at 3738; D
AHRENDORF, supra note 26, at 2627, 56; Scott,
supra note 78; B
ERGER ET AL., supra note 77 at 74.
87. See Hall, supra note 80, at 38; B
ERGER ET AL., supra note 77, at 74; DAHRENDORF,
supra note 26, at 56.
88. See Elizabeth Anderson, Beyond Homo Economicus: New Developments in Theories
of Social Norms, 29 P
HIL. & PUB. AFFS. 170, 191 (2000) (citing Viviana Zelizer, How Do We
Know Whether a Monetary Transaction is a Gift, an Entitlement, or Compensation?, in
E
CONOMICS, VALUES, AND ORGANIZATION 32931 (Avner Ben-Ner & Louis Putterman eds.,
1998)).
89. See D
AHRENDORF, supra note 26, at 38, 4243.
392239-ILJ 99-4_Text.indd 94392239-ILJ 99-4_Text.indd 94 5/29/24 10:48 AM5/29/24 10:48 AM
2024] CONTENT MODERATION REGULATION 1145
down Kanye West’s antisemitic Twitter post,
90
it’s unclear what the take down
meant, let alone whether it was legitimate. Was it a despotic act of censorship,
discrimination on the basis of West’s viewpoint, or an editorial judgment? Should
West petition the platform for redress, sue for discrimination, or accept the
platform’s judgment? Without an understanding of Twitter’s social role when it
moderates content, individuals lack the cultural knowledge that would enable them
to ascribe a particular meaning to Musk’s act and react accordingly.
B. Law’s Role-Scripting Function
Citron and Richards write, “How courts talk about the Internet matters, not only
in how they decide individual cases, but also in how they frame similar issues for
future courts.”
91
When it comes to content moderation regulation, the statements law
makes about speech platforms matter quite a lot, not only for lawmakers’ purposes,
but also to establish baseline expectations of what it means to be a speech platform
and how one behaves appropriately within that role.
Though this is a new lens to view the project of content moderation, it is a deep-
rooted function of law.
92
Law helps script social rolesit engages in “legal role-
scripting”when it makes statements about the characteristics and appropriate
behaviors of the entities it regulates and the public it serves. Law is especially able
to exert influence over social roles that are unsettled or uncertainlike “speech
platform.”
93
Content moderation regulations’ role-scripting can provide individuals
the cultural knowledge to evaluate whether speech platforms are treating them the
way they should, to know how they should act toward speech platforms, and to
modify the relationship’s social obligations as it continues to unfold. Regulations’
role choices are also likely to set further lawmaking down particular paths and close
off others.
Legal role-scripting is both a key function and a common regulatory mechanism
of modern liberal democracies.
94
Some laws, such as much of criminal law, proscribe
90. Twitter, Instagram Block Kanye West over Antisemitic Posts, ASSOCIATED PRESS
(Oct. 10, 2022, 5:39 PM), https://apnews.com/article/twitter-inc-entertainment-music-
ba5c710ec59d195fe4d83cb2c9343589 [https://perma.cc/BV48-9PCH].
91. Citron et al., supra note 36, at 1383.
92. See Cass R. Sunstein, Social Norms and Social Roles, 96 C
OLUM. L. REV. 903, 921,
923 (1996) (describing the manner in which “law is often self-consciously concerned with
social roles”); Dan-Cohen, supra note 77, at 122829 (describing the ways in which law
“scripts” social roles); Eric J. Mitnick, Law, Cognition, and Identity, 67 L
A. L. REV. 823, 824,
831 (2007) (explaining how “legal rules . . . categorize persons”); Bert I. Huang, Law and
Moral Dilemmas, 130 H
ARV. L. REV. 659, 678 (2016) (“Another, more subtle way that the law
might influence moral intuitions is by defining official or social roles, which in turn set our
expectations about correct or blameworthy behavior.”).
93. The widespread use of the term “platform” to refer to entities that intermediate
information and relationships is itself significant. See Julie E. Cohen, Law for the Platform
Economy, 51 U.C.
DAVIS L. REV 133, 145 (2017) (“In Tarleton Gillespie’s formulation, the
term ‘platform’ appears to offer users a ‘raised, level surface’ on which to present themselves,
but at the same time it elides the necessary work of defining and policing the platform’s
edges.”) (internal citation omitted).
94. See Mitnick, supra note 92, at 824 (“[M]odern liberal democratic legal institutions
392239-ILJ 99-4_Text.indd 95392239-ILJ 99-4_Text.indd 95 5/29/24 10:49 AM5/29/24 10:49 AM
1146 INDIANA LAW JOURNAL [Vol. 99:1131
certain acts regardless of the actor’s social role. Whether someone is a cashier or an
uncle, they cannot intentionally kill another person (without justification).
95
But,
more often than not, the law treats entities categorically through its choice of legal
subject. The rights, responsibilities, behavioral constraints, and entitlements law
gives its legal subject form the basic outline of a social role.
96
Some scholars suggest that law can onlyimperfectlyreflect settled social
roles,
97
but others acknowledge law historically has created a number of social roles
altogether or redefined substantially the behavioral norms associated with particular
roles.
98
Harlan Fiske Stone has noted, for example, family law’s extensive
elaboration of “husband” and “wife” roles through its allocations of rights and
responsibilities from the time of the nation’s founding.
99
Legislation gradually
ushered in wives’ legal independence and new, progressive norms within the
husband-wife relationship.
100
The origin of the corporate form and its jurisprudential development exemplify
the law’s ability to create social roles wholesale and continually reshape them.
101
When a government granted a corporate charter, the new entity that emergedthe
corporationhad a set of privileges and responsibilities granted by the state, not only
to those who occupied the new social role of shareholder,” but to the public.
102
Corporate law both defined the social role of “corporation” and constructed a suite
persist unreservedly in structuring the categories through which individuals perceive social
life and status.”); Manfred Rehbinder, Status, Contract, and the Welfare State, 23 S
TAN. L.
REV. 941, 955 (1971) (describing modern law as “a law of roles preformed and safeguarded
by the state, yet open and subject to constant change”).
95. See, e.g., N.Y.
PENAL LAW § 125.27 (2024),
https://www.nysenate.gov/legislation/laws/PEN/125.27 [https://perma.cc/AML8-7BAY].
96. See Mitnick, supra note 92, at 824, 828, 831, 833, 86869 (examining the
“constitutive nature of law” in terms of how it supplies social meaning to roles based on the
manner it regulates people occupying those roles); Rehbinder, supra note 94, at 95152 (“The
kind of law that differentiates according to social position of the individual is a law that
involves social roles.”); Dan-Cohen, supra note 77, at 122830 (describing the numerous ways
law helps script social roles); Huang, supra note 92, at 678 (examining how legally afforded
rights and responsibilities influence people’s moral understanding of role-based social
responsibilities).
97. See H
ARLAN F. STONE, LAW AND ITS ADMINISTRATION 34 (1915) (suggesting the
social significance of legal changes to wives’ status); Paul Bohannan, The Differing Realms
of the Law, 67 A
M. ANTHROPOLOGIST 33, 3537 (1965) (contending law is always out-of-
phase with society and a step behind role-based social norms).
98. See Sunstein, supra note 92, at 923; Dan-Cohen, supra note 77, at 122930; Mitnick,
supra note 92, at 824, 865; S
TONE, supra note 97, at 7879, 82; Neil Fligstein, From the
Transformation of Corporate Control, in T
HE NEW ECONOMIC SOCIOLOGY: A READER 40809
(Frank Dobbin ed., 2004) (describing corporations as legal constructs comprised of many, new
roles); William G. Roy, Socializing Capital: The Rise of the Large Industrial Corporation in
America, in T
HE NEW ECONOMIC SOCIOLOGY: A READER 43839, 45051 (Frank Dobbin ed.,
2004) (describing the historical elaboration of corporations’ social role).
99. S
TONE, supra note 97, at 7879.
100. Id.
101. See id. at 82; Fligstein, supra note 98, 40809; Roy, supra note 98, at 43839, 450
51.
102. See Fligstein, supra note 98, 40809; Roy, supra note 98, at 45051.
392239-ILJ 99-4_Text.indd 96392239-ILJ 99-4_Text.indd 96 5/29/24 10:49 AM5/29/24 10:49 AM
2024] CONTENT MODERATION REGULATION 1147
of social roles within the corporation, each with a set of legally scripted behavioral
obligations: shareholder, officer, director, chair, etc.
103
Since then, the Supreme
Court has recognized corporations’ rights to speak, fund electioneering
communications, and practice religion, suggesting these are all normal social
behaviors for corporations.
104
Even when law supplies roles’ initial social meaning, those roles are continually
redefined as actors interact in the roles and with the roles. One need only look, for
example, at the syllabus of the Harvard Business School course “Corporate
Governance and Boards of Directors.” The legal rights and obligations that attach to
the many social roles within the corporate structure are only a small component of
the course. Rather, the course focuses on “the complex dynamics among boards,
executives, and shareholders,” and the “managerial[] and behavioral issues that
directors must contend with. . . .
105
Society and law interact in a constant feedback loop. Contest over the meaning of
social roles and actual social behaviors continually reshape the expectations
associated with legally constructed social roles. When societal expectations
eventually diverge from law’s formulation of particular roles, social movements
often call for law to catch up. For instance, the Environmental, Social, and
Governance (ESG) movement is pressing lawmakers to impose a range of new
regulations on corporationsfrom increasing board diversity to reducing carbon
emissionsthat reflect emerging societal expectations of how corporations should
behave.
106
Different actors in society may have different ideas of what roles are relevant and
desirable in any new relationship. The state of public discourse on content
moderation, and scholarly discourse on its regulation, reflect that notions of speech
platform’s role are unsettled, uncertain, and deeply contested. Survey evidence
reveals, for instance, that Democrats broadly approve speech platforms’ political
fact-checking, whereas Republicans broadly disapprove of the practice.
107
Whereas
some of the largest speech platforms have begun to represent themselves as
103. See, e.g., N.Y. BUS. CORP. LAW art. 6 (2014),
https://www.nysenate.gov/legislation/laws/BSC/A6 [https://perma.cc/27U2-GZ8Z]; id. at art.
7 (2017), https://www.nysenate.gov/legislation/laws/BSC/A7 [https://perma.cc/Q2BG-
RXA4].
104. See Citizens United v. FEC, 558 U.S. 310, 365 (2010); Burwell v. Hobby Lobby
Stores, Inc., 573 U.S. 682, 707 (2014).
105. See Corporate Governance and Boards of Directors, H
ARV. BUS. SCH.: COURSE
CATALOG, https://www.hbs.edu/coursecatalog/2010.html [https://perma.cc/R38Z-2GBQ].
106. See Leah Malone, Emily Holland & Carolyn Houston, ESG Battlegrounds: How the
States Are Shaping the Regulatory Landscape in the U.S., H
ARV. L. SCH. F. ON CORP.
GOVERNANCE (Mar. 11, 2023), https://corpgov.law.harvard.edu/2023/03/11/esg-
battlegrounds-how-the-states-are-shaping-the-regulatory-landscape-in-the-u-s/
[https://perma.cc/A476-XDDW].
107. Most Americans Think Social Media Sites Censor Political Viewpoints, P
EW RSCH.
CTR. (Aug. 19, 2020), https://www.pewresearch.org/internet/2020/08/19/most-americans-
think-social-media-sites-censor-political-viewpoints/ [https://perma.cc/9BKD-UMGF].
392239-ILJ 99-4_Text.indd 97392239-ILJ 99-4_Text.indd 97 5/29/24 10:49 AM5/29/24 10:49 AM
1148 INDIANA LAW JOURNAL [Vol. 99:1131
“governors,”
108
conservative politicians insist they are “common carriers,”
109
and
numerous legal scholars argue they are actually “publishers.”
110
Whether law creates social roles or crystallizes those already present in society,
law’s involvement in role-scripting carries a special force in society. First, the public
readily internalizes legally constructed roles and norms, such that individuals are
rarely conscious of the law’s influence on their perceptions.
111
For example, in an
experimental study examining reactions to different iterations of tort law’s classic
trolley problem, Bert Huang found that respondents’ views of various roles’ moral
responsibilities changed depending on the legal duty the law assigned to each role.
112
Individuals often instinctively translate legal definitions into social norms, especially
when norms are not settled.
113
Second, law has a special ability to enforce social roles
and norms through the threat of sanctions for noncompliance.
114
Feminist activists
who fought for Congress to pass the Civil Rights Act of 1991 sought to leverage the
force of legal sanctions to drive changes to workplace behavioral norms toward
women.
115
Content moderation regulation is poised to mediate among the many conceptions
of speech platforms’ role. It may provide the role’s preliminary contours through the
duties it imposes, the entitlements it affords, and the statements it makes about the
legal attributes of speech platforms’ content moderation relationship with
individuals. The unsettled status of speech platforms’ social role suggests law may
be especially powerful in institutionalizing those social roles. Society may more
readily take up legal definitions in the absence of well-established alternatives. The
potential for legal sanctions is also crucial in this context. For instance, Google would
108. DAVID KIRKPATRICK, THE FACEBOOK EFFECT: THE INSIDE STORY OF THE COMPANY
THAT IS CONNECTING THE WORLD 254 (2010) (quoting Mark Zuckerberg stating, “In a lot of
ways Facebook is more like a government than a traditional company. We have this large
community of people, and more than other technology companies we’re really setting
policies.”).
109. See, e.g., Press Release, Dave Yost, Ohio Att’y Gen., AG Yost Files Landmark
Lawsuit to Declare Google a Public Utility (June 8, 2021),
https://www.ohioattorneygeneral.gov/Media/News-Releases/June-2021/AG-Yost-Files-
Landmark-Lawsuit-to-Declare-Google-a [https://perma.cc/NSW6-2TTZ].
110. See Brief of Internet Law Scholars as Amici Curiae Supporting Respondents,
Gonzalez v. Google LLC, 598 U.S. 617 (2023) (No. 21-1333) (signed by 19 Internet law
scholars).
111. See Mitnick, supra note 92, at 826; see also Huang, supra note 92, at 69495.
112. Huang, supra note 92, at 69495. Elizabeth Anderson also writes about a study of a
Swiss town that was offered compensation to serve as a potential site for a nuclear waste
facility. She suggests that when the government offered compensation, it regarded the
residents as property owners rather than citizens, with the consequence that residents were less
likely to accept the facility. Anderson, supra note 88, at 197.
113. See Lawrence Lessig, The Regulation of Social Meaning, 62 U.
CHI. L. REV. 943,
1030 (1995).
114. See D
AHRENDORF, supra note 26, at 4243; Rehbinder, supra note 94, at 953; Dan-
Cohen, supra note 77, at 1233; Anderson, supra note 88, at 193.
115. See Frank Fagan, Systemic Social Media Regulation, 16 D
UKE L. & TECH. REV. 393,
394 (2018).
392239-ILJ 99-4_Text.indd 98392239-ILJ 99-4_Text.indd 98 5/29/24 10:49 AM5/29/24 10:49 AM
2024] CONTENT MODERATION REGULATION 1149
no doubt resist being designated a common carrier (indeed, it has),
116
and it is
unlikely to comply voluntarily with its associated behavioral norms (e.g.,
nondiscrimination) because those norms would impair Google’s profits. That is,
Google refuses to internalize its role as a common carrier. Legal sanctions become
crucialthey make the role mandatory by imposing social norms on Google with
the threat of profit-reducing monetary penalties for non-compliance.
117
Legal definitions of social roles serve an additional function for law: they define
the scope of possible lawmaking and legal claims on a particular subject.
118
How law
regulates the sale of sex, for instance, would differ dramatically depending on
whether it regards people who sell sex as “workers” or “prostitutes.” If law adopts
the perspective that those who sell sex are workers, labor rightslike fair wages,
safe working conditions, and, potentially, protected opportunities to organizeare
likely to follow.
119
But if law perceives those who sell sex as prostitutes, it would
condemn the very same acts, prohibit individuals from selling sex, and impose
penalties for violating the prohibition.
120
Associated Press v. United States provides
another helpful example.
121
In that case, the Supreme Court carefully reasoned that
when Associated Press (AP) members entered an agreement not to share their stories
with nonmembers, they acted as businesses engaged in an unreasonable restraint of
trade, not as the press disseminating news for the benefit of democratic society.
122
Legal sanctions against AP members’ agreement to withhold news stories from non-
members flowed from the Court’s perception of AP members’ role as “businesses.”
Law and social roles are mutually constitutive. That is to say, legal definitions of
social roles require public buy-in to be salient and to drive future reform. A group of
political scientists based in the University of Zurich who studied Twitter discourse
around Trump’s ban write:
To rise to the political agenda, a given issue must first be construed as
politically salient and specific arguments put forward as to how and why
it might warrant policy intervention. Therefore, how political actors
frame content moderation may impact the kinds of solutions proposed.
For example, if content moderation is primarily framed as a violation of
free speech policy-makers might be more hesitant to implement strict
116. See State v. Google LLC, No. 21-CV-H-06-0274, 2022 WL 1818648, at *89 (Ohio
C.P. May 24, 2022).
117. See D
AHRENDORF, supra note 26, at 4243.
118. See J.M. Balkin, Understanding Legal Understanding: The Legal Subject and the
Problem of Legal Coherence, 103 Y
ALE L.J. 105, 12123 (1993).
119. See LaLa B Holston-Zannell, Sex Work is Real Work, and It’s Time to Treat It That
Way, ACLU (June 10, 2020), https://www.aclu.org/news/lgbtq-rights/sex-work-is-real-work-
and-its-time-to-treat-it-that-way/ [https://perma.cc/JL8T-MEDT]; Kate Andrias, The New
Labor Law, 126 Y
ALE L.J. 2 (2016).
120. See, e.g., N.Y.
PENAL LAW § 230.00 (2014),
https://www.nysenate.gov/legislation/laws/PEN/230.00 [https://perma.cc/S6BE-LF5X]. But
see N
EV. REV. STAT. 201.354 (2022), https://www.leg.state.nv.us/nrs/nrs-201.html
[https://perma.cc/LW73-SUA4] (“It is unlawful for a customer to engage in prostitution or
solicitation therefor, except in a licensed house of prostitution.”).
121. 326 U.S. 1 (1945).
122. Id. at 1920.
392239-ILJ 99-4_Text.indd 99392239-ILJ 99-4_Text.indd 99 5/29/24 10:49 AM5/29/24 10:49 AM
1150 INDIANA LAW JOURNAL [Vol. 99:1131
regulation on platforms’ rules around hate speech, misinformation and
sensitive content.
123
Aileen Nielsen explains that it’s important to understand whether platform users
perceive platforms as governing their speech, because their perception “can be
particularly influential as to what is prioritized in policymaking.”
124
In other words,
if the public sees speech platforms as speech governors and users as speakers and
listeners, they might resist laws that reflect a divergent perception of the speech
platform-user relationship, like common carriage regulation. As Citron and Richards
put it, “Legal rules and policies affecting free expression must take into account the
structures upon which they operate. Legal rules do not operate in a vacuum, and
different rules will operate differently in different structures.”
125
Content moderation
regulation’s legal role-scripting is poised to influence societal expectations about
speech platforms’ appropriate behaviors. If the public accepts, rejects, or alters legal
role-scripts, it will generate headwinds or tailwinds for possible future reform.
Some may feel uneasy at the prospect of law constructing social roles because
social roles typically stem from interpersonal interactions. Douek asks, “Should rules
be expressive and aspirational, especially as rules themselves can set and shape social
norms, or should they reflect the bounds of what is possible in a particular moment .
. . [?]”
126
Top-down prescription of speech platforms’ social role might seem
paternalistic, but at this moment it is unavoidable. Regardless of whether law should
be expressive and aspirational, law in practice shapes social roles whenever it
categorizes those it regulateswhether it redefines existing social roles or creates
them altogether. As Part IV examines in greater depth, current efforts to regulate
content moderation operate on this level. In the face of deep disagreement as to
whether speech platforms are speech governors, common carriers, or publishers,
legal decisions that assert speech platforms should be one of those roles will carry
special force.
III.
CDA SECTION 230 AS A LEGAL ROLE-SCRIPTING FAILURE
Public discourse on content moderation and scholarly discourse on its regulation
reflect that the role of “speech platform” is unsettled and deeply contested. The
previous Part asserted societal uncertainty as to speech platforms’ role positions
current content moderation regulatory reforms to mediate between multiple,
competing accounts of what it means to be a “speech platform” and to behave well
in that capacity.
123. Meysam Alizadeh, Fabrizio Gilardi, Emma Hoes, K. Jonathan Klüser, Mael Kubli &
Nahema Marchal, Content Moderation as a Political Issue: The Twitter Discourse Around
Trump’s Ban 6 (Dep’t Pol. Sci., Univ. Zurich, Working Paper, 2021),
https://fabriziogilardi.org/resources/papers/content-moderation-twitter.pdf
[https://perma.cc/VTF4-9ECM].
124. Aileen Nielsen, The Rights and Wrongs of Folk Beliefs About Speech: Implications
for Content Moderation, 27 UCLA
J.L. & TECH. 118, 130 (2022).
125. Citron et al., supra note 36, at 1381.
126. Douek, Content Moderation as Systems Thinking, supra note 19, at 551.
392239-ILJ 99-4_Text.indd 100392239-ILJ 99-4_Text.indd 100 5/29/24 10:49 AM5/29/24 10:49 AM
2024] CONTENT MODERATION REGULATION 1151
This Part examines, given the role theory architecture Part II built, how law
contributed to current discord on speech platforms’ social role. In short, Section 230
of the Communications Decency Act was a legal role-scripting failure. Lawmakers
were not attuned to how the CDA constructed social roles and the statute, as written,
equivocates on its characterization of whom it regulates and whom it serves. As a
result, courts have been left to apply the statute’s liability shields without a firm
understanding of what content moderation norms ought to merit immunity, as
scholars puzzle over what exactly speech platforms are for the purpose of content
moderation regulation.
A. Section 230 as Written and Intended
There are many excellent, extensive accounts of Section 230 and its history and
so this Article presents only a truncated version.
127
Professor Adam Candeub writes
that Congress passed Section 230 in an effort to control pornography and other non-
family-friendly content online.
128
Section 230 aimed to empower parents to control
their children’s access to online content, in part by overruling the New York case,
Stratton Oakmont Inc. v. Prodigy Services Co..
129
In that case, the court held Prodigy
(which managed online bulletin boards) was liable as a “publisher” of statements
made on those bulletin boards.
130
The case “created a Hobson’s choice for . . . content
moderation: either moderate content and face liability for all posts . . . , or don’t
moderate and have posts filled with obscenity or naked images.”
131
To encourage
providers to clean up the internet, Congress passed two liability shields:
Protection for “Good Samaritan” blocking and screening of offensive
material
(1) Treatment of publisher or speaker
No provider or user of an interactive computer service shall be treated as
the publisher or speaker of any information provided by another
information content provider.
(2) Civil liability
No provider or user of an interactive computer service shall be held liable
on account of
(A) any action voluntarily taken in good faith to restrict access to or
availability of material that the provider or user considers to be
obscene, lewd, lascivious, filthy, excessively violent, harassing, or
otherwise objectionable, whether or not such material is
constitutionally protected; or
(B) any action taken to enable or make available to information
content providers or others the technical means to restrict access to
material described in paragraph (1).
132
127. See, e.g., Candeub, supra note 20, at 14260; Sylvain, supra note 24, at 23158.
128. See Candeub, supra note 20, at 142.
129. 1995 WL 323710 (N.Y. Sup. Ct. May 24, 1995); see Candeub, supra note 20, at 142.
130. See 1995 WL 323710, at *1 (N.Y. Sup. Ct. May 24, 1995).
131. Candeub, supra note 20, at 142.
132. 47 U.S.C. § 230(c) (1996).
392239-ILJ 99-4_Text.indd 101392239-ILJ 99-4_Text.indd 101 5/29/24 10:49 AM5/29/24 10:49 AM
1152 INDIANA LAW JOURNAL [Vol. 99:1131
Professor Olivier Sylvain, examining the legislative history, explains the bill’s
drafters “proposed the bill as an alternative to direct government restrictions on
speech, believing that ‘parents and families are better suited to guard the portals of
cyberspace and protect our children than our Government bureaucrats.’”
133
The
congressional report accompanying the bill underscores this motivation “to overrule
the Prodigy opinion . . . to further the ‘important federal policy of empowering
parents to determine the content of communications their children receive.’”
134
The core of Section 230(c)(1) is to protect providers from being held liable as
“publishers,” as the New York court previously ruled.
135
Sylvain explains that
Congress’s reference to “publishers” invokes defamation law, in which a publisher
is as liable for publishing defamatory material as its author.
136
Defamation law treats
“distributors” a shade differently. Newsstands, libraries, bookstores, and the like may
only face liability if they actually know the original statement they disseminate is
defamatory.
137
“The [distributor’s] affirmative decision to disseminate the
defamatory statement . . . is a direct cause of . . . harm to the injured party’s
reputation.”
138
In Zeran v. AOL, Inc., the first appellate decision interpreting this
liability shield, the Fourth Circuit reasoned distributor liability effectively treats
distributors as publishers and, by extension, Section 230 must immunize providers
from distributor liability as well.
139
B. Whom Does Section 230 Regulate and Whom Does It Serve?
Though Section 230(c)(1) aimed to counteract publisher liability, it is not clear
from the face of the liability shield whether the law conceived of interactive service
providers as publishersin other words, whether it viewed providers as playing the
role of “publisher.” The statute’s role choices are far less clear when considering its
prefatory language as well. Before the liability shields, the statute provides:
(a) Findings
The Congress finds the following:
(1) The rapidly developing array of Internet and other interactive
computer services available to individual Americans represent an
extraordinary advance in the availability of educational and
informational resources to our citizens.
(2) These services offer users a great degree of control over the
information that they receive, as well as the potential for even greater
control in the future as technology develops.
133. Sylvain, supra note 24, at 238 (quoting 141 CONG. REC. H8460 (1995) (statement of
Rep. Wyden)).
134. Id. at 239.
135. But see id. (“[T]he history only suggests that Congress did not want the Prodigy
court’s unforgiving interpretation of the republication rule in the online setting to stand.”).
136. Id. at 23233.
137. Armijo, supra note 45, at 1216.
138. Id. at 1217.
139. 129 F.3d 327, 332 (4th Cir. 1997); see Candeub, supra note 20, at 14849.
392239-ILJ 99-4_Text.indd 102392239-ILJ 99-4_Text.indd 102 5/29/24 10:49 AM5/29/24 10:49 AM
2024] CONTENT MODERATION REGULATION 1153
(3) The Internet and other interactive computer services offer a forum for
a true diversity of political discourse, unique opportunities for cultural
development, and myriad avenues for intellectual activity.
(4) The Internet and other interactive computer services have flourished,
to the benefit of all Americans, with a minimum of government
regulation.
(5) Increasingly Americans are relying on interactive media for a variety
of political, educational, cultural, and entertainment services.
(b) Policy
It is the policy of the United States
(1) to promote the continued development of the Internet and other
interactive computer services and other interactive media;
(2) to preserve the vibrant and competitive free market that presently
exists for the Internet and other interactive computer services, unfettered
by Federal or State regulation;
(3) to encourage the development of technologies which maximize user
control over what information is received by individuals, families, and
schools who use the Internet and other interactive computer services;
(4) to remove disincentives for the development and utilization of
blocking and filtering technologies that empower parents to restrict their
children’s access to objectionable or inappropriate online material; and
(5) to ensure vigorous enforcement of Federal criminal laws to deter and
punish trafficking in obscenity, stalking, and harassment by means of
computer.
140
With the exception of Section 230(a)(5), it is difficult to see how any of Congress’s
findings or policy articulators relate to providers as “publishers.” Rather, these
provisions point to a number of different social roles the law might support or resist,
without committing to any in particular.
For the most part, the findings section seems to conceive of service providers as
potential collateral government censors and Internet users as citizens. Its emphasis
on providers offering a “forum for a true diversity of political discourse”
141
that
flourishes “to the benefit of all Americans, with a minimum of government
regulation,”
142
suggests Congress took note that Internet users engage online as
citizens in public discourse. It seems Congress was keen to prevent service providers
with the technical ability to control that discourse from becoming censors, to serve
users’ interests as citizens. So, the CDA likely sought to resist a collateral censor-
citizen relationship for service providers and users.
The policy section, on the other hand, seems much more focused on service
providers as businesses and Internet users as consumers. With goals like promoting
the Internet’s development, preserving a free market, encouraging new technologies,
and maximizing user controls,
143
Congress portrays service providers as businesses
and Internet users as consumers in a free market burgeoning with technological
development. Internet users as consumers are primarily interested in choice: “control
140. 47 U.S.C. § 230(a)(b).
141. Id. § 230(a)(3).
142. Id. § 230(a)(4).
143. Id. § 230(b)(1)(3).
392239-ILJ 99-4_Text.indd 103392239-ILJ 99-4_Text.indd 103 5/29/24 10:49 AM5/29/24 10:49 AM
1154 INDIANA LAW JOURNAL [Vol. 99:1131
over [what] information . . . they receive.
144
Service providers, as businesses, pursue
profit motives, but they are hampered by the specter of liabilitythe liability shields
that follow thus remove financial disincentives to develop blocking and filtering
technologies. Perhaps the CDA enshrines a business-consumer relationship.
The CDA’s since-stricken prohibition on transmitting offensive material to a
minor
145
adds another potential set of roles into the mix: service providers as
broadcasters and Internet users as democratic listeners. That provision criminalized
the “knowing” transmission of “obscene or indecent” messages to any recipient
under 18 years of age.
146
Communications law has a long history of regulating
broadcasters based on an understanding of their relationship to listeners as citizens
of a democracy.
147
Though broadcasters are often private businesses, they are duty
bound to serve listeners’ interests, which include ensuring their programs comport
with important social mores, such as limitations on minors’ access to obscene
content.
148
The CDA as drafted would have transposed this role-relationship on
service providers and Internet users. But, in Reno v. American Civil Liberties Union,
the Supreme Court not only held the provision was unconstitutional, but also
reasoned the Internet is characteristically unlike broadcasting,
149
implying a
broadcaster-listener relationship might be categorically inapplicable to content
moderation.
The liability shields that follow are disjointed from these prefatory sections.
Professor Olivier Sylvain notes the liability shields are auspiciously labeled as
144. Id. § 230(b)(3).
145. 47 U.S.C. § 223(a)(1)(B)(ii) (“§ 223. Obscene or harassing telephone calls in the
District of Columbia or in interstate or foreign communications. (a) . . . Whoever(1) in
interstate or foreign communications. . . (B) by means of a telecommunications device
knowingly. . . (ii) initiates the transmission of, any comment, request, suggestion, proposal,
image, or other communication which is obscene or indecent, knowing that the recipient of
the communication is under 18 years of age, regardless of whether the maker of such
communication placed the call or initiated the communication; . . . shall be fined under title
18 or imprisoned not more than two years, or both.”). The Supreme Court held 47 U.S.C. §
223(a)(1)(B)(ii) unconstitutional in Reno v. ACLU because it violated the First Amendment.
521 U.S. 844, 849 (1997). The Court found the law’s limits on “indecent” and “patently
offensive” were overly vague. Id. at 87174.
146. 47 U.S.C. § 223(a)(1)(B)(ii).
147. See Syracuse Peace Council v. FCC, 867 F.2d 654, 658 (D.C. Cir. 1989) (explaining
that both the fairness doctrine and its retraction encompass the First Amendment goals of
“stimulating fair, balanced, and diverse treatment of controversial issues (either by single
stations or by the media in the aggregate); of minimizing any chilling effect that may flow
from governmental requirements that one sort of presentation be matched by another; and of
minimizing the risks of abuse and other adverse effects that may flow from having
governmental officials sit in judgment on editorial decisions”); see generally Red Lion Broad.
Co. v. FCC, 395 U.S. 367 (1969) (describing how the fairness doctrine serves listeners’
interests in hearing a broad range of competing viewpoints); Turner Broad. Sys., Inc. v. FCC,
512 U.S. 622 (1994) (reasoning that FCC “must-carry” rules serve Congress’s objective of
providing free television programming to the 40 percent of Americans who lack cable
television).
148. 18 U.S.C. § 1464; 47 U.S.C. § 303.
149. 521 U.S. 844, 86870 (1997).
392239-ILJ 99-4_Text.indd 104392239-ILJ 99-4_Text.indd 104 5/29/24 10:49 AM5/29/24 10:49 AM
2024] CONTENT MODERATION REGULATION 1155
protections for “Good Samaritans.” The label calls up the biblical role of the Good
Samaritan, as well as state statutes that have historically protected individuals from
liability when they attempt to render others aid.
150
Yet, neither shield seems to have
this role in mind.
The first liability shield, Section 230(c)(1), prohibits providers from being treated
as publishers of third-party content, based on an understanding of publishers’
liability for defamation.
151
Defamation (as well as a number of other dignitary torts)
makes certain digressions from societal standards of respectful behavior legally
actionable.
152
It compels publishers to treat people with respect by not disseminating
false and damaging statements about them.
153
The statute’s bar on providers “be[ing]
treated” as publishers thus suggest providers might act like publishers, but it would
be unreasonable to hold them to the behavioral expectations society normally applies
to publishers.
154
The second liability shield makes no mention of publishers. It insulates providers
from liability for their good faith actions to restrict access to material the provider
deems obscene or objectionable.
155
This seems to imagine service providers as
administrative speech governors, like the Federal Communications Commission
(FCC). The FCC is charged by Congress to regulate the wires and waves in the public
interest, and it is accountable to citizens through administrative procedures.
156
It aims
to serve citizens’ interests as democratic listeners, which have historically included
limits on access to obscenity.
157
As for the first shield, the statute seems to invite
service providers to act in this capacity, but it resists societal expectations of public
accountability.
C. Role Equivocation, Role Confusion, and Content Moderation as “Business”
Practice
Citron and Richards assert Section 230 was meant to encourage speech platforms
to monitor and filter the content they host.
158
Sylvain contends Congress sought to
encourage speech platforms “to be passive conduits that facilitate end users
communications.”
159
The Fifth Circuit, in NetChoice, L.L.C. v. Paxton, found
150. See Sylvain, supra note 24, at 213.
151. 47 U.S.C. § 230(c)(1); see also supra notes 135136 and accompanying text.
152. Robert Post, The Social Foundations of Defamation Law: Reputation and the
Constitution, 74 C
ALIF. L. REV. 691, 70719 (1986) (explaining that logically and normatively
defamation causes of action penalize behaviors that disrespect certain social norms).
153. Id.
154. See 47 U.S.C. § 230(c)(1).
155. 47 U.S.C. § 230(c)(2).
156. 47 U.S.C. §§ 307(a), 309(a), (h).
157. See FCC v. Pacifica Found., 438 U.S. 726, 727 (1978) (explaining Congress intended
to give the FCC the power to regulate obscene, indecent, or profane language in exercising its
licensing authority in the public interest).
158. See Citron et al., supra note 36, at 1380.
159. Sylvain, supra note 24, at 219; see also Candeub, supra note 20, at 146 (“[Section
230(c)(1)] treats internet platforms as conduits, such as the telephone or telegraph
companies.”).
392239-ILJ 99-4_Text.indd 105392239-ILJ 99-4_Text.indd 105 5/29/24 10:49 AM5/29/24 10:49 AM
1156 INDIANA LAW JOURNAL [Vol. 99:1131
Section 230 instructs courts not to treat platforms as publishers of user content based
on “Congress’s factual judgment about the role of online platforms”; that is, that
“[they] are not acting as speakers or publishers when they host user-submitted
content.”
160
Who is correct?
The previous section examined in-depth Congress’s handlingperhaps, more
aptly, mishandlingof Section 230’s role constructions. Congress passed Section
230 mainly as a vehicle for more parental control over the material children could
access online, yet it didn’t have a clear image of whom it was regulating (then called
“interactive computer services,” now speech platforms) or the public it served. The
result is a statute that seems to recognize a range of possible roles platforms and their
users might play, without binding platforms to any role-based duties or endowing
users with any role-based rights.
Different perspectives on appropriate content moderation practices talk past each
other because they place speech platforms in different social roles that are often at
cross-purposes. Policymakers that support a common carriage approach, and others
that would impose liability on platforms for failing to take down certain kinds of user
content,
161
cannot see the merits of one another’s position. Each presupposes a
different role-relationshipin this example, common carrier-subscriber and speech
governor-listener.
Section 230’s role equivocation put courts in the unenviable position of
interpreting when to apply Section 230’s liability shields without a firm basis of the
behavioral norms immunity intends to encourage. In the years since the statute’s
passage, courts have immunized an ever-widening scope of speech platforms’
conduct from suit.
162
As Professor Citron and Benjamin Wittes write: Platforms
have been protected from liability even though they republished content knowing it
might violate the law, encouraged users to post illegal content, changed their design
and policies for the purpose of enabling illegal activity, or sold dangerous
products.”
163
Sylvain adds:
[U]nder current law, a social media company cannot be held responsible
for allowing a user to post compromising private photographs of his ex-
girlfriend publicly. A search engine cannot be called to task under law
for displaying the advertisements of third parties that sell copyrighted
ringtones. An online advertising service is under no legal obligation to
remove posts that encourage the sex trafficking of minors.
164
Most recently, Section 230 came for New York’s privacy statute. New York law
prohibits the use of a person’s name or likeness for advertising or trade purposes
160. 49 F.4th 439, 468 (5th Cir. 2022).
161. Compare infra Part IV.A, with Part IV.B (revealing how different underlying views
about the character of content moderation support dramatically different regulations of content
moderation).
162. See infra notes 163164 and accompanying text.
163. Danielle Keats Citron & Benjamin Wittes, The Internet Will Not Break: Denying Bad
Samaritans § 230 Immunity, 86 F
ORDHAM L. REV. 401, 408 (2017) (footnote call numbers
omitted).
164. Sylvain, supra note 24, at 217.
392239-ILJ 99-4_Text.indd 106392239-ILJ 99-4_Text.indd 106 5/29/24 10:49 AM5/29/24 10:49 AM
2024] CONTENT MODERATION REGULATION 1157
without their consent.
165
Patty Ratermann, a model, sued Amazon, Walmart, and
other online retailers for publishing her photo to advertise a third party’s skincare
product without first obtaining her consent. The Southern District of New York, in
Ratermann v. Pierre Fabre USA, Inc., held Section 230 immunizes the retailers’
publication of her photo because it was the skincare company, a third-party seller on
those sites, who produced the offending content.
166
Though it is troubling for Section
230 to chip away at New York’s only privacy protection, this development is
unsurprising given Section 230(c)(1)’s dissonant role construction. As Part III.B
explains, Section 230(c)(1) suggests speech platforms might act like publishers, but
they shouldn’t be held to publishers’ behavioral norms, which include privacy
standards.
167
The notion that plaintiffs might attempt to hold speech platforms accountable as
“publishers” has led courts and litigants to invest tremendous effort in ascertaining
what it means to be a publisher. The Fourth Circuit in Zeran initiated this focus by
holding “lawsuits seeking to hold a service provider liable for its exercise of a
publishers traditional editorial functionssuch as deciding whether to publish,
withdraw, postpone or alter contentare barred.”
168
Since then, courts deciding
whether to immunize speech platforms’ content moderation practices from suit
typically frame the issue as whether the content moderation practice amounts to a
publisher’s “traditional editorial functions.” For instance, in Levitt v. Yelp!, Inc., the
Northern District of California held Yelp’s manipulation of review pages, in
violation of its representations to users, was immunized from state unfair or
fraudulent business practices law because the act was “editorial,” even if it was
malicious.
169
As Part III.D examines in greater depth, the Supreme Court in Gonzalez
passed on the chance to decide whether platforms’ algorithmic content
recommendation systems are part of a publisher’s traditional editorial functions or
are the platforms’ own speech.
170
If the Court had held algorithmic content
recommendation constitutes a publisher’s editorial function, it would have
perpetuated Section 230’s dissonant treatment of platforms as publishers (i.e., that
they may act like publishers, but the public shouldn’t expect them to comply with
publishers’ behavioral norms). Alternatively, holding algorithmic content
recommendation falls outside a publisher’s editorial function would have disclaimed
speech platforms’ likeness to publishers, but otherwise left their role uncertain.
Scholars and lawmakers attempt to resolve Section 230’s role indeterminacy by
debating who has the correct perspective of what speech platforms are as a matter of
empirical reality.
171
But, in the background, speech platforms put into practice their
self-understanding that they moderate content as “businesses.” Professor Kate
165. N.Y. CIV. RIGHTS LAW §§ 5051 (2023),
https://www.nysenate.gov/legislation/laws/CVR/A5 [https://perma.cc/LW3G-6J9W].
166. 651 F. Supp. 3d 657, 665-68 (S.D.N.Y. Jan. 17, 2023).
167. See supra notes 151-154 and accompanying text.
168. Zeran v. AOL, Inc., 129 F.3d 327, 330 (4th Cir. 1997).
169. Nos. C101321, C102351, 2011 WL 5079526, at *67 (N.D. Cal. Oct. 26, 2011),
aff’d, 765 F.3d 1123 (9th Cir. 2014).
170. Gonzalez v. Google LLC, 598 U.S. 617 (2023); see infra notes 173-185 and
accompanying text.
171. See supra notes 47-73 and accompanying text.
392239-ILJ 99-4_Text.indd 107392239-ILJ 99-4_Text.indd 107 5/29/24 10:49 AM5/29/24 10:49 AM
1158 INDIANA LAW JOURNAL [Vol. 99:1131
Klonick writes, [T]he primary reason companies take down obscene and violent
material is the threat that allowing such material poses to potential profits based in
advertising revenue. Platforms’ ‘sense of the bottom-line benefits of addressing hate
speech can be shaped by consumers’ . . . expectations.’”
172
That is to say, speech
platforms typically engage in content moderation to the extent it coincides with their
profit motives and, simultaneously, they regard their users as consumers who reveal
their preferences in the amount of time they engage with the platform.
173
If speech platforms are just “businesses” that moderate content if and when it
serves their profit interests, the walls close in on the possibility of any content
moderation regulation. That is because the commercial speech doctrine, as it has
developed since the 1970s, affords increasing First Amendment protection to
business-to-consumer speech.
174
If framed as business-to-consumer speech, the
practice of content moderation would be insulated from governmental regulation.
175
In Virginia State Board of Pharmacy v. Virginia Citizens Consumer Council, the
Court held commercial speech warrants some protection from governmental
regulation, on the grounds [i]t is a matter of public interest that [purchasing]
decisions, in the aggregate, be intelligent and well informed. To this end, the free
flow of commercial information is indispensable.”
176
[I]f it is indispensable to the
proper allocation of resources in a free enterprise system, it is also indispensable to
the formation of intelligent opinions as to how that system ought to be regulated or
altered.”
177
The Court applied this constitutionalized business-consumer relationship to
digital information transactions in Sorrell v. IMS Health Inc.
178
In that case, the Court
held a Vermont law that forbade the sale or use of records containing doctors’
prescribing practices for marketing purposes violated the First Amendment.
179
It
reasoned, “The State may not burden the speech of others in order to tilt public debate
in a preferred direction. ‘The commercial marketplace, like other spheres of our
social and cultural life, provides a forum where ideas and information flourish.’”
180
Whereas earlier the Court understood commercial speech to serve consumers’
interest in the free flow of information, the Court now seems to conceive that
businesses have a constitutionally protected interest in speakingand, by extension,
in engaging in commercial transactions that support their ability to speak.
181
172. Kate Klonick, The New Governors: The People, Rules, and Processes Governing
Online Speech, 131 H
ARV. L. REV. 1598, 1627 (2018) (quoting Danielle Keats Citron & Helen
Norton, Intermediaries and Hate Speech: Fostering Digital Citizenship for Our Information
Age, 91 B.U.
L. REV. 1435, 1454 n. 113 (2011)).
173. See id.
174. See infra notes 176181 and accompanying text.
175. See Armijo, supra note 45, at 1228.
176. Va. State Bd. of Pharmacy v. Va. Citizens Consumer Council, 425 U.S. 748, 765
(1976).
177. Id.
178. 564 U.S. 552 (2011).
179. Id. at 557.
180. Id. at 57879 (quoting Edenfield v. Fane, 507 U.S. 761, 767 (1993)).
181. See Amy Kapczynski, The Lochnerized First Amendment and the FDA: Toward a
More Democratic Political Economy, 118 C
OLUM. L. REV. ONLINE 179, 20001 (2018);
392239-ILJ 99-4_Text.indd 108392239-ILJ 99-4_Text.indd 108 5/29/24 10:49 AM5/29/24 10:49 AM
2024] CONTENT MODERATION REGULATION 1159
Some scholars and courts assert “content moderation policies are protected
speech”
182
based on an implied business-consumer framing, but that is not a foregone
conclusion. Though this Part described the CDA as a legal role-scripting failure,
there is some upside to the role indeterminacy it engendered. Whether speech
platforms ought to be regarded as “businesses” is not settled as a matter of law or
societal expectation. The range of roles scholars and courts invoke when they
evaluate platforms’ liability for content moderation reveals as much. The First
Amendment is an obstacle to content moderation regulation, but that obstacle is more
like a hurdle than a brick wall. The next Part of this Article examines two current
content moderation regulatory reformslaw’s second take at role-scriptingto
direct scholars’ and lawmakers’ attention to important questions about whether they
establish desirable social roles for speech platforms and the public.
D. Gonzalez v. Google’s Residual Uncertainty on Speech Platforms as Publishers
In Gonzalez v. Google, the Supreme Court had the opportunity to consider
whether speech platforms act as publishers when they recommend (algorithmically
or otherwise) one user’s content to another.
183
The family of Nohemi Gonzalez, an
American killed in an ISIS terrorist attack in Paris, sued Google, Twitter, and
Facebook for violating the Anti-Terrorism Act (ATA).
184
They alleged the speech
platforms aided and abetted terrorism by recommending terrorism recruitment
content to their users.
185
The Ninth Circuit held the conduct was immune from suit under Section 230
because the duty that the plaintiff[s] allege[] the defendant violated derives from
the defendant’s status or conduct as a ‘publisher or speaker’” of another user’s
Jedediah Purdy, Neoliberal Constitutionalism: Lochnerism for a New Economy, 77 L. &
CONTEMP. PROBS. 195, 20001 (2014).
182. Armijo, supra note 45, at 1228 (emphasis omitted). See also Eric Goldman & Jess
Miers, Online Account Terminations/Content Removals and the Benefits of Internet Services
Enforcing their House Rules, 1 J.
OF FREE SPEECH L. 191, 192 (2021) (explaining that court
decisions reflect that platforms have the right to terminate user accounts); Johnson v. Twitter,
No. 18cecg00078 (Cal. Sup. Ct. June 6, 2018) (dismissing plaintiff’s claims Twitter’s
suspension of his account due to his political viewpoint violated the California Constitution,
the Unruh Act, and the UCL, based on application of Section 230 immunity and the First
Amendment); Taylor v. Twitter, A154973 (Cal. Apt. Ct. Aug. 17, 2018) (same); Federal
Agency of News LLC v. Facebook, Inc., 395 F. Supp. 3d 1295, 1309 (N.D. Cal. 2019)
(“Courts have rejected the notion that private corporations providing services via the internet
are public fora for purposes of the First Amendment.”); Tulsi Now, Inc. v. Google, LLC, No.
2:19-cv-06444, 2020 WL 4353686 (C.D. Cal. Mar. 3, 2020) (rejecting plaintiff’s claim Google
violated her First Amendment rights partly on the ground that its company self-regulation is
no way equivalent to government regulation); Wilson v. Twitter, No. 3:20-cv-00054, 2020
WL 3410349 (S.D.W.V. May 1, 2020) (dismissing plaintiff’s claim Twitter violated his First
Amendment rights based partly on the premise “Twitter is a publicly traded, multi-national
corporation and not an arm of federal or state government”).
183. Petition for Writ of Certiorari, at *i, Gonzalez v. Google LLC, No. 21-1333, 2022 WL
1050223 (Apr. 4, 2022).
184. See Gonzalez v. Google LLC, 2 F.4th 871, 880 (9th Cir. 2021); 18 U.S.C. § 2333.
185. Gonzalez, 2 F.4th at 881.
392239-ILJ 99-4_Text.indd 109392239-ILJ 99-4_Text.indd 109 5/29/24 10:49 AM5/29/24 10:49 AM
1160 INDIANA LAW JOURNAL [Vol. 99:1131
content.
186
Two of the three judges on the panel (one concurring and the other
partially concurring and partially dissenting) asserted that if not bound by precedent,
they would hold that the term ‘publisher’ under section 230 reaches only traditional
activities of publication and distributionsuch as deciding whether to publish,
withdraw, or alter contentand does not include activities that promote or
recommend content or connect content users to each other.
187
The effect of the
dissent’s preferred course would be to deny speech platforms Section 230 immunity
for their content recommendation.
188
The Supreme Court granted Gonzalez’s petition for certiorari but ultimately
declined to opine on Section 230’s application to the platforms’ content
recommendation.
189
In its three-page per curiam opinion, it instead vacated the Ninth
Circuit’s judgment and remanded to the lower court because Petitioners likely failed
to state a claim under the ATA.
190
The merits of Gonzalez hinged on an understanding of the societal expectations
placed on publisherstheir “traditional editorial functions”and whether a speech
platform’s content recommendations fall within them. Petitioners asserted “it strains
the English language to say that in . . . recommending . . . writings to users . . . [one]
is acting as the publisher of . . . information provided by another information content
provider.”
191
Recommending books is not inherently a publisher’s function because
anyone can recommend a book. Respondents, on the other hand, equated user content
recommendation to a decision to “display[] . . . information” traditionally associated
with publishers.
192
Amici added a range of interpretations of what publishers do and
whether those expectations include content recommendation.
193
The social role of
“publisher” was at the very center of the Gonzalez case.
186. Id. at 891 (quoting Barnes v. Yahoo!, Inc., 570 F.3d 1096, 1102 (9th Cir. 2009)).
187. Id. at 913 (Berzon, J., concurring); see also id. at 92121 (Gould, J., concurring in
part and dissenting in part) (“In short, I do not believe that Section 230 wholly immunizes a
social media company’s role as a channel of communication for terrorists in their recruiting
campaigns and as an intensifier of the violent and hatred-filled messages they convey.”).
188. Id. at 925 (Gould, J., concurring in part and dissenting in part) (“I would hold that
Section 230 does not immunize Google from liability for its content-generating algorithms
insofar as they develop a message to ISIS-interested users. The same reason supports lack of
immunity for the other Defendant social media companies’ use of their own algorithms,
procedures, users, friends, or other means to deliver similar content from ISIS to the users of
the social media.”).
189. Gonzalez v. Google LLC, 598 U.S. 617, 622 (2023) (per curiam). The Court reached
this holding in reliance on Twitter v. Taamneh, 598 U.S. 471 (2023), which it decided on the
same day. Id. at 478.
190. Id.
191. Petition for Writ of Certiorari, at *27, Gonzalez v. Google LLC, No. 21-1333, 2022
WL 1050223 (Apr. 4, 2022) (quoting Force v. Facebook, 934 F.3d 53, 7677 (2d Cir. 2019)
(Katzmann, J., dissenting)) (internal quotations omitted) (alteration in original) (emphasis in
original).
192. See Brief in Opposition at *20, Gonzalez v. Google LLC, No. 21-1333, 2022 WL
2533118 (Jul. 5, 2022) (alteration omitted) (quoting id. at *31a).
193. Brief of Amici Curiae Common Sense Media and Frances Haugen in Support of
Petitioners, at 3, Gonzalez v. Google LLC, No. 21-1333 (arguing Google isn’t a publisher
because it “also monitors and tracks users’ online activities across websites and computer
392239-ILJ 99-4_Text.indd 110392239-ILJ 99-4_Text.indd 110 5/29/24 10:49 AM5/29/24 10:49 AM
2024] CONTENT MODERATION REGULATION 1161
There are two ways the Court could have decided, each scripting speech
platforms’ social role differently. First, the Court could have decided content
recommendation isn’t a behavioral norm attributed to “publishersand so speech
platforms could face liability for the recommendations they make. As Judge Berzon
put it in his concurrence, “The actions of the social network algorithmsassessing
a user’s prior posts, friends, or viewing habits to recommend new content and
connectionsare more analogous to the actions of a direct marketer, matchmaker,
or recruiter than to those of a publisher.”
194
Second, the Court could have decided
content recommendation falls within a “publisher’sbehavioral norms and, on that
basis, immunized speech platforms from most claims their recommendations violate
the law. This would accord with the Chamber of Commerce’s view that publishers
inherently recommend whatever content they choose to publish.
195
Either way, the Court would not have rectified Section 230’s role-scripting
failure. The first possible decision would have disclaimed any affinity between
speech platforms and publishers (much in the way the Court in Reno v. ACLU
reasoned interactive computer services are characteristically unlike broadcasters).
196
But that decision would otherwise leave speech platforms’ role indeterminate. Other
laws, perhaps the PACT Act or state-level common carriage laws, might then have
supplied role-based behavioral expectations (as well as sanctions for deviations) to
fill the gap.
197
The second possible decisionfinding speech platforms act as
publishers when they recommend contentwould have entrenched the role
dissonance Section 230(c)(1) first generated. Part III.B explains that the CDA’s first
liability shield suggests speech platforms might act like publishers, but it would be
unreasonable for individuals to hold them to the behavioral norms that typically
apply to publishers. There isn’t a clear path forward for law that follows this role
construction, whether it’s the application of existing law or newly promulgated laws
that don’t amend or repeal Section 230(c)(1).
devices to make predictions about the content users want to view.”); Brief of Fairplay as
Amicus Curiae in Support of Petitioner Reynaldo Gonzalez et al., at 2627, Gonzalez v.
Google LLC, No. 21-1333 (explaining courts misconceive the term “publisher” when they
“immuniz[e] companies for their own conduct in designing social media algorithms, products,
and environments that affirmatively harm children.”) (emphasis omitted); Amicus Brief of
Senator Josh Hawley in Support of Petitioners, at 48, Gonzalez v. Google LLC, No. 21-1333
(explaining distributor liability is distinct from publisher liability and it is not subject to
Section 230 immunity); Brief for the States of Tennessee, et al. as Amici Curiae in Support of
Petitioners, at 1015, Gonzalez v. Google LLC, No. 21-1333 (arguing in favor of a narrow
view of “publisher” immunity); Brief of Amicus Curiae of Professor Eric Goldman in Support
of Respondent, passim, Gonzalez v. Google LLC, No. 21-1333 (construing speech platforms
as “online publishers”); Brief of Washington Legal Foundation as Amicus Curiae Supporting
Respondent, at 1516, Gonzalez v. Google LLC, No. 21-1333 (contending Congress intended
“publisher” to have its broad, everyday meaning); Brief of TechFreedom as Amicus Curiae in
Support of Respondent, at 10, Gonzalez v. Google LLC, No. 21-1333 (“A platform, as it
disseminates third-party content, is a publisher.”).
194. Gonzalez, 2 F.4th at 915 (Berzon, J., concurring).
195. Brief of Amicus Curiae Chamber of Commerce of the United States of America in
Support of Respondent, at 14, Gonzalez v. Google, 598 U.S. 617 (2023).
196. 521 U.S. 844, 86870 (1997).
197. See supra Parts IV.A and IV.B.
392239-ILJ 99-4_Text.indd 111392239-ILJ 99-4_Text.indd 111 5/29/24 10:49 AM5/29/24 10:49 AM
1162 INDIANA LAW JOURNAL [Vol. 99:1131
In the wake of the Supreme Court’s (in)decision in Gonzalez v. Google, federal
and state lawmakers are left to articulate their own visions of speech platforms’ social
roles and regulate them accordingly.
198
The Part that follows evaluates two current
content moderation regulatory reforms in terms of the social roles they construct for
speech platforms and the public.
IV.
CONTENT MODERATION REGULATIONS ROLE SCRIPTS
How law talks about speech platforms matters, now more than ever. Public and
scholarly discourse on content moderation and its regulation reflect that there is no
settled understanding of what the role of “speech platform” entails, what constitutes
“good” or “normal” content moderation, and what interests individuals might
reasonably have in a content moderation relationship. Deep divisions and heated
arguments over who has the correct perspective owe in part to Section 230’s failure
to articulate a cohesive vision of whom it regulates and the public it serves. Section
230 fostered uncertainty as to whether speech platforms should act as publishers,
speech governors, or common carriers and serve public interests associated with each
of these roles. Section 230 was a role-scripting failure, but lawmakersespecially
the Supreme Courtnow have another opportunity to script speech platforms’ social
role as they moderate users’ content. The roles law chooses today may help the public
and legal authorities understand whether Elon Musk’s content moderation decisions
are despotic acts of censorship, discrimination on the basis of users’ viewpoints, or
editorial judgments.
This Part examines two current content moderation regulatory reforms in terms
of the social roles they would construct for speech platforms and individuals: the
PACT Act and Protect Speech Act in Congress
199
and Texas and Florida common
carriage laws,
200
as well as the state laws’ judicial review. It surfaces their role
constructions by interrogating the messages they express about whom they regulate
and whom they protect. These include statements about the governed relationship’s
legal attributes (e.g., control, dependency) and parties’ relevant attributes and
interests (e.g., participation, respect), as well as role characteristics implied from the
rights, duties, behavioral constraints, and entitlements that attach to those who meet
expressed attributes. It then examines how each reform’s social roles would guide
behavior beyond explicit legal requirements and direct future legal reform.
This Part does not aim to settle debate over whether speech platforms, as they
exist today, are more “like” speech governors, common carriers, or cable operators,
etc. Rather, it starts from the position that the decisions lawmakers make today about
how to regulate content moderation articulate a vision of what speech platforms
should be. Content moderation regulations are expressive and norm-setting. This Part
instead raises questions lawmakers, scholars, and the public ought to answer
198. See supra note 189.
199. Protect Speech Act, H.R. 3827, 117th Cong. (2021); PACT Act, S. 4066, 116th Cong.
(2020).
200. H.B. 20, 87th Leg., 2d Called Sess. (Tex. 2021); S.B. 7072, 123d Leg., Reg. Sess.
(Fla. 2021).
392239-ILJ 99-4_Text.indd 112392239-ILJ 99-4_Text.indd 112 5/29/24 10:49 AM5/29/24 10:49 AM
2024] CONTENT MODERATION REGULATION 1163
regarding whether these reforms support desirable role-based behavioral norms
within the content moderation relationship.
201
A. The PACT Act & The Protect Speech Act: Speech Platforms as Speech
Governors
The speech governance narrative that features prominently in scholarly discourse
and public accounts of Musk’s Twitter takeover is perhaps best reflected in two bills
before Congress: the PACT Act and the Protect Speech Act. These bills cast speech
platforms as administrative speech governors and individuals as speakers and
listeners in a democratic society.
202
The Protect Speech Act, for one, would require speech platforms that wish to use
Section 230 as a defense to state publicly, in detail, their content moderation rules,
comply with those rules, and provide a rationale and an opportunity for the person to
respond when they restrict access to a person’s content.
203
The PACT Act requires
the same and also requires speech platforms to create a complaint filing system that
allows individuals to file and track complaints and appeals to platforms’ content
moderation decisions.
204
These proposals would afford what Citron calls
“technological due process,” which provides individuals with processes analogous
to those required of a government whenever it deprives a person of life, liberty, or
property.
205
Those processes include providing individuals with notice, the
opportunity to be heard, and an impartial tribunal.
206
201. A word of caution is due. It is not possible for law to predict with great clarity the
direction social meaning will take as law seeks to influence norms, nor is it possible for any
human to do the same. This Part evaluates the societal expectations each reform’s choice of
social roles support if society comes to accept and apply these roles to the content moderation
relationship. This hypothetical posture is necessary because society may reject law’s choice
of social roles or redefine them substantially in unpredictable ways. Still, it is helpful to
consider the societal expectations each role-relationship supports because of law’s greater
likelihood to influence social roles for relationships in which they are not yet settled.
202. These bills seem to draw from scholars’ work construing content moderation as a
form of speech governance. See Klonick, supra note 172, at 1603 (arguing content moderation
governs users’ speech); Jack M. Balkin, The First Amendment in the Second Gilded Age, 66
B
UFFALO L. REV. 979, 997 (2018) (agreeing with Kate Klonick); Jack M. Balkin, Free Speech
Is a Triangle, 118 C
OLUM. L. REV. 2011, 202122 (2019) (explaining online platforms govern
their spaces in multiple ways, including content moderation) [hereinafter Balkin, Triangle];
Alina Tugend, Barred from Facebook, and Wondering Why, N.Y.
TIMES, Sept. 20, 2014, at
B5 (describing the effect Facebook account termination has on people’s lives); T
ARLETON
GILLESPIE, CUSTODIANS OF THE INTERNET (2018) (describing online intermediaries as
exercising a governance function on the internet); Rebecca Tushnet, Content Moderation in
an Age of Extremes, 10 C
ASE W. RES. J.L. TECH. & INTERNET 1, 14 (2019) (acknowledging
speech platforms’ likeness to governors).
203. See Protect Speech Act, H.R. 3827, 117th Cong. (2021).
204. See PACT Act, S. 4066, 116th Cong. (2020).
205. See Danielle Keats Citron, Technological Due Process, 85 W
ASH. U. L. REV. 1249,
130513 (2008).
206. See id. at 127888 (citing Mullane v. Cent. Hanover Bank, 339 U.S. 306, 313 (1950)).
392239-ILJ 99-4_Text.indd 113392239-ILJ 99-4_Text.indd 113 5/29/24 10:49 AM5/29/24 10:49 AM
1164 INDIANA LAW JOURNAL [Vol. 99:1131
Not all speech governors are treatedor are expected to actlike governments.
Churches, private universities, and the military govern speech in different ways and
to different ends. Expectations about how each should be responsive (or non-
responsive) to those they govern depend on the source of their legitimacy.
207
Speech
is governed by churches to serve a higher power; by private universities in pursuit of
knowledge creation and, more recently, a welcoming environment for individuals of
diverse backgrounds; and by the military to maintain good order and discipline in
service of national security.
208
Law does not require due process of these
organizations, nor could it reasonably because they regulate speech within
relationships that do not require democratic legitimacy.
209
The PACT Act and the Protect Speech Act thus seem to construe speech platforms
and their users to be in a quasi-governmental relationshipa relationship compatible
with democratic processes. That is because they react to speech platforms’
governance of speech in the digital public sphere.
210
Consider, for example,
Facebook’s update to its content moderation policy to ban misinformation about
COVID-19 and the consequent removal of twenty million pieces of rule-violating
content, or its long-standing, highly specific restrictions of certain forms of nudity
and sexual content.
211
When Facebook promulgates and enforces these rules, it
defines how individuals may engage in public discourse on political matters and in
207. See generally ROBERT C. POST, DEMOCRACY, EXPERTISE, AND ACADEMIC FREEDOM
(2012) (explaining that different institutions, such as religious organizations, universities, and
the military, regulate speech differently because of their different underlying purposes and
objectives).
208. See generally id.
209. See generally id.
210. The kinds of harm these bills address, such as a lack of accountability and procedure,
reflect their roots in academic discourse on the digital public sphere. See Balkin, Triangle,
supra note 202, at 2012, 2041 (“Social media companies and search engines have social and
moral obligations to the public because they perform three connected public services. First,
they facilitate public participation in art, politics, and culture. Second, they organize public
conversation so that people can easily find and communicate with each other. Third,
they curate public opinion by providing individualized feeds and search results, and by
enforcing civility norms through their terms-of-service obligations and community
guidelines.”); Jack M. Balkin, Free Speech in the Algorithmic Society: Big Data, Private
Governance, and New School Speech Regulation, 51 U.C.
DAVIS L. REV. 1149, 1197 (2018)
(“The more that end-users view businesses as governors, or as special-purpose sovereigns, the
more end-users will expectand demandthat these companies should conform to the basic
obligations of governors towards those they govern. These obligations include procedural
fairness in handling complaints and applying sanctions, notice, transparency, reasoned
explanations, consistency, and conformity to rule of law valuesthe “law” in this case being
the publicly stated norms and policies of the company.”).
211. COVID-19 and Vaccine Policy Updates & Protections, F
ACEBOOK HELP CTR.,
https://www.facebook.com/help/230764881494641 [https://perma.cc/2NBH-
D7MQ?type=image]; Monika Bickert, How We’re Taking Action Against Vaccine
Misinformation Superspreaders, M
ETA (Aug. 18, 2021),
https://about.fb.com/news/2021/08/taking-action-against-vaccine-misinformation-
superspreaders/ [perma.cc/B6MM-TFXR]; Adult Nudity and Sexual Activity, Objectionable
Content, Community Standards, M
ETA, https://transparency.fb.com/policies/community-
standards/adult-nudity-sexual-activity/ [https://perma.cc/BDJ2-W9JU].
392239-ILJ 99-4_Text.indd 114392239-ILJ 99-4_Text.indd 114 5/29/24 10:49 AM5/29/24 10:49 AM
2024] CONTENT MODERATION REGULATION 1165
cultural development more broadly. Its content moderation policies and practices
shape the digital public sphere much like public law would.
The proposals thus cast platforms as administrative speech governors, which are
expected to govern public speech democratically in the public interest. Accordingly,
they should make decisions about permissible and impermissible speech, but those
decisions must meet citizens’ needs as speakers and listeners in a democratic society.
Speakers and listeners’ interests differ. Speakers are expected to be interested
primarily in individual and collective autonomy, i.e., the ability to express their own
views in an overall effort to shape law that is responsive to the collective public
will.
212
Listeners, on the other hand, desire access to speech on a wide array of
diverse viewpoints, but also a body of common facts to ground the public debate,
and, historically, limits on public speech that defies public morals, like obscenity.
213
The role-relationship between speech governors and democratic
speakers/listeners derives from the historic part the FCC played regulating speech in
the public sphere. Congress mandated the FCC to promulgate broadcasting rules and
regulations “specifically directed to consider the demands of the public.”
214
Historically, the FCC took up this mandate by promulgating the “fairness doctrine,”
which required broadcasters to cover controversial public issues and afford speakers
the opportunity to present contrasting viewpoints,
215
afford a certain amount of
airtime to educational programming,
216
and limit when indecent and profane content
could air.
217
Moreover, the FCC, like other administrative agencies, must follow
procedural requirements, including public notice and comment on proposed rules.
218
These obligations ensure the FCC, as a speech governor, fulfills the requirements of
democratic accountability.
Placing speech platforms and individuals in a speech governance relationship
helps explain a number of contemporary problems, like platforms’ circulation of
212. See generally DANIELLE KEATS CITRON, HATE CRIMES IN CYBERSPACE (2014)
(discussing speakers’ autonomy interests in the context of how law can regulate online hate
crimes); R
OBERT C. POST, CONSTITUTIONAL DOMAINS: DEMOCRACY, COMMUNITY,
MANAGEMENT (1995) (explaining the normative foundations of First Amendment protections
for speakers); P
OST, supra note 207 (describing the importance of individual and collective
autonomy for democratic functioning).
213. See Jane Bambauer, Is Data Speech?, 66 S
TAN. L. REV. 57, 59 (2014) (describing
listeners’ interests in receiving “raw facts”); C.
EDWIN BAKER, MEDIA, MARKETS, AND
DEMOCRACY (2001) (examining liberal and republican democratic theories on the public’s
need to access viewpoints that accord with their own and information that establishes common
ground among diverse individuals); P
OST, supra note 207 (explaining the importance of
expertise in providing the public with facts that allow them to pursue a common good); FCC
v. Pacifica Found., 438 U.S. 726, 727 (1978) (situating the FCC’s obscenity restrictions in its
authority to regulate the airways in the public interest).
214. See Red Lion Broad. Co. v. FCC, 395 U.S. 367, 380 (1969) (citing 47 U.S.C. §§
307(a), 309(a)).
215. See Report Concerning General Fairness Doctrine Obligations of Broadcast
Licensees, 102 F.C.C. 2d 143, 146 (1985); Syracuse Peace Council v. FCC, 867 F.2d 654, 655
(D.C. Cir. 1989).
216. See 61 Fed. Reg. 43981 (Aug. 27, 1996) (codified at 47 CFR §§ 73.671, 73.673).
217. 47 CFR § 73.3999.
218. See Administrative Procedure Act, 5 U.S.C. §§ 551559.
392239-ILJ 99-4_Text.indd 115392239-ILJ 99-4_Text.indd 115 5/29/24 10:49 AM5/29/24 10:49 AM
1166 INDIANA LAW JOURNAL [Vol. 99:1131
misinformation, promotion of politically divisive content, and filter bubbles. Each is
inappropriate because it undermines listeners’ interests in accessing diverse
viewpoints and common facts. Platforms as speech governors also act
inappropriately when they make “black box” content moderation decisions, either by
relying on algorithmic decision-making that obscures the basis for a content
moderation decision or by following “secret” internal rules that diverge from those
made public.
219
Overall, the speech governance relationship directs speech platforms
to moderate content in the public interest (as opposed to platforms’ profit interest) in
a manner that comports with public morals and is responsive to the will of those
governed. It also guides individuals to treat speech platforms as rightful governors
to accept the legitimacy of their content moderation authority and voice their dissent
to platforms’ policies and decisions.
The specific platform behavioral expectations this relationship could support may
fluctuate dramatically as society evolves and politics unfold. First, speakers’ and
listeners’ interests point in different directions, and they can easily come into
conflict. One can imagine a Facebook user who wants to post her belief that 2020
presidential election workers stuffed fraudulent ballots into suitcases and another
user who wants that post to be removed as misinformation. The first user acts as a
speaker, interested in expressing her political views. The second acts as a listener,
concerned with their access to shared facts. Platforms as speech governors are
expected to act in the public interest. That means they will have to make difficult
decisions about what the public interest requires that some will undoubtedly disagree
with. Second, behavioral expectations down the line might follow a number of tracks.
Perhaps individuals will expect platforms to follow an updated “fairness rule,” such
that they must include differing viewpoints on controversial matters. In a sense,
common carriage proposals already point in that direction.
220
Or they might expect
platforms to be interoperable with smaller rivals, to support public access to diverse
moderation regimes. Though substantive expectations might vary widely, they share
a baseline understanding that platforms have the authority to regulate speech in
general.
Reforms that place platforms and individuals into a speech-governance
relationship may start out fairly limited in scope, along the lines of the Protect Speech
Act and the PACT Act. These laws would only tweak content moderation at the
margins. They largely morph what prominent speech platforms already do into a set
of legal requirements.
221
However, law that adopts this relationship could justify both
broader and more extensive regulation down the line. As Professor Rebecca Tushnet
puts it, if what we are talking about are truly governance structures, why should
citizens of a democracy accept anything other than actual democracy, either
representative or otherwise, in the regulation of these spaces?”
222
A speech
219. Klonick, supra note 172, at 163134.
220. See infra Part IV.B.
221. Compare Klonick, supra note 172, at 163049 (demonstrating that large speech
platforms, including but not limited to Facebook, already engage in numerous, bureaucratic
internal and external governance mechanisms when moderating content, likely largely
compliant with the Acts’ requirements), with supra Part I.A (describing the Acts’
requirements).
222. Tushnet, supra note 202, at 16.
392239-ILJ 99-4_Text.indd 116392239-ILJ 99-4_Text.indd 116 5/29/24 10:49 AM5/29/24 10:49 AM
2024] CONTENT MODERATION REGULATION 1167
governance relationship could justify an explicit legal obligation on platforms to
moderate content in the public interest. It could also support reforms that would
impose a wider range of democratic governance featuressuch as an Administrative
Procedure Act with which platforms must comply if they seek Section 230’s
coverage
223
or even more dramatic forms of public control, like binding user-
initiated ballots on content moderation policies.
224
The speech governance relationship departs significantly from the business-
consumer relationship that frames platforms’ approach to content moderation. It
engenders the expectation that platforms regulate speech as a public service and,
accordingly, they should make content moderation decisions in the public interest,
rather than in their profit interest, and be subject to some form of public control. It
would be unsurprising for public-interest content moderation decisions to require
platforms to sacrifice some profit; decisions about politically divisive issues will
alienate some users and advertisers. But the cost to platforms (even if ultimately
passed on to consumers) is not a primary concern of regulation that responds to a
speech governance relationship. Instead, it aims to secure a public sphere that can
sustain democratic government.
On the surface, additional due process might sound inherently good. It seems to
empower individuals, in a general sense, by affording them rights against an
otherwise more powerful relationship counterpart. But due process rights are not a
given for all power-asymmetrical relationships: they must be justified based on the
roles participants play within that relationship. The PACT Act and the Protect Speech
Act thus depend on casting speech platforms as speech governors and individuals as
democratic speakers and listeners. They send a message that speech platforms ought
to be speech governors and individuals should regard them as such. Whether these
social roles are good is far from inherent. Rather, lawmakers and the public should
consider whether it’s desirable for platforms to engage in governance and whether
they should be charged with difficult decisions about how to moderate content in the
public interest. Likewise, they should scrutinize whether it’s desirable for individuals
to respect platforms’ sovereignty over public discourse or to participate in content
moderation rulemaking.
B. Texas & Florida Platform Common Carriage Laws: Speech Platforms as
Common Carriers (Maybe?)
Concern over speech platforms’ ability to limit individuals’ ability to speak on
their platforms has animated conservative lawmakers, legal scholars, and even
Justice Clarence Thomas to suggest law should regulate speech platforms as
common carriers.
225
Recently promulgated laws in Texas and Florida attempt to do
223. See Van Loo, supra note 49, at 87586 (proposing in detail a set of “Platform Federal
Rules” akin to the Administrative Procedure Act).
224. See Bloch-Wehba, supra note 19, at 7576.
225. See Matthew Feeney, Opinion, Are Social Media Companies Common Carriers?,
CATO
INST. (May 24, 2021), https://www.cato.org/blog/are-social-media-companies-
common-carriers [https://perma.cc/G5RY-HGAS] (“There is a widespread mistrust of
household‐name social media companies among many American conservatives, who allege
that Big Tech is institutionally biased against conservatives and seeking to stifle their
392239-ILJ 99-4_Text.indd 117392239-ILJ 99-4_Text.indd 117 5/29/24 10:49 AM5/29/24 10:49 AM
1168 INDIANA LAW JOURNAL [Vol. 99:1131
as much. Texas’s H.B. 20 prohibits social media companies from banning users
based on their “viewpoint,” whereas Florida’s S.B. 7072 penalizes social media
platforms for removing or obstructing access to content posted by or about political
candidates.
226
But, as written, the statutes are murky on whether they cast speech
platforms as discriminatory common carriers or censorial speech governors.
The two statutes differ on finer points, but H.B. 20 demonstrates the role
confusion that plagues both. The Texas legislature writes, in its findings, “social
media platforms function as common carriers, are affected with a public interest, are
central public forums for public debate, and have enjoyed governmental support in
the United States.”
227
These findings generally accord with the criteria Yoo identifies
as historically relevant in designating an entity as a common carrier.
228
But the
statute’s core conduct proscription bears little connection to common carriage. It
provides: A social media platform may not censor a user . . . based on: (1) the
viewpoint of the user or another person; (2) the viewpoint represented in the user’s
expression or another person’s expression; or (3) a user’s geographic location in this
state . . ..”
229
The statute also requires platforms to maintain complaint-and-appeal
systems for aggrieved users.
230
These behavioral prohibitions and mandates speak
far more to a relationship of governance demanding democratic legitimacy.
Censorship is generally problematic when done by a state
231
and a complaint-and-
appeal system mirrors the due process rights the PACT Act provides.
A common carriage relationship, by contrast, is concerned chiefly with
nondiscriminatory service on fair terms. Common carriage duties trace to the
emergence of railroads as a dominant mode of transportation of both people and
goods.
232
At the common law, railroads were bound by a “duty to serve at just and
content.”); David Yost, Let’s Make Google a Public Good, N.Y. TIMES (July 7, 2021),
https://www.nytimes.com/2021/07/07/opinion/google-utility-antitrust-technology.html
[https://perma.cc/B5J9-FDF8] (describing Attorney General Yost’s reasoning for declaring
Google a public utility); Biden v. Knight First Am. Inst. at Columbia Univ., 141 S. Ct. 1220,
122223 (2021) (Thomas, J., concurring) (describing how law may set limits on social media
companies’ right to exclude users by implementing common carriage regulation).
226. H.B. 20, 87th Leg., 2d Called Sess. (Tex. 2021); S.B. 7072, 2021 Leg., Reg. Sess.
(Fla. 2021). Other efforts to regulate speech platforms as common carriers include Ohio
Attorney General Dave Yost’s declaration, via lawsuit, that Google is a common carrier, see
Press Release, Dave Yost, Ohio Att’y Gen., AG Yost Files Landmark Lawsuit to Declare
Google a Public Utility (June 8, 2021), https://www.ohioattorneygeneral.gov/Media/News-
Releases/June-2021/AG-Yost-Files-Landmark-Lawsuit-to-Declare-Google-a
[https://perma.cc/5TPR-VR8H], and the 21st Century Free Speech Act in Congress, see 21st
Century FREE Speech Act, S. 1384, 117th Cong. (2021). The bill would require platforms, as
common carriers, to provide their service to anyone without discriminating against individual
users or classes of users, or on the basis of political or religious affiliation or region. Id.
227. Tex. H.B. 20 § 1(3).
228. See Yoo, supra note 20 and accompanying text.
229. Tex. H.B. 20 § 143A.002.
230. See id. §§ 120.10104.
231. See generally C
ENSORSHIP AND SILENCING: PRACTICES OF CULTURAL REGULATION
(Robert C. Post ed., 1998) (describing various cultural practices of expressive control).
232. See James B. Speta, A Common Carrier Approach to Internet Interconnection, 54
F
ED. COMM. L.J. 225, 25358 (2002).
392239-ILJ 99-4_Text.indd 118392239-ILJ 99-4_Text.indd 118 5/29/24 10:49 AM5/29/24 10:49 AM
2024] CONTENT MODERATION REGULATION 1169
reasonable prices,” not just in the case of a monopoly, but because railroads “were
affected by [the] public interestin that all members of the public needed to rely on
their availability.”
233
Public interest concerns also animated the Interstate Commerce
Act of 1887 (ICA), the first federal legislation imposing common carrier duties,
which imposed requirements of just and reasonable rates and conditions of service,
nondiscrimination, and tariff filing, and a mild prohibition on price discrimination.
234
Later amendments to the ICA required railroads to interconnect their lines as well.
235
The Communications Act of 1934 extended common carriage duties to companies
engaged in interstate communication by wire, such as telephone and telegraph
companies.
236
If these companies “hold[] [themselves] out indiscriminately to a class
of persons for service,” they have a duty to serve the public.
237
Much like the
railroads before them, telephone companies’ duty to serve first required just and
reasonable rates and nondiscrimination and later expanded to interconnection with
other carriers.
238
The common carriage components of the Texas and Florida statutes attempt to
translate the historic relationship between telecommunications carriers and
subscribers to the platform content moderation context. It casts speech platforms as
common carriers that principally serve to transport information between individuals,
much like telephones and, by some accounts, the internet’s backbone. Consider, for
instance, Facebook’s display of a user’s Friends’ posts in the user’s Newsfeed, or
Google’s transmission of search results that respond to a user query. A common
carriage relationship construes a platform’s information transport services as
essential because the platform is either the sole provider in its industry or one of
few.
239
Though Facebook’s and Google’s respective market shares are currently in
dispute,
240
it would be uncontroversial to say there are relatively few alternatives to
each.
241
The logic of common carriage regulation suggests that because platforms provide
essential information transport services, they are bound by a duty to serve.
242
That
is, they are expected to serve all who seek their service, on just and reasonable terms,
233. Id. at 25657.
234. Id. at 25859.
235. Id. at 260.
236. The ICA brought telecommunications into its purview prior to the Communications
Act’s promulgation. Id. at 26163.
237. Id. at 264.
238. Id. at 26364.
239. See Nikolas Guggenberger, Essential Platforms, 24 S
TAN. TECH. L. REV. 237, 252
87 (2021).
240. See Order Granting Motion to Dismiss, FTC v. Facebook, Inc., 560 F. Supp. 3d 1
(D.D.C. 2021).
241. Popular estimates suggest Facebook, Twitter, Pinterest, and YouTube account for
roughly 98% of worldwide social media use and Google, Bing, Baidu, and Yahoo! comprise
roughly 98% of worldwide search engine use. See Social Media Stats Worldwide,
STATCOUNTER (Nov. 2021), https://gs.statcounter.com/social-media-stats
[https://perma.cc/CM5U-ABVV]; Search Engine Market Share Worldwide,
STATCOUNTER
(Nov. 2021), https://gs.statcounter.com/search-engine-market-share [https://perma.cc/WZV8-
3CEH].
242. Feeney, supra note 225.
392239-ILJ 99-4_Text.indd 119392239-ILJ 99-4_Text.indd 119 5/29/24 10:49 AM5/29/24 10:49 AM
1170 INDIANA LAW JOURNAL [Vol. 99:1131
without discrimination. Moreover, they are expected to not regulate the morals or
conduct of those they serve: “Public utilities and common carriers [such as telephone
companies] are not the censors of public or private morals, nor are they authorized
or required to . . . regulate the public or private conduct of those who seek service at
their hands.”
243
Some may push back on the notion that “common carrier” is a social role at all
and not just a legal category. But, as Volokh points out, email systems are not treated
as common carriers as a matter of law, and they are technically able to screen
messages based on the viewpoints they contain.
244
Even so, it would surprise people
if their email provider screened emails in this way, because “an e-mail system’s ‘role
in transmitting e-mail is akin to that of a telephone company, which one neither wants
nor expects to superintend the content of its subscribers’ conversations.’”
245
Individuals commonly regard email providers as playing the role of “common
carrier”that is to say, they expect providers to fulfill a common carrier’s behavior
normsthough law does not require it.
A common carriage relationship casts individuals as subscribers interested in
receiving platforms’ information transport services (i.e., having their messages and
desired content faithfully transmitted) for fair prices and other terms of service.
Subscribers’ interest is subtly different than a typical consumer’s interest in price
and quality: the former is concerned with fairness. Fair prices and terms of service
do not hinge on market prices and quality levels, but rather the notion that essential
transportation providers must serve the public interest. Fair prices and terms are thus
tied to the particular public interests at stake given the provider’s service offering.
Whereas it might be unfair for a railroad to provide preferential transport to corn
over wheat, in the case of speech platforms, it might be unfair to promote user content
that supports military withdrawal from the Middle East over user content that
supports military expansion.
The common carriage relationship also depicts individuals as dependent on
platforms because they provide information transport services that are necessary for
daily life.
246
One can imagine how difficult it would be to connect with others online
without social media platforms like Facebook or find information across webpages
without a search engine like Google. The law’s expectation that individuals are
dependent on platforms helps normalize this dependency.
A common carrier-subscriber relationship helps explain the harm individuals
perceive when they are kicked off a speech platform or their post is removed because
of its content. That is because platforms act inappropriately if they refuse to serve a
particular person or group for reasons that are unfair or discriminatory. What counts
as unfair or discriminatory is open to interpretation, contest, and further elaboration
by society. Evidently, conservatives believe it is unfair for platforms to treat content
differently on the basis of the political or religious viewpoint it expresses.
247
In their
view, notions of nondiscrimination derived from First Amendment doctrine that
243. Volokh, supra note 20, at 386 (quoting Pa. Pubs. Inc. v. Pa. Pub. Util. Commn, 36
A.2d 777, 781 (Pa. 1944)).
244. Id. at 386.
245. Id. (quoting Lunney v. Prodigy Servs. Co., 723 N.E.2d 539, 542 (N.Y. 1999)).
246. Guggenberger, supra note 239, at 25276.
247. See 21st Century FREE Speech Act, S. 1384, 117th Cong. (2021).
392239-ILJ 99-4_Text.indd 120392239-ILJ 99-4_Text.indd 120 5/29/24 10:49 AM5/29/24 10:49 AM
2024] CONTENT MODERATION REGULATION 1171
condemns governmental restrictions of public discourse
248
should apply to speech
platforms’ transmission of user content.
249
On the other hand, LGBT activists assert
it is unfair for platforms to take down content that expresses LGBT viewpoints on
the basis it violates rules that restrict sexual material.
250
Their conception of
nondiscrimination derives from platforms’ intermediation of cultural exchange and
keys off civil rights laws that protect progressive norms from orthodox repression.
251
Whether either of these forms of content moderation is unfair or discriminatory will
depend on the rough consensus that emerges from political debate.
Whatever the result, grounding the content moderation relationship in
expectations of fairness, nondiscrimination, and a duty to serve allows concrete
platform behavioral expectations to fluctuate along with societyand expand or
contract accordingly. It also limits platforms’ discretion to determine what qualifies
as fair or discriminatory; platforms are bound by public political will.
If law responds to a common-carriage relationship, a range of reforms may
proceed that support subscribers’ interest in fairness or derive from platforms’ duty
to serve the general public. Law may scrutinize platforms’ terms of service beyond
their content moderation policies. Perhaps it is unfair to monetize individuals’
personal information in exchange for providing information transport services, to
bind them to mandatory arbitration, or to prohibit them from using “web spiders” to
automatically gather information from the platform.
252
Platforms’ duty to serve could
248. See Robert Post, Meiklejohn’s Mistake: Individual Autonomy and the Reform of
Public Discourse, 64 U
NIV. COLO. L. REV. 1109, 111516 (1993) (examining the normative
basis for the First Amendment to prohibit governmental viewpoint-based restrictions on
speech).
249. See Richard Epstein, Should Platforms Be Treated as Common Carriers? It Depends.,
American Enterprise Institute (July 2022), https://platforms.aei.org/should-platforms-be-
treated-as-common-carriers-it-depends/ [https://perma.cc/E33T-MZAU]; Phillip Hamburger,
Setting the Record Straight on Reining in Big Tech, New Civil Liberties Alliance (Aug. 5,
2021), https://nclalegal.org/2021/08/setting-the-record-straight-on-reining-in-big-tech/
[https://perma.cc/YE7J-39QS]. That’s to say, conservatives don’t derive their claim from
existing nondiscrimination law, which (for the most part) doesn’t consider political ideology
a “protected class.” See Craig R. Senn, Ending Political Discrimination in the Workplace
Ending Political Discrimination in the Workplace, 87 M
O. L. REV. 365, 373-405 (2022)
(cataloguing disparities at the federal and state level, and between public and private sector
employees, in terms of whether employment laws protect them from discrimination on the
basis of their political viewpoints).
250. Greg Bensinger & Reed Albergotti, YouTube Discriminates Against LGBT Content
by Unfairly Culling It, Suit Alleges, W
ASH. POST (Aug. 14, 2019, 4:21 PM),
https://www.washingtonpost.com/technology/2019/08/14/youtube-discriminates-against-
lgbt-content-by-unfairly-culling-it-suit-alleges/ [https://perma.cc/GJU2-33XF]; Oliver L.
Haimsen, Daniel Delmonaco, Peipei Nie & Andrea Wegner, Disproportionate Removals and
Differing Content Moderation Experiences for Conservative, Transgender, and Black Social
Media Users: Marginalization and Moderation Gray Areas, at 56, Proc. ACM Hum.-
Comput. Interact. 5, CSCW2, Article 466 (Oct. 2021),
https://dl.acm.org/doi/pdf/10.1145/3479610 [https://perma.cc/48RZ-N95P].
251. Fagan, supra note 115, at 394.
252. See, e.g., Thomas E. Kadri, Platforms as Blackacres, 68 UCLA
L. REV. 1184, 1186
87 & nn. 7, 368, 369 (2022) (explaining how the Computer Fraud and Abuse Act may endow
websites with too much authority to impose civil and criminal liability on entities that use
392239-ILJ 99-4_Text.indd 121392239-ILJ 99-4_Text.indd 121 5/29/24 10:49 AM5/29/24 10:49 AM
1172 INDIANA LAW JOURNAL [Vol. 99:1131
also support mandatory interconnection with other platforms, to the extent
interconnection would afford more of the public access to the platform.
253
The
overriding concerns of a law that responds to a common carriage relationship are
service to all on equal terms coupled with fair prices and conditions of carriage.
The state common carriage laws exhibit confusion about the roles they intend to
apply to speech platforms and individuals. This risks replicating the CDA’s legal
role-scripting failure and perpetuating public confusion about what speech platforms
are and how they should moderate content. Even taken in isolation, the common
carriage aspects of the state laws raise important questions about whether they
support a positive content moderation relationship.
Much like the speech governance relationship the PACT Act would establish, at
the surface common carriage requirements sound appealing. Platforms’ duty to serve
all, on fair terms, and without engaging in discrimination seem to advance the values
of equity and public participation. But the common carriage relationship on which
these duties rely for justification might be less appealing. Should individuals accept
dependence on a few platforms (rather than reject dependence and demand
government action that supports new competition)? What constitutes fair content
moderation terms and who should be empowered to decide? Does a duty to serve
mean platforms must interconnect with one another, even if it comes at a cost to
individuals’ privacy? These are some questions about common carriage’s role-based
behavioral expectations lawmakers and the public ought to consider when deciding
whether to support this type of content moderation reform.
C. The NetChoice Cases: Speech Platforms as Cable Operators
Shortly after the Texas and Florida common carriage laws were promulgated,
NetChoice, an industry group representing social media platforms, brought suit.
254
It
argued the laws were unconstitutional on the ground they interfere with the
platforms’ exercise of “editorial judgment” through content moderation, which it
contended is speech protected by the First Amendment.
255
The Fifth Circuit upheld
Texas’s law, in part based on the court’s conclusion that platforms’ content
moderation isn’t an exercise of “editorial judgment”
256
protected by the First
automated tools, like web spiders, to gather information from them).
253. See, e.g., Fiona Scott Morton, Opinion, Why ‘Breaking Up’ Big Tech Probably Won’t
Work, W
ASH. POST (July 16, 2019, 2:41 P.M.),
https://www.washingtonpost.com/opinions/2019/07/16/break-up-facebook-there-are-
smarter-ways-rein-big-tech/ [https://perma.cc/XV3Z-3VC7] (arguing law should mandate
platform interconnection rather than break up big technology companies).
254. See NetChoice, LLC v. Moody, 546 F. Supp. 3d 1082, 1084 (N.D. Fla. 2021), affd
in part, vacated in part, remanded sub nom. NetChoice, LLC v. Att’y Gen., Fla., 34 F.4th
1196 (11th Cir. 2022); NetChoice, LLC v. Paxton, 573 F. Supp. 3d 1092, 1101 (W.D. Tex.
2021), vacated and remanded sub nom. NetChoice, LLC v. Paxton, 49 F.4th 439 (5th Cir.
2022).
255. NetChoice v. Paxton, 573 F. Supp. 3d at 1103; NetChoice v. Moody, 546 F. Supp. 3d
at 1085.
256. The two opinions use the phrases “editorial judgment” and “editorial discretion”
interchangeably.
392239-ILJ 99-4_Text.indd 122392239-ILJ 99-4_Text.indd 122 5/29/24 10:49 AM5/29/24 10:49 AM
2024] CONTENT MODERATION REGULATION 1173
Amendment.
257
By contrast, the Eleventh Circuit rejected most of Florida’s law
based on its conclusion that speech platforms exercise “editorial judgment” much
like cable operators.
258
Both courts evaluated the pivotal question of whether platforms’ content
moderation involves protected “editorial judgment” by evaluating platforms’
similarity to other roles, like newspapers, parades, and cable operators.
259
The mode
of reasoning went: the Supreme Court established that X role(s) engage in “editorial
judgment” when they transmit speech, so if platforms are sufficiently like X role(s),
their content moderation must also be a form of “editorial judgment.”
260
The Fifth
Circuit dispensed with the three roles, finding they share common characteristics
(i.e., “carefully curat[ing]” their “choice of material”; and “select[ing] . . . content
before that content is hosted, published or disseminated”)
261
and common behavioral
expectations (i.e., “accept[ing] reputational and legal responsibility for the content
[they] edit[]”)
262
that platforms don’t fulfill. The Eleventh Circuit also reasoned
whether platforms have a protectable speech interest by evaluating their likeness to
those social roles, but it reached a different conclusion: [S]ocial-media platforms
should be treated more like cable operators, which retain their First Amendment right
to exercise editorial discretion . . . .”
263
The Eleventh Circuit asserted that the First Amendment protects a “private
entity’s choices about whether, to what extent, and in what manner it will disseminate
speech” because they are “editorial judgments.”
264
The court seems to purport that
the First Amendment protects the dissemination of speech regardless of an entity’s
social role. However, its reasoning conflicts with that assertionto determine
whether a particular act constitutes protected “editorial judgment,” the court
distinguished between different entities on the basis of their roles. For example, law
schools and shopping malls’ “choices about whether, to what extent, and in what
manner [they] will disseminate speech” do not qualify as “editorial judgments,” but
when those decisions are made by newspapers, parade organizers, or cable operators,
they do.
265
Ultimately, the court regarded social media platforms like cable operators,
257. NetChoice v. Paxton, 49 F.4th at 46465.
258. NetChoice v. Att’y Gen., Fla., 34 F.4th at 1220.
259. NetChoice v. Paxton, 49 F.4th, at 45965; NetChoice v. Atty Gen., Fla., 34 F.4th at
120323.
260. See NetChoice v. Paxton, 49 F.4th, at 45965; NetChoice v. Att’y Gen., Fla., 34 F.4th
at 120323.
261. NetChoice v. Paxton, 49 F.4th at 459, 461, 464 (emphasis in original).
262. Id. at 464.
263. NetChoice v. Atty Gen., Fla., 34 F.4th at 1220.
264. Id. at 1210.
265. Id. at 121021.
392239-ILJ 99-4_Text.indd 123392239-ILJ 99-4_Text.indd 123 5/29/24 10:49 AM5/29/24 10:49 AM
1174 INDIANA LAW JOURNAL [Vol. 99:1131
because both are technically unlike broadcast media,
266
and both refuse to carry
certain content “not to propound a particular point of view.”
267
Though the Eleventh Circuit’s conclusion that law should treat platforms like
cable operators saved platforms from much of the Florida law (i.e., the provisions
the court held were “content-based”),
268
its decision to equate platforms to cable
operators is hardly a boon for platforms. The court’s opinion elides the fact that cable
operators are regulated quite heavily on the basis of their social role. In fact, however
paradoxical to the Eleventh Circuit’s decision, the FCC requires cable operators that
allow a “legally qualified candidate for public office . . . to use its facilities” to afford
“equal opportunities to all other candidates for that office to use its facilities,
without censoring that candidate’s material in any way.
269
The FCC regulates cable operators in myriad ways in line with its view of their
social role; that is, its view that if cable operators hold “undue market power” they
would imperil “the availability of diverse views and information.”
270
On that basis,
the FCC authorizes state governments to engage in subscription fee regulation in
some cases; requires cable operators to carry at least one noncommercial educational
station; requires cable operators to carry all content of the broadcast stations they
transmit (without alteration or deletion); requires cable operators to provide
subscribers certain parental control; specifies subscriber privacy protections; restricts
cable operators’ ownership of programming producers; and caps each cable
operator’s nationwide reach, among other things.
271
The Copyright Act also requires
cable operators to remit compulsory license payments to content providers.
272
In Turner Broadcasting System v. FCC, the Supreme Court upheld FCC “must
carry” rules, which required cable operators to set aside a certain number of channels
for over-the-air broadcasters who elect mandatory carriage.
273
The Court refused to
hold the rules unconstitutional, on the grounds the rules were content neutral and
Congress found that cable operators had undue market power” that threatened the
viability of free broadcast television.
274
It reached this holding despite its judgment
that “cable operators engage in and transmit speech, and they are entitled to the
protection of . . . the First Amendment.”
275
266. NetChoice v. Att’y Gen., Fla., 34 F.4th at 1220. The Court relies on the Supreme
Court’s decisions in Turner Broad. Sys., Inc. v. FCC, 512 U.S. 622 (1994), and Reno v.
American Civil Liberties Union, 521 U.S. 844 (1997). Turner held cable operators shouldn’t
be subject to less stringent First Amendment scrutiny that applies to broadcasters because
cable lacks the “unique physical limitations” (i.e., the scarcity of frequencies) that
characterizes broadcast television. 512 U.S. at 63739. The Supreme Court similarly
distinguished the Internet from broadcast media in Reno. 521 U.S. at 844, 86869.
267. NetChoice v. Att’y Gen., Fla., 34 F.4th at 1213 (quoting Hurley v. Irish-American
Gay, Lesbian & Bisexual Group of Boston, 515 U.S. 557, 575 (1995).
268. See id. at 1226.
269. 47 C.F.R. § 76.205 (2019).
270. Cable Television, FCC (June 15, 2021),
https://www.fcc.gov/media/engineering/cable-television [https://perma.cc/2AGQ-7H8R].
271. Id.
272. 17 U.S.C. § 115 (2018).
273. Turner Broadcasting System, Inc. v. FCC, 512 U.S. 622, 62832 (1994).
274. See id. at 63234.
275. Id. at 63638.
392239-ILJ 99-4_Text.indd 124392239-ILJ 99-4_Text.indd 124 5/29/24 10:49 AM5/29/24 10:49 AM
2024] CONTENT MODERATION REGULATION 1175
The Eleventh Circuit’s decision that speech platforms are “like” cable operators
invites policymakers to query whether speech platforms’ “undue market power”
likewise risks the availability of diverse content the public requires. Likeness to cable
operators supports the expectation that speech platforms have a responsibility to
carry diverse content for the listening public’s benefit, and when they accumulate an
excess of market power, law is authorized to the extent it pursues that public interest.
That characterization of speech platforms’ social role enables a range of regulation
affecting content moderation and beyond. It could justify regulation not terribly
different from the Florida “common carriage” lawas for cable operators, a
requirement that if a platform allows a political candidate to transmit over their
services, it must afford other candidates for the same office equal opportunity. It
could support law that restricts content moderation to initial decisions to carry or not
carry a particular user and prohibits later deletion or alteration of that user’s content.
It could also support more stringent privacy protections and potentially compulsory
license payments to the users whose content the platforms profit from.
C
ONCLUSION
The Article argues the core project of content moderation regulation is legal role-
scripting. Whereas scholarly discourse commonly portrays law as role “taking” in
this space,
276
this Article characterizes law as chiefly role “making.”
277
The decisions
lawmakers make about how to regulate content moderation will likely form the basic
contours of what it means to be a speech platform, how one moderates content
appropriately, and the interests individuals reasonably have when they engage with
speech platforms.
278
The relationship between social roles and law is not a line, with
law merely reflecting existing roles, but a loop. When social roles are unsettled or
uncertainas in the content moderation relationshiplaw is especially poised to
influence their preliminary, basic contours.
279
Lawmakers can attempt to script social roles well for the content moderation
relationship by being conscientious about behavioral norms and legal pathways that
flow from their regulatory decisions. The final Part of this Article aimed to assist that
effort by surfacing difficult questions about the desirability of different reforms’ role
scripts and the online speech ecosystem they produce.
280
Examining these contours
of content moderation regulations demands time and attention. This Article asserts it
is time well spent.
It is not a given that current regulatory reforms will rectify the CDA’s failure to
articulate cohesive and consistent roles within the content moderation relationship.
281
Rather, if lawmakers are not conscientious about the social roles content moderation
regulations construct, they might replicate, entrench, or worsen the role uncertainty
the CDA first established.
276. See supra Part I.B.
277. See supra Parts I.C & II.B.
278. See supra Parts I.C & II.B.
279. See supra Part II.B.
280. See supra Part IV.
281. See supra Part III.C.
392239-ILJ 99-4_Text.indd 125392239-ILJ 99-4_Text.indd 125 5/29/24 10:49 AM5/29/24 10:49 AM
1176 INDIANA LAW JOURNAL [Vol. 99:1131
It is equally important to acknowledge that at least five distinct legal authorities
are in the process of articulating competing views of speech platforms’ role.
Congress is considering whether to hold speech platforms accountable as
administrative speech governors;
282
Texas and Florida have attempted to categorize
speech platforms as common carriers;
283
the Eleventh Circuit espoused its view that
speech platforms engage in “editorial judgment” akin to publishers;
284
and the
Supreme Court may wade into the debate on whether speech platforms act as
publishers when it resolves the Fifth and Eleventh Circuits’ split over the
constitutionality of the Texas and Florida laws.
285
There is a strong prospect these legal authorities will simultaneously establish
conflicting roles for speech platforms and the public, and that is a problem. Indeed,
the Fifth and Eleventh Circuits have already clashed on whether speech platforms
should behave like “common carriers,” or like cable operators that exercise “editorial
judgment.”
286
Texas’s common carrier relationship indicates platforms have a duty
to serve all who seek their service, on just and reasonable terms, without
discrimination.
287
The Eleventh Circuit’s publisher relationship authorizes platforms
to make unilateral decisions about how they will moderate content, regardless
whether they serve listeners’ interests.
288
The PACT Act’s speech governance
relationship would exacerbate those competing expectations by proposing platforms
will and should moderate content, but in speakers’ and listeners’ interests and
keeping with due process-like requirements.
289
Sometimes, a common carrier’s reasonable, nondiscriminatory content
moderation might resemble a speech governor’s content moderation in speakers’ and
listeners’ interests and a cable operator’s editorial judgment. For instance, a
moderation rule that allows users to post breastfeeding photos, despite a general
prohibition on nudity, might be reasonable and nondiscriminatoryin that it
comports with prevailing social norms and refrains from prejudicing mothersand
serve speakers’ interest in expressing their views and listeners’ interest in accessing
a rich and diverse public discourse.
These role-relationships diverge on a deeper level. First, the speech governor
relationship binds platforms to moderate content in the public interest, whereas the
common carriage relationship presumes platforms are bound by a duty to serve all,
and a cable operator’s unchecked editorial judgment shirks any normative binds
whatsoever.
290
Platforms’ content moderation decisions will likely differ on that
basis. The public interest may warrant robust content moderation that doesn’t afford
service to all people or all content; a duty to serve suggests only minimal moderation
will be appropriate; and unchecked editorial judgment could support maximalist or
282. See supra Part IV.A.
283. See supra Part IV.B.
284. See NetChoice v. Att’y Gen., Fla., 34 F.4th at 1220.
285. NetChoice, LLC v. Paxton, No. 22-555 (argued Feb. 26, 2024); Moody v. NetChoice,
LLC, No. 22-277 (argued Feb. 26, 2024).
286. See supra notes 25463 and accompanying text.
287. See supra Part IV.B.
288. See supra Part IV.C.
289. See supra Part IV.A.
290. See supra Part IV.
392239-ILJ 99-4_Text.indd 126392239-ILJ 99-4_Text.indd 126 5/29/24 10:49 AM5/29/24 10:49 AM
2024] CONTENT MODERATION REGULATION 1177
minimalist moderation or (as Elon Musk has exhibited)
291
unpredictable vacillation
between the two.
Second, the PACT Act affords individuals due process rights when platforms
moderate content because they serve the interests of speakers and listeners in public
discourse.
292
A common carriage regulation wouldn’t make that allowance because
it doesn’t serve subscribers’ interest in receiving information transport services on
fair terms.
293
Individual due process makes even less sense against a cable operator
if someone disagrees with an editorial decision, they ought to write a letter to the
editor.
294
The role conflict on the horizon presages further uncertainty about speech
platforms’ role within a content moderation relationship. It is difficult to tell how the
mix of federal statute, federal case law, and state statute would direct speech
platforms’ content moderation practices in general, let alone in a way that serves
important social values. Moreover, legally induced role conflict would likely
entrench public disagreement about what speech platforms are and how they should
moderate appropriately. Each side could point to a source of law to bolster the
reasonableness of their position.
It will likely be up to the Supreme Court to resolve this role conflict. It declined
to do so in the Gonzalez case,
295
but the Court still might articulate its perception of
speech platforms’ social role when settles the Fifth Circuit and Eleventh Circuit’s
split on the constitutionality of platform-common carriage laws.
296
Those cases
examined a wide range of roles speech platforms might playas speech governors,
common carriers, publishers, and others.
297
Whereas in Reno the Court asserted the
Internet is characteristically unlike broadcasting,
298
the Court could now put its
weight behind one cohesive and consistent vision of speech platforms’ social role.
291. See supra notes 1–10 and accompanying text.
292. See supra Part IV.A.
293. See supra Part IV.B.
294. See supra Part IV.C.
295. Gonzalez v. Google LLC, 598 U.S. 617, 622 (2023) (per curiam).
296. See NetChoice, LLC v. Paxton, No. 22-555 (argued Feb. 26, 2024); Moody v.
NetChoice, LLC, No. 22-277 (argued Feb. 26, 2024); NetChoice, LLC v. Paxton, 49 F.4th
439, 445 (5th Cir. 2022); NetChoice, LLC v. Att’y Gen., Fla., 34 F.4th 1196, 120910 (11th
Cir. 2022).
297. See NetChoice v. Paxton, 49 F.4th at 45559; NetChoice v. Att’y Gen., Fla., 34 F.4th
at 121518.
298. Reno v. American Civil Liberties Union, 521 U.S. 844, 86870 (1997).
392239-ILJ 99-4_Text.indd 127392239-ILJ 99-4_Text.indd 127 5/29/24 10:49 AM5/29/24 10:49 AM