{"id":843,"date":"2021-11-02T17:00:40","date_gmt":"2021-11-02T23:00:40","guid":{"rendered":"https:\/\/ctlj.colorado.edu\/?p=843"},"modified":"2021-11-02T17:04:18","modified_gmt":"2021-11-02T23:04:18","slug":"regulating-the-social-puppeteers-%c2%a7-230-marginalized-speech","status":"publish","type":"post","link":"https:\/\/ctlj.colorado.edu\/?p=843","title":{"rendered":"Regulating the Social Puppeteers: \u00a7 230 &#038; Marginalized Speech"},"content":{"rendered":"<h1 style=\"text-align: center;\"><a id=\"post-843-_Toc54013918\"><\/a> Regulating the Social Puppeteers:\u00a0\u00a7 230 &amp; Marginalized Speech<\/h1>\n<h2 style=\"text-align: center;\">Kylie Thompson<sup><a id=\"post-843-footnote-ref-2\" href=\"#post-843-footnote-2\">[1]<\/a><\/sup>*<\/h2>\n<h3 style=\"text-align: center;\">Print Version: <a href=\"https:\/\/ctlj.colorado.edu\/wp-content\/uploads\/2021\/11\/Regulating-the-Social-Puppeteers-\u00a7-230-Marginalized-Speech.pdf\">Regulating the Social Puppeteers- \u00a7 230 &amp; Marginalized Speech<\/a><\/h3>\n<p>&nbsp;<\/p>\n<p>Introduction 462<\/p>\n<p>I. What is \u00a7 230? 465<\/p>\n<p>A. Legislative History 465<\/p>\n<p>B. Expansive Scope 466<\/p>\n<p>C. Threading the \u00a7 230 needle 468<\/p>\n<p>II. The Power of Social Platforms 471<\/p>\n<p>A. Harassment &amp; Disinformation 471<\/p>\n<p>B. Content Moderation Practices 474<\/p>\n<p>C. Whose Content is it Anyway? 477<\/p>\n<p>III. Reforming \u00a7 230 478<\/p>\n<p>Conclusion 481<\/p>\n<p><em>This paper addresses social platforms\u2019 immunity under \u00a7 230 of the Communications Decency Act (CDA) in light of recent scholarship on content moderation and curation practices. I first became interested in this topic when I heard Danielle Citron speak about it at the Silicon Flatirons\u2019 Internet Platforms conference in the spring of 2019. Since then, \u00a7 230 has been increasingly scrutinized by both democrats and republicans. What was intended to encourage platforms to moderate content has since been employed to grant platforms broad immunity, allowing platforms to refrain from any moderation despite the legislature\u2019s intent. Many see \u00a7 230 as the lifeblood of the Internet, and fear that collateral censorship will generate more harm without it. But many see it as an unnecessary shield for harms perpetrated online under the cloak of anonymity, disproportionately impacting marginalized speech. Views of \u00a7 230 on all sides are entangled with the complexities of First Amendment doctrine. This note seeks to highlight how the modern internet enshrines hierarchies of power and control, and argues that platforms should not be afforded broad, unfettered, and unprecedented immunity while they obscure machinations that perpetuate inequality and reap the benefits of chaos on the platform. It assesses paths to reforming \u00a7 230 in light of platforms\u2019 content moderation and curation practices.<\/em><\/p>\n<p><a id=\"post-843-_Toc81056763\"><\/a> Introduction<\/p>\n<p>The Internet is different than any technology we\u2019ve encountered before: different in its reach and speed of dissemination; different in its pervasive and ubiquitous presence; different in that it is not the Internet of 1996 anymore. Since 1996, intermediaries have evolved from the bulletin boards of service providers into innovative platforms that allow users to connect and share with each other for \u201cfree\u201d and on a global scale.<sup><a id=\"post-843-footnote-ref-3\" href=\"#post-843-footnote-3\">[2]<\/a><\/sup> The market therefore changed from one in which revenues were generated by traditional models of profit with the end user as the customer, to an attention market in which the advertiser is the customer and the end user becomes the product.<sup><a id=\"post-843-footnote-ref-4\" href=\"#post-843-footnote-4\">[3]<\/a><\/sup><\/p>\n<p>While the Internet has changed global interactions, the use of discrimination on the basis of gender, sexual identity, and race remains embedded in cultural values as a tool of division and control.<a id=\"post-843-_Ref66967901\"><\/a><sup><a id=\"post-843-footnote-ref-5\" href=\"#post-843-footnote-5\">[4]<\/a><\/sup> Women, BIPOC, and gender non-conforming individuals are disproportionately subject to the worst online abuses like impersonation, doxing, stalking,<sup><a id=\"post-843-footnote-ref-6\" href=\"#post-843-footnote-6\">[5]<\/a> <\/sup>revenge porn,<sup><a id=\"post-843-footnote-ref-7\" href=\"#post-843-footnote-7\">[6]<\/a><\/sup> sexual assault threats, and blackmail, to name a few.<sup><a id=\"post-843-footnote-ref-8\" href=\"#post-843-footnote-8\">[7]<\/a><\/sup> While it\u2019s no shock that gender-based, sexuality-based, and race-based discrimination are expressed online,<sup><a id=\"post-843-footnote-ref-9\" href=\"#post-843-footnote-9\">[8]<\/a><\/sup> it is shocking to realize how these harms are transformed by the ecosystem of cyberspace. The immediate, widespread, and permanent nature of the Internet exacerbates identity-based harms, making them more difficult to remedy.<sup><a id=\"post-843-footnote-ref-10\" href=\"#post-843-footnote-10\">[9]<\/a><\/sup> On top of that, platforms might be incentivized to ignore online abuse that otherwise generates traffic and engagement on the platform.<a id=\"post-843-_Ref66904922\"><\/a><sup><a id=\"post-843-footnote-ref-11\" href=\"#post-843-footnote-11\">[10]<\/a><\/sup><\/p>\n<p>This note raises concerns with social media platforms in particular because of their unique influence over end users and the platforms\u2019 economic incentives to maintain user attention. These social platforms, such as Facebook and Twitter, have employed algorithmic designs that tap into human psychology in order to better tailor content to particular users and generate more interest in the platform\u2014essentially, to make users addicted to the site.<a id=\"post-843-_Ref66968287\"><\/a><sup><a id=\"post-843-footnote-ref-12\" href=\"#post-843-footnote-12\">[11]<\/a><\/sup> These websites\u2019 algorithmic designs exacerbate discrimination harms and can chill expression because social values that perpetuate inequities are embedded in the design.<a id=\"post-843-_Ref66904805\"><\/a><sup><a id=\"post-843-footnote-ref-13\" href=\"#post-843-footnote-13\">[12]<\/a><\/sup> If destructive and harmful content gets attention, then platforms have an interest in keeping that content.<sup><a id=\"post-843-footnote-ref-14\" href=\"#post-843-footnote-14\">[13]<\/a><\/sup><\/p>\n<p>Social platforms have begun to create complex content moderation rule sets enforced by human moderators and artificial intelligence (AI) in response to external pressures.<sup><a id=\"post-843-footnote-ref-15\" href=\"#post-843-footnote-15\">[14]<\/a><\/sup> Outside influence in moderation decisions further underscore the undesirable power imbalance that exists between individual user and powerful state and private actors. While content moderation decisions are to some extent grounded in First Amendment principles,<a id=\"post-843-_Ref66967671\"><\/a><sup><a id=\"post-843-footnote-ref-16\" href=\"#post-843-footnote-16\">[15]<\/a><\/sup> social platforms are not bound by the First Amendment and thus platforms have ultimate say over content on their sites.<sup><a id=\"post-843-footnote-ref-17\" href=\"#post-843-footnote-17\">[16]<\/a><\/sup> This note accepts that platforms are in the best position to address harms given the complexity of the cyberspace ecosystem and because platforms control the means and method of communication on their service.<\/p>\n<p>However, platforms cannot be relied upon to make the best decision for consumers because of economic incentives to maximize profits. While platforms should not be liable for any and all content as a newspaper would be, they also should not receive sweeping immunity for all content provided by third parties on the platform, especially when the platforms have the power to make a difference in the lives of the vulnerable individuals from whom they profit.<\/p>\n<p>This note draws on recent literature to highlight how social platforms and the immunity \u00a7 230 provides them enshrine hierarchies of power that disproportionately impact marginalized speech. It assesses why platforms\u2019 obscure machinations that perpetuate harassment and inequality should lead to the conclusion that \u00a7 230 needs reform. Section I provides necessary background: it maps out CDA \u00a7 230, explaining the legislative history and the scope of the Act. Following is a discussion of why reform is necessary to preserve the intent of the legislation. Section II explains the problems of harassment and disinformation online and assesses cyber civil rights in light of recent scholarship on platforms\u2019 content moderation and content curation practices. Section III argues for a potential avenue to reform the language of \u00a7 230, drawing on and augmenting Citron and Wittes\u2019 proposal to condition immunity on reasonableness.<a id=\"post-843-_Ref71041224\"><\/a><sup><a id=\"post-843-footnote-ref-18\" href=\"#post-843-footnote-18\">[17]<\/a><\/sup><\/p>\n<p><a id=\"post-843-_Toc62051607\"><\/a><a id=\"post-843-_Toc81056764\"><\/a><a id=\"post-843-_Toc54013919\"><\/a> What is \u00a7 230?<\/p>\n<p>To make sense of the debate surrounding \u00a7 230, it is first important to understand what the law does and why it was enacted. Section A provides some background on the legislative history surrounding \u00a7 230; and Section B addresses the broad scope given the provision through court interpretation. Section C describes the important strands in the debate and threads the needle between repeal and preservation. It recognizes that \u00a7 230 reflects imbalances in the First Amendment and thus should be reformed.<\/p>\n<p><a id=\"post-843-_Toc62051608\"><\/a><a id=\"post-843-_Toc81056765\"><\/a> Legislative History<\/p>\n<p>Section 230 is the only remaining provision of the CDA, which was passed as part of an effort to curtail pornography on the Internet.<a id=\"post-843-_Ref66905454\"><\/a><sup><a id=\"post-843-footnote-ref-19\" href=\"#post-843-footnote-19\">[18]<\/a><\/sup> The other provisions of the Act were struck down in violation of the First Amendment,<sup><a id=\"post-843-footnote-ref-20\" href=\"#post-843-footnote-20\">[19]<\/a><\/sup> but \u00a7 230 remains codified in Title V of the Telecommunications Act of 1996.<sup><a id=\"post-843-footnote-ref-21\" href=\"#post-843-footnote-21\">[20]<\/a><\/sup> This note concerns section (c) of the provision, which is the meat and potatoes of the legislation.<sup><a id=\"post-843-footnote-ref-22\" href=\"#post-843-footnote-22\">[21]<\/a><\/sup> It grants safe harbor to providers and users of \u201cinteractive computer services\u201d from publisher liability for information provided by a third party.<sup><a id=\"post-843-footnote-ref-23\" href=\"#post-843-footnote-23\">[22]<\/a><\/sup> It also immunizes platforms from liability for taking actions to moderate in \u201cgood faith.\u201d<sup><a id=\"post-843-footnote-ref-24\" href=\"#post-843-footnote-24\">[23]<\/a><\/sup><\/p>\n<p>A few months after Senator Exon\u2019s CDA proposal, Stratton Oakmont, a securities investment banking firm, successfully sued Prodigy, an online bulletin board, for libel.<sup><a id=\"post-843-footnote-ref-25\" href=\"#post-843-footnote-25\">[24]<\/a><\/sup> The court treated Prodigy as a publisher rather than a distributor and held that because Prodigy had made efforts to censor some third-party content, it was strictly liable as a publisher for all content.<sup><a id=\"post-843-footnote-ref-26\" href=\"#post-843-footnote-26\">[25]<\/a><\/sup> This decision followed a prior case, <em>Cubby, Inc. v. CompuServe Inc.<\/em>, 776 F. Supp. 135 1991, where the court found that CompuServe, an internet service provider (ISP) that refrained from moderating its bulletin board, was like a public library, and therefore more like a distributor than a publisher.<sup><a id=\"post-843-footnote-ref-27\" href=\"#post-843-footnote-27\">[26]<\/a><\/sup> The <em>Cubby <\/em>court\u2019s treatment of CompuServe as a distributor meant liability turned on evidence that the intermediary had knowledge of the content at issue.<a id=\"post-843-_Ref66905566\"><\/a><sup><a id=\"post-843-footnote-ref-28\" href=\"#post-843-footnote-28\">[27]<\/a><\/sup> Thus, under <em>Cubby<\/em>, a platform that takes a hands-off approach to moderation would not be held liable under a distributor standard so long as it didn\u2019t have knowledge of the specific content that created the harm;<sup><a id=\"post-843-footnote-ref-29\" href=\"#post-843-footnote-29\">[28]<\/a><\/sup> however, under <em>Prodigy<\/em>, any content moderation would give rise to strict publisher liability.<sup><a id=\"post-843-footnote-ref-30\" href=\"#post-843-footnote-30\">[29]<\/a><\/sup> As a result of the <em>Cubby<\/em> and <em>Stratton<\/em> <em>Oakmont<\/em> cases, online platforms were incentivized to refrain from moderating content on platforms rather than walk a tightrope between distributor and publisher standards, and to instead turn a blind eye in order to avoid potential liability.<sup><a id=\"post-843-footnote-ref-31\" href=\"#post-843-footnote-31\">[30]<\/a><\/sup><\/p>\n<p>House Representatives Chris Cox and Ron Wyden saw the decisions in the <em>Cubby<\/em> and <em>Stratton Oakmont<\/em> rulings as nonsensical because the intermediaries essentially provided the same services, but Prodigy had attempted to moderate third-party content.<sup><a id=\"post-843-footnote-ref-32\" href=\"#post-843-footnote-32\">[31]<\/a><\/sup> Representatives Cox and Wyden thought that the <em>Stratton Oakmont<\/em> ruling would disincentivize investment in the tech sector for fear of being held liable for the content that others put online.<sup><a id=\"post-843-footnote-ref-33\" href=\"#post-843-footnote-33\">[32]<\/a><\/sup> So they teamed up to write an amendment to the CDA\u2014\u00a7 230\u2014that was intended to foster growth in the technology sector and to encourage \u201cinteractive computer services\u201d to engage in content moderation without fear of liability.<sup><a id=\"post-843-footnote-ref-34\" href=\"#post-843-footnote-34\">[33]<\/a><\/sup><\/p>\n<p><a id=\"post-843-_Toc62051609\"><\/a><a id=\"post-843-_Toc81056766\"><\/a> Expansive Scope<\/p>\n<p>Several cases followed the adoption of \u00a7 230 that defined the scope of the law and etched broad immunity for platforms. Both <em>Zeran<\/em> and<em> Blumenthal<\/em> involved defamation suits against AOL in which AOL received summary judgment; however, each court dealt with the \u00a7 230 immunity claim in different ways.<sup><a id=\"post-843-footnote-ref-35\" href=\"#post-843-footnote-35\">[34]<\/a><\/sup> The Fourth Circuit in <em>Zeran<\/em> framed its decision in terms of policy, claiming it would be an \u201cimpossible burden\u201d for online service providers to look into every defamation claim and make an accurate determination as to the merits of the claim or risk being held liable as a distributor.<sup><a id=\"post-843-footnote-ref-36\" href=\"#post-843-footnote-36\">[35]<\/a><\/sup> In contrast, the district court for D.C. in <em>Blumenthal<\/em> felt bound to the text of \u00a7 230 despite skepticism over the practical implications of the statute in context.<sup><a id=\"post-843-footnote-ref-37\" href=\"#post-843-footnote-37\">[36]<\/a><\/sup> These cases illustrate how the text of \u00a7 230 has been interpreted and applied by courts to protect intermediaries in the face of defamation claims. This is despite factual circumstances in both cases showing the intermediaries\u2019 potential role in the harm, i.e., delayed removal and promotion of harmful content, respectively.<\/p>\n<p>In contrast, the<em> Roommates.com<\/em> decision signaled some limit on the breadth of \u00a7 230.<sup><a id=\"post-843-footnote-ref-38\" href=\"#post-843-footnote-38\">[37]<\/a><\/sup> Roommates.com provided a platform service that connected potential roommates through individual profiles. The court held that Roomates.com was not immune from liability under \u00a7 230 because it had a hand in crating the discriminatory content at issue\u2014it developed a discriminatory questionnaire as a condition of service that was subsequently used to conduct a filtering process based on subscribers\u2019 answers.<sup><a id=\"post-843-footnote-ref-39\" href=\"#post-843-footnote-39\">[38]<\/a><\/sup> The court remarked that \u00a7 230 \u201cwas not meant to create a lawless no-man\u2019s-land on the Internet.\u201d<sup><a id=\"post-843-footnote-ref-40\" href=\"#post-843-footnote-40\">[39]<\/a><\/sup><\/p>\n<p>More recently, the Supreme Court likened the Internet to the modern public square for purposes of First Amendment doctrine in <em>Packingham<\/em>.<sup><a id=\"post-843-footnote-ref-41\" href=\"#post-843-footnote-41\">[40]<\/a><\/sup> Although that case concerned government regulation of speech, it raised alarm that the Court could potentially apply that analogy to cases concerning platform liability in the future, which would mean that platforms could be likened to public utilities and thus subject to First Amendment constraints that are traditionally reserved to government actors.<sup><a id=\"post-843-footnote-ref-42\" href=\"#post-843-footnote-42\">[41]<\/a><\/sup> This decision underscores society\u2019s changing conception of the pivotal role that platforms assume in modern communication and engagement.<\/p>\n<p><a id=\"post-843-_Toc62051610\"><\/a><a id=\"post-843-_Toc81056767\"><\/a> Threading the \u00a7 230 needle<\/p>\n<p>Many supporters of \u00a7 230 assert that the Internet would not have been able to develop into what it is, or continue to exist as we know it today, without \u00a7 230.<sup><a id=\"post-843-footnote-ref-43\" href=\"#post-843-footnote-43\">[42]<\/a><\/sup> This narrative thrives on the notion that we should be afraid of what the Internet will look like or become if \u00a7 230 is revised, or worse, repealed in its entirety. Some who contend that \u00a7 230 should be repealed argue that tort law will provide enough protection for platforms in the absence of immunity.<a id=\"post-843-_Ref66905727\"><\/a><sup><a id=\"post-843-footnote-ref-44\" href=\"#post-843-footnote-44\">[43]<\/a><\/sup> A traditional publication tort requires that fault amounting to negligence be shown, meaning the plaintiff must prove causation to recover.<sup><a id=\"post-843-footnote-ref-45\" href=\"#post-843-footnote-45\">[44]<\/a><\/sup> This means platforms will not automatically be held liable for third-party tort harms without \u00a7 230.<sup><a id=\"post-843-footnote-ref-46\" href=\"#post-843-footnote-46\">[45]<\/a><\/sup> But \u00a7 230 singles out the Internet as special and provides it more protections than other traditional media sources, and more protections than any other industry in our history.<sup><a id=\"post-843-footnote-ref-47\" href=\"#post-843-footnote-47\">[46]<\/a><\/sup><\/p>\n<p>The argument for returned reliance on tort has intuitive appeal; however, a total repeal of \u00a7 230 could lead to negative consequences. Without \u00a7 230, the risk of returning to the messy distributor\/publisher distinction at work in both the <em>Cubby<\/em> and <em>Stratton Oakmont<\/em> cases will likely incentivize platforms to either refrain from content moderation altogether or to over-moderate and engage in collateral censorship.<sup><a id=\"post-843-footnote-ref-48\" href=\"#post-843-footnote-48\">[47]<\/a><\/sup> Rather than risk being held strictly liable as a publisher, platforms might step back and wash their hands of moderating any content without the protections of \u00a7 230.<sup><a id=\"post-843-footnote-ref-49\" href=\"#post-843-footnote-49\">[48]<\/a><\/sup> This is not desirable because, as victims\u2019 rights lawyers, cyber civil rights activists, and others have made clear, platforms are in the best position to address and moderate harmful content.<sup><a id=\"post-843-footnote-ref-50\" href=\"#post-843-footnote-50\">[49]<\/a><\/sup> Alternatively, platforms might begin to censor more content than desirable or stop allowing users to contribute content altogether because of the risk that a platform will be held liable under a distributor standard for the inevitable mistakes made in choosing which content to take down.<sup><a id=\"post-843-footnote-ref-51\" href=\"#post-843-footnote-51\">[50]<\/a><\/sup> Admittedly, this publisher\/distributor distinction is only relevant in cases that pertain directly to speech and involve publication tort suits.<sup><a id=\"post-843-footnote-ref-52\" href=\"#post-843-footnote-52\">[51]<\/a><\/sup> The problem underlying calls for repeal and reform is that \u00a7 230 jurisprudence has expanded protection beyond this narrow realm.<sup><a id=\"post-843-footnote-ref-53\" href=\"#post-843-footnote-53\">[52]<\/a><\/sup> Still, it is better that the legislature speak to how courts should proceed than to leave it up to courts to decide the publication status of the most powerful corporations of our age.<\/p>\n<p>Supporters often contend that \u00a7 230 should be preserved in its entirety because it appropriately balances the desire for platforms to moderate and the reality that platforms make filtering mistakes, against the concern that collateral censorship will occur without immunity.<sup><a id=\"post-843-footnote-ref-54\" href=\"#post-843-footnote-54\">[53]<\/a><\/sup> As the argument goes, Congress made the following judgment in enacting \u00a7 230: \u201c[t]he mistakes caused by liability are worse than the mistakes caused by immunity\u201d<sup><a id=\"post-843-footnote-ref-55\" href=\"#post-843-footnote-55\">[54]<\/a><\/sup> and thus platforms should be allowed the freedom to adopt moderation schemes at will without fear of liability. This argument succumbs to \u201cthe gravitational pull of the First Amendment\u201d and asserts that all censorship is necessarily bad and will result in less speech.<a id=\"post-843-_Ref66905868\"><\/a><sup><a id=\"post-843-footnote-ref-56\" href=\"#post-843-footnote-56\">[55]<\/a><\/sup> However, the First Amendment is not absolute and regulation might even serve to produce more speech\u2014a greater diversity of speech\u2014not less speech.<sup><a id=\"post-843-footnote-ref-57\" href=\"#post-843-footnote-57\">[56]<\/a><\/sup> Franks argues that the First Amendment \u201chas created a free speech dystopia in which only the powerful are truly at liberty to speak and the pursuit of truth has been rendered virtually impossible.\u201d<sup><a id=\"post-843-footnote-ref-58\" href=\"#post-843-footnote-58\">[57]<\/a><\/sup> Thus, the concern that liability will result in collateral censorship is contorted and made somewhat perverse with the understanding that only some speech is currently valued in our society: the speech of the powerful, white, male majority.<sup><a id=\"post-843-footnote-ref-59\" href=\"#post-843-footnote-59\">[58]<\/a><\/sup> The next section explores in greater depth how harassment online leads to the suppression of marginalized speech.<\/p>\n<p>As Citron and Wittes\u2019 aptly named article proposes, I too believe that \u201cthe Internet will not break\u201d from reforming \u00a7 230 to protect the victims of online abuse and harassment.<a id=\"post-843-_Ref66906050\"><\/a><sup><a id=\"post-843-footnote-ref-60\" href=\"#post-843-footnote-60\">[59]<\/a><\/sup> The benefits of \u00a7 230\u2019s immunity \u201ccould have been secured at a slightly lesser price.\u201d<sup><a id=\"post-843-footnote-ref-61\" href=\"#post-843-footnote-61\">[60]<\/a><\/sup> The question of how to reform \u00a7 230 to address abuse against individual users and larger societal harms is admittedly difficult because of misinformation about the law and opaqueness concerning what role platforms have in developing or creating harmful content. However, that should not hinder efforts to recraft immunity in such a way that balances the burdens of plaintiffs against the burdens of platforms.<a id=\"post-843-_Ref66967732\"><\/a><sup><a id=\"post-843-footnote-ref-62\" href=\"#post-843-footnote-62\">[61]<\/a><\/sup> This middle ground would be more desirable than letting current harms persist because of the Internet\u2019s central role in facilitating communication and civic engagement and the need to protect marginalized speech.<\/p>\n<p>The question then becomes how to reform \u00a7 230 to address inequities while mitigating the risk that platforms might begin to over-moderate marginalized speech in the face of new regulation. It is a hard question because there is no perfect way to quantify both the harms to individuals and the benefits of \u00a7 230, and then to balance them against each other.<sup><a id=\"post-843-footnote-ref-63\" href=\"#post-843-footnote-63\">[62]<\/a><\/sup> While it may be true that we cannot quantify the harms and benefits, and that the balancing judgment is based on personal values, I think we can, and should, do better. \u00a7 230 should not be read so broadly, and because it has already been read broadly by many courts, it should be amended by Congress to provide greater protection to vulnerable consumers.<\/p>\n<p><a id=\"post-843-_Toc62051611\"><\/a><a id=\"post-843-_Toc81056768\"><\/a> The Power of Social Platforms<\/p>\n<p>Recent scholarship produced useful insights into what platforms are currently doing to address harmful content on the platform.<sup><a id=\"post-843-footnote-ref-64\" href=\"#post-843-footnote-64\">[63]<\/a><\/sup> These scholars urge that any suggestions for reforming \u00a7 230 take into account the current practices that platforms follow.<sup><a id=\"post-843-footnote-ref-65\" href=\"#post-843-footnote-65\">[64]<\/a><\/sup> But before looking to the practices of platforms, it is important to first have a better sense of the harms that occur online against largely marginalized groups. Thus, Section A describes online harassment and disinformation as a steadfast reprisal of old harms that are amplified in the internet ecosystem. Sections B and C draw on recent scholarship to understand platforms\u2019 current content moderation and curation practices and attempt to situate that understanding within the context of cyber civil rights activists\u2019 calls for \u00a7 230 reform.<\/p>\n<ol>\n<li><a id=\"post-843-_Toc62051612\"><\/a><a id=\"post-843-_Toc81056769\"><\/a> Harassment &amp; Disinformation<\/li>\n<\/ol>\n<p>The debate over \u00a7 230 produced some useful comparisons to historical events, such as the women\u2019s rights movement of the 60s and 70s. In the mid 1980s, radical feminist and lawyer Catharine MacKinnon argued that pornography should not be constitutionally protected as speech because it legitimizes abusive acts and suppresses the speech rights of women.<sup><a id=\"post-843-footnote-ref-66\" href=\"#post-843-footnote-66\">[65]<\/a><\/sup> In particular, MacKinnon argued that pornography suppressed women\u2019s expression and their ability to speak out against abuse because it degrades and subordinates women as a class, effectively silencing them.<sup><a id=\"post-843-footnote-ref-67\" href=\"#post-843-footnote-67\">[66]<\/a><\/sup> Even though the Supreme Court left only obscenity and child pornography outside the protections of the First Amendment, Mackinnon\u2019s argument is reprised in the debate surrounding harms perpetrated against women online.<sup><a id=\"post-843-footnote-ref-68\" href=\"#post-843-footnote-68\">[67]<\/a><\/sup><\/p>\n<p>Harassers and abusers drive their victims offline by instilling legitimate fears of continued harassment, which leave victims effectively silenced.<sup><a id=\"post-843-footnote-ref-69\" href=\"#post-843-footnote-69\">[68]<\/a><\/sup> And women and racial minorities are disproportionately the targets of some of the most egregious cyberattacks.<sup><a id=\"post-843-footnote-ref-70\" href=\"#post-843-footnote-70\">[69]<\/a><\/sup> Thus, already marginalized speech is getting quashed by bad actors and impacting civic engagement on and off-line. \u00a7 230 enables platforms to turn a blind eye to this sort of censorship in that it immunizes even platforms that refuse to moderate illegal acts facilitated on the platform. Yet, at the same time, \u00a7 230 is somehow argued to protect against censorship\u2014namely, the collateral censorship of predominantly privileged voices to begin with. What\u2019s more, big tech is allowed to reap the benefits of these harms and is a presumed innocent bystander without second thought of any sort of accomplice liability.<\/p>\n<p>In the 1960s and 1970s, women collectively protested domestic violence and sexual harassment: practices entrenched in social norms of the day.<sup><a id=\"post-843-footnote-ref-71\" href=\"#post-843-footnote-71\">[70]<\/a><\/sup> These norms followed narratives of victim-blaming and maintaining the status quo: what happens in the home stays in the home and \u2018boys will be boys.\u2019<sup><a id=\"post-843-footnote-ref-72\" href=\"#post-843-footnote-72\">[71]<\/a><\/sup> Women began to debunk these social beliefs through systematic and organized movements, calling attention to the inequities and harms produced.<sup><a id=\"post-843-footnote-ref-73\" href=\"#post-843-footnote-73\">[72]<\/a><\/sup> Citron posits that the next frontier for attaining women\u2019s equality is online.<sup><a id=\"post-843-footnote-ref-74\" href=\"#post-843-footnote-74\">[73]<\/a><\/sup> The entrenchment and normalization of revenge pornography online serves as compelling evidence. In order to secure equity online, we must continue to change social attitudes and dispel victim-blaming.<sup><a id=\"post-843-footnote-ref-75\" href=\"#post-843-footnote-75\">[74]<\/a><\/sup> Just as \u201cstay home\u201d was not an acceptable response to workplace harassment, \u201cstay offline\u201d should not be an acceptable response to violence against women and minorities online.<sup><a id=\"post-843-footnote-ref-76\" href=\"#post-843-footnote-76\">[75]<\/a><\/sup><\/p>\n<p>Importantly, the nature of harassment online shifted from random bad actors\u2019 attacks on individual users to systematic attacks on communities or groups of people as a tool of disinformation campaigns.<sup><a id=\"post-843-footnote-ref-77\" href=\"#post-843-footnote-77\">[76]<\/a><\/sup> These systemic, cross-platform attacks are often carried out by state actors who use tactics like trolling to spread disinformation and enlist \u201cuseful idiots\u201d or average citizens to spread the message.<sup><a id=\"post-843-footnote-ref-78\" href=\"#post-843-footnote-78\">[77]<\/a><\/sup> The targeting of specific communities aims to exacerbate social divisions and further polarize people on political issues. Thus, a dynamic persists where disinformation is rampant and harassment tactics are used to further divide people along socioeconomic lines, illustrating a larger societal problem beyond isolated harms to individuals.<\/p>\n<p><a id=\"post-843-OLE_LINK118\"><\/a><a id=\"post-843-OLE_LINK119\"><\/a><a id=\"post-843-OLE_LINK120\"><\/a> Additionally, the ability for users to remain anonymous amplifies harassing and abusive behavior.<sup><a id=\"post-843-footnote-ref-79\" href=\"#post-843-footnote-79\">[78]<\/a><\/sup> <a id=\"post-843-OLE_LINK121\"><\/a><a id=\"post-843-OLE_LINK122\"><\/a>Identifying anonymous posters of harmful content is often difficult and unsuccessful. <sup><a id=\"post-843-footnote-ref-80\" href=\"#post-843-footnote-80\">[79]<\/a><\/sup> <a id=\"post-843-OLE_LINK123\"><\/a><a id=\"post-843-OLE_LINK124\"><\/a>In order for a plaintiff to unmask their anonymous attacker, the plaintiff must file suit against the anonymous defendant, subpoena the website to turn over data about the user\u2014such as an IP address\u2014and if the IP address is obtained, the plaintiff must request the name of the subscriber from the internet service provider that hosts that IP address.<sup><a id=\"post-843-footnote-ref-81\" href=\"#post-843-footnote-81\">[80]<\/a><\/sup> <a id=\"post-843-OLE_LINK125\"><\/a><a id=\"post-843-OLE_LINK126\"><\/a>This process poses several difficulties: not all websites require real names or email addresses, or keep track of IP addresses; the subpoena can be challenged, in which case complex First Amendment balancing tests are used to determine enforceability; and, even if the subpoena is enforced, the information often leads to a dead end.<sup><a id=\"post-843-footnote-ref-82\" href=\"#post-843-footnote-82\">[81]<\/a><\/sup> Because platforms have far more power and ability to control what happens on the platform\u2013\u2013who uses the platform, the terms and conditions of use, and what information they keep on users\u2013\u2013platforms should be incentivized to take greater responsibility in protecting vulnerable consumers.<\/p>\n<p><a id=\"post-843-_Toc62051613\"><\/a><a id=\"post-843-_Toc81056770\"><\/a> Content Moderation Practices<\/p>\n<p><a id=\"post-843-OLE_LINK127\"><\/a><a id=\"post-843-OLE_LINK128\"><\/a> Platforms enjoy unprecedented immunity from publisher liability under \u00a7 230, even though they maintain and regulate what <em>Packingham<\/em> likened to the modern-day public square for purposes of First Amendment law.<sup><a id=\"post-843-footnote-ref-83\" href=\"#post-843-footnote-83\">[82]<\/a><\/sup> <a id=\"post-843-OLE_LINK129\"><\/a><a id=\"post-843-OLE_LINK130\"><\/a>Cyber civil rights activists have called to reign in this immunity for over a decade.<sup><a id=\"post-843-footnote-ref-84\" href=\"#post-843-footnote-84\">[83]<\/a><\/sup> In a recent article Klonick uncovered how and why social media platforms moderate content, and proposed that we understand and treat platforms as new types of governance, separate and apart from traditional First Amendment categorical analogies:<\/p>\n<p><a id=\"post-843-OLE_LINK131\"><\/a><a id=\"post-843-OLE_LINK132\"><\/a> [P]latforms should be thought of as operating as the New Governors of online speech. These New Governors are part of a new triadic model of speech that sits between the state and speakers- publishers. They are private, self-regulating entities that are economically and normatively motivated to reflect the democratic culture and free speech expectations of their users.<sup><a id=\"post-843-footnote-ref-85\" href=\"#post-843-footnote-85\">[84]<\/a><\/sup><\/p>\n<p><a id=\"post-843-OLE_LINK133\"><\/a><a id=\"post-843-OLE_LINK134\"><\/a> Klonick frames the \u201cgovernance\u201d that platforms engage in as an iterative process reflecting the \u201cinterplay between user and platform.\u201d<sup><a id=\"post-843-footnote-ref-86\" href=\"#post-843-footnote-86\">[85]<\/a><\/sup> However, \u201cgovernance\u201d can be boiled down to something much simpler, something we ought not lose sight of. \u201cGovernance,\u201d in its simplest form, implies control and authority over a group backed by the threat of punishment. <a id=\"post-843-OLE_LINK135\"><\/a><a id=\"post-843-OLE_LINK136\"><\/a>Facebook asserts control of its users by setting the terms and conditions of engagement on the app\u2014more specifically, through the Abuse Standards.<sup><a id=\"post-843-footnote-ref-87\" href=\"#post-843-footnote-87\">[86]<\/a><\/sup> <a id=\"post-843-OLE_LINK137\"><\/a><a id=\"post-843-OLE_LINK138\"><\/a>The standards are non-negotiable for entrance and participation, and the punishment for violating those standards can rise to dismissal from the platform indefinitely\u2014a serious punishment resulting in the loss of access to a scarce medium for speech.<sup><a id=\"post-843-footnote-ref-88\" href=\"#post-843-footnote-88\">[87]<\/a><\/sup> While platforms have not been and should not be treated as government speakers, platforms should also not be allowed to control a scarce venue for modern communication and engagement completely free from traditional tort liability.<\/p>\n<p><a id=\"post-843-OLE_LINK139\"><\/a><a id=\"post-843-OLE_LINK140\"><\/a> The moderation or \u201cgovernance\u201d process evolves constantly behind the scenes as platforms attempt the impossible\u2014to keep pace with rapidly changing expectations about speech.<sup><a id=\"post-843-footnote-ref-89\" href=\"#post-843-footnote-89\">[88]<\/a><\/sup> <a id=\"post-843-OLE_LINK141\"><\/a><a id=\"post-843-OLE_LINK142\"><\/a>Platforms could not possibly take into account every user\u2019s changing expectation so what or who has the greatest influence on policy iterations? Klonick \u201cdiscusses four major ways platforms\u2019 content-moderation policies are subject to outside influence: (1) government request, (2) media coverage, (3) third-party civil society groups, and (4) individual users\u2019 use of the moderation process.\u201d<sup><a id=\"post-843-footnote-ref-90\" href=\"#post-843-footnote-90\">[89]<\/a><\/sup> All four of these categories reflect the embeddedness of power and privilege, victims lack of access to justice. Governments influence content decisions of platforms by threatening to regulate platforms or cut off access to the platform entirely.<sup><a id=\"post-843-footnote-ref-91\" href=\"#post-843-footnote-91\">[90]<\/a><\/sup> The media exerts influence over content decisions by evoking public outcry and collective action.<sup><a id=\"post-843-footnote-ref-92\" href=\"#post-843-footnote-92\">[91]<\/a><\/sup> Third party groups exert influence over content decisions by advocating for the interests of those they represent and meeting collectively with industry players to discuss content guidelines.<sup><a id=\"post-843-footnote-ref-93\" href=\"#post-843-footnote-93\">[92]<\/a><\/sup> While this category initially appears to be a win for individual users, third party groups operate at a level removed from users and cannot be said to adequately represent every user that has been harmed. <a id=\"post-843-OLE_LINK1\"><\/a><a id=\"post-843-OLE_LINK2\"><\/a>The moderation process itself is also problematic because not all users will have access to this process;<sup><a id=\"post-843-footnote-ref-94\" href=\"#post-843-footnote-94\">[93]<\/a><\/sup> not everyone is afforded technological due process. Therefore, the voices of victims are still unheard, and they have little to no recourse by which to impact the policy decisions of platforms.<\/p>\n<p>Moreover, individual users that exhaust all formal avenues to complain about moderation decisions often turn to informal tactics. Rory Van Loo discusses the limits of user\u2019s informal tactics to shape moderation decisions, like taking to social media to complain: \u201cAn assault victim should not have to take to social media and reveal a very private and painful event to the world to get a response. Moreover, users with few followers have less social media influence. Appealing to the CEO may go nowhere.\u201d<a id=\"post-843-_Ref66968028\"><\/a><sup><a id=\"post-843-footnote-ref-95\" href=\"#post-843-footnote-95\">[94]<\/a><\/sup> Without power and influence, or collective public outcry, individual users may have no recourse to the injustices suffered on social platforms when the platform refuses to help. And even with public outcry and collective action, those with power and privilege in society may succeed in keeping up content that would otherwise be removed or vice versa. Thus, fundamental problems of control and power continue to shape the moderation process outside the public view and without legal teeth mandating the transparency of moderation policy decisions.<\/p>\n<p>It\u2019s important to keep in mind that platforms are first and foremost companies, not public utilities, which means that fundamental to their existence\u2014and thus any scheme of \u201cgovernance\u201d\u2014is the drive to maximize profits. It makes sense then to examine the financial incentives of social platforms in moderating content particularly when \u00a7 230 does not currently require that platforms do any moderating in order to receive immunity from tort liability. Klonick argues that platforms have financial incentive to moderate content according to the expectations of users: \u201c[p]latforms have created a voluntary system of self-regulation because they are economically motivated to create a hospitable environment for their users in order to incentivize engagement.\u201d<sup><a id=\"post-843-footnote-ref-96\" href=\"#post-843-footnote-96\">[95]<\/a><\/sup> Certainly, if engagement suffers because users are made uncomfortable by particular content on the platform, then so too will advertising revenues.<sup><a id=\"post-843-footnote-ref-97\" href=\"#post-843-footnote-97\">[96]<\/a><\/sup> But these economic incentives are complicated because users are the commodity and advertisers are the customers in the revenue models of social platforms.<sup><a id=\"post-843-footnote-ref-98\" href=\"#post-843-footnote-98\">[97]<\/a><\/sup> Platforms like Facebook generate revenue from advertisers, not users, and thus are incentivized to protect advertisers before users, and potentially at the expense of users.<sup><a id=\"post-843-footnote-ref-99\" href=\"#post-843-footnote-99\">[98]<\/a><\/sup> Additionally, the fear of users leaving because of unhospitable conditions is countered by strong network effects and concentration of power in the market, as well as the argument that abusive material actually generates traffic and attention.<sup><a id=\"post-843-footnote-ref-100\" href=\"#post-843-footnote-100\">[99]<\/a><\/sup> So again it becomes evident that the powerful are protected at the expense of the vulnerable who do not have adequate access or power to influence content moderation policy.<\/p>\n<p><a id=\"post-843-_Toc62051614\"><\/a><a id=\"post-843-_Toc81056771\"><\/a> Whose Content is it Anyway?<\/p>\n<p>Beyond content removal decisions, social platforms make curation decisions about how and where content gets placed on the platform. Douek discusses this shift in content moderation policy in the context of Facebook:<\/p>\n<p>Facebook is increasingly relying not on the blunt content moderation tools of removing posts or pages, but on the subtle tools of limiting their reach and exposure. For \u2018borderline\u2019 content in each of its harmful categories, Facebook works to \u2018distribute that content less\u2019 to reduce the incentive to post such content.<sup><a id=\"post-843-footnote-ref-101\" href=\"#post-843-footnote-101\">[100]<\/a><\/sup><\/p>\n<p>If transparency is a problem in the more concrete decisions to take down or leave up content, it is even more of a problem in the context of curation algorithms that determine placement. These decisions, while meant to increase revenue by increasing engagement, also end up \u201c[shaping] the form and substance of their users\u2019 content\u201d in several notable and problematic ways.<sup><a id=\"post-843-footnote-ref-102\" href=\"#post-843-footnote-102\">[101]<\/a><\/sup> Platforms\u2019 designs on user data under the new attention markets have been described as manipulative because the algorithms used deploy principles of human psychology to alter human behavior by getting users to visit the platform more often.<sup><a id=\"post-843-footnote-ref-103\" href=\"#post-843-footnote-103\">[102]<\/a><\/sup> Sophisticated users who recognize and understand this process may purposefully change their behavior while participating on the platform in order to influence the algorithms curating their content in one way or another.<sup><a id=\"post-843-footnote-ref-104\" href=\"#post-843-footnote-104\">[103]<\/a><\/sup> Additionally, the curation of targeted content gives rise to filter bubbles or echo chambers that reinforce particular viewpoints and keep users isolated from content outside their comfort zone.<sup><a id=\"post-843-footnote-ref-105\" href=\"#post-843-footnote-105\">[104]<\/a><\/sup> Taking all this into account, it becomes increasingly difficult to separate the tortious or otherwise illegal content of third parties from the platform that promotes, filters, and profits from user engagement with such content.<\/p>\n<p>While the development of AI technologies holds the potential to solve many complex problems pertaining to online participation and interaction, such as moderating discriminatory or illegal content, its use is not without intense controversy. The nascent technologies are still developing and make many mistakes without human oversight.<sup><a id=\"post-843-footnote-ref-106\" href=\"#post-843-footnote-106\">[105]<\/a><\/sup> Even so, the algorithmic designs on user data deployed by big tech illustrates that platforms are capable of sifting through large volumes of content, archiving user data, and filtering and curating content for particular users. But the impetus behind these capabilities is the capitalistic incentive to increase profits; thus, the artificial intelligence technologies developed by platforms are being deployed and developed to achieve the goal of profit maximization. If we allow AI technology to continue developing for the single-minded goal of maximizing the capitalist\u2019s profit, then we might miss opportunities to apply this technology to a different set of problems.<sup><a id=\"post-843-footnote-ref-107\" href=\"#post-843-footnote-107\">[106]<\/a><\/sup><\/p>\n<p>If the attention revenue model and resulting data practices shape the substance of content on the platform that creates subsequent harms to users, then broad immunity under \u00a7 230 seems unreasonable because the platform did contribute to the harm. At the very least, a plaintiff should be able to bring a civil claim against a platform and receive a response from the platform before the lawsuit is dismissed on \u00a7 230 grounds. Although, the influence of algorithms is probably not enough evidence to overcome a \u00a7 230 defense in court, like in <em>Rommates.com<\/em> or in <em>Backpage<\/em>, because the judiciary is not well situated to understand the intricacies of algorithms and their psychological effects.<\/p>\n<p><a id=\"post-843-_Toc62051615\"><\/a><a id=\"post-843-_Toc81056772\"><\/a> Reforming \u00a7 230<\/p>\n<p>Society won\u2019t fall apart without blanket immunity for platforms: the Internet will fight to adapt.<sup><a id=\"post-843-footnote-ref-108\" href=\"#post-843-footnote-108\">[107]<\/a><\/sup> Platforms are more than just mere conduits for user content, they are active players in shaping communication online.<sup><a id=\"post-843-footnote-ref-109\" href=\"#post-843-footnote-109\">[108]<\/a><\/sup> Whether or not they have a hand in a particular harm to a particular user is a more difficult question. But we ought to be weary of giving platforms a free pass on liability, before ever getting to the merits. Citron and Wittes have proposed altering the language of \u00a7 230(c)(1) in the following way:<\/p>\n<p>No provider or user of an interactive computer service that\u00a0<em>takes reasonable steps to prevent or address unlawful uses of its services<\/em>\u00a0shall be treated as the publisher or speaker of any information provided by another information content provider\u00a0<em>in any action arising out of the publication of content provided by that information content provider.<\/em><sup><a id=\"post-843-footnote-ref-110\" href=\"#post-843-footnote-110\">[109]<\/a><\/sup><\/p>\n<p>These changes would effectively make platform immunity contingent upon platforms having a reasonable policy or process to prevent or address unlawful uses of the platform. Citron and Franks later expanded on how this might play out in the context of a platform\u2019s motion to dismiss on \u00a7 230 grounds: \u201cThe question would not be whether a platform acted reasonably with regard to a specific use.\u00a0.\u00a0.[but rather,]\u2026whether the [platform] engaged in reasonable content moderation practices writ large with regard to unlawful uses that clearly create serious harm to others.\u201d<sup><a id=\"post-843-footnote-ref-111\" href=\"#post-843-footnote-111\">[110]<\/a><\/sup> Further, Citron and Wittes suggest that an analysis of what constitutes \u201creasonable\u201d would take into account factors such as volume of content, whether unlawful actions were encouraged, and whether requests to remove content were addressed in order to account for the differences between ISPs, social media platforms, and other interactive computer services.<sup><a id=\"post-843-footnote-ref-112\" href=\"#post-843-footnote-112\">[111]<\/a><\/sup><\/p>\n<p>This would be a careful and well-balanced first step to addressing the challenges created by \u00a7 230 because it would (1) require that all social platforms, and other types of platforms, adopt reasonable moderation capabilities and policies, eliminating the problems elucidated by <em>Herrick v Grindr<\/em>;<sup><a id=\"post-843-footnote-ref-113\" href=\"#post-843-footnote-113\">[112]<\/a><\/sup> and, (2) it would not necessarily prompt over moderation because the question focuses on the reasonable efforts to moderate writ large rather than in the particular instance. However, given the reality of the tech sector in that a few firms have vast market power and that many of these firms have already developed content moderation policies, it is likely that platforms will still be able to succeed in dismissing cases relatively easily under this \u00a7 230 framework. It also does not get at the problem of platform transparency and accountability because individual moderation and curation decisions could continue to play out behind closed doors at the behest of powerful actors.<\/p>\n<p>To incentivize greater transparency and accountability, I also propose changing \u00a7 230(c)(2)(A) to read that \u201cno provider or user of an interactive computer service shall be held liable on account of any action <em>reasonably taken, and that is made in accordance with a reasonably transparent process,<\/em> to restrict access to or availability of material\u2026\u201d (italics are proposed changes). What constitutes \u201creasonable\u201d action here might include a consideration of the degree to which the content was considered and whether it conformed to a written public policy of the platform. I suggest writing into the statute a requirement for platforms to make aspects of the moderation process more transparent to quell concerns that platforms may be censoring certain voices disproportionately.<sup><a id=\"post-843-footnote-ref-114\" href=\"#post-843-footnote-114\">[113]<\/a><\/sup> The goal is to create a more legitimate system of accountability without recreating harms.<\/p>\n<p>A reasonableness standard combined with transparency would force platforms that wish to continue benefitting from \u00a7 230 to develop clear public facing policies and to explain decisions about content moderation, which might help to expose any effort to censor on the basis of race, gender, class, or political affiliation. However, public outcry would still be necessary to shame platforms into curtailing censorship on discriminatory grounds because private parties are free to infringe the First Amendment rights of others and engage in hate speech. Transparency would at least pave the way to greater accountability on the part of platforms.<\/p>\n<p>Of course, these reforms will likely prompt First Amendment challenges. Citron and Wittes contend that conditioning \u00a7 230(c)(1) immunity upon reasonable efforts to moderate does not burden free speech interests because it merely rolls back an immunity that is not required by the First Amendment.<sup><a id=\"post-843-footnote-ref-115\" href=\"#post-843-footnote-115\">[114]<\/a><\/sup> On the other hand, one could make the creative argument that this change would be a type of compelled speech in that the government is dictating that platforms must engage in some level of moderation. I don\u2019t think this counterargument is likely to win out because it seems well settled that systems of liability necessarily encourage and discourage certain behaviors. Thus, it appears plausible that Congress can constitutionally encourage moderation by dangling the carrot of immunity under \u00a7 230. Conditional immunity would force platforms to think carefully about how to moderate content and build the architecture of the platform with safeguards in place if they want to benefit from immunity.<\/p>\n<p>Adding a reasonable standard in both \u00a7\u00a7 230(c)(1) and (c)(2) might make it less likely that a platform could get away with disproportionately moderating the content of vulnerable groups. However, a transparency requirement might also be challenged on First Amendment grounds. I think the transparency requirement would withstand scrutiny because, likewise to the recommended provision in (c)(1), it only modifies or conditions an immunity that is not guaranteed by the First Amendment. Like the reasonableness requirement, a transparency requirement seeks to encourage transparent behavior in order to receive the carrot of immunity. Intermediaries would not be forced to reveal proprietary information but would have to implement and produce some evidence of a reasonable process by which users could inquire after moderation decisions that impact them, and potentially reverse the decision.<sup><a id=\"post-843-footnote-ref-116\" href=\"#post-843-footnote-116\">[115]<\/a><\/sup><\/p>\n<p><a id=\"post-843-_Toc81056773\"><\/a> Conclusion<\/p>\n<p>If \u00a7230 remains as is, then the victims of cyberattacks\u2014largely, women and minorities\u2014will continue to be driven offline by harassment and abuse with little to no recourse for justice in the most likely event that they cannot identify the individual perpetrator. And if victims do bring a claim against a platform for some theory of harm, then the broad reading that courts have given \u00a7 230 may end the claim\u2019s life at the pleading stage. The threat doesn\u2019t dissipate, it persists, forcing victims offline which is an essential method of communication and civic involvement today and for the foreseeable future. Platforms have little incentive to provide help when they are immune from expansive tort liability, financially benefitting from the attention created by harassment and abuse, and powerful enough to withstand backlash from the minority that are affected. Therefore, to incentivize platforms to exclude bad actors and protect vulnerable populations\u2019 ability to engage in democracy, \u00a7 230 should be revised. First, immunity under \u00a7 230 should be conditioned on the platform taking reasonable steps to moderate online content. This means that (c)(1) should be revised to make exemption from speaker\/publisher treatment contingent upon reasonable efforts to moderate, and (c)(2) should be revised to make immunity from liability for actions taken to moderate content contingent upon reasonableness and transparency.<\/p>\n<ol>\n<li id=\"post-843-footnote-2\">* J.D. University of Colorado, Class of 2021; B.A. University of Dayton, Class of 2018. Special thanks to the CTLJ members for their work on this note, and to my professors for their support and feedback along the way. <a href=\"#post-843-footnote-ref-2\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-3\"><em> . See generally<\/em> Tim Wu, The Attention Merchants: The Epic Scramble to Get Inside Our Heads 5\u20136 (2016) (describing the evolution of rapid commercialization, advertisement, and social media and its effects on daily lives and the economy). <a href=\"#post-843-footnote-ref-3\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-4\"><em> . Id. <\/em>at 335\u201336. <a href=\"#post-843-footnote-ref-4\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-5\"><em> . See, e.g.<\/em>, Danielle Keats Citron, Hate Crimes in Cyberspace 73\u201380 (Harvard U. Press, 2014). <a href=\"#post-843-footnote-ref-5\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-6\"><em> . See <\/em>Soraya Chemaly, <em>There\u2019s No Comparing Male and Female Harassment Online, <\/em>Time (Sept. 9, 2014, 10:55 AM), https:\/\/time.com\/3305466\/male-female-harassment-online\/ [https:\/\/perma.cc\/WWA7-P2KG] (\u201c70 percent of those stalked online are women. More than 80 percent of cyber-stalking defendants are male.\u201d). <a href=\"#post-843-footnote-ref-6\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-7\"><em> . See id. <\/em>(\u201c[A] study of 1,606 revenge porn cases showed that 90% of those whose photos were shared were women, targeted by men.\u201d). <a href=\"#post-843-footnote-ref-7\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-8\"><em> . See id.<\/em>; <em>see also <\/em>Ruha Benjamin, Race After Technology 23 (Polity Press, 2019) (providing an example of racial targeting online, \u201c[White nationalists] are especially fond of Twitter and use it to spread their message, grow their network, disguise themselves online, and generate harassment campaigns that target people of color, especially Black women.\u201d). <a href=\"#post-843-footnote-ref-8\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-9\"><em> . See generally Online violence: Just because it\u2019s virtual doesn\u2019t make it any less real<\/em>, Global Fund for Women, https:\/\/www.globalfundforwomen.org\/online-violence-just-because-its-virtual-doesnt-make-it-any-less-real\/ [https:\/\/perma.cc\/LR4U-Y232]. <a href=\"#post-843-footnote-ref-9\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-10\"><em> . See generally WMC Speech Project: Online Abuse 101, <\/em>Women\u2019s Media Ctr., https:\/\/www.womensmediacenter.com\/speech-project\/online-abuse-101\/ [https:\/\/perma.cc\/C2SF-6K4G] (explaining kinds of online abuse from a civil rights perspective). <a href=\"#post-843-footnote-ref-10\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-11\"><em> . See <\/em>Danielle Keats Citron &amp; Mary Anne Franks, <em>The Internet as a Speech Machine and Other Myths Confounding Section 230,<\/em> 2020 U. of Chi. Legal F. 45, 53\u201354 (2020) (\u201cYet the online advertising business model continues to incentivize revenue-generating content that causes significant harm to the most vulnerable among us. Online abuse generates traffic, clicks, and shares because it is salacious and negative. Deep fake pornography sites as well as revenge porn and gossip sites thrive thanks to advertising revenue.\u201d). <a href=\"#post-843-footnote-ref-11\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-12\"><em> . See generally <\/em>The Social Dilemma (Argent Pictures 2020) (the documentary-drama hybrid explores the dangerous human impact of social networking); <em>see also,<\/em> Olivier Sylvain, <em>Discriminatory Designs on User Data<\/em>, Knight First Amendment Inst. (Apr. 1, 2018), https:\/\/knightcolumbia.org\/content\/discriminatory-designs-user-data [https:\/\/perma.cc\/BS2Z-8SGH] (\u201cThe third part of the paper turns to the designs that intermediaries employ to structure and enhance their users\u2019 experience, and how these designs themselves can further discrimination.\u201d). <a href=\"#post-843-footnote-ref-12\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-13\"><em> . See <\/em>Sylvain, <em>supra <\/em>note 10<em>;<\/em> <em>see also<\/em> Safiya Umoja Noble, Algorithms of Oppression 1 (N.Y. U. Press 2018) (arguing that search engines reinforce racism through algorithms). <a href=\"#post-843-footnote-ref-13\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-14\"><em> . See <\/em>Citron &amp; Franks, <em>supra <\/em>note 9, at 46. <a href=\"#post-843-footnote-ref-14\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-15\"><em> . Id<\/em>. at 53 (\u201cWhat often motivates [banning, filtering, and blocking decisions] is pressure from the European Commission to remove hate speech and terrorist activity. The same companies have banned certain forms of online abuse\u2026in response to pressure from users, advocacy groups, and advertisers. They have expended resources to stem abuse when it has threatened their bottom line.\u201d). <a href=\"#post-843-footnote-ref-15\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-16\">. Kate Klonick, <em>The New Governors: The People, Rules, and Processes Governing Online Speech<\/em>, 131 Harv. L. Rev. 1598, 1621 (2018) (\u201cAmerican lawyers trained and acculturated in American free speech norms and First Amendment law oversaw the development of company content-moderation policy. Though they might not have \u2018directly im- ported First Amendment doctrine,\u2019 the normative background in free speech had a direct impact on how they structured their policies.\u201d). <a href=\"#post-843-footnote-ref-16\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-17\"><em> . See<\/em> <em>State Action Requirement<\/em>, Legal Info. Ins., https:\/\/www.law.cornell.edu\/wex\/state_action_requirement [https:\/\/perma.cc\/5JU5-CGEQ]. <a href=\"#post-843-footnote-ref-17\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-18\">. <em>See generally<\/em> Danielle Keats Citron &amp; Benjamin Wittes, <em>The Internet Will Not Break: Denying Bad Samaritans \u00a7 230 Immunity<\/em>, 86 Fordham L. Rev. 401 (2017). <a href=\"#post-843-footnote-ref-18\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-19\"><a id=\"post-843-_Hlk71041101\"><\/a> . John Bergmayer, <em>What Section 230 Is and Does \u2013 Yet Another Explanation of One of the Internet\u2019s Most Important Laws<\/em>, Pub. Knowledge (May 14, 2019), https:\/\/www.publicknowledge.org\/blog\/what-section-230-is-and-does-yet-another-explanation-of-one-of-the-internets-most-important-laws\/ [https:\/\/perma.cc\/ZPC3-AL4X] (\u201cAfter all, this is why it was enacted as part of the Communications Decency Act, most of the rest of which was struck down as unconstitutional, but which was broadly aimed at scrubbing the internet of porn.\u201d). <a href=\"#post-843-footnote-ref-19\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-20\"><em> . Id<\/em>. <a href=\"#post-843-footnote-ref-20\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-21\">. 47 U.S.C. \u00a7 230 (2018); <em>see also<\/em> Blake Reid, <em>Section 230 of\u2026what?<\/em>, blake.e.reid (Sept. 4, 2020), https:\/\/blakereid.org\/section-230-of-what\/ [https:\/\/perma.cc\/XK5B-SQCD]. <a href=\"#post-843-footnote-ref-21\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-22\"><em> . See <\/em>47 U.S.C. \u00a7 230(c) (2018). <a href=\"#post-843-footnote-ref-22\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-23\"><em> . See id<\/em>. \u00a7 230(c)(1). <a href=\"#post-843-footnote-ref-23\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-24\"><em> . See id<\/em>. \u00a7 230(c)(2). <a href=\"#post-843-footnote-ref-24\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-25\"><em> . See<\/em> Stratton Oakmont, Inc. v. Prodigy Servs. Co., 1995 WL 323710 (N.Y. Sup. Ct. May 24, 1995). <a href=\"#post-843-footnote-ref-25\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-26\"><em> . Id<\/em>. at *5. <a href=\"#post-843-footnote-ref-26\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-27\"><em> . See<\/em> Bergmayer, <em>supra<\/em> note 16 (quoting the court in <em>Cubby<\/em>, \u201c[a] lower standard of liability to an electronic news distributor such as CompuServe than that which is applied to a public library, bookstore, or newsstand would impose an undue burden on the free flow of information.\u201d). <a href=\"#post-843-footnote-ref-27\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-28\"><em> . See<\/em> <em>id<\/em>.; <em>see also <\/em>Jeff Kosseff, The Twenty-Six Words that Created the Internet 42 (2019). <a href=\"#post-843-footnote-ref-28\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-29\"><em> . See <\/em>Kosseff, <em>supra<\/em> note 25, at 42\u201343. <a href=\"#post-843-footnote-ref-29\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-30\"><em> . Id<\/em>. at 52. <a href=\"#post-843-footnote-ref-30\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-31\"><em> . See Id.<\/em> at 52\u201356. <a href=\"#post-843-footnote-ref-31\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-32\"><em> . Id. <\/em>at 59. <a href=\"#post-843-footnote-ref-32\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-33\"><em> . Id<\/em>. at 60. <a href=\"#post-843-footnote-ref-33\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-34\"><em> . Id<\/em>. at 61. <a href=\"#post-843-footnote-ref-34\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-35\"><em> . See<\/em> Zeran v. America Online, Inc., 129 F.3d 327 (4th Cir. 1997); <em>but see<\/em> Blumenthal v. Drudge, 992 F.Supp. 44 (D.D.C. 1998). <a href=\"#post-843-footnote-ref-35\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-36\"><em> . Zeran<\/em>, 129 F.3d at 332\u201333. <a href=\"#post-843-footnote-ref-36\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-37\"><em> . Blumenthal<\/em>, 992 F.Supp. at 51. <a href=\"#post-843-footnote-ref-37\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-38\"><em> . See<\/em> Fair Hous. Council of San Fernando Valley v. Roommates.com, L.L.C., 521 F.3d 1157 (9th Cir. 2008). <a href=\"#post-843-footnote-ref-38\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-39\"><em> . Id.<\/em> at 1164 (\u201cBy requiring subscribers to provide the information as a condition of accessing its service, and by providing a limited set of pre-populated answers, Roommate becomes much more than a passive transmitter of information provided by others; it becomes the developer, at least in part, of that information.\u201d). <a href=\"#post-843-footnote-ref-39\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-40\"><em> . Id<\/em>. <a href=\"#post-843-footnote-ref-40\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-41\">. Packingham v. North Carolina, 137 S. Ct. 1730, 1737 (2017). <a href=\"#post-843-footnote-ref-41\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-42\"><em> . Id.<\/em>; <em>see, e.g<\/em>., <em>Packingham v. North Carolina<\/em>, 131 Harv. L. Rev. 233, 233 (Nov. 10, 2017), https:\/\/harvardlawreview.org\/2017\/11\/packingham-v-north-carolina\/ [https:\/\/perma.cc\/QD7R-X4C3]. <a href=\"#post-843-footnote-ref-42\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-43\">. Kosseff, <em>supra<\/em> note 25, at 9; <em>but see<\/em> Carrie Goldberg (@cagoldberglaw), Twitter (Dec. 30, 2020, 1:43 PM), https:\/\/twitter.com\/cagoldberglaw\/status\/1344383688507879426?s=20 [https:\/\/perma.cc\/J7F2-DUMD] (\u201cSection 230 did not create the internet as we know it. The shift from subscription based profit models to \u2018free\u2019 user data-mining and advertising profit models is what created the internet. It\u2019s when the user stopped being the customer and started being the commodity.\u201d). <a href=\"#post-843-footnote-ref-43\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-44\"><em> . Recode Decode: CDA 230<\/em>, Decoder with Nilay Patel (Aug. 23, 2019), https:\/\/www.podchaser.com\/podcasts\/decoder-with-nilay-patel-100800\/episodes\/recode-decode-cda-230-43792630 [https:\/\/perma.cc\/UC2H-NM9A]. <a href=\"#post-843-footnote-ref-44\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-45\"><em> . Id.<\/em> <a href=\"#post-843-footnote-ref-45\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-46\"><em> . Id.<\/em> <a href=\"#post-843-footnote-ref-46\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-47\"><em> . Id.<\/em> <a href=\"#post-843-footnote-ref-47\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-48\"><em> . See <\/em>Bergmayer, <em>supra <\/em>note 16 (arguing that case law was underdeveloped before Section 230, as evidenced by <em>Cubby <\/em>and <em>Stratton Oakmont<\/em> and that \u201c[s]imple repeal could lead to unmoderated cesspools on the one hand, and responsible platforms beset by lawsuits and crippled by damages on the other\u201d). <a href=\"#post-843-footnote-ref-48\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-49\"><em> . See id.<\/em> <a href=\"#post-843-footnote-ref-49\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-50\"><em> . See Recode Decode: CDA 230<\/em>, <em>supra<\/em> note 41. <a href=\"#post-843-footnote-ref-50\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-51\"><em> . But see<\/em> Carrie Goldberg (@cagoldberglaw), Twitter (May 12, 2018, 3:26 PM), https:\/\/twitter.com\/cagoldberglaw\/status\/995415010678624257 [https:\/\/perma.cc\/E2L3-SE9E] (arguing that platforms could buy insurance to avoid the projected financial burdens from increased tort liability). <a href=\"#post-843-footnote-ref-51\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-52\"><em> . See generally <\/em>Bergmayer, <em>supra<\/em> note 16 (describing the previously discussed leading cases against online platforms before Section 230, one treating the platform as more akin to a publisher or speech and the other treating them as a distributor of speech). <a href=\"#post-843-footnote-ref-52\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-53\">. Citron &amp; Franks,<em> supra<\/em> note 9, at 59 (\u201cWhen \u2018courts routinely interpret Section 230 to immunize all claims based on third-party content,\u2019 \u2013including civil rights violations; \u2018negligence; deceptive trade practices, unfair competition, and false advertising; the common law privacy torts; tortious interference with contract or business relations; intentional infliction of emotional distress; and dozens of other legal doctrines\u2019 \u2013they go far beyond existing First Amendment doctrine, and grant online intermediaries an unearned advantage over offline intermediaries.\u201d). <a href=\"#post-843-footnote-ref-53\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-54\"><em> . See, e.g.,<\/em> James Grimmelman, <em>To Err Is Platform<\/em>, Knight First Amendment Inst. (Apr. 6, 2018), https:\/\/knightcolumbia.org\/content\/err-platform [https:\/\/perma.cc\/Q9QU-SPPM]. <a href=\"#post-843-footnote-ref-54\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-55\"><em> . Id.<\/em> <a href=\"#post-843-footnote-ref-55\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-56\">. Mary Anne Franks, <em>The Free Speech Black Hole:<\/em> <em>Can the Internet Escape the Gravitational Pull of the First Amendment<\/em>? Knight First Amendment Inst. (Aug. 21, 2019), https:\/\/knightcolumbia.org\/content\/the-free-speech-black-hole-can-the-internet-escape-the-gravitational-pull-of-the-first-amendment [<a href=\"https:\/\/perma.cc\/EJC2-ZP36\">https:\/\/perma.cc\/EJC2-ZP36<\/a>] (\u201cThe assertion that regulating speech inevitably chills speech is false: given that some forms of speech themselves inflict chilling effects, regulating those forms of speech may actually serve free speech interests.\u201d). <a href=\"#post-843-footnote-ref-56\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-57\">. Citron &amp; Franks, <em>supra<\/em> note 9, at 68. <a href=\"#post-843-footnote-ref-57\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-58\">. Franks, <em>supra<\/em> note 53. <a href=\"#post-843-footnote-ref-58\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-59\"><em> . Id.<\/em>;<em> but see<\/em> Daphne Keller, <em>Toward a Clearer Conversation About Platform Liability<\/em>, Knight First Amendment Inst. (Apr. 6, 2018), https:\/\/knightcolumbia.org\/content\/toward-clearer-conversation-about-platform-liability [<strong>https:\/\/perma.cc\/9RL3-GVGY] (\u201cSo while [it] is right to say that vulnerable groups suffer disproportionately when platforms take down too little content, they also suffer disproportionately when platforms take down too much.\u201d)<\/strong>(\u201c[W]hile\u2026vulnerable groups suffer disproportionately when platforms take down too little content, they also suffer disproportionately when platforms take down too much.\u201d); Citron &amp; Franks, <em>supra<\/em> note 9, at 67 (\u201cSection 230 already has a mechanism to address the unwarranted silencing of viewpoints. Under Section 230(c)(2), users or providers of interactive computer services enjoy immunity from liability for over-filtering or over-blocking speech only if they acted in \u2018good faith.\u2019\u201d). <a href=\"#post-843-footnote-ref-59\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-60\"><em> . See generally<\/em> Citron &amp; Wittes<em>, supra <\/em>note 16. <a href=\"#post-843-footnote-ref-60\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-61\"><em> . Id. <\/em>at 410. <a href=\"#post-843-footnote-ref-61\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-62\"><em> . See generally <\/em>Evelyn Douek, <em>Governing Online Speech: From \u201cPosts-As-Trumps\u201d to Proportionality &amp; Probability<\/em>, 121 Colum. L. Rev. 759 (2021) (advocating for content limitations proportionate to societal interests). <a href=\"#post-843-footnote-ref-62\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-63\"><em> . Id.<\/em> at 42\u201343. <a href=\"#post-843-footnote-ref-63\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-64\"><em> . See, e.g.<\/em>,<em> id.<\/em> at 44\u201345<em>; <\/em>Klonick, <em>supra<\/em> note 14. <a href=\"#post-843-footnote-ref-64\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-65\"><em> . See<\/em> Klonick, <em>supra<\/em> note 14, at 1603 (\u201cIf this fails and regulation is needed, it should be designed to strike a balance between preserving the democratizing forces of the internet and protecting the generative power of our New Governors, with a full and accurate understanding of how and why these platforms operate, as presented here.\u201d); <em>see also<\/em> Douek, <em>supra<\/em> note 59, at 7 (\u201cBut changing the regulatory environment without a proper understanding of content moderation in practice will make the laws ineffective or, worse, create unintended consequences. Regulators need to understand the inherent characteristics of the systems they seek to reform.\u201d); Sylvain <em>supra<\/em> note 10 (arguing that \u201c[j]udges, lawyers, and legislators should\u2026start looking carefully at how intermediaries\u2019 designs on user content do or do not result in actionable injuries.\u201d). <a href=\"#post-843-footnote-ref-65\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-66\">. Kosseff, <em>supra <\/em>note 25, at 210\u201311. <a href=\"#post-843-footnote-ref-66\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-67\"><em> . Id<\/em>. <a href=\"#post-843-footnote-ref-67\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-68\"><em> . Id.<\/em> <a href=\"#post-843-footnote-ref-68\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-69\"><em> . See, e.g.<\/em>, Citron &amp; Wittes <em>supra<\/em> note 57, at 420; <em>see<\/em> <em>also<\/em> Citron &amp; Franks, <em>supra<\/em> note 9, at 55. <a href=\"#post-843-footnote-ref-69\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-70\"><em> . See generally<\/em> Carrie Goldberg, Nobody\u2019s Victim: Fighting Psychos, Stalkers, Pervs, and Trolls (Plume 2019); <em>see also<\/em> Kosseff, <em>supra<\/em> note 25, at 209. <a href=\"#post-843-footnote-ref-70\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-71\">. Citron, <em>supra<\/em> note 3, at 95. <a href=\"#post-843-footnote-ref-71\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-72\"><em> . Id.<\/em> <a href=\"#post-843-footnote-ref-72\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-73\"><em> . Id.<\/em> at 96\u201399. <a href=\"#post-843-footnote-ref-73\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-74\"><em> . Id.<\/em> at 100. <a href=\"#post-843-footnote-ref-74\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-75\"><em> . Id.<\/em> <a href=\"#post-843-footnote-ref-75\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-76\"><em> . Id.<\/em> <a href=\"#post-843-footnote-ref-76\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-77\">. Brittan Heller, <em>Enlisting Useful Idiots: The Ties Between Online Harassment and Disinformation<\/em>, 19.1 Colo. Tech. L.J. 19, 20 (2021). <a href=\"#post-843-footnote-ref-77\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-78\"><em> . Id. <\/em>at 26. <a href=\"#post-843-footnote-ref-78\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-79\"><a id=\"post-843-OLE_LINK143\"><\/a><a id=\"post-843-OLE_LINK144\"><\/a>. Citron &amp; Franks, <em>supra<\/em> note 9, at 68 (\u201cThe Internet lowers the costs of engaging in abuse by providing abusers with anonymity and social validation, while providing new ways to increase the range and impact of that abuse. The online abuse of women in particular amplifies sexist stereotyping and discrimination, compromising gender equality online and off.\u201d). <a href=\"#post-843-footnote-ref-79\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-80\"><a id=\"post-843-OLE_LINK145\"><\/a><a id=\"post-843-OLE_LINK146\"><\/a>. Kosseff, <em>supra <\/em>note 25, at 221. <a href=\"#post-843-footnote-ref-80\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-81\"><a id=\"post-843-OLE_LINK147\"><\/a><a id=\"post-843-OLE_LINK148\"><\/a><em>. Id.<\/em> <a href=\"#post-843-footnote-ref-81\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-82\"><a id=\"post-843-OLE_LINK149\"><\/a><a id=\"post-843-OLE_LINK150\"><\/a><em>. Id.<\/em> at 221\u201322. <a href=\"#post-843-footnote-ref-82\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-83\"><a id=\"post-843-OLE_LINK151\"><\/a><a id=\"post-843-OLE_LINK152\"><\/a><em> . See<\/em> Packingham v. North Carolina, 137 S. Ct. 1730, 1732 (2017). <a href=\"#post-843-footnote-ref-83\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-84\"><a id=\"post-843-OLE_LINK153\"><\/a><a id=\"post-843-OLE_LINK154\"><\/a><em>. See generally,<\/em> <em>e.g.<\/em>, Citron, <em>supra<\/em> note 3. <a href=\"#post-843-footnote-ref-84\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-85\"><a id=\"post-843-OLE_LINK155\"><\/a><a id=\"post-843-OLE_LINK156\"><\/a>. Klonick, <em>supra<\/em> note 14, at 1603. <a href=\"#post-843-footnote-ref-85\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-86\"><a id=\"post-843-OLE_LINK157\"><\/a><a id=\"post-843-OLE_LINK158\"><\/a><em>. Id<\/em>. at 1617. <a href=\"#post-843-footnote-ref-86\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-87\"><em> . See<\/em> <a id=\"post-843-OLE_LINK159\"><\/a><a id=\"post-843-OLE_LINK160\"><\/a><em>id<\/em>. at 1644. <a href=\"#post-843-footnote-ref-87\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-88\"><a id=\"post-843-OLE_LINK161\"><\/a><a id=\"post-843-OLE_LINK162\"><\/a><em>. See generally<\/em> <em>id<\/em>. at 1661 (\u201cIn the years since Reno, the hold of certain platforms has arguably created scarcity\u2014if not of speech generally, undoubtedly of certain mediums of speech that these platforms provide.\u201d). <a href=\"#post-843-footnote-ref-88\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-89\"><a id=\"post-843-OLE_LINK163\"><\/a><a id=\"post-843-OLE_LINK164\"><\/a><em>. See generally<\/em> <em>id<\/em>. at 1629. <a href=\"#post-843-footnote-ref-89\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-90\"><em> . Id<\/em>. at 1649. <a href=\"#post-843-footnote-ref-90\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-91\"><em> . See <\/em><a id=\"post-843-OLE_LINK165\"><\/a><a id=\"post-843-OLE_LINK166\"><\/a><em>id<\/em>. at 1650\u201352. <a href=\"#post-843-footnote-ref-91\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-92\"><em> . See id<\/em>. at 1652\u201353. <a href=\"#post-843-footnote-ref-92\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-93\"><em> . See id<\/em>. at 1655\u201356. <a href=\"#post-843-footnote-ref-93\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-94\"><em> . See id.<\/em> at 1657. <a href=\"#post-843-footnote-ref-94\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-95\">. Rory Van Loo, <em>Federal Rules of Platform Procedure<\/em>, U. of Chi. L. Rev. (forthcoming) (manuscript at 31), https:\/\/papers.ssrn.com\/sol3\/papers.cfm?abstract_id=3576562 [https:\/\/perma.cc\/U7QC-6TTY]. <a href=\"#post-843-footnote-ref-95\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-96\">. Klonick, <em>supra<\/em> note 14, at 1615. <a href=\"#post-843-footnote-ref-96\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-97\"><em> . See id.<\/em> at 1627. <a href=\"#post-843-footnote-ref-97\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-98\"><em> . See <\/em>Citron &amp; Franks, <em>supra <\/em>note 9, at 52. <a href=\"#post-843-footnote-ref-98\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-99\"><em> . See<\/em> Van Loo, <em>supra<\/em> note 92, at 30\u201331 (giving an example of TripAdvisor taking down bad reviews in order to protect advertisers, which resulted in harms to individuals who relied on good reviews). <a href=\"#post-843-footnote-ref-99\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-100\"><em> . See<\/em> Citron &amp; Franks, <em>supra<\/em> note 9, at 53. <a href=\"#post-843-footnote-ref-100\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-101\">. Evelyn Douek, <em>Facebook\u2019s \u2018Oversight Board:\u2019 Move Fast with Stable Infrastructure and Humility<\/em>, 21 N.C.L.J &amp; Tech. 1, 42\u201343 (2019);<em> see also<\/em> Klonick, <em>supra<\/em> note 14, at 1660 (\u201cFor the content that stays up\u2014like a newspaper determining what space to allot certain issues\u2014platforms also have intricate algorithms to determine what material a user wants to see and what material should be minimized within a newsfeed, homepage, or stream.\u201d). <a href=\"#post-843-footnote-ref-101\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-102\">. Sylvain, <em>supra<\/em> note 10, at 2. <a href=\"#post-843-footnote-ref-102\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-103\">. The Social Dilemma (Argent Pictures 2020). <a href=\"#post-843-footnote-ref-103\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-104\"><em> . See generally <\/em>Jillian Warren, <em>This is How the Instagram Algorithm Works in 2021<\/em>, Later (Jan. 4, 2021), https:\/\/later.com\/blog\/how-instagram-algorithm-works\/ [https:\/\/perma.cc\/M5ZK-DJCX]. <a href=\"#post-843-footnote-ref-104\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-105\"><em> . See,<\/em> <em>e.g<\/em>., Klonick <em>supra<\/em> note, 14 at 1667. <a href=\"#post-843-footnote-ref-105\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-106\">. Elizabeth Dwoskin &amp; Nitasha Tiku, <em>Facebook sent home thousands of human moderators due to the coronavirus. Now the algorithms are in charge<\/em>, Wash. Post (Mar. 24, 2020, 3:55 PM), https:\/\/www.washingtonpost.com\/technology\/2020\/03\/23\/facebook-moderators-coronavirus\/ [https:\/\/perma.cc\/83HT-YE4C]. <a href=\"#post-843-footnote-ref-106\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-107\">. Note that Ruha Benjamin expresses the genuine concern that the deployment of new technologies in the tech world perpetuates racial inequalities through what she terms \u201cthe New Jim Code,\u201d or \u201cthe employment of new technologies that reflect and reproduce existing inequities but that are promoted and perceived as more objective or progressive than the discriminatory systems of a previous era.\u201d Benjamin, <em>supra <\/em>note 6, at 5\u20136. <a href=\"#post-843-footnote-ref-107\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-108\"><em> . See<\/em> <em>generally<\/em> Citron &amp; Wittes,\u00a0<em>supra<\/em> note 57. <a href=\"#post-843-footnote-ref-108\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-109\">. Sylvain, <em>supra<\/em> note 10. <a href=\"#post-843-footnote-ref-109\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-110\">. Citron &amp; Wittes, <em>supra<\/em> note 57, at 419. <a href=\"#post-843-footnote-ref-110\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-111\">. Citron &amp; Franks, <em>supra<\/em> note 9, at 22. <a href=\"#post-843-footnote-ref-111\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-112\"><em> . See <\/em>Citron &amp; Wittes, <em>supra<\/em> note 57, at 419. <a href=\"#post-843-footnote-ref-112\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-113\">. Herrick v. Grindr L.L.C., 765 Fed. Appx. 586 (2019); Carrie Goldberg, <em>Herrick v. Grindr: Why Section 230 of the Communications Decency Act Must Be Fixed<\/em>, Lawfare (Aug. 14, 2019), https:\/\/www.lawfareblog.com\/herrick-v-grindr-why-section-230-communications-decency-act-must-be-fixed [https:\/\/perma.cc\/QJT7-M3WN] (230 immunity granted to Grindr in a products liability tort suit alleging that Grindr harmed Herrick by not taking down a profile impersonating him, and not even having the capability to do so built into the architecture of the platform). <a href=\"#post-843-footnote-ref-113\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-114\"><em> . See <\/em>David Kayne, Speech Police: The Global Struggle to Govern the Internet 10\u201311 (2019) (Kayne argues, as an alternative to regulation, that platforms should be more transparent about how they arrive at policy choices, how they make decisions when moderating content, and how their algorithms make decisions). <a href=\"#post-843-footnote-ref-114\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-115\">. Citron &amp; Wittes, <em>supra<\/em> note 57, at 419\u201320. <a href=\"#post-843-footnote-ref-115\">\u2191<\/a><\/li>\n<li id=\"post-843-footnote-116\">. For a more in-depth process discussion, <em>see generally<\/em> Rory Van Loo, <em>Federal Rules of Platform Procedure<\/em>, U. of Chi. L. Rev. (forthcoming) (manuscript at 31), https:\/\/papers.ssrn.com\/sol3\/papers.cfm?abstract_id=3576562 [https:\/\/perma.cc\/VW4B-M9H3] (arguing that today\u2019s platforms need mandated procedures and legal standards for dispute resolution to foster transparency and accountability similar to financial institutions before.) <a href=\"#post-843-footnote-ref-116\">\u2191<\/a><\/li>\n<\/ol>\n<p>&nbsp;<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Regulating the Social Puppeteers:\u00a0\u00a7 230 &amp; Marginalized Speech Kylie Thompson[1]* Print Version: Regulating the Social Puppeteers- \u00a7 230 &amp; Marginalized Speech &nbsp; Introduction 462 I. What is \u00a7 230? 465 A. Legislative History 465 B. Expansive Scope 466 C. Threading the \u00a7 230 needle 468 II. The Power of Social Platforms 471 A. Harassment &amp; [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_uag_custom_page_level_css":"","site-sidebar-layout":"default","site-content-layout":"","ast-site-content-layout":"default","site-content-style":"default","site-sidebar-style":"default","ast-global-header-display":"","ast-banner-title-visibility":"","ast-main-header-display":"","ast-hfb-above-header-display":"","ast-hfb-below-header-display":"","ast-hfb-mobile-header-display":"","site-post-title":"","ast-breadcrumbs-content":"","ast-featured-img":"","footer-sml-layout":"","ast-disable-related-posts":"","theme-transparent-header-meta":"","adv-header-id-meta":"","stick-header-meta":"","header-above-stick-meta":"","header-main-stick-meta":"","header-below-stick-meta":"","astra-migrate-meta-layouts":"default","ast-page-background-enabled":"default","ast-page-background-meta":{"desktop":{"background-color":"var(--ast-global-color-4)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"ast-content-background-meta":{"desktop":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"footnotes":""},"categories":[13,9,8],"tags":[],"class_list":["post-843","post","type-post","status-publish","format-standard","hentry","category-13","category-printed","category-volume19"],"uagb_featured_image_src":{"full":false,"thumbnail":false,"medium":false,"medium_large":false,"large":false,"1536x1536":false,"2048x2048":false,"portfolio_item-thumbnail":false,"portfolio_item-thumbnail@2x":false,"portfolio_item-masonry":false,"portfolio_item-masonry@2x":false,"portfolio_item-thumbnail_cinema":false,"portfolio_item-thumbnail_portrait":false,"portfolio_item-thumbnail_portrait@2x":false,"portfolio_item-thumbnail_square":false},"uagb_author_info":{"display_name":"Kylie Thompson","author_link":""},"uagb_comment_info":0,"uagb_excerpt":"Regulating the Social Puppeteers:\u00a0\u00a7 230 &amp; Marginalized Speech Kylie Thompson[1]* Print Version: Regulating the Social Puppeteers- \u00a7 230 &amp; Marginalized Speech &nbsp; Introduction 462 I. What is \u00a7 230? 465 A. Legislative History 465 B. Expansive Scope 466 C. Threading the \u00a7 230 needle 468 II. The Power of Social Platforms 471 A. Harassment &amp;&hellip;","featured_media_urls":[],"_links":{"self":[{"href":"https:\/\/ctlj.colorado.edu\/index.php?rest_route=\/wp\/v2\/posts\/843","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/ctlj.colorado.edu\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/ctlj.colorado.edu\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/ctlj.colorado.edu\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/ctlj.colorado.edu\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=843"}],"version-history":[{"count":4,"href":"https:\/\/ctlj.colorado.edu\/index.php?rest_route=\/wp\/v2\/posts\/843\/revisions"}],"predecessor-version":[{"id":853,"href":"https:\/\/ctlj.colorado.edu\/index.php?rest_route=\/wp\/v2\/posts\/843\/revisions\/853"}],"wp:attachment":[{"href":"https:\/\/ctlj.colorado.edu\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=843"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/ctlj.colorado.edu\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=843"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/ctlj.colorado.edu\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=843"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}