{"id":6401,"date":"2025-03-05T20:30:04","date_gmt":"2025-03-05T20:30:04","guid":{"rendered":"https:\/\/blogs.luc.edu\/compliance\/?p=6401"},"modified":"2025-04-02T19:10:17","modified_gmt":"2025-04-02T19:10:17","slug":"curbing-censorship-the-constitutional-challenges-of-addressing-social-media-moderation","status":"publish","type":"post","link":"https:\/\/blogs.luc.edu\/compliance\/?p=6401","title":{"rendered":"Curbing Censorship: The Constitutional Challenges of Addressing Social Media Moderation"},"content":{"rendered":"<p><em>Kate Rice <\/em><\/p>\n<p><em>Associate Editor<\/em><\/p>\n<p><em>Loyola University Chicago School of Law, JD 2026<\/em><\/p>\n<p>At a time when online spaces have become central for news, connection, and the exchange of ideas, the balance between free speech and content moderation is more important than ever. In recent months, there have been rising concerns over potential government censorship and the proliferation of misinformation, especially on social media. The lack of transparency in the tech industry makes this issue uniquely tricky, as each platform\u2019s distinct algorithms are largely <a href=\"https:\/\/www.bbc.com\/news\/articles\/cp8e4p4z97eo\">proprietary<\/a>. However, many users feel that their voices are being silenced based on the nature of the content they are releasing. The possibilities for remedying these concerns are limited, as the First Amendment expressly protects private companies from government censorship (including the <a href=\"https:\/\/www.freedomforum.org\/free-speech-on-social-media\/\">requirement<\/a> that they host specific content), but there are several potential paths forward that could have far-reaching implications for the future of social media content moderation.<!--more--><\/p>\n<p><strong>Allegations of censorship<\/strong><\/p>\n<p>TikTok users in the United States have <a href=\"https:\/\/www.reuters.com\/technology\/am-i-being-censored-some-us-tiktok-users-say-app-feels-different-after-ban-2025-01-25\/\">reported<\/a> noticeable signs of censorship on the app, particularly following its brief <a href=\"https:\/\/www.reuters.com\/technology\/tiktok-awaits-trump-reprieve-china-signals-open-deal-2025-01-20\/\">ban<\/a> in January. Due to national security concerns over the Chinese-owned platform, Congress passed the <a href=\"https:\/\/www.congress.gov\/bill\/118th-congress\/house-bill\/7521\">Protecting Americans from Foreign Adversary Controlled Applications Act<\/a> (PAFACA) to restrict TikTok\u2019s operation in the U.S., raising First Amendment <a href=\"https:\/\/www.aclu.org\/news\/national-security\/banning-tiktok-is-unconstitutional-the-supreme-court-must-step-in\">questions<\/a> about the government\u2019s ability to control and limit digital platforms. Whether bans such as these constitute government censorship is still up for debate. Some TikTok users and other content creators <a href=\"https:\/\/freedom.press\/issues\/tiktok-ban-weakens-first-amendment\/\">argue<\/a> that PAFACA attempts to unlawfully control and limit their content. Others, particularly government officials, are <a href=\"https:\/\/www.american.edu\/sis\/news\/20250123-national-security-and-the-tik-tok-ban.cfm\">concerned<\/a> about foreign adversaries interfering in the U.S. social discourse.<\/p>\n<p>After President Trump brought back TikTok via <a href=\"https:\/\/www.whitehouse.gov\/presidential-actions\/2025\/01\/application-of-protecting-americans-from-foreign-adversary-controlled-applications-act-to-tiktok\/\">executive order<\/a>, many users <a href=\"https:\/\/www.reuters.com\/technology\/am-i-being-censored-some-us-tiktok-users-say-app-feels-different-after-ban-2025-01-25\/\">flagged<\/a> that comments and tags containing certain words or phrases, such as \u201cFree Palestine\u201d or \u201cFree Luigi,&#8221; had been hidden. Practices like <a href=\"https:\/\/www.washingtonpost.com\/technology\/2024\/10\/16\/shadowban-social-media-algorithms-twitter-tiktok\/\">shadowbanning<\/a>, also known as \u201calgorithmic suppression,\u201d are one of the primary ways platforms silence specific creators and content by excluding them from other users\u2019 feeds. However, because the algorithms behind social media feeds are not <a href=\"https:\/\/www.bbc.com\/news\/articles\/cp8e4p4z97eo\">public<\/a>, it is impossible to prove these occurrences.<\/p>\n<p>TikTok is not the only platform with this problem; <a href=\"https:\/\/themarkup.org\/automated-censorship\/2024\/02\/25\/demoted-deleted-and-denied-theres-more-than-just-shadowbanning-on-instagram\">Instagram<\/a>, <a href=\"https:\/\/www.forbes.com\/sites\/antoniopequenoiv\/2025\/01\/29\/meta-settles-trumps-lawsuit-will-pay-25-million-to-end-censorship-claims\/\">Facebook<\/a>, and <a href=\"https:\/\/www.nbcnews.com\/tech\/social-media\/elon-musk-accused-censoring-laura-loomer-maga-republicans-x-rcna185569\">X<\/a> are among others that have faced claims of censorship by users. These accusations come at a time when hate speech is on the <a href=\"https:\/\/news.berkeley.edu\/2025\/02\/13\/study-finds-persistent-spike-in-hate-speech-on-x\/\">rise<\/a>, complicating the situation even further. Most social media platforms have <a href=\"https:\/\/carnegieendowment.org\/research\/2021\/04\/how-social-media-platforms-community-standards-address-influence-operations?lang=en#platform-policies\/?lang=en\">community guidelines<\/a> that users must agree to when creating an account, like restricting content promoting violent or hateful sentiment on the platform. Because of the uncertainty of how exactly content is assessed, there is ample room for abuse of these guidelines depending on how open interpretations may be or who is making these calls.<\/p>\n<p><strong>Enforcement<\/strong><\/p>\n<p>The <a href=\"https:\/\/www.ftc.gov\/\">Federal Trade Commission<\/a> (FTC) is the agency responsible for regulating online platforms. On February 20, 2025, the FTC used its investigative authority to release a Request for Information (RFI) entitled, &#8220;<a href=\"https:\/\/www.ftc.gov\/system\/files\/ftc_gov\/pdf\/P251203CensorshipRFI.pdf\">Request for Public Comment Regarding Technology Platform Censorship<\/a>.&#8221; The RFI calls for technology platform users and employees to provide input on platforms\u2019 alleged practices of demonetizing and shadowbanning users due to their speech or affiliations. It <a href=\"https:\/\/www.jdsupra.com\/legalnews\/ftc-requests-public-comments-on-8664741\/#:~:text=In%20one%20of%20its%20first,the%20content%20of%20users'%20speech\">argues<\/a> that these practices can violate platforms\u2019 terms of service, run the risk of anti-competitive practices, and effectively result in censorship or violations of deceptive business practice laws. Alongside the release of the RFI, FTC Chairman Andrew N. Ferguson <a href=\"https:\/\/www.ftc.gov\/news-events\/news\/press-releases\/2025\/02\/federal-trade-commission-launches-inquiry-tech-censorship#:~:text=Today%2C%20the%20Federal%20Trade%20Commission,in%20response%20to%20the%20RFI.\">said<\/a>, \u201cTech firms should not be bullying their users . . . This inquiry will help the FTC better understand how these firms may have violated the law by silencing and intimidating Americans for speaking their minds.\u201d The FTC is <a href=\"https:\/\/www.hoganlovells.com\/en\/publications\/ftc-opens-discussion-on-technology-platform-censorship-1\">expected<\/a> to use this input to craft future rulemaking and enforcement actions on the matter.<\/p>\n<p>The FTC does have some power to act in these situations. If the agency identifies fraud, deception, or unfair business practices, it can <a href=\"https:\/\/www.ftc.gov\/enforcement#:~:text=The%20FTC%20enforces%20federal%20consumer,deception%20and%20unfair%20business%20practices.\">enforce<\/a> federal consumer protection laws to remedy the violations. If a platform is not transparent about its content moderation policies, this may constitute deception, which the FTC could use to justify regulatory action.<\/p>\n<p>However, tech firms are likely to aggressively defend their right to expression from the FTC. The <a href=\"https:\/\/constitution.congress.gov\/constitution\/amendment-1\/#:~:text=Congress%20shall%20make%20no%20law,for%20a%20redress%20of%20grievances.\">First Amendment<\/a> states that the federal government &#8220;shall make no law \u2026 abridging the freedom of speech or of the press.&#8221; If the FTC does act on this issue, companies will undoubtedly lean on this language to argue that a government agency cannot directly regulate content, or in effect, require that a platform host certain content\u2014regardless of materiality.<\/p>\n<p>Further, in 2024, the Supreme Court <a href=\"https:\/\/www.cato.org\/blog\/scotus-confirms-social-media-platforms-have-first-amendment-editorial-rights?gad_source=1&amp;gclid=CjwKCAiAzvC9BhADEiwAEhtlNz6c3H6VhwzxG1AAzrPcrl4OvHyiwgM-ddkAvr6HVxoz7Be8z6H5txoC7z0QAvD_BwE\">clarified<\/a> that social media platforms possess editorial rights under the First Amendment, which newspapers, magazines, and other forms of media also enjoy. Justice Elena Kagan <a href=\"https:\/\/www.supremecourt.gov\/opinions\/23pdf\/22-277_d18f.pdf\">wrote<\/a>, \u201cThe editorial judgments influencing the content of those feeds are, contrary to the Fifth Circuit\u2019s view, protected expressive activity . . . [platforms] include and exclude, organize and prioritize\u2014and in making millions of those decisions each day, produce their own distinctive compilations of expression. And while much about social media is new, the essence of that project is something this Court has seen before.\u201d<\/p>\n<p>It is also important to note that on February 18, 2025, President Trump signed an <a href=\"https:\/\/www.whitehouse.gov\/presidential-actions\/2025\/02\/ensuring-accountability-for-all-agencies\/\">executive order<\/a> mandating that all significant regulatory actions be submitted for review to the Office of Information and Regulatory Affairs (OIRA), strengthening presidential oversight of executive agencies. The <a href=\"https:\/\/www.hoganlovells.com\/en\/publications\/ftc-opens-discussion-on-technology-platform-censorship-1\">implementation<\/a> of this order will likely affect how the FTC approaches this issue and what it can realistically achieve.<\/p>\n<p><strong>An uncertain path forward<\/strong><\/p>\n<p>While users demand transparency and fairness in social media content moderation, platforms must weigh these with the need to combat misinformation, hate speech, and other harmful or offensive content. Similarly, lawmakers, courts, and agencies must balance competing priorities of preventing undue censorship and ensuring that platforms operate fairly and transparently. However, the Supreme Court\u2019s recognition of social media platforms\u2019 editorial rights further complicates efforts to regulate online content moderation, underscoring the constitutional challenges of government intervention in this area.<\/p>\n<p>Current moderation and censorship practices are increasingly toeing the line of outright suppression, which is particularly concerning as online spaces continue to shape the public discourse. When certain content is restricted, so is that viewpoint; this becomes extremely problematic when certain voices are continually silenced based on a given perspective or stance. This also has serious implications for content creators and consumers alike, creating a less pluralistic digital landscape where users must exercise skepticism about the information that they are being fed.<\/p>\n<p>Additionally, with increased executive oversight through OIRA, the FTC will likely face additional scrutiny in its enforcement efforts. With so many unknowns, aggressive public pushback will be crucial to hold the many decision-makers and stakeholders accountable as they navigate this increasingly complex and consequential issue.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>At a time when online spaces have become central for news, connection, and the exchange of ideas, the balance between free speech and content moderation is more important than ever. In recent months, there have been rising concerns over potential government censorship and the proliferation of misinformation, especially on social media. The lack of transparency in the tech industry makes this issue uniquely tricky, as each platform\u2019s distinct algorithms are largely proprietary. However, many users feel that their voices are being silenced based on the nature of the content they are releasing. The possibilities for remedying these concerns are limited, as the First Amendment expressly protects private companies from government censorship (including the requirement that they host specific content), but there are several potential paths forward that could have far-reaching implications for the future of social media content moderation.<\/p>\n","protected":false},"author":166,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[30,46],"tags":[283,2324,492,731,774,780,783,869,2323,914,1141,2325,1852,1916,2016,2023],"class_list":["post-6401","post","type-post","status-publish","format-standard","hentry","category-ftc","category-regulation","tag-big-tech","tag-censorship","tag-constitutionality","tag-enforcement","tag-executive-order","tag-expression","tag-facebook","tag-first-amendment","tag-free-speech","tag-ftc","tag-instagram","tag-oira","tag-social-media","tag-supreme-court","tag-trump","tag-twitter"],"_links":{"self":[{"href":"https:\/\/blogs.luc.edu\/compliance\/index.php?rest_route=\/wp\/v2\/posts\/6401","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/blogs.luc.edu\/compliance\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/blogs.luc.edu\/compliance\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/blogs.luc.edu\/compliance\/index.php?rest_route=\/wp\/v2\/users\/166"}],"replies":[{"embeddable":true,"href":"https:\/\/blogs.luc.edu\/compliance\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=6401"}],"version-history":[{"count":6,"href":"https:\/\/blogs.luc.edu\/compliance\/index.php?rest_route=\/wp\/v2\/posts\/6401\/revisions"}],"predecessor-version":[{"id":6487,"href":"https:\/\/blogs.luc.edu\/compliance\/index.php?rest_route=\/wp\/v2\/posts\/6401\/revisions\/6487"}],"wp:attachment":[{"href":"https:\/\/blogs.luc.edu\/compliance\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=6401"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/blogs.luc.edu\/compliance\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=6401"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/blogs.luc.edu\/compliance\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=6401"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}