Five Upcoming Supreme Court Cases Could Affect Tech Policy in 2024

Lisa Battaglia
5 min readDec 4, 2023

Five upcoming Supreme Court cases could change trust & safety practices and tech policy in the next year. Entering the 2023–2024 term, the Supreme Court agreed to hear five cases that will address content moderation, misinformation, free speech, and the right to access to public officials’ social media pages. Below I break down each of the five cases, what each could mean for tech companies and for users, and how to analyze the possible outcomes.

Overview of the cases

Two of the cases will examine whether a state official violates the First Amendment by blocking constituents from the official’s social media page if they use that page to communicate about job-related matters to the public.

Another two of the cases will resolve a disagreement between the Fifth and Eleventh circuits about whether a state law can restrict social media companies’ content moderation practices.

The final case will determine whether the federal government violated users’ First Amendment rights by requesting that private social media companies take steps to prevent the dissemination of misinformation during the COVID-19 pandemic.

History of the Supreme Court on content moderation

The U.S. Supreme Court in the 2022–2023 term declined to address Section 230 after hearing Gonzalez v. Google and Twitter v. Taamneh. The decisions kept the status quo, protecting platforms’ moderation decisions under Section 230. The question remains whether the Supreme Court will continue siding with platforms this term.

Two Cases on State Officials’ Social Media Activity:

Lindke v. Freed

  • Brief overview on the facts of the case: Michigan city official (Freed) used his social media page to post about the policies he initiated in his official capacity. Citizen Lindke did not like how Freed handled the pandemic and posted criticism on Freed’s Facebook page. Freed blocked Lindke and deleted Lindke’s comments. Lindke sued Freed for violating his First Amendment rights.
  • The main issue the court will address: Did Freed violate Lindke’s First Amendment rights by blocking Lindke from his social media page?
  • How it could impact tech policy: If the Court decides that public official’s pages cannot restrict public opinions and comments, platforms might consider creating a certain type of page for public officials that restricts their personal moderation of content.

O’Connor-Ratcliff v. Garnier

  • Brief overview on the facts of the case: The Court heard a very similar case the same day in O’Connor-Ratcliff v. Garnier. The Garniers frequently posted criticism of the District’s Board of Trustees (O’Connor-Ratcliff) on the social media pages of the Trustees. The Trustees began to hide or delete the critical comments by the Garniers, and then blocked the Garniers from their social media pages.
  • The main issue the court will address: If the public official uses the social media account to communicate about job-related matters with the public, does that official engage in state action subject to the First Amendment by blocking an individual from the official’s account?
  • How it could impact tech policy: Similar to Lindke, platforms might implement specific restrictions for a state official’s social media page in accordance with the ultimate decision.

Two Cases Resolving the 5th & 11th Circuit Split:

NetChoice, LLC v. Paxton (5th Circuit)

  • Brief overview on the facts of the case: Texas enacted legislation to regulate large social media platforms (Facebook, X, and YouTube). The law purports to prohibit large social media platforms from censoring speech based on the viewpoint of the speaker. The Fifth Circuit rejected the idea that platforms have a “freewheeling” right to censor what people say and affirmed Texas’ legislation.
  • The main issue the court will address: Does Texas’ legislation prohibiting social media platforms from censoring users’ violate the First Amendment?
  • How it could impact tech policy: If the Court allows Texas to implement this law regulating content moderation, platforms might consider reexamining their policies regarding political speech.

Moody v. NetChoice, LLC (11th Circuit)

  • Brief overview on the facts of the case: Florida enacted similar legislation imposing various restrictions on social media platforms, such as prohibiting the deplatforming of political candidates and requiring detailed disclosures about content moderation policies. Florida’s legislation attempts to address what it perceives as bias and censorship by large social media platforms against conservative voices. Unlike the Fifth Circuit, the Eleventh Circuit rejected Florida’s legislation declaring that it violates the First Amendment.
  • The main issue the court will address: Does Florida’s content-moderation legislation comply with the First Amendment?
  • How it could impact tech policy: The Supreme Court must resolve this inconsistency between the Fifth and Eleventh Circuits. Through these two cases, the Court will also examine whether social media platforms are common carriers. If the Court decides that state legislation can regulate censorship, social media platforms might need to fine-tune their policies regarding political speech.

Final Case on Misinformation:

Murthy v. Missouri

  • Brief overview on the facts of the case: Multiple plaintiffs including epidemiologists, consumer and human rights advocates, academics, and media operators claimed that federal agencies and officials have engaged in censorship by requesting that social media companies prevent the dissemination of purported misinformation on topics such as the 2020 presidential election, COVID-19, and election integrity.
  • The main issue the court will address: Did the federal government’s request that private social media companies take steps to prevent the dissemination of misinformation violate users’ First Amendment rights?
  • How it could impact tech policy: If the Court ultimately decides that the federal government did violate users’ First Amendment rights by requesting that social media companies prevent the dissemination of misinformation, companies should be prepared to adjust misinformation policies.

These five cases could further complicate trust & safety practices and tech policy in the next year. As a tech or social media company, it is important to stay ahead and well-researched on the potential outcomes of these cases. If you are looking to expand your legal research, need writing / blogging / creative help in this field, or need a policy expert on your team, reach out to me at lisa@lisabtag.com

--

--