One of the Supreme Court’s biggest cases this term involves the content moderation rights of websites—including YouTube Facebook, and X (formerly Twitter)—and two states that want them regulated as “common carriers,” a decision that could have a major impact on their business models.
On Feb. 26, 2024, the Court will hear arguments in Moody v. NetChoice and NetChoice v. Paxton, cases from Florida and Texas related to laws passed by those states seeking to regulate privately held digital companies with huge numbers of online users. The Florida case, Moody, is an appeal concerning a Florida law, S.B. 7072 (“the Stop Social Media Censorship Act”), which applies to an internet platform that does “business in the state” and has either “annual gross revenues in excess of $100 million” or “at least 100 million monthly individual platform participants globally,” and imposes on them “three types of transparency and speech-promoting protections: neutrality provisions, hosting provisions, and disclosure obligations,” according to Florida’s brief.
The Texas case, Paxton, concerns Texas’ state law H.B. 20. The statute seeks to regulate internet platforms with more than 50 million domestic monthly users, and bars them from moderating content in a way that discriminates against viewpoints while requiring the companies to justify moderation decisions. Two trade associations representing the internet platforms, NetChoice LLC and the Computer & Communications Industry Association, challenged both laws. The 11th Circuit ruled that the Florida law violated the social media companies’ First Amendment free speech rights; while the Fifth Circuit Court of Appeals upheld the Texas law. Both cases were appealed to the U.S. Supreme Court.
The Supreme Court accepted the cases on Sept. 29, 2023, and it limited arguments to two questions presented by Elizabeth B. Prelogar, the U.S. Solicitor General: Whether the laws’ content-moderation restrictions comply with the First Amendment, and whether the laws’ individualized-explanation requirements comply with the First Amendment.
The internet platforms targeted by these laws are owned by private companies, and usually the First Amendment protects private free speech rights by forbidding government control and censorship of their content. However, some privately owned companies are considered common carriers, a special class of business that dominates a market and provides a public service. For example, telecommunications companies that provide communications services are considered common carries and are regulated by state and local governments. Both Florida and Texas are seeking to categorize the social media platforms as common carriers to enable them to regulate these private entities in this way under the First Amendment.
The Florida and Texas Cases
Both cases involved the broader question of how to classify large social media platforms. In the 11th Circuit’s Moody decision, issued on May 23, 2022, Judge Kevin C. Newsom in writing for the unanimous three-judge panel said, that it was “substantially likely that social-media companies—even the biggest ones—are ‘private actors’ whose rights the First Amendment protects.” The court also held that content-moderation decisions made by social media operators were editorial judgments, and the Florida law unconstitutionally burdened the social media platforms, especially in light of the law’s requirement for a “thorough rationale” for all content-moderation decisions.
The 11th Circuit further disagreed with Florida’s claim that large social media platforms fell under the common carrier umbrella. “Social-media platforms exercise—and have historically exercised—inherently expressive editorial judgment, they aren’t common carriers, and a state law can’t force them to act as such unless it survives First Amendment scrutiny,” Newsom concluded.
The court also disagreed with claims that these laws were needed to prevent these platforms from censoring users based on their political or philosophical viewpoints. “The provisions that prohibit deplatforming candidates, deprioritizing and ‘shadow-banning’ content by or about candidates . . . or shadow-banning ‘journalistic enterprises’ all clearly restrict platforms’ editorial judgment,” the court held.
The Fifth Circuit decision concerning the Texas law was issued on Sept. 16, 2022, and it squarely conflicted with the 11th Circuit’s reasoning on several key points. “We reject the [social media] Platforms’ efforts to reframe their censorship as speech. It is undisputed that the Platforms want to eliminate speech—not promote or protect it. And no amount of doctrinal gymnastics can turn the First Amendment’s protections for free speech into protections for free censoring,” wrote Judge Andrew S. Oldham in the majority opinion.
Oldham’s opinion advanced the idea that large social media companies could be regulated as common carriers. He concluded that “the Platforms argue that because they host and transmit speech, the First Amendment also gives them an unqualified license to invalidate laws that hinder them from censoring speech they don’t like. . . . The Platforms are not newspapers. Their censorship is not speech.”
Judge Edith Jones concurred but did not join Oldham’s part of the opinion about common carriers. Oldham and Jones also agreed that the states were only subject to a First Amendment intermediate scrutiny test in regulating the social media companies. Judge Leslie Southwick discarded the importance of the common carrier argument and believed the 11th Circuit ruling protecting the social media platforms’ First Amendment rights was correct.
The Solicitor General’s Brief
Before accepting the cases for arguments, the Court invited the U.S. Solicitor General Prelogar to file a brief, and the Court accepted two of her four questions, including the ability of the social media companies to moderate content and the requirement for individual explanations for moderation decisions.
In her brief, Prelogar argued that the Court should accept the cases, and that the 11th Circuit’s ruling was correct. “The platforms’ content-moderation activities are protected by the First Amendment, and the content-moderation and individualized-explanation requirements impermissibly burden those protected activities,” she concluded.
One point raised by the Solicitor General was that content moderation was protected free speech, but it was also not immune to regulation. But the free speech burden imposed on the social media companies was unduly restrictive. “The States have not articulated interests that justify the burdens imposed by the content-moderation restrictions under any potentially applicable form of First Amendment scrutiny,” she said. Especially troublesome, Prelogar wrote, was “the government requirement that [platforms] display different content—for example, by including content they wish to exclude or organizing content in a different way,” a task that “plainly implicates the First Amendment.”
The individualized-explanation requirements in both state laws presented the same problem. “As the Eleventh Circuit explained, the sheer volume of content removal that the platforms undertake makes it impracticable for the businesses to comply with those mandates,” Prelogar said. In one example, YouTube would need to explain under the new laws why it removed over 1 billion comments in a three-month period.
Since the Supreme Court accepted the cases, more than 80 case briefs have been filed in their docket. However, some early opinions in the cases offer clues as to how the arguments coming up may play out at the Court. For example, in a May 2022 stay order in the Texas case, Justice Samuel Alito wrote, that “social media platforms have transformed the way people communicate with each other and obtain news.” Alito was joined by Justices Clarence Thomas and Neil Gorsuch. “It is not at all obvious how our existing precedents, which predate the age of the internet, should apply to large social media companies.” Alito noted.
Scott Bomboy is the editor in chief of the National Constitution Center.