We The People

Can Texas and Florida Ban Viewpoint Discrimination on Social Media Platforms?

February 29, 2024

Share

This week, the Supreme Court heard oral arguments in NetChoice v. Paxton and Moody v. NetChoice, which involved challenges to attempts by Texas and Florida to prevent social media sites from banning viewpoint discrimination. The challenges were brought by NetChoice, which argues that the laws’ content-moderation restrictions and must-carry provisions violate the First Amendment. The case could determine the future of our most important platforms, from Facebook to X to YouTube. Alex Abdo of the Knight First Amendment Institute and Larry Lessig of Harvard Law School recap the key issues in both cases; discuss the ideas raised in oral arguments; and preview the wide-ranging impacts these cases may bring.

Please subscribe to We the People and Live at the National Constitution Center on Apple Podcasts, Spotify, or your favorite podcast app. 

Today’s episode was produced by Lana Ulrich, Tanaya Tauber, and Bill Pollock. It was engineered by Bill Pollock. Research was provided by Samson Mostashari, Cooper Smith, Yara Daraiseh, and Lana Ulrich. 

 

Participants  

Alex Abdo is the inaugural litigation director of the Knight First Amendment Institute at Columbia University, where he has been involved in the conception and litigation of most of the Institute’s legal challenges, including legal threats issued by social media platforms to researchers investigating the influence that platforms are having on society. Prior to joining the Institute, Abdo worked for eight years at the ACLU, where he was at the forefront of litigation relating to NSA surveillance, encryption, anonymous speech online, government transparency, and the post-9/11 abuse of detainees in U.S. custody.

Larry Lessig is the Roy L. Furman Professor of Law and Leadership at Harvard Law School. Prior to returning to Harvard, he taught at Stanford Law School, where he founded the Center for Internet and Society, and at the University of Chicago. Lessig is the founder of Equal Citizens and a founding board member of Creative Commons, and serves on the Scientific Board of AXA Research Fund. He is the author of numerous books, including most recently, They Don’t Represent Us: Reclaiming Our Democracy (2019).

Jeffrey Rosen is the president and CEO of the National Constitution Center, a nonpartisan nonprofit organization devoted to educating the public about the U.S. Constitution. Rosen is also a professor of law at The George Washington University Law School and a contributing editor of The Atlantic. 

 

Additional Resources:  

Excerpt from Interview: Abdo discusses applying non-discrimination laws to digital platforms and highlights the operational challenges posed by Florida's broad transparency mandate compared to Texas' more feasible requirements.

Alex Abdo: I think context matters in analyzing non-discrimination laws, even as applied to the platforms. And I think you saw some of the concerns about this in the questions from the justices during the oral argument, where some of them were wondering about the constitutionality of more traditional non-discrimination laws. Laws that forbade the platforms from discriminating on the basis of race, gender, national origin, religion, et cetera, in deciding which users to allow on their sites. My instinct is that those laws would very likely be constitutional, because I don't think the expressive interest of the platforms here in the race, gender, religion, etcetera, of their users, but instead in the viewpoints that they're allowing their users to engage in. But I do want to just respond to what I think is the very powerful point they made by Professor Lessig about desiring a world in which, or a public sphere in which people are forced to confront the views that they don't like.

I absolutely agree with that, and I would treat the platforms like the sidewalks in Skokie, Illinois for following the analogy. If the platforms were, again, like I said earlier, unbreakable monopolies, if their control over the public square, the digital public square, were durable in the way that Skokie's is over its sidewalks.

I think I would support laws of the sort that Professor Lessig thinks are constitutional, or I think they're constitutional. One of the important differences, though, between platforms and sidewalks is that platforms are always going to have to have a curated feed for users to find them useful. There are just too many posts made per day, per hour, per second on these platforms for them to be useful without some kind of sorting of the feed, whatever their conception of the feed is.

Let me turn to your second question, though, which was about the transparency provisions in the law. So the laws have a lot of different transparency provisions. The Supreme Court accepted review just of what it called the individual explanation provisions of the law. And they're a little bit different. Florida law requires a thorough rationale explaining every decision to suppress speech. And it defines suppress very broadly to include not just deleting a post or an account, but deprioritizing a post or account. That would impose a pretty enormous burden because the platforms do that all the time. Every single decision to put a post in the second spot in a news feed is a decision to deprioritize it from the top. And explaining every single one of those decisions with a thorough rationale just, to me, seems entirely unworkable and would make it impossible for the platforms to operate.

Excerpt from Interview: Lessig argues that platforms like Facebook and Twitter shouldn't be regulated like publishers such as the New York Times, emphasizing the need for state and federal regulation in the digital realm without being constrained by the First Amendment.

Larry Lessig: Well, let me first of all take up Alex's invitation, because I think it's a helpful and constructive bridge between our positions. I think what drove our argument was actually how Facebook and Twitter and these other platforms are right now. They opened themselves up and insist they are platforms for anybody to comment, to speak, and in that context, I don't think it's appropriate to analogize the platform to a Christian website, if a Christian website wants to set itself up as the Christian website and say, "We're going to have Christian conversations here on this website." I don't think it's appropriate for the state to come in and say, "No, you have to have Jewish conversations on their website." Just like if a church opens itself up to the public and says, "We invite people in to come to worship." It would be perfectly appropriate for them to say, "I'm sorry, if Devil Worshiper, you're not allowed to be in our church during the time that we are worshiping Jesus Christ."

That's the character of the site. And the point about Facebook or Twitter I refuse to call it any other name, the point about those sites is that they define themselves as open public spaces where anybody can come and say what they want, which means that when somebody goes onto Twitter and says something crazy, I don't think that it's Elon Musk that's saying that. I think it's the crazy person who just tweeted something that's saying that, as opposed to if the New York Times publishes something, I think the New York Times is responsible for what they publish. They make a judgment and it's associated with them, and it's their brand which is being affected, and that's why it's important to protect the First Amendment interest in the context of New York Times, and not in the context of PruneYard, and not in the context of Facebook or Twitter.

As to how the court reception is, Jeff I think I just am too skeptical and too burned by my misunderstanding of what the court is actually thinking as the court as you see them answering and asking questions, Kavanaugh's intervention, I think, was quite significant. Not surprising, but I really am eager to see how they talk it through when they sit down at conference and actually try to wrestle with what's at stake here. I hope they hear a point that it doesn't seem Alex is disagreeing with. The importance of recognizing we have to preserve the sovereign ability of the states and federal government to regulate in this digital environment, because so much of our life is in this digital environment. And if we have no capacity because of the strictures of the First Amendment to do anything significant in this context, then we've basically decided that private corporations have the right to regulate us unconstrained by the state.

That's why Texas referred to our brief and said that Lessig, Wu, and Teachout have written it on our side. It's not typical that they write to support Texas. That's true we don't typically write to support Texas. But just before that, he made what I think is the really important point. If you say that anytime you attempt to regulate what is in effect an algorithm, you trigger the First Amendment that is Lochner On Steroids. People listening to this podcast, I'm sure will know what Lochner is, but Lochner refers to this period in American constitutional history, which both sides left and right agree was a terrible mistake in the evolution of American constitutional doctrine, where the courts forced ordinary regulation through an extremely heavy set of requirements to be upheld.

And the consequence of that is to shift enormous power, both to the judges and to the private industries that get to basically construct the world that they want without any effective regulation by the state, that's a disaster in the context of the internet, and it's not just in the context of neutrality laws on Facebook. It's going to be the wide range of areas where you already see NetChoice raising this issue in the context of privacy regulations. In California, when we start seeing efforts to regulate AI, the same issue is going to be raised. We have to be able to regulate in this space. And the mere analogy between what an algorithm does and what the editors at the New York Times do is not enough to establish that James Madison meant this to be off the regulatory table.

Full Transcript

View Transcript (PDF)

This transcript may not be in its final form, accuracy may vary, and it may be updated or revised in the future.

Stay Connected and Learn More

Questions or comments about the show? Email us at [email protected].

Continue today’s conversation on social media @ConstitutionCtr and #WeThePeoplePodcast.

Sign up to receive Constitution Weekly, our email roundup of constitutional news and debate, at bit.ly/constitutionweekly.

Loading...

Explore Further

Podcast
Native Americans and the Supreme Court

Exploring Native American history and law through the stories of landmark Supreme Court cases

Town Hall Video
2024 National First Amendment Summit

Co-hosted by FIRE and NYU’s First Amendment Watch

Blog Post
Veterans take another battle to the U.S. Supreme Court

Many of the nation’s veterans have fought battles with the federal agency responsible for awarding benefits for their…

Educational Video
Article III and Supreme Court Term Review Featuring Ali Velshi (All Levels)

For our final Fun Friday Session of the 2022-2023 school year, MSNBC’s Ali Velshi returns, joining National Constitution Center…

Donate

Support Programs Like These

Your generous support enables the National Constitution Center to hear the best arguments on all sides of the constitutional issues at the center of American life. As a private, nonprofit organization, we rely on support from corporations, foundations, and individuals.

Donate Today

More from the National Constitution Center
Constitution 101 logo
Constitution 101

Explore our new 15-unit core curriculum with educational videos, primary texts, and more.

Photo of student watching online program
Media Library

Search and browse videos, podcasts, and blog posts on constitutional topics.

Painting of Founders meeting
Founders’ Library

Discover primary texts and historical documents that span American history and have shaped the American constitutional tradition.

News & Debate