We The People

Trump and the Facebook Oversight Board

May 06, 2021

Share

The Facebook Oversight Board—a recently-developed court of sorts that independently reviews Facebook’s decisions and policies—issued a major ruling this week, upholding the company’s initial decision to ban President Trump indefinitely, but calling on the company to come to a final decision on its suspension of Trump and similar cases with greater detail. The board also requested that Facebook clarify its policies on political leaders, do some additional fact-finding, and report back with more on its decision and rationale in six months—when the board will reconsider the ban. Host Jeffrey Rosen considered the impact of the decision for the future of digital speech with two experts who have done pathbreaking work on the Facebook Oversight Board: Kate Klonick, assistant professor of law at St. John’s Law School who spent a year embedded with the Oversight Board as it was being developed, and Nate Persily, professor of law at Stanford Law School and co-director of the Stanford Program on Democracy and the Internet.

FULL PODCAST

PARTICIPANTS

Kate Klonick is Assistant Professor of Law at St. John’s Law School. She spent a year embedded with the Facebook Oversight Board as it was being developed, and subsequently wrote articles in the Yale Law Journal and The New Yorker.

Nate Persily is the James B. McClatchy Professor of Law at Stanford Law School and the co-director of the Stanford Program on Democracy and the Internet. He has written extensively on the law of democracy, the First Amendment, and more.

ADDITIONAL RESOURCES

This show was produced by Jackie McDermott and engineered by Greg Scheckler and David Stotz. Research was provided by Alexandra "Mac" Taylor, Anna Salvatore, and Lana Ulrich.

Stay Connected and Learn More

Questions or comments about the show? Email us at [email protected].

Continue today’s conversation on Facebook and Twitter using @ConstitutionCtr.

Sign up to receive Constitution Weekly, our email roundup of constitutional news and debate, at bit.ly/constitutionweekly.

Please subscribe to We the People and Live at the National Constitution Center on Apple PodcastsStitcher, or your favorite podcast app.

TRANSCRIPT

This transcript may not be in its final form, accuracy may vary, and it may be updated or revised in the future.

Jeffrey Rosen: [00:00:00] I'm Jeffrey Rosen, president and CEO of the National Constitution Center. And welcome to We the People, a weekly show of constitutional debate. The National Constitution Center is a nonpartisan, nonprofit, chartered by Congress to increase awareness and understanding of the constitution among the American people. The Facebook Oversight Board has just upheld the banning of President Trump from Facebook, but the board found that Facebook's indefinite suspension of President Trump was inappropriate and said that Facebook now has six months to review the standards for the ban. Here to explain this pathbreaking decision and its potential impact on the future of free speech online are two of America's leading experts on online free speech, and on the Facebook Oversight Board.

Kate Klonick is assistant professor of law at St. John's Law School. She spent a year embedded with Facebook's Oversight Board, as it was being developed, and subsequently has written articles about it in the Yale Law Journal and the New Yorker. Kate, it is wonderful to have you back on the show.

Kate Klonick: [00:01:11] Thank you for having me.

Jeffrey Rosen: [00:01:12] And Nate Persily is the James B. McClatchy Professor of Law at Stanford Law School and the co-director of the Stanford Program on Democracy and the Internet. He has written extensively on the law of democracy, the First Amendment and online free speech. Nate, it is wonderful to have you joining us.

Nate Persily: [00:01:29] Thanks for having me.

Jeffrey Rosen: [00:01:30] Nate, you have called this decision, the Marbury versus Madison of online free speech law. Tell us why and why it is important.

Nate Persily: [00:01:39] Well, I think that the decision is important in its own right, but it's also important in delineating the responsibilities and powers of the board and its relationship to Facebook. And that's really what Marbury did also. You know, as, as, as visitors to the Constitution Center can, can learn that, you know, when the Supreme Court decided Marbury versus Madison the actual decision itself about a commission for a judge was really not the most important issue. The question was whether the court had the power of judicial review and under what circumstances wou- would it be exercised.

And since... So too here, that what you learn from the Facebook Oversight Board's decision is how it's going to consider cases like this, how it's going to flex its muscles, what it's going to require from Facebook, what it's gonna ask of Facebook, how it might embarrass Facebook by telling the public what Facebook didn't provide, as well as a series of other measures that they took in terms of advice and policy recommendations going forward. So these early decisions, and particularly this one, since it's the most high profile, are important in establishing the board as an institution. And I think that's really what happened today, is that it, it, it sort of solidified its position as an institution that will review content moderation decisions by Facebook.

Jeffrey Rosen: [00:02:54] Kate, just before the show started, you called the decision, the McCulloch versus Maryland of online free speech law, and also called it, the jam. Tell us why you think it is important. [laughs].

Kate Klonick: [00:03:07] [laughs]. I don't know that that's like... People usually say those two things in one sentence. So I mean, it's not really the mo... I was mostly being facetious about McCulloch versus Maryland, but it's actually... I do think that this... I'm, I'm very, I'm very optimistic about this decision, and very pleased. I think that there has been a lot for a long time a question of what exactly the board was going to do for the public and what the board was going to do for Facebook and how would... all of those lines were going to kind of kind of fall, fall out and get drawn.

And what I think that this decision shows us is that Fa- Facebook wanted the board to be some type of expert body to solve the intractable problem of, of online speech and solve the intractable problem of content moderation and good and bad speech and free speech and not free speech, that the board's not going to do that for them. And they basically are not going to carry water for Facebook.

And so I think that what has... this decision tells us is that they effectively... it's been a large... to, to an extent, it's almost an... like an administrative law decision in which they are taking the d... they're telling Facebook, they're sending it back to Facebook and basically saying, "Get your standards in order. Get your process in order. Get your rules in order. Let us know how you're applying them. And then we'll tell them if you apply them and keeping with the rules of law and international human rights standards and your own stated values. But we're not going to make these hard decisions for you."

And I just... I think for that reason alone, it is a breath of fresh air and takes us into a much more, into a much more sophisticated space of thinking about online speech, instead of it just being about taking down a world leader or taking down Nazis or taking down white supremacists.

Jeffrey Rosen: [00:04:59] All right, let us dig into the decision. And dear, We the People friends, I would like you to read the text of the decision with me, the full-case decision, which you can easily find online and which we'll link at the podcast page. And the full-case decision begins with the decision summary. It then has a case description of the post from President Trump that are being challenged as violating Facebook's policies. It then cites the relevant standards that Facebook is applying in particular, its Community Standard on Dangerous Individuals and Organizations Community Guidelines, a separate Community Standard on Violence and Insurrection, Facebook Terms of Service.

It then goes on to state Facebook's values, including voice safety and dignity. It then notes human rights standards, including the UN Guiding Principles on Business and Human Rights and says that the board analyzed Facebook's responsibilities by considering human rights standards, including the right to freedom of expression in the International Covenant on Civil and Political Rights and other international standards as well as international rights of non-discrimination and remedy.

And then it then has the content creator statement, Facebook's explanation of why it did what it did, and president Trump's response. With that, wind up Nate, tell our listeners what struck you about the substantive finding of the board, in particular the board found that president Trump violated the Community Standard on Dangerous Individuals and Organizations, only a minority of the board found that it violated Facebook's Community Standard on Violence and Incitement.

Nate Persily: [00:06:31] So I think that this decision flows from the other decisions that Facebook... that Oversight Board has issued before, which is that they are relying heavily on International Human Rights Law, as you suggested. And that in many respects, they are thinking about Facebook like a government and the newsfeed like a public square. I personally disagree with that, we can talk about that later. But the, the point still holds, which is that they you know, feel that the Facebook rules when it comes to glorification of violence and dangerous individuals that were at issue here, were violated by President Trump when in a video, he basically said, you know, to the protestors, "Go home, but we love you, we praise you."

An- and you know, other words of, of affection and admiration. And then also he had said earlier in, in text that, "This is what happens when a, you know, an election is unceremoniously taken away from a legitimate elected president." Now, just to be clear, this... President Trump's words were not much different than words of thousands, perhaps hundreds of thousands of Facebook users on the same day. What is interesting, and I think, insightful about the board's position here is they say, "Look, the, ability to do offline harm is going to be affected by a you know, the significance of the leader who is issuing these words."

And so when President Trump praises violence or praises those who were gonna commit insurrection at the Capitol, that is different, and then... and that Facebook can respond to an emergency like that. However, it is an emergency just like any other incitement case. And so you cannot ban someone for life, they say for that kind of speech. And you've gotta come back in six months to, to basically say you know, whether this emergency situation has subsided.

Jeffrey Rosen: [00:08:17] Kate as Nate mentioned, the board found that the Dangerous Individuals and Organizations Standard was violated, but did not by majority find that the Violence and Insurrection Standard was violated. The dangerous Individuals and Organizations Standard is linked at the bottom of the decision. It says, "In an effort to prevent and disrupt real-world harm, we do not allow any organizations or individuals that proclaim a violent mission or engaged in violence to have a presence on Facebook. This includes organizations or individuals involved in...." And then there's a list that culminates in organized violence or criminal activity.

"And we also remove content that expresses support or praise for groups, leaders, or individuals involved in these activities." What do you make of the fact that this was the standard that the board coalesced around, it projected the eminence standard. And it also rejected the arguments of President Trump's advocates, that the First Amendment eminence standards should apply.

Kate Klonick: [00:09:09] I, I think that there's... I think that that is, I think that that is absolutely fascinating, but I, I have to... I'm going to actually dodge the question a little bit and go to something that I, I think was related to this about their analysis, which was which was how they, as a board, did their fact finding and how they went back to Facebook to create a record of what had happened internally, with how the decision was made.

And I think one of the key things that is going to come out of this decision going forward, it... and like is going to be picked up in the media and is generally going to be talked about, is the fact that, and essentially the board went back to Facebook and asked them 46 questions and asked how they had made this decision and how they applied as Nate kind of puts it, the newsworthiness standard, and the, the, the public, the public figure standards, and the exceptions around this.

And they denied that in this case, that there was any application of newsworthiness standards to Trump, which seems implausible. I will I will add, and that he had never, in fact, been considered and kept up, because of newsworthiness considerations, which is also frankly implausible. And so there are some really there are some really huge fact-findings that are going to inform the public and I think this debate going forward. And it's also really questionable that Facebook spent $130 million starting this board, all of this time and energy for two years, getting it going.

And then they came back to the Oversight Board and said, "We're not going to answer six or seven of your questions that you asked us. We're not going to give you the facts that you want, even though you're supposed to be holding us accountable, we're not going to talk...." It's like if the government did that in a case, it would be, it would be insane. And so I think that there is, I think that there is something really valuable about this process and the transparency that the board is providing us with this decision, and to the backend of Facebook's content moderation decisions and its inconsistencies that is just going to be absolutely invaluable going forward.

Jeffrey Rosen: [00:11:27] Well, Nate, what do you make of the questions that Facebook refused to answer? The board asked Facebook 46 questions, Facebook declined to answer seven entirely, and two partially. The questions Facebook did not answer included questions about how Facebook's news feed and other features impacted the visibility of Mr. Trump's content, whether Facebook has researched or plans to research these design decisions. I won't read them all, but they're enumerated in the decision. What do you make of that?

Nate Persily: [00:11:51] So to some extent, this grew out of a brief that the Knight First Amendment Institute provided, and also an op-ed that Jameel Jaffer, the head of it, put out, where, where they said that the Facebook Oversight Board should sort of declined to even answer this question and should, should put it back to Facebook to ask what its role was in the insurrection and the like. And so the Oversight Board asked... naturally asked the questions, "Give us all the, the information here." And so th- th- th- the fact that Facebook ref- refused is itself interesting, but also consequential is the fact that the Oversight Board noted that in its decision, what it has refused to provide. Right? So now, again, you gotta think sort of meta about this decision.

What does this mean going forward? It means that, that, that the Oversight Board will say what kinds of requests for information it has made of Facebook. And it will note times when Facebook falls short or doesn't respond. And Facebook is gonna have to... a- and like I said, has to be consistent, but its inconsistencies will be revealed if it's trying to say that privacy prevents them from revealing it in this circumstance, but not in that circumstance, or that there are... these questions about the algorithm are not relevant. Like who is Facebook to decide that it's relevant or not, right? Obviously the Oversight Board wanted to factor it into the decision.

Maybe they would have said something about it, but Facebook ultimately is gonna decide what is relevant. And if it doesn't think that, that the board needs to hear it, then it won't. Now, it also... You know, the tea leaves that I read into this is that Facebook is worried about creeping authority of the board when it comes to the algorithm itself, right? Because there are a lot of us on the outside who say, "Look, the take... the leave up, take down decisions are not the most important decisions that Facebook engages in, it's the questions of demotion and the hierarchy of information in, in the newsfeed.

And so if they draw a bright line there, about... th- the Oversight Board's oversight over the algorithm, that's a pretty important bright line. It is consistent, I should say, with, you know, the way that they envisioned the Oversight Board, but frankly, you know, the Oversight Board, as I said, last time you and I, all of us spoke about this, right? Th- the legitimacy of the Oversight Board was not given from its birth, it's legitimacy that needs to be earned over time. And we don't know what this institution is gonna look like in five years.

And so it is developing its powers as it is interpreting them. And so, for example, the idea that it's basically saying to Facebook, "We're gonna uphold the ban right now, but you gotta come back in six months to re-justify it" well, that was not like a four ordained outcome, that that's the kind of power that this Oversight Board will have, but now that's the kind of thing that they can do. They can... They, they, they will ask Facebook to do more than just leave up or take down speech.

Jeffrey Rosen: [00:14:37] Kate, what do you make of Facebook's unresponsiveness in terms of the future relation between the Oversight Board and Facebook? And you've studied this relation more than anyone. Before the decision, you had expressed concern, that had the board ordered Trump immediately resurrected Facebook might not obey the decision, in a, in a real anti-Marbury moment. What does this unusual back and forth say about the relation between the two bodies in the future?

Kate Klonick: [00:15:02] It says the horse is out of the barn, and it's like it's too late for this to like... for it to... I mean, and this has always been my argument, and Nate and Jeff, you, you both know this. My argument has always been that this... that there was... People criticize the board and the creation of the board as it was going... it was about a legitim- legitimation of the governance of Facebook, legitimation of Facebook's power, that it was going to not have any real teeth, that it was going to be a Potemkin village, like all of these kind of critiques. And my answer to all of it was like, yes, maybe. But it will certainly be a crack in the door that we can drive a wedge through, po- potentially. Right?

And I think that, as Nate said, like, I don't know if it's Marbury, and I... It's probably definitely not McCulloch, but there's like... [laughs], maybe. But I, I, I think that there is something... I, I just think that i- it's, there is so much in this decision to, to be excited about, in terms of what this institution has potential to do. And maybe it's not... we're not going to all agree on the standards that it selects, if it... especially if it hues to, to international human rights, more than I think either Nate or I would probably prefer.

But like, but at the very least, it is a consistent recent level of accountability from very serious people. And I just, I find it to be just astonishingly refreshing after, after just years of only having panicked media reports and and newsroom talking points from the companies. And so this is just... I feel like this is, in that sense, I'm just very excited about it.

Jeffrey Rosen: [00:16:53] Thank you for that. Nate uh, tell our listeners about the invocation of International Human Rights Law. The board notes Facebook says its decision was informed by article nine of the ICCPR and the UN general comment 34 on freedom of expression, which permits the necessary and proportionate restrictions of freedom of expression in the situations of public emergency that threatens the life of the nation. It also took into account six contextual factors from the robot plan of action on the prohibition of advocacy of national, religious- religious hatred, developed by experts with the support of the United Nations. This invocation of International Human Rights Law was surprising to those of us who haven't been following this field. Was it surprising to you? And what does it say about the relevance of International Human Rights Law moving forward?

Nate Persily: [00:17:42] I- it wasn't surprising to me since they had sort of presaged this with the earlier decisions where they cited International Human Rights Law as well. And this is... this was in some ways fore-ordained by the charter and the lead up to the creation of the board where the International Human Rights Law is specifically mentioned there. I, I continue to believe that this is a misguided approach for the following reason, which is that Facebook is not a government and the newsfeed is not a public square. And so there are all kinds of things that a company and a social media company should be able to do that a government cannot, and that we should hold them to different standards.

Now, in their defense, they also cite a set of principles dealing with the human rights standards that apply to companies, but that's a whole separate other area of International Law. It's... Sort of bracket that for a second. But the point is that when it comes to regulating political speech or regulating r- really any speech o- on the platform what, what the Oversight Board is suggesting here is that th- the, the decisions by world... you know international courts will be relevant in informing whether this speech should be allowed or not.

And so, for example, whether it's you know, the, the case they dealt with with nudity and whether a breast cancer video could be allowed up on, or whether coronavirus misinforma... coronavirus borderline disinformation would be allowed on the platform. They also cited International Human Rights Law. Each one of the, the texts that you just cited, right, have a kind of familiar First Amendment that you fill to them where it's about, you know, that, that you, you know, you have a fundamental right to speech, but that it can be limited in certain circumstances. One of those circumstances is actual violence.

And then how do you know when the violence is imminent enough? And then you have a few factors that, that you look at, right? Okay. That's, that- that's basically a familiar kind of analysis that anybody who analyzed this free speech goes through. The problem it seems to me is that, if you believe this, right, it seems to me that you can never justify a lifetime ban, right? There's no way. Y- you couldn't ban a speaker in the United States or anywhere, right, from speaking for their entire life because of speech that they did, you know, a few times. Right? And so does this now mean that Facebook has to revisit all of its account bans every six months to see whether they continue to vio... pre- present a danger for community standards.

And while they say, "Well, yeah, damn it, they should do that." Well, they do take down 4 billion accounts per year. Okay. So, so that's, that's not an easy thing. Now, most of that is commercial spam, right? So it's probably not a big deal. But, you know, take downs are very familiar part of enforcement for social media companies.

Jeffrey Rosen: [00:20:23] Kate, as Nate suggests, a lifetime ban on any user is difficult to reconcile with either First Amendment Law or International Human Rights Law. But I wanna ask you to really dig on this question of why is International Human Rights Law the correct standard? I... Nate just told us it was part of the Oversight Board's charter, but Facebook is free to apply its own community standards or International Human Rights Law, or traditional First Amendment standards, which are far more protective of free expression than the international standards that the board invoked, including the three-part test about legality and legitimate aim and necessity and proportionality.

So just to put the question squarely, for a civil libertarian why wouldn't the right answer would be that Facebook should have chosen to apply First Amendment standards rather than international human rights standards?

Kate Klonick: [00:21:14] Well, this is actually where I... This a- actually leads into my, my comparison to administrative law and an administrative body out of this decision, is that I really think that one of the th... there's, there's two things happening. There has been a... The reason that International Human Rights Law plays a massive role in the charter of the board and in an in kind of the board's decision-making, is that people in the c- course of Facebook setting this up, told the board... told Facebook and told the team that was setting this up that that's what they wanted, that they wanted International Human Rights Law to be the standard.

And there were a lot of people that like made that argument. And so... And the other part of it was that if you didn't make international human rights, or if you didn't make some body of law the standard, you were going to have it be... like you were gonna have even l- less to hold the board for the... the board have even less to hold on to. And you were going to create this kind of this kind of rule of law conflict of laws problem. And I... for the board to kind of constantly grapple with.

And I think that what is, I think what is gonna be fascinating here is, i- if Facebook decides that their administrative, their administrative body, like... as like an admin... as their own separate like body, if they decide that their rules are going to directly conflict with notions of International Human Rights Law, what is the board gonna do about that? Because at the... on the one hand, this board has been created to have jurisdiction over the administrat... like over Facebook. But is that even possible if it's also going to have an obligation to human rights standard? Is that like an even, is that an even, is that an even sensible question? And so, Nate, do you have thoughts on that?

Jeffrey Rosen: [00:23:08] Yes, Nate, we'd love to hear your thoughts on that, and on the fact that the board divided over the applicability of Human Rights Law. A majority of the board concluded that the violation in this case was severe in terms of its human rights harms, and it apply to six factors from the robot plan of action. There, there are many multi-part tests here, but the robot plan asks about context, status of the speaker, intent, content and form, extent, and reaching imminence of harm. But a minority of the board held that the proportionality analysis should be informed by Mr. Trump's use of the platforms before the November 2020 presidential election, in- including a May post where the looting starts, the shooting starts, and pre-election comments.

So obviously these standards are quite vague and malleable. And what do you make about the fact that the board disagreed about them? And what do you say in response to Kate who wonders about whether the board is ultimately responsible to Facebook standards or to international human rights standards?

Nate Persily: [00:24:06] So lemme say two things. First, is that it is extremely difficult to come up with an incitement standard in advance that will deal with particular instances of incitement. That's why you can't just say, well, whenever a clear and present danger is present that then you can act to suppress speech, right? And so the robot standard is just like any others, which is that it's just basically looking at factors in the context of the speech, which would then justify its, its censorship or suppression. And so you know, th- the, the division among the, the Fa Oversight Board, it seems to me focuses on whether you look at the speech itself that gave rise to the... the final straw that broke the camel's back, or whether you look at the behavior of the speaker over the previous year.

And reasonable people can disagree on that. Frankly, it seems inconceivable to me that Facebook wasn't looking at the whole, you know history of Trump's speech when they made the decision to do what they did, right? Because they had been urged to take down his account for months based on some of that other violations of community standards. And there were... there was a strike that we now learned was against them un- under Facebook system. So, so there's that. Now, l- lemme be clear about the point about international human rights, which is that the... As Kate says, right, there, there... the, the Oversight Board needed to provide some, some content for its decisions.

Why is it basing its decisions on speech on the, on these principles, right? How does it interpret these? But now, one option could have been that you treat the Facebook charter, the charter for the Oversight Board, like a constitution, and overtime that the, the Facebook Oversight Board develops its own common law about that. Perhaps it gets informed by the, the international human rights standards in some loose way, like persuasive authority, but they are treating it almost like pretty strong precedent here. And that, that for me is a problem.

I mean, j- just so that listeners understand what the problem is, take, take a nudity ban, for example, it seems to me perfectly plausible and okay for Facebook to say, no nudity at all on the platform. You know, yes, that is going to prevent some kind of breast cancer education videos, all kinds of other things. But, you know, if you wanna find nudity on the internet, there are plenty of places you can find it, right? It's not as if Facebook is the only platform in which you can see this stuff. And it is, it is beneficial for a social media platform to have clear standards that are sometimes over-inclusive, that can be applied in an automated way by the algorithms.

And yes, there's going to be some loss of speech in that context, but the harm is not similar to if a government completely banned this category of speech, right? It's because there are other places on the internet you're gonna be able to see the speech as well.

Jeffrey Rosen: [00:26:47] Thank you for that. Kate, Nate mentioned the strike system and the board noted that in addition to the two posts on January 6th uh, Facebook had previously found five violations of its community standards. In response to the board's question on whether strikes had been applied, Facebook said the page received one strike for a post in August, but did not explain why the other violating content it removed did not result in strikes. And the board then called on Facebook to explain its strike and penalties policies for restricting profiles more clearly. It's confusing to read from the decision. Can you explain for our listeners what the strike policy is, what we know about, how it was applied here, and how the board is asking for clarifications?

Kate Klonick: [00:27:29] No, I can't. And the reason I can't is because no one knows what the strike policy is, and because they make it up as they go along. So like that's actually... So this is another really central... this isn't a... I mentioned the implausibility that like newsworthiness had not factored into Trump's staying up on the platform. And maybe they didn't use the term newsworthiness when they were considering this, behind the scenes of Facebook, making these choices. But there is some type of capacity in which he was, he was put into a bucket of political figures or newsworthiness or special consideration that was separate from other users.

And this is one of the big problems with Facebook's current current current system, is that they don't really have a system, that when things get... when the, when the water gets hot enough, then they like decide to ban someone or they decide to... that they're going to make up strikes and they're going to start taking things down. But this was very much... the, the sense that I have is that this is... very much was an ad hoc decision.

Facebook kind of like, like, you know... And, and the board seeking more information that they kind of had... like couldn't do any more than say that it was an ad hoc decision. And the board, I think, is basically saying that that's not good enough. If you're going to have if you're going to have a system like this, you have to you have to have set rules for what you're going to do. And then you have to show us how they are applied and how they are applied consistently.

Jeffrey Rosen: [00:28:56] So let's not talk about the indefinite ban. The second part of the decision after upholding Facebook's decision to suspend Mr. Trump's access to post content on January 7th says, "However, as Facebook suspended, Mr. Trump's accounts indefinitely, the company must reassess the policy. Within six months Facebook must re-examine the arbitrary penalty it imposed, and decide the appropriate penalty. This penalty must be based on the gravity of the violation and the prospect of future harms. It must also be consistent with Facebook's rules for severe violation, which must in turn be clear, necessary, and proportionate."

Nate this was what everyone was waiting for. Would the indefinite ban be lifted? What do you make about the fact that the board refused to come up with standards on its own, but sent it back to Facebook? And what does this say about the future of indefinite bans?

Nate Persily: [00:29:47] Well, so I do think that it is hard to justify indefinite bans for community standards violations based... you know, after this decision. So whether it's for hate speech or incitement or, or nudity or anything else you know, if you believe in the... that the international human stand... rights standards that are articulated in this decision are the right ones, then it's hard to see how an indefinite ban would apply going forward. So so that puts a lot of burden on Facebook if they are going to implement this decision beyond the confines of this decision, which they don't have to.

Let's be clear that at least the way that this is organized, Facebook's obligation is just to deal with the Trump controversy In this context, it doesn't necessarily have to deal with all other cases like this. But, but this... You know, what exactly does the Oversight Board expect of Facebook as a general rule? Is it that all suspensions now have to be revisited every six months? And you know, th- that's going to be a pretty expensive proposition for them.

But more importantly, what kind of information goes into that recalibration? Right? So wh- when it comes to Trump, it's going to be whether a kind of clear present danger of violence conti... or, you know, or, or whatever continues to exist, such that he should be taken off the platform. Ironically, I should say that one of the... at the risk of being too snarky, what, what ends up happening in this decision is that the Oversight Board basically tells Facebook that it should be more like YouTube, which is actually the way that YouTube has, has come out, which is that he is indefinitely banned until he is until the situation is such that he doesn't pose a threat.

But, you know, it's, it's not... th- these are not mathematical calculations. What kind of... What would you have to see in the environment right now in order to know whether President Trump would use Facebook to incite violence again, or to, or to engage in, in problematic behavior? Because frankly, look, no one predicted the January 6th insurrection, right? And once it happened, it was too late. And so it's not clear to me what conditions on the ground you would have to see before you can say, all right, the emergency has passed.

Jeffrey Rosen: [00:31:53] Kate, a minority of the board also was not satisfied with simply lifting the ban and noted that Facebook's rules should ensure that users who seek reinstatement after suspension recognize their wrongdoing and commit to observing the rules in the future. In this case, the minority suggests that before Mr. Trump's account can be restored, Facebook must aim to ensure the withdrawal of praise or support for those involved in the riots. A very strong suggestion. What, what do you make of this division among the majority and the minority about when and how to lift the ban and what this says about bans, in- indefinite bans moving forward?

Kate Klonick: [00:32:31] Yeah, I also think... I, I think that's a great question. And I, I don't know... I think that the inclusion of that as the minority opinion is going to be important in what Facebook decides it's going to do in terms of reestablishing or better establishing its standards and what, and how they're going to structure things. I think right now inside the company there are a lot of people who are not, you know, not in, in the C-suite but are on the factory floor, that are, that are very happy about this decision, because they have felt a need for this type of rigor and procedure for a long time. And that the, the board giving voice to this gives them a level of legitimacy to kind of move and agitate for some of these standards becoming better.

I, I really am curious to see what happens to the idea of newsworthiness coming out of this. And I think that this is... I mean, it has... in my opinion, it has been an undefinable or circularly defined thing, even in First Amendment Law for a long time, and, a- and it's like a concept in general. And I feel as if Facebook adopted the idea of it because it is a... it was a word that they could hide behind to do certain things that were otherwise unpalatable, that people didn't want them to do. And so it's going to be interesting how they instrumentalize basically this idea of, of public figures and newsworthiness coming out of this decision.

Jeffrey Rosen: [00:34:06] Uh, Nate, as Kate notes, this question of whether influential users should be treated differently than other users is important, and it's flagged by the concluding portion of the decision, the policy advisory statement, where the board asserts that it's not always useful to draw from distinction between political users and other influential users. And says that Facebook should publicly explain the rules it uses when it imposes account level sanctions against influential users.

These rules should ensure that when Facebook imposes a time-limited suspension on the account of an influential user to reduce the risk of signif- significant harm, it will assess whether the risk has receded before the suspension term expires. So what do you make of this policy statement which raises all sorts of questions about how to treat public figures and influential users? And what does that say about how ordinary users will be treated moving forward?

Nate Persily: [00:34:58] So, you know, the casual observer will look at this decision and, and naturally think that this is all about president Trump and the particular circumstances surrounding the insurrection. But, you know, this is a decision as much about Bolsonaro and Modi and Duterte, as it is about President Trump. And you know, you need only to spend a few hours with Facebook employees to, to quickly hear about the international implications of everything they do, and that everybody is paying too much attention to what happens in the US.

So, so you are right that, that last advisory... A portion of the opinion, which was in response to a question that Facebook sought advice on, right, is quite important. And, you know, I think that the Oversight Board is right, in that, you know, you shouldn't think that elected leaders are the only people whose influence matters. But as Kate was saying, there is some question about like the newsworthiness exception and whether a, a political leader elected or otherwise should be held to different standards, either more strict or less strict, frankly.

Because you might think... Th- the argument for why Facebook applied a newsworthiness ex- exception to to political leaders is that they're amenable to counter speech, they have lots of people arguing with them and that they can... they're sort of a subject to the marketplace of ideas. The long and the shorted here for me is simply that it is really difficult to define a class of individuals whose speech requires different rules than other people, right? Because it is like with all these things, gonna be dependent on context and the power of a speaker to have an impact on a particular audience.

Jeffrey Rosen: [00:36:30] Thank you for that. Kate, there are other policy recommendations at the end of this decision in particular, the board finds that Facebook's penalty system is not sufficiently clear to users and does not provide adequate guidance to regulate Facebook's exercise of discretion, Facebook should explain, and its community standards and guidelines, its strikes and penalties process for restricting profiles. It should give users sufficient info- information. And finally, the board urges Facebook to develop and publish a policy that governs its response to crises or novel situations when its regular processes would not prevent or avoid imminent harm.

So this is an awful lot of requests of Facebook to clarify policies that the board finds are not sufficiently clear. Do you, having studied the board, believe that Facebook will respond by answering the board's request to provide the clarity that its current policies do not provide?

Kate Klonick: [00:37:22] I, I think that that is... I think that they have to, and that there's no reason to think, yet given how they have basically complied with the board so far, that they shouldn't do that going forward. That being said, I think that this... And maybe this is why I'm the most... this part is the thing that I'm most excited about, because from an anthropological from an anthropological level, having studied the, the system inside of this and had to reverse engineer and piece together through qualitative conversations with former employees and people, how these decisions were made, this speaks to me very deeply and so a lot of my frustration.

And I think something that has been has been one of the biggest problems that I've always seen with how Facebook and these platforms have, adjudicate their, their speech standards, which is that they're, they are opaque. They are there is a lack of equity, there is a lack of equality. There is a lack of fairness and, and most importantly, there's a lack of consistency. The, the ad hoc nature of this decision and like the strike system that like none of us had any... I mean, I did actually know that this was kind of this weird justification they have internally, but most people have no idea even exists. This is crucial to people understanding what their rights are.

And so I think that, I think that, you know, you never know what they're going to do. I certainly think that Facebook could try to dodge in some way. And there's of course always the chance that they say that they do all of these things and then they, you know, and then they don't do all of those things. But I think we're getting closer and closer to kind of, to that... to having a clear answer and some meaningful transparency around the process, then we have... and then we were, we were ever going to have just spare them posting a list of their rules. But like instead, we we're kind of getting to a kind of, have an open court, and it's just a, it's a, it's a world of difference.

Jeffrey Rosen: [00:39:26] Nate, how do you think Facebook will respond? And, and could you imagine a policy as detailed as the one that the board is asking for? You began with Marbury versus Madison, Marshall declined to order President Jefferson to deliver Marbury's commission, 'cause he said, "I'm not fond of butting my head against a wall as in sport." Here by contrast, the advisory board is asking for a very detailed series of distinctions. How should public figures be treated differently than ordinary users? Or indefinite bans versus time limited bans, the relevance of newsworthiness and the strike system. Do you think that a, a detailed policy will in fact emerge and, and what do you think such a policy should look like?

Nate Persily: [00:40:09] Well, well, to take that Marbury analogy further, yes, you're right that they declined to order the commission, but they strike down the Judiciary Act, is unconstitutional, right? So, so that... It's again, taking with one hand and giving with the other. And I, and I think that that's what happened here, which is that on the one hand... I actually think the folks in the C-suite, to use Kate's terminology, I, I think are actually perfectly happy with this decision, and the way it came out, because they... you know, now they have some direction from, from the Oversight Board that is implementable, at least in this case.

Now, I think you are right. Like everybody always says the community standards are too vague. Right? And as you know, when... if you teach First Amendment Law, this is always what happens in a First amendment case. You say it's too, too over-inclusive or it's too vague. Right? It's too over-broad. That- that's just the nature of any speech code, right? Which is you, you say, well, why didn't they have a policy on, you know, speech that might lead to the insurrection of the Capitol on January 6th, right? You cannot come up with, in advance, a detailed incitement standard that will account for all of the possibilities.

Now, i- it could be the case that they... I do think that they are, they are able to specify and have greater transparency as Kate suggests over the strike system that they employ, but a- as well as the you know, the calibration of penalties and what you will get if you engage in one thing or another. The problem is, is that if they give users notice like that, then if people know they have three strikes, then they will reserve their two strikes for when it's going to be most injurious. Right? So, so there's a reason they have not been transparent about these things, which is that it leads to a kind of calibration of user behavior to exploit the system.

What the... One graph that, that Facebook folks will always show you is that wherever they put the line on a speech regulation, then the, the amount of speech increases sort of logarithmically as it approaches the line, right? So that they will end up with people pushing the envelope as much as possible, no matter where they put that line.

Jeffrey Rosen: [00:42:12] Thanks so much for that. Kate, given th- the pressures you've described within the company to ban unpopular users like President Trump, and given the minorities view in this decision that president Trump should have to apologize before he's allowed back on the platform, do you imagine Facebook will come up with some standards that ultimately make it hard for them to get back on the platform?

Kate Klonick: [00:42:36] I think that they're going to... I... to, to Nate's point, I think he's completely correct that they're... that this is... maybe everyone at Facebook and maybe everyone at the platforms is a little bit happy a- around this decision today not just the people who wanted to beat up Facebook. But the idea that like there is going to be a set of... that Facebook now has permission to come up with a process for people like Modi or for people like Bolsonaro and people like Trump, I mean, for world leaders generally, and then it'll have a backstop with something like the Oversight Board, I think is I think is a clear signal out of a lot of noise.

And I think the bigger question will start to, to turn to the board is, are they sufficient to check this power? And how, and whether or not they should start being seen as kind of a, like a legitimate international court or a body. I think that the language of this decision... And we shouldn't be totally surprised given the quality of the people that are on the board, but the language of this decision is very serious. It is very rigorous and it's very it's very clear and well-reasoned. And I think that, I think that i- it bodes well for this being a new voice in international, in international circles.

Jeffrey Rosen: [00:43:58] Nate, is this decision a model for oversight of free speech decisions on the platforms moving forward? Can you imagine pressures for there to be a Twitter Oversight Board and a Snapchat Oversight Board? And is this a good model? The decision was anonymous. We did not have individual judges signing majority opinions and dissents. There was not  clear distinction between the policy recommendations and the holdings. So if you were to advise Twitter and other platforms about how to set up Oversight Boards, what tweaks or advice would you offer?

Nate Persily: [00:44:38] Well first, the Oversight Board was set up with its trust and other institutional characteristics so that other firms, if they wanted to, could actually join in and refer, refer stuff to it. So I don't think that's ever gonna happen because Facebook is toxic and no other firm really wants to join in any of its dispute resolution mechanisms, but it's there. I think that, that each platform has sort of different issues that it's dealing with, and, and Twitter is having, is, is I think taking a different approach. Dorsey has been investigating the idea of user... sort of, sort of distributed content moderation, where users can opt in to different types of content moderation systems.

I think that holds a lot of promise actually. It's, it's complicated idea, but the idea is that you choose the rules for your Twitter feed. YouTube, I think, has been the most opaque and has been more than happy to hide behind Facebook as it takes the slings and arrows of everybody who is complaining about these content moderation decisions. Look, I think that, that the Facebook Oversight Board is a incomplete sixth-best solution when the... No one is willing to put forth the first five solutions because governments, which everybody... democratically kind of go- governments, which everybody thinks should be the ones making these important decisions have either dropped the ball, or they have completely screwed this up.

And so you need only look at, at India, which is actually where the content moderation fight is going to be fought. And you see the take-downs that the Modi government has ordered on people who are criticizing government for coronavirus response. And that is what Democrats... sometimes a democratic government will... th- those are the rules that they will come down with. And so the Oversight Board is a, you know, a secondary sixth-best solution here. And I think that, you know, the other platforms are waiting to see how this works and then whether they can improve upon it.

Jeffrey Rosen: [00:46:29] Kate, your closing thoughts on this wonderful discussion. We began by calling this the Marbury of... Free speech law and ended with Nate's thoughts that it was a second or a sixth-best solution. What do you think the model for this decision is for other platforms moving forward and what will it say about the future of oversight and online free speech?

Kate Klonick: [00:46:47] I think that we're in a moment where we're trying to figure out every... we're giving everything in this kitchen sink a shot at trying to fix the problem of online speech. I think that this is a model of governance that I... will... is going to have purchase. And I think that there is a, a real chance that a real chance that as Nate says, that other platforms have different types of considerations and different types of problems that are not... would not be perfectly solved by an oversight board. But I think that this is going to be the start of, of a really... of a serious conversation around around how to put governance into all of these other platforms.

Jeffrey Rosen: [00:47:26] Thank you so much Nathan Persily, and Kate Klonick for a illuminating, rigorous and timely discussion of this important decision by the Facebook Advisory Board, dear We the People listeners, please follow up by reading the decision yourself. And if you have further thoughts, email me and let me know what they are. Nate, Kate, thank you so much for joining.

Nate Persily: [00:47:51] Bye-bye. Thank you.

Jeffrey Rosen: [00:47:56] Today's show was engineered by David Stotz and produced by Jackie McDermott. Research was provided by Mac Taylor and Jackie McDermott. The homework of the week, read the Facebook advisory decision and write to me and tell me whether or not you find the standards that it applied uh, persuasive. Please rate, review and subscribe to We the People on Apple Podcasts, and recommend the show to friends, colleagues, or anyone anywhere who's hungry for a weekly dose of timely and on the news, constitutional debate.

And always remember that the National Constitution Center is a private nonprofit. We rely on the generosity, the passion, the engagement, the willingness to read in real time the important decisions affecting the constitution and the law from around the country who are inspired by our nonpartisan mission of constitutional education and debate. You can support the mission by becoming a member at constitutioncenter.org/membership, giving a donation of any amount to support our work, including this podcast at constitutioncenter.org/donate, or by engaging with the text, reading the decision and making up your own mind about how to apply constitutional values in a world of changing technologies. On behalf of the National Constitution Center, I'm Jeffrey Rosen.

Loading...

Explore Further

Podcast
Can the Government Pressure Private Companies to Stifle Speech?

The Supreme Court examines the limits of jawboning

Town Hall Video
Unpacking the Supreme Court’s Tech Term

Legal experts Alex Abdo, Clay Calvert, and David Greene explore key tech cases before the Supreme Court and important questions at…

Blog Post
A national TikTok ban and the First Amendment

The recent House passage of a bill banning TikTok from app stores in the United States has ignited a national constitutional…

Educational Video
AP Court Case Review Featuring Caroline Fredrickson (All Levels)

In this fast-paced and fun session, Caroline Fredrickson, one of the legal scholars behind the National Constitution Center’s…

Donate

Support Programs Like These

Your generous support enables the National Constitution Center to hear the best arguments on all sides of the constitutional issues at the center of American life. As a private, nonprofit organization, we rely on support from corporations, foundations, and individuals.

Donate Today

More from the National Constitution Center
Constitution 101

Explore our new 15-unit core curriculum with educational videos, primary texts, and more.

Media Library

Search and browse videos, podcasts, and blog posts on constitutional topics.

Founders’ Library

Discover primary texts and historical documents that span American history and have shaped the American constitutional tradition.

News & Debate