May it Please the Court: Exploring Facebook’s Oversight Board Formation and Decisions

Sarah Ryan

Associate Editor

Loyola University Chicago School of Law, JD 2022

Last Friday, Facebook’s Oversight Board (“the Board”) issued its latest verdict, overturning the company’s decision to remove a post that moderators alleged violated Facebook’s Violence and Incitement Community Standard. This judgment brings the Board’s total number of decisions to seven, with the Board overturning the Facebook’s own decision in five out of the six substantive rulings it has issued. The Board’s cases have covered several topics so far, including nudity and hate speech. Because Facebook’s Oversight Board does not have any modern equivalents, it is worth exploring what went into this experiment’s formation.

Developing Facebook’s “Supreme Court”

While fashioning itself as a haven of free expression on the Internet, Facebook has long been the target of criticism that the conspiracy theories, hate speech, and disinformation that run rampant on the platform threaten democracy. Because the First Amendment complicates the U.S. government’s regulation of ‘free speech,’ Facebook, like other technology platforms, has exercised nearly unlimited control over content removal. Over time, Facebook developed its own set of Community Standards to ensure that its two billion global users “feel empowered to communicate” but are protected against the risk of harm or abuse that certain content presents.

In 2019, in response to even more concerns about Facebook’s threat to democracy and calls for transparency, accountability, and due process, the company decided to create an independent organization to review its moderators’ decisions, much like the U.S. Supreme Court reviews judicial decisions by lower courts. Previously, users could object to content removal by appealing the removal to Facebook’s community moderation team, but that was the extent of any dispute. However, Mark Zuckerberg, Facebook’s co-founder and CEO, has expressed his current belief that he and the company should not have complete control over decisions about free expression, political discourse, and safety.

In creating the new group, Facebook sought input from both the company’s critics and supporters by conducting research on a global scale, with more than 650 people in 88 different countries. This global consultation process resulted in the Oversight Board’s charter and governance design, as well as the establishment of an independent Oversight Board Trust. Each entity has separate roles and responsibilities, which is meant to help “ensure mutual accountability.” Facebook began the member selection process with four co-chairs, who then selected the Board’s first sixteen additional members, experts who represent twenty-seven countries and a variety of disciplines like the law and human rights.

Board members are not Facebook employees, but instead contract directly with the Oversight Board, and their six-figure salaries are funded by Facebook through the “independent” Oversight Board Trust. Anyone can recommend oneself or another candidate for board membership, through a recommendations portal operated by the firm Baker McKenzie.

How the decisions work?

Users of Facebook and Instagram (and Facebook itself) can now appeal the Facebook community moderation team’s decisions regarding “takedowns” to the independent Oversight Board, which renders decisions that are supposedlybinding, even if they overrule Zuckerberg himself. According to Facebook, around two hundred thousand posts become eligible for appeal every day, but the Board selects only a small number of “highly emblematic cases” to review. Each case is reviewed by a panel of five randomly selected members, who remain anonymous. The Board’s charter confirms that the expert panels should generally defer to past decisions in an attempt to set some sort of precedent with the tiny fraction of cases that the Board will actually be able to review.  

While many have labelled the Board as Facebook’s private “Supreme Court,” there are important differences between the bodies. First, unlike the U.S. Supreme Court, users submit a written brief arguing their case in lieu of oral arguments. In turn, a Facebook representative files a response brief explaining the company’s reasoning for removing the content. After the panel’s decision is approved by the entire board, it becomes binding for Facebook and is implemented promptly “unless implementation of a resolution could violate the law.” Also deviating from the U.S. Supreme Court, the Board’s rulings do not become official Facebook policy in the same way Supreme Court follow the tradition of stare decisis. The Board’s rulings apply only to the individual case at issue, and while the Board may issue policy recommendations to Facebook with its decisions, these recommendations are only advisory. This means that the impact of the Board’s decisions over Facebook’s policies and practices remain to be seen.  

So far, the independent Oversight Board has been met with mixed reviews. Its future success seems to depend on the degree to which it demonstrates independence, organizational legitimacy, and transparency. It is yet to be seen what relationship might develop between this new body and existing laws and regulations. Some posit that this quasi-legal system could cite judicial opinions from different countries, and that real courts might even eventually cite the Board’s opinions in return. Regardless, the world will continue to monitor the Board in the upcoming months, as it resolves the issue of Donald Trump’s removal from the platform.