Facebook Provides New Overview of Oversight Board Decisions and Actions


The big question around Facebook’s independent Oversight Board project has always been how much impact can it actually have, and will it actually be able to make Facebook evolve its more controversial policies?

The core concept makes sense – Facebook has established a group of experts, from a range of fields, to provide another option for review of its content decisions, essentially giving Facebook users a means to seek more fairness and impartiality in such, while additionally providing guidance for Facebook’s policy decisions.

Given the various challenges Facebook has had on this front – from allowing antisemitic speech to ‘censoring’ the former President – many would agree that this is sorely needed, but technically, Facebook doesn’t have to enact any of the Oversight Board’s decisions.

So are they? Has Facebook been implementing the Board’s recommendations – and is that helping to improve its approach?

Facebook’s new Oversight Board quarterly update provides a new level of transparency on such, outlining the full scope of Board actions thus far, and how Facebook has responded to its recommendations.

And it does seem, on the face of it, that the Oversight Board is helping to improve Facebook’s systems.

Facebook Oversight Board stats

As you can see in this chart, in the first quarter of 2021, the Oversight Board issued 18 recommendations based on six cases, and Facebook is implementing – fully or in part – 14 of them.

“[We’re] still assessing the feasibility of implementing three, and taking no action on one. The size and scope of the board’s recommendations go beyond the policy guidance that we first anticipated when we set up the board, and several require multi-month or multi-year investments.”

In addition to these individual case notes, Facebook has also called upon the Board to assess 26 of its content decision cases – those relating to its platform rules – from which the board has selected three.

Facebook Oversight Board cases

Those three cases areas relate to:

  • A case about supposed COVID-19 cures
  • A case of a veiled threat based on religious beliefs
  • A case about the decision to indefinitely suspend former US President Donald Trump’s account

The last one you may have heard about, while the other decisions have provided Facebook with more guidance for its overall platform rules and regulations, which will give it more insight – and ideally, lead to Facebook creating more balanced, nuanced rules for what it will and won’t accept on its platforms.

Of course, that will never please everybody. Some users will see such rules as censorship, while others will say that Facebook needs to do more to protect users.

It can’t get all of it right, all of the time, but the addition of these independent expert insights will ideally help Facebook better align with societal expectations, and lessen the potential negative impact of the platform, in helping to amplify some of the more controversial and divisive elements.

Facebook says that it’s already implemented various upgrades as a result of this guidance, including improved explanations for policy violations, and new tests to assess the impact of telling people about whether automation was involved in enforcement. Facebook says that it’s also updated its Dangerous Organizations and Individuals policy, “creating three tiers of content enforcement for different designations of severity and adding definitions of key terms”.

So the Oversight Board is having an impact, with its outside perspective helping to better shape the platform’s policy approach.

That could lead to more platforms, or regulatory bodies, looking to implement similar – which, really is what Facebook’s Oversight Board project is all about. Facebook doesn’t want to be the one making the rules on what is and isn’t allowed on its platform, it wants all digital platforms to come under the same rules and enforcement, which can only happen via independent assessment like this.

The Oversight Board is really Facebook’s experiment to show how this could work, and it could, eventually, provide a framework for regulation, and improving online discourse.

The results here provide some insight into that process, with a view to that future.

And they do show some promise, at least based on these early findings.

You can check out Facebook’s full Oversight Board Q1 review here.