Two and a half years after Mark Zuckerberg floated the idea on a podcast, and several months after organizers said it would be ready to hear cases, Facebook’s independent Oversight Board is now up and running. In a call with reporters today, the board’s co-chairs said that they are now prepared to hear appeals from the billions of people who use Facebook and Instagram each day. The ability to appeal to the board will roll out gradually around the world over the next several weeks. And when it’s complete, one of the bolder experiments in platform governance will have finally begun.
Let’s begin by taking a step back. One of the more striking things about our social networks, I have argued, is how terrible they are at customer service. If your post is taken down in error, or your account is suspended for unclear reasons, or you are banned and lose access to all of your data, historically you have had almost no recourse whatsoever. You fill out a form, you send it in, you pray. Maybe your prayer will be answered; more likely it will not be. Tech journalists might be more sensitive to this problem than almost anyone else: each day, our inboxes fill up with anguished requests from users of YouTube and Twitter and Instagram who find their posts blocked or their accounts banned. We can almost never help them.
In the early days of social networks, this did not seem to be of particular concern to the tech companies themselves. Good customer service eats away at profit margins, and executives were convinced that most of it could be effectively automated anyway. But as the platforms grew into monoliths, to the point that even the slightest change to the user interface draws angry Congressional inquiries, the question of tech power has come to feel more urgent.
The questions of power have come to feel particularly acute on questions of speech. This is especially true at Facebook, where the founder CEO has majority control of voting shares and lacks any real check on his power. When a high-profile and thorny issue of speech arises — as it did several times this summer — it lands at Zuckerberg’s feet, for him to make the final call. Given Facebook’s vast size, this setup has ensured that at any given time, millions of people are angry at him and at Facebook for deciding against them. It has also ensured that, aside from complain, they can’t do anything about it.
What the Oversight Board now promises is — well, what if they could?
When it’s fully operational, you will have the ability to appeal a Facebook content moderation decision to an independent body, with the company honor-bound (if not legally required) to accept its decisions. For starters, you will only be able to appeal when you believe your post has been wrongfully removed; eventually, you’ll be able to appeal when you believe a post has wrongfully been allowed to stand. And starting now — ahead of the November 3 US presidential election — Facebook will also be able to refer policy issues to the board and receive advisory opinions on what to do.
All of this has been a significant undertaking. Facebook placed $130 million into an irrevocable trust to fund the board’s operations, and for its initial members recruited a former prime minister, a Nobel Peace Prize laureate, constitutional law scholars and human rights advocates.
There has also been the technical work. Facebook built software that will let it transfer cases to the board in a way that protects users’ privacy, and a case management tool that lets board members choose cases to review, examine outside opinions on the cases and other supplementary materials, and to deliberate with their peers. (The board will eventually have 40 members, but individual cases will be heard by a small panel.)
It has taken more time than Facebook once hoped. And the pandemic hasn’t helped — Facebook had to scrap plans to get board members together for in-person training on the new system. But Brent Harris, Facebook’s head of governance, told me in an interview yesterday that the company had moved a quickly as it could.
“From January 1, we have been in institution-building mode,” he said. “I’m not sure how many institutions in 10 months have actually gotten to a spot where they were then ready to take on a responsibility like this — to take appeals from 2 billion people around the world. So we actually think that we’ve moved pretty fast on this one.”
At the same time, Facebook and the board have come in for criticism for failing to get to work before voting began in the US election. In September, a group of vocal Facebook critics announced they were forming a rival organization, confusingly titled the Real Oversight Board, to begin issuing immediate opinions on what Facebook ought to do. (The gist was that Facebook should take a lot of things down.)
In any case: the board is here, and I’m glad. But it can all feel a little anticlimactic, mostly because so many of my big questions about the board remain unanswered.
On a call with the board’s co-chairs this morning, I asked how many cases it expects to hear. Will it select a handful each year, like the US Supreme Court, or will it be set up to process more? (It has more than four times as many “justices” as the Supreme Court, but is serving more than 10 times as many “citizens.”)
“It’s a good question — and one that I think we will be developing over time as we see what the volume of cases that are appealed might be, and as we further develop and refine our case selection procedures,” said Jamal Greene, the board’s co-chairman.
Board members made it clear that, while the board can move quickly should it choose to do so, for the most part it won’t. Facebook will continue to moderate the vast majority of all content on its platform, and to hear the first round of user appeals itself. The Oversight Board will hear only a small fraction of cases beyond that. But if the company or the board have any idea how many appeals they’ll get, they’re not saying. (I asked people at both.)
“Facebook was always criticized for moving fast and breaking things,” said Helle Thorning-Schmidt, the former prime minister of Denmark and board co-chairwoman. “I think we are looking at the opposite — we want to look at quality, and look at how we are here for the long term, rather than to move quickly and be under a lot of time pressure.”
The idea is that the board will pick representative cases — ones that will set precedents and cause Facebook to update its policies. If the board had existed when photos of breastfeeding were a big controversy, you can imagine it taking one case in which a user’s photo had been removed and making the case that Facebook should be more permissive, causing it to relax its policies around the world.
Embedded in the Oversight Board is the idea that an entity as large as Facebook ought to have something resembling a justice system. I’m pressing it on how many cases it might hear because I want to know how much justice we can expect from it.
I’m confident that Facebook will use it as a kind of release valve for some of the trickiest policy decisions that it faces. And I’m sure the board will provide thoughtful guidance on a wide range of issues and individual cases where Facebook has erred. I’m less confident that the board can make Facebook feel more just to the average person — the one who logs on to find that their business’s page has been removed, or account has been suspended, or post has been put behind a warning screen. Customer service issues are on some level about justice, but that’s not the kind of justice that the board is set up to provide.
Still: I’m optimistic. For all its faults, the board still represents an unprecedented move to devolve some of a tech giant’s power back to the people that, on some level, it represents. Yes, it will serve to give Facebook public-relations cover during controversies. But it also enshrines the principle that citizens of a platform have a right to redress their grievances. However much justice the board offers them in the future will likely be more than they are getting today.
This column was co-published with Platformer, a daily newsletter about big tech and democracy.
Recent Comments