Why Facebook's idea for a ‘Supreme Court’ can’t handle all disputes

Oversight board is a worthy step but a tiny solution for an unimaginably vast problem

FILE - In this April 25, 2019, file photo an address sign for Facebook Way is shown in Menlo Park, Calif. Facebook unveiled a broad plan Tuesday, June 18, to create a new digital currency. (AP Photo/Jeff Chiu, File)
Powered by automated translation

Facebook appears to be moving ahead with the Supreme Court-like content oversight board it has been discussing for a year. It’s a worthy step but also a 1 per cent solution for an unimaginably vast problem.

Mark Zuckerberg, Facebook’s co-founder and chief executive officer, has been talking for more than a year about an independent authority that would become a final arbiter about whether a social network post should stay online or be wiped away for breaching the company’s rules against hate speech, calls to violence or other abuses. People can also appeal to the independent body if they think one of their posts has been unfairly flagged or removed.

Facebook has solicited feedback on the structure for this Supreme Court-like body.

This is a promising idea, and I’m glad that Facebook, Google’s YouTube, Twitter and other internet gathering places are all (belatedly) thinking hard about how to deal with the inevitable and sometimes deadly downsides of giving billions of people a public megaphone. Sensible principles, however, must be tested and revised against reality, and I hope when the board does its work it will give the public opportunities to assess how well it’s working.

But no one can pretend that this board of perhaps dozens of people will be able to tackle more than a minuscule fraction of disputed posts a year. That’s useful for high-profile judgment calls, such as the doctored video of US House speaker Nancy Pelosi that surfaced recently on Facebook and for which Facebook faced criticism about how it handled the situation. Indeed, a Facebook executive suggested recently that the oversight board would be helpful for “dozens” of cases every year in which there is debate within the company on the right approach for a post or video.

Dozens of cases are immaterial to Facebook’s scale. The company says that it gets millions of reports every week from people worldwide who believe posts have nudity, graphic violence, hate speech or other potentially inappropriate materials. Many of the judgement calls are made by relatively low-wage contractors who are left scarred by the experience of sifting through the worst of humanity to make split-second calls on whether a post violates Facebook’s rules. It is the hidden horror show behind the internet’s most popular hangouts.

A Supreme Court would be the opposite end of this. A high gloss, highly selective, presumably well-paid group that would would deliberate how to best balance free expression and the protection of people from harassment, violence or manipulation. Facebook likes ideas that “scale”, and the Supreme Court cannot possible scale to the 2.7 billion people who use Facebook’s internet hangouts. That doesn’t mean it’s not worth doing, but let’s also not pretend an oversight board is anything close to a silver bullet.

I can't help think that there is too much focus on Facebook's policies and procedures and not enough on what the company does in real life.

I also can’t help think that there is too much focus on Facebook’s policies and procedures and not enough on what the company does in real life. Facebook’s favourite excuse is that terrorism, calls to violence, promotions of illegal drugs, child exploitation or other abuses “are not allowed” on its internet hangouts. That’s because Facebook has written rules, and those rules have specific prohibitions against all manner of misdeeds. And yet all those abuses are rampant because Facebook exists in the real world and not on paper.

Facebook is a reflection of the world, with the best and worst of humanity amplified and exposed to a wide audience. That means Facebook’s good intentions don’t matter. Its purported diligence and earnestness do not matter. What matters is what the company does when its good intentions meet reality, and too often Facebook has failed in that regard.

Groups in Myanmar complained repeatedly that people in the country were systematically harnessing Facebook to sow hatred and violence against the Rohingya ethnic minority in the country, and yet Facebook could not or would not do anything to stop it. Would the supporters of the Rohingya in Myanmar be served by a Facebook Supreme Court? Would it matter if they appealed to an oversight board about genocide after the fact? Again and again, people broadcast in real time acts of violence on Facebook, and the company believes it should continue to allow live video on its site that is difficult or impossible to police until after the harm has already been done.

Setting and enforcing rules for 2.7 billion people is not simple. Dealing with an open space for billions of people is often reactionary. Facebook too often ignores systematic problems until someone important complains or until it’s too glaringly obvious to ignore. Each country also has its own norms about the appropriate balance of free expression and harmful speech. And one post or photo on its own may be innocuous but it becomes dangerous as part of a pattern to encourage violence or sow division in an electorate.

Assuming it is transparent about its work, having an oversight board for Facebook’s high-profile content disputes is a good step. But the public and regulators should continue to press Facebook and its peers on the bigger, pernicious problems.