Why Facebook liked the idea of asking for help in deciding what to publish

Exclusive: Social media giant's director for global affairs and governance explains its plans to develop an independent oversight working group with participation from the region

Facebook is changing. And it is listening.

At least that is the message the company has been working to make loud and clear in recent months as it strives to minimise the fallout from nearly three years of scandal.

There have been revelations of how Facebook was used to manipulate voters in elections in the United States and elsewhere, criticisms over the lack of oversight of how its own users’ data is being protected, and pressure to stem the proliferation of hate-fueled content on its platforms. Taken together, there is practically a global atmosphere of suspicion and distrust over how the company is being run.

Facebook has responded in the past year, and among its responses is the creation of an independent oversight board of experts to review its content decisions.

“The company is embracing a wider set of approaches for how it operates. Our CEO Mark [Zuckerberg] had a comment on the earnings call recently where he talked about how, for when we launch products now that touch societal issues, we are going to go out and consult on them and think in advance about how to build them,” says Brent C Harris, Director for Global Affairs and Governance at Facebook.

He was speaking exclusively to The National during his visit to Dubai last month, part of global consultations, meetings and workshops about the idea of an oversight board of experts. The UAE meetings included representatives of academia and researchers of digital content and rights and internet governance, from across the Middle East and North Africa.

“I believe you will see the company do this more and more around big changes and big decisions and I think that is something new for how we operate,” says Mr Harris of the consultations.

The board idea came out of wide-ranging talks within the company that began at least as far back as the early part of last year.

“We had discussions pretty much every week internally, and one of the ideas that was proposed was that we should create some board to do a review of really difficult content decisions. I think there was an emerging consensus that it was something worth trying and worth building.”

There is a vast amount of content being produced by Facebook users. Every minute, hundreds of thousands of comments are posted, and about 350 million photos are uploaded daily. Data from 2013 (the most recent figures available) also showed 4.75 billion pieces of content were shared daily – a 94 per cent increase from August 2012 figures.

The timing and the nature of the internal conversations predate some of the more recent issues around hate speech that Facebook has been grappling with. These discussions also started long before the controversy after the Christchurch mosque attacker was able to live stream his deadly assault for 17 minutes on the platform before getting blocked, in March.

“There was a growing sense that the [content] decisions we were taking are ones that we shouldn’t make alone and I don’t think that speaks to any single issue. It is about a growing belief that we don’t believe the decisions should sit solely inside Facebook,” Mr Harris says.

Despite criticism that Mr Zuckerberg and chief operating officer Sheryl Sandberg have appeared reluctant to be involved in decisions over the thornier content issues, the idea of an external content advisory board does not mean Facebook is looking to shirk any of its responsibility.

It says it already makes thousands of difficult decisions about what content should stay up and what should come down, every day. While the board would deliberate on the way Facebook handles specific pieces of content, the company will set policy.

“It is a question that if you step back and think, given the responsibility that Facebook has today, what is the best way to actually be able to engage in that, the best way to exercise that responsibility,” Mr Harris says.

Facebook has often seemed to set itself up as a champion of free speech above all other things.

However, it has taken steps recently to stop the more extreme voices from having a platform, most recently when it banned a half dozen high-profile extremists and one conspiracy theorist with massive followings from its platform, as well as Instagram, which Facebook also owns.

Still, it remains keen to strike a balance between online freedom and safety.

“A lot of the matters that will go before the board are the hard questions of trade-offs between those principles and trying to figure out for a specific piece of content, where do you set that line? That line is a hard one at times to figure out,” according to Mr Harris.

Apart from Dubai, consultations about the board have been held around the world, including in France, Mexico, Singapore and India.

“There has also been fairly consistent set of feedback that the people who should serve [on the board] should be folks who are deeply deliberative and who are impartial,” he says.

Nashwa Aly, Facebook’s Head of Public Policy for the Middle East and North Africa, says that “it is on us, as Facebook, to improve our outreach in order to make sure people understand our community standards and content policy better”.

The board will further show, she says, that “Facebook is not unilaterally making all of the decisions on content pieces, and they are willing to learn and they are willing to open up and they are willing to diversify their understanding across the world because it is a truly global platform”.

Much is still up in the air until the consultations are completed and the process for the development of the board is completed by the end of this year. The number of members could range from between 20 to 100 experts and will be a quite diverse group of people from across the world.

There will be someone from the Middle East on the oversight board.

“The fact that they were keen to make sure the Middle East is represented in global consultations” is indicative of this being the case, Ms Aly says.

It is ironic that a company so representative of the digital age is putting so much effort into building a physical institution.

Mr Harris says that “part of how we moderate content today is very technology focused, very product focused. At the same time what animates the concept is, I think, an increasing belief that, as more of society becomes digital that we also need some of the institutions that we find in traditional society. I don’t think it’s an accident that it may be a low-tech idea but it’s also a time in which, I think, some of the low-tech ideas are increasingly needed in a digital world."

Ms Aly says that the reason “why people approve of us doing this and [are] very encouraging of it, is the fact that it is very human based, it is not relying on a machine, it is not relying on an algorithm. Humans still trust other humans to make those kinds of decisions.”

It is tempting to put this initiative into the box of a global publicity roadshow aimed at a quick boost for Facebook's banged up image of late – look no further than the cover of Wired magazine's April issue: an illustration of a literally beaten up Mark Zuckerberg.

But Mr Harris insists that “it is too hard to be PR. If we were looking to do PR, we would have picked something that is much easier to figure out”.

“We’ve gone around the world and really tried to listen to feedback.”

Updated: May 11, 2019, 9:36 AM