Facebook reveals secret guidelines used to police extremist content

Executive overseeing the Mideast, Europe and Africa admits staff aren’t ‘perfect’ in enforcing policy violations

<p>Facebook released guidelines content managers use to decide whether posts should be removed. Dado Ruvic&nbsp;/ Reuters</p>
Powered by automated translation

Facebook has published the secretive rules its 7,500 content monitors use to remove posts likely to promote terrorism, incite violence or breach company policies covering everything from hate speech to child exploitation, sex, bullying and drugs.

The 27-page Facebook rule book released Tuesday offers an unprecedented insight into how the company decides what its two billion users may or may not share, and how the social media giant navigates the line between censorship and free speech. The rules update the short "community standards" guidelines Facebook has previously allowed users to see.

"You should, when you come to Facebook, understand where we draw these lines and what is okay and what's not okay," Facebook's vice president of product policy and counter-terrorism Monika Bickert, a former US federal prosecutor, told reporters on Tuesday.

In Facebook's “Graphic Violence” guidelines section, for example, Facebook explains that it removes content that “glorifies violence or celebrates the suffering or humiliation of others” but allows graphic content, with some limitations, to help people raise awareness about issues. In its “Hate Speech” section, Facebook said it doesn’t allow speech that “creates an environment of intimidation and exclusion and in some cases may promote real-world violence”.

The rule book does not address controversial issues that have dogged Facebook for months, however, including the publication of fake news, the Cambridge Analytica data harvesting scandal, or questions about whether Facebook is doing enough to protect the welfare of children online. On Monday, Facebook took another hit when it was sued for defamation by Martin Lewis, a British financial expert who claims his image has been used in 50 fake Facebook adverts to scam millions from vulnerable people.

_____________

Read more

_____________

Siobhan Cummiskey, Facebook's head of policy for Europe, the Middle East and Africa, admitted the company's enforcement of policy violations isn’t perfect but insisted Facebook had the interests of its users at heart and plans to hire additional content reviewers to beef up its 7,500-strong team worldwide.

In an interview with Sky News, Ms Cummiskey said the company uses a combination of technology, human reviewers and the flagging of problem content to remove texts, pictures and video posts that violate standards.

"In the context of child exploitation imagery, we use technology in order to stop the re-upload of known child exploitation images,” she said.

"Technology is also helping to counter terrorism. Ninety-nine percent of terrorist content is removed before it is ever flagged by our community of users."

She insisted that Facebook considers the safety of its users to be paramount “and that's really why we are publishing this new set of community standards".

Facebook told reporters that it considers changes to its content policy every two weeks at a meeting called the “Content Standards Forum,” led by Ms Bickert. Its standards are based, in part, on feedback from more than 100 outside organisations and experts in counter-terrorism, child exploitation and other areas.

For the first time, Facebook said it would also introduce a mechanism that will allow users to appeal decisions to take down content. Previously, users could only appeal the removal of accounts, Groups and Pages.

The new standards highlight Facebook’s determination to act on unacceptable content, but they are also an admission by Facebook that it needs to improve.

"Our policies are only as good as the strength and accuracy of our enforcement and our enforcement isn't perfect. We make mistakes because our processes involve people and people are not infallible," Ms Bickert said in a blog post.

Technology writers were divided about why Facebook chose to release the guidelines on Tuesday.

“Coming soon after CEO Mark Zuckerberg's testimony in two lengthy Congressional hearings earlier this month, the release is well-timed if not overdue,” Wired reported.

Quartz noted that some of the new rules were “clearly developed in response to a backlash Facebook received in the past.”

TechCrunch commended Facebook for making some “significant improvements” but added that the company was effectively shifting criticism to its underlying policy instead of individual incidents of enforcement mistakes, such as the 2016 controversy where Facebook took down posts of the newsworthy Napalm Girl historical photo because it contains child nudity, then restored the posts.

The 27-page Facebook guide is filled with minute detail from definitions to explanations about why Facebook removes specific content, including violent or sexual images.

Under the “Graphic Violence” section, for example, Facebook says users should not share images of violence against people or animals with comments or captions by the poster containing enjoyment of suffering or humiliation, an erotic response to suffering, remarks that speak positively of the violence or remarks indicating the poster is sharing footage for sensational viewing pleasure.