Facebook released guidelines content managers use to decide whether posts should be removed. Dado Ruvic / Reuters
Facebook released guidelines content managers use to decide whether posts should be removed. Dado Ruvic / Reuters
Facebook released guidelines content managers use to decide whether posts should be removed. Dado Ruvic / Reuters
Facebook released guidelines content managers use to decide whether posts should be removed. Dado Ruvic / Reuters

Facebook reveals secret guidelines used to police extremist content


  • English
  • Arabic

Facebook has published the secretive rules its 7,500 content monitors use to remove posts likely to promote terrorism, incite violence or breach company policies covering everything from hate speech to child exploitation, sex, bullying and drugs.

The 27-page Facebook rule book released Tuesday offers an unprecedented insight into how the company decides what its two billion users may or may not share, and how the social media giant navigates the line between censorship and free speech. The rules update the short "community standards" guidelines Facebook has previously allowed users to see.

"You should, when you come to Facebook, understand where we draw these lines and what is okay and what's not okay," Facebook's vice president of product policy and counter-terrorism Monika Bickert, a former US federal prosecutor, told reporters on Tuesday.

In Facebook's “Graphic Violence” guidelines section, for example, Facebook explains that it removes content that “glorifies violence or celebrates the suffering or humiliation of others” but allows graphic content, with some limitations, to help people raise awareness about issues. In its “Hate Speech” section, Facebook said it doesn’t allow speech that “creates an environment of intimidation and exclusion and in some cases may promote real-world violence”.

The rule book does not address controversial issues that have dogged Facebook for months, however, including the publication of fake news, the Cambridge Analytica data harvesting scandal, or questions about whether Facebook is doing enough to protect the welfare of children online. On Monday, Facebook took another hit when it was sued for defamation by Martin Lewis, a British financial expert who claims his image has been used in 50 fake Facebook adverts to scam millions from vulnerable people.

_____________

Read more

_____________

Siobhan Cummiskey, Facebook's head of policy for Europe, the Middle East and Africa, admitted the company's enforcement of policy violations isn’t perfect but insisted Facebook had the interests of its users at heart and plans to hire additional content reviewers to beef up its 7,500-strong team worldwide.

In an interview with Sky News, Ms Cummiskey said the company uses a combination of technology, human reviewers and the flagging of problem content to remove texts, pictures and video posts that violate standards.

"In the context of child exploitation imagery, we use technology in order to stop the re-upload of known child exploitation images,” she said.

"Technology is also helping to counter terrorism. Ninety-nine percent of terrorist content is removed before it is ever flagged by our community of users."

She insisted that Facebook considers the safety of its users to be paramount “and that's really why we are publishing this new set of community standards".

Facebook told reporters that it considers changes to its content policy every two weeks at a meeting called the “Content Standards Forum,” led by Ms Bickert. Its standards are based, in part, on feedback from more than 100 outside organisations and experts in counter-terrorism, child exploitation and other areas.

For the first time, Facebook said it would also introduce a mechanism that will allow users to appeal decisions to take down content. Previously, users could only appeal the removal of accounts, Groups and Pages.

The new standards highlight Facebook’s determination to act on unacceptable content, but they are also an admission by Facebook that it needs to improve.

"Our policies are only as good as the strength and accuracy of our enforcement and our enforcement isn't perfect. We make mistakes because our processes involve people and people are not infallible," Ms Bickert said in a blog post.

Technology writers were divided about why Facebook chose to release the guidelines on Tuesday.

“Coming soon after CEO Mark Zuckerberg's testimony in two lengthy Congressional hearings earlier this month, the release is well-timed if not overdue,” Wired reported.

Quartz noted that some of the new rules were “clearly developed in response to a backlash Facebook received in the past.”

TechCrunch commended Facebook for making some “significant improvements” but added that the company was effectively shifting criticism to its underlying policy instead of individual incidents of enforcement mistakes, such as the 2016 controversy where Facebook took down posts of the newsworthy Napalm Girl historical photo because it contains child nudity, then restored the posts.

The 27-page Facebook guide is filled with minute detail from definitions to explanations about why Facebook removes specific content, including violent or sexual images.

Under the “Graphic Violence” section, for example, Facebook says users should not share images of violence against people or animals with comments or captions by the poster containing enjoyment of suffering or humiliation, an erotic response to suffering, remarks that speak positively of the violence or remarks indicating the poster is sharing footage for sensational viewing pleasure.

UAE currency: the story behind the money in your pockets
Timeline

2012-2015

The company offers payments/bribes to win key contracts in the Middle East

May 2017

The UK SFO officially opens investigation into Petrofac’s use of agents, corruption, and potential bribery to secure contracts

September 2021

Petrofac pleads guilty to seven counts of failing to prevent bribery under the UK Bribery Act

October 2021

Court fines Petrofac £77 million for bribery. Former executive receives a two-year suspended sentence 

December 2024

Petrofac enters into comprehensive restructuring to strengthen the financial position of the group

May 2025

The High Court of England and Wales approves the company’s restructuring plan

July 2025

The Court of Appeal issues a judgment challenging parts of the restructuring plan

August 2025

Petrofac issues a business update to execute the restructuring and confirms it will appeal the Court of Appeal decision

October 2025

Petrofac loses a major TenneT offshore wind contract worth €13 billion. Holding company files for administration in the UK. Petrofac delisted from the London Stock Exchange

November 2025

180 Petrofac employees laid off in the UAE

European arms

Known EU weapons transfers to Ukraine since the war began: Germany 1,000 anti-tank weapons and 500 Stinger surface-to-air missiles. Luxembourg 100 NLAW anti-tank weapons, jeeps and 15 military tents as well as air transport capacity. Belgium 2,000 machine guns, 3,800 tons of fuel. Netherlands 200 Stinger missiles. Poland 100 mortars, 8 drones, Javelin anti-tank weapons, Grot assault rifles, munitions. Slovakia 12,000 pieces of artillery ammunition, 10 million litres of fuel, 2.4 million litres of aviation fuel and 2 Bozena de-mining systems. Estonia Javelin anti-tank weapons.  Latvia Stinger surface to air missiles. Czech Republic machine guns, assault rifles, other light weapons and ammunition worth $8.57 million.

HIJRA

Starring: Lamar Faden, Khairiah Nathmy, Nawaf Al-Dhufairy

Director: Shahad Ameen

Rating: 3/5