Facebook’s oversight board has recommended an independent reviewer conduct a “thorough examination” into its content moderation of posts relating to Israel and Palestinians.
The recommendation came after a review of a case in which a user’s post sharing an Arabic-language post from a news outlet about a militant group was removed by Facebook. It later reversed the decision.
Facebook originally removed the content under the Dangerous Individuals and Organisations Community Standard Policy, and restored it after the Oversight Board selected the case for review.
Facebook was unable to explain why two human reviewers originally judged the content to breach the policy, noting that moderators are not required to record their reasoning for individual content decisions.
The online platform has been criticised for allegedly acting against Palestinian activists during and after an outbreak of violence in April and May this year.
The board said Facebook declined to provide answers to all its questions about whether the social media platform had been asked to censor Palestinian content because of Israeli government demands.
While Facebook said it had not had any official requests, it did not reveal whether it had received any unofficial ones.
Public comments submitted about the case included allegations that Facebook disproportionately removed or demoted content from Palestinian users and content in Arabic during the conflict.
There were also criticisms of the company's treatment of posts threatening anti-Arab or anti-Palestinian violence within Israel, as well as reproaches for not doing enough to remove content that incited violence against Israeli civilians.
Social media platforms have been dogged by issues with their moderation of content in non-English speaking conflict areas such as Palestine. As these sites tinker with their moderation systems, Palestinians are increasingly reporting that their digital rights are being violated by these platforms.
Among the oversight board’s recommendations to Facebook was the formalisation of a transparent process on how it receives and responds to all government requests for content removal.