Facebook admits it could have done more to stop Myanmar violence incited on its platform

The social network said it hadn’t “done enough” to prevent the spread of violence-inducing posts on its site.

(FILES) In this file photo taken on August 25, 2018 Rohingya refugees perform prayers as they attend a ceremony organised to remember the first anniversary of a military crackdown that prompted a massive exodus of people from Myanmar to Bangladesh, at the Kutupalong refugee camp in Ukhia.




 Canadian lawmakers on September 20, 2018 unanimously voted to declare Myanmar's military against the Rohingya people a "genocide."The House of Commons endorsed the findings of a UN fact-finding mission on Myanmar that found "crimes against humanity have been committed against the Rohingya" and that these acts were sanctioned by top Myanmar military commanders.

 / AFP / Dibyangshu SARKAR
Powered by automated translation

Facebook has admitted it did not act swiftly or comprehensively enough to prevent the spread of division and inciting online violence in Myanmar, saying it “should do more”.

The admission comes after an independent report, commissioned by Facebook and carried out by Business for Social Responsibility (BSR) found that the social network had failed in its responsibility to keep false information and hate speech off its platform, and despite improvements there was still a “high likelihood” that hate speech is being posted on Facebook in Myanmar today.

"The report concludes that, prior to this year, we weren't doing enough to help prevent our platform from being used to foment division and incite offline violence," Facebook's product policy manager Alex Warofka wrote in a blog post on the company's corporate website.

“We agree that we can and should do more.”

Over 700,000 Rohingya have people fled across the border from Myanmar into Bangladesh since last summer as violence, rape and murder were meted out against the Muslim minority by the country’s military in a campaign the UN has called as genocide.

Although the 62-page report made little mention of the Muslim Rohingya people specifically, it laid blame for inciting “offline violence” in Myanmar at Facebook’s door. A mix of Facebook’s lack of action on hate speech and misinformation, combined with Myanmar’s inadequate legal protection of human rights had led to an "enabling environment" for human rights abuses, it said.

Facebook grew exponentially in Myanmar after the country opened up to the outside world in 2011,, with 20 million users in mid 2018. In fact, the social network has as many account holders as there are internet users in Myanmar and many phones come with the platform’s app pre-loaded onto them.

The swift and sudden rise in internet use has led to a crisis of internet literacy, BRS said, with citizens finding it difficult to distinguish between fact and misinformation - something which “bad actors” exploited to spread hate against the Rohingya people and others.

“There are indications that organised groups make use of multiple fake accounts and news pages to spread hate speech, fake news, and misinformation for political gain,” the report said.

“Rumours spread on social media have been associated with communal violence and mob justice.”

After the plight of the Rohingya people became the subject of international condemnation, and Facebook singled out by the UN as being “slow and Ineffective” to take action on the issue, the network has made changes. In a blog post discussing the BSR report’s conclusions, Facebook said it had hired 99 native Myanmar language speakers to review content on the platform and extended the use of artificial intelligence to pick up and delete harmful posts.

It has also thrown several hard-line Buddhist monks and Myanmar military members of the platform.

The report’s conclusions and Facebook’s admission comes as the platform is once again under the spotlight for being slow to act on misinformation and hate on its site. On Monday, it finally removed a campaign advert from Donald Trump which multiple television networks had refused to air. By the time it was removed, the video had be viewed millions of times.