Meta oversight board hits out at shortcomings in Facebook VIP moderation system

Cross-check program was designed to help Meta avoid a PR backlash by famous people who mistakenly have their posts taken down for breaking rules

The Meta oversight board made 32 recommendations to allow Meta to comply with its commitments and address issues. Reuters
Powered by automated translation

Meta's oversight board has criticised Facebook's cross-check program, which moderates its most powerful users.

The internal system, which exempts high-profile users, including former US president Donald Trump, from some or all of its content moderation rules, "is flawed in key areas which the company must address", the board said in a report.

The board accepted a request from the company to review, cross check and make recommendations for how it could be improved.

It found that Meta was performing about 100 million enforcement attempts on content every day.

"At this volume, even if Meta were able to make content decisions with 99 per cent accuracy, it would still make one million mistakes a day," the report said.

"In this respect, while a content review system should treat all users fairly, the cross-check program responds to broader challenges in moderating immense volumes of content."

The oversight board's report said that the cross-check system resulted in users being treated unequally and that it led to delays in taking down content that broke the rules because there were up to five separate checks.

Decisions, on average, took more than five days, it found.

"When users on Meta’s cross-check lists post such content, it is not immediately removed as it would be for most people, but is left up, pending further human review," the report stated.

"In our review, we found several shortcomings in Meta’s cross-check program. While Meta told the board that cross-check aims to advance Meta’s human rights commitments, we found that the program appears more directly structured to satisfy business concerns."

The board said it understands that Meta is a business, but by providing extra protection to certain users selected largely according to business interests, cross-check allows content that would otherwise be removed quickly to remain up for a longer period, potentially causing harm.

For content posted by American users, the average decision took 12 days, and for Afghanistan and Syria, it was 17 days. In some cases, it took a lot longer: one piece of content waited 222 days — more than seven months — for a decision, the report said, without providing further details.

The board noted that it first became aware of cross-check in 2021 when deciding its case on the suspension of former US president Donald Trump's accounts.

Nick Clegg, Meta’s president for global affairs, tweeted that the company requested the review of the system “so that we can continue our work to improve the program”.

To fully address the board’s recommendations, “we’ve agreed to respond within 90 days”, he added.

Among its 32 recommendations, the board said Meta “should prioritise expression that is important for human rights, including expression which is of special public importance".

Updated: December 07, 2022, 2:27 PM