Facebook moderators take battle for better conditions to Irish parliament

It will be the first time a social media moderator has testified to a parliamentary committee

Mark Zuckerberg, chief executive officer and founder of Facebook Inc., listens during a joint hearing of the Senate Judiciary and Commerce Committees in Washington, D.C., U.S., on Tuesday, April 10, 2018. Zuckerberg apologized, defended his company, and jousted with questioners while agreeing with others during his first-ever congressional testimony. Early reviews on his effort to restore trust with lawmakers and the public were mostly positive. Photographer: Al Drago/Bloomberg
Powered by automated translation

Facebook moderators are taking their fight for better conditions to the Irish parliament.

Experts have warned more support is needed for the social media giant’s moderators who are employed by outsourcing firms, and that a change of model in the way graphic content is handled is “desperately” needed.

It will be the first time a social media moderator has given evidence to a parliamentary committee.

Former moderators have told The National they had to assess up to 1,000 posts per shift, including disturbing images, against hundreds of criteria, and that they were expected to achieve 98 per cent accuracy.

The Oireachtas' Committee on Enterprise, Trade and Employment is due to hear evidence on Wednesday from a Dublin-based moderator, the Communication Workers Union and the director of advocacy group Foxglove, Cori Crider.

They are expected to talk about the working conditions of moderators and the psychological support available to them.

More than 30 moderators are presently taking legal action against Facebook and its outsourcing firms after allegedly suffering post-traumatic stress disorder from viewing explicit content.

"I’m joining Facebook moderator Isabella at a meeting with an influential committee of the Irish parliament," Ms Crider said.

"It’s an opportunity to persuade Irish lawmakers to support our campaign for Facebook moderators to be treated fairly.

"Content moderators do essential work for big tech corporations like Facebook. They deal with some of the worst content imaginable - child abuse, terrorism - so the rest of us don’t have to see it.

"But the social media companies don’t want to treat their content moderators properly and use outsourcing to keep them at arm’s length."

Earlier this year, an investigation by The National revealed some moderators have been left traumatised by terrorist videos and feel they have not received adequate training or access to mental health professionals, unlike Facebook's in-house staff.

"At the moment, most of the people who do content moderation work for Facebook are not full employees with full employment rights," Ms Crider said.

"Instead they are 'outsourced' to different companies."

Facebook employs more than 15,000 moderators globally through outsourcing companies.

Hundreds of moderators currently employed by Facebook are also campaigning for more support, training and better working conditions.

The chairman of Ireland's justice committee, James Lawless, raised concerns in March that Facebook was not "urgently addressing" content issues after dozens of moderators highlighted workplace problems.

Mr Lawless met Facebook chief executive Mark Zuckerberg three years ago to discuss the company's use of agency staff to monitor graphic content.

"I am still not satisfied that Facebook is addressing the issue with sufficient urgency and the fact that such moderators are typically kept at a legal remove, by use of agency staff, suggests the long-term intention is not positive for them either," he previously told The National.

Last year, Facebook agreed to pay $52 million to 11,250 current and former US moderators to compensate them for mental health issues developed on the job.