Traumatised former Facebook moderator: I cried my eyes out in a cafe

Chris Gray reveals horrors he witnessed that led him to sue the social media giant

This former Facebook moderator is suing for psychological damage

This former Facebook moderator is suing for psychological damage
Powered by automated translation

Watching self-harming images and videos of killings several times a day became a way of life for Chris Gray.

The former Facebook moderator was responsible for viewing hundreds of pieces of inappropriate material every day and it was not until years later that the psychological trauma hit him.

"I was sat in a cafe in Dublin talking about what I had witnessed and I just started crying," he told The National.

“All the horrific things I’d seen. I never realised at the time how much it had affected me. We were under pressure to meet targets and it was just a continuous, never-ending stream of material showing the worst of society. Watching a child kill herself or others being shot and tortured, I became numb to it for a time.”

Mr Gray, 54, was employed full-time as a community operations analyst for Facebook by recruitment firm CPL and was based at its tech hub in Dublin, Ireland.

Facebook's European headquarters in Dublin. (Photo by Niall Carson/PA Images via Getty Images)
Facebook's European headquarters in Dublin. (Photo by Niall Carson/PA Images via Getty Images)

The strain of the challenging work started taking its toll on him after six months and he began suffering nightmares, anxiety, mood swings and later a breakdown.

He received a post-traumatic stress syndrome (PTSD) diagnosis and is leading a pioneering compensation case against Facebook and CPL.

“I was getting depressed,” he said.

“I was becoming very hardened to what I was seeing and I could see my mood was changing. I just soldiered on, I didn’t realise that myself and others were being harmed by it.”

He joined Facebook’s team in 2017 when the company had just launched an evening shift for moderators to cope with an increase in demand.

After just eight days of training, he and 50 others were given a copy of Facebook's guidelines and began the difficult task of sifting through the thousands of reported posts.

“We had never seen the system before that moment,” he said.

“We were in a theoretical vacuum and when we started it was all a mess. I started off with viewing nudity complaints and German hate speech before moving on to bullying, graphic violence and suicides.

“We were not getting through as much as the day team, so they began auditing us.

Smoke billows above what is believed to be a burning village in Myanmar's Rakhine state as members of the Rohingya Muslim minority take shelter in a no-man's land between Bangladesh and Myanmar in Ukhia on September 4, 2017.
A total of 87,000 mostly Rohingya refugees have arrived in Bangladesh since violence erupted in neighbouring Myanmar on August 25, the United Nations said Monday, amid growing international criticism of Aung San Suu Kyi. Around 20,000 more were massed on the border waiting to enter, the UN said in a report. / AFP PHOTO / K.M. ASAD
Facebook moderators have been affected by monitoring atrocities in Myanmar. AFP PHOTO / K.M. ASAD

“The problem was there was no one to ask for advice in the evenings."

He said they were given quizzes as a training device because there was no proper training system in place. "The people advising us had only been there one month longer.”

Many of the moderators were former language students, on low wages, with no experience of dealing with traumatic content.

In one instance, he had to view an image of a motionless baby lying on the ground with a person’s foot on it.

It led to an argument with an auditor over what the breach was, over whether the baby was alive or not.

“You begin losing your humanity,” he said.

Another case showed footage of a lorryload of people being unloaded at gunpoint and lined up next to a trench in the ground. Mr Gray said as the shooting started he had to continue watching until the end to ensure he made the right decision according to Facebook's guidelines.

“It was like we were ambulance first responders, you just think you can distance yourself from it,” Mr Gray said.

“We never talked about it. You are not allowed to talk about it, even with your family, they drill it into you. At work there’s no opportunity either.

"I asked for a one-to-one session with the wellness team, they are people outsourced too. I was depressed and there was no real acknowledgement of PTSD and that it was a real thing. They just told us to take a 10-minute breather if we saw something disturbing. There was no systematic acceptance that this stuff can build up in your head.”

The moderators were expected to have a 98 per cent accuracy record on the content they assessed, Mr Gray said.

“The atmosphere was not very good. We had to get a score of 98 per cent accuracy but when you are dealing with up to 1,000 tickets [pieces of content] a night of constantly new content coming through, and them changing the rules, it’s impossible,” he said.

He left the job in 2018 and it was in March 2019 that he first spoke out about the inadequate training offered to moderators. As he began reliving the images he had seen he became overwhelmed.

“I talked about how the training was inadequate, how there was nobody there to talk to about the bad experiences and I began dredging things up and I broke down,” he said.

“I was in public and just crying my eyes out. I was totally shocked at what had happened. I was totally bewildered and I went to see my doctor and I was diagnosed with PTSD.”

AYG7C5 IE - DUBLIN: Four Courts
Dublin's High Court is set to hear the Facebook cases later this year.

Since then he has had counselling and seen therapists.

A year ago he lodged a legal action in Ireland’s high court against Facebook and CPL and is seeking compensation for the trauma he suffered.

“I’m just really angry that this has been done to us, that we weren’t allowed to talk to anyone and that this problem did not exist,” he said.

“If they had just acknowledged and managed the problem and put systems in place I could have continued to do the work.

“These big companies will not accept PTSD is a real thing. Since I first spoke out I have been contacted by so many people thanking me for highlighting the problem.

“There are many others waiting in the wings. People are scared to take action, Dublin is a small place and Facebook and CPL are massive employers.

“Once these cases are settled the floodgates will open. At the moment people are afraid over their future employment prospects due to the stigma attached to a court case.”

He has now been joined by a further 30 moderators across Europe seeking legal action against Facebook.

CPL told The National it does not comment on individual cases but said the health and safety of its employees is its "top priority".

“We have many measures in place to ensure employee well-being, including unrestricted access to counselling services as well as a 24/7 on-call service,” it said.

“We operate a professional, safe and rewarding work environment and are very proud of the great work carried out by our team.”

Facebook said it provides extensive training and is implementing technical solutions to limit people’s exposure to graphic material.

“This is an important issue, and we are committed to getting this right,” it said.