Watching self-harming images and videos of killings several times a day became a way of life for Chris Gray.
The former Facebook moderator was responsible for viewing hundreds of pieces of inappropriate material every day and it was not until years later that the psychological trauma hit him.
"I was sat in a cafe in Dublin talking about what I had witnessed and I just started crying," he told The National.
“All the horrific things I’d seen. I never realised at the time how much it had affected me. We were under pressure to meet targets and it was just a continuous, never-ending stream of material showing the worst of society. Watching a child kill herself or others being shot and tortured, I became numb to it for a time.”
Mr Gray, 54, was employed full-time as a community operations analyst for Facebook by recruitment firm CPL and was based at its tech hub in Dublin, Ireland.
The strain of the challenging work started taking its toll on him after six months and he began suffering nightmares, anxiety, mood swings and later a breakdown.
He received a post-traumatic stress syndrome (PTSD) diagnosis and is leading a pioneering compensation case against Facebook and CPL.
“I was getting depressed,” he said.
“I was becoming very hardened to what I was seeing and I could see my mood was changing. I just soldiered on, I didn’t realise that myself and others were being harmed by it.”
He joined Facebook’s team in 2017 when the company had just launched an evening shift for moderators to cope with an increase in demand.
After just eight days of training, he and 50 others were given a copy of Facebook's guidelines and began the difficult task of sifting through the thousands of reported posts.
“We had never seen the system before that moment,” he said.
“We were in a theoretical vacuum and when we started it was all a mess. I started off with viewing nudity complaints and German hate speech before moving on to bullying, graphic violence and suicides.
“We were not getting through as much as the day team, so they began auditing us.
“The problem was there was no one to ask for advice in the evenings."
He said they were given quizzes as a training device because there was no proper training system in place. "The people advising us had only been there one month longer.”
Many of the moderators were former language students, on low wages, with no experience of dealing with traumatic content.
In one instance, he had to view an image of a motionless baby lying on the ground with a person’s foot on it.
It led to an argument with an auditor over what the breach was, over whether the baby was alive or not.
“You begin losing your humanity,” he said.
Another case showed footage of a lorryload of people being unloaded at gunpoint and lined up next to a trench in the ground. Mr Gray said as the shooting started he had to continue watching until the end to ensure he made the right decision according to Facebook's guidelines.
“It was like we were ambulance first responders, you just think you can distance yourself from it,” Mr Gray said.
“We never talked about it. You are not allowed to talk about it, even with your family, they drill it into you. At work there’s no opportunity either.
"I asked for a one-to-one session with the wellness team, they are people outsourced too. I was depressed and there was no real acknowledgement of PTSD and that it was a real thing. They just told us to take a 10-minute breather if we saw something disturbing. There was no systematic acceptance that this stuff can build up in your head.”
The moderators were expected to have a 98 per cent accuracy record on the content they assessed, Mr Gray said.
“The atmosphere was not very good. We had to get a score of 98 per cent accuracy but when you are dealing with up to 1,000 tickets [pieces of content] a night of constantly new content coming through, and them changing the rules, it’s impossible,” he said.
He left the job in 2018 and it was in March 2019 that he first spoke out about the inadequate training offered to moderators. As he began reliving the images he had seen he became overwhelmed.
“I talked about how the training was inadequate, how there was nobody there to talk to about the bad experiences and I began dredging things up and I broke down,” he said.
“I was in public and just crying my eyes out. I was totally shocked at what had happened. I was totally bewildered and I went to see my doctor and I was diagnosed with PTSD.”
Since then he has had counselling and seen therapists.
A year ago he lodged a legal action in Ireland’s high court against Facebook and CPL and is seeking compensation for the trauma he suffered.
“I’m just really angry that this has been done to us, that we weren’t allowed to talk to anyone and that this problem did not exist,” he said.
“If they had just acknowledged and managed the problem and put systems in place I could have continued to do the work.
“These big companies will not accept PTSD is a real thing. Since I first spoke out I have been contacted by so many people thanking me for highlighting the problem.
“There are many others waiting in the wings. People are scared to take action, Dublin is a small place and Facebook and CPL are massive employers.
“Once these cases are settled the floodgates will open. At the moment people are afraid over their future employment prospects due to the stigma attached to a court case.”
He has now been joined by a further 30 moderators across Europe seeking legal action against Facebook.
CPL told The National it does not comment on individual cases but said the health and safety of its employees is its "top priority".
“We have many measures in place to ensure employee well-being, including unrestricted access to counselling services as well as a 24/7 on-call service,” it said.
“We operate a professional, safe and rewarding work environment and are very proud of the great work carried out by our team.”
Facebook said it provides extensive training and is implementing technical solutions to limit people’s exposure to graphic material.
“This is an important issue, and we are committed to getting this right,” it said.