Posted: December 5th, 2019
A former Facebook content moderator today submitted a legal action to the High Court, which is expected to be followed by more compensation claims for other moderators, due to the psychological trauma he sustained from viewing disturbing and graphic content.
Mr Chris Gray submitted the workplace trauma compensation action, stating that he was expected to view a range of inappropriate content on a daily basis and filter out disturbing content with a 98% accuracy rating. Content that was labelled inappropriate included “various scenes of people dying in different accidents … set to a musical soundtrack. [Gray] had a long argument with the quality point of contact [a senior role] about whether the music meant that the person posting it was ‘celebrating’ or whether it just counted as disturbing content.”
The claimant said that he was traumatized and under an unacceptable amount of stress due to the nature of the content he viewed and his daily work targets. Mr Gray developed He developed difficulty sleeping and said that would regularly wake during the night due to nightmares. He stated: “It took me a year after I left to realise how much I’d been affected by the job. I don’t sleep well, I get in stupid arguments, have trouble focusing.”
Mr Gray, who is being represented by solicitor, Diane Treanor of Coleman Legal Partners, is likely to be the first of a number of content moderators working with CPL Solutions and Facebook to file a compensation claim due to trauma. Ms Treanor said that content moderators based in Berlin and Barcelona have also contacted her firm with an interest in joining a lawsuit. Mr Gray remarked: “If I can get them better working conditions, better care, then that also improves the quality of the content moderation decisions and the impact on society.”
Facebook released a statement which said: “We are committed to providing support for those that review content for Facebook as we recognise that reviewing certain types of content can sometimes be difficult. Everyone who reviews content for Facebook goes through an in-depth, multi-week training program on our Community Standards and has access to extensive psychological support to ensure their wellbeing. This includes 24/7 on-site support with trained practitioners, an on-call service, and access to private healthcare from the first day of employment. We are also employing technical solutions to limit their exposure to graphic material as much as possible. This is an important issue, and we are committed to getting this right.”
UK-based not-for-profit group Foxglove is supporting the court case and Director Cori Crider said: “The reason we’ve got involved is that we think that social media factory floors are unsafe and need to be cleared up. In a decade we’re going to look back on this as we did at meat packing plants at the turn of the century. Facebook’s only going to pay attention to things when they know that they’ve got a typhoon bearing down on them. What I’d like to see is the moderators realising how much power they have if they just organise. Because let’s face it, social media as we know it could not exist without the labour people like Chris provide.”