The article discusses the role of content moderators for Facebook and Instagram, focusing on those employed by subcontractors like Majorel in Ghana. These individuals are tasked with reviewing and removing disturbing content, which has proven to be psychologically taxing—many report symptoms such as depression and anxiety. Legal actions are underway against these companies due to inadequate mental health support and claims of unfair dismissal. Despite the claim by companies like Teleperformance that they provide mental health resources, many moderators feel unsupported in handling the distressing content they encounter daily.
The vast majority of Facebook and Instagram users never see disturbing content depicting scenes of torture, child abuse or murder on these platforms thanks to content moderators.
Content moderators working for Majorel in Ghana's capital Accra told The Guardian they have suffered from depression, anxiety, insomnia, and substance abuse.
According to The Guardian, lawyers are currently preparing for a lawsuit against Majorel, a company contracted by Meta to engage in content moderation.
British NGO Foxglove is now preparing a lawsuit against content moderators, alleging illegal dismissals and inadequate psychological support.
Collection
[
|
...
]