A legal case has been initiated against Meta by a group of content moderators who allege that their work reviewing and removing distressing material on social media has caused them significant psychological harm.
The moderators are employed by Majorel, a company based in Accra that was contracted by Meta to screen content considered to violate community standards.
The workers report suffering from conditions such as depression, anxiety, insomnia, and substance abuse, which they attribute directly to the nature of their roles.
They stated that “mental health support given by their employer was insufficient and that their pleas for help were ignored.”
Teleperformance, the parent company of Majorel, has disputed these allegations, according to The Guardian.
This legal action follows a similar situation in Kenya, where more than 100 Facebook content moderators were diagnosed with severe post-traumatic stress disorder after being exposed to graphic content on the platform.
Social media companies rely on human moderators to remove offensive or harmful material and to help train automated systems designed to perform similar tasks.





