A former Facebook content moderator from Kenya has filed a lawsuit against Facebook’s parent company, Meta. The worker has accused Meta of poor working conditions, including irregular pay and insufficient mental health support.
The ex-Facebook content moderator worked for the company via Sama, a local outsourcing agency. Daniel Motaung is the complainant in this case. He claims that Sama fired him shortly after he tried to form an employee union.
Poor working conditions for Facebook moderators
Motaung’s lawsuit claims that Sama, Facebook’s local partner in Kenya, has kept its employees in poor condition. The Guardian reports that the first video that Motaung moderated was a beheading. Daniel claims he has severe PTSD and his pay isn’t enough to cover his mental healthcare.
Motaung’s lawyer added that “If in Dublin, people can’t look at harmful content for two hours. That should be the rule everywhere”. The lawyer’s statement also implies that Meta’s partner company hasn’t kept up with the code.
However, a Meta spokesperson told Reuters that ” We take our responsibility to the people who review content for Meta seriously and require our partners to provide industry-leading pay, benefits, and support.” There was no comment from Sama, Facebook’s Kenyan partner.
Questioning Facebook’s content moderation
When it comes to platforms like Facebook, heavy-handed moderation happens behind the scenes. Without this, we’re likely to see horrifying and disturbing content across the platform. However, this moderation requires human input, and Facebook’s content moderators have to see things that nobody should have to see.
While AI is also used for some moderation, it is simply not enough. Facebook and its parent company Meta claim that industry-leading pay and mental health care are given to the moderators. However, that doesn’t seem to be the case when lawsuits like this come forward.
Last year, Facebook agreed to pay $85 million in settlements to content moderators in California. While the company didn’t admit to any wrongdoing, the health hazards of manual moderation were glaring it in the face.
Now, another ex-Facebook content moderator has sued the company for the same reasons. The case is still in the initial stage but could reportedly have far-reaching consequences for Facebook in Kenya.