Short Bytes: Two Microsoft content moderators have sued the company for making them watch “toxic images and videos” of child pornography, murder, and bestiality. As a result, they developed acute PTSD (post-traumatic stress disorder). They allege that the company failed to brief them about the psychological impacts of the job. Instead, the company started a “Wellness Program,” which advised employees to simply take “walks and smoke breaks.” In an email sent to me, a Microsoft spokesperson denied the claims.
The employees, Greg Blauert and Henry Soto, say that Microsoft didn’t warn them about the risks involved and provided disappointing psychological support. As a result, they have sued Microsoft “alleging negligence, disability discrimination and violations of the Consumer Protection Act.”
The filing mentions that Mr. Soto, who was “involuntarily transferred” to the team, and others had “God Like” status and they could literally view any customer’s communications. In this process, they were exposed to the images and videos that were of traumatic nature.
Greg Blauert is still being treated for “acute and debilitating PTSD.”
According to their allegations, Microsoft often simply advised them to go for a walk, play video games, or take a smoke to clear their heads. These measures were the part of Microsoft’s Wellness Program.
Soto says that the job took a significant toll on him and he developed nightmares and hallucinations. The other plaintiff, Blauert, claims to have suffered a breakdown in 2013. He’s still being treated for “acute and debilitating PTSD.”
This case highlights that moderating such media isn’t something that’s suitable for all. Without a doubt, it raises serious questions regarding the mental health of the people tasked with such jobs and medical wellness programs of the companies.
We have contacted Microsoft for further details and explanation, and we’ll be updating the article to add the response.
Update: Jan 14, 2017
In a statement issued to Fossbytes, a Microsoft spokesperson has denied these claims. Here’s what Redmond had to say:
“We disagree with the plaintiffs’ claims. Microsoft takes seriously its responsibility to remove and report imagery of child sexual exploitation and abuse being shared on its services, as well as the health and resiliency of the employees who do this important work.
Microsoft applies industry-leading technology to help detect and classify illegal imagery of child abuse and exploitation that are shared by users on Microsoft Services. Once verified by a specially trained employee, the company removes the imagery, reports it to the National Center for Missing & Exploited Children, and bans the users who shared the imagery from our services.
This work is difficult, but critically important to a safer and more trusted internet. The health and safety of our employees who do this difficult work is a top priority. Microsoft works with the input of our employees, mental health professionals, and the latest research on robust wellness and resilience programs to ensure those who handle this material have the resources and support they need, including an individual wellness plan. We view it as a process, always learning and applying the newest research about what we can do to help support our employees even more.”
Also Read: How Did Linux Force Microsoft To Improve Gaming On Windows