Advertisement

Microsoft sued by former employees who developed PTSD after reviewing violent content

Two of the company’s ex-employees sue over PTSD after reviewing child porn

Two former Microsoft employees claim that the technology giant made them look at photos and videos “designed to entertain some of the most twisted and sick-minded people in the world.” After developing post-traumatic stress disorder (PTSD) due to reviewing the content as part of their jobs, the former employees are suing. 

Typing_on_Laptop

According to Courthouse News, Henry Soto and Greg Blauert were part of Microsoft’s online safety team, whose job was to figure out which types of online content should be removed and when it should be reported to police. In the position, Soto and Blauert said that it was required of them to look at images of child pornography, murder, bestiality, and “indescribable sexual assault.” This went on in Microsoft’s facility in King County, WA, for years.

Last month, the two men filed a lawsuit against Microsoft, accusing the company of negligence, disability discrimination, and violation of the Consumer Protection Act. Both claim to be suffering from PTSD.

The suit alleges that Microsoft was negligent in its handling of the mental health of the employees on this team, even though the company extended programs and benefits to ensure the welfare of employees in its Digital Crimes Unit, which has similar responsibilities.

Such a case raises questions not only about the well-being and mental health of people in these toxic roles, but also about the ways in which content platforms deal with such content in terms of policies that are enforced and the technology developed to eliminate it.

For social networks, moderating media that may not be suitable for all audiences is important so that they can promise their users a safe and positive environment, and policies that outline what’s okay to share is a good starting point.

Unfortunately, merely posting a list of rules at an entrance doesn’t ensure that users will always follow them. This is what makes technology that can automatically detect and flag such content necessary, and it’s already in use. In recent years, technology giants such as Facebook, Twitter, Google, and Microsoft have worked to develop automated cloud-based solutions to detect images of child sexual abuse distributed through their services.

Of course, this kind of technology still does require human oversight, and that’s where people like Soto and Blauert come into play. But even with access to therapy and counseling, the work of closely examining hundreds of pieces of disturbing content will certainly take a toll on your psyche.

Essentially, it’s up to companies to fully understand what they’re asking of their employees, and to empathize with their situations. By doing so, this can improve the conditions in which they provide this valuable service and can come up with innovative technology to minimize our reliance on human involvement for content moderation.

Both plaintiffs and their wives seek damages for pain and suffering and economic damages and treble damages under the Consumer Protection Act and Washington Disability Discrimination Act.

Source: The Next Web, Courthouse News

Advertisement



Learn more about Electronic Products Magazine

Leave a Reply