Fb content material moderators say they obtain little help, regardless of firm guarantees

Despite repeated pledges from Facebook to change the poor workplace conditions for its content moderators, often contractors who spend their days reviewing graphical and violent images posted on the site, little has changed at the company, a former Facebook said -Content presenter in an interview that aired Monday on NBC Nightly News.

Josh Sklar, a former Accenture subcontractor based in Austin, Texas who hosted content for Facebook and Instagram from September 2018 through March, said that the working conditions for content presenters have barely improved and that they continue to review large volumes of frequently traumatic posts. Sklar was one of around 15,000 people who spend hours every day combing the downside of the social network and flagging posts that violated the content guidelines.

“They review, you know, maybe hundreds of pieces of content a day, and of that content, a really healthy percentage of that is bad stuff,” he said. “I mean, we talk about everything from hate speech to animal mutilation to videos of people committing suicide to child pornography.”

Sklar said he had been hired to do this for more than six hours each day with few breaks as disruptive images poured into his queue. In one case, he said, a reconfigured algorithm that chooses what the moderators see resulted in him seeing “a lot more blood”, including a picture of the corpses of Palestinian women “over and over” in an explosion and had been killed over and over again. “

“Sometimes you will find that you are desensitized to it and you think that doesn’t seem like a good thing,” he said. “I don’t really want to be deaf to human suffering.”

Sklar is one of several content moderators who have spoken out in recent years. But Sklar said that speaking so far had not led to much change. Sklar once said that he had spoken out internally against a policy update that allowed pictures of animal mutilation to go online for months without flags. Sklar said he repeatedly presented the issue to quality assurance staff (QAs), who then raised it on Facebook.

Josh Sklar.NBC News

Sklar said even though a QA told him that Facebook said, “Oh, that’s not supposed to be what’s happening,” he noted that this still hasn’t changed in the short term.

For more information, check out NBC Nightly News tonight with Lester Holt at 6:30 p.m. ET / 5:30 p.m. CT.

Sklar also said he had to sign a nondisclosure agreement that he allegedly never saw again and another document warning him of suffering from post-traumatic stress disorder. He said he was told he was responsible for treating these health problems.

Before leaving in March, Sklar wrote an internal memo about his experience with Workplace, an internal tool for corporate communications. In it, he called the wellness program, which is supposed to support moderators and their mental health, “inadequate” and made suggestions as to how moderators can get more wellness time and have the opportunity to taste the therapy.

Facebook company spokesman Drew Pusateri responded to Sklars accounts: “We appreciate the important work that content reviewers are doing to keep this content off our platform and often change and improve our policies based on their feedback.” Accenture said in a statement that the company makes the well-being of its employees “a top priority” and that “our employees have unrestricted access to 24/7 support that includes proactive, confidential, and needs-based advice.”

Repeat history

This isn’t the first time Facebook has been accused of abusing its content moderators.

In February 2019, Business Insider published an article in which moderators at the Austin facility where Sklar worked alleged in an internal letter about Workplace that additional job restrictions were costing them their “sense of humanity.” At the time, Facebook told Business Insider that there were no new rules to address these issues and that it was a “misunderstanding” of the ones that already exist. The company said it would address concerns from employees at this point. Accenture referred Business Insider to Facebook for comment.

The following week, The Verge published an article showing Facebook content moderators in Phoenix subcontracted by Cognizant – reportedly no longer involved in content moderation – were suffering from mental health problems and trauma issues, less than 10 minutes per day for “wellness time” – debriefing hard content – and “insufficient” coping resources that led some of them to resort to drugs. A Facebook spokeswoman told The Verge that the claims “do not reflect the daily experiences of most of their contractors, either in Phoenix or in other locations around the world.”

In May 2020, Facebook closed a lawsuit saying it would pay $ 52 million to content moderators who claimed they had developed mental health issues like PTSD while at work and provided more mental health resources , such as monthly group therapy sessions and weekly one-to-one classes.

Six months later, over 200 content moderators, including Sklar, said in a letter to executives at Facebook, Accenture and CPL, another contractor, the company “forced” them back into the office during the pandemic.

“Before the pandemic, moderating content was definitely Facebook’s most brutal task. We’ve piled up for hours through violence and child abuse. Moderators working on child abuse content had elevated goals without additional support during the pandemic, ”the moderators wrote in the letter. “Now, in addition to the work that is psychologically toxic, sticking to the job means going to a hot zone.”

At the time, Facebook told NPR that it “has exceeded health guidelines for ensuring the safety of facilities for working in the office” and was prioritizing the health and safety of its moderators. Accenture said it was gradually inviting employees back to its office, but “only where it is urgently needed and only when we are sure we have taken the correct security measures in accordance with local regulations.” CPL informed NPR that the roles of workers were considered “essential” and “cannot be performed from home due to the nature of the work”.

But Cori Crider, co-founder of Foxglove, the organization dedicated to helping social media content moderators who posted the letter, said Facebook could have done more.

“Facebook could absolutely afford to hire these people directly and treat these people better,” Crider said. “You can’t have a healthy public space when the people you rely on to defend it work in digital sweatshirts.”

Comments are closed.