© 2024 KOSU
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Propaganda, Hate Speech, Violence: The Working Lives Of Facebook's Content Moderators

<em>The Verge</em>'s<em> </em>Casey Newton reported on the high-pressure work of Facebook content moderators. "Almost everyone I spoke with could vividly describe for me at least one thing they saw that continues to haunt them," he tells NPR's Scott Simon.
Iain Masterton
/
Getty Images/Canopy
The Verge's Casey Newton reported on the high-pressure work of Facebook content moderators. "Almost everyone I spoke with could vividly describe for me at least one thing they saw that continues to haunt them," he tells NPR's Scott Simon.

Without the work of social media moderators, your timelines and news feeds would feel a lot darker.

"There are people in the world who spend a lot of time just sort of uploading the worst of humanity onto Facebook and Instagram," Casey Newton, the Silicon Valley editor for The Verge, said in an interview with NPR's Scott Simon.

And the moderators contracted by Facebook are on the front lines of this fight. (Facebook is a financial supporter of NPR.)

In a recent article for The Verge titled "The Trauma Floor: The secret lives of Facebook moderators in America," a dozen current and former employees of one of the company's contractors, Cognizant, talked to Newton about the mental health costs of spending hour after hour monitoring graphic content.

In Newton's story, the moderators describe a low-paying, high-stress job with limited allowance to process the emotional toll of the work. Non-disclosure agreements, drawn up to protect employees from users who might be angry about a content decision, prevent the employees from talking about their work, according to Newton.

Despite continued promises from Facebook leadership to prioritize safety on the platform, the company has come under scrutiny for its failure to curb the spread of propaganda, hate speech and other harmful content. At the same time, it has also been accused of wielding too heavy a hand in censoring free speech.

One way Facebook has responded is by hiring a small army of mostly contract labor to manage the abundance of flagged content. Worldwide, 15,000 moderators contracted by the company spend their workday wading through racism, conspiracy theories and violence. As Newton cites in his story, that number is just short of half of the 30,000-plus employees Facebook hired by the end of 2018 to work on safety and security.

"Every piece of content that gets reported on Facebook needs to be evaluated by a moderator to see if it breaks the rules or not," Newton said. "And if a moderator makes the wrong call more than a handful of times during the week, their job could be at risk."

That's difficult when the rules are ever-changing. "They're just under tremendous pressure to try to get it right, even though Facebook is changing those guidelines on a near daily basis to account for some nuance," he said.

Newton found that some workers are so disturbed by the content that they don't finish the required training to become a full-time moderator. Some moderators, he noted, went on to develop PTSD-like symptoms after leaving the company.

In one chilling account described by Newton, a Cognizant trainee using the pseudonym "Chloe" is asked to moderate a Facebook post in front of her fellow trainees.

"The video depicts a man being murdered," Newton writes. "Someone is stabbing him, dozens of times, while he screams and begs for his life. Chloe's job is to tell the room whether this post should be removed."

"Almost everyone I spoke with could vividly describe for me at least one thing they saw that continues to haunt them," he said.

Yet according to Newton, employees were regularly dissatisfied with the counseling opportunities available to them. "A common complaint of the moderators I spoke with was that the on-site counselors were largely passive, relying on workers to recognize the signs of anxiety and depression and seek help," writes Newton.

Their time is also managed down to the second, Newton said, leaving moderators with little time to reflect on the disturbing content they might see.

To track their time, the moderators must click a browser extension every time they leave their desk. They get two 15-minute breaks to use the bathroom and a 30-minute lunch break. They're also allotted nine minutes of "wellness time" per day, reserved for when they encounter particularly traumatizing content.

The longer they looked at the kind of fringe conspiracies that get posted onto Facebook, the more they found themselves sympathetic to those ideas

Perhaps the most surprising find from his investigation, the reporter said, was how the majority of the employees he talked to started to believe some of the conspiracy theories they reviewed.

"The longer they looked at the kind of fringe conspiracies that get posted onto Facebook, the more they found themselves sympathetic to those ideas," he said. "I spoke to one man who told me that he no longer believes that 9/11 was a terrorist attack. I talked to someone else who said they had begun to question the reality of the Holocaust."

Some told Newton that they knew their changing beliefs were actually false, he said, "but they just kept telling me these videos are so persuasive and we see them all the time."

In a statement to NPR about Newton's story, Facebook spokesperson Carolyn Glanville said the company is committed to getting "such an important issue" right.

"We work with our partners to ensure they provide competitive compensation starting at $15 per hour, benefits, and a high level of support for their employees. We know there are going to be incidents of employee challenges or dissatisfaction that call our commitment into question, which is why we are taking steps to be extremely transparent about the expectations we have of our partners," the statement said.

Those steps include working to regularly audit their partners. Facebook also plans to invite partner employees to a summit to discuss these issues, according to the statement.

Glanville also noted that Facebook invited Newton to visit Cognizant's Phoenix office — an offer he accepted and detailed in his story.

Bob Duncan, who heads Cognizant's content moderation operations in North America, told Newton in response to his story that recruiters describe to applicants the types of graphic content they should expect to see on the job, and provide them with examples of such content.

"The intention of all that is to ensure people understand it," Duncan told Newton. "And if they don't feel that work is potentially suited for them based on their situation, they can make those decisions as appropriate."

Duncan also told Newton the company would investigate the safety and management issues raised by moderators.

Newton said he's glad to hear that Facebook is taking these issues seriously, but he has suggestions for steps he thinks the company should take. Topping the list: Raise the salary of moderators.

Currently, Newton noted, moderators at Cognizant are earning about $4 above the state's minimum wage. Moderators in Phoenix will make just $28,800 per year. By comparison, the average Facebook employee has a total compensation of $240,000, according to Newton's reporting.

"When you think of other people in these similar first responder-type roles — a police officer, a firefighter, a social worker — those folks are in many cases going to be making something more like $60,000 a year," he said.

Moderators are assessing crucial questions about speech and security, he said. "They're policing the terms of our public debate."

Karina Pauletti and Lynn Kim edited and produced this story for broadcast. Emma Bowman produced this story for digital.

Copyright 2021 NPR. To see more, visit https://www.npr.org.

Scott Simon is one of America's most admired writers and broadcasters. He is the host of Weekend Edition Saturday and is one of the hosts of NPR's morning news podcast Up First. He has reported from all fifty states, five continents, and ten wars, from El Salvador to Sarajevo to Afghanistan and Iraq. His books have chronicled character and characters, in war and peace, sports and art, tragedy and comedy.
KOSU is nonprofit and independent. We rely on readers like you to support the local, national, and international coverage on this website. Your support makes this news available to everyone.

Give today. A monthly donation of $5 makes a real difference.
Related Content