The invisible labor that keeps Facebook clean
Social networking sites like Facebook and YouTube have a problem. In a way, it’s a good one to have.
The grandma problem.
As social networking has gone mainstream – in other words, “even Grandma is on Facebook” – the seedier side of the web becomes a bigger and bigger problem. Say Grandma logs in to check out new family photos or videos, and then she’s bombarded with everything from violent car crashes to the most vile kinds of pornography? Not a good user retention strategy.
Enter the content moderator. She makes sure the really icky stuff the Internet has to offer doesn’t show up next to photos of the grand kids. She is part of a massive workforce, which one expert estimates at over 100,000 around the world. Or, 14 times the size of Facebook.
Adrian Chen wrote about content moderators for the November issue of Wired. His reporting took him to the Philippines, where outsourcing firms pay content moderators as little as $300 per month.
“What the companies told me was that people in the Philippines, because of the cultural connection to the U.S., were better-equipped to screen content for American and Western audiences,” Chen said.
But no content moderator is well-equipped for the volume of vile content that the screening process entails.
“People get a darker view of humanity,” Chen said, adding, “seeing all this abnormal stuff all day gives you a twisted view of what’s really going on out there.”
The full article, including accounts of some of the terrifying content that moderators see, is at Wired.com.
There’s a lot happening in the world. Through it all, Marketplace is here for you.
You rely on Marketplace to break down the world’s events and tell you how it affects you in a fact-based, approachable way. We rely on your financial support to keep making that possible.
Your donation today powers the independent journalism that you rely on. For just $5/month, you can help sustain Marketplace so we can keep reporting on the things that matter to you.