This link has been bookmarked by 98 people . It was first bookmarked on 23 Oct 2014, by someone privately.
-
13 Jul 16
-
So companies like Facebook and Twitter rely on an army of workers employed to soak up the worst of humanity in order to protect the rest of us.
-
estimates that the number of content moderators scrubbing the world’s social media sites, mobile apps, and cloud storage services runs to “well over 100,000”—that is, about twice the total head count of Google and nearly 14 times that of Facebook.
-
This work is increasingly done in the Philippines. A former US colony, the Philippines has maintained close cultural ties to the United States, which content moderation companies say helps Filipinos determine what Americans find offensive
-
was offered $312 per month by another firm to moderate content for Facebook, paltry even by industry standards
-
It’s something of a surprise that Whisper would let a reporter in to see this process. When I asked Microsoft, Google, and Facebook for information about how they moderate their services, they offered vague statements about protecting users but declined to discuss specifics
-
“I think if there’s not an explicit campaign to hide it, there’s certainly a tacit one,”
-
Companies would prefer not to acknowledge the hands-on effort required to curate our social media experiences, Roberts says. “It goes to our misunderstandings about the Internet and our view of technology as being somehow magically not human.”
-
While a large amount of content moderation takes place overseas, much is still done in the US, often by young college graduates like Swearingen was. Many companies employ a two-tiered moderation system, where the most basic moderation is outsourced abroad while more complex screening, which requires greater cultural familiarity, is done domestically
-
“Everybody hits the wall, generally between three and five months,” says a former YouTube content moderator I’ll call Rob. “You just think, ‘Holy shit, what am I spending my day doing? This is awful
-
For the first few months, Rob didn’t mind his job moderating videos at YouTube’s headquarters in San Bruno. His coworkers were mostly new graduates like himself, many of them liberal arts majors just happy to have found employment that didn’t require a hairnet
-
If someone was uploading animal abuse, a lot of the time it was the person who did it. He was proud of that,” Rob says. “And seeing it from the eyes of someone who was proud to do the fucked-up thing, rather than news reporting on the fucked-up thing—it just hurts you so much harder, for some reason. It just gives you a much darker view of humanity.
-
YouTube employs counselors whom moderators can theoretically talk to, but Rob had no idea how to access them. He didn’t know anyone who had. Instead, he self-medicated. He began drinking more and gained weight.
-
Given that content moderators might very well comprise as much as half the total workforce for social media sites, it’s worth pondering just what the long-term psychological toll of this work can be
-
“It’s like PTSD,
-
“There is a memory trace in their mind.”
-
But even with the best counseling, staring into the heart of human darkness exacts a toll. Workers quit because they feel desensitized by the hours of pornography they watch each day and no longer want to be intimate with their spouses. Others report a supercharged sex drive. “How would you feel watching pornography for eight hours a day, every day?” Denise says. “How long can you take that?”
-
Constant exposure to videos like this has turned some of Maria’s coworkers intensely paranoid. Every day they see proof of the infinite variety of human depravity. They begin to suspect the worst of people they meet in real life, wondering what secrets their hard drives might hold
-
-
12 Jul 16
-
21 Dec 15
-
15 Nov 15
-
28 Oct 15
-
12 Oct 15
-
02 Sep 15
-
22 Aug 15bnewcomer
Skip Article Header. Skip to: Start of Article. The campuses of the tech industry are famous for their lavish cafeterias, cushy shuttles, and on-site laundry services. via Pocket
-
25 Jun 15
-
11 Jun 15
-
09 May 15
-
20 Jan 15
-
They won’t continue to log on if they find their family photos sandwiched between a gruesome Russian highway accident and a hardcore porn video. Social media’s growth into a multibillion-dollar industry, and its lasting mainstream appeal, has depended in large part on companies’ ability to police the borders of their user-generated content—to ensure that Grandma never has to see images like the one Baybayan just nuked.
-
he number of content moderators scrubbing the world’s social media sites, mobile apps, and cloud storage services runs to “well over 100,000”—that is, about twice the total head count of Google and nearly 14 times that of Facebook.
-
Sarah Roberts, a media studies scholar at the University of Western Ontario and one of the few academics who study commercial content moderation. Companies would prefer not to acknowledge the hands-on effort required to curate our social media experiences
-
Whisper practices “active moderation,” an especially labor-intensive process in which every single post is screened in real time
-
His coworkers were mostly new graduates like himself, many of them liberal arts majors just happy to have found employment that didn’t require a hairnet
-
Moderators were instructed to leave such “newsworthy” videos up with a warning, even if they violated the content guidelines. But the close-ups of protesters’ corpses and street battles were tough for Rob and his coworkers to handle.
-
Given that content moderators might very well comprise as much as half the total workforce for social media sites, it’s worth pondering just what the long-term psychological toll of this work can be.
-
occupational health consultancy, Workplace Wellbeing, focused on high-pressure industries. She has since advised social media companies in the UK and found that the challenges facing their content moderators echo those of child-pornography and anti-terrorism investigators in law enforcement.
-
But where law enforcement has developed specialized programs and hires experienced mental health professionals, Stevenson says that many technology companies have yet to grasp the seriousness of the problem.
-
“It’s like PTSD,” she tells me as we sit in her office above one of the city’s perpetually snarled freeways. “There is a memory trace in their mind.”
-
-
16 Dec 14
-
25 Nov 14
-
08 Nov 14
-
05 Nov 14
-
Annika Pissin
Inside the soul-crushing world of content moderation, where low-wage laborers soak up the worst of humanity, and keep it off your Facebook feed.
-
04 Nov 14
-
02 Nov 14
-
30 Oct 14Greg Linch
See @AdrianChen's article here: http://t.co/0SC8Ua4jTF #newtopics
-
29 Oct 14Aurialie Jublin
"So companies like Facebook and Twitter rely on an army of workers employed to soak up the worst of humanity in order to protect the rest of us. And there are legions of them—a vast, invisible pool of human labor. Hemanshu Nigam, the former chief security officer of MySpace who now runs online safety consultancy SSP Blue, estimates that the number of content moderators scrubbing the world’s social media sites, mobile apps, and cloud storage services runs to “well over 100,000”—that is, about twice the total head count of Google and nearly 14 times that of Facebook."
-
-
So companies like Facebook and Twitter rely on an army of workers employed to soak up the worst of humanity in order to protect the rest of us. And there are legions of them—a vast, invisible pool of human labor. Hemanshu Nigam, the former chief security officer of MySpace who now runs online safety consultancy SSP Blue, estimates that the number of content moderators scrubbing the world’s social media sites, mobile apps, and cloud storage services runs to “well over 100,000”—that is, about twice the total head count of Google and nearly 14 times that of Facebook.
This work is increasingly done in the Philippines. A former US colony, the Philippines has maintained close cultural ties to the United States, which content moderation companies say helps Filipinos determine what Americans find offensive. And moderators in the Philippines can be hired for a fraction of American wages. Ryan Cardeno, a former contractor for Microsoft in the Philippines, told me that he made $500 per month by the end of his three-and-a-half-year tenure with outsourcing firm Sykes. Last year, Cardeno was offered $312 per month by another firm to moderate content for Facebook, paltry even by industry standards.
-
But as months dragged on, the rough stuff began to take a toll. The worst was the gore: brutal street fights, animal torture, suicide bombings, decapitations, and horrific traffic accidents. The Arab Spring was in full swing, and activists were using YouTube to show the world the government crackdowns that resulted. Moderators were instructed to leave such “newsworthy” videos up with a warning, even if they violated the content guidelines. But the close-ups of protesters’ corpses and street battles were tough for Rob and his coworkers to handle. So were the videos that documented misery just for the sick thrill of it.
“If someone was uploading animal abuse, a lot of the time it was the person who did it. He was proud of that,” Rob says. “And seeing it from the eyes of someone who was proud to do the fucked-up thing, rather than news reporting on the fucked-up thing—it just hurts you so much harder, for some reason. It just gives you a much darker view of humanity.”
-
Constant exposure to videos like this has turned some of Maria’s coworkers intensely paranoid. Every day they see proof of the infinite variety of human depravity. They begin to suspect the worst of people they meet in real life, wondering what secrets their hard drives might hold. Two of Maria’s female coworkers have become so suspicious that they no longer leave their children with babysitters. They sometimes miss work because they can’t find someone they trust to take care of their kids.
Maria is especially haunted by one video that came across her queue soon after she started the job. “There’s this lady,” she says, dropping her voice. “Probably in the age of 15 to 18, I don’t know. She looks like a minor. There’s this bald guy putting his head to the lady’s vagina. The lady is blindfolded, handcuffed, screaming and crying.”
The video was more than a half hour long. After watching just over a minute, Maria began to tremble with sadness and rage. Who would do something so cruel to another person? She examined the man on the screen. He was bald and appeared to be of Middle Eastern descent but was otherwise completely unremarkable. The face of evil was someone you might pass by in the mall without a second glance.
After two and a half years on the cloud storage moderation team, Maria plans to quit later this year and go to medical school. But she expects that video of the blindfolded girl to stick with her long after she’s gone. “I don’t know if I can forget it,” she says. “I watched that a long time ago, but it’s like I just watched it yesterday.”
-
-
Tania Sheko
The campuses of the tech industry are famous for their lavish cafeterias, cushy shuttles, and on-site laundry services. via Pocket
-
-
“I think if there’s not an explicit campaign to hide it, there’s certainly a tacit one,” says Sarah Roberts, a media studies scholar at the University of Western Ontario and one of the few academics who study commercial content moderation. Companies would prefer not to acknowledge the hands-on effort required to curate our social media experiences, Roberts says. “It goes to our misunderstandings about the Internet and our view of technology as being somehow magically not human.”
-
While a large amount of content moderation takes place overseas, much is still done in the US, often by young college graduates like Swearingen was. Many companies employ a two-tiered moderation system, where the most basic moderation is outsourced abroad while more complex screening, which requires greater cultural familiarity, is done domestically. US-based moderators are much better compensated than their overseas counterparts: A brand-new American moderator for a large tech company in the US can make more in an hour than a veteran Filipino moderator makes in a day. But then a career in the outsourcing industry is something many young Filipinos aspire to, whereas American moderators often fall into the job as a last resort, and burnout is common.
-
Given that content moderators might very well comprise as much as half the total workforce for social media sites, it’s worth pondering just what the long-term psychological toll of this work can be.
-
Denise and her team set up extensive monitoring systems for their clients. Employees are given a battery of psychological tests to determine their mental baseline, then interviewed and counseled regularly to minimize the effect of disturbing images. But even with the best counseling, staring into the heart of human darkness exacts a toll. Workers quit because they feel desensitized by the hours of pornography they watch each day and no longer want to be intimate with their spouses. Others report a supercharged sex drive.
-
-
27 Oct 14Wessel van Rensburg
Deeply disturbing: "content moderators might comprise as much as half the total workforce for social media sites". http://t.co/aCr3wlKeYQ
the number of content moderators scrubbing the world’s social media sites is twice the total head count of Google--->http://t.co/VSxlpbwyaF
@wildebees is it related to this? https://t.co/AMwOTQZhou -
26 Oct 14
-
25 Oct 14
-
Daisy PhD
.@phdaisy Remember our conversation about PTSD and researchers? Content moderators, too: http://t.co/iFIQxNRlmk via @WIRED
-
24 Oct 14
-
Jorge Barba
The human toll of content moderation http://t.co/jiLSh4YcJj
-
Marcel Weiss
"So companies like Facebook and Twitter rely on an army of workers employed to soak up the worst of humanity in order to protect the rest of us. And there are legions of them—a vast, invisible pool of human labor. Hemanshu Nigam, the former chief ...
-
ilanmichalby
The campuses of the tech industry are famous for their lavish cafeterias, cushy shuttles, and on-site laundry services. via Pocket
-
-
“It goes to our misunderstandings about the Internet and our view of technology as being somehow magically not human.”
-
-
Weiye Loh
Baybayan is part of a massive labor force that handles “content moderation”—the removal of offensive material—for US social-networking sites. As social media connects more people more intimately than ever before, companies have been confronted with the Grandma Problem: Now that grandparents routinely use services like Facebook to connect with their kids and grandkids, they are potentially exposed to the Internet’s panoply of jerks, racists, creeps, criminals, and bullies. They won’t continue to log on if they find their family photos sandwiched between a gruesome Russian highway accident and a hardcore porn video. Social media’s growth into a multibillion-dollar industry, and its lasting mainstream appeal, has depended in large part on companies’ ability to police the borders of their user-generated content—to ensure that Grandma never has to see images like the one Baybayan just nuked.
-
23 Oct 14
-
Javier Pastor
The Laborers Who Keep Dick Pics and Beheadings Out of Your Facebook Feed | WIRED http://t.co/rc3W6xdUqg
— Javier Pastor (@javipas) October 23, 2014 -
Dave Ebpob
This work is increasingly done in the Philippines. A former US colony, the Philippines has maintained close cultural ties to the United States, which content moderation companies say helps Filipinos determine what Americans find offensive. And moderators in the Philippines can be hired for a fraction of American wages. Ryan Cardeno, a former contractor for Microsoft in the Philippines, told me that he made $500 per month by the end of his three-and-a-half-year tenure with outsourcing firm Sykes. Last year, Cardeno was offered $312 per month by another firm to moderate content for Facebook, paltry even by industry standards.
Here in the former elementary school, Baybayan and his coworkers are screening content for Whisper, an LA-based mobile startup—recently valued at $200 million by its VCs—that lets users post photos and share secrets anonymously. They work for a US-based outsourcing firm called TaskUs. It’s something of a surprise that Whisper would let a reporter in to see this process. When I asked Microsoft, Google, and Facebook for information about how they moderate their services, they offered vague statements about protecting users but declined to discuss specifics. Many tech companies make their moderators sign strict nondisclosure agreements, barring them from talking even to other employees of the same outsourcing firm about their work. -
Tim McCormick
The campuses of the tech industry are famous for their lavish cafeterias, cushy shuttles, and on-site laundry services.
Wow. The psychological tolls of content moderation, with companies from the #Philippines http://t.co/ZTRRTOu2jZ -
Beto Borbolla
RT @AdrianChen: An invisible army of outsourced moderators cleans the internet of porn and gore. My story for @WIRED http://t.co/K2h1K6UgTo
Would you like to comment?
Join Diigo for a free account, or sign in if you are already a member.