'The Cleaners' Looks At Who Cleans Up The Internet's Toxic Content

Nov 12, 2018
Originally published on November 12, 2018 9:05 pm

Thousands of content moderators work around the clock to ensure that Facebook, YouTube, Google and other online platforms remain free of toxic content. That can include trolling, sexually explicit photos or videos, violent threats and more.

Those efforts — run by both humans and algorithms — have been hotly contested in recent years. In April, Mark Zuckerberg spoke before a congressional committee on how Facebook would work to cut down the prevalence of propaganda, hate speech and other harmful content on the platform.

"By the end of this year we're gonna have more than 20,000 people working on security and content review," Zuckerberg said.

The Cleaners, a documentary by filmmakers Hans Block and Moritz Riesewieck, seeks to get to the bottom how, exactly, that work is done. The film follows five content moderators and uncovers their jobs actually entail.

"I have seen hundreds of beheadings. Sometimes they're lucky that it's just a very sharp blade that's being used to them," one content moderator says in a clip from the film.

Block and Riesewieck explored more of the harsh realities that come along with being a content moderator in an interview with All Things Considered.


Interview Highlights

On a Facebook content moderator's typical day

They see all these things which we don't want to see online, on social media. That could be terror, that could be beheading videos like the ones the voice was talking about before. It could be pornography, it can be sexual abuse, it could be necrophilia, on one hand.

And on the other hand it could be content which could be useful for political debates, or to make awareness about war crimes and so on. So they have to moderate thousands of pictures every day, and they need to be quick in order to reach the score for the day. ... It's sometimes up to so many pictures a day. And then they need to decide whether to delete it or to let it stay up.

On Facebook's decision to remove the Pulitzer Prize-winning "napalm girl" photo

This content moderator, he decides that he would rather delete it because it depicts a young, naked child. So he applies this rule against child nudity, which is strictly prohibited.

So it is always necessary to distinguish between so many different cases. ... There are so many gray areas which remain, in which the content moderators sometimes told us they have to decide by their gut feelings.

On the weight of distinguishing harmful content from news images or art

It's an overwhelming jump — it's so complex to distinguish between all these different kinds of rules. ... These young Filipino workers there have a training from three to five days, which is not enough to do a job like this.

On the impact of content moderators being exposed to toxic content daily

Many of the young people are highly traumatized because of the work.

The symptoms are very different. Sometimes people told us they are afraid to go into public places because they're reviewing terror attacks every day. Or they're afraid to have an intimate relationship with his boy or girlfriend because they seeing sexual abuse videos every day. So this is kind of the effect this work has....

Manila [capital of the Philippines] was a place where the analog toxic waste was sent from the Western world, has been sent there for years on container ships. And today the digital garbage is brought there. Now thousands of young content moderators in air conditioned office towers are clicking through the infinity toxic sea of images and tons of intellectual junk.

Emily Kopp and Art Silverman edited and produced this story for broadcast. Cameron Jenkins produced this story for digital.

Copyright 2018 NPR. To see more, visit https://www.npr.org.

ARI SHAPIRO, HOST:

From violent threats to trolling, we are looking at toxic content on this month's All Tech Considered.

(SOUNDBITE OF MUSIC)

SHAPIRO: Companies like Google and Facebook often get asked how they decide what can stay up on their sites and what gets removed. Often they give answers like this.

(SOUNDBITE OF ARCHIVED RECORDING)

MARK ZUCKERBERG: By the end of this year, we're going have more than 20,000 people working on security and content review.

(SOUNDBITE OF ARCHIVED RECORDING)

SUSAN WOJCICKI: There are content that we will remove if it violates our policy. And so we're in the process of having 10,000 people looking at controversial content.

SHAPIRO: That was YouTube CEO Susan Wojcicki speaking with Recode in February and, before that, Facebook CEO Mark Zuckerberg speaking before a congressional committee in April. Well, a documentary airing today on PBS looks at who those thousands of content reviewers are deciding what we see and what we don't. The film is called "The Cleaners."

(SOUNDBITE OF FILM, "THE CLEANERS")

UNIDENTIFIED PERSON #1: I have seen hundreds of beheadings. Sometimes they are lucky that it's just a very sharp blade that's being used to them.

SHAPIRO: Directors Hans Block and Moritz Riesewieck join us now. Welcome.

HANS BLOCK: Hello.

MORITZ RIESEWIECK: Hello.

SHAPIRO: Just give us a typical job description. What's a day in the life of somebody who works as a moderator for one of these companies?

BLOCK: They see all these things which we don't want to see online on social media. That could be terror. That could be beheading videos like the ones the voice was talking about before. It could be pornography. It can be sexual abuse. It could be necrophilia on one hand. And on the other hand, it could be content which could be useful for political debates or to make awareness about war crimes and so on. So they have to moderate thousands of pictures every day. And they need to be quick in order to reach the score for the day.

SHAPIRO: Something like 25,000 images a day.

BLOCK: Exactly. It's sometimes up to so many pictures a day. And then they need to decide whether to delete it or to let it stay up. And that is called ignore.

(SOUNDBITE OF FILM, "THE CLEANERS")

UNIDENTIFIED PERSON #2: Ignore.

UNIDENTIFIED PERSON #3: Di-di (ph), di-di (ph)...

UNIDENTIFIED PERSON #4: Ignore.

UNIDENTIFIED PERSON #5: Ignore.

SHAPIRO: All of these tech companies use algorithms to weed out toxic content. And so why do they still need humans to do this job?

RISESWIECK: Yeah. This is very interesting because when they talk about solutions, they sometimes offer that artificial intelligence will do the job in the future. And this is in a way not true because an algorithm can analyze what is in the picture, for example, a chair or a horse or a glass of water. But what is important to review content all over the world is to see the context of an image. For example, if you see a image with violence, it can be for several reasons - to document violence on social media. It can be for propaganda. So you need humans to see the context of an image.

SHAPIRO: But having a human doesn't solve the problem of controlling for context. You use an example of an image that many listeners will have seen from the Vietnam War that shows people fleeing a napalm strike, including a screaming naked girl. This image won a Pulitzer Prize. What do the content moderators do with it?

BLOCK: This content moderator, he decides that he would rather delete it because it depicts a young naked child. So he applies this rule against child nudity, which is strictly prohibited. So it is always necessary to distinguish between so many different cases, cases in which you should rather apply this rule or this rule. And there are so many gray areas which remain and which the content moderators sometimes told us they had to decide by their gut feelings.

SHAPIRO: Is it unfair to ask any human to try to distinguish between news images and terrorist propaganda, between fine art and pornography? I mean, these are debates that people have been having for as long as these things have existed.

RISESWIECK: Absolutely. It's an overwhelming job. It's so complex to distinguish between all these different kinds of rules. But what we - facing when we researched in the Philippines, that these young Filippino workers there have a training from three to five days, which is not enough to do a job like this.

SHAPIRO: You explore the impact that looking at these appalling videos and images has on the people who spend all day doing that. And at one part of the film, a woman describes what happened after she viewed sexual content involving a child.

(SOUNDBITE OF FILM, "THE CLEANERS")

UNIDENTIFIED PERSON #6: I went straight to my team leader and told him that I can't do this. I really can't do this. I can't look at the child. But then he told me that I should do it because this is my job and I signed a contract for it.

SHAPIRO: What kind of impact does this work have on the people doing it?

RISESWIECK: Many of the young people are highly traumatized because of the work. The symptoms are very different. Sometimes people told us they are afraid to go into public places because they're reviewing terror attacks every day or they're afraid to have a intimate relationship with his boy or girlfriend because they're seeing sexual abuse videos every day. So this is kind of the effect this work has.

SHAPIRO: The film raises a lot of provocative questions about free speech, about the impact of viewing these images on the people doing the work. What kinds of answers did the tech companies give you?

RISESWIECK: Every interview request we sent to the companies, there was no answer at all. So that was really frustrating for us because there is something like a code of silence in the Silicon Valley. No one liked to talk about the insides of these companies.

SHAPIRO: Watching the movie, it felt almost dystopian to me. This idea of an underclass of people far, far away who spend all day handling the stuff that we literally cannot bring ourselves to look at, it just seemed so bleak.

BLOCK: Yeah, it is. And it's interesting because Manila was a place where the analog toxic waste was sent from the Western world, has been shipped there for years on container ships. And today, the digital garbage is brought there. Now thousands of young content moderators in air-conditioned office towers are clicking through the infinity toxic sea of images and tons of intellectual junk.

SHAPIRO: Hans Block and Moritz Riesewieck, thank you so much.

BLOCK: Thank you.

RISESWIECK: Thank you.

SHAPIRO: They directed the documentary "The Cleaners," which airs on PBS tonight.

(SOUNDBITE OF MUSIC) Transcript provided by NPR, Copyright NPR.