On Thursday, authorities in Canada announced the bust of an enormous international child pornography operation. It was the end of a three-year investigation into a website that trafficked in illicit videos of young boys. More than 300 people have been arrested in connection with the videos, 76 of them in the United States.

Although busts like this one end with press conferences and high-profile trials, they begin far away from the public eye, with one of the most difficult jobs in the world: content moderation.

The rise of Internet porn has created a shadow industry of people whose job it is to screen vast numbers of images for child pornography.

Richard Brown knows how difficult it can be to see this kind of content. He used to be in charge of the Internet Crimes Against Children task force for the New Jersey State Police. Part of his job was to look through the hard drives of suspects, image by image. And it was hard to forget what he saw.

"I have 2 boys," he says, "and I remember being ultra-protective of my boys during the time that I was involved in this type of work, and I think that's pretty common."

Now, Brown is a law enforcement liaison for the International Center for Missing and Exploited Children. That organization is developing a program that can help police departments automate the reviewing of images of child sexual abuse.

Internet search providers like Google and Microsoft are also investing in similar programs. Samantha Doerr, the director of public affairs and child protection at Microsoft, explains that automation is important because "unlike any other kind of offensive content online, the image itself is a crime scene, and every new viewing of that image is a re-victimization of that child."

The Microsoft system, known as PhotoDNA, was co-developed by a team at Dartmouth and has since been donated to the National Center for Missing and Exploited Children. Earlier this year, Twitter began using it to scan every photo that's uploaded to its site.

The program works by scanning known images of child pornography and giving them a unique signature that goes into a database. If that image appears on another site, it is instantly flagged and removed. Google has its own proprietary tagging system that works in a similar way.

But in order for an image to be identified as child pornography in the first place, a person has to review it. The people who do that work for tech companies are employed all over the world, and very little is known about them, says Sarah Roberts of Western University in Ontario, Canada.

Roberts studies the workers who are part of the new content moderation industry, and she explains that one reason so little is known about them is that most companies require their employees to sign nondisclosure agreements.

"They're precluded from speaking to the media, and it is difficult to reach out and find them," Roberts says. "I think there's an aspect of trauma that can often go along with this work and many workers would rather go home and tune out not talk about it. So I think the unknown aspect of this is by design. It's no mistake that it's difficult to find workers who will talk to you about this."

Many of the workers Roberts has spoken to anonymously have said they feel stigmatized because of the content they come in contact with through their jobs.

"It's exacting a toll on these workers, and because this industry is so new and the need for this work is so new, I think the jury is out as to what the real implications are going to be for these people later on in their life," she says.

But the demand for content moderators is only growing. In March, the eight tech companies who belong to the Technology Coalition against child exploitation online came out with guidelines for how to support employees who come in contact with child pornography as part of their jobs.

The guidelines suggest employees take their minds off traumatic content by, for example, taking a 15 minute walk or engaging in a hobby. The guidelines also say companies should have a counseling hotline for employees.

Roberts says providing resources may not be enough.

"If someone were to access some of these support services," she says, "there may be an implicit suggestion that they're not cut out for the kind of work they're trying to do for a living."

Copyright 2015 NPR. To see more, visit http://www.npr.org/.

Transcript

ARUN RATH, HOST:

It's ALL THINGS CONSIDERED from NPR News. I'm Arun Rath.

On Thursday, authorities in Canada announced they had busted an enormous international child pornography operation. It was the end of a three-year investigation into a website that trafficked in illicit videos of young boys. Three hundred and forty-eight people have been arrested in connection with the videos, 76 of them in the U.S.

Investigations like this end with press conferences and high-profile trials, but they begin far away from the public eye with one of the most difficult jobs in the world.

NPR's Rebecca Hersher reports. But first, a warning. This story contains content that some listeners may find disturbing.

REBECCA HERSHER, BYLINE: Rich Brown knows how to crack a child pornography case. For decades, he was a cop for the New Jersey State Police. He worked on the Internet crimes against children task force. And part of his job was to look through suspects' hard drives, image by image, which meant he saw some terrible, terrible things.

RICHARD BROWN: You know, there's - I think there's certain things that are more difficult. I do remember seeing the first molestation video. It was a husband that was molesting his two daughters.

HERSHER: When he went home at night, it was difficult for Brown to forget what he had seen.

BROWN: I have two boys, and I remember being ultra-protective of my boys during the time that I was involved with this type of work, and I think that's pretty common.

HERSHER: Brown says this kind of work can be traumatic. Some officers had trouble sleeping or marriage problems. But cops are no longer the only people getting paid to review images of child pornography. The rise of Internet porn has created a shadow industry of people working for tech companies like Google, Microsoft and Facebook. They spend eight hours a day, 40 hours a week looking at pictures and videos and asking themselves, is this child pornography?

SARAH ROBERTS: This is a shadow industry by design. So you're not going to find an association of commercial content moderators publishing statistics and necessarily making themselves known in that way. There's value to the invisibility.

HERSHER: That's Sarah Roberts of Western University. She's one of the few people who systematically studies those who do this work. She says even though the Internet couldn't function without content moderators, very few have heard of them, even within the tech industry.

ROBERTS: A lot of them are under nondisclosure agreements, so they're precluded from speaking to the media, and it is difficult to reach out and find them. And I think there's an aspect of trauma that can often go along with this work, and many workers would rather go home and tune out and not talk about it. So I think the unknown aspect of this work is certainly by design. It's no mistake that it's difficult to find workers who will talk to you about this.

HERSHER: Microsoft and Google both declined to put me in touch with any of the people who review images for their services. Samantha Doerr is the director of child protection at Microsoft.

SAMANTHA DOERR: It's a yucky job. In fact, Microsoft has to invest in, you know, wellness programs for people that work on this.

HERSHER: In March, Microsoft and eight other tech companies came out with guidelines for such wellness programs. They suggest employees take their minds off traumatic content by, for example, taking a 15-minute walk or engaging in a hobby. Many companies have a counseling hotline for employees. But Sarah Roberts says that may not be enough.

ROBERTS: We're also talking about a culture in which your success at your job is directly tied to your ability to stomach this imagery. So if someone were to access these kinds of support services, there may be an implicit suggestion that they're not cut out for the work that they're trying to do for a living.

HERSHER: Doerr says Internet service providers are investing in ways to decrease the number of people who need to do this kind of work, and with good reason.

DOERR: Unlike any other kind of, you know, offensive or illegal content online, the image itself is a crime scene, and every new viewing of that image is a re-victimization of that child.

HERSHER: Scott Rubin is a spokesman for Google.

SCOTT RUBIN: When we become aware of one of those images, we will create this digital signature. It's like a fingerprint. It's a series of ones and zeroes unique to that particular image.

HERSHER: In an effort to automate the process, Microsoft and Google have both created such software. Tagged pictures are sent to a database at the National Center for Missing and Exploited Children, which coordinates with law enforcement. And earlier this year, Twitter also began using the Microsoft program to screen every photo posted to that site.

But despite the software advances, the need for content moderators is still growing. And Sarah Roberts says that workers she's spoken to often feel stigmatized.

ROBERTS: They've said things to me like, you don't want an Internet without our interventions. But at the same time it's exacting a toll on these workers. And because this industry is so new and the need for this work is so new, I think the jury is out as to what the real implications are going to be for these people later on in their life.

HERSHER: And as demand grows, companies are looking to cheap foreign workers to do some of the most difficult work. Rebecca Hersher, NPR News. Transcript provided by NPR, Copyright NPR.

300x250 Ad

Support quality journalism, like the story above, with your gift right now.

Donate