© 2024 New Hampshire Public Radio

Persons with disabilities who need assistance accessing NHPR's FCC public files, please contact us at publicfile@nhpr.org.
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations
🎁7PM DEADLINE FOR THE CHANCE TO WIN A $2,000 VISA GIFT CARD! GET IN ON THE FUN AND PURCHASE HOLIDAY RAFFLE TICKETS TODAY🎁

The Emotional Toll Of Content Moderation

MICHEL MARTIN, HOST:

In recent years, consumers have become more interested in the working conditions of the people who make the products they use. So we're going to spend a few minutes talking about the job of monitoring content on major websites. Now, when you watch videos on YouTube or Google, you might not be thinking about what you didn't see. But there's an entire workforce dedicated to catching disturbing content as it hits the Internet.

Casey Newton has been writing about this in a series of articles for The Verge. That's a new site that looks at how technology affects people's lives. He's been writing about the impact on these workers of having to digest a steady diet of extreme and other disturbing images, usually without any training on how to protect themselves. I started our conversation by asking him to describe the kinds of things moderators have to see.

CASEY NEWTON: Some of it is relatively benign. It's stuff like spam, for example, or like phony listings on Google Maps. But as you note, a lot of it is really disturbing. At Google, they have what they call a queue that is dedicated exclusively to violent and extremist content. So the moderators in that queue are looking at terrorist propaganda. That might include murders or beheadings. It could include rapes. The other queue that seems to be the most disturbing for the folks that I've talked to you has to deal with child exploitation. And again, there's sort of a whole queue where people will be looking at videos of child abuse for, you know, one or more hours a day. So it gets really grim.

MARTIN: And you were - you spoke to a young woman named Daisy - who you call Daisy in the piece. She started working at Google when she was 23, monitoring content related to child abuse. Could you talk a little bit about that? And she had no training for this, if I get your - if I understand. You know, she's not a social worker. She had no training in dealing with emotional distress.

NEWTON: That's right. She was a paralegal. And she saw a job listing online. And the job was called legal removals associate. And as it was described to her, she was going to determine whether there was anything that was in Google's search that, you know, needed to be removed because of the law. And what she quickly found out was that, you know, while they had told her that she might see some disturbing content, maybe an hour or two a week, over time, it became more and more a part of her job. And she would have to enter this war room and spend five or six hours a week looking at images and videos of child abuse. And as a result, after doing that work for about a year, she was diagnosed with post-traumatic stress disorder and wound up having to take medical leave from Google. And she was one of several members of her team who had to do that.

MARTIN: And what about some of the people who do the violent extremism queue? You were saying - you were reporting that many of these workers are immigrants from the Middle East. And why do they get these jobs? I guess because of language skills, right?

NEWTON: That's exactly right. So the site that I wrote about is in Austin, Texas. It's operated by Accenture. And they have recruited folks who have the language skills so that they can, you know, listen to the videos and determine whether they're terrorist propaganda or whether they're something more benign. But the big difference between Daisy and those contractors is that Daisy was making about $75,000 a year when she started. These contractors are making 18.50 an hour. That's about $37,000 a year. And, of course, they don't get any of those Google benefits or the nice perks. And so, you know, while there's probably never enough money to make this job really easy or fun, there are definitely different standards of care that are given to these Google employees based on whether they're full time or whether they're a contractor.

MARTIN: Well, I have to assume that you've approached the companies about this and said - and asked them, like, what - you know, what are you doing? And the reason I - you know, I feel like I can speak about this is that news organizations like this one have become increasingly attentive to the impact on their employees of being exposed to violence. So when you approached them about this, what did they say?

NEWTON: You know, what they will say is that they try to take good care of all of their people. They try to make mental health care resources available to them. So, for example, they have counselors on site. They give them like a hotline they can call, you know, if they ever want to talk to someone. And they basically just say that they're trying to do as best they can.

But, you know, I think the bigger story beyond their response recently has been that some of these employees are talking about organizing, particularly in some of the foreign countries where this work is taking place. Some workers have started to file class-action lawsuits. And so I think there is a kind of dawning awareness that this work is difficult. And, you know, it - and these companies aren't going to be able to just kind of say we're trying anymore. They're going to have to take further action.

MARTIN: What response have you gotten to these pieces?

NEWTON: You know, I mean, it's really been amazing. I mean, first of all, so many people have written to me just to say that they didn't know that human beings were actually doing this work. They assumed it was all automated. And so a lot of people sort of expressed gratitude that they, you know, actually knew that these were human beings doing this, right? It's their friends and neighbors. And it's folks right here in America that are doing this job. And then, of course, as you might imagine, I've heard from lots of moderators. So, you know, this year, I've talked to more than a hundred moderators who worked for companies big and small, basically every, you know, tech company that you can think of. And I've heard a really wide range of experiences.

And, you know, I should say there are some folks who are doing this work, and they they feel proud of it. They feel proud of the fact that they're removing disturbing material from the Internet. They feel like they have a good job. But at the same time, the more folks I talk to, the more I just came to feel like there was something fundamentally unjust in the way that these systems are set up and that, particularly at the biggest companies, where the folks have, you know, some of the best salaries in the world, the best offices in the world, that's only possible because of the work of these very low-paid people who are often suffering long after they leave these jobs. That doesn't seem right to me.

MARTIN: That's Casey Newton, Silicon Valley editor at The Verge. Casey, thanks so much for talking to us.

NEWTON: It was my pleasure. Transcript provided by NPR, Copyright NPR.

You make NHPR possible.

NHPR is nonprofit and independent. We rely on readers like you to support the local, national, and international coverage on this website. Your support makes this news available to everyone.

Give today. A monthly donation of $5 makes a real difference.