Subtitles section Play video Print subtitles (dramatic music) - Part of the job description was, "You will be part of a team "that protects free speech online," which makes it seem very heroic. It felt like you were putting on a cape working at Google. - Over the past year, I've been reporting on the lives of Facebook's content moderators in America and they've told me about their low pay, their dire working conditions and in some cases, the long term mental health consequences of doing the work that they do. A content moderator is kind of like a police officer for the internet. If you ever see something that you think doesn't belong on a site and you report it, that report is gonna be reviewed by a human being. While a lot of what they see is really benign, like spam, for example, some of it's really disturbing. I'm talking about murder, terrorism and child exploitation. Recently, I started seeing out people who did this kind of work for Google and YouTube. I wanted to see how their experiences compared to the ones I had heard about already. What I did learn surprised me. (dramatic music) - Part of doing our job and how they would make us feel better about it was that, "You guys see this so other people don't have to see this." (dramatic music) - Over the course of my reporting, I talked to both people who worked at Google full time and people who had been hired on through third-party contractors. It became clear to me that no matter who hired you, doing this job over a long enough time period can cause significant mental health consequences. But it also became clear to me that there is a big difference in how Google employees get treated and how those third-party contractors get treated. Today, a former full-time Google employee named Daisy Soderberg-Rivken is going on the record to talk about her experiences as a content moderator. She had access to all the perks and all the benefits that come with being a full-time Google employee. But at the end of that day, that didn't save her from the consequences of doing the job. - I was a legal root removals associate, which is a very fancy way of saying I was a content moderator at Google. - Let's talk about what the job actually was. You show up, you have your orientation, you sit down at your computer, it's time to do your job. What is your job? - You usually start your work by going through a queue. So you're assigned to a queue based on either an issue area or a geographic area. I focused on the French market, because my first languages were French and English and I also worked on our child sexual abuse imagery cases and our terrorism cases. - And you were working primarily on web search, right? - Yes, we as in-house content moderators, we would usually handle more high level, complex issues. Certain things that were very high volumes, such as defamation and copyright were typically sent over to contractors. They would then escalate to us if it was kind of a gray area, but if it was even a gray area for us, we would then escalate to our council. It was kind of levels of how specialized we were. - At what point did you start to feel like you were seeing more disturbing stuff than you expected? - Very early on. They said we would be analyzing child sexual abuse imagery but I remember clearly, in parentheses, it said, this kind of content would be limited to one to two hours per week, when in reality, we were understaffed, so we would be in there sometimes five, six hours a week, which sounds like nothing, but it's actually... - Oh, it sounds like a lot. - It's a lot. - Yeah, yeah. When do you first notice that doing this job was starting to affect your mental health? - When I was walking around San Francisco, actually, and I was with one of my friends and we saw a group of kids, toddlers, that were hanging on to one of those ropes so that they don't go far. I looked at them and then, I kind of blinked once, and suddenly, I just had a flash of images of some of the images I had seen, children being tied up, children being raped, at that age. This is three, three years old. I kind of like stopped and I was kind of blinking a lot and my friend had to make sure I was okay and I had to sit down for a second and I just exploded crying. She was like, "What just happened?" And I couldn't explain it to her and I just, these racing thoughts and then, an instant panic attack. I was having nightmares, I wasn't sleeping, I had spent multiple days just crying in the bathroom. I was having all of these panic attacks. My work productivity just dipped. Finally, my manager was like, "Listen, we really need you "to step up your productivity game." I just snapped and I turned to him and I said, "Do you understand what we're looking at "and we're not machines, we're humans. "So we have emotions and those emotions "are deeply scarred by looking at children "being raped all the time and people getting "their heads chopped off." It was like there was no escape and yeah, I finally snapped and they took that as, oh, she needs to take a second, she needs to breathe. And I was said, "No, I need to leave." The free food, the nap pods, all these benefits, this doesn't mean anything if this is, if this is my day-to-day. - Daisy helped me understand how hard this job is to do even when you work in the greatest office in the world. But the truth is that most people don't work in an office half that nice. One of Google's biggest projects that it has to moderate, of course, is YouTube. When it comes to YouTube, Google has decided to give most of the work of content moderation to third-party contractors. Recently, I went to Austin, Texas, to meet with a group of moderators who work for Accenture on the YouTube project. Specifically, they work on what is called the VE queue. VE standing for violent extremism. 120 times a day, they review YouTube videos that have terrorism, graphic violence and other disturbing content. You're about to hear from one of them and we've altered the audio to protect their identity. - [Moderator] So, at the beginning, they told you to watch some videos. You're going to take some actions. You will apply the YouTube polices. But you don't feel how this is going to impact you. - In some ways, the content moderators who do this work for Google and YouTube are treated better than the ones who work for Facebook. Most prominently, they get two hours of break time each day. Basically, two hours of paid leave in which they can recover from the challenges of doing this work. But, most of them aren't able to take a full two hours a day. - [Moderator] They're forcing you, micromanaging you to have to be sitting on the desk five hours and a half. And if you don't, there is going to be a punishments. The schedules will be changed. You will be on night shift. And this is going to affect my wellness time. I will never take my three hours. (dramatic music) - [Casey] What kind of things do they do that make life hard? - [Moderator] They always have complaints about everyone. You know, like, I have something on you. If you make any problems, you know what? This is the reason that I can fire you. - [Casey] Right, right. - [Moderator] One of the things that they always saying is if we miss one agent tomorrow, we get another 10. - [Casey] So they're constantly reminding you