Placeholder Image

字幕列表 影片播放

  • Peter Kafka: I'm not going to do a long wind-up here, because I have a lot of questions

  • for my next guest.

  • I'm delighted she's here.

  • Please welcome Susan Wojcicki, CEO of YouTube.

  • They gave you a good hip-hop theme for your way in.

  • Susan Wojcicki: Thank you.

  • Thank you for coming.

  • Sure.

  • Thank you for having me.

  • I'm really glad we get to have this conversation.

  • I'm glad we get to do it in public, on a stage, on the record.

  • That's great.

  • Let's start here.

  • There was a bunch of news last week.

  • Some of it involved you.

  • Some of it involved vox.com, where I work.

  • There was a policy change.

  • I think they all sort of happened at the same time.

  • Can we just walk through what happened, and if they're parallel tracks, or if they were

  • connected?

  • Sure.

  • So, first of all, thank you.

  • A lot of things happened last week, and it's great to be here and talk about what happened.

  • But I do want to start, because I know that the decision that we made was very hurtful

  • to the LGBTQ community, and that was not our intention at all.

  • Should we just set context, for anyone who was not following this?

  • What decision this was?

  • Yeah.

  • So, let me ... I'll go into that.

  • But I thought it was really important to be upfront about that, and to say that was not

  • our intention, and we were really sorry about that.

  • But, I do want to explain why we made the decision that we did, as well as give information

  • about the other launch that we had going on.

  • Really, there were two different things that happened at the same time.

  • The first one I'll talk with is, we made a really significant change involving hate

  • speech.

  • This is something we had been working on for months, and we launched it on Wednesday of

  • last week.

  • And this is a series of policy changes you've been rolling out for years now.

  • So, just to be clear ... Yeah.

  • So, we've been making lots of different policy changes on YouTube.

  • We have made about 30 changes in the last 12 months, and this past week, we made a change

  • in how we handle hate speech.

  • That took months and months of work, and hundreds of people we had working on that.

  • That was a very significant launch, and a really important one.

  • What we did with that launch is we made a couple big changes.

  • One of them was to make it so that if there's a video that alleges that some race or religion

  • or gender or group, protected group, is superior in some way, and uses that to justify discrimination

  • or exclusion, that would now no longer be allowed on our platform.

  • Similarly, if you had a religion or race, and they alleged that inferiority, that another

  • group was inferior, and they used that to justify discrimination in one way.

  • Those were changes that we made.

  • So, examples would be like, “Race X is superior to Y, and therefore Y should be segregated.”

  • Is it weird to you that you had to make a rule that said, “This shouldn't be allowed”?

  • That this wasn't covered either by an existing rule?

  • That you had to tell your community, “Look.

  • This is not acceptable”?

  • Well, actually, a lot of this ... We're a global company, of course.

  • And so, if you look at European law, there are a number of countries that have a really

  • strong hate speech law.

  • And so, a lot of this content had never been allowed in those countries, but had actually

  • been allowed in the US and many other countries.

  • And so what we had actually done with it a few years ago is we had actually had limited

  • features, meaning that it wasn't in the recommendations.

  • It wasn't monetized.

  • It had an interstitial in front of it to say that this was content that we found offensive.

  • And when we did that, we actually reduced the views to it by 80 percent.

  • So, we found that it was effective, but we really wanted to take this additional step,

  • and we made this step on Wednesday.

  • We also added, which is really important, a few other definitions to protected groups.

  • So, we added caste, because YouTube has become so significant in India.

  • Then, we also added victims of verified violent events.

  • So, like saying the Holocaust didn't happen, or Sandy Hook didn't happen, also became

  • violations of our policies.

  • And so, this was happening on Wednesday, and we launched it on Wednesday.

  • There were thousands of sites that were affected.

  • And again, this is something that we had been working on ...

  • This was coming already.

  • It was coming already.

  • We had started briefing reporters about it in Europe over the weekend, because they're

  • ahead.

  • You know, the train had left the station.

  • And then at the sameon Friday, there was a video.

  • We heard the allegations from Mr. Carlos Maza, who uploaded a video on Twitter with a compilation

  • Works at vox.com.

  • Who works at vox.com, yes.

  • With a compilation of different video pieces from Steven Crowder's channel, putting them

  • together, right?

  • And asked us to take action.

  • Each of these videos had harassment

  • Saying, “He's directing slurs at me, and the people who follow him are attacking me

  • outside of YouTube, as well.”

  • Yes.

  • So, he alleged that there was harassment associated with this, and we took a look at this.

  • You know, we tweeted back and we said, “We are looking at it.”

  • You know, Steven Crowder has a lot of videos, so it took some time for us to look at that

  • and to really understand what happened, and where these different snippets had come from

  • and see them in the context of the video.

  • Actually, one of the things I've learned, whenever people say, “There's this video

  • and it's violative.

  • Take it down or keep it up,” you have to actually see the video, because context really,

  • really matters.

  • And so, we looked through a large number of these videos, and in the end we decided that

  • it was not violative of our policies for harassment.

  • So, were you looking at this yourself, personally?

  • Vox is a relatively big site.

  • It's a big creator.

  • Were you involved in this directly?

  • I mean, I am involved whenever we make a really important decision, because I want to be looking

  • at it.

  • So, you were looking at the videos.

  • Well, so we have many, many different reviewers.

  • Mm-hmm.

  • They will do a review.

  • Again, there are lots of different videos produced by Steven Crowder.

  • He's been a longtime YouTuber.

  • But in this case, did you weigh in personally?

  • Did you look at the stuff?

  • I mean, yes.

  • I do look at the videos, and I do look at the reports and the analysis.

  • Again, I want to say there were many videos, and I looked certainly at the compilation

  • video.

  • So, when the team said, “We believe this is non violative.

  • This doesn't violate our rules,” you agreed with that?

  • Well, let me explain to you why.

  • Mm-hmm.

  • Why we said that.

  • But you agreed?

  • I agreed that that was the right decision, and let me explain to you why I agreed that

  • was the right decision.

  • Okay?

  • So, you know, when we gotfirst of all, when we look at harassment and we think about

  • harassment, there are a number of things that we look at.

  • First of all, we look at the context.

  • Of, you know, “Was this video dedicated to harassment, or was it a one-hour political

  • video that had, say, a racial slur in it?”

  • Those are very different kinds of videos.

  • One that's dedicated to harassment, and one that's an hour-longso, we certainly

  • looked at the context, and that's really important.

  • We also look and see, is this a public figure?

  • And then the third thing that we look at is, you know, is it malicious?

  • Right?

  • So, is it malicious with the intent to harass?

  • And for right or for wrong right now, malicious is a high bar for us.

  • So the challenge is, like when we get an allegation like this, and we take it incredibly seriously,

  • and I can tell you lots of people looked at it and weighed in.

  • We need to enforce those policies consistently.

  • Because if we were not to enforce it consistently, what would happen is there would be literally

  • millions of other people saying, “Well, what about this video?

  • What about this video?

  • What about this video?

  • And why aren't all of these videos coming down?”

  • And if you look at the content on the internet, and you look at rap songs, you look at late-night

  • talk shows, you look at a lot of humor, you can find a lot of racial slurs that are in

  • there, or sexist comments.

  • And if we were to take down every single one, that would be a very significant

  • So, to stipulate that you take it seriously.

  • I want to come back to the idea that there's a ton of this stuff here.

  • Well, so what we did commit toand really, this is I think really importantis we

  • committed, like, “We will take a look at this, and we will work to change the policies

  • here.”

  • We want to be able towhen we change a policy, we don't want to be knee jerk.

  • We don't want it to be like, “Hey, I don't like this video,” or, “This video is offensive.

  • Take it down.”