Placeholder Image

Subtitles section Play video

  • How did you first find out that there were deepfakes of you?

  • My husband actually told me.

  • Because he is friends with Ashton Kutcher.

  • So he actually told him like, “Oh, by the way, there these things called deepfakes

  • and your wife is one of them."

  • Deepfakes use machine learning to fabricate events that never happened

  • like Bill Hader shape-shifting:

  • And I said, “Seth Rogan was like, “It was amazing!

  • He has like, a bike track in his back yard!

  • It's phenomenal.”

  • And I did a Seth Rogan impression

  • And it was like I did a magic trick, Tom Cruise was like, “Ahoooh!”

  • And there's growing concern that in the wrong hands this technology

  • can pose a very real national security threat.”

  • could impact the 2020 election

  • could become a real and present danger to our democracy, Dana.”

  • But the most pressing, daily threat of deepfakes isn't politics.

  • It's porn.

  • People are hijacking women's faces to make porn videos they never consented to be in.

  • Which is why this kind of deepfake is harmful, even when people know it's not real.

  • I was just shocked.

  • Because this is my face, it belongs to me!

  • In a September 2019 survey, researchers at Deeptrace found that of the deepfake videos

  • they could identify online, 96% were pornographic.

  • And that more or less 100% percent of these videos are women and are not consensual.

  • Pornhub and other big porn streaming sites have policies banning deepfakes,

  • though they don't seem to be enforcing them.

  • Mostly, these videos show up on separate sites dedicated to this kind of abuse.

  • It's not just celebrities anymore.

  • Not that celebrities feel any less pain from these videos.

  • But the phenomenon is evolving at a rate where we're seeing deepfake pornography increasing

  • in number and an increasing number of victims as well.

  • I was at work and I got an email on my phone.

  • In fact, I can even bring up the email.

  • 25th of May, 4:18 p.m.

  • "F.Y.I. There is a deepfake video of you on some porn sites.

  • It looks real."

  • I remember sitting down receiving that email.

  • I think it was like you're frozen for that moment of time.

  • It was depicting me having sexual intercourse, and the title of the video had my full name.

  • And then I saw another video that was depicting me performing oral sex.

  • Noelle is an Australian law graduate.

  • She's not a celebrity.

  • Someone took photos she shared on social media and first photoshopped them into nude images

  • then graduated to deepfake videos.

  • What's happening in these videos is a specific kind of digital manipulation,

  • and it's unlike older face-swapping filters you might have used.

  • Those tools let you put your face into your friend's head, but you still controlled it

  • -- a sort of video Photoshop --

  • transferring both your facial features and your expressions.

  • But deepfakes can take the facial features alone,

  • and animate that face with the expressions of someone else.

  • Tom Cruise was likeAhoooh!”

  • This is what makes deepfake porn videos so invasive.

  • The creator takes away a victim's control of her own face

  • and uses it for something she never wanted.

  • Transferring a mask of someone's facial features requires training a piece of software

  • called anautoencoderon hundreds of images of the same face from different angles

  • and in different lighting with different expressions until it learns what they all have in common.

  • That volume of images has long been available of celebrities, but increasingly it exists of...

  • anyone.

  • If you're someone not even with a very intense social media presence, but just a presence online,

  • you have a few pictures.

  • Maybe there's a video of you from a party or something like this.

  • You have so many training images to take from that.

  • At the moment, you do still need a fair bit of data to make a convincing deepfake.

  • But as the technology is improving, we're needing less data and the tools are becoming increasingly

  • accessible and user-friendly.

  • And there's a growing infrastructure for deepfakes.

  • It's not just about posting videos, it's also about forums discussing how to make them,

  • how to target certain individuals.

  • In fact, the termdeepfakeoriginated as the name of a subreddit for swapping

  • celebrity faces onto porn stars.

  • Reddit banned the page, but the conversation just moved to other sites.

  • You have almost directories about, "Okay, you want to make a deepfake of a certain celebrity.

  • Here are adult performers that will best suit that."

  • There's a lot of money to be made from selling custom deepfakes.

  • Users pay for deepfakes of specific celebrities

  • or even women they know personally.

  • And they discuss whether all of this is legal.

  • Some think they can protect themselves by identifying the videos as fake.

  • But that's not true.

  • If you live in the US and someone makes porn with your face, you can sue the creator

  • whether or not they've marked it as a deepfake.

  • What is true is, it's very difficult to actually do that.

  • You'd need to pay to bring a lawsuit, with no certainty you'd be able to secure a punishment

  • or even find the creator, let alone stop the video from spreading.

  • Some people on these sites question the morality of what they're doing.

  • And disagree about how they'd feel if it happened to them.

  • But it probably won't happen to them.

  • That's the point.

  • Taking a woman's face and putting it into this context is part of a long history

  • of using sexual humiliation against women.

  • You know, we're having this gigantic conversation about consent and I don't consent.

  • So that's why it's not OK.

  • Even if it's labeled as, "this is not actually her."

  • It's hard to think about that.

  • This is probably one of the most difficult things because

  • fake porn and my name will be forever associated.

  • My children, my future children, will have to see things.

  • My future partner will have to see things.

  • And that's what makes me sad.

  • I just, I wish that the Internet were a little bit more responsible and a little bit kinder.

How did you first find out that there were deepfakes of you?

Subtitles and vocabulary

Click the word to look it up Click the word to find further inforamtion about it