Placeholder Image

Subtitles section Play video

  • Hey, sisters, James Charles here.

  • And welcome back to me to channel you guys.

  • You guys.

  • You guys okay.

  • Thank you so much for watching.

  • I love you so much and I'll see you next week.

  • Over the next 24 hours, we're going to be creating deep fakes of some of the most famous YouTubers in the world.

  • Now, if you've never heard of a deep bake before, it's essentially a type of identity swap where you can take somebody else's face, say, a celebrity's put it on your own and then start acting and saying things that maybe they would never say I'm so nervous, I don't know, Like making a depict sounds hard, they're increasing realism.

  • And the fact that average nonprofessionals have access to this technology now has become a big concern for the general public.

  • Well, we already hit a roadblock downloadable of stuff and then found out immediately that it doesn't actually exist anymore.

  • But I think I found another one called Deep Face Lab, which is open source on.

  • I think it'll be a better choice for us brainstorm session.

  • Yeah.

  • What?

  • We're trying to brainstorm the jurors, and yeah, I know.

  • I know the culture.

  • I know what we need.

  • Someone like David over debates of ELISA Koji and David Dilbert getting back together.

  • Oh, okay.

  • I think we should go intending.

  • I think that's, like thing to do, because, I don't know, maybe people we don't know, because makes don't tell anyone.

  • Go on and on.

  • This isn't even.

  • Oh, this guy who was always like, Hey, that's my name.

  • So there's Shane Dawson and arriving Adams, his partner O v sauce one sign heavy sauces.

  • Michel.

  • Oh, my God.

  • And you look like and you kinda look like Link, but before busted red Think Casey Nice.

  • Having good.

  • He has a very distinct face.

  • That's when I feel like he has a great That's actually a good one.

  • I moved to L.

  • A.

  • It just seems like this isn't gonna be what makes it work.

  • That doesn't look.

  • Are you sure that's not just like a full on virus?

  • The amateur technology that we're using today originated only in 2000 and 17 from a Reddit er named Deep Fake.

  • Hence the common name deep fake reason this technology was created was to put celebrities faces onto pornography.

  • But the reason we have enjoyed deep fakes is when they are used to make really funny videos on the Internet like this.

  • My favorite is probably Lisa Vanderpump.

  • My least favorite.

  • I don't want to say because who knows what you're going to run into these people, you know, seeing any of these.

  • So I want you to read this one first.

  • Read what he said.

  • Yeah, that's what he said.

  • Based on our research of deep takes, it is really important if the person that you choose has a similar face to you that limited our scope of YouTubers.

  • But we still haven't.

  • And so now we're going to practice.

  • Our voices and practice are sort of impersonations.

  • So Mitch is gonna be Haiti Neistat.

  • So we need to brainstorm how we pull it off.

  • Yeah, I don't know.

  • I'm trying to think of like an aesthetic that looks like because it'll be easier to match the face if we're in sort of similar settings.

  • What about like that at the back of the techies weight Yet have even guys like technology a lot.

  • What?

  • We owe just a couple of big take.

  • But I think that you and him will be a really convincing deep.

  • So right now I'm just taking a video of Shane Dawson that we had that has, like, decent lighting.

  • We're gonna try and match it.

  • Basically, the computer has figured out where Greg's features are.

  • If we go through figure jawline, figure out the rough perspective.

  • I was all the main features so that it can then imposed Shane Dawson face.

  • I'm here with these similar features.

  • Your and there's a lot of island spaces you have been wanting from morning tonight.

  • Like you guys you don't even understand.

  • That's gonna take this brush on your going to make sure you put it over are under your eyes.

  • I absolutely love this brush is Actually you guys is actually insane because it's what it's made of this, like really interesting edible for her.

  • So when you're done with it, you just pop it up.

  • That's like it like it is a Prague or eternal, mixed with a human.

  • Okay, so this is where the magic happens.

  • So we're actually training the neural network right now.

  • So in column one and call him to our actual photos of Greg and shame that are really and the columns right beside them.

  • So this one and this one, this is the computer, slowly trying toe.

  • Learn how toe recreate, shame, phase and recreate Brexit.

  • It will become more clear as we go forward, but what it's doing is it's taking the shape of Greg's face and putting Shane Dawson space on it.

  • So obviously, at the very beginning, it's not very good at it means a lot of time to process.

  • It needs directions.

  • Right now, it's only a 1700 iterations humming of need.

  • We maybe want, like, 100,000 or 200,000 what I've taken like only a few 1000 iterations just so we can see it tracked on it bad.

  • But it gives us a sense of work, so it's gonna be Jianqing.

  • It's gonna look really blurry because the computer hasn't done a lot of training.

  • But this is the very early you can start to see its features.

  • In the past, this kind of technology was reserved for high level experts or like visual effects artists.

  • But now we've seen that even we people who have never dealt with this before.

  • I have downloaded a free program, and within hours have a good sense of how it works.

  • Thankfully, there are now media forensic agencies developing that air creating Softwares and their own technologies to tell if a video is real or fake, in particular, ones that threatened national security.

  • But already depicts have been used to put people's faces into porn without their consent.

  • And there's a genuine concern with how this technology could be used to blackmail people with fake video.

  • Okay, the sun's going down.

  • I've got a new been going over here with Casey, and over here we have a shame still working out.

  • I'm going to start a third one on my laptop, which isn't that powerful, but I'd rather just have a bunch going at once because it takes so long that I'd rather just a crop it computer re trying its best.

  • Do you think that this is how probably Michael here, or we do all know that technology advances a lot faster than laws and regulations, So it's sort of the wild, Wild West for this type of thing.

  • In fact, Facebook said, they're not going to take deep takes off of their platform, and it led to someone making a deep fake of Mark Zuckerberg saying what we're about to show you and putting it on Facebook to see what they would do.

  • I wish I could keep telling you that our mission in life is connecting people, but it isn't.

  • Specter showed me how to manipulate you into sharing intimate data about yourself and all those you love for free.

  • There's a lot of scary future scenarios where this technology is.

  • We don't really know yet what it's gonna dio, but in the meantime, it is kind of fun.

  • So let's get back to the okay, you ready to see the deep for defects are done.

  • We're gonna watch them.

  • We do have to admit we did more than 24 hours because we want to set.

  • We did send me.

  • Let's have it pretty easy to do, but the timing takes a while.

  • Take your seats, grab your popcorn.

  • Let's go watch it.

  • Last to leave their child on the side of the road, goes to jail for seven years and wins $10,000 bail.

  • Last to leave their child on the side of the road, goes to jail for seven years and wins $10,000 bail So you're watching me on that camera right there?

  • Then how is this camera right here?

  • Because I ran around Manhattan twice this morning before 6 a.m. And I got bored, so I'm moving in.

  • L A So you're watching me on that camera right there?

  • Then how is this camera right here?

  • It's because I ran around Manhattan twice this morning before 6 a.m. And I got bored, So I'm moving l a Hey, v sauce Michael here.

  • Or some of you might be wondering, when I go to a subway sandwich and I get extra dressing, Ever ask for it to be v saucy?

  • Yes.

  • And as always, thanks for watching.

  • Hey, v sauce Michael here.

  • Or some of you might be wondering, when I go to a subway sandwich and I get extra dressing, Ever ask for it to be V sausage?

  • Yes.

  • And as always, thanks for watching.

  • What's up?

  • Everybody?

  • Welcome back to my channel.

  • Hi.

  • How are you now?

  • You know you cannot buy Jeffery Starr to see these doggy treats.

  • Authentic Jaguar liver, $5000 a pop.

  • This big girl, This ain't no basic big its oil from the now and this toilet paper.

  • 5000 applied.

  • Now, that's the tea, honey.

  • What's up, everybody?

  • Welcome back to my channel.

  • Hi.

  • How are you now?

  • You know you cannot buy Jeffery Starr to see these doggy treats.

  • Authentic Jaguar liver.

  • $5000.

  • This big girl.

  • This ain't no basic big its oil from the now and this toilet paper.

  • 5000 applied.

  • Now, that's the tea, honey, we're going to be Meeting goes way.

  • I have own long of questions.

  • Way you're going to be meeting the ghost of Organza woman who interviews the British to T.

  • I have a lot of questions.

  • Row.

  • I'm so disappointed, bro.

  • A mad bro Jake.

  • Jake washed his bro.

  • You always gone that back, bro.

  • Think of this from my brother.

  • Row.

  • So you are my bro, bro.

  • You see that, bro?

  • Science, Bro.

  • Bro Bro.

  • My bro Bro.

  • I'm so disappointed, bro.

  • A Mab, bro.

  • No, Jake.

  • Jake Wash this bro.

  • You always got my back.

  • Think of this from my brother row.

  • So you are my bro, bro.

  • You see that, bro?

  • Science, Bro.

  • Bro.

  • Bro.

  • Bro.

  • Bro.

  • I think the most interesting part was how fast we did that we don't have the strongest computers.

  • And so it's just interesting to think of what someone else could do with a lot more time.

  • We hope you now have more of an understanding of how deep fakes are made, so you're not left in the dark.

  • Researchers have found that online hoaxes spread 10 times faster than accurate information, which sort of makes sense because as humans were hard wired to respond to extreme emotional information, which is easily made up and spread on social media.

  • And the last thing we all can do is pressure major news media to show fake video beside original video so we can understand where those changes are being made.

  • We do live in a world where we ourselves, you yourself are in charge of making sure that the information that you are getting is trustworthy.

  • We do kind of all have to be our own editors.

  • And just cause you're not a journalist doesn't mean you can't fight fake news or fight the misuse of deep fakes.

  • We actually teamed up with the Canadian Journalism Foundation for this video with their important message that if you see information online and you're unsure about it.

  • We should doubt it.

  • Check it and challenge.

  • It's an election year here in Canada.

  • There's gonna be one in America in a democracy happening all over the world.

  • Those are important times for you to be thinking about this message.

  • Thanks again to the Canadian Journalism Foundation for supporting this video and helping us with lots of different resources.

  • And thank you so much for watching, sharing, liking and all that.

  • We will see you next time for another science video.

Hey, sisters, James Charles here.

Subtitles and vocabulary

Click the word to look it up Click the word to find further inforamtion about it