Basic US 4717 Folder Collection
After playing the video, you can click or select the word to look it up in the dictionary.
Loading...
Report Subtitle Errors
Hello.
Today I'm going to be talking to you about a new technology that's affecting famous people.
- Remember when Obama called Trump a dipshit? -"Complete dipshit."
Or the time Kim Kardashian rapped, "Because I'm always half naked"?
Or when Arnold Schwarzenegger impersonated himself?
"Get out of there! There's a bomb in there! Get out!"
Deepfake. Deepfake. Deepfake.
You gotta be kidding!
This is a deepfake, too.
I'm not Adele.
But I am an expert in online manipulation.
So deepfakes is a term that is used to describe video or audio files that have been created using artificial intelligence.
My favorite is probably Lisa Vanderpump.
It started as a very basic face-swapping technology.
And now it's turned into film-level CGI.
There's been this huge explosion of, "Oh my goodness, we can't trust anything."
Yes, deepfakes are eerily dystopian.
And they're only going to get more realistic and cheaper to make.
But the panic around them is overblown.
In fact, the alarmist hype is possibly more dangerous than the technology itself.
Let me break this down.
First, what everyone is freaking out about is actually not new.
It's a much older phenomenon that I like to call the weaponization of context or shallowfakes with Photoshop and video editing software.
There's so many needs.
How about the time Nancy Pelosi appeared to be drunk while giving a speech?
"But you never know."
"But this president of the United States."
Turns out that video was just slowed down at 75%.
"It was very, very strange."
You can have a really simplistic piece of misleading content that can do huge damage.
For example, in the lead-up to the midterms, we saw lots of imagery around this caravan of people who were moving towards the U.S.
This photo was shared with captions demonizing the so-called migrant caravan at the U.S.-Mexico border in 2018.
But a reverse image search showed it was actually Pakistani refugees in Greece.
You don't need deepfakes' A.I. technology to manipulate emotions or to spread misinformation.
This brings me to my second point.
What we should be really worried about is the liar's dividend.
The lies and actions people will get away with by exploiting widespread skepticism to their own advantage.
So, remember the "Access Hollywood" tape that emerged a few weeks before the 2016 election?
"When you're a star, they let you do it."
"You can do anything."
Around that time, Trump apologized, but then more recently he's actually said, I'm not sure if I actually said that.
When anything can be fake, it becomes much easier for the guilty to dismiss the truth as fake.
What really keeps me awake at night is less the technology.
It's how we as a society respond to the idea that we can't trust what we see or what we hear.
So if we are fearmongering, if we are hyperbolic, if we are waving our hands in the air, that itself can be part of the problem.
You can see where this road leads.
As public trust in institutions like the media, education and elections dwindles, then democracy itself becomes unsustainable.
The way that we respond to this serious issue is critical.
Partly this is the platforms thinking very seriously about what they do with this type of content, how they label this kind of content.
Partly is the public recognizing their own responsibility.
And if you don't know 100%, hand on heart, "This is true," please don't share, because it's not worth the risk.
    You must  Log in  to get the function.
Tip: Click on the article or the word in the subtitle to get translation quickly!

Loading…

Loading…

Deepfakes: Is This Video Even Real? | NYT Opinion

4717 Folder Collection
Mackenzie published on August 21, 2019    Mackenzie translated    Evangeline reviewed
More Recommended Videos
  1. 1. Search word

    Select word on the caption to look it up in the dictionary!

  2. 2. Repeat single sentence

    Repeat the same sentence to enhance listening ability

  3. 3. Shortcut

    Shortcut!

  4. 4. Close caption

    Close the English caption

  5. 5. Embed

    Embed the video to your blog

  6. 6. Unfold

    Hide right panel

  1. Listening Quiz

    Listening Quiz!

  1. Click to open your notebook

  1. UrbanDictionary 俚語字典整合查詢。一般字典查詢不到你滿意的解譯,不妨使用「俚語字典」,或許會讓你有滿意的答案喔