Placeholder Image

Subtitles section Play video

  • I love that I can unlock my phone with my face,

  • and that Google can predict what I’m thinking.

  • And that Amazon knows exactly what I need.

  • It’s great that I don’t have to hail a cab

  • or go to the grocery store.

  • Actually, I hope I never have to drive again or navigate

  • or use cash or clean or cook or work or learn.

  • But what if all this technology

  • was trying to kill me?

  • The same technology that is making your life easier

  • is being weaponized.

  • That feature that unlocks your phone with your face,

  • here it is attached to a self-learning machine gun.

  • It’s manufacturer, Kalashnikov,

  • made this video to show the gun using object-recognition

  • software to identify targets.

  • They say it gets more accurate the more you use it.

  • That drone advertised to get awesome snowboarding shots,

  • here’s one that doesn't require a pilot.

  • This ad shows it with a high-explosive warhead.

  • It hangs out in the sky, until it

  • finds an enemy radar system, then

  • crashes headfirst into it.

  • Oh, and that driverless car you thought was so cool,

  • well, here it is in tank form at a Russian arms fair.

  • It’s called the T-14.

  • Dmitry, here, says he sells them

  • to the Russian government.

  • That contract is part of a trend that’s changing

  • the way wars are waged.

  • Like all good stories, this one

  • starts at a Russian arms fair.

  • Were a few hours outside of Moscow.

  • Everyone from government officials to gun enthusiasts

  • have come here to see the latest weapons.

  • It’s a family affair.

  • Buyers want to know how the 21st-century technology

  • boom can give their armies a strategic advantage.

  • They want to know: Can technology make war safer?

  • But some fear giving weapons too much power

  • because it brings us closer to machines that could

  • go out and kill on their own.

  • They say, we might not be able to control weapons

  • like these, weapons loaded with artificial intelligence.

  • So artificial intelligence is a study

  • of how to make machines behave intelligently,

  • which means acting in a way that

  • will achieve the objectives that theyve been given.

  • And recently, I’ve become concerned about the use of A.I.

  • to kill people.”

  • Stuart Russell.

  • He was an early pioneer in artificial intelligence.

  • He’s also been warning people about its potential danger

  • for years.

  • So a killer robot is something

  • that locates, selects and attacks human targets.”

  • Stuart isn’t so worried about robots like this.

  • Were still pretty far from theTerminator.”

  • But Stuart says were not as far from something

  • like this bee-sized drone.

  • He imagined one, and made a movie that he hopes

  • will freak you out.

  • In Stuart’s movie, we see swarms

  • of them armed with explosives set loose on their targets.

  • The main issue is youre creating a class of weapons

  • of mass destruction, which can kill millions of people, just

  • like a nuclear weapon.

  • But in fact, it’s much easier to build,

  • much cheaper, much more scalable,

  • in that you can use 1 or 10 or 100 or 1,000 or 10,000.

  • Whereas with a nuclear weapon, it’s sort of all or nothing.

  • It doesn’t destroy the city and the country

  • that youre attacking.

  • It just kills all the people you want to kill,

  • all males between 12 and 60 or all males wearing

  • a yarmulke in Israel.”

  • The weapon Stuart is describing is terrifying,

  • if it works perfectly.

  • With the current state of tech,

  • many experts say it wouldn’t, but that

  • could be even scarier.

  • The way we think about A.I. is we build a machine

  • and we put the objective into the machine.

  • And the machine pursues the objective.

  • So you put in the objective offind a cure for cancer

  • as quickly as possible.’

  • Sounds great, right?

  • O.K. Well, probably the fastest way to do that

  • is to induce tumors in the entire human population,

  • and then try millions

  • of different treatments simultaneously.

  • Then, that’s the quickest way to find a cure.

  • That’s not what you meant, but that’s what you asked for.

  • So we call this the King Midas Problem.

  • King Midas said, ‘I want everything

  • I touch to turn to gold.’

  • And he got his wish.

  • And the, his food turned to gold,

  • and his drink turned to gold and his family turned

  • to gold.

  • He died in misery and starvation.

  • You know, this is a very old story.

  • We are unable to correctly specify the objective.”

  • Machines will always be limited by the minds

  • of those who made them.

  • We aren’t perfect.

  • And neither is our A.I.

  • Facial recognition software has had trouble

  • with dark skin.

  • Self-driving vehicles still need good weather and calm

  • streets to work safely.

  • We don’t know how long it will take for researchers

  • to create weapons with that kind of flexibility.

  • But behind closed doors, defense labs

  • are working on it and theyre not working alone.

  • Militaries don’t have to invent A.I.

  • It’s already being builtit’s being

  • driven by major tech companies out

  • in the commercial sector.”

  • Paul Scharre, here, led a Department of Defense

  • working group that helped establish

  • D.O.D. policies on A.I. and weapons systems

  • for the U.S. military.

  • The reality is all of the technology

  • to put this together, to build weapons that

  • can go out on the road, make their own decisions to kill

  • human beings, exists today.”

  • But it’s one thing to assemble a weapon in a lab, and another

  • to have it work in any environment.

  • And war is messy.

  • Machines are not really at a point

  • today where theyre capable of flexibly

  • adapting to novel situations.

  • And that’s a major vulnerability in war.”

  • Governments around the world see potential advantages

  • in these weapons.

  • After all, human soldiersthey get tired, emotional.

  • They miss targets.

  • Humans get traumatized.

  • Machines do not.

  • They can react at machine speed.

  • If a missile was coming at you,

  • how quickly would you want to know?

  • Autonomous weapons could save lives.

  • The same technology that will help self-driving cars avoid

  • pedestrians could be used to target civilians or avoid

  • them, intentionally.”

  • The problem is weve gotten this wrong before.

  • To really understand the growing trends of automation

  • in weapons that have been growing for decades,

  • you have to go all the way back to the American Civil War,

  • to the Gatling Gun.

  • How do I describe a Gatling Gun?

  • Do I have to describe it?

  • Could you guys show a picture of it?

  • Richard Gatling was looking at all

  • of the horrors that were coming back

  • from the Civil War.

  • And he wanted to find a way to make war more humane,

  • to reduce the number of people that are

  • needed on the battlefield.

  • Wouldn’t that be amazing?”

  • Four people operating Gatling’s gun

  • could fire the equivalent of 100 soldiers.

  • Far less people would be needed on the battlefield.

  • It was the precursor to the machine gun.

  • And it was born with the intention to save lives,

  • at least for the army that had the gun.

  • Of course

  • The reality was far, far different.

  • Gatling’s invention had the very opposite effect

  • of what he intended.

  • And then it magnified the killing and destruction

  • on the battlefield, by orders of magnitude.”

  • Gatling was wrong.