Placeholder Image

Subtitles section Play video

  • The world is entering a new age of warfare. A digital revolution is sweeping through  

  • every military force on the planet. Leading the charge is artificial intelligence.  

  • A technology with the power to upend  everything about human conflict.  

  • Including whether humans are involved at all

  • And simmering beneathis a global cyberwar  that has already started and may never end.

  • Digital technology is transforming all our lives,  

  • so no wonder it's also changing how we  fight. It's making militaries smarter,  

  • faster, more efficient. But it's also opening up  the prospect of serious dangers in the future.

  • There's a third revolution of warfare  after gunpowder and nuclear weapons

  • There will be more unpredictability in  how we get to armed conflict, and that  

  • will make the whole world a more dangerous place.

  • Here in Berlin, Germany's foreign minister warns  us: a full-scale tech arms race is underway.

  • We're right in the middle of it. That's  the reality we have to deal with

  • Wir sind schon mittendrindas ist die  Realität, mit der wir es zu tun haben.

  • In fact critical technologies are  developing so fast that SOCIETIES  

  • can barely keep upand ask themselves  the question, is this what we want? So  

  • in this video we're going to zero in on two  risks that are not getting enough attention.

  • First, we'll see how a cyber intrusion against the  command and control systems for nuclear weapons  

  • could set off a terrifying chain of events.

  • You have to worry that it's going to  escalate into something that's like  

  • truly apocalyptic -- civilization ending.

  • Then we'll examine how a recent  war in an obscure part of the world  

  • provided a taste of things to come…  accelerating a race for autonomous weapons.

  • And how the artificial intelligence behind them  

  • could lead to conflicts that  move at horrifying speed.

  • All of a sudden you have a have  a war that no one really started,  

  • and which could spiral out of control.

  • We'll catch glimpses of a future  where wars can start more easily

  • where they can escalate fasterand where humans can't stop them.

  • Machines are dictating the  conduct on the battlefield.  

  • Machines are making the ultimate  decisions about life and death.

  • The good news is: it's not too late to make  critical choices. And in the final part  

  • we'll look at what political leaders could be  doing NOW to prevent the worst from happening

  • But firstwe begin with a scenario that is  not from the future. It could happen today.

  • We're going to England's north York Moors, near  the coast and the windy North Sea. Here we find  

  • what could be the most important places in the  world -- that you have probably never heard of.

  • Its name is Fylingdales, a British air force  base that's notable not for its planes, but for

  • this grey edifice, jutting out of the ground.  

  • They call it the pyramid. But  in fact, it's a giant radar.

  • It's not the only one.  

  • There's something similar on the other side of  the world, at Clear air force base in Alaska.

  • And there's another far to the south at  Beale, in the heat of the California desert

  • There's one hidden in the forest on Cape Cod,  

  • Massachusettswhere America  nudges out into the West Atlantic.

  • And in the frozen north of Greenlandfar above the Arctic Circle,  

  • you'll find that another pyramid looms.

  • These installations are all part of America's  early warning systempowerful radars built to  

  • detect attacks on the US homeland or American  allies. Above allincoming nuclear missiles.

  • It's a system that reaches out into space, where  dedicated satellites keep watch from high orbit.  

  • Constantly feeding back to the  "command-and-control" apparatus  

  • in charge of America's own nuclear weapons.

  • This is the nervous system of the western  military alliance. It dates back to the Cold  

  • War but in today's geopolitical tensions, it's  as crucial as ever. Disrupting it could leave  

  • the alliance blindprone to attack. That was made clear in America's latest  

  • nuclear posture reviewessentially the  instruction manual of its most powerful weapons.

  • This infrastructure is so  important, the review said  

  • that if it were attacked, the US might  respond by using nuclear weapons.

  • As we're going to find outdespite their  critical position at the heart of Western  

  • security, these systems are vulnerable  – to new and unpredictable threats.

  • The first early warning systems were built decades  ago, at the height of the Cold War. Their job:  

  • detecting nuclear missiles coming in from Russia. As they've been updated over the decades,  

  • two crucial things have changed  that make them more exposed.

  • First -- many are no longer focussed only  on nuclear threats. They're multi-tasking.

  • "None of the big command and control  systems whose existence has been  

  • acknowledged by the US government are used  exclusively for non-nuclear operations."

  • James Acton is one of the world's  leading experts on nuclear security.

  • "That's one example of this phenomenon that I term  

  • the growing entanglement between the  nuclear and the non-nuclear domains."

  • This idea of "entanglement" is important. It  means that the incredibly sensitive area of  

  • nuclear weapons is no longer separated off in its  own bubble. It's become mixed in with matters of  

  • conventional warfare. And that multi-tasking  means they're more likely to be a target.  

  • In a crisis or a conflict, adversaries could have  a potential incentive to attack these dual-use  

  • command and control assets, these assets that are  used for both nuclear and non-nuclear operations.  

  • Potentially, they're doing that in order to  disrupt US conventional war fightingbut that  

  • would have the effect of degrading the US  nuclear command and control architecture.

  • So there are more reasons to attack these targetsAnd on top of that comes the second big change:  

  • they've entered the digital age, opening  them up to the prospect of cyber-attack.

  • Systems are now relying on digital signals  as opposed to analogue signals, increasingly  

  • relying on thingslike IP based operating systems IP-based operating systems: Internet Protocol 

  • that means computer operating  systems with networking capabilities 

  • which creates vulnerabilities, for  example, in the form of cyber-attacks.  

  • Very old-fashioned nuclear command and  control systems that didn't use digital  

  • systems were invulnerable to cyber-attacksThere was no code there to do the attacking."

  • Today, cyber-attacks are an everyday eventwe often hear about them on the news. In fact,  

  • some say we've entered a low-grade  cyber-war that will NEVER stop.

  • "You have a mix of state level and non-state  actors constantly probing and attacking  

  • networks around the world. That's just the reality  

  • of 21st century life and something  that we'll have to deal with."

  • Just about everything and  everyone can be a target.  

  • Recent attacks have hit the US  government, the German parliament,  

  • and Iran's nuclear programme. And  they're just the ones that we know about.

  • Some of the most serious cyber-attacks  have hit public infrastructure  

  • like those against Ukraine's power grid –  attacks blamed on Russia. That was so grave  

  • that the United States stepped in to press  charges against the alleged perpetrators.

  • No country has weaponized its cyber capabilities  as maliciously and irresponsibly as Russia."

  • Attacks like that on civilian infrastructure  have become a major public concern.  

  • But only a small circle of experts are  thinking about how a cyber-attack on  

  • nuclear command and control systems might play  out. Here, the stakes could not be higher.

  • To see what could happen, let's go back  to the English coast and Fylingdales,  

  • the early warning system peering  across the North Sea towards Russia.  

  • In a crisis situation with the  Kremlin, this could be a prime target.

  • "That's so significant because  that radar is the closest US  

  • radar to Russia's biggest concentration of  its nuclear forces. It's the one that would  

  • get the quickest warning of a Russian nuclear  attack. It's also the most entangled one."

  • Remember that idea of "entanglement" between  the nuclear and the non-nuclear realms

  • Fylingdales is a key example of this, watching  out not just for big nuclear missiles,  

  • but also for conventional weapons.

  • If Russia was firing short-range ballistic  missiles at Europe, Fylingdales could see  

  • those missiles in a way that other US radars  that are further from Europe couldn't. 

  • So of all the US early warning radars, Fylingdales  is the one that has the biggest Russian incentives  

  • to attack in a crisis or a conflict. And it's the  one that attacks could have the biggest effect  

  • on in terms of degrading US  strategic early warning capabilities.

  • And a scenario where exactly that  happens is all too easy to imagine.

  • It's the near future. We're in Latvia, a  former Soviet republic, now a member of NATO.

  • Protests have broken out among  ethnic minority Russians,  

  • who are accusing the government of discrimination.

  • As the protests turn violent, Russia  begins amassing troops along the border.

  • Western leaders accuse Moscow  of orchestrating the unrest  

  • as a pretext to invade this  tiny NATO member state.

  • Neighbouring Estonia and Lithuania  – also former Soviet republics  

  • and also now members of NATOreport a surge in cyber-attacks.

  • Fear spikes across the region.  

  • For the first time since the Cold War, NATO  and Russia are on the brink of direct conflict.

  • As the crisis deepens, the US detects  malicious computer code planted in its  

  • early warning networks at Fylingdales. In the  heart of a system that is on ultra-high alert.

  • James Acton explains what happens next.

  • If you find malicious code in your networksIt's very hard to know what that code does. It  

  • takes a long time to analyse the code and  understand what the other side is doing.  

  • And this makes it very hard to know whether  this malicious code is just for espionage,  

  • or is also for offensive operations as  well. And in this fast-moving crisis,  

  • the US doesn't know what the code does. It  hasn't yet had a chance to analyze it even  

  • if that code is exclusively for espionage  purposes. There is a danger that the US  

  • might conclude that it's preparations for  an attack on an early warning system."

  • As the malware spreads, the US also has to work  out who planted it. That's a process called  

  • attribution. It takes time, and it is NOT easyAdding pressure to the fear and deep uncertainty.

  • There's various countries that could have  incentives to launch cyber espionage or prepare  

  • for cyber attacks by inserting malware  against the U.S. early warning system.  

  • You know, North Korea would have an  incentive for doing it. China would have  

  • an incentive for doing it. Russia would have  an incentive of doing it, maybe others, too.

  • Amid all that uncertainty, with the Latvian crisis  ongoing, Russia becomes be the obvious suspect.

  • "I think there is potentially an  assumption in that crisis to assume  

  • that Russia implanted the malware. Even if you  don't know for certain who did it -- Chinese  

  • implantation or North Korean implantation --  again, in a fast moving crisis in which you  

  • don't have time to do the attribution properlymay be misinterpreted as a Russian intrusion."

  • So in the heat of this crisis,  

  • under intense pressure, the US has  some enormous decisions to make.

  • Its most sensitive nuclear weapons  infrastructure is under cyber-attack.  

  • It doesn't know what the code  is doing, or who planted it.  

  • But the circumstances suggest it's a Russian  attack. So the Americans decide to respond  

  • in kind -- with a cyber-attack of  their own against Russia's systems.

  • It then does the same thing against Russianot  necessarily for an attack at this point, but for  

  • espionage and for signalling purposes and sayingyou know, anything you could do, we can do better.  

  • The problem is that Russia is very worried  about the survivability of its nuclear forces.

  • Now Russia fears that the US is trying  to mess with ITS nuclear weapons.

  • Discovering cyber intrusions in your  command-and-control system can exacerbate  

  • those fears. You could believe the US  is preparing to attack, preparing to  

  • eliminate the nuclear forces pre-emptively.

  • The two sides are entering a spiral of escalation  

  • that leads towards disaster with a relentless  logic. Russia makes the first move.

  • A lot of their nuclear weapons or missiles  are based on trucks which they would have  

  • to disperse to make them survivable  so that the US couldn't destroy them.  

  • So they may do that because they're  worried about a US nuclear attack

  • But that kind of action could confirm the US fear  that they're preparing for nuclear weapon use…  

  • and that that's the kind of scenario thatthink could catalyze nuclear weapon use directly.  

  • The US then disperses its nuclear forces  that confirms Russian fears that the US  

  • is thinking about using nuclear weapons and  that leads to Russian limited nuclear use.

  • Limited nuclear use. We've gone from a piece  of mystery code in the wrong place to a nuclear  

  • missile launch. Let's do what the  governments can't in this situation - and  

  • slow right down to pick  apart what's just happened.  

  • Because this is how a regional crisis  can turn into a catastrophic war.

  • In the heat of a crisis with Russia, the US  detects malware in its early warning networks.

  • Fearing it could be Russian code  aimed at disabling its systems,  

  • it retaliates with a cyber intrusion  of its own into Russia's networks.

  • Russia now fears its nuclear  capabilities are being threatened,  

  • and scatters its land-based weapons  to avoid possible attack. When this is

  • picked up by the US, Washington disperses  its own nuclear forces for the same reason.

  • Fearing an imminent nuclear attack, Russia fires  the ultimate warning shot - a small nuclear  

  • missile against a target that would result in  minimal casualties, like a naval ship out at sea.

  • You can conceive of first use of nuclear weapons  that literally kills no civilians and only a small  

  • number of military personnel. The use of nuclear  weapons against a ship at sea, far from any land,  

  • a military vessel -- you might only kill the  sailors on board that vessel and no civilians."

  • While the immediate damage may be limited, this  crosses the threshold called "nuclear first  

  • use. Whichever side does this, they've  made the situation deadly serious.

  • Once you've crossed that threshold -- once nuclear  first use has happened -- you have to worry that  

  • it's going to escalate into something that's  like truly apocalyptic civilization ending.  

  • And so the real goal for me is  to prevent any first use at all.

  • We've just seen how a cyber intrusion can  escalate mercilessly into a nuclear conflict  

  • that nobody wanted. Where things could go from  THERE is the stuff of nightmares. But there  

  • are things the world could do now to prevent  such a disaster from happening in the future.  

  • We'll look at those later. But  firstlet's leave this realm of  

  • scenarios and return to the real world  – and a war that has already happened.

  • It's late 2020 and war has broken out  in a place the world had forgotten.  

  • A festering conflict has erupted  into full-scale fighting.

  • Ground zero is Nagorno Karabakh… a  disputed region in the Caucasus mountains,  

  • fought over by two former Soviet  republics: Armenia and Azerbaijan.

  • This looks like a textbook regional warover  territory, over ethnic and national pride. Fought  

  • while the rest of the world is consumed by the  pandemic, it doesn't get that much media coverage.  

  • But for those who are paying attentionit is a glimpse of future wars.

  • You can find it right here, in the propaganda  pumping out from the start of the war.

  • Azerbaijan's border patrol posts this video on  its YouTube account just as the conflict begins.

  • The lyrics are a rush of jingoistic feverwith a mantra: "hate" for the enemy.

  • But look carefully, and you'll see what makes  this conflict a watershed in modern war.

  • Watch out for these trucks in the background.

  • In this shot you can just about see what's inside.

  • Then a launch, in slow motion.

  • What emerges is not a rocket or a missileit has wings that are beginning to unfold  

  • just before the video cuts away.

  • We can see enough to identify what this is.

  • It's what's called a "loitering munition" from  Israel's state-owned defence manufacturer,  

  • IAI. Its model name: the "Harop."

  • The company's promotional videos show  what "loitering munitions" can do.

  • Once launched, they flyautonomously  – to a target area, where they can wait,  

  • or "loiter" in the sky for hours, scanning  for a targettypically, air defence systems.

  • Once they find a target, they don't dropbomb, but fly into it, to destroy it on impact.

  • It's earned them the nickname "kamikaze drones."

  • In the war over Nagorno Karabakh,  

  • these weapons didn't just make for good  propaganda. They made a real difference.

  • Azerbaijan had spent years  investing in loitering munitions.  

  • Analysis by a US think tank  showed that they had more than  

  • 200 units across four different models –  all of them sophisticated Israeli designs.

  • Armenia only had a single, domestically  made model with a limited range.

  • "The really important aspect of the conflict  in Nagorno Karabakh, in my view, was the use of  

  • these loitering munitions, so-called kamikaze  drones, these pretty autonomous systems."

  • Ulrike Franke is one of Europe's  leading experts on military drones.

  • They also had been used in some  way or form before, but here,  

  • they really showed their usefulnessmilitarily speaking, of course. It  

  • was shown how difficult it is  to fight against these systems.

  • As Azerbaijan celebrated victoryyou could even call Nagorno Karabakh  

  • the first war that was won -- in  part -- by autonomous weapons.

  • Little wonder the Harop was on show that dayAnd other militaries were paying attention.

  • Since Nagorno Karabakh, since the  conflict, you could definitely  

  • see a certain uptick in interest in loitering  munitions. We have seen more armed forces  

  • around the world acquiring or wanting  to acquire these loitering munitions.

  • The Nagorno Karabakh war amounted to a showcase  

  • for autonomous weapons technology. With  a clear message: this is the future

  • It's a future that is coming at us fast. Ever  more advanced models are coming onto the market…  

  • Designed to hit a wider range of targets

  • The manufacturer IAI even markets one of its  models with the slogan… "fire and forget."

  • Fire and forgetthink about that. Alreadytoday, autonomous weapons are being used to  

  • find a target over long distances and  destroy it without human intervention.  

  • And this revolution is just getting started  – turbocharged by artificial intelligence.

  • In the United States, a major report  from a "national security commission"  

  • on artificial intelligence talks  about AI enabling a "new paradigm  

  • in warfighting" – and urges massive  amounts of investment in the field.

  • This isn't all about autonomous weapons –  there are many other areas of the military  

  • which will be using artificial intelligence.

  • One area where we see a lot of AI-enabled  capabilities is in the realm of data analysis.  

  • We are gathering so much data in military  operations. Another area, which I think is  

  • quite promising, but also still relatively  removed from the battlefield is logistics.  

  • AI can definitely help to make this more  efficient, cheaper, better, easier, all of that.

  • And fuelling all of this is an intensifying  global competition, which spans all the  

  • way from these more prosaic fields to the  autonomous weapons we're looking at today

  • The Chinese and the Russians have  made it very clear that they intend  

  • to pursue the development of autonomous weapons

  • Martijn Rasser, a former analyst at the CIA,  

  • covers emerging weapons technology at  Washington's leading defence think tank

  • and they're already investing heavily in the  research and development of those systems.

  • It's not just the superpowers piling in. Britain's  

  • new defence strategy also  puts AI front and centre.

  • And as we've already seen, Israel isleader in the autonomous weapons field.

  • In fact, wherever you look, countries  of all sizes are jumping in.  

  • No wonder there's talk of  this becoming an arms race.

  • Germany's foreign minister Heiko Maas is  clear that that arms race is already underway.

  • We're right in the middle of it. That's  the reality we have to deal with.

  • If anything, this might go  deeper than an arms race

  • AI is here to stay. And there is a belief  among the major powers that this could make a  

  • difference on the battlefield in the futureSo they are frenetically investing in it.

  • Indian Diplomat Amandeep Singh Gill is  the former chair of the UN government  

  • experts' group on lethal autonomous weapons

  • And this is a race, in a sense, which cuts across  the military and the civilian fields, because  

  • there's also the sense that this  is a multitrillion dollar question.  

  • It's about the future of resilient economies.

  • That is what sets this new era apart from  arms races of the past. During the Cold War,  

  • the development of nuclear weapons was driven  purely by governments and the defence industry.  

  • Beyond power generation, there wasn't much  commercial use for nuclear technology.

  • Today, AI is rapidly entering our everyday  lives. It might even unlock the phone in your  

  • pocket when you hold it up to your faceThis emerging ubiquity of AI important.  

  • Because it means that developments  in AI cannot be containedthey  

  • are bound to bleed across between civilian and  military fields -- whether we like it or not.

  • AI is by definition dual use or multi use, it  can be used in all kinds of ways. It really  

  • is an enabler more than the technology. There  is a whole range of applications of artificial  

  • intelligence in the civilian realm, from health  care to self-driving cars to all kinds of things.

  • It means that something as innocuous as  a new year's celebration in Edinburgh  

  • or St Patrick's Day in Dublincan be  powered by similar swarming technology  

  • to what the Indian army showed  off on its national day. In fact,  

  • swarming is one of the hottest areas of  autonomous weapons development right now.

  • The US Navy has released  footage of early demonstrations.  

  • Here, fighter jets drop over  100 tiny drones in mid-flight.

  • Once they're out there, it's almost impossible  for the human eye to keep track of them.

  • The whine of their motors -- almost  the only sign of the threat in the sky

  • Experts say they will make  highly effective weapons.

  • You could take out an air defense  system, for example, by -- just  

  • you throw so much mass at it and so many  numbers that the system is overwhelmed.  

  • This, of course, has a lot of tactical  benefits on a battlefield. And no surprise,  

  • a lot of countries are very interested  in pursuing these types of capabilities.

  • Not least the head of the body  advancing the US army's modernisation,  

  • as he explained in an online think tank forum.

  • Most likely drone swarms are something you're  going to see on the battlefieldon a future  

  • battlefield. I don't think it's a matter  of ifas a matter of fact, I think we're  

  • already seeing some of itit's a  matter of when we begin to see it.

  • And feeding the momentum of this potential  arms race - in order to fight these weapons,  

  • you need these weaponsHumans don't have a chance.

  • When you're defending against a drone swarm,  a human may be required to make that first  

  • decision. But I'm just not sure that any  human can keep up with a drone swarm.

  • This issue of speed gets us to a critical  emerging danger of autonomous weapons...

  • The weapons we've seen so far are  capable of a high degree of autonomy.  

  • But they wouldn't be impossible for humans  to control. Even a "fire and forget" weapon  

  • needs a human to fire it, and they're still  operating in a way that we can pretty much grasp.

  • Now let's think ahead, a decade or two into  the future. That's a decade or two of rampant  

  • technological development - and adoption  - of increasingly autonomous weapons.

  • I think what is very likely that in  20 years' time we will have swarms  

  • of unmanned systems, not even necessarily just  airborne drones -- it can also be ground systems,  

  • surface vessels, etc. So different units operating  together and carrying out attacks together,  

  • which does indeed require quitehigh level of AI-enabled autonomy

  • To fight these systems, you will need these  systems. Because human beings are simply too slow.

  • This is what potentially may drive  an arms race that -- some actors may  

  • be forced to adopt a certain level  of autonomy, at least defensively,  

  • because human beings would not be  able to deal with autonomous attacks  

  • as fast as would be necessary. So  speed is definitely a big concern here.

  • And that could have fateful  consequences for how wars begin.

  • We could find ourselves in a situation where  because of this this problem of speed and  

  • autonomous systems having to be countered by  other autonomous systems, we could find ourselves  

  • in a situation where these systems basically  react to each other in a way that's not wanted

  • We've already seen something like  this on the financial markets.  

  • The "flash crash" of 2010 wiped more than  a trillion dollars off the US stock markets  

  • in just minutes. It was driven by  trading algorithms feeding off each other  

  • in a dizzying spiral. How it happened  is STILL not fully understood.

  • In a flash crash, trading can  be halted to prevent disaster.  

  • The risk with a "flash war" is that  there might be no pulling back.

  • If the beginning is bad enough, it may not  even matter any more that the original event  

  • wasn't supposed to be an attack  in the first place. You could have  

  • a situation where the counterattack  is so bad that you end up in a war.

  • Now, think back to Nagorno Karabakh -- a  regional war where autonomous weapons may  

  • have tipped the balance. In a future  world with the risk of "flash war,"  

  • places like this could face even  more instability, even more conflict.

  • We are moving in the world into a world where  

  • systems will be more autonomous. But we need  to make sure that we minimize the risk of  

  • unwanted escalation, of lethality decided  by machines without any human control.

  • But how do we do that? How do we prevent  the worst? As we're about to find out  

  • the world is struggling to find a way

  • We've just seen glimpses offuture that nobody could want

  • Of war spinning out of control. Even erupting out of nowhere.

  • These are not the nightmares of science fictionThey're highly plausible outcomes of the rapid  

  • technological development that's happening nowThere is no way to stop the technologies we've  

  • seen in this video. And we probably wouldn't want  to. There are many positive applications that  

  • will come out of them. The urgent challenge  is to find ways to keep them under control.

  • My fear is that there will  be more unpredictability  

  • in how we get to armed conflict, so the  pathways to getting to the battlefield  

  • won't be clear to policymakers. So they will  not understand fully the risks of certain  

  • actions or certain happenings, and that will  make the whole world a more dangerous place.

  • Amandeep Singh Gill was at the centre of  United Nations efforts to try to get a  

  • grip on autonomous weapons… a process that  critics say is on the brink of failure.

  • This is where it all happensThe UN buildings  in Geneva. It's here that delegates from UN  

  • member states gather with experts and NGOs to  talk about the future of autonomous warfare.

  • This process is part of what's called the UN  Convention on Certain Conventional Weapons.  

  • A diplomatic tongue-twister launched in the  1980s to try to regulate non-nuclear weapons  

  • that were deemed so dangerous that they  need special attention. Things like land  

  • mines and blinding lasers. In 2014, lethal  autonomous weapons made it onto the agenda.

  • It has been very slow going. The process has  yielded a set of " guiding principles" – saying  

  • that autonomous weapons should  be subject to human rights law,  

  • and that humans must have ultimate  responsibility for their use.

  • But these "guiding principles" have no force…  they're just a basis for more discussions.  

  • For campaigners calling forban, that's not good enough.

  • We do get frustrated by the  delays that have happened  

  • and the delay in moving from discussions  to actual negotiations of a new treaty.  

  • The main problem with this forum is that it  operates by consensus. So meaning any one state  

  • can block progress and block that shift  from discussions and negotiations.

  • Bonnie Docherty lectures on human rights  at Harvard Law School - and is also a  

  • spokeswoman for theCampaign to Stop Killer  Robots” – a high-profile coalition of NGOs.  

  • She has mapped out principles  for an international treaty.

  • The overarching obligation of the treaty should  be to maintain meaningful human control over the  

  • use of force, and where it should be a treaty  that governs all weapons operating with autonomy  

  • that choose targets and fire on them based  on sensor's inputs rather than human inputs.

  • That idea of keeping "meaningful human  control" is broadly echoed by many countries,  

  • but only 30 states support the campaignThey're mostly smaller nations but include  

  • one giant in the form of China. But  Beijing's true position is blurred.

  • China has called for a ban on, or expressed  support for a ban on USE, but has not,  

  • to my knowledge, expressed support for a ban  on development and production. We believe that  

  • you need to prohibit development as well as  use of these inherently problematic systems,  

  • because once things are developedthe genie is out of the bottle.

  • And the other great military powers aren't  keen at all on those sorts of limitations  

  • either. Russia is accused by  many of taking any opportunity  

  • to thwart the Geneva talks. But there  are plenty of other objectors too.

  • Russia has been particularly vehement in its  objectionsSome of the other states developing  

  • autonomous weapon systems such as  Israel, the US, UK and others have  

  • certainly been unsupportive of a new treaty and  have expressed varying degrees of support for  

  • actually continuing discussions. So those  are some of the roadblocks that we face.

  • As things stand, the US is  highly unlikely to support a ban.  

  • Rather, it has set out its own principleswhich include human involvement.

  • A ban on autonomous weapons systems is  essentially infeasible just because the technology  

  • is out there. The Department of Defense has been  very clear about its commitment to ethical uses  

  • of these technologies, where right now the  position is that a human being has to be on  

  • the loop or in the loop when those weapons are  used so that it won't be fully autonomous in  

  • the sense that there won't be any human  interaction with these weapons systems.

  • But the reality is that the US, China and Russia  

  • are competing so intensely in all areas of AI  technology that it's questionable whether any  

  • of them would sign up to a treaty that  significantly limits what they can do.

  • The large powers will have will  always have agendas. They want  

  • freedom of manoeuvre. They think that they need  to have agency over technology development.  

  • And sometimes they've been very sceptical  of the role of international organizations,  

  • multilateral forums in understanding  and regulating technology.

  • Aside from the lack of interest from crucial  playersthe challenge of tackling an intangible  

  • technology like AI with the traditional tools  ofarms controlis genuinely difficult.

  • A lot of the old ways of arms  control and arms control treaties  

  • don't work anymore and don't apply anymore to  these systems, because we are to put it bluntly,  

  • we're talking about software rather than hardwareSo a lot of arms control systems in the past  

  • basically were about allocating a certain number  of systems. You were allowed one hundred warheads  

  • of this type and you were allowed one hundred  heads of this type. And we're basically counting.  

  • You can't do this with the A.I. enabled  weapon systems that we were talking about,  

  • because it doesn't matter what it looks  like from the outside. But what's in there.

  • Germany has been quite active in trying  to navigate around these problems…  

  • its foreign minister says that  the world has to find a way

  • Just like we managed to do with nuclear weapons  over many decades, we have to forge international  

  • treaties on new weapons technologies Heiko Maas is a member of Germany's  

  • social democrats and has beenvocal advocate of arms control.

  • They need to make clear that  we agree that some developments  

  • that are technically possible are not  acceptable and must be prohibited globally.

  • In fact the German government has laid  out its intention - in the document that  

  • underpins the current coalition. It says "We reject autonomous weapons systems  

  • that are outside human control. We  want to prohibit them worldwide."

  • That sounds pretty clear. But even this is  complicated. Germany for instance does not  

  • support the Campaign to Stop Killer  Robots. It says there's a better way.

  • We don't reject it in substance –  

  • we're just saying that we want others to be  included the global controls that we would need  

  • to ensure that autonomous  weapons systems don't come into  

  • useSo military powers that are  technologically in a position  

  • not just to develop autonomous weapons but  also to use them. We need to include them.

  • So this isn't just a debate about the  rights and wrongs of autonomous weapons.  

  • It's also a debate about PROCESS.

  • On the one hand, Germany says an agreement it  only worth anything if the big countries are  

  • on board - they want that elusive  consensus in the Geneva process.  

  • On the other, the Campaign to Stop Killer Robots  

  • says the matter is too urgent to wait. They say  there's just time for one more round in Geneva.

  • We feel that if states don't take action  by that point, that they should consider  

  • strongly they should move outside of the  Convention on Conventional Weapons and  

  • look at other options. So they could go to the UN  General Assembly to negotiate a treaty. They could  

  • start an independent process, basicallyforum that is not bound by consensus, but is  

  • guided by states that actually  are serious about this issue  

  • and willing to develop strong standards  to regulate these weapon systems.

  • There's precedent for thiswith land  mines, for example. In the 1990s,  

  • the Geneva process couldn't find consensus.  

  • Instead, more than 100 countries broke away  to create a ban called the "Ottawa Convention.

  • But the great powers didn't signAnd more than 20 years later,  

  • the US, Russia and China still  haven't joined the Ottawa Convention.

  • It's a dilemma, isn't it? So you can  do away with the rule of consensus  

  • and then you can have results  quickly, but they will not  

  • have near universal support at the very leastthey will not have support from the countries that  

  • are developing these capabilities. But through  the rule of consensus, you force those countries  

  • to engage. So I think it's a choice that the  international community makes in these forums.

  • So the world doesn't agree on what to do about  autonomous weapons. And it can't even agree on  

  • HOW to agree on what to do about them. In this  situation, is there any prospect of a solution?

  • In the end we may end up with rules or norms  or indeed agreements that are more focused  

  • on specific uses and use cases rather  than specific systems or technology. So  

  • where you basically agree, for example, to use  certain capabilities only in a defensive way,  

  • or only against machines rather  than humans or only in certain  

  • contexts. But as you can imagine, implementing  and, first of all, agreeing to that and then  

  • implementing that is just much harder than  some of the old arms control agreements.

  • Compounding this is the rock-bottom level  of trust between the major powers right now.  

  • US-Chinese talks in Alaska in early 2021  descended into a bitter round of accusations.

  • When there is lack of trust, you tend to attribute  all kinds of intentions to the other party  

  • and you tend to overestimate what they might  be doing and overshoot in your own response.  

  • Today, frankly, the developments on the technology  front are actually adding to the mistrust.

  • Preventing the kind of cyber disaster  we looked at earlier would REQUIRE the  

  • great powers to cooperate. But as a first stepthere are things they could do independently.

  • I think states need to think very carefully  about how their cyber operations could be  

  • misinterpreted, in order to fully analyze  the benefits and risks before conducting  

  • them. I think countries should adopt a rule that  before launching any cyber intrusions against  

  • nuclear command and control, including dual  use assets, including the stuff that's both  

  • nuclear and conventional, that should have to  be signed off on by a Secretary of Defense or  

  • a head of state as a way of ensuring  these things are not routine.

  • Beyond that, the best we could  hope for might be a behavioral norm  

  • agreed between the US, Russia and China –  that they would not launch cyber intrusions  

  • against nuclear command and control systems.

  • The idea would be that if you detected  another state in your network,  

  • the deal was off and you could  go after their network. And so  

  • in this way, you'd hope to enforce this  agreement through mutual deterrence.

  • But remember that problem of entanglementof  

  • systems involved in nuclear  and non-nuclear operations.

  • That's going to make it potentially very  difficult to define what command and control  

  • assets are included in this kind of pledge  and what command and control assets are  

  • excluded as part of this plan. I meanyou'd have to have some pretty kind of  

  • difficult and sensitive negotiations that  I don't think states are ready for yet.

  • As the US, China and Russia slip deeper  into an era of "great power competition,"  

  • the challenge will be to carve out areas  like this -- where they can put mutual  

  • interest above the visceral drive to be on  top. THAT is the spirit of "arms control.

  • You don't make arms control agreements with  your best friends and allies. You always,  

  • by definition, you know, try to negotiate  them with your enemies. And this isn't  

  • exactly new… I don't I don't think  it's impossible that these players,  

  • which are already opponents and may eventually  become even more adversarial, can come together  

  • and agree on certain minimum requirements  simply because it is in everyone's interests.

  • For Germany's foreign minister, the  whole world has responsibility here.

  • The world must take an interest in the fact that  we're moving towards a situation with cyber or  

  • autonomous weapons where everyone can  do as they please. We don't want that.

  • Climate change serves as an ominous  warning of what can happen when humanity  

  • sees a common threat on the horizon  but FAILS to act in time to stop it.

  • The Rio Summit kicked off the UN's process of  talks to tackle climate change way back in 1992 

  • It took 23 years to get to the Paris Agreement And it's clear even THAT wasn't enough 

  • It's already too late to  prevent much of the devastation  

  • that scientists predicted from the start.

  • With the scenarios we've just seen --  the warning signs are just as clear,  

  • and if anything even more urgent.

The world is entering a new age of warfare. A digital revolution is sweeping through  

Subtitles and vocabulary

Click the word to look it up Click the word to find further inforamtion about it