Placeholder Image

Subtitles section Play video

  • I miss how my family used to gather

  • at the end of the day.

  • How we used to talk.

  • My home was like a normal home.

  • The simple, daily details that everyone has.

  • Heba lived here, in northern Gaza.

  • Her family evacuated on October 11th, 2023.

  • By February, she learned her home was no longer there.

  • She talked to me from a friend's home, in Rafah, in southern Gaza.

  • We received a picture of our house

  • and we were in shock.

  • We had down there, like,

  • a place where where we have trees,

  • we have flowers planted.

  • Heba didn't know exactly why her home had been destroyed.

  • But over the past few months, Israeli journalists

  • have found that much of the destruction in Gaza

  • since the attacks of October 7th

  • has been enabled and often directed by an artificial intelligence system.

  • The promise of AI generally

  • is a promise in two respects.

  • One is swiftness and the second is accuracy.

  • The whole dream of AI is

  • that it would offer these precision strikes.

  • But after over 34,000 Palestinians killed,

  • compared to just over 1,400 in Israel's 2014 war in Gaza,

  • it's clear something different is happening.

  • So what does AI have to do with it?

  • To get some answers, we called a couple of AI experts, reporters

  • and investigative journalists.

  • The Israeli Defense Forcesuse of AI is not new.

  • I think that the most famous use of AI

  • by the IDF is, of course, the Iron Dome,

  • which is a defensive system that aims to disrupt

  • the threat of missile attacks.

  • This system is partly what defended Israel

  • against Iran's drone and missile attacks in April 2024.

  • The other one is another homegrown weapon that they have called

  • the SMASH from Smartshooter,

  • which is an AI precision assault rifle sight

  • that you add on to handheld weapons.

  • And what it does is it uses advanced

  • image-processing algorithms to hone in on a target,

  • sort of like a an auto-aim in Call of Duty.

  • Another way Israel uses AI is through surveillance

  • of Palestinians in the occupied territories.

  • Every time they pass through one of the hundreds

  • of checkpoints, their movements are being registered,

  • Their facial images and other biometrics

  • are being matched against a database.

  • But we're now learning more about the AI systems

  • that choose bombing targets in Gaza,

  • from two reports in the Israeli publications +972 and Local Call.

  • Gospel is a system that produces bombing targets

  • for specific buildings and structures in Gaza.

  • It does this by working in conjunction with other AI tools.

  • And like any AI system,

  • the first step is the large-scale collection of data.

  • In this case, surveillance and historical data

  • on Palestinian and militant locations in Gaza.

  • The most famous application,

  • would be Alchemist,

  • which is a platform that collects data

  • and allows the transfer of data between different departments

  • later being transferred to another platform, which is called the Fire Factory.

  • The Fire Factory observes the data and categorizes it.

  • The generated targets are generally put into one of four categories.

  • First, tactical targets,

  • which usually include armed militant cells, weapons warehouses,

  • launchers and militant headquarters.

  • Then there are underground targets,

  • primarily tunnels under civilian homes.

  • The third category includes the family homes

  • of Hamas or Islamic Jihad operatives.

  • And the last category includes targets that are not obviously military in nature,

  • particularly residential and high-rise buildings with dozens of civilians.

  • The IDF calls these power targets.

  • Once the data is organized,

  • it goes through a third layer called the Gospel.

  • The Gospel creates an output

  • which suggests specific possible targets,

  • possible munitions,

  • warnings of possible collateral damage, and etc.

  • This system produces targets in Gaza faster than a human can.

  • And within the first five days of the war,

  • half of all the targets identified were from the Power Targets category.

  • Multiple sources who spoke to +972 reported that the idea behind power targets

  • is to exert civil pressure on Hamas.

  • Heba’s home was most likely one of the power targets

  • picked up by the Gospel system.

  • Months after the Gospel investigation, +972

  • also surfaced a more opaque and secretive AI system,

  • built for targeting specific people,

  • known as Lavender.

  • As the Israel-Hamas war began,

  • Lavender used historic data and surveillance

  • to generate as many as 37,000 Hamas and Islamic Jihad targets.

  • Sources told +972 that about 10% of

  • those targets are often wrong.

  • But even when determining the 90% of supposedly correct targets,

  • Israel also expanded the definition

  • of a Hamas operative for the first time.

  • The thing is, Hamas ultimately runs the Gaza Strip.

  • So you have a lot of civil society that interacts with Hamas.

  • Police force, doctors, civil society in general.

  • And so these are the targets that we know that they're looking at.

  • After Lavender used its data to generate these targets,

  • AI would then link the target to a specific family home,

  • and then recommend a weapon for the IDF to use on the target,

  • mostly depending on the ranking of the operative.

  • What we were told is that for low-ranking Hamas militants,

  • the army preferred to usedumb bombs,”

  • meaning bombs that are not guided, because they are cheaper.

  • So in a strange way, the less of a danger you posed,

  • then they used less sophisticated bombs,

  • therefore maybe creating more collateral damage.

  • Sources told reporters that for every junior Hamas

  • operative that Lavender marked,

  • it was permissible to kill up to 15 or 20 civilians.

  • But also that for some targets,

  • the number of permissible civilian casualties

  • was as high as 300.

  • [Arabic] More than 50 displaced people were in the building.

  • More than 20 children were in it.

  • AI systems do not produce facts.

  • They only produce prediction,

  • just like a weather forecast or the stock market.

  • Theintelligencethat’s there

  • is completely dependent on the quality, the validity,

  • the understanding of the humans

  • who created the system.

  • In a statement to the Guardian, the IDFoutright rejected

  • that they hadany policy to kill tens of thousands of people in their homes

  • and stressed that human analysts must conduct

  • independent examinations before a target is selected.

  • Which brings us to the last step of both of these processes:

  • Human approval.

  • Sources told +972 that the only human supervision protocol in place

  • before bombing the houses of suspected junior militants marked by Lavender,

  • was to conduct a single check:

  • Ensuring that the AI-selected target is male

  • rather than female.

  • Experts have been telling us that essentially what's happening in Gaza

  • is an unwilling test site for future AI technologies.

  • In November 2023,

  • the US released an international framework

  • for the responsible use of AI in war.

  • More than 50 signatures from 50 different countries.

  • Israel has not signed on to this treaty.

  • So we're in sort of this space

  • where we lack sufficient oversight

  • and accountability for drone warfare,

  • let alone new systems being introduced like Gospel and Lavender.

  • And we're looking at a future really, where

  • there is going to be more imprecise

  • and biased automation of targets

  • that make these civilian casualties much worse.

  • The fallacy of,

  • you know, the premise that faster war fighting is somehow

  • going to lead to global security and peace.

  • I mean, this is just not the path that's going to get us there.

  • And on the contrary,

  • I think a lot of the momentum of these technological initiatives

  • needs to be interrupted, in whatever ways we can.

  • It really aches my heart that these moments are never going to be back.

  • It's not like I left home and like, for example,

  • I traveled and I know it's there.

  • No, it's not.

  • It's not there anymore.

I miss how my family used to gather

Subtitles and vocabulary

Click the word to look it up Click the word to find further inforamtion about it