How AI tells Israel who to bomb

preview_player
Показать описание
AI is supposed to help militaries make precise strikes. Is that the case in Gaza?

Israel's war with Hamas, in response to the attacks of October 7, 2023, has led to more fatalities than in any previous Israeli war, with at least 34,000 Palestinians killed as of May 7, 2024. In Israel’s 2014 war in Gaza, just over 1,400 were killed. One factor in that difference is the use of artificial intelligence.

Israel’s incorporation of AI in warfare has been public for years through both defensive and offensive weapons. But in this war, AI is being deployed differently: It’s generating bombing targets. The promise of AI in a military context is to enhance strike precision and accuracy, but over the past few months Israeli outlets +972 magazine and Local Call have revealed that the multiple AI systems that help the IDF select targets in Gaza have contributed to the highest number of Palestinian civilian deaths and injuries ever.

In our video, we interview multiple experts to understand how two specific systems, Gospel and Lavender, operate, and we explore the broader implications of current and future AI use in warfare.

Further reading:

‘A mass assassination factory’: Inside Israel’s calculated bombing of Gaza, by Yuval Abraham

‘Lavender’: The AI machine directing Israel’s bombing spree in Gaza, by Yuval Abraham

‘The machine did it coldly’: Israel used AI to identify 37,000 Hamas targets, by Bethan McKernan and Harry Davies

Israel under pressure to justify its use of AI in Gaza, by Joseph Gedeon and Maggie Miller

Israel is using an AI system to find targets in Gaza. Experts say it’s just the start, by Geoff Brumfiel

Israel’s AI Revolution: From Innovation to Occupation, by Anwar Mhajne

Рекомендации по теме
Комментарии
Автор

One of the most dystopian video titles I think I've ever read

zacky
Автор

If we allow "AI" to be an excuse here, we allow "A coin flip" to be an excuse. AI did not tell Israel who to bomb. Israel asked AI who to bomb, and told it how it should decide who to bomb.

TrogdorBurninor
Автор

Training AI with thousands of civilian lives as data for war is the most messed up thing

saemstunes
Автор

How insanely irresponsible it is to report on the use of "power targets" and NOT to mention how this is a blatant war crime. The targeting of civilian housing with the intent to "put pressure" on a militant group that is not actively operating there is a war crime. It's also the definition of terrorism.

LithiumBlock
Автор

"civil pressure" so terrorism, that's literally what terrorism is.

abdelrahmana.abdelgawad
Автор

The irony of part of the AI bombing system being called "The Gospel" speaks volumes

liftgaming
Автор

People keep saying “this is so dystopian” when the reality is, we are already there. We just don’t realize it yet.

whatupinvaders
Автор

"civil pressure" I love how everything they do is sugar coated. When Russia does it it's a war crime, when they do it, it's "applying civil pressure" somehow.

mmk
Автор

"auto aim from call of duty", "permissable civilian casualities", "civil pressure" oh my god dude

machariawanjagi
Автор

this isn't the world anyone should have to live in

crisp-waffle
Автор

So now they can do genocide and blame on AI now?

evil_morty_
Автор

This feels more and more surreal. You think it cant possibly get more dystopian and then it does

yazzmaniac
Автор

This is clearly Terrorism not civil pressure. Isn't civil pressure a type of terrorism?

saadbhai
Автор

Ever wonder why they didn't ask the AI engine to exclude children and women?

korayem
Автор

It's so insanely dystopian how we're using AI to decide who dies and who doesn't. this is something out of a movie, not something we should be seeing in real life.

wingsofkuiper
Автор

Considering one of the AI tools for Lavender is named "Where's Daddy", I don't think their programmers are too concerned with life preservation.

wenerjy
Автор

You know it's bad when you hear the words "auto aim from call of duty"

goldking
Автор

Just Israel being Israel. It's been this blatant for a very long time, and I'm glad people are finally paying attention.

RedHair
Автор

Obviously the point of AI isn't to be more accurate or fast, but to avoid individual culpability for those in charge.

GallileoPaballa
Автор

The term, “permissible civilian casualties”, is not a phrase that any military should ever use, under any circumstances. Full stop.

Adrian-uqyb