AI Tells Israel who to kill ?

preview_player
Показать описание
#gospel #aithreats #lavender #artificialintelligence #educationalvideo #israelwarnewstoday #palestine #gaza #competitiveexams

How AI tells Israel who to bomb?

A family lived here in northern Gaza this family evacuated on October 11- 2023,  by February this home was no longer there .

What happened?

Israeli journalists have found out much of the destruction in gaza was often directed by the artificial intelligence system.

After 34,000 Palestinians were killed compared to just 1400 in Israel's 2014  war in gaza, it's clear something different is happening.

So what does AI have to do with it ?

For  Israeli defence force  use of AI is not new they use IRON DOME, this system is what partly defended Israel against the Iran missile attack in April 2024.

And also this,  'SMASH' an AI precision rifle.

Another way Israel uses AI is through surveillance of Palestinians in the occupied territories which records their movements,a facial image and other biometrics which will be matched in the database.

Now let's learn more about AI system that chooses bombing targets in Gaza.

'GOSPEL'  is a system that produces bombing targets for specific buildings and structures in gaza. It does this by working in association with other AI tools.

like any AI system the first step is the large-scale  collection of  data, in this case surveillance and historical data on Palestinians and military locations in gaza.

This data will later be transferred to another platform called 'fire factory' where the data is observed and categorised.

Once the data is organised it goes through a third layer called 'The Gospel' where the gospel creates an output which suggests specific targets, weapons to be used, warnings of possible collateral damage etc.

This system produces targets in Gaza faster than a human can.

And wait there is a more opaque and secretive AI system built for targeting specific people, known as 'LAVENDER' .

As the Israel-hamas war began lavender used historical data and surveillance to generate as many as 37,000 Hamas and Islamic Jihadi targets.

Sources say that about 90% of those targets are  correct and accurate.

After lavender used its data to generate these targets AI would then link the target to a specific family home and then recommend a weapon for the Israel defence force to use on the targets.

The Last step of these processes is human approval
Рекомендации по теме