Some Israeli soldiers have alleged large numbers of civilian homes in Gaza were marked for destruction by a computer system. EPA
Some Israeli soldiers have alleged large numbers of civilian homes in Gaza were marked for destruction by a computer system. EPA
Some Israeli soldiers have alleged large numbers of civilian homes in Gaza were marked for destruction by a computer system. EPA
Some Israeli soldiers have alleged large numbers of civilian homes in Gaza were marked for destruction by a computer system. EPA


Israel, Gaza and AI machines - is this the automation of war crimes?


  • English
  • Arabic

April 05, 2024

For the past decade, side rooms in international law conferences have hosted panel discussions on the introduction of AI software into military toolkits. The use of AI-powered drones in Afghanistan, Pakistan and elsewhere have led to campaigns to ban “killer robots”. All of this was premised on the idea that you need to keep human decision making in the loop as a means of ensuring that – even if technology makes warfare easier – a soldier with moral awareness can ensure that human ethics and international law are still observed.

An explosive investigation released on Wednesday by +972 Magazine, an Israeli publication, may come to upend those discussions for years to come. The report, based on interviews with six anonymous Israeli soldiers and intelligence officials, alleges the Israeli military has used AI software to carry out killings of not only suspected militants but also civilians in Gaza on a scale so grand, so purposeful, that it would throw any Israeli army claim of adherence to international law out the window.

Among the most shocking elements of the allegations is that the war has not been delegated entirely to AI. Instead there has been plenty of human decision-making involved. But the human decisions were to maximise killing and minimise the “bottleneck” of ethics and the law.

To summarise the allegations briefly, the Israeli army has reportedly made use of an in-house AI-based programme called Lavender to identify possible Hamas and Palestinian Islamic Jihad (PIJ) militants from within the Gazan population, and mark them as targets for Israeli air force bombers. In the early weeks of the war, when Palestinian casualties were at their highest, the military “almost completely relied on Lavender”, with the army giving “sweeping approval for officers to adopt Lavender’s kill lists, with no requirement to thoroughly check why the machine made those choices or to examine the raw intelligence data on which they were based”.

The raw intelligence data consisted of a number of parameters drawn from Israel’s vast surveillance system in Gaza – including a person’s age, sex, mobile phone usage patterns, patterns of movement, which WhatsApp groups they are in, known contacts and addresses, and others – to collate a rating from 1 to 100 determining the likelihood of the target being a militant. The characteristics of known Hamas and PIJ militants were fed into Lavender to train the software, which would then look for the same characteristics within Gaza’s general population to help build the rating. A high rating would render someone a target for assassination – with the threshold determined by senior officers.

Four allegations, in particular, stand out because of their dire implications in international law.

First, Lavender was allegedly used primarily to target suspected “junior” (ie, low-ranking) militants.

Second, human checks were minimal, with one officer estimating them to last about 20 seconds per target, and mostly just to confirm whether the target was male (Hamas and PIJ do not have women in their ranks).

Third, a policy was apparently in place to try to bomb junior targets in their family homes, even if their civilian family members were present, using a system called “Where’s Daddy?” that would alert the military when the target reached the house. The name of the software is particularly malicious, as it implies the vulnerability of a target’s children as collateral damage. +972’s report notes that so-called dumb bombs, as opposed to precision weapons, were used in these strikes in spite of the fact that they cause more collateral damage, because precision weapons are too expensive to “waste” on such people.

And finally, the threshold for who was considered by the software to be a militant was toggled to cater to “a constant push to generate more targets for assassination”. In other words, if Lavender was not generating enough targets, the rating threshold was allegedly lowered to draw more Gazans – perhaps someone who fulfilled only a few of the criteria – into the kill net.

Every time an army seeks to kill someone, customary international law of armed conflict (that is, the established, legally binding practice of what is and is not acceptable in war) applies two tests. The first is distinction – that is, you have to discriminate between what is a civilian and a military target. The second is precaution – you have to take every feasible measure to avoid causing civilian death.

Israeli Air Force bombers allegedly dropped cheaper, less discriminate bombs on lower-ranking Hamas militants' homes. EPA
Israeli Air Force bombers allegedly dropped cheaper, less discriminate bombs on lower-ranking Hamas militants' homes. EPA

That does not mean armies are prohibited from ever killing civilians. They are allowed to do so where necessary and unavoidable, in accordance with a principle called “proportionality”.

The exact number of civilians who may be killed in a given military action has never been defined (and any military lawyer would tell you it would be naïve to attempt to do so). But the guiding principle has always, understandably, been to minimise casualties. The greatest number of justifiable civilian deaths is afforded to efforts to kill the highest-value targets, with the number decreasing as the target becomes less important. The general understanding – including within the Israeli military’s own stated procedures – is that killing a foot soldier is not worth a single civilian life.

But the Israeli military’s use of Lavender, allegedly, worked in many respects the other way around. In the first weeks of the war, the military’s international law department pre-authorised the deaths of up to 15 civilians, even children, to eliminate any target marked by the AI software – a number that would have been unprecedented in Israeli operational procedure. One officer says the number was toggled up and down over time – up when commanders felt that not enough targets were being hit, and down when there was pressure (presumably from the US) to minimise civilian casualties.

The exact number of civilians who may be killed in a given military action has never been defined

Again, the guiding principle of proportionality is to trend towards zero civilian deaths, based on target value – not to modulate the number of acceptable civilian deaths in order to hit a certain quantity of targets.

The notion that junior militants were targeted specifically in their homes with mass-casualty weapons (allegedly because this was the method most compatible with the way Israel’s surveillance system in Gaza operates) is particularly egregious. If true, it would be evidence that Israel’s military not only ignored the possibility of civilian casualties, but actually institutionalised killing civilians alongside junior militants in its standard operating procedures.

The way in which Lavender was allegedly used also fails the distinction test and international law’s ban on “indiscriminate attacks” on multiple fronts. An indiscriminate attack, as defined in customary law, includes any that is “not directed at a specific military objective” or employs a method or means of combat “of a nature to strike military objectives and civilians … without distinction”.

The +972 report paints a vivid picture of a programme that tramples over these rules. This includes not only the use of the “Where’s Daddy?” system to intentionally enmesh civilian homes into kill zones and subsequently drop dumb bombs on them, but also the occasional toggling down of the ratings threshold specifically to render the killing less discriminate. Two of the report’s sources allege that Lavender was partly trained on data collected from Gaza public sector employees – such as civil defence workers like police, fire and rescue personnel – increasing the likelihood of a civilian being given a higher rating.

On top of that, the sources allege that before Lavender was deployed, its accuracy in identifying anyone who actually matched the parameters given to it was only 90 per cent; one in 10 people marked did not fit the criteria at all. That was considered an acceptable margin of error.

The normal mitigation for that kind of margin goes back to human decision-making; you would expect humans to double-check the target list and ensure that the 10 per cent becomes 0 per cent, or at least as close to that as possible. But the allegation that soldiers routinely only conducted brief checks – mainly to ascertain whether the target was male – would show that not to have been the case.

If human soldiers can kill civilians, either intentionally or through error, and machines can kill civilians through margins of error, then does the distinction matter?

In theory, the use of AI software in targeting should be a valuable asset in minimising civilian loss of life. One of the soldiers +972 interviewed sums up the rationale neatly: “I have much more trust in a statistical mechanism than a soldier who lost a friend two days ago.” Human beings can kill for emotional reasons, potentially with a much higher margin of error as a result. The idea of a drone or radio operator directing an attack from an operations room after having verified the data ought to provide some comfort.

But one of the most alarming aspects of delegating so much of the target incrimination and selection process to machines, many would argue, is not the number of civilians who could be killed. It’s the questions of accountability afterwards and the incentives that derive from that. A soldier who fires indiscriminately can be investigated and tried, the motivation for his or her actions ascertained and lessons of those actions learnt. Indiscriminate killing by humans is seen as a bug in the system, to be rooted out – even if the mission to do so at a time of war seems like a Sisyphean task.

A machine’s margin of error, on the other hand, is not ideal – but when it is perceived by operators as preferable to human mistakes, it isn’t treated as a bug. It becomes a feature. And that can create an incentive to trust the machine, and to abdicate human responsibility for error minimisation – precisely the opposite of what the laws of war intend. The testimonies of the Israeli officers to +972 provide a perfect illustration of an operational culture built on those perverse incentives.

That would be the charitable interpretation. The less charitable one is an operational culture in which the human decision makers’ goal was to kill at scale, with parameters superficially designed to cater to ethics and laws being bent to fit the shape of that goal.

The question of which of those cultures is more terrifying is a subjective one. Less subjective would be the criminality that gives rise to both of them.

How to apply for a drone permit
  • Individuals must register on UAE Drone app or website using their UAE Pass
  • Add all their personal details, including name, nationality, passport number, Emiratis ID, email and phone number
  • Upload the training certificate from a centre accredited by the GCAA
  • Submit their request
What are the regulations?
  • Fly it within visual line of sight
  • Never over populated areas
  • Ensure maximum flying height of 400 feet (122 metres) above ground level is not crossed
  • Users must avoid flying over restricted areas listed on the UAE Drone app
  • Only fly the drone during the day, and never at night
  • Should have a live feed of the drone flight
  • Drones must weigh 5 kg or less
The%20Emperor%20and%20the%20Elephant
%3Cp%3E%3Cstrong%3EAuthor%3A%20%3C%2Fstrong%3ESam%20Ottewill-Soulsby%3C%2Fp%3E%0A%3Cp%3E%3Cstrong%3EPublisher%3A%20%3C%2Fstrong%3EPrinceton%20University%20Press%3C%2Fp%3E%0A%3Cp%3E%3Cstrong%3EPages%3A%20%3C%2Fstrong%3E392%3C%2Fp%3E%0A%3Cp%3E%3Cstrong%3EAvailable%3A%20%3C%2Fstrong%3EJuly%2011%3C%2Fp%3E%0A
COMPANY PROFILE
Name: Kumulus Water
 
Started: 2021
 
Founders: Iheb Triki and Mohamed Ali Abid
 
Based: Tunisia 
 
Sector: Water technology 
 
Number of staff: 22 
 
Investment raised: $4 million 
Sly%20Cooper%20and%20the%20Thievius%20Raccoonus
%3Cp%3E%3Cstrong%3EDeveloper%3A%3C%2Fstrong%3E%20Sucker%20Punch%20Productions%3Cbr%3E%3Cstrong%3EPublisher%3A%3C%2Fstrong%3E%20Sony%20Computer%20Entertainment%3Cbr%3E%3Cstrong%3EConsole%3A%3C%2Fstrong%3E%20PlayStation%202%20to%205%3Cbr%3E%3Cstrong%3ERating%3A%3C%2Fstrong%3E%205%2F5%3C%2Fp%3E%0A
What are NFTs?

Are non-fungible tokens a currency, asset, or a licensing instrument? Arnab Das, global market strategist EMEA at Invesco, says they are mix of all of three.

You can buy, hold and use NFTs just like US dollars and Bitcoins. “They can appreciate in value and even produce cash flows.”

However, while money is fungible, NFTs are not. “One Bitcoin, dollar, euro or dirham is largely indistinguishable from the next. Nothing ties a dollar bill to a particular owner, for example. Nor does it tie you to to any goods, services or assets you bought with that currency. In contrast, NFTs confer specific ownership,” Mr Das says.

This makes NFTs closer to a piece of intellectual property such as a work of art or licence, as you can claim royalties or profit by exchanging it at a higher value later, Mr Das says. “They could provide a sustainable income stream.”

This income will depend on future demand and use, which makes NFTs difficult to value. “However, there is a credible use case for many forms of intellectual property, notably art, songs, videos,” Mr Das says.

South Africa's T20 squad

Duminy (c), Behardien, Dala, De Villiers, Hendricks, Jonker, Klaasen (wkt), Miller, Morris, Paterson, Phangiso, Phehlukwayo, Shamsi, Smuts.

UAE currency: the story behind the money in your pockets
The%20specs
%3Cp%3E%3Cstrong%3EEngine%3A%20%3C%2Fstrong%3E3.0%20twin-turbo%20inline%20six-cylinder%0D%3Cbr%3E%3Cstrong%3ETransmission%3A%20%3C%2Fstrong%3Eeight-speed%0D%3Cbr%3E%3Cstrong%3EPower%3A%20%3C%2Fstrong%3E503hp%0D%3Cbr%3E%3Cstrong%3ETorque%3A%20%3C%2Fstrong%3E600Nm%0D%3Cbr%3E%3Cstrong%3EPrice%3A%20%3C%2Fstrong%3Efrom%20Dh450%2C000%0D%3Cbr%3E%3Cstrong%3EOn%20sale%3A%20%3C%2Fstrong%3Enow%3C%2Fp%3E%0A

Seemar’s top six for the Dubai World Cup Carnival:

1. Reynaldothewizard
2. North America
3. Raven’s Corner
4. Hawkesbury
5. New Maharajah
6. Secret Ambition

Emergency phone numbers in the UAE

Estijaba – 8001717 –  number to call to request coronavirus testing

Ministry of Health and Prevention – 80011111

Dubai Health Authority – 800342 – The number to book a free video or voice consultation with a doctor or connect to a local health centre

Emirates airline – 600555555

Etihad Airways – 600555666

Ambulance – 998

Knowledge and Human Development Authority – 8005432 ext. 4 for Covid-19 queries

MATCH INFO

What: 2006 World Cup quarter-final
When: July 1
Where: Gelsenkirchen Stadium, Gelsenkirchen, Germany

Result:
England 0 Portugal 0
(Portugal win 3-1 on penalties)

Gender equality in the workplace still 200 years away

It will take centuries to achieve gender parity in workplaces around the globe, according to a December report from the World Economic Forum.

The WEF study said there had been some improvements in wage equality in 2018 compared to 2017, when the global gender gap widened for the first time in a decade.

But it warned that these were offset by declining representation of women in politics, coupled with greater inequality in their access to health and education.

At current rates, the global gender gap across a range of areas will not close for another 108 years, while it is expected to take 202 years to close the workplace gap, WEF found.

The Geneva-based organisation's annual report tracked disparities between the sexes in 149 countries across four areas: education, health, economic opportunity and political empowerment.

After years of advances in education, health and political representation, women registered setbacks in all three areas this year, WEF said.

Only in the area of economic opportunity did the gender gap narrow somewhat, although there is not much to celebrate, with the global wage gap narrowing to nearly 51 per cent.

And the number of women in leadership roles has risen to 34 per cent globally, WEF said.

At the same time, the report showed there are now proportionately fewer women than men participating in the workforce, suggesting that automation is having a disproportionate impact on jobs traditionally performed by women.

And women are significantly under-represented in growing areas of employment that require science, technology, engineering and mathematics skills, WEF said.

* Agence France Presse

The specs
  • Engine: 3.9-litre twin-turbo V8
  • Power: 640hp
  • Torque: 760nm
  • On sale: 2026
  • Price: Not announced yet
F1 The Movie

Starring: Brad Pitt, Damson Idris, Kerry Condon, Javier Bardem

Director: Joseph Kosinski

Rating: 4/5

FIGHT%20CARD
%3Cp%3EAnthony%20Joshua%20v%20Otto%20Wallin%2C%2012%20rounds%2C%20heavyweight%3C%2Fp%3E%0A%3Cp%3EDeontay%20Wilder%20v%20Joseph%20Parker%2C%2012%20rounds%2C%20heavyweight%3C%2Fp%3E%0A%3Cp%3EDmitry%20Bivol%20v%20Lyndon%20Arthur%2C%2012%20rounds%2C%20light%20heavyweight%3C%2Fp%3E%0A%3Cp%3EDaniel%20Dubois%20v%20Jarrell%20Miller%2C%2012%20rounds%2C%20heavyweight%3C%2Fp%3E%0A%3Cp%3EFilip%20Hrgovic%20v%20Mark%20de%20Mori%2C%2012%20rounds%2C%20heavyweight%C2%A0%3C%2Fp%3E%0A%3Cp%3EArslanbek%20Makhmudov%20v%20Agit%20Kabayel%2C%2012%20rounds%2C%20heavyweight%C2%A0%3C%2Fp%3E%0A%3Cp%3EFrank%20Sanchez%20v%20Junior%20Fa%2C%2012%20rounds%2C%20heavyweight%C2%A0%3C%2Fp%3E%0A%3Cp%3EJai%20Opetaia%20v%20Ellis%20Zorro%2C%2012%20rounds%2C%20cruiserweight%3C%2Fp%3E%0A
Diriyah%20project%20at%20a%20glance
%3Cp%3E-%20Diriyah%E2%80%99s%201.9km%20King%20Salman%20Boulevard%2C%20a%20Parisian%20Champs-Elysees-inspired%20avenue%2C%20is%20scheduled%20for%20completion%20in%202028%3Cbr%3E-%20The%20Royal%20Diriyah%20Opera%20House%20is%20expected%20to%20be%20completed%20in%20four%20years%3Cbr%3E-%20Diriyah%E2%80%99s%20first%20of%2042%20hotels%2C%20the%20Bab%20Samhan%20hotel%2C%20will%20open%20in%20the%20first%20quarter%20of%202024%3Cbr%3E-%20On%20completion%20in%202030%2C%20the%20Diriyah%20project%20is%20forecast%20to%20accommodate%20more%20than%20100%2C000%20people%3Cbr%3E-%20The%20%2463.2%20billion%20Diriyah%20project%20will%20contribute%20%247.2%20billion%20to%20the%20kingdom%E2%80%99s%20GDP%3Cbr%3E-%20It%20will%20create%20more%20than%20178%2C000%20jobs%20and%20aims%20to%20attract%20more%20than%2050%20million%20visits%20a%20year%3Cbr%3E-%20About%202%2C000%20people%20work%20for%20the%20Diriyah%20Company%2C%20with%20more%20than%2086%20per%20cent%20being%20Saudi%20citizens%3Cbr%3E%3C%2Fp%3E%0A
The years Ramadan fell in May

1987

1954

1921

1888

Living in...

This article is part of a guide on where to live in the UAE. Our reporters will profile some of the country’s most desirable districts, provide an estimate of rental prices and introduce you to some of the residents who call each area home. 

Company Profile

Name: Thndr
Started: 2019
Co-founders: Ahmad Hammouda and Seif Amr
Sector: FinTech
Headquarters: Egypt
UAE base: Hub71, Abu Dhabi
Current number of staff: More than 150
Funds raised: $22 million

Updated: April 05, 2024, 8:08 AM