The aftermath of Israeli bombing in Rafah. The use by the military of AI systems to target Hamas has come under question. AFP
The aftermath of Israeli bombing in Rafah. The use by the military of AI systems to target Hamas has come under question. AFP
The aftermath of Israeli bombing in Rafah. The use by the military of AI systems to target Hamas has come under question. AFP
The aftermath of Israeli bombing in Rafah. The use by the military of AI systems to target Hamas has come under question. AFP

Israel's AI targeting of Gaza criticised for potential analytical errors


Thomas Harding
  • English
  • Arabic

Live updates: Follow the latest on Israel-Gaza

Technology experts have warned Israel's military of potential “extreme bias error” in relying on Big Data for targeting people in Gaza while using artificial intelligence programmes.

The Israeli military has reportedly to be using two AI systems, Gospel and Lavender, to track down Hamas operatives and speed up missile strikes but they are controversial, with some suggesting they have contributed to the high number civilian casualties, with more than 34,800 Palestinians killed.

Big Data, defined as large, diverse sets of information that grow at ever-increasing rates, has now become so widespread and powerful with the rise of AI that if not “in the not too distant future, no one will be able to escape digital surveillance”, Dr Miah Hammond-Errey told a Rusi think tank webinar.

Israel’s use of powerful AI systems has led its military to enter territory for advanced warfare not previously witnessed at such a scale between soldiers and machines.

The Lavender system is understood to have processed huge amounts of personal data from Gaza, allowing it to quickly militant enemy profiles, with up to 37,000 Palestinian men linked by the system to Hamas or Palestinian Islamic Jihad.

It is also alleged that Israeli strike operators, using AI, are allowed to kill up to 20 civilians per attack if the target is deemed an appropriate rank.

Unverified reports say the AI systems had “extreme bias error, both in the targeting data that's being used, but then also in the kinetic action”, Ms Hammond-Errey said in response to a question from The National. Extreme bias error can occur when a device is calibrated incorrectly, so it miscalculates measurements.

The AI expert and director of emerging technology at the University of Sydney suggested broad data sets “that are highly personal and commercial” mean that armed forces “don't actually have the capacity to verify” targets and that was potentially “one contributing factor to such large errors”.

She said it would take “a long time for us to really get access to this information”, if ever, “to assess some of the technical realities of the situation”, as the fighting in Gaza continues.

The first phase of Israeli operations in Gaza in November
The first phase of Israeli operations in Gaza in November

Prof Sir David Omand, former head of Britain’s GCHQ surveillance centre, urged against “jumping to conclusions” over Israel’s AI use, as its military had not given independent access to its system.

“We just have to be a bit careful before assuming these almost supernatural powers to large data sets on what has been going on in Gaza, and just remember that human beings are setting the rules of engagement,” he said.

“If things are going wrong, it’s because human beings have the wrong rules, not because the machines are malfunctioning.”

Israeli’s use of Lavender and Gospel “would likely form a test case for how the international community and tech companies respond to the use of AI”.

Dr Hammond-Errey, author of Big Data, Emerging Technologies and Intelligence, argues that for national security agencies the “Big Data landscape offers the potential for this invasive targeting and surveillance of individuals”, not only by states but others not governed by rules.

“If we aren't there already, in the not-too-distant future no one will be able to escape digital surveillance.”

Big Data could give armies “military dominance”, as it offers “imperfect global situational awareness but on a scale previously not considered”, especially when connected to space targeting systems.

Aligned with AI, Big Data can compile “comprehensive profiles” of people, institutions, political groups and nation states that “can be made remotely and very quickly”.

Dr Hammond-Errey also warned that Big Data had been used around the world to target individuals and specific groups, exploiting “individual psychological weaknesses” as well as interfering with elections.

Our legal consultant

Name: Dr Hassan Mohsen Elhais

Position: legal consultant with Al Rowaad Advocates and Legal Consultants.

'Munich: The Edge of War'

Director: Christian Schwochow

Starring: George MacKay, Jannis Niewohner, Jeremy Irons

Rating: 3/5

Dr Afridi's warning signs of digital addiction

Spending an excessive amount of time on the phone.

Neglecting personal, social, or academic responsibilities.

Losing interest in other activities or hobbies that were once enjoyed.

Having withdrawal symptoms like feeling anxious, restless, or upset when the technology is not available.

Experiencing sleep disturbances or changes in sleep patterns.

What are the guidelines?

Under 18 months: Avoid screen time altogether, except for video chatting with family.

Aged 18-24 months: If screens are introduced, it should be high-quality content watched with a caregiver to help the child understand what they are seeing.

Aged 2-5 years: Limit to one-hour per day of high-quality programming, with co-viewing whenever possible.

Aged 6-12 years: Set consistent limits on screen time to ensure it does not interfere with sleep, physical activity, or social interactions.

Teenagers: Encourage a balanced approach – screens should not replace sleep, exercise, or face-to-face socialisation.

Source: American Paediatric Association
How to come clean about financial infidelity
  • Be honest and transparent: It is always better to own up than be found out. Tell your partner everything they want to know. Show remorse. Inform them of the extent of the situation so they know what they are dealing with.
  • Work on yourself: Be honest with yourself and your partner and figure out why you did it. Don’t be ashamed to ask for professional help. 
  • Give it time: Like any breach of trust, it requires time to rebuild. So be consistent, communicate often and be patient with your partner and yourself.
  • Discuss your financial situation regularly: Ensure your spouse is involved in financial matters and decisions. Your ability to consistently follow through with what you say you are going to do when it comes to money can make all the difference in your partner’s willingness to trust you again.
  • Work on a plan to resolve the problem together: If there is a lot of debt, for example, create a budget and financial plan together and ensure your partner is fully informed, involved and supported. 

Carol Glynn, founder of Conscious Finance Coaching

RACE CARD

4.30pm: Maiden Dh80,000 1,400m
5pm: Conditions Dh80,000 1,400m
5.30pm: Liwa Oasis Group 3 Dh300,000 1,400m
6pm: The President’s Cup Listed Dh380,000 1,400m
6.30pm: Arabian Triple Crown Group 2 Dh300,000 2,200m
7pm: Wathba Stallions Cup Handicap (30-60) Dh80,000 1,600m
7.30pm: Handicap (40-70) Dh80,000 1,600m.

Updated: May 12, 2024, 7:57 AM`