It would be a staggering act of deception: an inflatable Iranian tank that tricks the US and Israel into wasting their expensive missiles on a cheap balloon.
Claims on social media that Iran has bought the dummies from China, or deployed them from a fleet of 900,000, are often accompanied by videos of the inflating tank.
The footage in question is an AI fake, a fairly obvious one.
But its wide reach shows how artificial intelligence is adding to the confusion, misinformation and social media theories that surround a war in 2026.
Another AI mock-up, depicting damage to a US naval base in Bahrain, turned out to show “pretty much exactly what happened”, said Jeremy Binnie of defence intelligence company Janes.
The incident “very early on flagged up the possibilities of what could be done here”, he said.
Inflatable tanks
Posts on the inflatable tanks have been shared thousands of times. “The trillion-dollar defence machine is firing multi-million-dollar missiles … at balloons,” said one post. Many claim the equipment was supplied by China.
FAKE:
The videos are clearly fakes. But inflatable tanks do exist. Some are believed to have been used in the war in Ukraine. Dummy tanks were used as long ago as the Second World War, to confuse Nazi Germany before D-Day.
And the rise of easy-to-use AI tools such as Grok, Nano Banana and Sora 2 means anyone with an internet connection can create a fake image or video to illustrate the concept.
There is a widespread lack of internet access in Iran, where AI fakes flourished during a wave of anti-regime protests in January.
But government accounts that do have internet have been pushing fake narratives online, according to fact-checking website NewsGuard. The state-run Tehran Times posted an AI image of a destroyed American radar in Qatar.
FAKE:
“Even if some of the pictures are AI-manipulated, there are still a lot of very alarming pictures that are not AI-generated,” said David Jalilvand, a consultant who analyses Iran.
“At the global level, if the point is showing that the Iranians and Americans are not securing a quick and swift victory, the Iranians are winning that information battle,” he said.
Team Trump and missing Mojtaba
The Americans can play the AI game too.
Among the voids being filled with AI trickery is the absence from view of Iran's Ayatollah Mojtaba Khamenei. The new supreme leader has not been seen in public since his election.
The situation reminded some people of past occasions when cardboard cutouts of Ayatollah Ruhollah Khomeini, the founder of the Islamic Republic, were paraded around Iran along with effigies of other leaders. Some of those images are genuine.
REAL:
But AI users have taken things a step further, imagining a cardboard Mojtaba Khamenei being displayed before crowds.
FAKE:
White House deputy chief of staff Dan Scavino clearly found that funny, sharing an AI video of cardboard Mojtabas rolling off a production line.
FAKE:
In addition, “the US government has effectively told satellite imagery providers not to put out imagery of the Middle East related to the conflict. That’s a key verification tool that has been taken away from journalists and analysts,” Mr Binnie said.
“Iranian sources are putting out some satellite imagery. Could this be AI? Could this be manipulated? Enough of that now seems to have been verified to begin to give it some credibility.”
The US is also putting AI to military use. Admiral Brad Cooper, the commander of American forces in the Middle East, said his troops use AI to “analyse large volumes of data within seconds”, while insisting that the final decision is always made by a human.
The National last year revealed how Iran has ambitions to use AI in its armed forces and security apparatus.
Social media sites have hinted at a crackdown. Nikita Bier, the head of product at X, said people who post unlabelled AI content could be blocked from earning money on the site for 90 days. An oversight board for Meta, which owns Facebook and Instagram, urged the company to set new rules on “deceptive AI” during conflicts. Meta said it agreed with the board's findings.
Gulf countries under attack from Iran have repeatedly warned residents to stick to official sources, and not share footage of missile interceptions.


