‘Donald Trump’ deepfake addresses democracy conference in Copenhagen

The digitally altered video was played to warn the audience about the danger the AI tool poses to democracy

US President Donald Trump speaks during a luncheon with Republican members of Congress at the White House on June 26, 2018. / AFP / Nicholas Kamm
Powered by automated translation

A deepfake of US President Donald Trump addressed a democracy conference in Copenhagen on Friday to warn about the dangers the artificial intelligence tool can pose for politics and elections.

At a panel event with Facebook’s Head of Global Affairs Nick Clegg, a Skype call with Mr Trump was played to the audience.

In the video, Mr Trump said: “No one loves democracy as much as I do-that’s why God elected me.”

Except it wasn’t the US President, who was at the G20 summit in Osaka on Friday, but a highly-realistic digitally altered video.

Deepfake, a term coined in 2017, works by combining and transposing existing images and videos onto source images using artificial intelligence.

Last year, news station Buzzfeed created a deepfake video where former US President Barack Obama gave a public address complaining about his successor Mr Trump. In the video, actor Jordan Peele does his best impersonation of Mr Obama and then machine learning is used to precisely model how Mr Obama moves his mouth when he speaks. Mr Peele’s words can be put into the synthetic Mr Obama’s mouth.

BuzzFeed stated clearly that the video they created was a fake and intended to raise awareness about how misinformation can be spread using the artificial technology tool.

But politicians and the public have continued to be fooled by deepfake videos.

Last month, Mr Trump shared a doctored clip of US House of Representatives Nancy Pelosi appearing to drunkenly slur her words through a speech.

The video was declared a fake but not before it had been seen millions of times across Facebook, YouTube and Twitter.

Concerns about the technology were raised only this week after a programmer created an app, which used neural networks to turn photographs of women into nude pictures.

The creator of “DeepNude” has since closed the app after criticism that it could make any woman a potential victim of revenge porn.