Ever since film was invented, directors have used special effects to make the improbable seem real. As viewers, we can immerse ourselves in their stories and marvel at their ingenuity. But what happens when the film-maker's ability to distort reality becomes widely and cheaply available, to the extent that objects or people can be inserted or removed from videos in a completely plausible way?
Earlier this month, at an event run by software firm Adobe, a tool was unveiled that can remove objects from moving video footage without a trace. As Project Cloak spirited away a street lamp from the front of a cathedral to offer a better view, the crowd whooped in amazement – but while the footage was made more aesthetically pleasing, reality was slightly altered. The technique raised the question of whether advanced editing techniques, CGI and artificial intelligence could lead us to a place where it is simply impossible to tell truth from fantasy.
It is already far from straightforward. As we have gained access to clever digital tools and the mass-distribution possibilities of the internet, cries of "fake" have become ever louder. They are levelled at everything from footage of magic tricks to claims of attendance at protests; from newspaper headlines to photos of terrorist atrocities – and that scepticism tends to fall in line with the sceptic's pre-conceived beliefs. Doubts over what is true or false have created a situation where the unexpected always tends to be questioned, and the illicit manipulation of audio-visual material is often blamed (usually wrongly) for misleading people.
Image manipulation has been around since photo retouching was invented in the late 19th century, but such processes tend to leave telltale signs in their wake. Also unveiled at the Adobe event was Scene Stitch, a means by which parts of an image could be removed and the gaps realistically and seamlessly filled.
Photoshop's "content-aware fill" feature can already fill gaps with pixels taken from elsewhere in that same image, but Scene Stitch takes things a step further – it uses AI techniques to search whole libraries of images to select something appropriate and drop it into the scene. We no longer have to do the work of image manipulation; machine learning does it for us.
_____________
Read more:
Extremist rhetoric and fake news top of the agenda for Facebook Middle East
‘News should come from reliable sources,’ expert warns social media users
Fighting fake news cannot be trivialised
_____________
In a similar vein, Google has been working on ways to “zoom in” on low-resolution images, a myth that has been perpetuated by sci-fi, but now made possible by AI: if you want to take a closer, detailed look at a photo, the computer can guess what it might look like, despite such detail never being there in the first place. Famous photographic hoaxes of yesteryear such as the Loch Ness Monster (1934) and the Cottingley Fairies (1917) used techniques that are laughably primitive in comparison, but as we have become more sceptical, ways of fooling the eye have become more sophisticated.
The ear can be fooled, too. Years ago, laborious tape editing caused people to question whether Elvis Presley might actually be dead (he was); or whether secret tapes of phone conversations between Ronald Reagan and Margaret Thatcher concerning potential nuclear attacks were authentic (they weren’t). But last year, again at an Adobe event, a project called VoCo was unveiled; it could analyse a recording of a speech and use the syllables to synthesise other words, allowing whole new speeches to be created from scratch and giving the perception that people have said things that they haven’t. If this technique is coupled with video, the results will be even more disorientating.
This summer, researchers at the University of Washington used AI to create a fake video of Barack Obama convincingly lip-synching to words taken from an audio recording of one of his speeches; combined with VoCo, this kind of footage could allow the production of videos of public figures saying literally anything we might want them to.
Applied to the field of entertainment, these techniques open up whole worlds of possibility. The apparent "reanimation" of Peter Cushing in the film Star Wars: Rogue One more than 20 years after his death raised a number of ethical questions, but gave a glimpse of the way that storytelling might be reinvented in magical new ways. In the context of the cruel world of social media, however, its impact could be damaging. Revenge-porn techniques, where the face of one person is convincingly superimposed onto the body of another in a still photo, would become even more damaging when extended to video. People who are predisposed to cyber-bullying will have new tools at their disposal, a problem that has already been acknowledged by British children's charity the NSPCC. And with advanced voice synthesis, new opportunities may open up to fraudsters, as scam phone calls become truly sophisticated and disorientating.
The effect these tools may have on trust of the media is hard to imagine. Politicians and their supporters already criticise reports they disagree with as “fake news”, but in recent years there has been an upswing in genuinely fake news that is fabricated purely for clicks – celebrity deaths, health scares and so on. The internet’s premier debunking resource, Snopes, already has a section devoted to such websites, but their work will become ever greater as fake stories are supported by fake video and audio. This stuff is also a gift for governments predisposed towards misinformation and propaganda – fake audio-visual material could easily be deployed to affect election results and stoke unrest.
Optimists say that these technological developments will lead to a situation where everyone treats new information with a healthy scepticism. But that famous adage "A lie will go around the world while truth is pulling its boots on" is more applicable today than ever. In an interview with Business Insider, Gregory C Allen from the Centre for a New American Security said that while our current technology allows us – just about – to determine truth, "we cannot rely on this technological balance of truth favouring truth forever". People and organisations may, in the future, try very hard to prove the authenticity of their audio-visual media, but at the point when hoaxes become impervious to forensic analysis, there may be no means of us knowing what is fact and what is fiction.