By Michael Adesanya
There is much concern about deepfakes, or artificial intelligence-generated images or audio of leading political figures designed to mislead viewers/listeners.
There has been a steady supply of examples pf deepfakes from around the world, including those focused on issues of the economy, agriculture and sports.
Here are the visual elements to look out for, to know whether the video you are watching is a fake or nor.
Oddness around the mouth or chin
In deepfake videos the area around the mouth can be the biggest giveaway. There may be fewer wrinkles in the skin, less detail around the mouth, or the chin looks blurry or smudged. Poor synchronisation between a person’s voice and mouth can be another sign.
Dr Mhairi Aitken, an ethics fellow at the Alan Turing Institute, the UK’s national institute for AI, says the first giveaway for the Minecraft deepfakes is, of course, “the ridiculousness of the situation”. But another sign of AI-generated media or manipulation is imperfect sync between voice and mouth.
Another tell, says Aitken, is whether shadows fall in the right place or whether lines and wrinkles on a face move when you would expect them to move.
Ardi Janjeva, a research associate at the institute, adds that the low resolution throughout the video is another obvious sign that people should notice because it “immediately resembles something that is patched together”. He says people are familiar with this amateur approach because of the prevalence of “rudimentary, low-res scam email attempts”.
This lo-fi approach then manifests itself in obvious areas such as the mouth and jawline, he says. “It shows in facial features like the mouth, which viewers tend to focus their attention on, where there is excessive blurring and smudging.”
Strange elements of speech
In most deepfake video, the voice and mouth are out of sync and the lower facial area is blurred. The use of “pounds” before a number indicates that a text-to-audio tool has probably been used to recreate Starmer’s voice, she adds.
“This is likely an indication that a tool has been used to convert written words into speech, without checking this reflects typical spoken word patterns,” she says. “There are also some clues in the intonation. This maintains a fairly monotone rhythm and pattern throughout. To check the veracity of a video it’s a good idea to compare the voice, mannerisms and expressions with real recordings of the individual to see whether they are consistent.”
Consistency between the face and body
This deepfake video of the Ukrainian president, Volodymyr Zelenskiy, asking civilians to lay down their arms to the Russian military was circulated in March 2022. The head is of a disproportionate size to the rest of the body and there is a difference between the skin tones of the neck and face.
Hany Farid, a professor at the University of California in Berkeley and a specialist in deepfake detection, says this is an “old-school deepfake”. The immobile body is a giveaway, he says. “The telltale sign in this so-called puppet-master deepfake is that the body below the neck doesn’t move.”
Discontinuity across the video clip
A video, circulated in May 2024, falsely shows the US state department spokesperson, Matthew Miller, justifying Ukrainian military strikes on the Russian city of Belgorod by telling a reporter “there are virtually no civilians left in Belgorod”. The video was tweeted by the Russian embassy in South Africa and then removed, according to a BBC journalist.
The fake video shows the spokesperson’s tie and shirt changing colour from one point of the video.
While this is a relatively conspicuous change, Farid notes that the generative AI landscape is changing rapidly and therefore so are the deepfake pointers. “We also have to always practise good information consumption habits that include a combination of good old common sense and a healthy amount of scepticism when presented with particularly outrageous or unlikely claims,” he says.
Extra fingers, hands, limbs
Look out for a surplus of fingers, legs, arms and odd-looking hands in still images that are AI-generated.
A picture purportedly showing the US president, Joe Biden, and the vice-president, Kamala Harris, celebrating Donald Trump’s indictment was circulated on Twitter in April 2023.
A manipulated photo of Kamala Harris and Joe Biden hugging in the oval office.
Signs that it could have been generated by AI include Kamala Harris’s right hand having six fingers. The top of the flag is distorted and the pattern on the floor is also awry.
The AI team at Reality Defender, a deepfake detection firm, says prompts typed into image-generating tools can focus on people – often the names of well-known individuals – which result in outputs emphasising faces. As a result the artifice is often revealed in other details such as hands or physical backgrounds, as with the Biden-Harris image.
“Usually the prompts to create such images place higher emphasis on the people featured in them – particularly the faces,” the Reality Defender team explains. “Thus, the outputs often create credible human faces with higher-frequency details, while deprioritising the physical consistency of the background (or, in many cases, other parts of bodies, like hands).”
However, Reality Defender, which uses deepfake detection tools, says the increasing complexity of generative AI programmes means that manual scrutiny of deepfakes is becoming “decidedly less reliable”.
Mangled letters and numbers
AI image generators have difficulties reproducing numbers and text. This fake Trump mugshot published in April 2023 was made with such a tool. You can see how in the background, instead of a height chart, there is a combination of nonsensical numbers and letters.
“The numbers and text in the background are a giveaway,” says Aitken. “AI image generators really struggle with producing text or numbers. They don’t have an understanding of the meaning of the symbols they produce so they typically produce garbled or illegible text and numbers. If there are any text or numbers in an image zooming into these can be a really good way of identifying if it might be AI-generated.”
A manipulated mug shot of Donald Trump standing in front of a height chart. There are yellow squares annotating scrambled text.
Jerky video edits
Some manipulated images are put together so amateurishly they are very easy to spot. Known as “cheapfakes”, these often use simple video-editing software and other lo-fi techniques.
Just before the Mexican elections, a video of the then presidential candidate Claudia Sheinbaum was edited to show her saying she would close churches if elected. The clip was patched together in a deliberately misleading manner from a video where she was in fact stating: “They are saying, imagine the lie, that we’re going to close down churches.” An alternative background showing satanist symbols was also added in an attempt to make the clip even more damaging.