There have been repeated warnings about the dangers of deepfakes. Now we have two concrete cases. The well-known ZDF presenter Christian Sievers and Tagesschau presenter André Schünke are victims of deepfake abuse.
In a promotional video on Facebook, the fake Sievers promises big profits with new AI-supported investment platforms. With “minimal effort”, German citizens can earn “large sums of money”.
This message and the overall amateurish presentation of the clip might make you skeptical. If it weren’t for the deceptively real digital copy with the voice of well-known ZDF presenter Christian Sievers.
Sievers complains on Twitter: “The guy looks like me, sounds like me (almost). But it’s not really me… at all”. He warns: “Beware, bad scam with AI on social media.” He has not received a response to his complaint on Facebook, where the video appears to have been shared.
Deepfake clues get lost in the social media whirlwind
It takes a closer look to notice that Sievers’ lip movements are out of sync with his words and appear to be animated. But anyone watching the video on a small smartphone screen in Facebook’s low resolution is unlikely to notice such minor inconsistencies.
Especially when these videos are deliberately targeted at less tech-savvy audiences who are unaware of the new possibilities of AI-based forgery. Sievers’ cloned voice is convincing. Since his work as a news anchor provides many hours of audio data on the web, it is easy to train a suitable voice model.
In mid-August, “Tagesschau” news anchor André Schünke also fell victim to an essentially identical deepfake scam. “It can be frightening what is possible with artificial intelligence these days. And I’m afraid this is just the beginning. It’s hard to tell the difference between original and fake,” Schünke told Bild.
News anchors are particularly well suited for training deepfake AI systems. They are often well known, have a high reputation and level of trust in society, and there is a large amount of high-quality audio and video material available online.
The problem with these scams is that while they were and are possible without AI tools, AI lowers the barrier to entry. With services like Elevenlabs, for example, all you need is five minutes of audio to get a cloned voice with a recognizable expression and human-like intonation.