AI expert Dominic Rees said Morgan Neville’s decision to recreate Anthony Bourdain’s voice with the help of artificial intelligence in the documentary Roadrunner was “absolutely horrifying”.
Bourdain died by suicide in 2018.
“Neville wanted this film to be a tribute, almost a memorial. This film is full of good intentions,” Leeds added. “He wanted it to look like Bourdain was speaking from the dead, like in ‘Sunset Boulevard.'”
In the film, artist David Cho reads an email that the celebrity chef wrote to him shortly before his death, only then the “Bourdain” voice takes over.
“The audience started thinking there was something wrong. Who would record themselves reading an email?! Neville replied, ‘Okay, I’m going to deny it now. I used a voice clone.'” But he didn’t tell the audience that. He also told critics, “We used this technique in three other places in the film, but I won’t reveal where.”
Lecture at Jihlava International Airport. Speaking at the Documentary Film Festival’s Ethics in Documentary Filmmaking conference, Leeds said the damage to Neville’s reputation was “very serious”.
“Personally, when I watch another Morgan Neville documentary, I can’t help but think, ‘They’ve deliberately fooled me.’ The trust between me, the audience, and you, the filmmaker, has been completely destroyed.”
He argued that a lack of transparency in the use of AI could “destroy” relationships with viewers. Another issue is consent.
“Amidst the ‘Roadrunner’ controversy,” Neville claimed, “I asked[Bourdain’s]ex-wife and she said it was cool.” And she said, “No, I didn’t.” This further damaged his credibility. ”
In his keynote speech at Czech Fest, Leeds also mentioned his AI-focused production company Particle 6. The company behind controversial AI actress Tilly Norwood is also focusing on historical documentaries.
“They said Tilly Norwood would be picked up by a Hollywood agent and cast in a movie, but they also see documentaries, starting with historical reconstruction, as a major part of their future business,” he said, showing the company’s showreel and describing their efforts as “comical.”
“Ancient Egyptian slaves wear bright white towels, as if they came out of a sauna or something. This is how Particle 6 wants to appeal to documentary filmmakers working in this field of television and historical documentaries.” Many companies want an ethical statement to “create confidence,” he said, quoting Jenny, who also specializes in documentation and historical reconstruction.
“They have a “Generational Commandment,” a public-facing code of ethics that “promotes historical accuracy, transparency, fairness of representation, and respect for authentic human likeness.” But companies should be very wary of bringing ethics to the fore when it comes to promotions. ”
Returning to the topic of voice control, Leeds mentioned Endurance, a National Geographic documentary about Ernest Shackleton’s 1914-1917 Antarctic expedition. Its creators decided to recreate his voice and added a disclaimer to the end credits.
“Producer Ruth Johnston had to think about her relationship with her audience: ‘I’ve used AI in my film. What do I do?'” She makes it clear that these voices were recreated using artificial intelligence,” Leeds said, reiterating that “transparency is going to be one of the absolutely central ethical issues around the use of generative AI by documentary filmmakers.”
However, creating voices can be difficult, especially for historical figures. Hitler appears in Jan Rewinkel’s short story “History Tells Us Nothing”, which was screened at the festival.
“Once a deepfake AI human avatar or voice clone is created, it becomes a permanent asset that will exist forever. And who will control it? Documentary filmmakers think they have nothing to worry about just because they have made their film. But this is a fine-tuned model of the Adolf Hitler voice out there. Anyone can use it. Ethics extends beyond the filmmakers themselves, to technology as a whole.”
Felix Moller faced a similar dilemma in Jud Süss 2.0: From Nazi Propaganda to Online Anti-Semitism, and ultimately decided not to recreate Henry Ford’s voice.
“He was one of the richest men in the world in the 1920s and ’30s, and he was an anti-Semite, which has some parallels to now. He had a firm grip on this fake document, the Protocols of the Elders of Zion,” Leeds explained.
“They thought, ‘Is it possible to get Henry Ford to read his anti-Semitic pamphlet?’ That’s absolutely vile, but as a filmmaker, it’s important to understand how vile anti-Semitism was in America at the time. They showed it to me, but I was concerned for a number of reasons: Obviously, anti-Semites could extract this and use it for anti-Semitic propaganda in the 21st century.
“This is an important thing for documentary makers to consider. I have the opportunity to use generative AI, which could be a beautiful way to supplement the creative impact of my film, but I can also decide at any time that this is not the right thing to do.”
He praised examples of AI in documentaries such as Welcome to Chechnya, where deepfakes obscured the identities of lesbian and gay activists who would otherwise have been murdered. “In the age of AI, audiences are increasingly aware that what they see on screen may not be real. Authenticity has been an issue from the beginning of documentary filmmaking, but now we have a much deeper problem.”
