AI Impersonation
Listen now
Description
This week we talk about robo-Biden, fake Swift images, and ElevenLabs. We also discuss copyright, AI George Carlin, and deepfakes. Recommended Book: Debt: The First 5,000 Years by David Graeber Transcript The hosts of a podcast called Dudesy are facing a lawsuit after they made a video that seems to show the late comedian George Carlin performing a new routine. The duo claimed they created the video using AI tools, training an algorithm on five decades-worth of Carlin's material in order to generate a likeness of his face and body and voice, and his jokes; they claimed everything in this video, which they called "George Carlin: I'm Glad I'm Dead," was the product of AI tools. The lawsuit was filed by Carlin's estate, which alleges these hosts infringed on the copyright they have on Carlin's works, and that the hosts illegally made use of and profited from his name and likeness. They asked that the judge force the Dudesy hosts to pull and destroy the video and its associated audio, and to prevent them from using Carlin's works and likeness and name in the future. After the lawsuit was announced, a spokesperson for Dudesy backtracked on prior claims, saying that the writing in the faux-Carlin routine wasn't written by AI, it was written by one of the human hosts, and thus the claim of copyright violation wasn't legit, because while the jokes may have been inspired by Carlin's work, they weren't generated by software that used his work as raw training materials, as they originally claimed—which arguably could have represented an act of copyright violation. This is an interesting case in part because if the podcasters who created this fake Carlin and fake Carlin routine were to be successfully sued for the use of Carlin's likeness and name, but not for copyright issues related to his work, that would suggest that the main danger faced by AI companies that are gobbling up intellectual property left and right, scraping books and the web and all sorts of video and audio services for raw training materials, is the way in which they're acquiring and using this media, not the use of the media itself. If they could somehow claim their models are inspired by these existing writings and recordings and such, they could then lean on the same argument that their work is basically the same as an author reading a bunch of other author's book, and then writing their own book—which is inspired by those other works, but not, typically anyway, infringing in any legal sense. The caveat offered by the AI used to impersonate Carlin at the beginning of the show is interesting, too, as it said, outright, that it's not Carlin and that it's merely impersonating him like a human comedian doing their best impression of Carlin. In practice, that means listening to all of Carlin's material and mimicking his voice and cadence and inflections and the way he tells stories and builds up to punchlines and everything else; if a human performer were doing an impression of Carlin, they would basically do the same thing, they just probably wouldn't do it as seamlessly as a modern AI system capable of producing jokes and generating images and videos and audio can manage. This raises the question, then, of whether there would be an issue if this AI comedy set wasn't claiming to feature George Carlin: what if they had said it was a show featuring Porge Narlin, instead? Or Fred Robertson? Where is the line drawn, and to what degree does the legal concept of Fair Use, in the US at least, come into play here? What I'd like to talk about today are a few other examples of AI-based imitation that have been in the news lately, and the implications they may have, legally and culturally, and in some cases psychologically, as well. — There's a tech startup called ElevenLabs that's generally considered to be one of the bigger players in the world of AI-based text-to-voice capabilities, including the capacity to mimic a real person's voice. What that means in practice is th
More Episodes
This week we talk about neural networks, AGI, and scaling laws. We also discuss training data, user acquisition, and energy consumption. Recommended Book: Through the Grapevine by Taylor N. Carlson Transcript Depending on whose numbers you use, and which industries and types of investment those...
Published 11/19/24
Published 11/19/24
This week we talk about the Double Reduction Policy, gaokao, and Chegg. We also discuss GPTs, cheating, and disruption. Recommended Book: Autocracy, Inc by Anne Applebaum Transcript In July of 2021, the Chinese government implemented a new education rule called the Double Reduction Policy. This...
Published 11/12/24