Joe Carlsmith on How We Change Our Minds About AI Risk
Listen now
Description
Joe Carlsmith joins the podcast to discuss how we change our minds about AI risk, gut feelings versus abstract models, and what to do if transformative AI is coming soon. You can read more about Joe's work at https://joecarlsmith.com. Timestamps: 00:00 Predictable updating on AI risk 07:27 Abstract models versus gut feelings 22:06 How Joe began believing in AI risk 29:06 Is AI risk falsifiable? 35:39 Types of skepticisms about AI risk 44:51 Are we fundamentally confused? 53:35 Becoming alienated from ourselves? 1:00:12 What will change people's minds? 1:12:34 Outline of different futures 1:20:43 Humanity losing touch with reality 1:27:14 Can we understand AI sentience? 1:36:31 Distinguishing real from fake sentience 1:39:54 AI doomer epistemology 1:45:23 AI benchmarks versus real-world AI 1:53:00 AI improving AI research and development 2:01:08 What if transformative AI comes soon? 2:07:21 AI safety if transformative AI comes soon 2:16:52 AI systems interpreting other AI systems 2:19:38 Philosophy and transformative AI Social Media Links: ➡️ WEBSITE: https://futureoflife.org ➡️ TWITTER: https://twitter.com/FLIxrisk ➡️ INSTAGRAM: https://www.instagram.com/futureoflifeinstitute/ ➡️ META: https://www.facebook.com/futureoflifeinstitute ➡️ LINKEDIN: https://www.linkedin.com/company/future-of-life-institute/
More Episodes
Connor Leahy joins the podcast to discuss the motivations of AGI corporations, how modern AI is "grown", the need for a science of intelligence, the effects of AI on work, the radical implications of superintelligence, open-source AI, and what you might be able to do about all of this.    Here's...
Published 11/22/24
Suzy Shepherd joins the podcast to discuss her new short film "Writing Doom", which deals with AI risk. We discuss how to use humor in film, how to write concisely, how filmmaking is evolving, in what ways AI is useful for filmmakers, and how we will find meaning in an increasingly automated...
Published 11/08/24