SDS 565: AGI: The Apocalypse Machine
Listen now
Description
In this episode, Jeremie Harris dives into the stirring topic of AI Safety and the existential risks that Artificial General Intelligence poses to humankind. In this episode you will learn: • Why mentorship is crucial in a data science career development [15:45] • Canadian vs American start-up ecosystems [24:18] • What is Artificial General Intelligence (AGI)? [38:50] • How Artificial Superintelligence could destroy the world [1:04:00] • How AGI could prove to be a panacea for humankind and life on the planet. [1:27:31] • How to become an AI safety expert [1:30:07] • Jeremie's day-to-day work life at Mercurius [1:35:39] Additional materials: www.superdatascience.com/565
More Episodes
Generative AI is reshaping our world, and Bernard Marr, world-renowned futurist and best-selling author, joins Jon Krohn to guide us through this transformation. In this episode, Bernard shares his insights on how AI is transforming industries, revolutionizing daily life, and addressing global...
Published 04/23/24
What are the risks of AI progressing beyond a point of no return? What do we stand to gain? On this Five-Minute Friday, Jon Krohn talks ‘books’ as he outlines two nonfiction works on AI and futurism by Oxford philosopher Nick Bostrom. Listen to a breakdown of DEEP UTOPIA and SUPERINTELLIGENCE in...
Published 04/19/24