How are Canadians confronting the dangers posed by AI?
Listen now
Description
We had the pleasure of speaking with Mario Gibney, who decided years ago that he couldn’t just wait for someone else to take action on AI safety. In 2022, Mario co-founded AI Governance and Safety (AIGS) Canada to push the national conversation forward. Through movement and policy advocacy, AIGS is helping Canada become a leader in AI governance and safety. I recommend reading their concise white papers to get a summary of the issues. We learn how Mario got into this line of work, what Canadians think about the state of AI Safety these days, and things to get excited about in the Toronto scene. I leave you with a message about how to deal with the emotional toll of AI doomerism. Thanks to my amazing producer Chad Clarke for being essential in putting this show together. All mistakes are mine. Artificial Intelligence Governance & Safety Canada aigs.ca   LessWrong https://www.lesswrong.com/   Slate Star Codex https://slatestarcodex.com/   Astral Codex Ten https://www.astralcodexten.com/   80,000 Hours https://80000hours.org/   Center for AI Safety https://www.safe.ai/   Future of Life Institute https://futureoflife.org/   EAGxToronto applications are open until 31 July at 11:59 pm Eastern–apply now! https://www.effectivealtruism.org/ea-global/events/eagxtoronto-2024   We’re a PFG! Profit4good.org  
More Episodes
This is what EAs talk about at afterparties. Or rather, afterparties of afterparties. Tax deductibility is not the sexiest topic I could find an interviewee for, but if you want to take a stand for your favourite effective charities, why would you be a sucker and pay as much taxes as the one who...
Published 11/12/24
I loved hearing how James’s experience in the aid sector has influenced his views on effectiveness and optimization. Having worked with refugees, aid organizations, and advocates for children, James brings valuable field experience into his advocacy work. After seeing how major aid organizations...
Published 10/25/24
Published 10/25/24