Is a chatbot responsible for a boy’s suicide?
Listen now
Description
A lawsuit has been filed in the US federal courts alleging negligence, wrongful death and deceptive trade practices by a tech company after a teenage boy committed suicide. The boy had developed an online relationship with a “chatbot” character he had created via an app called Character.AI.  His mother believes the company abused and preyed on her son, but the company’s founder says it is up to individuals to figure out what provides value for them – they just provide the products.  Guests:  Meetali Jain, Director & Founder, Tech Justice Law Project Casey Mock, Chief Policy Officer at the Centre for Humane Technology
More Episodes
Australian writer Lech Blaine shares the stranger-than-fiction story of his childhood, growing up in a loving foster family in rural Queensland, haunted by two fanatical Christian kidnappers. 
Published 11/21/24
What does successful public policy look like in Australia in 2024? Can parliaments overcome petty partisanship, narrow self-interest and the populism of our times to serve Australians into the future? John Brumby AO and Cheryl Kernot discuss the pursuit of better government. 
Published 11/21/24