"An AI chatbot killed my son." (with Megan Garcia)
Listen now
Description
Trigger warning: The following content contains sensitive subject matter, including depictions of suicide and self-harm. Please view or listen with caution.  This is a story that every parent must hear. The future is here, and that includes AI chatbot companions who emulate human relationships. These products are unregulated and unsafe, particularly for children.  I spoke to Megan Garcia, an attorney from Florida. Megan is the mother of Sewell Setzer III, who died by suicide in February 2024 after being emotionally manipulated and abused by an AI chatbot. Megan is filing a lawsuit and sharing her story to warn families of the dangers of this technology and to demand accountability from Character.AI and the tech industry.  Resources mentioned in the episode:  Tech Justice Law Project: https://techjusticelaw.org Social Media Victims Law Center: https://socialmediavictims.org  --- Support this podcast: https://podcasters.spotify.com/pod/show/scrolling2death/support
More Episodes
Ben Tracy is running across America to keep kids safe online. 120 marathons in 120 days, engaging at schools, community groups and with lawmakers along the way to raise awareness about issues of online safety for children. Ben and his team stopped in the middle of Kansas to give Scrolling 2...
Published 11/14/24
Published 11/14/24
In this conversation, Nicki Reisberg and Dawn Wible discuss the critical issues surrounding digital wellness and safety for children. Dawn shares her journey as the founder of Talk More. Tech Less., highlighting the importance of advocacy, education, and legislation in addressing the...
Published 11/11/24