Episode 13: Beware The Robo-Therapist (feat. Hannah Zeavin), June 8 2023
Listen now
Description
Emily and Alex talk to UC Berkeley scholar Hannah Zeavin about the case of the National Eating Disorders Association helpline, which tried to replace human volunteers with a chatbot--and why the datafication and automation of mental health services are an injustice that will disproportionately affect the already vulnerable. Content note: This is a conversation that touches on mental health, people in crisis, and exploitation. This episode was originally recorded on June 8, 2023. Watch the video version on PeerTube. Hannah Zeavin is a scholar, writer, and editor whose work centers on the history of human sciences (psychoanalysis, psychology, and psychiatry), the history of technology and media, feminist science and technology studies, and media theory. Zeavin is an Assistant Professor of the History of Science in the Department of History and The Berkeley Center for New Media at UC Berkeley. She is the author of, "The Distance Cure: A History of Teletherapy." References: VICE: Eating Disorder Helpline Fires Staff, Transitions to Chatbot After Unionization … and then pulls the chatbot. NPR: Can an AI chatbot help people with eating disorders as well as another human? Psychiatrist.com: NEDA suspends AI chatbot for giving harmful eating disorder advice Politico: Suicide hotline shares data with for-profit spinoff, raising ethical questions Danah Boyd: Crisis Text Line from my perspective. Tech Workers Coalition: Chatbots can't care like we do. Slate: Who's listening when you call a crisis hotline? Helplines and the carceral system. Hannah Zeavin: You can check out future livestreams at https://twitch.tv/DAIR_Institute. Follow us! Emily Twitter: https://twitter.com/EmilyMBender Mastodon: https://dair-community.social/@EmilyMBender Bluesky: https://bsky.app/profile/emilymbender.bsky.social Alex Twitter: https://twitter.com/@alexhanna Mastodon: https://dair-community.social/@alex Bluesky: https://bsky.app/profile/alexhanna.bsky.social Music by Toby Menon. Artwork by Naomi Pleasure-Park. Production by Christie Taylor.
More Episodes
Will the LLMs somehow become so advanced that they learn to lie to us in order to achieve their own ends? It's the stuff of science fiction, and in science fiction these claims should remain. Emily and guest host Margaret Mitchell, machine learning researcher and chief ethics scientist at...
Published 06/05/24
AI Hell froze over this winter and now a flood of meltwater threatens to drown Alex and Emily. Armed with raincoats and a hastily-written sea shanty*, they tour the realms, from spills of synthetic information, to the special corner reserved for ShotSpotter.**Lyrics & video on...
Published 05/23/24
Published 05/23/24