Shell Game
Shell Game
And how does that make you feel?
0:00
Current time: 0:00 / Total time: -3:16
-3:16

Paid episode

The full episode is only available to paid subscribers of Shell Game

And how does that make you feel?

As AI’s are increasingly used as counselors and confidants, the research struggles to keep up.

(Late happy new year everyone, we’re back after some early year housekeeping and Season 2 work around here. On the Season 1 front, we’re thrilled to report that Shell Game is a finalist in the “Emerging Podcast” category of the iHeart Podcast Awards. Meanwhile, if you want to hear AI Evan embarrass me with a truly cringeworthy collection of clichés about Ireland, we showed up together on Dublin’s RTÉ Radio. I also had a pair of great conversations with Ali Tocher on the What do Robots Sound Like show, and Brendan O’Meara of the Creative Nonfiction Podcast, on the process and journalism side of things.)

Among the questions I now commonly get asked—as a perceived emissary from our AI agent future—are variations on the following: Can AI help us combat the loneliness epidemic, or will it just make us more lonely? In Episode 4 of the first season we explored some issues around chatbots and mental health, when I loaded up my own voice agent with my problems and sent it to AI therapists (and one human one). These therapist agents did often provide some useful talk therapy, if a bit rote. Occasionally they ran comically off the rails, swapping roles of counselor and patient, repeating the same exercises ad nauseam, or offering up cold, canned responses that revealed they hadn’t actually “heard” the problems they were addressing.

But whatever issues might arise from people turning to a large language models (LLMs) for empathy, aid, or full-on psychotherapy, they haven’t stopped companies from encouraging users to do so, and monetizing the results. Tens of millions of people now regularly converse with chatbots through services like Replika and Character.ai—many treating them as at least a confidant, and often a therapist. Uncounted millions more are likely chatting in some therapy-seeking capacity with ChatGPT, Claude, and other large-language model chatbots.

Meanwhile, the explicit deployment of bots as AI therapists has continued apace, with new startups launching all the time, some of them raising tens of millions of dollars to deploy AI therapy. In their marketing and public statements, these companies make the same set of arguments. Namely, that society is suffering from intertwined crises of mental health and loneliness, and that there’s a well-established shortage of professionals to address them. Chatbots and AI therapy agents, the argument goes, can fill that gap. The LLM creators themselves tend to offer up AI as a palliative, if not a cure, for society’s mental health ills. (One partner at Andreessen Horowitz, a venture capital firm heavily invested in AI character and AI therapy apps, took the hype in a different direction, suggesting to The Washington Post that “maybe the human part of human connection is overstated.”)

On the other side, each week it seems there is a story in the news about people getting addicted to chatbots, falling in love with chatbots, or allegedly being driven to self-harm by them.

But the stories you hear about the mental health benefits and harms of conversing with AI’s—including my own experiences in Season 1—are just that: anecdotes in a vast sea of data. I wondered what the academic literature had to say about chatbots, AI agents, and mental health. I figured surely, given how widespread their use has already become, researchers had begun to determine whether we’ll benefit from all this AI talk therapy.

Listen to this episode with a 7-day free trial

Subscribe to Shell Game to listen to this post and get 7 days of free access to the full post archives.