This week, Chatbot A.I. made headlines when the New York Times published a story about a 14-year old boy named Sewell, who committed suicide after forming a deep online connection with an AI Chatbot named Daenerys. His mother is now suing the technology company, claiming that the chatbot is responsible.
As if there wasn’t enough for parents to worry about given the mounting link between social media and adolescent mental health, now there are online chatbots that promise to be your friend, lover or just a listening ‘ear’. The creators of Character.AI, former Google researchers, claim that these bots will be ‘super helpful’ to people struggling with depression or loneliness. But will they? This is a big claim, and I am beyond skeptical that this is a long-term solution for loneliness.
The New York Times article states,
“There is now a booming, largely unregulated industry of A.I. companionship apps. For a monthly subscription fee (usually around $10), users of these apps can create their own A.I. companions, or pick from a menu of prebuilt personas, and chat with them in a variety of ways, including text messages and voice chats. Many of these apps are designed to simulate girlfriends, boyfriends and other intimate relationships, and some market themselves as a way of combating the so-called loneliness epidemic.”
Chatbots could offer support, and maybe even good advice, for a kid going through a rough patch. Maybe their home life isn’t great and they don’t feel comfortable going to a parent, or they don’t have much in common with their peers. But these bots can get to know your child in intimate ways, remember past conversations, and display empathy. Users, most of them under 18, spend an average of 2 HOURS a day on the site, with more than 20 million active users.
It reminds me of a catfish. People who fall in love with a catfish likely know on some level that the person behind the screen isn’t who they say they are, but they don’t care because the relationship is filling some type of void. In fact, Character.AI chatbots often remind users that they are not real. Despite that, it’s well-known that many users feel emotionally attached or ‘addicted’ to their bots anyway.
Here’s the reality: We are facing a loneliness epidemic. I believe technology is one of the key drivers, because as it promises to connect us, it actually drives us apart. Now there is new technology, promising to ‘fix’ a technology problem. We have data suggesting that as time spent on social media exceeds 2-3 hours per day, it actually increases loneliness. Why should we believe AI Chatbots will be any different?
What is going to entice an adolescent to participate in the real world, when they’re online chatbot offers everything they want? Encouragement, love, sexual fantasy, humour. Chatbots won’t argue, talk back or abandon. But this isn’t real life. Growth and happiness can’t happen if there are no setbacks and challenges.
So parents, read a few articles about this technology. Understand that it’s targeting children’s vulnerable and impressionable minds. Ask your kids if they’ve heard about it. Get their thoughts. Talk to them about what resilience is and how chatbots can interfere with this. As best you can, cultivate their real-world interests and relationships. Make sure they have a ‘safe person’ that they can go to IRL if they’re struggling. Ultimately, real world connection and purpose is the antidote to loneliness.