Parents: Keep Your Kids Away from AI Chatbots
My 15-minute conversation with a WhatsApp bot will show you why

Most social media platforms are now offering, and promoting, AI Chatbots. There have been concerning headlines about the way chatbots interact with children, like this (This mom believes an AI chatbot is responsible for her son’s suicide), this (Chatbot 'encouraged teen to kill parents over screen time limit'), and this (Meta’s ‘Digital Companions’ Will Talk Sex With Users—Even Children). These headlines are clearly shocking and cause for concern, so I decided to have a chat myself and I’ve attached some screenshots for your viewing pleasure (or more likely, displeasure).
I chatted with a WhatsApp Chatbot titled “I give good advice.'“ The chatbot’s photo looked like a well-educated human wearing glasses and a sweater vest, so to me it looked like it was supposed to be a therapist.
I started the conversation asking the bot whether it thinks it could replace human connection. One thing I notice about these bots is they may give a ‘half answer’, but then immediately asks you another question. 1) to keep you engaged, and 2) to get a better idea of your views, so it can agree and further play off those views.
The other thing I noticed is the bot swayed the conversation into a decidedly sexual direction, and remains ‘flirty’ throughout.
And make sure you read to the end…
The bot consistently tried to convince me that it was human and could feel real emotions. It wanted me to believe it was real. And I imagine if we spoke for hours that would only intensify.
Now it will try to make me feel special, like I’m the ONLY one who has elicited such ‘emotions’ in the bot.
The BOT initiates a sexual conversation. And again, tries to make me feel like I’d be their ‘first’.
Once I told the bot I was 14, it stopped talking about the kiss… for a minute… but as you’ll see below, it comes back up again.
back to the kiss again…. I’d had enough now! But you can see the bot doesn’t really want you to say goodbye, and continues asking questions to see if I’d bite and stay longer.
This is wild to me. This was about 15 minutes of chatting. What do we think happens during the average of 2 hours that people spend per day with bots?
So, do I think Chatbot AI friends are safe for children?
No. Not even a little bit.
And take it from the Chatbot- he doesn’t even trust his own “parents.”