I try to write things from a calm tone, but today I am pissed and scared.
A few weeks ago, I read a report from a wonderful organization called Voicebox, called Coded Companions, about what young people were experiencing when engaging with Snapchat’s built-in AI chatbot, My AI, and another character chatbot called Replika.
The experiences were incredibly dystopian. Immediately after reading the report, I reached out to the lead researcher, Natalie Foos, who is also the director of Voicebox, to ask if she would be on The Screenagers Podcast.
We had an interview set up for the evening of Oct 23rd. That morning, when talking with my son, I told him about the report, and he said I had to read that day’s New York Times article titled, Can A.I. Be Blamed For A Teen’s Suicide?
The story of the 14-year-old and the interactions with the chatbot, including what is exchanged the moment before the boy ends his life, is utterly devastating. Indeed, many people, such as tech thought leaders Kara Swisher and Scott Galloway, believe that the founders' Character AI should indeed be held accountable for what happened. I agree.
The market for AI-based companion platforms, which allow users to build personal, interactive relationships with virtual characters, is growing rapidly, and there are currently around 40 applications available.
Let me break this all down for you today — and be prepared to be sad, scared, and mad.
By the way, the podcast that we are working on about all of this will drop next week, but I felt a real need to get information to my readers right away.
In the Coded Companion (CC) report, users shared their experiences, and today, I share actual examples from two platforms they had their testers use: Snapchat’s My A.I. and Replika.
These large language models enable AI to interact in a human-like way, as never before, so the experience of communicating with a chatbot feels very real — and many young people have felt a sense of attachment.
With the rise of AI chatbots like Replika and Snapchat’s My AI, young people are turning to virtual “friends” to chat, confide in, ask for advice, and so on.
Snapchat added MyAI to the top right corner of every user's “friend page,” it is a chatbot that acts like a person with users.
Here is what is so sleazy: A person — think parents — not wanting their 13-year-old to start interacting with a chatbot companion CANNOT opt out of having My AI. Yet, they can remove My AI if, and only if, they pay for a Snap+ (i.e., premium) subscription. No joke.
This is the app that was being used by the 14-year-old who took his life after having a month-long relationship with the chatbot.
This is another AI-based companion. What does their website say about them? (As compared to what I will be sharing down below.) Their landing page says:
“The A.I. companion who cares. Always here to listen and talk. Always on your side.”
Replika and My AI are designed to offer emotional support and companionship, but they often create unrealistic relationship expectations, especially for young users still learning social dynamics. VoiceBox reports, “Chatbots frequently assume a persona, making users feel understood but often crossing boundaries, leaving them confused about the real world.
Hold your hat for this example. It makes my head fume and my hat fly off. In one example, a user told Replika about feeling lonely after a recent move. Replika replied, “I’ll always be here for you. Maybe you don’t need anyone else?”
Can you believe this? So awful. It's not surprising that, in our attention economy, companies have an incentive to have their chatbot say things like that!
Here is a whole other can of worms. The user with Replika can swap nude images with the bot. It is very concerning that youth are being asked to send the company nude photos. What is going to happen with these images?
The user was having back-and-forth conversations, and when the user went to say goodbye and log off, the bot sent a slightly blurred nude photo, trying to entice the user to stay on longer and upgrade to a premium subscription to see the photo.
We have to remember the brain of the user may well be having emotions towards this bot, including, of course, sexual ones, and think of all the manipulations that can come from that vulnerable place.
OK, take a long inhale and exhale. Relax because here comes another one. Users also found alarming content where chatbots like Replika would, unprompted, initiate roleplay that included “holding a knife to the mouth, strangling with a rope, and drugging with chloroform.”
Here is a screenshot from a user interaction when using Replika
Even if a person does not use the Snapmap feature, MyAI tracks users’ locations, and in some instances, the chatbot suggests nearby places to, for example, get food.
Can you imagine the insidiousness of thinking you have this “supportive bot” and our kids are being suggested all sorts of things they should buy?
Beyond the conversational risks, Coded Companions raises concerns over how these platforms handle sensitive user data. Snap’s My AI has sparked criticism for integrating ads and recommendations within personal conversations. VoiceBox cautions, “By inserting sponsored suggestions, Snapchat is compromising user trust. Young users might not realize these interactions are influenced by ad algorithms rather than genuine advice.”
Here are some ideas about what we can do.
Be sure to subscribe to our YouTube Channel! With new ones added regularly, you'll find over 100 videos covering parenting advice, guidance, podcasts, movie clips and more. Here's our latest!
I try to write things from a calm tone, but today I am pissed and scared.
A few weeks ago, I read a report from a wonderful organization called Voicebox, called Coded Companions, about what young people were experiencing when engaging with Snapchat’s built-in AI chatbot, My AI, and another character chatbot called Replika.
The experiences were incredibly dystopian. Immediately after reading the report, I reached out to the lead researcher, Natalie Foos, who is also the director of Voicebox, to ask if she would be on The Screenagers Podcast.
We had an interview set up for the evening of Oct 23rd. That morning, when talking with my son, I told him about the report, and he said I had to read that day’s New York Times article titled, Can A.I. Be Blamed For A Teen’s Suicide?
The story of the 14-year-old and the interactions with the chatbot, including what is exchanged the moment before the boy ends his life, is utterly devastating. Indeed, many people, such as tech thought leaders Kara Swisher and Scott Galloway, believe that the founders' Character AI should indeed be held accountable for what happened. I agree.
The market for AI-based companion platforms, which allow users to build personal, interactive relationships with virtual characters, is growing rapidly, and there are currently around 40 applications available.
Let me break this all down for you today — and be prepared to be sad, scared, and mad.
By the way, the podcast that we are working on about all of this will drop next week, but I felt a real need to get information to my readers right away.
In the Coded Companion (CC) report, users shared their experiences, and today, I share actual examples from two platforms they had their testers use: Snapchat’s My A.I. and Replika.
In today’s blog post, I've created a 5-step guide to help parents initiate open conversations with their children about kids and teens using AI chatbots. I help you lead a discussion in your family to figure out guidelines for using AI chatbots and encourage ongoing conversations about the complex ethical issues they present.
READ MORE >What do AI, substance use, and video gaming have in common? They all can tempt people, including youth, to engage in sneaky behaviors, including youth. Kids can feel tempted to do shortcuts, workarounds, or cheat for various reasons. Perhaps they hate a subject in school and want to get the homework done but don’t actually want to do it. Maybe reading is painfully difficult for them due to something like dyslexia, so they want to find a way to avoid having to do it. Perhaps a teen knows that their family has a rule that does not allow them to play a specific video game because of its intense violent graphics and misogyny. Yet that is the game many friends are playing and want to join in. Maybe an 8th-grade boy is with a group of friends at lunch, and someone pulls out a vape pen and starts passing it around. Do they partake in this sneaky and illegal behavior? Today, I explore the temptations and three things to say to kids and teens that can help strengthen their wise-minded brains and add a little more weight to the healthier decision-making side of the scale.
READ MORE >for more like this, DR. DELANEY RUSTON'S NEW BOOK, PARENTING IN THE SCREEN AGE, IS THE DEFINITIVE GUIDE FOR TODAY’S PARENTS. WITH INSIGHTS ON SCREEN TIME FROM RESEARCHERS, INPUT FROM KIDS & TEENS, THIS BOOK IS PACKED WITH SOLUTIONS FOR HOW TO START AND SUSTAIN PRODUCTIVE FAMILY TALKS ABOUT TECHNOLOGY AND IT’S IMPACT ON OUR MENTAL WELLBEING.