“The fundamental paradox of technology is that the more intimately it knows us, the better it can serve us and the better it can exploit us.”
Asa Raskin is a tech designer known for coining the term "infinite scroll" and for his work on technology ethics. — from Your Undivided Attention Podcast
Hi Everyone,
Delaney here, not a chatbot but rather a human who is deeply concerned about AI companion bots.
I am so concerned, in fact, that I wrote a blog last week on this topic, just released a podcast episode yesterday, and now this blog.
Help is needed because Silicon Valley is in a race to create and monetize large learning platforms to be human companions. And, as many of you know, a young boy recently took his life while in the midst of a really disturbing “relationship” with such a “companion.”
Let me share another recent quote by Aza Raskin:
“...the question to be asking is, is intimacy more valuable to a company than just mere attention? And the answer is yes, that there is a race therefore to colonize, to commodify intimacy. And that's just not an abstract thing that we say, that's a real thing, which is now causing the death of human beings.”
I have created a step-by-step guide to help you discuss all this with your kids. This can be modified if you talk with young people who are not your kids, such as students, a scouting troop, extended family, etc.
Before the steps, let me offer this tip on how to have the discussion. Communication science tells us that when we ask kids and teens questions because we really want to understand their points of view, they feel respected and thus more motivated to participate in the conversations.
The steps
The more kids answer the question versus our offering reasons, the more their analytical thinking comes on board, and the greater the chance they may find themselves contemplating the questions after the discussion.
Let kids come up with as many as they can.
(Pro tip: repeat some things in your own words to them; it helps them feel heard. Remember, people want to be heard as much as they want what they think they want.)
I invite you to consult my blog from last week for ideas (titled, Super Scary Snapchat Chatbot AI And Other Chatbot Companions.)
It is important to talk about what we want for our kids and our expectations and rules. Yes, kids can get around rules, but research shows that the chances are less that this will happen if we have rules in place. We explain our reasoning around rules and take into account our kids’ feelings and opinions.
I think it is very reasonable to say to a 13-16 year old (and other ages where you feel appropriate), something like this:
Silicon Valley is racing to create these AI companions, but they often have no safeguards. For example, if a young person talks about self-harm or thoughts of suicide, a chatbot with responsible safeguards may offer resources for seeking help and also says it can’t have conversations about that. Yet, many of these other platforms have not built in such safeguards. For now, we want to have the rule that it is not okay to use AI chatbot companions.
Of course, you should think about this as the parent/ parents and have discussions with kids. I can not say the rule for your family.
Let me add that Character AI, the platform that the 14-year-old became immersed in when he took his life, is for 13 and older people. I just think it is awful that 13-year-olds can access such a bot, and there are other ones for kids aged 13. Let alone there is no age-gating, so kids younger than 13 can have access by lying about their birthdate.
These “empathy chatbots,” as some in Silicon Valley call these companions, are being touted as a reasonable approach to helping the lonely. I don’t buy it. Whatever your views on this, one thing we can all agree on is people who are struggling and feeling lonely need support. We can also agree that we all want kids to be helpers now and move in their lives.
So, I suggest saying to kids something like the following:
Of course, one reason people will be pulled into connection with an AI companion is due to loneliness. Let's commit to all being on the lookout for people we think might be feeling lonely and we can talk about ways that we might help them.”
Decide together some specific times that you will revisit these topics, explaining that this is all brand new and that, frankly, we need our youth to grapple with these complex ethical issues.
After all, their ideas are key now and are needed as future leaders.
In terms of picking times to discuss companion A.I.s, how about proposing the first night of each month for the next three months? At that time, you will all check in with each other about the four steps above, such as discussing people who seem lonely and ways that can possibly help them.
I recommend putting the dates you suggest on a large, visible paper calendar. To add some lightness to this hard topic, maybe pair it with roasting a few marshmallows over a stovetop.
Questions to get the conversation started within your family or group:
Be sure to subscribe to our YouTube Channel! With new ones added regularly, you'll find over 100 videos covering parenting advice, guidance, podcasts, movie clips and more. Here's our latest!
“The fundamental paradox of technology is that the more intimately it knows us, the better it can serve us and the better it can exploit us.”
Asa Raskin is a tech designer known for coining the term "infinite scroll" and for his work on technology ethics. — from Your Undivided Attention Podcast
Hi Everyone,
Delaney here, not a chatbot but rather a human who is deeply concerned about AI companion bots.
I am so concerned, in fact, that I wrote a blog last week on this topic, just released a podcast episode yesterday, and now this blog.
Help is needed because Silicon Valley is in a race to create and monetize large learning platforms to be human companions. And, as many of you know, a young boy recently took his life while in the midst of a really disturbing “relationship” with such a “companion.”
Let me share another recent quote by Aza Raskin:
“...the question to be asking is, is intimacy more valuable to a company than just mere attention? And the answer is yes, that there is a race therefore to colonize, to commodify intimacy. And that's just not an abstract thing that we say, that's a real thing, which is now causing the death of human beings.”
I have created a step-by-step guide to help you discuss all this with your kids. This can be modified if you talk with young people who are not your kids, such as students, a scouting troop, extended family, etc.
I try to write things from a calm tone, but today I am pissed and scared. I read a report from a wonderful organization called Voicebox, called Coded Companions, about what young people were experiencing when engaging with Snapchat’s built-in AI chatbot, My AI, and another character chatbot called Replika. The market for AI-based companion platforms, which allow users to build personal, interactive relationships with virtual characters, is growing rapidly, and there are currently around 40 applications available. Let me break this all down for you today — and be prepared to be sad, scared, and mad.
READ MORE >What do AI, substance use, and video gaming have in common? They all can tempt people, including youth, to engage in sneaky behaviors, including youth. Kids can feel tempted to do shortcuts, workarounds, or cheat for various reasons. Perhaps they hate a subject in school and want to get the homework done but don’t actually want to do it. Maybe reading is painfully difficult for them due to something like dyslexia, so they want to find a way to avoid having to do it. Perhaps a teen knows that their family has a rule that does not allow them to play a specific video game because of its intense violent graphics and misogyny. Yet that is the game many friends are playing and want to join in. Maybe an 8th-grade boy is with a group of friends at lunch, and someone pulls out a vape pen and starts passing it around. Do they partake in this sneaky and illegal behavior? Today, I explore the temptations and three things to say to kids and teens that can help strengthen their wise-minded brains and add a little more weight to the healthier decision-making side of the scale.
READ MORE >for more like this, DR. DELANEY RUSTON'S NEW BOOK, PARENTING IN THE SCREEN AGE, IS THE DEFINITIVE GUIDE FOR TODAY’S PARENTS. WITH INSIGHTS ON SCREEN TIME FROM RESEARCHERS, INPUT FROM KIDS & TEENS, THIS BOOK IS PACKED WITH SOLUTIONS FOR HOW TO START AND SUSTAIN PRODUCTIVE FAMILY TALKS ABOUT TECHNOLOGY AND IT’S IMPACT ON OUR MENTAL WELLBEING.