


Smart speakers like Alexa and Google Home are becoming more human, but that brings real risks for kids. They can accidentally access adult content, learn rudeness from how we speak to devices, and lose patience with instant answers. There are also privacy concerns as these assistants collect and use voice data. Practical fixes include unplugging them at night and trying simple, screen-free alternatives for music and stories.
A reader recently sent me a great question: “Should I be worried about my kid using Alexa or Google Home?”
It’s a great question, and one I’ve been thinking about more myself lately, especially as these devices become more conversational and, honestly, more human-sounding every day.
Google is gradually replacing Google Assistant with Gemini, a more advanced conversational system. Similarly, Amazon has launched Alexa+, which uses large language models (LLMs) to make the assistant more human-like.
It seems like the right moment to pause and consider what risks our kids face with these “smart” speakers.
Smart speakers, such as Alexa, are found in some kids’ bedrooms, where they’re used to play music, listen to a meditation, or read bedtime stories. Throughout the house, they are used for things like finding out the weather forecast, helping with homework, and much more, of course.
Both situations, in kids’ rooms and in the house in general, pose some risks for kids, and I want to explore a few while offering solutions, including replacement options for when they are used at bedtime.
(And let me mention that our Screen-Free Sleep campaign includes ensuring Alexa and other smart speakers are out of children’s bedrooms to ensure healthy sleep)
Learn more about showing our movies in your school or community!
Join Screenagers filmmaker Delaney Ruston MD for our latest Podcast

Learn more about our Screen-Free Sleep campaign at the website!
Our movie made for parents and educators of younger kids
Learn more about showing our movies in your school or community!
A study called "SkillBot: Identifying Risky Content for Children in Alexa Skills" tested thousands of child-directed Alexa apps (the apps are called “skills”) and found something unsettling: kids could accidentally trigger apps meant for adults, including some with inappropriate content. In some cases, the system even prioritized adult content!
The study also found that many parents assume Alexa’s kids category is thoroughly vetted and safe. Yet, these child-focused apps can still collect personal information or respond with content that is inappropriate for their intended age group.

How does one talk to these robots? At times I use the voice feature on ChatGPT to practice a foreign language, and I can feel bad when I get frustrated and raise my voice at it. Sometimes I do the reverse like finding myself thanking it like I would a person.
It’s confusing, because we don’t want to model for our kids that we treat machines like people which goes against the goal of helping them avoid forming attachments to technology. But at the same time, we also don’t want to model being rude.
Kids are curious, and their questions are so sweet (and can drive us batty at times, like with the endless 'why? why? why?')
Smart speakers can answer questions quickly. But what is the message if kids think they can get their questions answered instantly? Where is the grappling with the question? We need to teach them to practice patience.
And then there’s the privacy piece. Researchers found that Alexa can infer personal interests from what people say, and those insights feed into ad targeting. For example, if you ask enough questions about skincare, you will likely be filtered into a “Fashion and Style” persona, which is then used to target you with ads.

Okay, what can we do, especially if you have one of these smart speakers in your home?
And why not celebrate completing your science experiment with a bit of chocolate? When I worked in a lab after college, our weekly meetings always included chocolate. Best tradition ever!
Learn more about showing our movies in your school or community!
Join Screenagers filmmaker Delaney Ruston MD for our latest Podcast

Learn more about our Screen-Free Sleep campaign at the website!
Our movie made for parents and educators of younger kids
Join Screenagers filmmaker Delaney Ruston MD for our latest Podcast
Subscribe to our YouTube Channel! We add new videos regularly and you'll find over 100 videos covering parenting advice, guidance, podcasts, movie clips and more. Here's our most recent:
As we’re about to celebrate 10 years of Screenagers, we want to hear what’s been most helpful and what you’d like to see next.
Please click here to share your thoughts with us in our community survey. It only takes 5–10 minutes, and everyone who completes it will be entered to win one of five $50 Amazon vouchers.
A reader recently sent me a great question: “Should I be worried about my kid using Alexa or Google Home?”
It’s a great question, and one I’ve been thinking about more myself lately, especially as these devices become more conversational and, honestly, more human-sounding every day.
Google is gradually replacing Google Assistant with Gemini, a more advanced conversational system. Similarly, Amazon has launched Alexa+, which uses large language models (LLMs) to make the assistant more human-like.
It seems like the right moment to pause and consider what risks our kids face with these “smart” speakers.
Smart speakers, such as Alexa, are found in some kids’ bedrooms, where they’re used to play music, listen to a meditation, or read bedtime stories. Throughout the house, they are used for things like finding out the weather forecast, helping with homework, and much more, of course.
Both situations, in kids’ rooms and in the house in general, pose some risks for kids, and I want to explore a few while offering solutions, including replacement options for when they are used at bedtime.
(And let me mention that our Screen-Free Sleep campaign includes ensuring Alexa and other smart speakers are out of children’s bedrooms to ensure healthy sleep)
A study called "SkillBot: Identifying Risky Content for Children in Alexa Skills" tested thousands of child-directed Alexa apps (the apps are called “skills”) and found something unsettling: kids could accidentally trigger apps meant for adults, including some with inappropriate content. In some cases, the system even prioritized adult content!
The study also found that many parents assume Alexa’s kids category is thoroughly vetted and safe. Yet, these child-focused apps can still collect personal information or respond with content that is inappropriate for their intended age group.

How does one talk to these robots? At times I use the voice feature on ChatGPT to practice a foreign language, and I can feel bad when I get frustrated and raise my voice at it. Sometimes I do the reverse like finding myself thanking it like I would a person.
It’s confusing, because we don’t want to model for our kids that we treat machines like people which goes against the goal of helping them avoid forming attachments to technology. But at the same time, we also don’t want to model being rude.
Kids are curious, and their questions are so sweet (and can drive us batty at times, like with the endless 'why? why? why?')
Smart speakers can answer questions quickly. But what is the message if kids think they can get their questions answered instantly? Where is the grappling with the question? We need to teach them to practice patience.
And then there’s the privacy piece. Researchers found that Alexa can infer personal interests from what people say, and those insights feed into ad targeting. For example, if you ask enough questions about skincare, you will likely be filtered into a “Fashion and Style” persona, which is then used to target you with ads.

Okay, what can we do, especially if you have one of these smart speakers in your home?
And why not celebrate completing your science experiment with a bit of chocolate? When I worked in a lab after college, our weekly meetings always included chocolate. Best tradition ever!
Subscribe to our YouTube Channel! We add new videos regularly and you'll find over 100 videos covering parenting advice, guidance, podcasts, movie clips and more. Here's our most recent:
Sign up here to receive the weekly Tech Talk Tuesdays newsletter from Screenagers filmmaker Delaney Ruston MD.
We respect your privacy.
A reader recently sent me a great question: “Should I be worried about my kid using Alexa or Google Home?”
It’s a great question, and one I’ve been thinking about more myself lately, especially as these devices become more conversational and, honestly, more human-sounding every day.
Google is gradually replacing Google Assistant with Gemini, a more advanced conversational system. Similarly, Amazon has launched Alexa+, which uses large language models (LLMs) to make the assistant more human-like.
It seems like the right moment to pause and consider what risks our kids face with these “smart” speakers.
Smart speakers, such as Alexa, are found in some kids’ bedrooms, where they’re used to play music, listen to a meditation, or read bedtime stories. Throughout the house, they are used for things like finding out the weather forecast, helping with homework, and much more, of course.
Both situations, in kids’ rooms and in the house in general, pose some risks for kids, and I want to explore a few while offering solutions, including replacement options for when they are used at bedtime.
(And let me mention that our Screen-Free Sleep campaign includes ensuring Alexa and other smart speakers are out of children’s bedrooms to ensure healthy sleep)

Snapchat and Instagram both have AI chatbots built in by default, with no way to fully disable them. Meta's own internal documents revealed policy decisions that allowed minors to receive romantic and sexual content from its AI systems. Meanwhile, Snapchat's premium AI features are designed to increase engagement, and Meta is now using teens' AI conversations to target them with personalized ads.
READ MORE >
New research from the Rithm Project surveyed 2,383 teens and young adults to understand how AI is shaping their relationships. While most use AI for information and tasks, a notable group are turning to AI characters for emotional support, with over half of this group reporting they feel they have no one to turn to. These findings offer an important window into how some teens are really using AI, and why parents need to be having these conversations.
READ MORE >
AI tools like ChatGPT can now complete many homework tasks for students, often in minutes. While these tools may be useful for skilled adults, research suggests they can undermine learning for children by bypassing effort, problem solving, and critical thinking. Homework that involves writing, calculations, or study materials is especially vulnerable to AI use, while memorization and hands-on creative work still require student effort. Clear household rules and ongoing conversations can help protect learning and set expectations around AI use for schoolwork.
READ MORE >for more like this, DR. DELANEY RUSTON'S NEW BOOK, PARENTING IN THE SCREEN AGE, IS THE DEFINITIVE GUIDE FOR TODAY’S PARENTS. WITH INSIGHTS ON SCREEN TIME FROM RESEARCHERS, INPUT FROM KIDS & TEENS, THIS BOOK IS PACKED WITH SOLUTIONS FOR HOW TO START AND SUSTAIN PRODUCTIVE FAMILY TALKS ABOUT TECHNOLOGY AND IT’S IMPACT ON OUR MENTAL WELLBEING.
