As we’re about to celebrate 10 years of Screenagers, we want to hear what’s been most helpful and what you’d like to see next.
Please click here to share your thoughts with us in our community survey. It only takes 5–10 minutes, and everyone who completes it will be entered to win one of five $50 Amazon vouchers.

A reader recently sent me a great question: “Should I be worried about my kid using Alexa or Google Home?”
It’s a great question, and one I’ve been thinking about more myself lately, especially as these devices become more conversational and, honestly, more human-sounding every day.
Google is gradually replacing Google Assistant with Gemini, a more advanced conversational system. Similarly, Amazon has launched Alexa+, which uses large language models (LLMs) to make the assistant more human-like.
It seems like the right moment to pause and consider what risks our kids face with these “smart” speakers.
Smart speakers, such as Alexa, are found in some kids’ bedrooms, where they’re used to play music, listen to a meditation, or read bedtime stories. Throughout the house, they are used for things like finding out the weather forecast, helping with homework, and much more, of course.
Both situations, in kids’ rooms and in the house in general, pose some risks for kids, and I want to explore a few while offering solutions, including replacement options for when they are used at bedtime.
(And let me mention that our Screen-Free Sleep campaign includes ensuring Alexa and other smart speakers are out of children’s bedrooms to ensure healthy sleep)
Research shows how often the wrong/ inappropriate apps can be activated
A study called "SkillBot: Identifying Risky Content for Children in Alexa Skills tested thousands of child-directed Alexa apps (the apps are called “skills”) and found something unsettling: kids could accidentally trigger apps meant for adults, including some with inappropriate content. In some cases, the system even prioritized adult content!
The study also found that many parents assume Alexa’s kids category is thoroughly vetted and safe. Yet, these child-focused apps can still collect personal information or respond with content that is inappropriate for their intended age group.

Modeling rude interactions
How does one talk to these robots? At times I use the voice feature on ChatGPT to practice a foreign language, and I can feel bad when I get frustrated and raise my voice at it. Sometimes I do the reverse like finding myself thanking it like I would a person.
It’s confusing, because we don’t want to model for our kids that we treat machines like people which goes against the goal of helping them avoid forming attachments to technology. But at the same time, we also don’t want to model being rude.
Undermines patience
Kids are curious, and their questions are so sweet (and can drive us batty at times, like with the endless 'why? why? why?')
Smart speakers can answer questions quickly. But what is the message if kids think they can get their questions answered instantly? Where is the grappling with the question? We need to teach them to practice patience.
Privacy issues and targeting with ads
And then there’s the privacy piece. Researchers found that Alexa can infer personal interests from what people say, and those insights feed into ad targeting. For example, if you ask enough questions about skincare, you will likely be filtered into a “Fashion and Style” persona, which is then used to target you with ads.

Okay, what can we do, especially if you have one of these smart speakers in your home?
And why not celebrate completing your science experiment with a bit of chocolate? When I worked in a lab after college, our weekly meetings always included chocolate. Best tradition ever!
Subscribe to our YouTube Channel! We add new videos regularly and you'll find over 100 videos covering parenting advice, guidance, podcasts, movie clips and more. Here's our most recent:
Sign up here to receive the weekly Tech Talk Tuesdays newsletter from Screenagers filmmaker Delaney Ruston MD.
We respect your privacy.

A reader recently sent me a great question: “Should I be worried about my kid using Alexa or Google Home?”
It’s a great question, and one I’ve been thinking about more myself lately, especially as these devices become more conversational and, honestly, more human-sounding every day.
Google is gradually replacing Google Assistant with Gemini, a more advanced conversational system. Similarly, Amazon has launched Alexa+, which uses large language models (LLMs) to make the assistant more human-like.
It seems like the right moment to pause and consider what risks our kids face with these “smart” speakers.
Smart speakers, such as Alexa, are found in some kids’ bedrooms, where they’re used to play music, listen to a meditation, or read bedtime stories. Throughout the house, they are used for things like finding out the weather forecast, helping with homework, and much more, of course.
Both situations, in kids’ rooms and in the house in general, pose some risks for kids, and I want to explore a few while offering solutions, including replacement options for when they are used at bedtime.
(And let me mention that our Screen-Free Sleep campaign includes ensuring Alexa and other smart speakers are out of children’s bedrooms to ensure healthy sleep)

We want our kids to be motivated to learn, face challenges, and generate their own ideas. However, school often assigns work that doesn't inspire interest, and now AI provides an easy shortcut. Instead of struggling through it, students can simply ask a chatbot for answers or even complete assignments. In today’s blog, I share five ways parents can help kids stay engaged in learning.
READ MORE >
You might have heard about the tragic suicide of 16‑year‑old Adam Raine, who was talking with ChatGPT for up to four hours a day. His parents filed a wrongful‑death lawsuit against OpenAI and CEO Sam Altman on August 26, 2025, in San Francisco Superior Court. In this blog we talk about the immediate safeguards needed to fix these horrific risks of AI, and offer parents suggestions for how they can talk with their kids about these risks and dealing with strong emotions.
READ MORE >
Our latest podcast features candid interviews with college students on how they’re navigating the rapid rise of AI tools like ChatGPT in their academic lives. In today’s blog, I explore the ethical lines students are trying to draw, what they will and won’t use ChatGPT for, the tools educators are using to detect AI-generated work, and one student’s experience of being wrongly flagged for cheating on a paper she wrote entirely on her own.
READ MORE >for more like this, DR. DELANEY RUSTON'S NEW BOOK, PARENTING IN THE SCREEN AGE, IS THE DEFINITIVE GUIDE FOR TODAY’S PARENTS. WITH INSIGHTS ON SCREEN TIME FROM RESEARCHERS, INPUT FROM KIDS & TEENS, THIS BOOK IS PACKED WITH SOLUTIONS FOR HOW TO START AND SUSTAIN PRODUCTIVE FAMILY TALKS ABOUT TECHNOLOGY AND IT’S IMPACT ON OUR MENTAL WELLBEING.
