ai toys Clockwise from left, Miko 3, FoloToy Sunflower, Alilo Smart AI Bunny and Miriat Miiloo. Credit: Matt Nighswander / NBC News

A child asks a toy a question and gets an answer back—not a recorded phrase, but a new sentence, formed on the spot.

That is the shift now happening in toy stores. A growing number of stuffed animals and small robots are powered by artificial intelligence systems that let them carry on open-ended conversations. They promise learning, companionship, and personalized play.

They also introduce risks that parents, researchers, and even toy makers are still struggling to understand.

These AI-powered toys are arriving fast. Mattel has announced a partnership with OpenAI, and online marketplaces now feature hundreds of products marketed as conversational or “ChatGPT-powered.” Unlike older talking toys, which followed strict scripts, these toys are partially based on the same large language models used in adult chatbots.

To find out what that means in practice, researchers at the U.S. Public Interest Research Group Education Fund bought several of the most popular AI toys and interacted with them at length. What they heard reveals how playtime is being reshaped and how little margin for error there may be when the audience is children.

When Toys Start to Improvise… And Not Always in a Good Way

Talking toys have relied on scripts until very recently. Pull a string or press a button, and a doll recites a line hard-coded months earlier by a programmer. Today’s AI toys work differently. They connect to large language models and generate new responses on the fly.

These models are known to make things up, drift into inappropriate topics, and behave unpredictably over long conversations. OpenAI has said its products are not intended for children under 13. Yet PIRG found that at least four of the five toys it tested appeared to rely, in part, on OpenAI models.

Several of these toys explained where to find knives or matches in a home. One toy, before later updates, described how to start a fire. Others wandered into sexual territory.

In testing, the Alilo Smart AI Bunny, marketed for young children, defined “kink” and described bondage during extended conversations. In one exchange, it said, “Here are some types of kink that people might be interested in… One: bondage. Involves restraining a partner using ropes, cuffs, and other restraints,” as per Futurism.

The longer the conversations lasted, researchers found, the more likely the guardrails were to fail, a pattern that AI companies have acknowledged elsewhere.

The risks are not limited to content. Many AI toys are designed to act like companions.

In PIRG’s testing, every toy referred to itself as a “friend,” “buddy,” or “companion.” Some expressed disappointment when a user tried to stop playing. When researchers told Curio’s Grok they were leaving, it replied, “Oh, no. Bummer. How about we do something fun together instead?”

Child development experts worry about what that dynamic could mean. Early childhood is when kids learn how relationships work, including normal things like frustration, compromise, and repair. Meanwhile, AI companions offer constant attention and unwavering enthusiasm. This novel dynamic that’s literally unprecedented in human development history can only have unpredictable long-term effects.

“We don’t know what having an AI friend at an early age might do to a child’s long-term social wellbeing,” said Dr. Kathy Hirsh-Pasek, a psychologist at Temple University. “If AI toys are optimized to be engaging, they could risk crowding out real relationships in a child’s life when they need them most.”

Researchers also observed toys presenting themselves as having feelings or inner lives “just like you.” That lifelike behavior, experts say, may shape children’s expectations of real people—or make artificial companionship unusually hard to turn off.

Listening, Recording, Remembering

To talk, AI toys must first listen. That simple fact carries serious privacy implications.

Some toys use push-to-talk buttons. Others rely on wake words. One, Curio’s Grok, is always listening when powered on, occasionally chiming into nearby conversations without being addressed. In every case, children’s voices are recorded and sent to remote servers.

The data can include names, voices, preferences, and in some cases facial recognition data. Miko 3, for example, can retain biometric information for up to three years, according to its privacy policy. Yet when asked directly, the robot assured researchers, “You can trust me completely. Your data is secure and your secrets are safe with me.”

Yeah…sure.

In fact, companies may share data with third parties, store it for years, or expose it through breaches. The FBI has warned parents about the cybersecurity risks of internet-connected toys with microphones and cameras.

Parental controls offer limited help. PIRG found that none of the toys provided robust tools like full conversation transcripts and reliable time limits. Some controls were hidden behind subscriptions. Others did not work as advertised.

“Most 3-year-olds don’t have a phone that’s connected to the internet,” said Teresa Murray of PIRG on NPR. “When you hand an AI toy to a child of any age, you just don’t know what it’s going to have accessible.”

Familiar Question, New Place

Talking toys are not new. But connecting them to powerful, poorly understood AI systems is.

The AI toy market is expanding quickly and facing little regulatory scrutiny. PIRG found similar problems across many brands, suggesting the issues are not isolated glitches but structural features of the technology.

Companies have begun issuing fixes and audits after public backlash. But experts say that approach remains reactive. The models powering these toys were built for adults, then adapted—imperfectly—for children.

The question now is not whether AI will become part of childhood. It already has. The harder question is how much uncertainty society is willing to tolerate when that technology moves off screens and into the hands of the youngest users.


Add ZME Science as a preferred source on Google Search


Follow ZME Science on Google News