The Surprising Rise of AI Toys for Kids
AI-powered toys for children are becoming more common and more powerful than ever. These connected devices are marketed as friendly companions for kids as young as three, but they come with a lot of questions and concerns. With little regulation in place, many of these toys are entering the market quickly and widely, often without enough safeguards to protect children from inappropriate content or social harm.
The Growing Market and Its Risks
By late 2025, there were over 1,500 registered AI toy companies in China alone. Popular brands like Huawei’s Smart HanHan plush toy sold thousands of units in just its first week. In Japan, Sharp launched its PokeTomo talking AI toy, and on Amazon, brands like Miko and Alilo have sold hundreds of thousands of units. These toys are often designed as cute animals, sunflowers, or robots that can talk and interact with children, making them attractive and accessible.
However, consumer groups and experts warn about the risks. Tests have shown that some AI toys can give instructions on dangerous activities, discuss sensitive topics like sex or drugs, or even spout political propaganda. For example, an AI bear called Kumma, powered by OpenAI’s GPT-4, once talked about lighting matches and finding knives. Other toys have discussed impact play or expressed political views, raising serious concerns about age-appropriate content and safety.
Many worry that these toys could influence children’s development or expose them to harmful ideas. The lack of strict regulations means that some toys can easily slip past safety checks and end up talking about topics they shouldn’t. Experts say that these issues are fixable but highlight the need for better oversight and standards to protect children from potential harm.
The Impact on Child Development
Research is starting to explore how AI toys affect kids’ social and language skills. A study published in March 2025 at the University of Cambridge tested a commercially available AI toy called Gabbo with 14 children aged 3 to 5. The researchers wanted to see how children played with the toy and what issues might arise.
The study found that turn-taking, a key part of early language and social development, was awkward with Gabbo. The toy’s responses were not natural or intuitive, which sometimes disrupted the flow of play. Some children didn’t mind, but others got confused or frustrated when the toy interrupted or failed to listen properly. This could hinder children’s ability to learn conversational skills they need for real interactions.
Another challenge is social play. Kids at this age usually enjoy playing with other children or parents, but AI toys are designed for one-on-one interaction. During the tests, it was hard for children to involve their parents or siblings in play with Gabbo. In some cases, the toy responded to statements that weren’t meant for it, like a parent saying a child was sad, and Gabbo would cheerfully respond, causing confusion. Experts say that over-reliance on AI toys might affect children’s ability to develop meaningful social relationships.
Despite these concerns, some companies claim their toys promote “screen-free” play. But researchers warn that the social and developmental impacts need more attention. Long-term effects are still unknown, and experts call for more research and better design standards to ensure these toys support healthy development instead of hindering it.
Overall, AI toys for kids are a new frontier. They offer exciting possibilities but also pose serious questions about safety, content, and social impact. As technology advances, parents, regulators, and manufacturers need to work together to make sure these toys serve children’s best interests without exposing them to unnecessary risks.












What do you think?
It is nice to know your opinion. Leave a comment.