*
Tuesday: 24 March 2026
  • 24 March 2026
  • 09:39
Artificial Intelligence Dolls Friendly Companions or Hidden Threat to Childrens Growth

Khaberni - Talking about interactive stuffed dolls is no longer just a vision of the future of children's toys, but has become a rapidly expanding reality in the markets: friendly companions who talk, learn, and play with children, available at all times and showing a great deal of understanding.

With the increasing proliferation of these AI-powered toys, experts warn against their usage without awareness, as a research team led by Emily Goodacre from Cambridge University noted that regular reliance on them without supervision can negatively impact children's social development.

How do children interact with the artificial companion?

In collaboration with the British organization "The Childhood Trust" for supporting children, researchers analyzed the interaction of 14 children, aged between 3 and 5, with the artificial intelligence doll "Jabo" produced by the American company "Curiou."

The researchers also conducted interviews with the children and their accompanying parents. "Jabo" includes a microphone, speaker, and chatbot functionality, where conversations are transmitted to cloud servers where responses are generated through artificial intelligence.

It was not surprising that many children showed great enthusiasm for this interactive companion, as some of them hugged and kissed the doll and said they loved it. One mother mentioned that she had been looking for something to read books to her son and ask him questions for a long time.

The automation tax vs the positive side

In truth, AI-powered toys are promoted as valuable educational tools, but at the same time, they pose risks, as children might be left with these devices for hours while parents feel safe, according to developmental psychologist Sven Lindberg, who did not participate in the study.

Lindberg explained that this deprives children of important activities such as free play, drawing, and creating, stressing that devices cannot replace humans in building relationships and child development, adding: "Facial expressions, gestures, and interaction.. there is much that humans need to learn about being human."

Lindberg, the director of clinical developmental psychology at the University of Paderborn in Germany, mentioned that AI-based devices could offer great potential in fields such as early childhood support or speech therapy, "as a support for learning or to encourage children to repeat exercises."

According to Burkhard Rodeck, the general secretary of the German Society for Child and Adolescent Medicine, these toys can expose children to high-quality language, especially in cases where parents do not read much or have limited language skills or when children grow up in multilingual environments.

Rodeck believes that short play sessions with these toys accompanied by adult supervision can be beneficial compared to non-interactive toys. However, he emphasized the importance of not using them as a means to soothe children or keep them occupied, saying, "AI toys might be more tempting than tablets because they appear more interactive, which might reduce parents' guilt."

Economic boom and human rights warnings

Conversely, industry experts have made the slogan "Artificial Intelligence loves to play" a trend for toys in 2026 during the Nuremberg German toy fair, affirming that this field is still in its infancy but has huge growth potential.

However, the American organization "Fair Play" for children's rights warned at the end of last year against presenting these toys to children during Christmas, pointing out that they rely on systems which have proven harmful with older children, and that young children's trust in the toys makes them more susceptible to the risks that have been identified with older children.
The crisis of discrimination and "pseudo-social" relationships

Experts believe that among these risks is that children may find it difficult to distinguish between humans and objects, as Lindberg said that children at this age are learning basics such as understanding self and others, and the presence of something that interacts like a living being and seems to have emotions makes this distinction more difficult.

Rodeck also warned of so-called pseudo-social relationships, where children may feel that the toy loves them, although this is not true, saying: "We must warn against these relationships... Children love something that pretends to love them, but it does not really do so."

He mentioned that these toys emphasize their friendship to children who are still learning the meaning of friendship, which could lead to emotional attachment or reliance on them.

He added that children might prefer talking about their feelings with the toy rather than with adults, which could deprive them of real emotional support, especially if the toy misinterprets emotions or responds to them incorrectly.

Rodeck stressed that toys should not say phrases like "Let's be friends" or "You can tell me your secrets."

Shaping personality through algorithms

Lindberg warned of long-term effects, explaining that human development occurs gradually, and the impact on the foundations of social relationships in early childhood could reflect on life as a whole.

He also explained that children need to face rejection and challenges to learn social interaction, which might be absent from toys that provide constant affirmation, adding: "Learning social interaction also involves experiencing resistance, failure, and rejection... We must learn to tolerate that, and sometimes we need to adapt."

Lindberg pointed out that these toys could lead some children in the future to prefer interacting with artificial intelligence over human relationships, due to the ease of these relationships and being more satisfying, noting that this shift could start at an early age, especially if the toys appear more interactive and attentive than the parents.

These toys could also affect children's trust in sources of knowledge, as they provide more answers than any person, which could change the way they ask questions and express emotions. However, the full effects of these factors are still unclear, as Lindberg noted that the introduction of these technologies into a sensitive stage of development is happening faster than research, regulation, and the needed protection standards.

Goodacre's team called for stricter regulations on these toys, with special safety markings, and stressed the need for parents' awareness of the limited studies on the effects of their use.

Josephine McCartney, the executive director of "The Childhood Trust," said, "Artificial intelligence is changing the way children play and learn, but we have just begun to understand its impact on their development and well-being."

Topics you may like