Different but complementary: Navigating AI’s role in children’s learning and development

a boy and girl with female caregiver looking at a tablet device together

Photo: Ketut Subiyanto (Pexels)

As a researcher focusing on AI and child development (and also as a parent of two), I have seen many instances of kids talking to conversational AI agents like Siri, Alexa, or ChatGPT. It seems that kids turn to AI agents to satisfy their curiosity, asking things like what six plus six equals, how far away black holes are, or how to make an invisible potion. And sometimes kids engage in what feels like social chitchat: they share their favorite colors or princesses, or even ask if the AI has its own favorites. Very often, children seem amazed and baffled by how AI can understand and respond in ways that seem quite smart.

These observations have made me wonder whether the future might entail AI serving as a genuine conversational partner for children—something (someone?) kids can talk to, have fun with, and potentially learn from. We all understand how crucial it is for children to engage in conversations to learn about the world around them; if AI can offer similar conversations, children’s opportunities for learning would be significantly amplified, as children might not always have an engaging companion available to answer questions like “why” and “how”.  Yet, I also share the common concern about the uncertain outlook for what could be called the “AI generation.” Is AI’s ability to provide quick answers stunting children’s learning? Is using the “Hey” command to wake up AI making kids forget about politeness? And perhaps most troubling, what if they become more attached to their AI than to the humans around them?

Both hopes and concerns about AI are valid, and it’s important to recognize that we are at a critical juncture in AI development, where its future trajectory is still being shaped. So, how should we approach AI? I believe we should view well-designed and child-centered AI as an additional source of support and learning opportunity for children, one that is different from but complementary to the interactions they have with their family, teachers, and peers.

In fact, my research, and that of many others, has already shown that children can effectively learn from AI, provided the AI is developed in alignment with learning principles. Over the past few years, I have designed and tested AI companions that engage children during book reading, television watching, and storytelling, by asking questions and responding to children based on their answers.

To give you a more concrete idea of what this type of AI-assisted conversation looks like—imagine a young child and their caregiver reading a picture book together. In my research studies, a smart speaker simulates the role of the caregiver, reading the story aloud and pausing intermittently to ask children questions—generally questions about the problems the characters are encountering, how the characters feel, or what the children think will happen next. The smart speaker listens to children’s responses and offers little hints, just as a caregiver would, if the child needs a bit of help answering.

We have repeatedly found that young children who engage in this type of dialogue with AI comprehended the story better and picked up more vocabulary than did those who merely listened to the story without this dialogue. What is even more intriguing is that, in some contexts, we found that an AI companion can lead to learning gains comparable to those from engaging in similar dialogue with a human.

If you’re expecting me to suggest that AI can replace human interactions, that’s not my argument. Even when studies show learning benefits, it does not mean that AI can replicate the unique benefits of authentic conversations children have with others. This is because conversation is not just about exchanging information; it is also about building relationships. Children thrive when they engage with someone they can relate to, and someone who can relate to them. So, the question comes down to whether children and AI can achieve this level of connection.

This is a very challenging question to answer, but we can gain some insights on children’s relationships with AI by examining how children talk to them and comparing this to how they talk to other humans. While children were quite talkative with the AI agents in my studies, they were even more talkative with human conversation partners. Moreover, when talking with a human, children are more likely to steer the dialogue, adding their own thoughts or following up with question after question when something puzzles them. These “child-driven” aspects of conversations are the active ingredient that fuels children’s cognitive and social development, and AI still falls short in encouraging this type of engagement.

Why do children engage with AI differently? It boils down to how children perceive, or feel about, AI. My studies with children have made it clear that even children as young as four recognize that AI simply doesn’t look, talk, or act like a human. They also sense that AI doesn’t have the same experiences that they have and can’t genuinely empathize with them. These factors, consciously or unconsciously, affect children’s engagement, making their interactions with AI fundamentally different from their interactions with people.

These differences are not necessarily negative. In fact, we can take advantage of these inherently different experiences to maximize the benefits for children while minimizing undue influences. I’d like to offer two suggestions that might help achieve this:

  • First, we should help children maintain healthy boundaries with AI by being transparent about its nature. Ensuring they understand that they are interacting with a program, not a person, prevents confusion and strengthens their ability to differentiate between AI and humans. This is where the growing number of AI literacy initiatives—designed to teach children critical knowledge about AI—plays a crucial role, empowering them to approach AI interactions with awareness and confidence.
  • Second, we should design AI to encourage parental involvement. In a recent study, our team developed an AI that allowed a beloved Sesame Street character to engage with children while reading a book together. The AI provided discussion prompts to actively involve parents, and we found that this approach not only supported children’s language development but also fostered meaningful and enriching family interactions

By thoughtfully adopting these approaches, I am hopeful that we embrace the new learning possibilities AI brings while preserving the irreplaceable human connections that nurture our children’s growth.

Check out our full papers here:

Xu, Y., Aubele, J., Vigil, V., Bustamante, A. S., Kim, Y. S., & Warschauer, M. (2022). Dialogue with a conversational agent promotes children’s story comprehension via enhancing engagement. Child Development, 93(2), e149-e167. https://doi.org/10.1111/cdev.13708

Xu, Y., He, K., Levine, J., Ritchie, D., Pan, Z., Bustamante, A., & Warschauer, M. (2024). Artificial intelligence enhances children’s science learning from television shows. Journal of Educational Psychology. Advance online publication. https://dx.doi.org/10.1037/edu0000889

 

Ying Xu

Ying Xu is an Assistant Professor of AI in Learning and Education at Harvard University. Her research focuses on designing AI technologies that promote language and literacy development, STEM learning, and wellbeing for children and families. 

 

 

TAGS: , , , , , ,
More Content to Explore