Reimagining AI Through the Lens of Children’s Well-Being

This blog post was originally published by foundry10 and appears here with permission. 

Researcher Stefania Druga, showcasing ChemBuddy, a multimodal AI chemistry assistant she developed.

On a late June morning at Reykjavík University in Iceland, researchers, educators, clinicians, and designers from all over the world convened for the the 24th annual ACM Interaction Design and Children (IDC) Conference to share the latest research findings, innovative methodologies, and new technologies in the areas of inclusive child-centered design, learning and interaction.

foundry10 Senior Researcher Jennifer Rubin and Technology, Media, and Information Literacy Team Lead Riddhi Divanji were excited to co-lead a hybrid workshop this year at the conference on “Designing AI for Children’s Well-Being.” Participants in the workshop applied from a wide range of fields and industries to share their ideas and learn from others.

“Educators, designers, researchers, clinicians, and caregivers/parents have had the difficult task of needing to be reactive and responsive to a powerful technology that is ubiquitously available to youth, but not designed with safeguards for child development and well-being in mind. This workshop brought all of these stakeholders together to build a shared language around designing AI with children’s well-being at the center. The energy and openness in the room made it clear: people are ready to move from conversation to action,” said Divanji.

The workshop was co-organized alongside Rotem Landesman (University of Washington), Medha Tare (Joan Ganz Cooney Center at Sesame Workshop), and Azadeh Jamalian (The GIANT Room). Together, they guided participants through discussions and activities focused on The Responsible Innovation in Technology for Children (RITEC) Toolbox, a research-based guide and framework created by UNICEF and the LEGO Group for designers to build more thoughtful, child-centered digital experiences.

How to Design for Children’s Well-Being in Digital Play

The RITEC project was developed in response to conversations with children ages 8-12 from around the world. The framework maps how the design of children’s digital experiences affects their well-being and provides guidance as to how informed design choices can promote positive well-being. It also identifies eight dimensions of children’s subjective well-being that well-designed digital technologies should support.

Using this framework as a guide for their conversation, 15 IDC workshop participants discussed their ideas, projects, and proposals for how AI can be designed and used to promote children’s well-being.

“I was struck by how applicable the RITEC framework is to a wide variety of products and solutions. I had gone in wondering how easily we would be able to adapt, and I found that not only were people able to choose other products but also other applications (e.g., evaluation instead of design),” said IDC workshop participant Gillian Hayes, Professor at University of California, Irvine.

Learn more about some of the proposals, ideas, and research projects workshop participants discussed below.

Opening AI Spaces for Youth Mental Health Support

Jocelyn Skillman, a licensed therapist who participated in the workshop, shared a prototype of ShadowBox—a trauma-informed AI companion designed to hold space for youth navigating violent ideation or overwhelming inner experiences.

“It is not a therapeutic substitute or crisis tool, but a proposed experiment in relational design: an attempt to create a slow, steady digital presence that can model containment, warmth, and emotional pacing,” said Skillman.

ShadowBox stems from Skillman’s clinical work with children and adolescents who experience intrusive or violent thoughts. Many individuals with homicidal or suicidal ideation fear the consequences of disclosure—believing they will be hospitalized, punished, or misunderstood.

“In traditional American care systems, these fears are often justified—violent ideation is treated as inherently dangerous, rather than as a signal of underlying distress,” said Skillman.

Though she sees the risks inherent in a relational AI tool, she can also imagine the benefits of an anonymous, low-stakes space for youth to “experiment with language, regulate affect, and feel witnessed—without fear of surveillance, diagnosis, or moral panic.”

Skillman says the most impactful part of the workshop for her was, “learning about all the resources and research from other participants. I will continue to think about the tension between AI as an ally and resource versus a threat to youth’s development.”

Insights on Teaching AI and Digital Literacy Skills in Middle School Classrooms

Workshop participant Anna Baldi, a middle school technology teacher at The Evergreen School in Shoreline, WA, integrates digital literacy skills with hands-on learning in video production, graphic design, robotics, and programming. This year, she partnered with a social studies teacher to introduce AI tools into the 8th grade curriculum, focusing on prompt generation, and editing and fact-checking AI-generated content, which led to meaningful discussions on the pros and cons of these tools.

“Speaking with students, I have observed the consequences of technology in their lives, including plagiarism, cyberbullying, and social media addiction. However, I’ve also seen students create impressive multimedia projects and engage in thoughtful conversations about AI’s role in their education,” shared Baldi.

In Baldi’s experience, the classroom offers an ideal space for students to explore AI technologies under the guidance of informed educators and caregivers.

“In many ways, technology breaks down the barrier between home and school, so a three-way partnership between teachers, youth, and caregivers is essential for a unified and informed approach to AI tools,” said Baldi.

Baldi sees AI as a valuable tool when used to support personalized learning and foster students’ creativity, autonomy, and emotional growth. When asked how she would apply what she learned in the workshop to her own work in the classroom, Baldi said:

“As an educator I was really interested in the idea that children need to perceive their own learning. It’s made me consider how much my students’ skill levels and improvement are visible to them, and how I can help them see their personal growth throughout the learning process rather than just at the end.”

Designing AI Tools and Guidance to Prioritize Well-Being

AI is showing up in more and more everyday products, including those made for young children ages two to six. As these tools become part of family life, it’s important to make sure they support healthy child development and youth well-being. Right now, there’s a gap between what researchers know and what product designers need in order to build youth-friendly AI.

To help close that gap, the Digital Wellness Lab at Boston Children’s Hospital is leading a multi-phase research project reviewing current studies and gathering expert input to create a practical, easy-to-use guide for designers.

According to IDC workshop participant Brinleigh Murphy-Reuter, a Program Administrator at the Digital Wellness Lab, preschoolers may benefit from personalized learning experiences with familiar characters that can respond like real friends. But some young children are also forming bonds with AI itself—like smart speakers or robot toys—which we don’t fully understand yet. As AI technology rapidly evolves, it’s crucial to learn how to design products that support children’s well-being and development.

Research is still in progress, but preliminary findings show that understanding the emotional impact and formation of relationships between early learners and AI-driven characters is an under-explored research area.

“By studying the importance of AI as a tool, not a replacement for human interaction for young users, researchers can play an important role in bridging the gap between research and industry implementation,” said Murphy-Reuter.

Stefania Druga, an independent researcher focused on novel, multimodal AI applications, also highlighted the importance of prioritizing youth well-being when designing AI tools with her project, ChemBuddy. This tool acts as an AI tutor or “lab partner” for middle-school chemistry students. It incorporates a variety of tangible sensors and interfaces that students can interact with to get answers to questions or extra guidance in real-time.

ChemBuddy helps reduce cognitive load and common frustrations that can occur when students learn about abstract concepts. It makes learning concrete through hands-on activities and uses Socratic dialogue to guide students toward more accurate understandings of concepts if it detects a misconception, instead of just providing direct answers.

This personalized support encourages students to articulate their thoughts and build their own understandings, promoting deeper learning, intellectual agency, and enhanced student well-being in learning environments.

The Future of Designing AI for Children’s Well-Being

The “Designing AI for Children’s Well-Being” workshop highlighted just how essential it is to create space for collaboration across disciplines when developing technology for young people. By grounding conversations in the RITEC framework and surfacing insights from educators, clinicians, designers, and researchers alike, participants reaffirmed the importance of designing AI not just for profit, but with children’s emotions, safety, and autonomy in mind.

“I think the workshop did a brilliant job of drawing out how AI needs to be ethically aligned with how children grow, play, and make meaning. This mirrors wider ethical concerns around transparency, power asymmetries, and consent, but it’s even more urgent with child-centered AI, because children’s developmental trajectories can alter depending on the systems they interact with,” said IDC workshop participant, Dr. Nomisha Kurian, Assistant Professor in the Department of Education Studies at University of Warwick.

As AI continues to shape children’s day-to-day experiences, this workshop served as both a call to action and a reminder: centering children’s voices and well-being must remain at the heart of innovation.

“Our greatest responsibility is not to the algorithms we create, but to the children who will inherit the world we build with them.” —workshop participant Cristine Legare, Professor of Psychology at the University of Texas at Austin.

Learn more about the Digital Technologies and Education Lab and the Technology, Media, and Information Literacy Team at foundry10.