Into the Digital Future: Empowering Children’s Voices with Amanda Third
Laura and Jordan are joined by Amanda Third, Associate Professor at the University of Western Sydney and co-author of the UN’s Digital Rights of the Child. Amanda shares her participatory research approach that empowers children as active contributors to data on their digital experiences. As part of the Global Kids Online initiative, she highlights the digital challenges kids face and calls for adult support and effective policies. Tune in to understand how we can create a future where children can imaginatively harness digital technology.
This transcript of the Into the Digital Future podcast has been edited for clarity. Please listen to the full episode here, and learn more about the series.
Amanda Third: Hi. My name’s Amanda. I’m a professor at Western Sydney University, where I co-direct the Young and Resilient Research Centre.
Laura: So we’ve known each other for quite a long time, I think, at least eight years now, and in that time we’ve worked on some really massive projects. Just to name a few: The UN Digital Rights of the Child which we’re definitely going to dig into and the Global Kids Online Research Project. They’ve had such a massive impact on the well being and safety of kids and teens and adolescents online, we’re going to focus on those as some of our discussion. Let’s start with a really big one.
So the United Nations’ Digital Rights of the Child? That’s a really huge thing. Can you tell us a little bit of the background, and where your work kind of cut into that. What was the goal?
Amanda: Yeah, so that was probably a real career highlight for me. This was a big project to really try to ensure that the rights of children could be realized in online spaces as they are in offline spaces. So it was about taking the Convention on the Rights of the Child and really reinterpreting that, you know, updating the convention for the 21st century, thinking about what we needed to do whether that was as governments or as a private enterprise or of civil society, to really ensure that children could not just be safe online, but really maximize the benefits of digital technology for realizing their rights. So it was a really long process. It took us about five years from proposing that there should be one of these general comments to the moment where we finalized it, and it was adopted formally by the UN Committee on the Rights of the Child. It was a really great process and the result is that we now have a general comment on children’s rights in relation to the digital environment, which guides states to tell them how to apply the convention to ensure that children are safe in the digital age.
Jordan: Tell us some of the specific things in it, some things that might surprise our listeners.
Amanda: Well, it’s 10,700 words. Precisely. It’s a nice, concise text for a sort of a legal instrument, right? And it talks about all the kinds of different rights that children have and then how to ensure that those rights can be respected, protected, and fulfilled. So children, under the convention and the rights of the child have, roughly speaking, three kinds of rights. The first category is around provision, which is all about the basic fundamental things we need to make sure are in place so that a child can survive and develop and grow. They also have protection rights which are about protecting them from all different kinds of harm. And they have participation rights. And it’s the participation rights that get me most excited. The participation rights that are stipulated in the convention give children fundamental rights to participate in civic and political life. You know it grants them rights that are equivalent to those of full citizens, right? Those adults who are defined by law as the people who constitute the citizenry and they stipulate that actually to really fulfill children’s broader rights. You have to be speaking with, listening to, and responding to children’s own sense of what they need, what they desire, and what they aspire to.
Jordan: I love that because one of the things that really fascinates me about your work is the participatory approach to research, which obviously is connected to everything you’re saying. Well, maybe it’s not obvious to everyone who’s listening, so maybe you could tell us a bit about first, what participatory research is, how it shows up for you, and how that connects to digital wellness, or digital rights.
Amanda: Thanks for that. But before we do that, Jordan, can I just pick you up on one little thing which I think is kind of important? It’s this question of digital rights. I think we need to be really careful when we talk about children. You know what the general comment secures for children, because whilst a really important part of the general comment is securing rights for children in digital spaces, right? So thinking about what it means to be safe online, but also what it means to really participate fully in the digital environment in those online spaces. I want us to really push ourselves to think about the value of this general comment more broadly because, actually what the general comment is trying to set up is a way for us to think about how the digital might support children to realize their rights across online and offline spaces, to really open up a sort of a radical vision of what technology might offer for children, and that to me is really exciting. And it’s not a question that we often think enough about, but back to your other question, which was sorry, is that okay, that I just went rogue there?
But what is participatory research? And what’s the value of it? So participatory research with children is about taking seriously the idea that children should be agents in conducting research, generating insights, interpreting those insights, and then wielding them. So the idea is that it’s about thinking beyond doing research, about our children. And really, how you do research with children, thinking about them as partners in the research process. And this is, you know, it sounds lovely in principle, but it’s actually quite a difficult thing to achieve in practice, because, of course the minute that you begin to do anything with children as an adult, you’re activating a whole set of power dynamics that get in the way of all kinds of good things. So it’s a constant challenge inside the participatory research process to really stay attuned to those power dynamics and think creatively about how to work with them, subvert them, and really, really open up spaces, I guess, for children to participate meaningfully.
Jordan: Yeah, what is it? What does that mean? If we were to get really specific about it? What does that bring the other, the seeing research on children rather than with children. Right? Like, what do we get that’s different? Do we get more accuracy? Where do we get accuracy? Where do we get more accuracy? Where do we have less accuracy? When do you know all those? All the questions?
Amanda: Yeah, so well, good. Thanks, Jordan, you need to just pull me down to get me to stop talking abstractly and more concretely. yeah. So what does it give us? I think what participatory research really gives us is a way to make sure that the solutions that we design for children really resonate with children, and a position for the best possible uptake and impact. I think, working really closely with children, can supercharge what we’re able to achieve in terms of not just protecting them, but actually leveling them up and enabling them to really maximize the value of being online and using digital technologies.
I think, too— this is something I think we really underestimate. It makes research really fun. And… we really underestimate it. Because when you’re having fun, you’re in a much more creative space, you’re much more able to think about things from different angles and generate new ideas, and you know it’s just way better to put it in. You know the words that a teenager recently used with me. It’s way more vibey, right? It’s kind of like you can just do more stuff. So I think, you know, I think all of us adults who are involved in designing the digital world, for children really need to make time to, you know. Take our shoes off and stand in the grass and remember what it’s like to be a child, you know. Come down to street level in order to level up right? I think I think in a nutshell, that’s what I would say it offers us.
Laura: I love that, Amanda. You know we’ve recently been working on a project together on a different theme, and one of the key findings that came out of it was about the thing that we’re all missing, whether we’re a good platform who work with young people all the time, or academics or other people in this space, NGO’s in the space, is youth participation, but also youth-led product. Design is so vital, and we all talk about it, and we all want it. We’re willing to do it, but actually making it happen feels scary. But I feel hopeful we’re going to start seeing some stuff in that space. I also just wanted to pick up on what you were talking about young people being given that platform, that opportunity to go and do the things that they want to do online. You know, we just see this political activism from young people which is absolutely incredible. And it’s the first time I would say, particularly in the online space where we’ve seen this real movement happening. And it feels a little bit like the hippie revolution, which I’m quite a fan of. But yeah, Godspeed kids. I think you’re going to be fine and hopefully, we will all enable you on that journey.
I would like to move on just very briefly to talk about the Global Kids Online project. So in our previous series one of your good friends, Dr. Sonia Livingstone, came and joined us. And we talked about EU kids online and her participation in that and a little bit about global kids. Amanda, obviously, you’re based in Australia. I think we’ve briefly touched on that. But what was your experience like again working with the youth participants and any findings, takeaways, and stuff that you would like to share with us?
Amanda: Yes, I’ve been really, really lucky to have a role in Global Kids Online as an expert adviser. And really, I guess I’ve come into that entity as someone who brings some child participation expertise. And having talked to a lot of children over 15 plus years doing this work. You know I have learned a lot, and so that’s kind of how I came into that entity. And I think what’s really lovely about the Global Kids Online initiative is that it’s trying to generate evidence in places where we don’t have the evidence to drive policy and programming. This is a really big challenge for the sector around the world, you know, especially when we reflect on the fact that one in three users of the internet is a child, you know, that’s a huge number of users. A lot of those children are coming online as mobile-first users, so that you know that they don’t connect via a desktop, or they connect via a mobile phone. And that comes with, you know, different kinds of constraints and capabilities. But also many children are coming online in communities where digital technology is a really new thing, and they don’t always have the cultures of support around them, because the adults in their lives are really rapidly getting up to speed, too. I mean, I think, about, you know, Australia is supposed to be a reasonably you know, advanced technological, adopting country, right? I’m sure there’s a technical term for that. It’s not that, you know, like we’re thought of as early adopters. And I even think about my own experiences as a parent trying to keep up to speed and be a good parent to a child who is growing up in the digital age, and I find it really challenging. So I think those kinds of challenges are accentuated in many parts of the world. We really do need a bunch of evidence to help drive that.
I think also, what is lovely about the Global Kids Online initiative is that it is trying to generate quantitative evidence, but also qualitative evidence. And I think this is really super important, because you know…numbers don’t tell us everything, and it’s really important to to drill down into children’s experiences and really understand why they make the choices that they make, why they do things. You know what they’re able to do online and what they’re not able to do online, you know, to really understand the why behind the numbers
Jordan: So what do you understand so far?
Amanda: Oh, Jordan, you like to ask a big question, don’t you? Well, look, we’ve developed a bunch of methods that enable us to, I guess talk with children in a deep way at scale. We call this distributed data generation. And this is about developing a workshop. Typically, the workshops are like 5 hours long, and we then work with child-facing organizations in different countries to run workshops with children, gather their insights, and then we have a process where we analyze that data in conjunction with those child-facing organizations. And through this process, we’ve learned a lot. If I think about the things that children say are the biggest challenges to them, number one are the access issues. We know that there are still really obstinate digital inclusion challenges that are impacting the ways that children engage online. These range from, you know, disrupted internet connectivity. You know old devices or hand-me-down devices with batteries that don’t work, you know, all those sort of typical hardware issues. But also things like the parents making rules that you know, restrict their access and really limit children’s capacity to engage meaningfully, the attitudes of the adults in their lives. You know one thing that they highlight often is that teachers don’t really know enough about the digital world and how to bring it to life at school and value it and all of these sorts of things. They also say that, you know, adults, (you know, love kids. They really. They’re so wonderful when they get going) But they say things like, you know, “Adults say, on the one hand, ‘Put your devices away. Do this, do that’, and then, on the other hand, they just flagrantly break their own rules, and they’re not role-modeling in the ways that we would like them to.” So they’ve got a lot of what we might think of as quite valid critiques of the ways that their digital technology is spoken about.
But they also, and this is the thing that I really love the most is they, have this incredible enthusiasm and optimism for the ways that digital technology can benefit them. They really do see digital technology as an enabler for them for the most part and they’ve got these really wonderful visions of how technology can help to unite the world, how it can make it a more just. And you know, good place to be, but they’re also calling on governments, on companies, your platforms on, you know, the adults in their lives in general to really make sure that these spaces can be those enablers, and that the risks of harm are really minimized. The, you know, discrimination, hate speech, misinformation, privacy, breaches, these are the things that really bother children, and they’re really looking for us to step up and find some solutions.
Laura: Amazing piece of work. I’m sure that everyone involved in it is incredibly proud. Certainly, every time the reports come out you know I dig in, and I’m always really fascinated with how things have kind of changed over a period of time, and certainly, globally, it feels as much as I think lots of people are suffering a little bit with that lack of access that you mentioned, but actually, it’s probably leveled out on a global level, whereas the countries that saw themselves as way beyond in the past, maybe I’ve now dropped down a little bit. So that’s quite interesting.
Amanda: And I think, Laura, that’s really true. Often it’s really easy for us to talk about young people in the global North as the most connected generation, etc. And we often forget actually, this incredible diversity in the ways that children engage with technology. Some have much more access than others, and really some of the divisions within countries are quite stark. And so digital inclusion is still a massive challenge for us, not just in low-income settings, but in high-income settings as well.
Laura: Thank you. So, Amanda. yeah, I mean amazing work that you’ve done already, and I’m so happy that I’ve worked with you on some of these projects.
Jordan: I wonder, with that in mind like, do you think there’s things that we (I don’t know who “the we” is) whether “the we” is researchers, tech industry, academics, policy. Maybe like, are there things we’re not focusing on that we really should be focusing on?
Amanda: My biggest gripe is that we’re not imaginative enough about the way that we think about children’s relationship to technology. It’s very true that there are lots of risks of harm associated with children engaging online. But we have got to, first of all, find ways to protect children online, and also, but at the same time think in a kind of maximal way about what the possibilities really are. We want to be teaching children to grab a hold of this technology and think with it, work with it. To do, you know things that we haven’t thought of yet? I think it’s really easy for us to get stuck in our thinking about what those possibilities are. And I think we need to enable children a chance to think differently. So yeah, to my mind, we really need to open up those spaces.
Laura: I love that so much, Amanda. I can’t wait to see what that future might look like. And hopefully it’s something we can all get behind.
How Would Kids Design a Social Networking App That Supports Their Digital Citizenship?
In the tech world, kids’ ideas are often unnoticed and invisible. Even worse, when they voice their ideas, they might be dismissed or laughed at because we, as adults, may not know how to respond to their imagination. In partnership with the Cooney Center’s Sandbox initiative, we at The GIANT Room are working to give kids a seat at the design table of real products. We believe that this is not only beneficial for the kids themselves, but also for the designers of these products, as well as the quality of the product being developed. Through interactions with their intended users, designers will have a chance to learn more about kids’ worlds, their interests, needs, and perspectives — insight that goes beyond “usability” concerns. As a result, they will be more equipped to design products that are actually engaging, effective, accessible, and inclusive of kids with diverse backgrounds and cultures.
To highlight how kids have influenced the design of a real product, we invite you to read about the co-design process of Mrs Wordsmith’s Planet Agora, a social networking app for kids 8 to 13 years old, that was tested by a group of 30 GIANT kids over the course of two cycles of their product development.
Pierre Lagrange, chairman at Mrs. Wordsmith, wanted to create a product that enabled kids under 13 to engage with an age-appropriate social network. He said they wanted to “develop Planet Agora to empower kids with the social-emotional tools that help them champion positive values, while navigating the nuances of such concepts as diversity, body positivity, and gender equality.”
However, important questions remained about how to design a chatting app for kids that supports their social-emotional well-being. Could such an app help them learn about concepts such as gender equality, fake news, and indecency, while supporting them as they learn to act as moderators in their community? To answer these questions, designers from the Mrs Wordsmith team worked with the Cooney Center Sandbox and kids from the GIANT DesignLab to playtest, prototype, and co-design Planet Agora.
Involving Kids as Co-designers at the Very Early Stages of Product Development.
To truly understand kids’ perspectives on Planet Agora, the Mrs Wordsmith team started playtesting with kids from very early on, before they even had a playable digital prototype of their app. Designers from The GIANT Room prepared a paper prototype based on wireframes. The paper prototype was designed to answer specific fundamental questions about the user experience and interactions in the app such as:
- Engagement: How might kids moderate the content of their chats with their friends?
- Rewards System: How might kids react to the rewards mechanism in the app? What changes would they like to make? How would they go about designing a reward system to motivate them to moderate the content AND stay positive in the app?
- Social Emotional Skill Building: How might we design the app to help kids build social emotional skills and improve their digital literacy?
Playtesting the Paper Prototype
For our first co-design session, we had 20 kids (9 girls and 11 boys between 8-13 years old) join us for one of our playtesting and prototyping sessions in person in New York City.
Some of the parents who signed their children up for these sessions mentioned that their child had experienced “bullying” at school, or wanted to build their anti-bullying awareness. Others mentioned their kids spend a lot of time online, and they wanted to make sure their children are equipped with the digital literacy needed to navigate these experiences. Still others shared that their kids were interested in learning more about the thinking and decisions that go into making digital media such as movies, shows, and apps. These responses suggest that the product can address the real concerns of families.
One of the most important aspects of co-designing with kids is ensuring that all participants— kids and adults—feel comfortable sharing their thoughts and believe that their ideas and contributions have equal weight. We asked kid designers to examine the wireframe of the app and offer their impressions on the look and feel of the interface before playing with the paper prototype. As they played with the prototype, they showed a high level of engagement, chatting and moderating each others’ conversations. This was an important finding for the Mrs Wordsmith team, ensuring that the core experience of the app is in fact engaging for kids.
We also paid attention to how kids moderate content. We observed that kids can easily sense when something is “not nice,” but classifying offensive content under specific categories proved to be more challenging for them. In most cases, they could identify when content was an example of body shaming or racism, but found it much harder to identify “hate speech,” for example. These findings suggest that the Mrs Wordsmith team could gradually introduce players to more challenging pillars as they build their social-emotional skills.
Kid designers were then invited to prototype certain features of the app or new features they would like to be added to the app. They made their prototypes with arts and crafts materials, or simply by sketching their designs and then sharing with the adult designers. They made prototypes of their avatars, creator tool kits they would like to see in the app to customize their avatars, new features such as an SOS button to call their friends when they need support or when they felt they were being bullied in the app, modes of communication, and more.
Data from the paper playtesting also helped us to identify nuances in kids’ experiences in the app — how might kids react when being targeted with a negative comment? How might we design to allow or encourage “apologies?” How might we discern objective versus subjective moderation? How might we support kids when they feel targetted or when they need help in identifying a pillar? We aggregated data from playtesting and prototyping exercises from all four sessions, outlined findings to these questions, and provided a series of recommendations for the Mrs Wordsmith team to consider implementing in their next prototype for the app.
Designers come in all shapes and sizes. At Mrs Wordsmith, we believe in co-designing with the children who use our products, so we make sure to have them on board at every step of the design process. It’s not just talk. It’s about creating something that truly resonates.
— Lady San Pedro, Director of Business Innovation
Inviting Kid Co-Designers Back
For the second round of designing with kids, the Mrs Wordsmith team traveled to New York City and joined a group of eight kids at The GIANT Room for three sessions of co-designing and prototyping. For this round, they were interested in kids’ preferences for learning methods and instructional media (so that the app can more effectively facilitate social-emotional learning), kids’ attitude towards conflict, their relationships with adult counselors, and their views on online privacy and security.
During these sessions, GIANT kid-designers playtested the digital prototype of the app, engaged in deep conversations and interactions with the adult designers, exchanged ideas, provided feedback, and recommended new features or updates in the app. A number of the kid participants had previously joined the paper-prototype playtesting sessions and were ecstatic to see the Mrs Wordsmith team’s progress and to play with the digital version.
Kid vs Adult-Designed Instructional Materials
During these sessions, we learned that kids enjoy comics, videos, and animations that explain challenging concepts using examples to which they can relate and connect. For example, one GIANT kid-designer created a comic about indecency: a chicken spray paints the White House.
Another child teamed up with one of Mrs Wordsmith’s designers to write a comic about fraud. Together, they came up with the story of a friend who sold a car to another friend, but when the buyer received the car, he realized that it was actually a toy car — for the price of a real car! Funny and accurate, right? That’s what happens when we collaborate with kids to explain ideas in ways that are kid-friendly, effective, and engaging.
Kid-designed instructional materials were relatable, humorous, and concrete. This was in contrast to adult-designed materials which were filled with “formal” definitions and examples that were sometimes too far removed from the kids’ world.
Co-Designing the “Counseling” Feature in the App
Next, adult and kid-designers started discussing and co-designing the counseling feature in the app. We learned that kids care about who their counselors are; they want to have the choice to read about them including their gender and method of counseling, and sometimes they are willing to wait for their preferred counselor. Alternatively, they are also willing to speak with an “AI-counselor that’s not biased” as long as they know they are talking with an AI-bot. They also emphasized the importance of being able to explain themselves and having the chance to apologize. This is a pattern we also saw during the playtesting. Often when a child posted something that was offensive to others and was flagged, they immediately wanted to explain why they posted what they did and in many cases, it was obvious that they didn’t mean to offend someone. These moments of conflicts were golden learning moments for them to learn about Planet Agora’s pillars, reflect, and empathize with other users.
On Privacy and Security
Another surprising finding was how conscious and opinionated kids are when they are asked about their online privacy and security. They discussed how they wish they had agency over what can be shared about them by others. They even thought AI technology can play a role in securing their privacy — for example, if someone posts a photo of them without them knowing, AI can detect their face in the photo and notify them. They also came up with yet another “pillar” in Planet Agora — “Respecting each other’s privacy”. They wanted to flag messages that they thought might violate their privacy or someone else’s: “What if someone shares a photo of someone else’s private diary without them knowing? I want to be able to flag that post and I want it to become “blurred” right away so no one else can see it until we know if that person actually had the permission to post it.”
Bringing Kids’ Voices into Product Design Process
What we have shared in this post is just a small window into what we’ve learned about kids’ viewpoints and their ideas when it comes to designing a chatting app for them. When co-designing with kids, adult designers need to be conscious and aware enough to choose what to pay attention to and how to construct meaning from the experience. They need to be able to connect with kids, engage in meaningful conversations that actually elevate kids’ voices, and give them space to express themselves. They need to be able to ask questions, pause, and endure the silence without jumping to provide or suggest an answer. There is an art in co-designing with kids and if a product team succeeds in doing so, truly magical products can be developed – products that will be engaging, effective, and that are actually good for kids’ well-being.
We’re proud to be a part of the Sandbox Initiative from Joan Ganz Cooney Center; getting kids involved in the design process can make a huge difference in the industry and consequently, the lives of millions of kids around the world. The initiative makes it possible for product designers to connect with kids, make meaning of their interactions, and learn how they can apply their design findings into the product development process.
Azadeh Jamalian, is founder and CEO of The GIANT Room creative STEM organization, helping the next generation of creatives and inventors to think big and act on their most ambitious ideas. Jamalian is a TED speaker, former head of education strategy at littleBits, and co-founder of Tiggly. She has a proven record of designing learning products and programs that millions of children have engaged with, and thousands of schools have implemented in their programs. She has a PhD in cognitive studies in education from Teachers College, and has publications on a broad range of topics such as designing learning platforms for children, emerging educational tech, game design, mathematical education, and cognition.
Into the Digital Future: Preventing Digital Harassment with Trisha Prabhu
Trisha Prabhu, an innovator and anti-cyberbullying advocate, joins Laura and Jordan for a powerful discussion on combating digital harassment. Trisha shares insights from her journey as the creator of ReThink, a technology solution designed to detect and deter offensive language before it’s posted online. Together, they explore the current landscape of online bullying and the proactive steps being taken by her company and the tech industry to combat them.
This transcript of the Into the Digital Future podcast has been edited for clarity. Please listen to the full episode here, and learn more about the series.
Trisha Prabhu: Hi, everyone. My name is Trisha Prabhu. I am the inventor of ReThink, a patented app that stops cyberbullying before it happens. I first got into this work as a teenager. I had my own experience with bullying and cyberbullying, and I saw the issue and how it was affecting youth in my community and around the world and really wanted to do something about it. I didn’t want to be a bystander. I wanted to be an upstander. And my vision was tackling cyberbullying at its root with the cyberbully, before they say something mean or hurtful, that in the moment, they may not realize it’s having that kind of impact, but then later might regret, and so I came up with this idea for ReThink. At the time, it was just an idea, but then for an 8th grade science fair project, I had the chance to test it, to validate it, and realized it was an incredibly effective way to tackle cyberbullying. And that launched a multi-year journey to take this idea and bring it to life and create a technology that could help cultivate, as we say it at ReThink, a new generation of responsible digital citizens. So that’s the mission I’m working towards. Yeah, that’s me in a nutshell.
Jordan Shapiro: Wow, I’m so excited about this. You said so many things there. I want to follow up on one. I’m going to ask you to really just explain so much about ReThink, but before that, I want to go back to one thing you said, which is that you had your own experience with bullying and cyberbullying. I mean, trigger warning for anyone listening, because I’m about to ask you to tell us about it. What was it like for you to be 13 years old? What did it feel like? What was the everyday experience of being a 13-year-old in a world of smartphones and social media?
Trisha: Yeah, I think the words that come to mind are overwhelming and lonely. I think, especially when you’re a teenager or a young teenager, so much of what you think and see about yourself is reflected in other people. I saw other people kind of as a mirror, and I really believed what other people had to say about me. I think with time, you kind of have the maturity to realize, okay, that person might be struggling with something, right? And so they’re projecting onto you because of their own insecurity or because of their own challenges. But as a young person, you don’t realize that. You believe what other people say about you. So, for me, being excluded or receiving texts that I thought were from one person but were actually from another person and then, you know, were used as a joke and spread to all the students in my school. That was telling me, you know, you’re not worthy, you’re not loved, you’re not important. And I really believed that. And so for the longest time, I just felt incredibly lonely, like I didn’t really know where I fit in the world and I wanted to find that place to fit and was struggling to find that place to fit. And so it was hard because, again, that’s all I knew. And it took a while for me to then rebuild and repair that confidence and realize and affirm that I was worthy, that I was important, that I was loved. So it is really difficult.
Jordan: Yeah. I think in your book, you used the term “girl drama” exactly to describe it. And you just did such a beautiful job of sort of explaining the feeling that happens when we see this sort of everyday girl drama that sadly is glamorized in our television shows. So it’s really fantastic. Do you imagine it feels similar to 13-year-olds right now? Do you think things have changed, or do you think it’s a similar thing with maybe better solutions, like the ReThink app?
Trisha: Yeah, first of all, I’ll just say girl drama, it’s tough. I always tell middle schoolers it gets better, because it’s a tough period. In terms of how it’s evolved, I think it’s gotten better in some ways, because there was far less awareness and advocacy around cyberbullying when I was 13, which, believe it or not, is coming up on 10 years ago, which I can’t believe that it’s been that much time, but there was so much less conversation about it. And I think now increasingly there is literacy education, advocacy conversation. There’s been an effort to try and get rid of the stigma. That’s a huge part of the work that we do as well, and technology solutions to try and prompt and teach youth to be smarter. On the other hand, I think that social media has also evolved and changed right? And created new opportunities and new challenges. So it’s always a balancing act, and it’s certainly, I think, still a very important issue, one that we have to continue to work to address.
Laura Higgins: Definitely.
Jordan: Before I hand you the next question, Laura, I just want to say, that you use the word “girl drama” in the book, but it’s not unique to girls. Everybody’s experiencing it, and even the idea of girl drama just normalizes the notion that we think we should accept it and not fight.
Trisha: Absolutely. It’s a rite of passage, right? That’s how it’s portrayed. And I think we’ve got to do more to try to try and rework that narrative. For sure, yeah.
Laura:I remember Rosalind Wiseman wrote the books around Mean Girls and Wing Men and those clicks. They still exist, but hopefully the education and the way that we talk about those things does help people to be more resilient, and feel more able to manage the situations around them.
I’m going to dig into more of the technical stuff, so I want to know much more about the app itself. How does it work, what happens? If I was going to download it on my phone now, what would be the process? How would I get my hands on it and what would it do?
Trisha: If you download the app onto your phone, it’s very simple. You go to either the Google Play Store or the App Store. The ReThink app is free. So you get the app and when you download it. The setup process, which is really simple, in essence, consists of replacing your mobile device’s default keyboard with our ReThink keyboard. So the keyboard that’s on your phone that you type on, we just swap that out for a keyboard that has the power to detect offensive content in the moment and then prompt you to rethink. And so what that means is in the end we’re platform agnostic, so we can work across every app on your phone from email to social media and instantaneously in the moment, detect offensive content intelligently and then give you a chance to pause, review, and rethink.
Jordan: So you said already that you had tested it during your 8th grade science fair and had great results. Tell us how all of it works. I know you have numbers, statistics. It doesn’t have to be the 8th grade success, I assume you have even more numbers now.
Trisha: Yes. So, I mean, it’s incredibly effective amongst the youth age group, 13 to 18. Over 93% of the time when this age group receives this ReThink alert. “Whoa, hold on, are you sure you want to post this?” They change their mind and decide not to post this offensive message and over time we’ve innovated with trying to keep that statistic really high. So for instance, you’ll get a different ReThink message each time you receive an alert to try and keep kind of the shock factor high of receiving that alert and really intervening, so you don’t become desensitized to the ReThink alert. But the fact that we can proactively stop cyberbullying at the source in such an incredible way, so overwhelmingly, 93% of the time. I think it is a testament to the fact that, you know, so much of what happens on the internet is in part because of a lack of friction, right? So much of what we don’t like or see on the internet is just because, you know, it can feel very ephemeral. And reminding people about their humanity, bringing their conscience to the forefront, is just a really powerful, simple, yet powerful intervention.
Laura: And I think it’s one that a lot of adults could do with as well. This is not just for young people. I really want to say that.
Trisha: That was actually our biggest request during the pandemic, believe it or not, the pandemic started, and we started to get all these requests from companies like, is there ReThink for adults? Is there a ReThink for the workplace? Because everything transitioned online, and suddenly Slack was getting, like, a little snarky. So, believe it or not, we do have ReThink for Slack coming soon. One of our biggest product requests.
Jordan: Can I ask you a question about I’m sort of interested in a behind-the-scenes question at ReThink. Some of the things, the things we started talking about, the kind of bullying that we see in the media, glamorized in the media, a lot of that stuff. I can imagine that that’s not the hard stuff to program in, right? Like how to be mean, how to recognize I mean, how not to be mean, how to recognize when your language could be on the edge. But there’s also so much that it takes to really do that in a way that is trauma-informed, in a way that’s thinking about all the potential micro-aggressions. We have so many average statements that we say that have not been considered in our everyday life. So how do you think we think about one staying up to date, right? Being informed, learning all these things, as scholars and people finally tell us, wow, it’s been hurting to hear that language all the time when we didn’t even realize it was bullying for 20 years or 50 years or 100 years. So how do you think about being on that edge? Or I guess making sure that the underlying code isn’t just replicating existing social justice problems?
Trisha: Such a great question and I think such an important issue, particularly where you have AI or any type of intelligent technology in the picture. And we have to be really open, really transparent, and really, really critical and intentional, because perpetuating existing, existing language that is harmful, right, that is creating harm goes directly against the mission of what we’re trying to do. So in our case, it’s two things, I think. One is a mindset, right? The mindset is we’re never done, the product is never perfect. You know what I mean? So it’s never, it’s never, it’s never whenever over, it’s constantly, constantly, constantly being updated. And part of that is just because the language period evolves, right? Slang evolves, you know, kids get more creative. You know, it’s very common, you know, for instance, on video game platforms to have young people try and get around filters, take an offensive word and just get rid of a letter, get rid of two letters, put in a star, right? Young people got creative. So we have to get creative too, right? And we have to be conscious of those changes and that trend and evolution. The other thing to your point is also constantly thinking about to that mindset of we’re never done is also thinking about what we are learning, right?
What are we learning about ourselves, about the world, about how people communicate with one another and what is and is not appropriate. So always being in tune to that. And that leads into kind of the second thing, which is practice. So all of our technology. We work with psychologists. We work with linguistic experts. When we’re developing this technology in international languages, we’re working with local, on-the-ground linguistic experts to think about the appropriate context and to think about how language can affect other people in ways that we may not understand or expect, right? If there’s one takeaway that I learned from this entire process, it’s that I’m not a language expert. Two, language is really hard, to your point. It’s very easy to think about this being simple when you think about the extreme cases like I hate you or something that’s super obscene. But there’s a lot of gray. There’s a lot of gray, right? And we have to be really conscious, too, to stay true to our values and our beliefs. For instance, some people might say that some language that’s used by the LGBTQ community is offensive. We completely disagree, right? And so we have to stay true to our values, and we need to draw really clear lines and be careful not to get swept up into some of that. All that’s to say, it’s a process. It’s a combination of a mindset of we’re never done, we’re always going to be intentional, we’re always going to be critical, and the practice of bringing in experts who can offer the insight, the advice, the expertise that’s necessary to do this in a way that is rigorous and up to the standards that we think are absolutely needed.
Laura: Fantastic. That was really good, really helpful. And yeah, it resonates with me. Again, we see these things both on Roblox and more broadly in the sort of teenage community around this. They will be very inventive about how they insult one another. And actually, a lot of it might not go against normal terms of a platform. It can be really subtle. And again, that cultural nuance and difference is really interesting, and it is a huge amount of tasks. So well done. Okay. And it’s great, actually, to hear that you work with other kinds of child development professionals as well as experts in the field in this area. So a lot of our audience are parents, and they are fed a lot of scary stuff from the media. For us, what sorts of things do you think are the red flags? What do you think parents and educators and actually kids and teens should be worried about? And also maybe what we shouldn’t worry about so much? The stuff that we’re seeing, the scary stuff.
Trisha: Yeah. I mean, I think first and foremost, the good news for parents and educators is that the most extreme consequences of cyberbullying happen, but extremely rarely, right? So we see stories, and I mean, it’s heartbreaking, and it is really horrifying, right. When we see, for instance, cases of, as you said, Laura, lives lost to cyberbullying or self-harm suicidal tendencies as a result. But again, it’s a very rare instance. So that’s the good news. I think we should, of course, have that in mind and never want to escalate to that. But I don’t think that the goal coming in should be, oh my gosh, if I give my child the device, they’re going to explode, or if my child is on a social media platform, they’re going to explode. I think it’s more my advice to parents and educators is how can you create open channels of communication with your children so that if and when a problem does arise, your child can come to you? I think that the number one thing that I encourage parents to do is to not wait to have a conversation about their child’s phone or their Internet use until a problem arises, which is a very common strategy, right?
That’s the first time you have a conversation about it, or that’s the first time you’re talking really seriously about it. How can you do that proactively so that, one, a child feels comfortable coming to you if they experience something negative, but also two, so that you’re always talking about and learning about their experiences, what they’re seeing, and you can help shape and inform, right? You can be a part of that conversation as opposed to something that’s just happening on the side. Very siloed. That’s my big piece of advice. And then in terms of what to look out for, I think conversations with your children about how they see themselves and self image I think is really important, right. Because these more subtle effects of social media and interaction online I think are more common. Certainly I’ve heard it in conversation with young people, right? So it’s not the child exploding, it’s more these little learned attitudes or perspectives. So how can you have conversations with your children? Again, this comes right back to that first point, right, of keeping those channels of communication open, right? When those channels are open, then you can have conversations with them, right, about their perspectives or what they’re internalizing and ensuring that they’re staying true to themselves and that they’re remembering and being reminded that they are important, that they’re perfect as they are, right?
I think that those two types of things are super important. And then I think also having conversations with children about what is and is not appropriate to say super important sounds super basic to say. But I think a lot of young people can sometimes feel, again, as I said, because of the ephemeral nature of the digital world, there’s a little creative license that’s like, I can be a different person. I have a digital persona. Like, that’s not me. And I think having conversations with kids about the fact that what they say online has consequences for other people, but also for them, you’d be surprised how many young people I’ve met with who don’t realize that a future employer might be looking at their social media, right? So I think having conversations around that is super important. But yeah, I think overall, the big takeaway for parents and educators is the good news is the phone is not like it’s a ticking time bomb, right. That’s not what it is. It’s more a pathway to a new world where a child can learn and explore and develop different parts of their identities. And how can you be a part of that process so that that development is healthy and positive and not so that it is negative? Not so that they are saying or receiving bad content, and not so that they’re internalizing or starting to buy into assumptions that are completely off the mark. Right. I think that’s more how we need to be thinking about tackling the challenges that come with devices.
Laura: So there’s critical thinking skills again, as well as spotting the misinformation.
Jordan: Yeah, absolutely. And you’re starting to bring this mission through technology and programming to schools, right. What does that look like?
Trisha: Yes. So we do a lot of collaboration with schools both in the US. And globally. And it depends in terms of the actual. Sometimes we’ll work directly with schools if we’re working internationally, often we’ll work with an organization. But we’ll bring the ReThink technology to schools. We’ll also bring a digital literacy wellness curriculum where we’re teaching young people again about those digital literacy basics and age-specific audience, also incorporating those socio-emotional learning skills. So that’s a big part of what we’re doing now. And we think that’s super important because school really is the place we go to learn all the things that are crucial right, for life. Or in theory, that’s what school is supposed to be. And growing up, I always had to learn about substance abuse. We learned about a number of important issues. We never really talked about the Internet. There was no thought process of it. Not enough to just give people a phone or give young people a phone. We need to give them a framework with which to use that phone. We need to give them a toolkit. We need to give them the language. So a lot of what we do in schools is creating language around things that some young people or adults would think is obvious.
But a lot of young people, often especially at a young age, don’t really have the tools to identify and say, oh, that’s the problem. Right. Or this is the specific issue where this is how it’s creating harm. That’s a huge part of what we do. And then another huge part of what we do is just empowering young people to feel confident about issues that sometimes can feel taboo, right?
It can be hard, for instance, to talk with parents about sexting. Right. I just put that right. That’s a conversation parents don’t want to have. It’s a conversation children don’t want to have. And the result is no one talks about that. But the truth is that, again, it’s rare. But according to the most recent statistics from Pew, 7% of US teenagers, 13 to 17, are having images that they’ve sent forwarded on without their consent, right? So it is a problem, right? And we have to have these conversations. And so part of also what we try to do is use that opportunity to that platform in schools to convey crucial information that young people may not be receiving or talking about in other arenas, because it’s too taboo, right? Despite the fact that it might be a real issue that maybe they don’t confront, but a friend confronts. Right. And again, being educated as an opportunity to help them.
Laura: Absolutely. Love that. Well, this has been amazing. I’m going to come back to my last question, I think. So, Trisha, what next for you? What are your big dreams? What are you doing with yourself now? And also thinking specifically about ReThink? What are the next steps? What are you incubating for that as well?
Trisha: Yeah, so, for me, I’m really blessed. I’m actually here in the UK with you, Laura, doing my postgraduate study at the University of Oxford as a Rhodes Scholar. And so I’m here doing research on the internet. Very fitting, thinking about cyberbullying and hate speech, but also thinking about misinformation, which is also increasingly a very important issue affecting the internet, but also youth and how they’re receiving information online. So doing research in that space, that’s keeping me busy. And as I think about the future, I hope that a lot of it is ReThink, but I also hope a lot of it is thinking about new innovative solutions, right? Technical solutions, but also, hopefully, solutions that are sitting at the intersection of a number of stakeholders, right. You know, parents, policy, tech, everyone coming together to kind of envision more broadly, what does a good internet look like? That’s kind of what’s motivating me, right?
Because I think the power of something like ReThink rate is addressing an important issue. I think the limit is that it’s one solution and we have a vast array of problems, and so I’m hoping to get at that vast array of problems. At least that’s my vision in terms of ReThink and kind of our vision as we look to the future. I think the biggest thing is expanding internationally and tackling cyberbullying in a global context. And that means also thinking about what online hate and harassment look like in different parts of the world. Because it looks different in the US, it looks different in the UK, it looks different in Lebanon, which is a country where we’re bringing ReThink and our educational curriculum to next month, actually, which is super exciting, but it’s different in different parts of the world. But with the COVID-19 Pandemic, there’s been this huge acceleration of technology and internet use. Right. So parts of the world where cyberbullying with less of a conversation or less prevalent, that is starting to become an issue, particularly amongst youth populations. So how can we scale ReThink up to those populations? Make the technology available in new languages, develop a curriculum that is appropriate for the context, and hopefully try and address some of its worst effects before they happen, right?
In places where this is an up-and-coming issue, if we can really get to the root of the problem before it fully manifests, then that’s the ideal. So that’s, I think the big vision for ReThink and also hopefully making ReThink the Internet standard, that’s my big vision for ReThink. I think it’s super powerful, right, but one of its biggest limits, as an app at least, is that it has to be downloaded, right? That’s part of the reason that we work with schools, that’s a big part of the reason we work with communities, is so that we can use them as kind of a conduit through which we can roll this technology out to young people. But in the long term, how great would it be if this just could be the internet standard, right? If this is the default and we could work with big platforms and big companies to make this just a reality for what our experience of interacting online is, that would be the absolute best. So that’s the longer-term vision.
Laura: So pretty big stuff. One to watch, I would suspect. And hopefully we’ll be having another conversation in a few years about all of these amazing things you’ve achieved, fixing the internet being one of them, by the sounds of things.
Into the Digital Future: Building Child Safety into Digital Platforms with Jennie Ito
Jennie Ito, a developmental psychologist and expert in children’s media and content appropriateness, joins Laura and Jordan in a compelling conversation on creating safer digital environments for kids. Drawing from her vast experience, which includes spearheading content policy development for kids’ experiences on YouTube, Google Play, and her current role at Roblox, Jennie provides a deep dive into the intricacies of age-appropriate digital policy-making. Together, they explore the critical challenges and innovative solutions in ensuring child safety in digital platforms.
This transcript of the Into the Digital Future podcast has been edited for clarity. Please listen to the full episode here, and learn more about the series.
Jennie: My name is Jennie Ito and I’m a senior product policy manager at Roblox. Before that, I was at YouTube and Google where I was a policy specialist on the Google Play Store. I was also a user experience researcher for a short period of time on the Google Kids and Family team and then I went back to working on policy at YouTube, where I led policy development for kids, tweens and teens for YouTube and YouTube Kids.
Jordan: So you’re at Roblox, which means the two of you work together. You know each other.
Jennie: Yes, that’s right. We do work together.
Jordan: So do you spend lots of time working together?
Laura: We do. In fact, I’m very, very lucky. So, Jennie, she’s going to talk a little bit about her work that she does at Roblox, which is amazing. But the work that I do at Roblox is all around civility and safety education, and it’s amazing how many times Jennie’s name comes up in a conversation where I’m like, you know who we need to talk to?
Laura: Jennie, and so we’re really pleased that she’s been able to support us with lots of different projects that we’re working on, some of them that are, you know, cross-functional working with lots of different academics, experts in the field, other tech companies. And then some that are much more internal projects. But yeah, I pretty much say there’s not a day goes by when we don’t email or chat to each other, so it’s such a pleasure to have you here, Jennie.
Jordan: You’re gonna have to explain to me, what do you actually do that makes your name come up so often in Laura’s conversations?
Jennie: Well, as Laura mentioned, she’s really focused on civility, and my team is really focused on safety. So we like to see we kind of write the rules of Roblox. And so we’re really focused on keeping. And all our users actually because our platform is not just focused on kids’ safety. And so there’s a lot of obvious overlap between civility and safety. So, as Laura mentioned, we work quite closely together.
Jordan: And so what does your day look like when you’re working on safety like? What does that mean for people who don’t know anything about what it looks like behind other than the app on their phone, or their computer, or their tablet, or their console? What does it mean for someone to work on safety?
Jennie: Yeah, that’s a great question. So I would say no two days of my days really look alike. There’s lots of lots of different things that I work on, which makes my job really fun and exciting. So one of the first things when it comes to safety is our team drafts, policies and then, which I was mentioning are kind of like the rules of Roblox, and then we work really closely with the moderation team which actually helps enforce those rules and make sure that everyone is is following the rules on the platform. So a big part of my job is doing research and coming up with these policies and then working with our cross-functional stakeholders to you know whether it’s product or legal to to actually make sure that these these policies make sense, and that they truly are going to keep our users safe on the platform.
Jordan: But you come to this as a child psychologist, right?
Jennie: Yes, that’s right. Yes, my background’s in developmental psychology.
Jordan: So I’m curious. Can you tell me a bit about how that impacts all these everyday decisions? How, does, how like, I just don’t think that most people are imagining that there’s like child psychologists sitting there in meetings thinking about thinking, and so and I know there are. And Lauren knows there are so like. What is that like? What, what your parents? How would you explain to parents? How like, how like how you get involved in that, and how you bring real developmental research to these questions about your digital well-being?
Jennie: Yeah, that’s a good question. So I will say that this is something that definitely is more common now. But when I started in tech back in 2015, it actually wasn’t that common to have child development, or even educational specialists in the room, or have a seat at the table. Thankfully, this has changed over the years. So both at my job at YouTube, but also at Roblox. As I mentioned, our users are not just adults. We also have a large proportion of our users are actually children and tweens and teens. And when it comes to working or working with policies around kids and tweens and teens, you not only have to think about kids. You also have to think about parents, which is, you know, an additional challenge. So it’s definitely something I’m thinking of when we talk about users when it comes to kids policies. We’re thinking about kids as well as parents and it’s really important when we’re developing policies, you know, children are not just mini-adults, so we have to take what developmental stages that they’re at into consideration.
So that’s always kind of where I start. My policy development is really focusing on different developments. You know what changes are happening in children’s social development, cognitive development, how that actually can shape their media preferences, and also what type of content might be more risky for them, depending on their developmental stages. So really thinking through all of this, as we’re as we’re writing our policies.
Jordan: Yeah, that’s fascinating.
Laura: It is absolutely so. One of the things I’d really love to understand. There’s kind of two parts to this. So one is, how do you, tailor those community standards, the rules for a platform, so that they are understandable and digestible by both the parents and adults in their lives, and also the kids and teens, and the second part being, how do you you know, you kind of mentioned developmental milestones and things like that. How do you keep your finger on the pulse of what’s happening out there that’s going to affect these kids and teams to build those community standards?
Jennie: Yes, it’s definitely challenging. Well, I had two tweens myself, actually I’d rather say one tween and a teenager, because he’s now 13. So I definitely am reminded every day what’s cool and what’s not cool by my children, and also I’m in the club. Yeah. So I have that. But to answer your second question first, definitely working with different academics, staying up to date on developmental research. This is actually one of my favorite parts of my job when I do get to interact with other experts.
I work with different academics, staying up to date on developmental research. This is actually one of my favorite parts of my job when I do get to interact with other experts. And when it comes to the community standards, it’s definitely a challenge, because, especially at a platform with Roblox and YouTube as well where we had users of all different ages. We have to make sure that our community standards are understandable to our youngest users. They’re written in a way that works with all the different age ranges which can be definitely difficult.
Jordan: Yeah. So I’m curious about how you think about this. You know, with the younger kids, of course, younger kids follow rules pretty well. They like to understand them. But as you start to get older, as you get into the tweens and teens, especially at a place like any digital platform where it feels like their playground, they just see rules as obstacles. They’re all mad about them. They don’t understand that any of them are for them, or they’re just in the way of fun. Right?
So, what do you think about that? As you’re starting to think about policy? Because, obviously, we want people to understand it. We want them not to feel it as something that’s like a reason to be angry at my space. My playground.
Jennie: Absolutely. So there’s a couple of things one is for really starting to think about how we get feedback. If someone does violate one of our policies. How can we tailor that feedback to different ages? And how can we focus more on education and deterrence versus kind of like a punishment, because we know, especially with tweens and teens, they’re going to break the rules. They’re going to push the boundaries, you know. So how can we communicate with them? You know this actually isn’t allowed, but we’re going to give you another chance to correct that behavior. And another thing is really working very closely with the user experience research team there, doing research with parents and kids and twins and teams, and really listening to what they want. You know what they think the rules should be, and just like incorporating a lot of that feedback where we could, and not just after the fact of getting feedback after we’ve made up the rules, but thinking about taking a child-centric approach from the beginning and listening to what they want.
Laura: As a parent of a teenager as well, there’s a common theme here, isn’t there, when we hear that we know it’s perfectly normal for tweens and teens to push the boundaries to go seek out inappropriate content and sort of to experiment when they’re online. I know that that’s kind of terrifying to a lot of parents who feel really out of their depth. Is there any advice that you could give to just help parents feel empowered in how to manage that situation?
Jennie: Yeah, I definitely can relate to how scary that can be. I think, having open conversations with your kids about your twins and teens about what they could see online, and then also what they should do if they do see something that makes them feel uncomfortable. You know, if they engage with someone, and it makes them feel comfortable to report that behavior. If they see something, they’re not really sure how to make sense of it, talk to your parents. And also not to overreact as parents when our kids do stumble upon something that they probably shouldn’t be seeing. Because, as you mentioned, it is very developmentally typical for children to seek out this type of content. But it also can be, you know, even Sonya Livingstone’s work about what bothers teens and tweens online. You know a lot of kids stumble upon things they didn’t even know existed like, maybe hyper-sexualized content, or something that’s really scary in the news. And so I think it’s really important to have those conversations with your kids.
Jordan: So when you look at the entire landscape of the platforms that kids and teenagers are using, you know, I don’t want to call any specific one out, because I’m gonna ask you a question here. But any of them, whether we’re talking about Roblox, or YouTube or Facebook or Instagram. Where do you think are the places where we really need to think about? You know, I mean as adults and kids. Not as companies, but as adults raising kids.
Jennie: Yeah, I think when we look across all platforms, I would say, probably in the tween, and in teen years, especially when you hit 13. I know for myself. As I mentioned, I have a 13-year-old. I had, you know, parental controls on a lot of these different platforms, and then suddenly, at 13 they disappeared completely. And so that’s a real shift. And so I think, either having. You know whether it’s, you know, just really having that communication with your child, but also giving parents the tools to have those conversations. You know some of these platforms. Maybe we need some more extended defaults and controls into the, you know, early teen years because I do think it’s quite a shift where you know there’s all these with really restrictive controls, and then all of a sudden at 13 they’re all completely gone, and if a child has been very restricted, it can be very overwhelming to kind of have everything open up to them with no limits.
Jordan: Yeah, I used to say all the time that everybody gives the kids the phones at 13, and I’m like, you should start with, maybe some boundaries, younger right?
Jennie: Yeah, definitely. That was definitely something that you know—and it’s funny, it hadn’t really occurred to me till I got that email, saying, hey, by the way, parental controls are no longer on these accounts.
Laura: Amazing. And I’d love to say thank you so much for mentioning one of our dear friends, Sonia Livingstone. She’s actually previously been on series 1 of our podcast. So you’re in great company and everyone else is great. And then, yeah, absolutely thanks. Thank you for the plug. This has been. This has been such a lovely conversation, and I’m sure you know a lot of our audience who our parents will find really reassuring that what they’re experiencing is just normal. That’s the first thing that makes everyone feel better. And you’ve definitely given some really fantastic advice as well. I’m gonna take us a little bit into the future now. So thinking about new innovations, actually one of the projects you and I are working on at the moment is about this kind of thing. What can we do in the next 5 to 10 years? So what innovations do you want to see? What are you? What new things do you feel should be coming, or might be coming, that is going to make a lot of this stuff easier for parents and kids to navigate.
Jennie: You know. I don’t know if this is a great answer, but I think one of the things I’ve been thinking a lot about, especially at Roblox and kind of this whole metaverse is thinking about. How can we have children and adults interacting in a safe way? I’m in the same space and a shared space. You know, because a lot of our other platforms, we kind of have our children siloed, separate from adults, but we are moving towards this world where they want everyone interacting together, and that just has so many safety challenges.
Laura: But I also think it could be, it can be like the opportunity. It is pretty amazing for shared experiences, but I do think it’s definitely a challenge. It is. But actually, online spaces are a reflection of the real world, and we don’t just keep kids locked up in a house. They are as our tweens and teens that we’ve already mentioned, they are going out. We send them to the store. They’re going to the park by themselves. They’re riding their bikes to school. They do have to interact with all sorts of strangers. So yeah, I think you know for me, certainly the responsibility on the platform to make sure that we can, that we’re thinking about that, and putting tools in place and helping to educate people around. You know, building, resilience and recognizing risk and harm, and all of those things. I think it’s really important.
Jennie: But yeah, it’s great to hear that you’re really thinking about this because you’re the person who needs to be thinking about that. And understanding kind of the, you know, unique risks that are online that are separate and different from the real world. And you know, the only the only, the only the best way, at least, that that kids are gonna learn. It is to be able to interact in these spaces with their parents. It’s where we are, it’s where we learn most of our most of what we learn about etiquette and so I wish there were more. I wish there were more spaces younger, and I wish there were more spaces older, older for it to happen. What do you think, Jennie? Like? What are the blind spots, or the places where the way that we think about the digital world, the way we think about the metaverse, where we think about online digital play. Where do you think we have blind spots that actually keep us from making the progress that we really need?
Jennie: I am happy to see kind of more developmental psychologists, people who really understand kids and tweets, really developing like the strategies behind these products and kind of really helping with product development. Because I think that’s really going to help us understand and kind of build these spaces with kids in mind. I make sure that they actually are appealing to them. And they want to be in these spaces, I think, for a long time it was. You know we didn’t have this expertise and we weren’t involving children from the beginning and listening to their feedback, and so I hope that we continue to do that as we build out these products.
Laura: Yeah. Same, I think that’s vital, isn’t it. And you know we know that being part of the product has been a thing with tech spaces and platforms for such a long time and we have to be even more thoughtful and careful about our duty of care. What? When it’s got young people involved as well. So okay, a bit more personal. So what’s next for you? What big dreams, what things are you working on, whether at Roblox or outside of that? What are you currently working on?
Jennie: Yeah. So I can. I mentioned something. I think I’m allowed to mention this. If not, we’ll have to edit it out, Laura. But right now I’m working on aging up the platform at Roblox, and one of our biggest growth areas is with our users from 17 to 24. So I’m really excited about that. I think in the past it’s been more focused on our younger users. But now we’re focusing on tweens and teens, and just making sure that we have appealing content for them, and making this obviously continue to make this a safe space for them. And so I did something similar over at YouTube, and I really enjoyed that project because I especially as a mom of a between and team. It’s exciting to kind of be building a space. I know my own kids are going to enjoy it.
Into the Digital Future: Raising Teenagers in a Digital Age with Dr. Hina Talib
Dr. Hina Talib, a pediatrician and adolescent medicine specialist, joins Laura and Jordan for an enlightening discussion on raising teenagers in the digital age. They delve into the impact of technology on youth well-being, emphasizing the need for open conversations and listening. Dr. Talib highlights the importance of understanding individual experiences and fostering positive online communities. The conversation touches on age verification, content moderation, and the importance of sleep and movement. With valuable insights and practical advice, the discussion provides a thoughtful guide to navigating the complexities of parenting teens in the digital world.
This transcript of the Into the Digital Future podcast has been edited for clarity. Please listen to the full episode here, and learn more about the series.
Hina Talib: Thank you for having me. I am Dr. Hina Talib. I am a pediatrician and an adolescent medicine specialist, which means I take care of teenagers in that special age range from about 10 to 25 years old. I practice here in New York City. I’m also a writer and a spokesperson for the American Academy of Pediatrics.
Jordan Shapiro: Wow! That’s that’s that’s great as you know, as someone who lives in a house with four teenagers whom I love dearly, but I can’t wait for them to be gone also. So why do you get into adolescent medicine? What were you thinking? Why are you so passionate about this?
Hina: You know you are not the first person to ask me that. A lot of people say, oh, just bless you for taking care of this age group, and I’ve heard it my whole life honestly. I just love working with teenagers. There is something about this age group that is so empowering and vibrant and creative and futuristic to me, almost, it’s just. I believe that being able to see them for who they are, and helping them reflect who they are to the world, and engage in the world in a positive way. I believe it will change the world for the better. It’s just a privilege for me to be part of that. And yes, I know it’s dicey, and I know it’s hard, and I think all parents, and even pediatricians, who take care of teenagers. We all just have to laugh about how hard it is sometimes, but I always try to remind people that it’s also really magical. And a lot of what looks really kind of striking from the outside is actually developmentally normal. It’s just a little turbulent at times.
Jordan: Yeah, I often think about it. Teenagers, they’re totally fantastic, you know. I hate the idea that we’re always like, Oh, my god teenagers!, like it’s hard, and I hope I didn’t come off that way. I mean it is hard, but it’s hard because they’re little tiny adults without a lot of practice. It’s like, if you had a bunch of roommates who never learned how to live with other people.
Hina: That’s a really funny analogy. I like that. I do think teenagers get a bad rap overall, but I think that I think they’re very special, and I think that anybody who cares for teenagers or parent teenagers would agree with me. Of course, it’s just we have to help and sometimes helping means actually not saying anything
Jordan: Yeah, I know I was looking for those for any of our listeners who are like, how do you do this? How do you deal with teenagers? I was checking out your Instagram. It’s myteenhealth?
Hina: teenhealthdoc
Jordan: And you have not only a ton of awesome advice, but all of these like infographics. They tell you great questions to ask, conversations to have or to be considered. And you have a ton of followers there, like I can hear my teens. My teens will think they’re important. They don’t think Miles Davis is important because he doesn’t have enough followers, but you are.
Hina: Oh, oh dear, yeah, it’s great. Teenagers do definitely keep me timely, which is great. But even they will tell me. You know. Instagram is like, it’s now old, and it’s like, Oh, okay. Well, I joined Threads, so hopefully, that’ll make this cool again. But yeah, you know, I think adolescents and their relationship with social media is actually something that’s a super hot topic right now. And in fact, parent’s relationships with their own social media should be, too.
Laura Higgins: I think we’re going to come onto that in a little bit. So I’ve spent a bit of time on your channel, and I really get that passion that you have for young people and the positivity around it. You really focus on practical advice about how to talk to young people, how to relate to them, and really kind of putting into focus themes around respect privacy, and just advocating for kids to have their space in the world and we were saying it at the beginning of the chat, it’s sometimes easier said than done. How can parents start these conversations with their teenagers if they haven’t already been having those conversations?
Hina: You know, it’s always great to start a conversation with connection and with something positive. So if you’re already doing that because you are laughing at the same joke, or watching something on Netflix together that you both enjoy, or playing a sport or watching a sport that you are getting excited about, and you’re on the same team that those those feel good moments are a good time to have a conversation about, you know, “I was thinking about you, and I was thinking about how X, Y, or Z is challenging. And how are you doing? We just want to know how you’re doing.” Instead of— sometimes people like to make a conversation, a whole big, scary moment. “We’re gonna have a conversation. Let’s have a talk. Let’s do it Sunday at 9 o’clock in the morning.” And then they’re worried about it. Because what is it my parents want to talk to me about? Why did they set a time? and why is it tomorrow, not today? And so sometimes just going with the flow of when your family is kind of already feeling good and happy, and having a connected moment is the right time to just it might feel like you’re saying it out of nowhere, but I found that it is actually the right time to say “I was just thinking about you. I was thinking about this. I just want to know how you are.”
Jordan: Thanks, that’s great advice. I think certainly something that I might go away and practice, because I feel like one of the things that parents in general are just not very good at when it comes to teenagers, is listening. That’s myself included. I don’t want anyone to think I’m pointing fingers. You know, we think we know best, we think it’s our job to tell them. We still are in that mindset of, I have to scream at them to not cross the street while traffic is coming, because they’re little people who can make bad mistakes, but they’re starting to grow up.
Do you have advice for how we can do a better job of listening to the teens, to understand what their issues are, what they’re dealing with, because I think we’re often wrong as their parents.
Hina: Yeah, you hit the nail on the head. And the thing is it is the most important parenting skill for many different life stages, but particularly for adolescents who do have a lot of thoughts going on in their heads and aren’t always equipped to share them in the most helpful ways. But neither are some adults to be fair. I think that listening is super important.
My advice would be to say half or less than half of what you intended to say, so really to be conscious of listening more than talking. Open-ended questions are helpful. Not all teenagers like to, or feel comfortable, having face to face conversations. So sometimes, having them talk when they’re chatty, and then just be there to listen. So catch their chatty moments. Their chatty moments might be when you’re picking them up from an activity, and you’re in both. In the car. Our conversations are huge because you’re not looking at each other, and it’s less stressful and perhaps you’re listening to a song both like and and then you can talk, but listening is a skill so if you are not good at listening as a parent, that’s okay. Very many people are not. And it actually is something that you can learn about how to be a better listener. And it means creating the space, truly listening and not speaking over them, reflecting back what you hear to make sure that you heard it right, and that you’re kind of synthesizing it right? And then asking their permission if they want you to jump in with guidance or advice? Or are you just here to vent? That’s cool because I’m happy to just listen.
Laura: I love that, particularly that last bit of advice. I think that is so key. I think we all go into protect mode, especially if our kids are having a hard time. It’s like I want to do something to help you. And sometimes it can make things so much worse. And we do take away that agency and that resilience building that teens need to go through.
Jordan: Yeah, I want to. I want to also ask you, like, you know. We do a lot of talk about this sort of adolescent as a category, right? The teen as a category. And of course, this is what your work is, but there’s also a lot of really specific cases, individual cases, you know, for where adolescence has unique challenges, whether we’re talking about socioeconomic differences in economic groups. LGBTQ communities, BIPOC community, you know, there’s very specific structural things that are shaping the experience of teens. And when can we generalize, universalize, and be like, “this is the teen experience.” And when do we really need to look at the specific causes?
Hina: You know, it’s interesting because I share this advice on my Instagram account and in other outlets. When I’m asked to comment about adolescent health issues, I always feel so nervous doing it because I truly think that when you’re parenting an adolescent or young adult, there’s a lot of nuance, and you’re best suited to parent the kid that’s actually in front of you and not take the cookie cutter advice that’s being shared with the masses through the media through the parenting metaverse.
Jordan: Except for this podcast…
Hina: Yeah! Because we are speaking to you. Yeah, you know, I think that that’s like that is an area that I struggle with internally. And I try. I’m actually trying more and more these days to say, if this doesn’t apply to you don’t let it bother you, because this piece of advice may not apply to the child that you have, and we actually have to look at the child that you have the circumstances that they’re living in, what else they have brought in.
You know, I often say, like from zero to 10, a lot of great things happen in life. But for a lot of kids, also, a lot of not so great things happen in their lives. A lot of hard things may have happened in their lives, and they only process that or begin to really process that from 10 to 20. So some kids come into their adolescence with, you know, having had complex health conditions, having had changes in their household structure, having financial hardship, having experienced racism having, you know, been gathered in so many different ways. And they process that in adolescence. So you absolutely have to look at the whole picture when you’re kind of giving and and taking advice.
Laura: Putting a real human twist on it. And again, one of those things that sounds easier said than done is to really know your child. I think a lot of us feel that sort of disconnect starting to happen when they’re teenagers, but you do have to trust your instinct that you have shared your life with this person again. Those listening skills, I think, would be really really helpful there as well. Okay, I’m going to shift gears a little bit and bring it back into my sphere. So kids and tech, what do you think are the main pros and cons for young people in online spaces?
Hina: So echoing what I was saying before, I really strongly feel that it’s very hard to give cookie cutter advice on this topic. I think you really do need to parent the teen in front of you. I do think that for many adolescents that I work with, they have positive relationships with their media. I have found that some of those that have the most healthy and most positive relationships with their media. This is going to be controversial, actually started with their use earlier when they were very accepting of rules, and we’re a little bit more rule followers before they or you know, and for others who are given them at a certain age, when it was sort of like, okay? And you’re just given this phone without, you know, really having had the developmental lead up to it and the conversations and the graded exposures to it that parents who give and share devices at younger ages do so. You have to look at the kids in front of you. I think that the important part of it is the development and relationship with the device to see what they’re using it for, how it’s making them feel what their intention is and helping them to really conquer it. Now for some kids, it’s gonna be a problem. They’re going to have problematic uses. For other people, it’s episodic that it’s problematic, meaning that there are big feelings related to events that are happening online, right? There’s an entire world online for them. So you can have drama in your in-person life. You can have drama, and you’re online, and you need the tools to resolve the drama. Or, you know, in both places the tools are very different. So I think it’s a lot about development, relationship, intention, support. And I mean, I can’t hit this home enough. It is parenting, and I’m going through this right now. I have a five- and seven-year-old. I’ve already sort of regretted— or I’m not sure that we’re having the best relationship with the Switch right now. And it’s hard. It’s like this Switch is winning, so I’m living it, and I’m working with the parents in my school, having conversations as parents about what are we going to do when, as are we going to unite as parents in second grade, third grade, fourth grade to talk about what we are comfortable with having device wise for our our little kids, because the middle school parents are telling us what they regret it and what they’re going to do and it’s really eye opening, it’s really shocking. And so here I am; I’m a pediatrician, I’m a mom, and an adolescent medicine specialist, and in this, with folks like it is, it’s not easy, it’s not. And I think you know.
Laura: So the takeaways I got there were about setting the kind of norms and the expectations and helping them to build healthy relationships where possible. But also it’s that positive role modeling which you know increasingly, you know, all of us are online, more and more. And so there is definitely a balance needed there.
Hina: Yeah, it is so much about the parents’ relationship, too. I hear a lot of parents coming at me, telling them that they can’t be on their phone. Telling me I need to take the phone away. You know, it’s a consequence, right, that I can take the phone away. And it’s sort of like, well, what are you doing with your phone? And if you’re if you’re telling them to take the phone away the hour before bedtime and charge it outside in the hallway. Where’s your phone? And so it’s really, you know, and when you’re speaking to them, and when you’re sort of moving through your day in your home, in your car, are the phones out? And are there spaces where there are no phones? Because if you’re not doing it, then it’s much, much easier, and they’ll accept it much, much more if you ask them to do it. If you’re not doing it too. And if you’re talking about well, how’s their cell phone, you know? Talk to them about how your relationship is, you know. And my children knew the word Instagram at a very young age, because I was a media person on Instagram, and I’m not proud of that. And so I. You know, I definitely have to be more intentional about when I’m taking pictures and when I’m having my phone and doing my work and calling it work. I am doing this for work. And then, now I’m going to put it down, but being very verbal with, I think the people around you about when you have your phone out and back, you have it out, I have found to be helpful. And it just makes you remember, too.
Jordan: Yeah, and of course you can extrapolate that to so many things, even beyond tech. You know, I often think about the parents who go, “I can’t get my kid to read books,” and I’m like, Well, when’s the last time they saw you reading a book? Right? And so I kind of want to take that and and that idea, and and and ask you a question moving back to the tech question, especially specifically to adolescents. And you know, as you already pointed out, very different questions for younger kids out of lessons. You know, how do we understand? Like, how do we figure out which are the again? And I mean this culturally, not necessarily as parents like, which are they? Which are the things that we have to be really attentive to that are just part of the teen experience, whether we’re talking about social anxiety, sexual awakening, body image, recklessness. You know all these things that are just a normal part that are now happening within a context of digital tech. And what’s actually because of the digital tech world.
Hina: Yeah, I think that in some ways there are strong parallels because you can find yourself in tricky, sexual and reproductive health arena situations in real life as well as online. You can find yourself in bullying situations, in real life and online, you can find yourself with substance abuse, unfortunately, online as well. It’s being sold and talked about and and and in real life, obviously as well. So there are ways to talk about it. I almost think that everything that we talk about in real life, we should. That is sort of like a risk category of things that adolescents might encounter, that we, as pediatricians, you know. Sometimes when, after these things have happened, they end up playing a role in it. But if you’re thinking ahead, you know all the counseling that we do about safety, in real life there should be a corollary for. And this could happen online. And this is how you handle it online. So you practice saying “No” to substances that you’re not wanting to experiment with in real life. You practice how to say it online, you practice how to say “No” to certain sexual encounters that you’re not ready for in real life. You practice a funny response that you can send to somebody if they’ve asked you for a sext, so like you, you really do. I think that there’s a lot of learning and bi-directional conversations that should just become more fluid. We should all just be talking about this could happen here, or it could happen in this place. And how do you sort of prepare yourself? And in anticipation of these opportunities coming your way, in those cases.
Laura: And being really transparent about it as well, I think, isn’t it? So what do you think tech companies and platforms should be thinking about when they’re designing with kids and teams in mind, and particularly thinking about youth well being?
Hina: Hmm. You know, I think that content moderation and algorithms to me are very important for us to keep talking about. And I think the age verification is also very important and very dicey. I don’t know exactly the right answers on, you know, on all of those, but I have heard, actually adolescents themselves tell me “I would just love a reset button and for my page”. Other than having to completely close it and start a new page if there is an algorithm reset button when they know. Okay, I’ve gone down a rabbit hole. This was not healthy. I don’t want to be looking at this content, all it’s showing me this content. I’ve tried to use the tools to block the words, and this and that, but I’m still being shown the content.
And so I’ve heard them say I would love just, and I’ve heard other people talk about it. And I ask adolescents sometimes. Would you use that? Would that work? And it’s very clear that they would use their reset, if it meant that they didn’t have to start their entire account again.
Jordan: So I like that. I think age verification is important. I love that idea, especially when I think back to my own teen years and remember, like I, I can always remember, like the end of the summer spending a lot of time thinking, How do I want to show up at school? Who am I going to be this year this year? Imagine you could just reset your profile? I remember thinking very years ago about how hard it must be to, you know, I know a lot of people like, you know.
With the Internet, everything’s remembered, and that’s bad for jobs. I always thought about this like teenagers need to reinvent themselves constantly until they figure out what works. And it’s really hard to do when there is a permanent visual record.
Laura: And if you have to go through and delete it, one photo by photo or caption, but it’s a lot of work. So yeah, but so hopefully, some of our friends in the text base are listening. Because I think that’s a really great recommendation.
Jordan: So yeah, I think the age verification, you know, is important too, absolutely. And so what are some of the sort of things that we’re not talking about enough when it comes to teens, adolescents, digital technology? What’s something you think about that, really hasn’t entered the mainstream conversation yet, but you really wish it would.
Hina: I think that in practice, in clinical practice, some of what comes back to me more often than not is sleep and movement. And so I think that it’s been helpful for me to really focus these conversations on. If the conversations around the device and the media use is sensitive and already been heated. I sometimes shift to talking about sleep and movement and I get to the same endpoint with a lot more buy-in because it’s a value that the kid shares and the parent shares. Everybody agrees that sleep is important. You know my favorite question is, “How satisfied are you with your sleep? How satisfied are you with your social media? How satisfied are you with your level of movement of your body?” And so I let them grade it themselves, and then sort of ease into those conversations but I don’t think we talk enough about adolescent sleep.
The adolescents are chronically sleep deprived as an age group, and I think media does play a role there. And I think there’s skills to be taught on how to protect your sleep and movement too. You know, it’s a different conversation, but for some people it does make it less likely for them to move because they’re, you know, satisfied with, in the moment, spending their free time that they do have in the day doing a more sedentary activity. So those are a couple of issues, and there was something else that. Oh, you know, I think that I think that it by and large a lot of the media headlines on a lot of the articles, and a lot of the reaction to our Surgeon General has. It tends to be very negative, but even though the Surgeon General’s report actually does talk about the positive uses of media, but the headlines do tend to be very negative, and I think it’s as some as an adolescent medicine specialist. I do provide care for LGBTQ and for other youth who feel different in different ways, whether it’s that they have different abilities or they have different health issues. And they find community in these spaces, and they adore these communities, these communities of people who look like them or go through challenges like they do have been so very helpful. And so I think what doesn’t get enough attention is how amazing these supportive communities there are online for people who have a very niche health condition similar to yours, or who, you know, identify as LGBTQ and have gone through very similar challenges that you have, or very experiences. And it’s actually even more important for them to see that. And so yeah, and we need to protect those spaces, don’t we? The community within them, but also the spaces themselves. yeah, I think I feel that very strongly. And I know I know that those spaces can get tripped up, too. But I see their value, and I want to, just as you said– that we have to protect the spaces also and access to those spaces which you know involve children who are under 18 sometimes, and that’s okay.
Laura: Absolutely. So, this has been such a wonderful conversation. Is there anything that you really would love to add or share that we’ve missed?
Hina: I think the work of parenting is really hard. I remind parents that if you will, if you got through those toddler years you will get through the adolescent years too. There are a lot of parallels to how you supported your kiddos during that time, that that will carry you through the time. And I think that the technology is not insurmountable. We just have to take control, and it has to be in the home. I have talked, you know, to educators and to other people in the community, and it can’t be, “No, you do it.” “No, you do it.” “No, you do it”. It has to be all of us together, pediatricians and teachers. And as creators and the tech company. We all have to do this together, and that is a big feat, but I feel like we’re on to something. I feel like people are really thinking about this and and and your podcast talks about a lot of this. And so I’m grateful that you guys are putting this out there.
Laura: Thank you. I’m off right now to go and tell my teenager that he’s like a toddler. I think that’ll go down really well.
Hina: Joking, of course!
Announcing Our Second Season of Into the Digital Future
We are thrilled to announce the second season of Into the Digital Future, our podcast that explores the complexities of parenting in the digital age from the perspectives of a wide range of experts in pediatrics, public health, public policy, technology, and more. Hosts Jordan Shapiro (Cooney Center Senior Fellow and author of Father Figure: How to be a Feminist Dad and The New Childhood: Raising Kids to Thrive in a Connected World) and Laura Higgins (Senior Director of Community Safety & Digital Civility at Roblox) dig into some of the many facets of kids’ well-being in a digital world.
Our guests this season offer deep expertise in their efforts to make the internet a safer and more welcoming place. We’re kicking it off this week with a conversation with Julie Inman Grant, Australia’s eSafety Commissioner about the role governments, educators, and parents play in helping to protect youth from potential harms. Next week, Dr. Hina Talib specializes in adolescent medicine and offers a pediatrician’s perspective on the unique developmental paths of teenagers and the impact of tech on their well-being. Jennie Ito, a developmental psychologist and expert in children’s media and content appropriateness, talks about her role in creating safer digital environments for kids at Roblox. Trisha Prabhu discusses some of the social dynamics that can lead to digital harassment and how she was inspired to develop a tool to encourage users to pause before posting potentially damaging content.
In November, Professor Amanda Third from Western Sydney University shares highlights from the research she has conducted with children around the world that has informed the United Nations’ Digital Rights of the Child and the Global Kids Online Research Project. Melissa Mercado, a lead researcher at the Centers for Disease Control and Prevention, speaks with Jordan and Laura about online violence as a public health issue, and emphasizes the value of leveraging online spaces for positive connections and skill-building. Jaspal Sandhu, Executive Vice President at Hopelab, delves into the positive possibilities of technology and the importance of involving young people as active partners in shaping solutions. And we’ll wrap our season with a conversation with Lucia Sotomayor, from the Office of the Special Representative of the UN Secretary General on Violence Against Children, to discuss how the POP Project is working to protect youth through online participation and how we can make the internet a safer place for children.
Please join us as we delve into how our guests are working to protect and empower young people in the digital world. And please share with friends and colleagues!
You can find us on your favorite podcast platforms including Apple Podcasts, Spotify, iHeartRadio, and YouTube.
Into the Digital Future: Navigating the Cyber Frontier with Julie Inman Grant
Join Laura and Jordan as they discuss online safety with Julie Inman Grant, Australia’s esteemed eSafety Commissioner. With her wealth of experience from tech giants like Microsoft, Twitter, and Adobe, Julie brings a unique perspective on internet safety measures and regulations. Learn about her journey leading the eSafety Commission since 2017 and the transformative strides they’ve made under the newly created global online safety regulators network. Together, they discuss the proactive strategies being implemented by the commission to safeguard Australians in the digital realm.
This transcript of the Into the Digital Future podcast has been edited for clarity. Please listen to the full episode here, and learn more about the season.
Julie Inman Grant: I’m Julie Inman Grant, the Australian eSafety Commissioner. I’m sure you’re very confused because I have an American accent. You are not hearing wrong. I actually started my career in Washington, DC in the early 1990s. I had big ideals, and of course, even bigger hair at the time. I really landed in tech policy ground zero.
I worked in Congress for my hometown Congressman, and I was working on a range of social justice issues, and he popped his head over the cubicle one day and said “Hey, Julie, you know you’re working on these hard issues. We’ve got this small little computer company in our electorate called Microsoft. Could you work on tech tissues as well?” So this was before there was even an internet. I ended up working in the education and nonprofit sector, going to graduate school at night, studying the intersection between technology and society. And then I was drafted by Microsoft to be one of their first lobbyists back in 1995. So right before the US DOJ antitrust.
Jordan Shapiro: That is back when Bill Gates still used to lie on the desk with the sweater on.
Julie: Well, I’ll have to tell you about my second day of work. I was so excited to meet my CEO. And he landed. He really didn’t like coming to Washington, DC at the time, you know, he just wanted to get on with it, and, you know, create jobs and stimulate innovation. So he arrived in chinos and polo shirt, and he was going to the White House and going to the Press Club, and so we had to ask our outside council to change out of his suit in the hotel lobby so that Bill could have a suit. And then I took him over to the White House and he forgot his wallet and the security guard wouldn’t let him in. I said. “But don’t you know? This is Bill Gates.” He said, “I don’t care who you say you are. You’re not getting into the White Gouse without an ID.” So you know, it was very interesting. And then the next day the headline in the Washington Post was “Oh, they’re nice. It’s unfortunate that we have to have a DC office. They’re nice people, but they’re just overhead.”
And I think that really describes the state of the broader tech sector today, where the starting point is, “We just want to be left alone to innovate and create and sorry if what we exfiltrate into the wild might be harming society, you know. You can’t stand in the way of innovation.” So I was part of that cabal. I spent time at Twitter, you know, was so excited to join after the Arab spring, and speaking truth to power, but very quickly I saw how social media really surfaced the ills of society, of human nature, you know, through misogyny and racism and targeted online harassment which was actually designed and effective in silencing voices, and then I spent some time at Adobe. And then I got recruited in as a poacher turned gamekeeper, and I now regulate the tech sector
Jordan: You have kids right?
Julie: I’ve got three. Right in the sweet spot.
Jordan: How do you manage being a big time government official and also having kids? I’m just an author, and I struggle with my kids.
Julie: Well, I am taking a three week vacation. It’s the first time in 30 years that I’m not bringing my laptop. This was on doctors’ orders to just disconnect, which I think we all need to do. I’ve got a 17-year-old and I remember when she was 3 years old, she was more fascinated with my phone and the lights and the Wii than she was with the doll. And I remember thinking, “Wow, these kids”— this is 14 years ago—”are going to grow up very differently.” So I also have 11-year-old twins, who are in the upper end of primary school. Now they claim that they’re the only two kids in sixth grade that don’t have a phone, and I’m actually inclined to believe them. They do have access in the house. They use my account when they’re trying to connect with friends, so I can see everything that’s being said, but you know it’s really hard not to bring what I see and what my investigators see every day with child sexual abuse, material, including self-generated content. Seeing that the age of cyber bullying has gone down from 14 to 8. We have 8 and 9 year olds reporting serious cyber bullying to us. This is a sort of interesting hangover of the pandemic where I think we were all struggling as parents to remotely work and school, and so we were a little bit more permissive with technology. but once you hand over a smartphone or put your kid on TikTok at 8 or 9 years old, you can’t wretch that back.
Jordan: So what do you do? As the Commissioner. How do you think about that? From a policy standpoint— what can governments do about that?
Julie: Well, listen. I think, you know, the approach we take is, I think, a unique one that we’ve developed over time because there were no other online safety regulators, and we had to write the playbook as we went along. But before we even use our regulatory powers, and we have complaint schemes, and take-down powers and high levels of success— and now we have some systemic regulatory powers. We have to start with prevention through education and fundamental research, building the evidence base. So everything I heard the Surgeon General say last month, you know, are issues that we’ve been gathering evidence about and advising parents and educators, and even young people themselves. through a co-design process about the healthy use of technology. You know, there are so many mixed outcomes in the research. And I do worry about direct causal links being drawn to, you know, “social media is bad. It leads to mental health issues.” We had a terrible situation in 2018, where there was a tragic suicide of a 14-year-old girl and it was all attributed to cyberbullying, when, of course, you know that there are much more complex things going in the background. And what I worry about when adults in the media draw that direct causal link is that children won’t engage in help-seeking, and they will think that taking their own life or doing something more drastic is the way out.
I think we’re the only government in the world that has a youth-based cyberbullying service where we serve as that safety net. When a child, their parent, or an educator reports anything seriously threatening, intimidating, humiliating, or harassing, and it doesn’t come down, we’re that safety net, and we advocate on behalf of that child and have a 90% success rate in terms of getting that content taken down, which goes a long way to relieving the mental distress that young people experience from not just acute cyber, bullying like death threats and rape threats which we do see from teenagers, but just the garden variety being mean—creating drama, starting rumors— which can have a very corrosive impact on children’s lives.
Laura Higgins: Julie, I’ve had the pleasure of working with you for many years in my previous work, on helplines, and so on. And you know, Australia was the first country to have this regulatory responsibility. So what sort of power specifically do you have? I know you just touched on a couple of things, a couple of questions on that. How does the industry respond to it? And also, how is it received by general citizens and people in the country?
Julie: Well, thank you for those questions. So we, you know, as countries around the world, are considering legislation. For instance, the Online Safety Bill is actively being debated in the UK Parliament. The House of Lords, I believe, right now, Ireland just set up an Online Safety Commissioner. Of course we’ll see a proliferation of regulators through the Digital Services Act. We’ve already reformed our legislation. So you know the formation of the eSafety Commissioner. We actually started as the Children’s eSafety Commissioner, but it came about because a well-known female presenter from Australia’s next top model, so the Tyra Banks of Australia, was very open about mental health issues. She had a nervous breakdown. She was getting terrible trolling on Twitter, and she came out of her recovery and got right back on Twitter. I was interviewing for Twitter at the time, so I remember seeing what was playing out. She ultimately ended up taking her life, and it was referred to as the Twitter suicide.
Now this spawned a petition that went to the government, that said, you know, the government needs to step in. This is 2014. But what the government of the day decided to do was start small and slow with the Children’s eSafety Commissioner because they were concerned about things like freedom of expression, and the specter of censorship. So they started with the hotline function that we already have in terms of taking reports of child sexual abuse material. And we’ve had that function like the National Center for Missing and Exploited children for more than 20 years, and then they created this new, youth-based, cyberbullying scheme. So again, not to be proactive monitors of the internet, or to do the content moderation job for the companies, but to serve as that safety net. That’s how we started.
We have a very good relationship with the platforms. They get huge volumes of reports. Often the content moderation is outsourced, and a content moderator from another country who doesn’t understand the culture in the context may have 30 seconds to a minute to decide whether or not that violates the Company’s terms of service, so they get it wrong. I mean, if you look at the latest Meta oversight boards reports, something like 1.3 million requests for reviewing decisions were made. They only got to 50, and more than half of them were found to be wrong decisions: So we do that day in and day out, on a daily basis as an objective third-party. But we always prefer to use informal means, because that gets to tech, that gets the content down more expeditiously. We know the more quickly we get that content down, the more relief we provide to the children.
We also work with schools and parents, because cyberbullying tends to be peer-to-peer. It’s an extension of conflict which is happening within the schoolyards and it’s, as you know, very visible to young people and their peers, but often hidden to teachers and parents, particularly as things start moving to Snap and to DM, and that sort of thing. So we started with that and we moved to what the Government asked me to set up, a revenge porn portal, and I know you did some pioneering work at the revenge porn hotline, but my first inclination was no, I’m not going to call it a revenge porn portal. Revenge for what? That can lead to victim blaming. Let’s call it what it is: image-based abuse. And we set that up in 2017 and got much more potent powers in 2018. We’ve helped tens of thousands of Australians get intimate images taken down from all over the internet. We’re starting to get reports of deep fake pornography. So that’s starting to be democratized. But one huge change that we’ve seen that’s of huge concern, I think, to hotlines all over the globe, is that we’ve seen a tripling of reports of sexual extortion. And it’s totally changed. The demographics of those we’re helping is on its head. You know, relationship retribution, in terms of the sharing of intimate images and videos. That garden variety, revenge porn, or a non-consensual sharing of intimate imagery tends to impact women and girls. But 90% of the reports we’re getting around sexual extortion, which is backed by organized crime, are young men and boys, mostly between the ages of 18 and 24.
Jordan: Yeah. So you’ve talked about so many things that are really, really scary. So for our listeners, the parents, the educators, the sort of everyday users of the internet, what would be some like top advice that you’d get? That you give us in terms of thinking about it? I don’t want everyone listening to go, “Oh, no, there’s such horrors”.
Julie: It is tough. So we’ve done a lot of work around the pedagogy of online safety education and what works and what doesn’t, you know? Scaring people or judging parents, tends to lead to amygdala hijack. And you know, people shut down. We also know that doing the same with kids and just doing one-off presentations is not going to help them, and I happen to believe that just banning devices rather than teaching what I call the four Rs of the digital age—Respect, Responsibility, building digital Resilience, and critical Reasoning skills—if you’re not teaching kids these skills and self-regulation of technology use, then we’re not setting them up for success in the future. I’d say the same thing about AI.
So I just say to parents that you are the front lines of defense. We are the ones that tend to hand over the digital device. In Australia, 42% of two-year-olds have access to a digital device, and by the time they’re four years old it’s 94%. So on our website, esafety.gov.au, we’ve got a parents guide for under fives, and the key message to be delivering to kids is be safe, be kind, ask for help and make good choices, and then when they get into the primary years. It really is about, as I said, those 4 Rs of the digital age. Sitting down, signing family tech agreements. We know that kids are much more likely to adhere to family technology use rules when they are part of the decision-making process. We’ve got a number of those for young families. All of this is free.
The key advice that I give to parents is to really speak early and often to your kids. Keep the lines of communication open. Let kids know they can come to you if anything goes wrong online. A lot of kids are worried about device denial, so they won’t confide in their parents, or they don’t think that they can help them, but if you start by asking those questions at the dinner table. We ask our kids about school. We ask them about sport. Ask them what they’re experiencing online, co-play and co-view. I could never keep up with my kids on Roblox. I’m like, how the heck are they, you know, six years old. And how do they know how to do that? But it’s something that, you know, I want to know who they are playing with when we set the parental controls. I make sure my kids use technology in open areas of the house, so I can see how they’re reacting to it. So often, when you know kids are being cyber bullied you’ll see a visceral reaction, they will have to reading the paper, and that also includes you know, when kids are playing fortnite or gaming, you might have them wearing headphones, but if you really want to hear what they’re experiencing, make sure that’s in an open area to the house. And you can hear what they’re experiencing as well.
Laura: Yeah, I love that advice, Julie. As you know, both of us are parents, and Jordan and I talk about these topics all the time, and many other guests here. And I think that’s really good advice. It sounds so simple, but actually, things that do become quite challenging for people actually to be just as you say, present to co-play, to co-participate, to role model as well. I think that’s really really great advice, to just be there with your kids on this journey exactly as you said as you would talking about everything else that they do in life, and trying to make that as normal a conversation as opposed to it being a let’s sit and have the talk.
Jordan: Is there anything that we should have asked about that we didn’t?
Julie: No, I’m sorry if I went all over the place, and I was getting excited about this as well again. I guess I just also mentioned that one of the things we did, we announced in Washington, DC, in November last year was the creation of a global online safety regulators network, and so as a lone regulator who kind of felt like they were, you know, climbing up a humongous hill at the front of the peloton, with no one drafting behind us. We do have organizations around the globe joining us, and we believe that we need to work together and we can learn from each other that we can share intelligence and information and make sure that there isn’t a splintered net of regulation, and that what we’re all doing is harnessing the benefits and the good. AI is a perfect example of a use case that could have tremendous power in helping humans with content moderation on the internet, and you know, search out and sweep out illegal or seriously harmful content. So we want to see those positive use cases. But we also want to minimize the risks. And we need to do that as governments together.
Laura: Huge congratulations on that initiative, Julie. I think it’s going to be really interesting to see where things go in the next couple of years. But I think we’re all really grateful for the work that you and your team do for leading the charge. Well done! Thank you.
Youth Design Team: Call for Applications
Apply to the…
Youth Design Team
Calling all high schoolers! Join the Youth Design Team to help shape the future of games, apps, and websites for young people around the world. Apply to join the Team if you want to…
Make a difference.
We want games, websites, media, and apps to be better for kids and teens. And we need your help! |
Learn new skills.
Join online workshops with professional designers to share your ideas and insights about what the future of media should look like. |
Earn money.
10 people will be selected to join the Team and |
Join the Youth Design Team to help design interactive media for kids and youth,
and connect with teens like you in new ways. Youth participants in this program will have opportunities to work with the Well-Being by Design Fellowship program for mid-career designers.
- Jan 16 – March 31, 2024
- $250 stipend
- About 2-3 hours/month
The Team is for 14-17 year-olds who live in the U.S. and…
Submit your video application to join the Youth Design Team by October 31 at 11:59pm PDT:
https://bit.ly/youthdesignteam
How does the Youth Design Team work?
The Team meets online to help professional designers at digital media companies come up with new and exciting ways to design apps, websites, and interactive content for kids. In addition to an orientation meeting and a concluding meeting for just the Youth Design Team, we will participate in 3-4 design workshops with professional designers. During the workshops, you might…
- Share your perspective and insights into current digital trends
- Help designers brainstorm and design elements of products
- Test digital products and offer your feedback
How much time do I need to commit?
From January 16 to March 31, Team participants will…
- Attend a Team orientation (1.5 hr)
- Participate in 3-4 interactive brainstorming and feedback workshops with 2-3 professional designers (1 hr each)
- Attend a Team conclusion session, including a chance to learn about the impact of your feedback on products and engage in a Q&A with design professionals (1.5 hr)
Do I need to have any special experiences to be able to join?
Nope. We’re hoping to have a diverse group of teenagers join the Team. If you are in high school, have access to technology to join online meetings, want to make a difference, and can commit to working with us from January to the end of March… we want you to apply!
Can I see a preview of the application questions?
Yes! Take a look at the application questions here.
What if I have more questions?
Email us at cooney.center@sesame.org and we’ll get back to you!
How can I apply?
Applying should take 10-15 minutes. Submit an application before October 31 at 11:59pm PDT: https://bit.ly/youthdesignteam
About the Joan Ganz Cooney Center at Sesame Workshop
Over 50 years ago, Joan Ganz Cooney came up with a crazy idea: what if we could use television to help children learn to read? At that time, families around the US were beginning to buy TVs for their homes, and kids were watching them. A lot. Kids were watching cartoons and commercials and whatever happened to be on.
But what if the things they were watching on TV could help them learn?
Inspired by this question, Joan Ganz Cooney wrote a research paper in 1966 called, “The Potential Uses of Television in Preschool Education.” Her ideas caught on, and a few years later the first episode of “Sesame Street” was broadcast on PBS.
Now it’s 2023, and technology is changing everything about how audiences watch, listen, and engage with media. We need your help to design the future of technology for kids.