Listen to this episode on iTunes, Spotify, or Libsyn.
Ron Dahl Please be advised – this episode contains adult language.
How do we learn to become contributing, collaborative, and connected adults?
Adolescence–the years from 10 to 25–is a crucial window when brain and social development create new capacities that we need to connect–not just with friends and romantic partners, but with ideas and goals and other members of our communities and the world.
As technology offers more and more opportunities to connect through digital means or with AI – what does this mean for the future of authentic human connection?
Alison Lee Why are you turning to AI? What do you like about it? And they’re like, it doesn’t judge me. It’s not going to give me consequences. It gives me really good advice without worrying about the repercussions.
Thao Ha This is all set up by our biology; we’re wired to really like learn from social relationships. And so adolescence is really the time to do that. So if we’re outsourcing some of this learning to a large language model, that is going to change how young people are learning about their relationships.
Nick Allen This relational technology fits hand-in-glove with exactly the most key developmental tasks of adolescence, both in terms of flourishing, but also in terms of creating risk for problems like depression, social anxiety, substance use, because they are also very much embedded in social cognition.
Ron Dahl I’m Ron Dahl, founding director for the Center for the Developing Adolescent, and this is Adaptivity, where we explore the science of adolescence, untangling misconceptions about the years between childhood and adulthood. We explore new insights into the rapid, cognitive, emotional and social changes that happen during these years, and how the developing adolescent brain is primed to promote healthy and adaptive learning.
On this special episode of Adaptivity, recorded live at SXSW EDU, we discuss the vulnerabilities–and potential opportunities–of young people engaging with artificial intelligence agents. We consider how this might interfere (but also could support) formative learning about human connection in our adolescent years. In ways that could have an enduring impact on relationships throughout our lives.
Within the discussion, we’ll hear an excerpt of a conversation I had with Elle Liemandt, creator of AskElle,ai, [1] a relationship-advice app for young people. It helps adolescents develop skills to make social connections with other people in an increasingly digital world.
Elle Liemandt I do think that at our core, teens, my generation, we want to be in person. We don’t want to have relationships with AI. And I think that it really comes down to providing them with the skills and the ability, and how can we make sure that we protect each other, protect this generation? And that’s what I’m trying to build with my app.
Ron Dahl You can hear an extended version of my interview with Elle at the end of the panel discussion.
But first, let’s begin with my conversation with researchers Alison Lee, Nick Allen, and Thao Ha.
We recorded our discussion in front of an audience on March 9, 2026, at the SXSW EDU conference in Austin, Texas.
*****
Ron Dahl Okay. Welcome, everyone, including those people coming in. I’m excited about our conversation panel and podcast recording today. We’ll be talking about adolescents, AI, and the future of authentic human connection. And a major focus of this work is to what degree technology and AI in particular is creating new vulnerabilities, but also new opportunities for scaffolding young people in their development–their social development and their development of relationships.
And what we really want to highlight, I think, as the central issue across all of our perspectives is not only the science that informs these issues, but also the recognition that human connection, sending and receiving thoughts, words, feelings between individuals in ways that create bonds, that create feelings of connection is so important to humanity, and things that disrupt and undermine those processes can have very serious consequences for young people, but also for society. And there are a lot of concerns and questions about how this rapidly changing world is causing fragmentation in these ways.
And we really want to be thinking about this, particularly during this formative period of adolescence when young people are learning through trial and error and their experiments, who they are, about self and other and their connections and forming new kinds of relationships and romantic relationships, and dealing with emotionally charged relationships with high-intensity feelings.
Let’s have the panel introduce themselves.
Thao Ha I’m Thao Ha, I’m an associate professor of psychology at Arizona State University, where I direct the Heart Lab–Healthy Experiences Across Relationships and Transitions. And we study young teen love in the context of emerging technologies: so social media, virtual reality, and AI.
Nick Allen I’m Nick Allen, I’m a professor of clinical psychology at the University of Oregon. I’ve been working in the area of youth mental health my entire career. And I also run a startup company that produces software for behavioral health research and behavioral health services, where we’re particularly focused on use cases like youth suicide prevention.
Alison Lee Hi, I’m Alison. I am the chief R&D officer at the Rhythm Project, where I study human connection in the age of AI. And that’s both in both directions. In what ways are young people’s social biomes leading to young people’s usage of AI for different reasons and motivations? And then in turn, how is AI reshaping their human relationships, their capacity, their skill sets, and the relationships themselves?
Ron Dahl I’d like to start, Alison, to have you share a little bit about, before we even dive completely into AI, what the role of technology has been in adolescents’ relationships in the recent past, and what’s rapidly changing?
Alison Lee I love that question. Um, so I think I have to do this through the storytelling that I hear from young people themselves. I’m going to bring two stories into the fold here. The first was a conversation that I had with a high schooler. She was a young woman, and she was telling me about the way that she felt about her friendships, that she felt that her right now, her friendships felt more fragile than ever before. She remembers that when she gets into an argument, for example, with a friend that’s in her social circle, that instead of being able to repair after that rupture, that she feels hesitant. She doesn’t know how to actually go ahead and approach that conversation. And instead, what she’ll start to see is that rupture really manifests both in real life at school and also in her digital environments.
So she was saying that she was starting to see, all of a sudden the group chat went silent. She was like, she knew her friends were still talking, but they had a separate group chat now that she wasn’t on anymore. And then she saw an IG story where all of her friends were hanging out and the caption was “real ones only.” So now she’s seeing the rupture of her friendships fall out in incredibly public ways, right?
And now, Regina Georges and Mean Girls and, like, difficult dynamics in friendships have always existed, right? They’ve existed for probably as long as people have been in relationship with one another.b But I think the difference now is that this tech is really playing a magnifying lens into how this plays out incredibly publicly, which has real implications for the risk-taking, the messiness of human relationships being in display.
And then the second story that I’ll just name here is one from one of my youth fellows, Cash. And Cash was telling me about growing up in a generation where everything is documented on social media. They’re like, “You got to do it for the gram,” right? You’ve got to snap it on Snapchat and share it to your friends. And so everything is filtered. Everything is curated and displayed in a very specific way. That’s supposed to be your best self in the digital world. And now here comes AI. And I know we’re going to be talking about AI in a second, where they’re passing all of their communications through AI, right? All of their texts and DMs and emails. And so he calls this a mask machine, right? That they’ve got, they’ve grown up in a world where tech has mediated every external representation of who they are in the world, but he feels like, I don’t know if we actually know how to show up in a way that isn’t mediated in some curated form, right? It’s not that they don’t want to, it’s that they’ve grown up with the expectation that that’s how they’re supposed to show up.
Ron Dahl Thank you. Just one quick follow-up question before we move on. If you had to pick one area of relationship health, relationship interactions that you’re most concerned about, for us to consider today, what has emerged not only from these stories, but from your work?
Alison Lee Oh, man, I can’t wait to dig into this. So I think when you hear those two stories together, right. Taken together, the resounding theme that I’m hearing is one of vulnerability. In what ways have we eroded young people’s capacity to really show up in the hard moments, as fully yourself and fully in relationship with other people, right? Um, and so when we think about, again, that’s always been incredibly difficult, but I think it would be remiss to ignore the implications of the ways that tech has made that even harder, um, alongside lots of other things, right? The fact that schools don’t really center human connection anymore and that there aren’t really third spaces where young people can engage in to practice those skills, but tech is part of that.
And so we’re seeing that same, uh, concern about vulnerability show up in our data first. In the interviews that we did almost a year and a half ago, we were sitting with young people in December of 2024 asking them, why are you turning to AI? What do you like about it? And they’re like, it doesn’t judge me. It doesn’t. It’s not going to give me consequences. It’s not going to judge me for the questions that I ask. Right? It gives me really good advice without worrying about the repercussions. Right?
And so that led to a study that I’m happy to talk about a little bit more, where we actually tried to start to document this. So we asked young people, um, the number of friendships that they have, the number of adults that they have in their life that they can turn to for help. And we also asked them about their ability to be vulnerable. So we asked them, “How often do you feel like you can be real?” Be real? This was taken directly from these interviews with young people. How often do you feel like you can be real with the people around you? And how often do you feel like you are a burden to other people when you ask them for help? Or how often do you feel like you know how to ask for help? Or that you have people in your life that you can ask for help?
And then we asked a whole bunch of questions about AI. What are you using it for? How are you using it? And then we asked some questions around what we call risky AI behavior. So how often do you find yourself turning to AI before human beings when you’re upset or alone? We ask them about addictive tendencies. Do you find yourself having the urge to use AI more and more? And then we finally asked, do you feel an emotional attachment to your AI? Have you ever experienced that?
And it turns out we had this sort of hypothesis, right, that it was going to be the most isolated kids that would be experiencing the greatest rates of these, what we call problematic AI behaviors, right? Human displacement and addiction. But what we actually found was that it wasn’t about the number of connections that young people had. It was them feeling the inability to be vulnerable that was the greatest predictor of problematic AI use. So when they said they couldn’t be real with the people around them, or they couldn’t ask for help, or they felt like they were a burden to other people in life, that was the thing that really led to attachment and displacement of human relationships.
Ron Dahl Thank you very much. We’re going to come back to this issue of being vulnerable as part of connection. Nick, you have spent a great deal of your career not only working with mental health, not only working with the importance of relationships in both supporting and creating challenges to mental health, but how use of technology is interwoven in important ways in both of those. What can you share from all this research, as well as being a clinician treating young people with mental health issues that you think is important for us to be considering from the the mental health perspective?
Nick Allen Well, I’m sure many people here are aware that there’s a very dramatic change that occurs in the epidemiology of mental disorders around early adolescence. And if you haven’t seen these graphs, they’re quite dramatic. So problems like depression, some forms of anxiety, particularly social anxiety, eating disorders, substance use. And also later on in adolescence, psychoses like schizophrenia and bipolar disorder all emerge, typically in this period of life. And so one of the things that we’ve got to understand from a clinical point of view is what are some of the reasons for this shift? For this change in relationships and what I’ll call relational cognition, the way you think about relationships and the way you represent them in your mind seems to play a really important role in that.
So we know that during early adolescence, there’s obviously the transition of puberty, which is bringing young people into a new world of different kinds of relationships that they’re starting to negotiate. But also, there’s changes in the brain that are very important in particular. Ironically, some of the higher cognitive skills that we value emerging during adolescence, that are associated with the development of the frontal lobe, like the ability to represent abstract outcomes and things that are a long way in the future, also bring with them burdens.
So for example, now you can think about something like reputation and what is my reputation now? Reputation is an abstract concept, but I can hold that online. I can give it value, and I can start to reason about where my reputation sits with respect to other people. And I can also start to ruminate about that. I can start asking questions. Why? Why is this happening in this particular way? So my point is, some of the higher cognitive abilities that we really think are very valuable that emerge during adolescence also bring with them burdens that are probably related to the emergence of some of these problems, like depression, for example, where a lot of it is about: am I loved, am I respected, am I valued in my peer and my family environment?
So I think, just to bring in the technology angle, so what we have are these relational technologies. And of course, we’ve had social media with us for a while. But the big new one is artificial intelligence, particularly conversational agents based on large language models, because they are designed to talk to you and they are good at it, now in a way that I would have thought was entirely impossible four years ago. If you had told me the conversation that I literally had this morning, which I’ll tell you about later with a chatbot, that that would be possible. I would have said you were crazy. So this relational technology fits hand in glove with exactly the most key developmental tasks of adolescence, both in terms of flourishing, but also in terms of creating risk for problems like depression, social anxiety, substance use, and probably even psychosis because they are also very much embedded in social cognition.
Ron Dahl So I want to come back and hear about your conversation later. But I want to shift now to you, Thao, and you’ve done so much research, when few people were trying to do direct research in these areas about how young people not only develop relationships, but what are the relational skills and what are the processes? By investigating these that seem to be really important and that may be being altered by technology and AI?
Thao Ha Yeah. Um, we’ve been doing this work for 20 years already. Um, and so I started out, um, observing how teens are interacting with their romantic partners. So we would actually film them and we would just see how they would talk about jealousy. And jealousy is a huge, huge topic when you’re 16- or 15-years-old. And it was amazing to see the variety between how couples resolve an issue like jealousy. Some of the teens are like, we don’t have any jealousy or they’re dismissing it. Others are like going right at it. I saw you talking to this girl, la la la la la. And then others are really like, yeah, I think they’re more mature. Or like, you know, they’re more like, yeah, sometimes you feel jealous, but then we talk about it. So you see a whole variety of, of young people trying to navigate this really intense feeling of the fear of losing someone because they love the other person so much, but they don’t quite know how to, how to navigate that.
And so, um, and it’s what, what Nick was saying about the brain, like this is all set up by our biology. We’re wired to really like learn from social relationships. And so adolescence is really the time to do that. And if we’re missing out. So if we’re outsourcing some of this learning to a large language model that does make a teenager, like Alison said, feel heard, feel seen. And it’s also more objective to some teens, because it’s just a machine that is going to change how young people are learning about their relationships. So relational learning and that’s what we study in our lab. And, um, do you want me to talk about what we study there or…
Ron Dahl Yes. Yeah. I was actually even thinking about that set up by what you’ve just said is the idea that people have to negotiate setting boundaries about behavior and jealousy and how different that is with a partner or a potential partner than it is with an AI agent.
Thao Ha Yes. Setting boundaries is such a huge skill. And this is I’ll go back to my human-to-human observations of these young people who are in love. And you see how they’re trying out different techniques to, some are very firm, like, I don’t want you to talk to so and so, so and so, so and so and so. You don’t use Instagram, you don’t use this, you don’t use that. So they’re very firm. And then the partner is reacting of like, okay, I won’t do that anymore. Or they will say, what do you mean? I never do that? And then they show another response. So what you see is relationship learning in the moment in action by having all of these different micro-responses and by practicing these micro-responses over time, you develop like a menu of how you respond generally. And you carry that throughout life.
And we found in our data, you know, how you learn how to resolve conflict, how you learn how to set boundaries, how you learn to negotiate intimacy that is hugely impactful on relationship quality later in life, but also mental health. And it goes back to what Nick was saying. Like, if you’re not able to solve your conflicts in the moment with your partner and that continues throughout life, it is predictive also of feelings of depression later in life and feelings of loneliness and anxiety. And, um, and all of this is some of this is being outsourced to AI right now. And that interferes with the, the core relational learning that needs to happen during adolescence.
Ron Dahl Thank you. I want to ask both Nick and you, Thao, a follow up question about the role of research. Because I think when we talk about young people’s relationships, a lot of people who are teachers, clinicians, you know, interact with young people, you have a sense of what we need to do. And I think part of the theme in this, on our frame, is how research and studies, whether they’re qualitative, whether they’re clinical, can provide actionable insights, real insights. And so I’d love to hear, Nick, maybe an example. And then Thao, I can imagine how in addition to what you might just think, observing young people, interacting with them, how does research add value in ways that really could be actionable? Thinking about the challenges and vulnerabilities moving forward?
Nick Allen Well, I’m not sure if it’s what you were looking for, but I’m going to I’m going to make a meta comment about a thing that I think is very critical here, which is that we have a big problem because the pace of research cannot keep up with the pace of change. In the case of social media, which has been with us for how long? 15, 20 years, we still don’t have good answers about the impact of social media on young people’s mental health. Now, that may surprise you that I say that because one of the things that’s happened is that there’s been kind of like a community consensus that social media is, is, is, is bad for kids mental health. That’s, that’s a kind of a message you hear a lot. And there are some places like in Australia where I’m from, as you can probably tell, because of the funny way I speak, that where they’ve actually just recently introduced legislation to restrict the use of social media for kids under 16. So it’s actually flowing right through to policy. There are many other international contexts which are looking at those kinds of laws. So. But if you go to the researchers who work in this area, they will generally tell you, we’re not so clear about this. This is not a clear story. And in fact, we still don’t have the data to answer the most important questions about what we should do.
So I introduce that because I’m saying research is a slow process because it’s about doing things properly and getting it right and so forth. And sometimes it’s limited by technology. You know, like the, the tools you have to understand something. And when you have products that are changing year to year, it’s very challenging for research to keep up. Now, In the case of artificial intelligence and particularly, you know, large-language-model-based products, things are changing month to month.
So I don’t really have an answer for this, Ron. But this is a challenge, right? We have things that are moving so quickly and it’s moving at a pace that traditional research models, which are about doing things properly and getting them right. And of course, you have to get your funding before that, and then you have to wait for the journal to publish the damn paper and all these sorts of things, you know. So it’s generally like a 6-to 7-year project process from idea to outcome. For traditional research models, they’re completely not up to the task right now. And so in the meantime, we have young people going through an experiment. I think I think you were looking for some specific findings. But to me, that’s the most important issue.
Ron Dahl And I think that’s actually the most powerful framework for the for the discussion, which is in a rapidly changing world where things are moving so fast. And by the time people process data from studies carefully enough, looking back, it’s not addressing the question. And this idea that we need interdisciplinary teams who have insights from research to how to inform what we’re going to try to do, because we can’t wait ten years into the future to figure out what we should have done in 2026.
And so these are compelling issues. And yet we can’t just let emotionally driven policymakers afraid of outcomes make consequential decisions. And so I think how to connect the science and areas of science and researchers and clinicians together to make informed judgments and quickly evaluate them becomes really, really important. But I want to come back to you, Thao, because again, you’ve been asking basic questions about learning relational skills. And so in some ways, we need to extrapolate as best we can to what are the, the big concerns and how might we approach having a positive impact?
Thao Ha Yeah. What you’re describing, Nick, is in my day-to-day life, every day I’m dealing with searching for funding, writing research, analyzing data, recruitment of couples. Um, you know, so it’s, it’s what we do. Um, I would really like to emphasize, yes, it’s moving so fast, but there’s also basic principles that we need to know from our developmental science that we can apply, and we need to do research in the meantime, because for some young people, AI can be a lifesaver.
I work in the context of Arizona in the public schools. As some of you know, there is no social emotional development really going on there. We don’t really value adolescents’ relationship development at all. And so and not having a secure support system like Alison was also pointing out, AI can have really good answers for some young people and for some other young people who are maybe struggling with mental health problems. They feel unseen, they feel unheard, they feel like they don’t matter. They might become really quickly dependent on AI. And I think we really need research to disentangle when it can be beneficial. For whom, why, and in which contexts. Because there’s some studies that are coming out among adults.
And adulthood is a very different developmental time than adolescence that show that there are some benefits in the short term. So people, these are often like 1- or 2-week experiments, and then there is some kind of decrease in loneliness. And I think that’s really important to know, but we also need to look and study how these relationships develop. So relationships with machines and relationships with humans, how they develop over time. Because if it temporarily reduces loneliness, it explains why people and young people in particular like using it. It’s like a Band Aid. But then over time, what are the consequences over time?
And again, I want to highlight it doesn’t have to be bad for everybody, but it can be. And if we cannot just rely on media and the most extreme stories, and unfortunately, the extreme stories are very, very sad, but there’s also we just need really good data on this. And so when I, when I’m referring back to these real-life interactions of these young people trying to navigate boundaries and love, they’re doing the same with an AI and we can observe those interactions if we get that data.
Ron Dahl So I want to come back to the vulnerability dimension and the role of how we navigate friction and difficulty and vulnerability as a part of this. But I want to come back to, um, uh, Elle Leimondt’s description of exactly some of these issues that she sees in the millions of young people who have been interacting with her approach to this.[2]
Elle Liemond Hi, I’m Elle, I’m 17, and two years ago I blew up on TikTok. It was just after Covid. And so I had experienced firsthand the diminishing of in-person relationships, whether it’s friendships, relationships, just it had all moved online. And so I started posting on TikTok and now I have like 65 million likes on my account. And it’s not because I’m some expert TikToker. It is because this advice is so needed by teens.
When I was interacting with AI and I was seeing my peers interact with AI, even in 2022, I was already seeing the honesty that would come out when they were talking to an AI. You can be entirely yourself, and that’s what people I think are really gravitating towards AI, and I think it has positive and negative effects. The negative effects are venting to an AI, and being emotionally vulnerable to an AI is not teaching the skill of being an emotionally vulnerable to a human being. There’s an entirely different dynamic if you’re emotionally vulnerable to a human being. There’s possible consequences, you know, that could change the way they view you. I wanted to shift that into a more positive light where it was like, okay, I now have almost this ability for teens to be super, super, super, super honest with this thing, this AI.
How can I use that to promote the positive skills? So it’s scaffolding where it’s like, if you can first be honest with an AI. Oh, this is how I feel. I really like this girl, but I’m way too scared because of rejection. And then the AI can slowly coach you. It never tells the team what to say, what to do, but it’s super important for teens to learn. You know the spectrum of life skills because it’s not just about what you’re saying, it’s about how you make the other person feel.
And I do think that at our core teens, my generation, we want to be in person. We don’t want to have relationships with AI. And I think that it really comes down to providing them with the skills and the ability. And how can we make sure that we, you know, protect each other, protect this generation? And that’s what I’m trying to build with my app is use AI for good rather than using AI to further harm the generation.
Ron Dahl Thank you. I think that captured there’s actually a much longer interview, really fascinating insights from helping peers understand and scaffold building these skills to be vulnerable, to repair after conflict, to navigate uncertain boundaries and, and that that work of taking risks. We do have enough science about the value of these things, of repairing after a relationship has ended and the risk for depression. We know a lot about the positive side of building these feelings of connection. Social skills for managing and building up the relationship. How do we use what we do understand and apply it to the new challenges of taking the shortcut with AI that can give you small bits of that, but without the authentic, real vulnerability?
How can we have enough clarity about the positive skill development that will have value for young people, even in a rapidly changing world, that will give them capacities and skills to better navigate the risks and dangers of AI?
Alison Lee Um, so in the survey that we ran back in November of 2025, we asked young people if they said that they were using AI for processing their emotions or, uh, asking for relationship advice. We’re like, tell me a little bit more about what you asked it for. So we actually coded all of these responses with like 1500 responses. I have to say, I used AI to do that. And so a little note on that when we talk about like cross-product research versus academic research, there’s an opportunity there.
There were three big buckets that started to come up. The first one was emotional processing. Man, I have some big feelings right now and I just need to vent. I just really need to figure out how I’m feeling and how to do this productively, because I can feel myself ready to explode, right? And when we asked young people, like, what is the effect of this on you. A lot of it was like, wow, I feel so much clearer about my emotions. Or I feel like I’m more prepared to talk about my emotions to the people in my life now. So I think that’s like a promising skill set, right? When you think about like the SEL domain, right? We want people to develop self-awareness and healthy ways of processing so that they can be in relationship with other people in the world.
Um, and the second one was about navigating conflict, right? Um, man, I just had this fight with my dad and he was doing all this stuff and I said some mean things back to him. Am I the asshole? [3] Excuse my language, but like, it was about like, how do I navigate this? Right? Like, can I get an objective like set of eyes on this scenario so I can really understand what’s going on, right? Um, and in that I see both promise and peril, right? Because I am hearing young people seek out understanding of, of a scenario. And we know a lot of this technology right now is very good at being sycophantic. No, you were totally in the right. You know, they were such jerks and you were absolutely justified in how you feel. And so that’s another concern.
Um, but the last one, and I was I’m looking at Thao here because we talked about this last night was a lot of it was about reading relationship signals, right? It was like, oh, I have this crush on a girl and she did this, this and she sent this to me or she made eye contact with me. Does this mean she likes me too, or does this mean that she’s not into me anymore? Or what does this signal mean? Right? So a lot of it is about sitting in the uncertainty that comes with the developmental stage that they’re in, where they’re developing an acuity of the signals. And yet now they’re turning to both a tool to help them reframe that. But also, is this a way that might erode their capacity to transfer those skills into real life? And that’s a question that is worth investigating, right? To try to understand what are the long-term effects of this? Is this going to actually build our skill set to do this in real life, or is this going to just offload it and then erode our that capacity?
And I’ll say one last note here before I’ll be quiet, is that I spent some time in academia, really understanding the rigor of academic research and the value that brings. And then I also spent some time in big tech working as a UX researcher. And we would churn out studies in three months. Right now, the pace at which we were conducting that research was required to keep up with the level of the pace of the development of the technology. But I also had a data scientist, an engineer, and a product manager that was at my disposal that I could pull this data.
So I think what I’m asking for in this moment is actually a new way of bridging interdisciplinary research. Imagine if, Nick, you had access to an engineer that could crawl through thousands of pages of chat transcripts to classify and detect particular moments where an interaction may be facilitating a negative thought, and you could count the instances of it and have an engineer do that across thousands of people, and then attach that to your body of expertise to say, what are the most important signals, because the engineers are going to be able to know how to build it. They’re not going to know what to build. What are they supposed to be detecting? Right? So I think that’s also what I’m hoping to see, a new sort of shift in the way we talk about research and the collaborations that are required in this moment. That’s really exciting.
Ron Dahl Thank you. So, Nick, you can either respond to that and or I want to go back to your mention of the amazing capabilities of these agents. And that creates some positive opportunities, but also risks. I’d love to hear you say a little bit more about both of those.
Nick Allen Yeah. Well, my thought, I think builds on what was just said, which is that anyone who spent time with the new generation chatbots will know that, according to the chatbot, from my own experience, is that I have apparently an endless supply of great ideas. And this is very enjoyable to talk to. This thing which says everything I put in there, it says, that’s a great idea. And let me take it further. And would you like me to do this for you and so forth? So it’s actually and I’m making it a bit humorous, but it’s actually very compelling to be told that. Right? Yes. So, and it’s lovely, but it’s very different to a human relationship in that respect.
However, it it’s interesting to look at this issue of the personality of the of the chatbot is actually a big deal that gets a lot of attention within the frontier labs. Uh, some of you may be aware that when ChatGPT 4-0 was deprecated, there was an enormous reaction online of people saying things like, I’ve lost my best friend. I don’t know who to talk to anymore. This new one’s not as nice, you know, and so forth. So my point is that the way we design the personalities, for want of a better term, of these chatbots is really critical.
And so I was aware that Anthropic, as some of you may be aware, has a document, which is what they call their constitution. Some people call it the soul document. And I’ve heard interviews with some of the people responsible for this work. I actually think Anthropic is probably, in my personal opinion, the most thoughtful company about this kind of stuff. And so they have published this document and you can, you can download it and read it if you go and look for it.
So this morning I downloaded it and I, which I looked at before. And then I asked Claude, what, uh, what this had to say about healthy adolescent development. It said a lot. But here’s the summary. “The Constitution takes a broadly sensible, values-grounded position on user well-being that implicitly offers some protection for teenagers, particularly around dependency, epistemic autonomy and emotional manipulation.” First of all, that’s an amazing sentence from a robot. “But it doesn’t engage substantively with adolescent developmental science.” Now this next bit is the bit that freaks me out. And this is another important aspect of these products is the role of memory. Because this is going to be a big development soon. It’s going to learn about you. So this is the next sentence: “from your perspective, me, as someone who studies digital mental health. The document’s framework is coherent but underspecified for this population. The tools are there: well-being, autonomy preservation, anti-sycophancy. But they’re not calibrated to the distinctive vulnerabilities or capacities of adolescent users.” So that’s what Claude says about his own soul.
But my larger point is there’s an opportunity here. And I do think that the large language model labs are, are thinking a lot about some of these aspects of personality, but based on that, and I’m fairly confident Anthropic is probably ahead of the other labs in their thinking on this, there’s probably no one there who’s really thinking explicitly about youth, the developmental needs of youth, and how this very powerful and engaging relational technology could shape youth development, either beneficially or harmfully. And apparently Claude agrees with that very smart idea.
Ron Dahl Thank you. Uh, we have about five minutes left. Thao, do you want to offer a reflection on the set of issues and or anything you want to highlight that we haven’t yet talked about that feels really important in this space.
Thao Ha Yeah. I would like to highlight the relational learning that happens between a human young person and the machine. And then reflecting back to what Alison was saying about, you know, venting to it. What was the second thing you were saying—resolving conflicts with it. And those are core adolescent developmental capacities that apparently Claude does not have in its soul. But I want us to realize that all of these large language models are trained on human data. As humans, we’re also not that great in having healthy relationships, and we know exactly when and how intimate partner violence, for example, emerges. And so if I’m even slightly coercive or would like to have a bit more of control of what my partner is doing and then this large language model is reinforcing that you got a classic coercive cycle that leads to digital and in-person intimate partner violence, right? And so these dynamics between young person and a machine, they reflect our human dynamics. And we do know when those are harmful. And so it’s really not that hard for some of these companies to have better design principles and take into account, um, other types of calculations that we can predict, um, teen flourishing. And when is it actually deteriorating in the context of using these, um, chat bots? And I think there’s huge opportunity there. And I don’t think it’s that hard because they have so much money and so many engineers and so much access. And so yeah.
Ron Dahl Thank you. Alison, any closing thoughts in this space?
Alison Lee My closing thought, I think this is a room of adults who really care about young people. I think that’s why you’re here. And so I would say, yes, we need these big system solutions. We need Anthropic and OpenAI and all these other big tech companies to take this seriously. And we need our governments to take this seriously. We need research too, to be able to move at the pace of this.
I’d also say that we have a lot of agency in this room, right? We all have young people in our lives. We all potentially touch systems that young people exist in. And so I would invite all of you in this audience and people who might be listening to really think about, how do I bring this conversation about the integrity of human connection in this new world to the young people that I serve or the systems that I touch? And I’ll just say, not just to young people, adults need this too, right? to our point. Adults don’t have this figured out really either.
And so I think the most important piece of all of this is that if we want to get clear about what about human relationships are worth protecting, we have to talk about it and we have to be about it. We have to live it. So live vulnerability. Talk about why it’s hard. Talk about why it’s precious and worth protecting. Um, and then, and then you can back into the AI conversation. How could AI support our capacity to be vulnerable? And when might it erode our capacities to be vulnerable? So I think that’s the invitation here is to say we have individual agency alongside, um, our collective agency to push for change.
Ron Dahl Thank you. Nick. Final comment?
Nick Allen Get ready. It’s a big change. it’s a big change. And it’s a big change for young people and for those of us who work with them and care about them, it’s something we cannot sleep on. It’s moving fast, so let’s get on top of it.
Ron Dahl Well, thank you and thanks to everyone here. These are such compelling issues. I guess I would like to just add one additional thought is one of the most fundamental things that young people are learning about is love. Giving and receiving love. And that’s not simply a matter of saying the right words or even expressing ideas. It is given and received, and we really need to cultivate that. As I think others have said, in our own interactions and support of young people and helping them to really build these, these skills, capacities and feelings of being valued, mattering, and loved. So that they can better find resilience in this very fast moving, rapidly changing world. Thank you very much.
********
Ron Dahl Thanks again to our panelists, Thao Ha, Allison Lee, and Nick Allen for joining us at SXSW EDU 2026, and to everyone else who helped make this conversation possible. And special thanks to our wonderful audience.
Now, as promised, here’s a longer version of my conversation with Elle Liemandt, creator of the AskElle AI dating advice app.
Ron Dahl So first of all, I’d like to you just to say a little bit about what is AskElle AI, and how does it work, and a little bit about yourself.
Elle Leimondt Hi, I’m Elle. I’m 17, and two years ago I blew up on TikTok. It was just after Covid. And so I had experienced firsthand the diminishing of in-person relationships, whether it’s friendships, relationships, just it had all moved online. And so I started posting on TikTok. Not really as a the expert on all of this, but just as a, you know, a fellow teen going through this as well. Um, and I blew up. I got 100,000 followers in a few weeks, you know. Now I have, like, 65 million likes on my account. Um, and it’s not because I’m some expert TikToker. It is because this advice is so needed by teens.
And that’s what I realized. And, um, I was getting at this time, I was getting hundreds of DMs every single day from teens, which for those of you who know, I mean, I have a lot of followers, but hundreds of DMs is a pretty crazy ratio. Um, and generative AI had just come out and I was learning about it in my school, and I thought to myself, the way I’m making my videos at the time—I’d never dated. I was like, I don’t really know what I’m talking about. I was just summarizing, um, psychology articles that I was reading online. So Eli Finkel is a huge, you know, inspiration of mine. I would read his study, and then I would basically just translate it into Gen Z format. Um, and I realized, why couldn’t I just train an LLM to do the same thing, where instead of teens coming to me with their questions and DMing me with their questions, they could instead go to this app.
Um, and so I developed the app with the help of AI. I’m also not a great coder, so there was lots of use of AI around around this. Um, and I released the app and it got almost 100,000 downloads instantly. And what the app does is, uh, a teen can log on and they are faced with an avatar. At the time, I always say this, it was very novel that it was an avatar. Now everyone knows, like ChatGPT has an avatar, you can talk to AI, but two years ago that was actually pretty leading edge and I felt pretty proud about my avatar. Um, and the avatar, it looks like me, and it’s supposed to be me giving advice instead of through DMs. And so teens can talk to the AI about any relationship question. And the number one thing that the AI has prompted about and why it’s different than ChatGPT or Claude or any of these other, even though it’s the same model, the way I’ve prompt tuned it and fine-tuned it, um, make sure that it is always encouraging in person relationships. It’s not, you know, this AI bot that’s just trying to talk to you forever. In fact, if you ask like, oh, I want to ask a girl out. It won’t even say like, add her on Snapchat or talk to her over text. It will give—it’ll be like, let’s do rejection training where so that you feel confident enough to show up in person, where it’s always trying to teach the actual life skills that the teen, the teen years used to be the training ground for that have now fallen more online than in person, and so always trying to cultivate that.
Ron Dahl Thank you so much. I have to share that I’m feeling emotional by the degree to which you’re tapping into the need, the value. How many young people want help and advice? Uh, they sense something they want more of and, uh, and getting skills to do this. And it’s really exciting to hear how quickly this has resonated has been being useful. And one of the things you’ve already identified, I want to ask you a little bit more about is how do you promote in real life the real person-to-person interactions. Even though you’re using technology to scaffold this.
Elle Liemandt It is so important. And so I think that that’s one of the differences with my app than anything else, even I would say, than peer-to-peer advice, which I love, peer-to-peer advice. If there’s anything my AI, it’s very much it knows its scope. If you go outside and you’re asking about, oh, you know, this toxic relationship, it’s going to be like, talk to a trusted adult about this. My AI understands what it is good at and what it is not good at, but one of the things that it is really good at is making—not just giving teens advice, but giving them the actual skills. Because it’s one thing for an AI to just say like, oh, go do x, y, z, and a teen is like, how am I possibly supposed to do that?
And the other thing is, it’s trained in a lot of David Yeager. He is a psychologist, um, that I worked with. Luckily, I’ve been very lucky, lucky to have him as a mentor. Um, and I read his book a few years back. Um, and it was all one chapter that really hit home for me was all about how teens just basically care about social status. And if you can’t tie it to social status, you’re not going to encourage that behavior.
And so I was very conscious of that. And I’m a teenager myself I understand that. And so I was very cautious about that. Where the AI, it’s not just giving the perfect absolute right thing to do, it’s giving advice that is healthy, safe and positive, but something a teen would actually implement. You don’t just tell a teen like, oh, just don’t date. Like that’s not going to resonate. And in fact, they’re just going to disregard the advice entirely and do whatever you know is, you know, whatever, whatever their limbic system tells them to do in that exact moment.
But instead, I want, my AI is very much, let’s give you the skills and let’s make sure that it is in line with what are your priorities? What are your values? Social status. So let’s not, you know, you’re really scared of getting rejected. That’s your number one fear right now. So let’s avoid that. How do we navigate that. And another thing is it’s conscious of that when it’s giving advice in the sense of if a teen asks, “I’m scared of getting rejected or I’m scared to ask this girl out to homecoming because I’m scared of getting rejected,” It won’t just identify exactly that, but it’s going to go a little deeper and it’s going to say, are you scared because you really, really like her? Are you scared of what people are going to think about you? Oh, you’re the guy that got rejected by this girl.
And so it’s trying to find the deeper, actual root of what is the cause of why are you not going in person? Why are you defaulting to online or not dating at all, not getting any in person, um, any, any in-person relation at all, which is obviously the worst possible situation. And so the bot is very conscious of teens, their environment today, what is going on. And it is always trying to identify the root cause and then build—slowly scaffold skills that will get them back in person. And so I think really the only way to do that right now and a super advantage is using AI, um, and it’s using AI carefully and not, you know, like Grok companion, for example, where it’s, it’s, it’s maybe going in the wrong direction versus using AI to promote good skills in teenagers, which is totally possible.
Ron Dahl Thank you so much. It was a great set of descriptions of key aspects of this, and glad to hear that David Yeager was a part of this. He’s a good friend. We work closely together. So, um, so yes, it’s great to hear and I want to just touch on two things. You said you started out saying about this importance of giving skills, and then you unpack that. It’s about building skills in a gradual step-by-step, scaffolded, which is a great metaphor, um, way. And then the last thing that links to that, is it’s not just building a relationship with a bot or a bot-like avatar. It is how this scaffolds doing this in real life with someone where the stakes feel higher because it’s someone, um, that that could create both really strong negative feelings or strong positive feelings. How did you zoom in on the importance of that at the center so quickly of using AI to promote??
Elle Liemandt I think for me, it was even really early on when AI came out, there were already studies that were being released that, I mean, now it’s three-quarters of teens have used AI. So I mean, it’s skyrocketed. But even back then it was showing that I was just at a conference, the SPSP [Society for Personality and Social Psychology] conference presenting this past weekend. And I came across a poster, um, where and the study was all about how, um, teens feel the exact same emotional release venting to an AI that they do to a friend. And I was like, whoa. I mean, that’s crazy that this is something so human, which is venting and all these, all these issues, but they feel the same emotional release talking to an AI. It was it was a crazy revelation for me.
And I think though that is kind of when I was interacting with AI and I was seeing my peers interact with AI. Even in 2022, I was already seeing the honesty that would come out when they were talking to an AI. You can be entirely yourself. And that’s what people, I think are really gravitating towards AI. And I think it has positive and negative effects. The negative effects are venting to an AI, and being emotionally vulnerable to an AI is not teaching the skill of being an emotionally vulnerable to a human being. That there’s an entirely different dynamic. If you’re emotionally vulnerable to a human being, there’s possible consequences, you know, that could change the way they view you.
There’s all of these things that you’re thinking about when you’re talking to an AI. You can just delete the chat if they start to, like, judge you or whatever. Oh, no, no, this is too far. I’m just gonna delete this chat and start over. Um, and I think that that’s that sort of thought process as I was seeing myself use AI and see my fellow peers use AI, I was like, it seems that they are being more honest with AI, and I think that we wanted to shift that into a more positive light where it was like, okay, I now have almost this ability for teens to be super, super, super, super honest with this thing, this AI. How can I use that to promote the positive scales, where it’s like, if you can first be honest with an AI, oh, this is how I feel. I really like this girl, but I’m way too scared because of rejection. And then the AI can slowly coach you. It never tells the teen what to say, what to do. No no no, we are self-determination theory over here. That is not the way, um, we believe to actively affect change long term. Um, but it’s also it’s changing the thinking where it’s like, okay, giving the teens facts is a huge part of it where an AI can kind of just it knows the social context that I’ve trained it on.
So it knows the teen landscape that we’re working in. It also knows the facts. It’s trained on a ton of psychology papers like Eli and David and all of the best people. Um, and so and so it can effectively give advice that is relevant to the teen, but then also has the facts behind it. And so because teens are super honest to this AI bot, what it says in response, it can either be super helpful or it can be super negative.
And so I wanted to build a bot that wasn’t just reassurance that that’s what I see with a lot of people. It’s just like, oh, that’s amazing. So you vent and you whatever you say. Like I had a friend who, um, you know, she’d gotten into this weird situation with a boy and she had explained it to me, and she was like, you got this? Like, that’s amazing. And I was like, that’s not exactly. That’s not the real advice that you would want your daughter or your friend to have. That that’s not that’s not actually helpful in them getting into a long-term relationship or having in-person connection. And so that was something that was super crucial when I was developing the I was I was just recognizing how I was using it, how my peers were using it, and how can I flip that into something more positive?
Ron Dahl Again, really wonderful description. I want to just weave in a little bit of the science of, um, of relationships that I think resonate with exactly what you’re doing, which is that one set of skills for young people exploring relationships is figuring out the right words, to be honest, but communicate effectively and to practice that. But the second part is to develop a trusting relationship requires vulnerability, shared vulnerability. And so just interacting with an agent that tells you what you want to hear always keeps things frictionless or without any feeling of great stakes isn’t the full skill, but it’s also not that sending and receiving of feelings that develop trust. And that’s what we want in our relationships, correct?
Elle Liemandt Definitely. Um, I think that’s I have the reason I’m so scared of I call it the “AI girlfriend epidemic” or, you know, people kind of reverting to that is because when we look at the long term trends, it does seem to be going that way, where at first it was just in person and then it was, oh, you can call them from your home phone for like an hour, but you’re spending a majority of the time in person. And now I feel we’ve got, you know, and it’s slowly descended to where now there’s relationships that are obviously entirely online.
But even if I use this anecdote all the time. But I had a friend who. And this is a common teen experience. If you like a girl in your science class, you don’t go up to her in science class. You wait till after school, add her on Snapchat, talk all night. Next day in science class, you don’t go up night. You talk. It’s entirely text online. I would say this is the average relationship where it’s very online. Um, and there’s this theory of stage one, stage two and stage three relationships like depth, where stage one is you’re I would call this the talking stage where you’re talking with someone online.
Um, and then this is, this is where the fantasy for girls really develop, where they, they just kind of envision every everything that they don’t know about the person they assume is perfect. So they’re talking online. Oh, he’s gonna take me on the best dates. He’s going to open every door. And then stage two is when you actually it’s the first in-person, you know, you’re actually okay, we’re going to go hang out in person. We’re going to try to go an emotional level deeper. This is where girls get the ick, where they see, okay, this person is not perfect. They have flaws that, you know, they were maybe trying to conceal when they were online, but in person they just, you know, bleed through human nature. Um, and so then they’re kind of like, oh, ick, go back. And then they go back to stage two. And so teens are really, really struggling to get to stage three, which is when the, the positive, safe, emotional, healthy relationships are which is, I trust you emotionally, vulnerably. You trust me. I know you’re not a perfect person. You know I’m not a perfect person and you actually have a full understanding of who that is, who you’re talking to and and they know who you are and that’s where you want to be.
But teens are really getting stuck in the stage 1 to 2 cycle. And my fear is that stage one, is there really such a difference of talking online, texting with someone versus texting an AI bot? I don’t think so. In fact, I think that just texting an AI bot might even be a level above because they’re never going to judge you. They’re literally programmed to—How happy can I make this user that is their sole goal in life versus humans? I you know, I hope you’re not in a relationship where your sole goal in life is to make the other person happy. And so that’s where my worry comes from with AI is as we’ve seen this trend go this way. It’s like, okay, how far is this going to go? And so that’s why with my app, I want to bring them back to the top where, um, if you can talk to someone online for a few days, but teens are in these this stage one phase for months and months and months. And that’s just that’s not productive. That’s not healthy. That’s not teaching good skills. So my app is trying to get them through that loop to stage three as quickly as possible.
Ron Dahl I just want to zoom in on what is sort of obvious in a way. And yet I think we don’t think about enough in the contrast is that when we are interacting just through text, just even, you know, online, there’s a lot of information that we are not experiencing subtle cues, facial expressions, tone of voice that in real-time synchrony can generate really important positive and negative. And it’s, it’s like doing a sport versus just drilling one aspect of the sport. If you don’t practice all those pieces together, it’s not it’s hard to be fluent. It’s hard to be competent without practicing all these little pieces that we don’t, most of them were not even consciously aware of. When we’re getting a feeling that someone’s uncomfortable, or that somebody that an idea landed with that person more deeply and so this idea that you’ve this framework that’s really been the foundation for you is it’s to help support the full real-life skills of interacting with other humans. You send and receive feelings, not just words.
Elle Leimandt 100%, because I mean, before it was it was just you would be put in situations where you would have to do it. Your mom would send you to the dance, and you would have to ask a girl, and you would like that. That was the social pressure where it was like, okay, there were lots of situations where teens were put in proximity to people of the opposite gender and their same gender, where they would have to talk all together in person for an extended period of time. And that has really diminished.
Where, um, I read an article. Oh gosh, I wish I remembered, um, who wrote it, but it was all about the decline of the school dance. And I actually think it’s such a great metaphor for the broader picture of what has happened, where if you go to a school dance today, the guys are all there, all together, huddled, you know, like at the chairs, kind of like, oh, like, we’re not going to do that. And all the girls are on the dance floor just waiting for a guy. But the guys are so worried about rejection, about their fellow boy peers, what are they going to think if I all of those things are holding them back?
Um, and because we’ve just, we’ve lost those reps that are so important because they’ve moved online, I think that, um, it was happening before Covid and I don’t like blaming Covid for everything, but Covid really did expedite it, where suddenly, even when there was an in-person relationship, it got moved online. And I think that we still really have not gotten back in person for the teens because we lost those super key reps where we’re seeing it, not just in the, you know, teen, but also there’s a study conducted that says that, uh, 45 percent of men ages 18 to 25 have never asked a girl out. So it’s not just 14 to 18. Now, now we’re 18 to 25, and suddenly it’s actually pushing, you know, the marriage age and the birth rate.
And so it actually is causing a domino effect. Um, and so, so that’s why I think it’s it is it’s not this. Oh, I get this a lot from parents where they’re like, okay, my, my teen didn’t date in the teen years. Isn’t that kind of positive? Like I’m just going to wait for them to get to college. But clearly it’s still not happening even when you get to that second phase. So that’s why I think it’s crucial.
Ron Dahl Thank you. I want to come back to, um, a comment earlier and unpack it a little bit more. And you were talking about how it really helped you to understand through work, like David Yeager’s and others. The how sensitive. Uh, you know, adolescents, even in early adolescents are to and you use the term social status. And I want to just unpack that because I think sometimes people think that’s just purely popularity or getting higher in