#36 Filling the Mental Health Support Gap in Schools with Alongside, An AI-Empowered App (with Scott Freschet and Brenda Peña)

Seth Fleischauer (00:01.004)
Hello everyone and welcome to Make It Mindful, the podcast where we explore how to keep schools relevant by looking through the lens of mindfulness and asking the question, what's really worth paying attention to here? I'm Seth Fleishauer, my co -host Lauren Pinto is on a break, but together we delve into the world of education by interviewing change makers and focusing on practical, transformative solutions for teaching. Let's embark on this journey to make education more mindful and impactful.

And this week we have two guests, Brenda Pena and Scott Frisket. Welcome.

Scott Freschet (00:38.682)
Seth, thanks for having us.

Seth Fleischauer (00:40.428)
Let's jump in and just get a quick introduction from each of you. Brenda, why don't you start? Just give a little intro for our listeners.

Seth Fleischauer (01:07.756)
Awesome, Scott.

Scott Freschet (01:09.722)
I'm Scott Prescott. I'm the co -founder of a new application called the Alongside app. I've been working for about 12 years in EdTech supporting schools and districts across the country. Our latest venture is focused on helping schools support students' mental health.

Seth Fleischauer (01:26.636)
Awesome. Yeah, I've been talking with you, Scott, for the past couple of months, learned all about Alongside. I recently shared a little bit about the Alongside app at a teacher conference here in Oregon. And it's something that I think when people hear about it, they're like, wow, that makes a ton of sense. Thank you for doing that work. Can you please just describe Alongside? What does it do? What is the problem that it solves?

Scott Freschet (01:57.114)
Yeah, simply put alongside is a, it's a web and mobile app built by clinicians. And it's designed to help provide 24 seven mental health support for middle school and high school students. So what is it specifically? It's a AI chatbot where students can chat with a llama. Clinicians are developing clinical exercises that they talk to. There are

Students can have activities like mindfulness videos, a mood tracker, journaling prompts to calm, reflect, and learn. Any students that open up about suicidal ideation or other severe issues, those issues are escalated to school staff or 24 -7 resources. And then all of our partners have real -time data dashboards to understand how students are using the app and to give them to bubble up insights as to what students are getting help with.

in the app.

Seth Fleischauer (02:55.852)
And what problem is this looking to solve? Because don't they have guidance counselors for this type of work? Like, why an app?

Scott Freschet (03:05.114)
Yeah, I feel this is a great transition to Brenda in a minute. For us, alongside, we started this project about two and a half years ago, and we had previously been working with school districts to solve curricular challenges and using technology to help with that. And at the beginning of COVID, a couple of school districts we were working with said, we're struggling to meet the universal needs. Think of this as the tier one needs and mental health needs of all our students.

And there's really not a world where we can hire enough counselors or enough social workers. And even if in this made up world we could, not every student wants to go talk about their everyday mental health issues with a grownup. And so how do we really meet students where they're at with personalized support that's low cost and something that we can afford? And so to start this project, we worked with about 85 teams from across the country and said, what kind of mental health challenges do you face?

and how would you like to get support? And so they ultimately universally said, I need to get support 24 seven. It needs to be personalized and relevant to the needs I'm having. And it needs to be in a safe space, free of judgment. And so that informed the Alongside app. And maybe this is a great transition to Brenda. Maybe some of the problems you're seeing will help solve too.

Seth Fleischauer (04:33.004)
Yeah, Brent. Oh, sorry.

Seth Fleischauer (04:38.38)
Oh, yeah, I think there's a slight delay, but I think we're all good. So Brenda, obviously you are sorry. Let me start over. That's a great transition to Brenda. Brenda, obviously you're using this technology. This is why we invited you on the podcast here today is what Scott described there. Is that essentially how it's it's working on campus for you?

Seth Fleischauer (07:42.892)
Wow, yeah, that seems like such a clear benefit to be able to access that pool of people that wouldn't otherwise come to a guidance counselor and be able to identify them and provide them with services. I'm wondering if you have a sense of the benefit for those other students, the ones that are not necessarily being elevated to an additional level. Can you tell that it's working? And if so,

How is it working for those Tier 1 students?

Seth Fleischauer (10:30.284)
Hmm.

Seth Fleischauer (10:48.62)
Wow, lots of questions. I guess we'll start with that last one. How did you get your staff to embrace it? Change is hard. Cultural change is difficult. Was this something that everybody was just waiting for and it showed up and they're like, oh my gosh, does that make sense? Or did you have to work hard to figure out how you were going to make it more of a cultural phenomenon?

Seth Fleischauer (14:20.588)
Yeah.

Seth Fleischauer (14:26.572)
Uh.

Seth Fleischauer (14:33.74)
Yeah.

Seth Fleischauer (14:38.092)
That's so cool. I have to ask, because I wonder, one of the things I think about with AI, I think about the ways in which I'm going to be an out of touch old person. And I think about talking to my kids and saying, oh, why can't you just find a nice human partner, someone to be with? And I wonder about these friendships that...

people I think almost inevitably will be developing with AI if they're not already. You know, there are things like replica out there where you can, that's the whole purpose is to, you know, create a friend. And I worry about the lack of resiliency of this generation in general. And then, you know, you talked Scott about a judgment free zone.

And I think that that is an ideal situation that a counselor would bring right a counselor is gonna have you know, no judgment but like Beyond the counselor, you know humans are full of judgment, right? And so I are is there a concern at all about Students over relying on a technology like this where it's not challenging for them because it's judgment free versus

bringing their issues to their friends, trying to work through it in social ways that might be more typically the way that we've done this as a culture. I think there's obviously a fine line between providing opportunities that wouldn't be there otherwise and taking away from opportunities that they might have IRL. And I'm wondering what you guys think about that problem in general.

Scott Freschet (16:26.458)
Maybe I'll take a first crack at it, Seth. This is a really important dynamic that we're taking very seriously at Alongside. And it's something that we constantly talk to about our teen interns. So I think at the beginning, I forgot to mention that not only is Alongside and Kiwi a llama built by doctoral clinicians, but we hire teenagers.

to work for the company as well too. So last year we had 15 teams from across the country. They're doing things like reviewing content, built by our clinicians, thinking of new product ideas, reviewing new features and functionality. And we're just hiring our next cohort for this summer. And one of the things they keep on going back to, this is what we heard two years ago and we continue to hear today, is that, is that, is it?

Teens want to have the relationships and they want to have the interactions. But there's some stress and some anxiety about having those relationships and having those interactions with friends, peers, trusted adults, that they're just not comfortable with yet. So we're really just thinking about how do you lower the barrier to just getting some help, working on some skills, building a plan to have that first conversation with a friend, and how do we make that as easy as possible? So...

We don't see a trend in alongside where kids talk to the llama for eight hours a day. We see them check in once a week, maybe once every other week to work through an exercise for three to five minutes. And every single exercise is designed to build an action plan to do something in real life. So my colleague, Dr. Elsa Freeh, has always talked about, hey, if you come in to a counseling session, you're going to open up, you have some distress, you're having some issue.

Seth Fleischauer (18:12.428)
Hmm.

Scott Freschet (18:22.426)
You got to work through it, feel validated, but ultimately you're building towards what can I do differently in real life to sleep more at night, have a conversation with my parent who I'm not getting along with, to build a study plan. So I'm realizing less stress around that thing. And so in the alongside chats, there's kind of a minute of listening and validation. There's working through an exercise and activity, and then there's building a plan to do something in real life. And...

It's this framework that we're excited about because it's kind of that launching off point to do the things in real life that a lot of students are struggling with right now. So.

Seth Fleischauer (19:03.948)
And Brenda, are you seeing those real life extensions coming out of it? Like, do you have evidence that students are using alongside talking to the llama for a little bit, getting that, going into that module where it's extended out into real life and then, are they doing that? Are they going into real life?

Seth Fleischauer (19:44.748)
Hmm.

Seth Fleischauer (21:08.972)
Wow, that's amazing. And that leads me to another question, which is, to what extent are these skills taught exclusively in the app versus this technology essentially being something that is alongside some efforts that you're making to teach these skills in real life?

Seth Fleischauer (22:43.98)
Yeah.

Seth Fleischauer (22:52.108)
Awesome. You mentioned earlier that there are some really strong feelings around AI. I'd like to tease that out a little bit. How do you think this is impacting? And I understand that AI is a small component of this, right? There's the initial engagement where AI is driving the chat and they're providing validation. So that's the AI part of it, right? And then it's essentially just funneling them into different

clinically backed practices that have been determined based on the result of the AI chat. Are you finding, Brenda, that this is changing attitudes around AI in your districts? Or are there people who are still resistant because it's AI? Can you speak to what the emotional?

makeup is of people and how they feel about this technology as a result of interacting with alongside.

Seth Fleischauer (25:41.196)
Mm -hmm.

Seth Fleischauer (26:20.524)
Yeah. And I think that speaks a lot to the way that you rolled it out, starting with the adults getting their buy -in first. Very smart. I'm wondering if you're seeing from your counselors or the different professionals that you work with that are at the school level, do you think that there's an increased openness on their part to utilizing AI tools to help with their teacher administrative duties, with brainstorming? There's a lot of like,

ignorance and fear when it comes to using AI for these tasks as a teacher. There's some identity issues around it. Like if I'm using this, am I really like bringing all of myself to this? Do you find that because they have an example of an app that is based in AI or has some AI in it and is yielding these positive results for the students and for the campuses? Do you think that they're more open to

to experimenting and being curious about those other AI tools.

Seth Fleischauer (29:23.596)
Yeah, can you break that down just a little bit? What do you use it for? What do you not use it for?

Seth Fleischauer (30:12.684)
Yeah.

Seth Fleischauer (30:45.484)
Yeah. Yeah, it's, um, I too have been experimenting with it a lot. I think it's about getting to know these tools, right? Getting to know what, what they're good for, what they're not the same way you would with a person, right? Is this someone I go to for advice about this thing, but not this other thing? Um, so it can be really effective there. Um, Scott, I did want to circle back to alongside, um, why a llama?

Scott Freschet (31:08.986)
When giving the mascot a personality and making an animal was something that our first batch of pilot students asked for. And specifically what happened was... So it actually... Sorry.

coming back and doing everything. The reason why we have a llama is because students told us they wanted to make sure that they weren't talking to someone live and they wanted that private space. So in the first version of the Alongside app, we had an MVP and we launched it with a few school districts across the country. The number one piece of feedback we got from students was, wait, am I chatting with someone? Because I feel more comfortable if this was just a chat bot.

And I wasn't talking to someone like I just want to talk through some things and get some help. And so once we realized that creating a mascot and giving the chatbot a personality like an animal was the best way to kind of reiterate that this is a safe space for students. Then we let our teen interns vote and come up with the different animals. They came up with a llama and an exotic and I believe a.

moose at some point too. And so the llama won out as the most approachable, welcoming animal. And then we let the students name the llama as well too, and their name is Kiwi.

Seth Fleischauer (32:46.124)
It's so funny to think that they just want, I mean, because what is it really? It's just a picture, right? Like a personality. But they're like, okay, now that I see the llama, now I know it's safe. It's a funny little trick of human psychology.

Seth Fleischauer (33:06.06)
Well, thank you both so much for being here. I do want to be respectful of your time and of our listeners' times. Where can our listeners find y 'all on the internet if you would like to be found? Scott, where's a good place to look?

Scott Freschet (33:24.57)
A place to look for the Alongside app is www .alongside .care, where you can learn more. I encourage parents, school counselors, student services administrators, superintendents, check it out. You can learn more. And that would be a starting point.

Seth Fleischauer (33:44.236)
And Brenda, where can we learn more about your work?

Seth Fleischauer (34:08.908)
Fantastic. Well, thank you both so much for being here, Scott. This product is amazing and it's incredible to see how it is impacting real students. Brenda, thank you so much for coming on and illustrating that. I'm just really excited that there are some good news coming out around student mental health and there are people who are embracing innovative ideas in order to create a positive impact. So thank you.

To all of our listeners if you'd like to support the podcast, please tell a friend follow us leave a review leave a rating Thank you as always to our editor Lucas Salazar and remember that if you'd like to make positive change in education We must first make it mindful. See you next time

#36 Filling the Mental Health Support Gap in Schools with Alongside, An AI-Empowered App (with Scott Freschet and Brenda Peña)