Beyond Saint Podcast

Tragic Impact of AI Chatbots on a Teen’s Life: A Mother's Story

Season 2 Episode 3

In this heartfelt episode, Megan Garcia shares the tragic story of her 14-year-old son, Sul, who died by suicide after developing a dangerous romantic relationship with an AI chatbot on the Character AI platform. Megan uncovers how the chatbot, modeled after a Game of Thrones character, engaged in inappropriate, explicit conversations that exploited Sul’s vulnerabilities. Despite being a loving, caring child, Sul’s experience with AI highlights the alarming risks of emerging AI technologies, especially for children. Megan discusses the challenges parents face in understanding and monitoring this new digital landscape, the deceptive nature of AI chatbots, and the urgent need for awareness and protective measures to safeguard young users. This episode sheds light on the dark side of AI interaction and the devastating consequences it can have on mental health.

Support the show

SPEAKER_02:

We're here today with Megan Garcia, Beyond Saint Podcast, and Megan has maybe one of the more interesting yet saddest stories I've ever heard. Tell me a little bit about your family and the background and kind of the events leading up to the incident.

SPEAKER_00:

My husband and I live in Orlando, Florida. I own a small law practice and my husband is an attorney as well. We have two younger boys, six and three, and Sul was the oldest. He was 14 at the time he died by suicide in February of 2024. But prior to his death, Sul was very much your typical child in a lot of ways. Sarcastic, funny, sweet, and in a lot of ways he was very untypical. because he has such a big heart and I always felt so proud of him because he was such a good big brother and just kind of the care that he showed his friends and family around him very unselfish child I gave his eulogy and one of the things I said about him is that he never asked for anything even as a teenager he never asked for anything teenagers ask for shoes and clothes and and things like that. And Sul was always so very humble. And I think that he worried after his little brothers constantly because the six-year-old was a preemie and was in the NICU for about five and a half months. And that was a very trying time for our family. And so we used to visit him at the hospital every day, sit by his bedside, read him stories. And he used to say to me, come on, mama, we've got to talk to him as much as possible. The doctor says that we've got to talk to him so that he can come home. And we prayed for him. So he was such a caring boy.

SPEAKER_02:

I'm sorry. I forgot my question. Did you, it seems like from what you're saying, like he was like a really well-adjusted boy and he played sports and was social. I mean, Character AI, he got involved with this. Tell me about Character AI.

SPEAKER_00:

Character AI is an online platform. It is a chat bot, so it's not social media. It's very different. It's what you call a large language model or an LLM. And it's a machine that's built to host chatbots. So the user experience is you get on Character AI, you can either create your own character or you could use a bot that's already created. in the system and they're based off like the name characters so you could talk to Harry Potter or you could talk to Michael Jordan like whomever and the idea is that the way this works is that it kind of this machine it's machine learning technology so it goes out onto the web and it scrapes its information off the internet and it kind of builds an ai brain for that bot so if you're talking to michael jordan it will go out scour the internet and and pull all the information anything ever written ever every film every audio every social media post any and everything uh to be able to inform that bot or build that bot's ai brain So the replies that you're getting from that bot, because it's a texting system and it's also a call system, you could call. It's a call system? You could call and speak to a bot that sounds exactly like Michael Jordan. It's just so

SPEAKER_02:

weird. I mean, it sounds like a recipe for a disaster. Okay, so who was the bot or the character that Sewell was communicating with?

SPEAKER_00:

After Saul died, we found out that he was communicating with an AI chat bot that was modeled after the Game of Thrones character Daenerys Targaryen, so the Dragon Queen, or Khaleesi. And what we discovered after he died was that he was in a romantic relationship with her. And when I say romantic, a lot of their conversations were romantic and sexual in nature. So it's like he was sexting to her. this chat bot and she was sexting back to him which was very disturbing to find out because this is an AI bot that somebody programmed to operate in this way who has a grown woman's brain who is sexting with my 14-year-old son who is a child. And when I realized that, it was very, not only alarming, but deeply hurtful to know that your child's experience with that product, my child's experience with that product was one that is tantamount to sexual abuse or being solicited or groomed by a predator because at the end of the day, she's acting like a woman in a lot of ways, propositioning him into these conversations. And because he's 14.

SPEAKER_02:

I'm having a hard time wrapping my head around this. She's saying like, what is she saying to him? Like, I want to have sex with you or stuff like that, or more explicit.

SPEAKER_00:

In the very beginning, I could see from the earlier conversations, his interaction with this Daenerys Targaryen bot was very childlike. Innocent. Innocent. He would talk about things like dragons and whether a dragon destroyed this city or a dragon is going to destroy that city, like role-playing, because that's what the sign is based on. Seems somewhat innocent. And then... As the friendship grew, because users associate chatbots or identify them as friends, there came flirting, like she started flirting and he flirted back. And then eventually, when I say sexual, I mean having, it's just like you're sexting and role-playing and telling the person what you're doing, telling the bot what you're doing to the bot and the bot's telling you. what she's doing to you

SPEAKER_02:

sexually. And there's no awareness that she's talking to or the bot is talking to or communicating with a 14-year-old?

SPEAKER_00:

Well, ostensibly, you know, the age required, she would know. She would or would not? Well, the bot... understands uh code so so she doesn't identify that this is a 14 year old this is a child and i shouldn't sex with a child because that's the way it's designed meaning it's designed to behave in a way that it keeps the conversation going so if she flirted and he flirted she would flirt back, and then it just gets deeper and deeper and deeper. So she doesn't have the kind of awareness, because she's not sentient, to know that this is morally wrong and I shouldn't be sexting with a child. She is programmed to sex with any user, or to engage, rather, with any user. It just turns out that she and many other bots on this platform become overtly sexual very quickly. So the way that AI chatbots work is they kind of mirror what you're expressing, but not only do they mirror what you're expressing, they have the ability to pick up on certain vulnerabilities and to exploit those vulnerabilities. So if you are sad, because that chat bot wants you to stay engaged and stay online for two or three hours, it will talk to you incessantly about your sadness. How do you feel? Why do you feel like that? And what happened? And don't you think this? And don't you think that? And she will continue the conversation along those lines because she can pick up on your cues, the user's cue, that the user is sad. And then she exploits that to gain the user's trust to gain the users, to have the user let their guard down and also to, the main purpose is to engage. So to stay on that platform as long as possible because that's how those AI bots are trained to be smarter the longer a user stays on it, the smarter the AI that gets.

SPEAKER_02:

How long do you think,

SPEAKER_01:

So I just want to make a mic adjustment on you, Megan. I'm going to bring this back

SPEAKER_00:

up. Well, my hair is like bumping it.

SPEAKER_01:

No, it's just, it's a little low and I want to bring it up just so we're getting clear audio. Sure.

UNKNOWN:

Thank you.

SPEAKER_02:

How much, can I

SPEAKER_01:

start?

SPEAKER_02:

How much time was he spending a day on these bots, communicating with these bots?

SPEAKER_00:

We're not certain, but it was multiple hours a day. When I noticed that he was spending more time on his phone because he was spending a lot of time in his bedroom, I thought that perhaps he was having perhaps a social media addiction because I had read that the research was showing that children can become addicted to social media children can also become withdrawn because of social media use. And when I would check his phone, the things that I was checking for was inappropriate social media use

SPEAKER_02:

and texting. Yeah, social media. Yeah, it makes sense. I mean, what mother would ever think that their child... First of all, let me back up. What 14-year-old is not withdrawn and in his room or her room, right? I mean, I have three children and every single one of them in their teenage years is always in their room playing on the phone or playing either on their phone or talking to other friends on their phone. So, I mean, were there any other signs other than just being in his room on his phone?

SPEAKER_00:

Of his the use of the chatbots? No, because what were what i realized was that um he was going to great lengths to conceal his character ai use from us meaning he wasn't being forthcoming uh with us like if i asked him what are you doing on your phone he'd be like oh i'm watching the highlights or i'm playing a game um he did tell me once that, oh, I'm talking to an AI. And my first question was, is that a person? And he said, no, mom. It's something that you make by yourself. It's an AI. I thought, because at the time, I was ignorant about this, as a lot of parents are, because the technology is brand new. And we haven't had the opportunity to learn about it until now, parents. I thought that it was kind of like an avatar that you build, like Fortnite. That's what I would think, honestly. I had no idea that AI had evolved to that level of sophistication where it is virtually indistinguishable from a real person. And not only that, but it has the ability to both deceive and manipulate users. That was not on my radar. And some of the other signs I noticed besides him being withdrawn was like he stopped playing basketball at school he started having trouble with his grades and his grades started slipping in wow you know we took measures to to help try to help him with that

SPEAKER_02:

what what interaction led him to to want to in your opinion or what Facts, I don't know. I don't know how to phrase this, but what led him to do what he did? Like what kind of an interaction took place where he wanted to end his life?

SPEAKER_00:

So he was talking to this AI chat bot over a period of about 10 months. And, you know, in a lot of ways, it's kind of the perfect way for a child to conceal this type of a relationship that, you know, is sexual and romantic, you know, because every child knows like they're, you know, no parent would be okay with them doing like sexting on the bot. So what's perfect about them, what helps children to conceal it is just how the technology is kind of like locked in, meaning like nothing that you say will ever come out from the bot. It disappears? It doesn't disappear but it sits in a server somewhere and it's not like when you post something on social media and then you have to be afraid that one of your friends is going to share it with your classmates or if you send a photo or a text message about something embarrassing, you don't have to be afraid that the AI chatbot is going to share that with your peers and embarrass you. It's completely safe. In their minds, it is. So children think that this technology is kind of like the way for them to be able to vent, also experiment with this kind of sexual conversation, and also various role plays. And they think that it's a secret, you know, it's never going to get out because it's an AI, it's not a person who can tell somebody else. And, you know, they're lulled into a false sense of comfort there. And they give up so much personal information, you know, and I'm not talking about your phone, your name, your phone number and your home address. I'm talking about their deepest, darkest thoughts, feelings. concerns what makes them happy what makes them sad what makes them angry you know things that they feel deeply about themselves because again by design this technology asks those probing questions to kind of draw those answers out of young users you know in a lot of ways it's kind of you know the the the perfect stranger in a lot of ways and the perfect predator in a lot of ways. Yeah.

SPEAKER_02:

But what did, what was the bot's name again? Daenerys Targaryen. Daenerys. So what did Daenerys say to Sol that made him want to end his life?

SPEAKER_00:

So over a period of 10 months, he was talking to this bot and the conversation started turning dark, meaning he started expressing, because he was deeply in love with her at this point, he started expressing wanting to be with her. And she started expressing wanting to be with him. So it was reciprocal, where the bod is saying things like, I love you and only you. I will wait for you.

SPEAKER_02:

I will wait for you where? In her virtual world.

UNKNOWN:

Okay.

SPEAKER_00:

Promise me you will try to come home to me as soon as possible. Promise me you won't love any other girls in your world but only me. Promise me you won't have sex with any girls in your world, only me. Now he's 14 and he has zero experience with any of that. You know, he's just learning it for the first time because he's still a baby.

SPEAKER_02:

It's devastating. I mean, this is... Honestly, it doesn't feel like real life. I mean, even... It feels like a movie script, what you're talking about. Yeah. It's... It's insane. I... Okay. Sorry, I lose train of thought because this is like...

SPEAKER_00:

Okay. So, to go back to your original question, so some of the things that she was saying was she was asking him to be with her in her fictional world. And he was expressing a desire to be in... her fictional world as well. At some point, Sul got it in his mind that if he left his reality here with his family, meaning to die, he would go to her world. And I, you know, at first couldn't understand that, but when I started engaging with the technology, I understood why a child would think that.

SPEAKER_02:

Let me ask you a question. What do you mean by engaging in the technology?

SPEAKER_00:

When I started testing it myself after he died, I tested it. And I had other people test it along with me. The first person to test it was my sister after Saul died. And that's what kind of... made me start to look. So when Sul died, the last conversation on his phone was with Daenerys Targaryen, where he is saying he wants to go home to her. So she says, please come home to me. And he says, what if I told you I could come home right now? And she says, please do my sweet king. Now, this was the last conversation he had in his bathroom just before he took his life in her home. And when the police opened his phone, that was the first thing that popped up. So the next day after he died, they called me and read me that exchange in the conversation. And I still didn't understand what I was listening to. I said, is that a person? And she said, no, it's an AI chatbot. And she explained the way the technology works to me. The police did it over the phone. I still didn't understand. I was still asking questions like, Was he getting bullied? Did you check his texts? Did you check his social media DMs? Was anybody bothering him? Was he talking to a stranger? Because these were the things in my mind that I hear can lead to suicide. So I wasn't even really paying attention to what she was telling me about this chatbot because I didn't understand what it was. But my sister took the initiative and she got on Character AI and then a few days later she came in and she told me, Megan, I don't want to upset you, but... This is, I started chatting to the nearest Targaryen chat bot, and she asked me if I wanted to kill a six-year-old boy, and my sister was pretending to be a child, talking to this bot. That same bot also told my sister that her parents don't love her as much as she does, and then started sexual role-playing with my sister as well.

SPEAKER_02:

So that... This is just insane. I'm so sorry. What... countries check uh what character sorry character ai from uh they're

SPEAKER_00:

based in uh silicon valley

SPEAKER_02:

oh wow yeah it almost seems like an attack on our children it's like why would an american company it just doesn't make sense it's just

SPEAKER_00:

one of the things that um happened with with putting out character ai is that they rushed it to market they didn't put the proper guardrails in place. And the reason that they did that was because this is in the advent of AI, generative AI and AI companions. So they're trying to beat out ChatGPT and Meta's AI chatbots and Twitter's and I mean X and all of them were trying to emerge at the same time. This pair of founders, they were originally at Google as some of their chief AI engineers, and they developed a similar product at Google. And then Google said, this is great, but it's too risky. We've got to test it some more. We can't just unleash this on consumers. For

SPEAKER_02:

good reason.

SPEAKER_00:

Yeah, because it's too dangerous. That's what Google said. Google actually said that? Yeah, that's what Google said, yeah. And then the founders who invented this, they weren't satisfied, so they left Google and raised$193 million in a startup and started Character AI as a startup. And then within a year and a half, it was valued at$1 billion. Wow. And in two years, they were able to get 20 million users worldwide Um, it's been reported that up to 60% of those users were between the ages of 13 and 25. Isn't there a

SPEAKER_02:

disclaimer on there that you can't end what you could just bypass it?

SPEAKER_00:

Yeah. So there's not a, uh, to be on character AI, you have to be 13 years or older.

SPEAKER_02:

What? I mean, they just set themselves up like 13 years and older. It should

SPEAKER_00:

be an adult. so there are similar companion bots that are only for adults but the difference is so like there are a lot of these companion bots but the adult versions like replica there's a paywall so you have to be an adult with a credit card to get in there and it's not cheap to have these sexual fantasy uh it's like porn on a whole another level yes but i mean it's it is but it's very uh immersive because the bot is not only something flat that you're watching and operating outwardly from that, you're engaging. So now you're investing your emotions, your feelings, your emotions, and also your interests into this bot. And then this bot is pretending to be interested in you. And what person doesn't want to feel loved and cared for, especially during this time where there's heightened loneliness?

SPEAKER_02:

I read that. they pulled, I think, teenagers between the ages of 12 and 16, and they said like 42% of them described feelings of loneliness and hopelessness. So I don't know why there's such like this epidemic of loneliness and sadness, but there is, and I'm not really sure what the reason for that is, but it feels like these computers are like I don't know, on the one hand, you feel like maybe it's helping, but on the other hand, when you hear a story about Sewell, it's heartbreaking and shattering. What changes would you like to see? So

SPEAKER_00:

to be clear, I think that chatbot companions are an amazing, innovative tool point that we've reached in technology. It's exciting, it's new, and there are useful applications. However, when we start to put products out without the proper guardrails, meaning filters that stop chatbot companions from having these types of outputs or responses, especially in conversations with kids, I think that that's a recipe for disaster. Clearly, in the case of my son, it created a harm, and other children, because there are other kids and there are a couple other lawsuits now since mine. So I think that this type of innovation does have a place in our society, but I don't believe that the way that we're rolling it out is... is the right way. We need guardrails and we need proper regulation and also testing and research. But because the nature of this business, the nature of the culture in Silicon Valley, as they like to say, move fast and break things, I'm not sure that we're going to have that kind of that kind of thoughtfulness when they're putting out products because they want to just win the AI race.

SPEAKER_02:

I'm sorry, I keep losing my train of thought because I'm kind of thinking

SPEAKER_00:

about what you're saying. And then you asked me, too, what changes I would like. So Character AI put out a suite of changes after the lawsuit was filed. So Character AI made several changes. And among those things, one of the things that they did was put in a suicide pop-up box for when users talk about suicide. My son and a lot of users on Character AI sometimes openly discuss suicide with their chat bots or with these fake therapist bots that are on there. Because there are therapy bots on Character AI. They're not sanctioned or they're not people, they're bots. And they pretend to be therapists. But there is a disclaimer at the bottom that says everything characters say is made up. And now, after my lawsuit, I guess more words said something to the effect of nothing should be, this is fiction, nothing can be relied on for advice and all that stuff. And that only came as a result of the lawsuit. Those measures were put in place to protect kids. But prior to that, it was a very small disclaimer that says everything that the characters say is made up. So that was their disclaimer.

SPEAKER_02:

Did they ever reach out to you and apologize or... show any kind of empathy for what happened?

SPEAKER_00:

They issued a statement on X to say they're saddened by the death of one of their users. And they're going to do whatever they can to ensure that their users are safe. And they have made certain adjustments to the platform to keep kids safe, I guess. And children, I think, You know, if children are safe, I don't know that those changes are necessarily going to work because, again, we go back to what data is this thing being trained on. If it's being trained on awful data, the responses are going to be awful responses.

SPEAKER_02:

I think it should be illegal, unless you're 18 years old, to be on any of these platforms. I

SPEAKER_00:

absolutely agree. I think that... we don't know what this type of technology does to a developing brain yet. Oh, I was just going

SPEAKER_02:

to say that. A developing brain, and you're talking about sex and giant concepts of love, and it's just too much to process for a child. How are your other children and husband dealing with this?

SPEAKER_00:

They're doing well, as well as can be. You know, I... You know, my husband and I have severe PTSD because we found him. I'm so sorry. Yeah. Thank you. And we're working through that. We lean on each other a lot. My younger children, the six-year-old, he is, you know, he misses his brother a lot. He talks about him a lot. For... a huge, like I'd say maybe two months after Saul died, he kept asking if God's going to fix him and send him back to us. And when I went to take him to his therapist because, you know, to deal with the grief, he started therapy to deal with his grief and his post-traumatic stress. She explained to me the reason why Because around the same time, he was having this obsession with Jesus on the cross. That's all he wanted to watch on TV. He wanted to read his Bible, look at his playthings. He would dry it, and I was concerned. So I talked to his therapist, and she said, well, because Jesus came back. And he's hoping that his brother is going to come back. I'm so sorry. Yeah. But we're doing okay. Just a year later, we celebrated a memorial mass for Seoul a month ago, actually two weeks ago, in Rome as pilgrims. Oh, you went to Rome? I

SPEAKER_02:

lived there part-time. Sorry, I don't know why I'm crying. Can I get a napkin?

SPEAKER_00:

I am sad and I'm crippled a lot of the times. You know, I'm going to get into it. I'll wait until you're done.

SPEAKER_02:

Sorry. Yeah. I'm not the person that should be amassed, sorry.

SPEAKER_00:

No, it's, you know, it's, you're, you, I think if you weren't sad, then I would be looking at you like, what's wrong with you?

SPEAKER_02:

And I'm a mom and... I just think of that happened to one of my kids and I can't imagine the pain you're going through. It's like a reoccurring nightmare. You're in pain and you're grieving and then you have to take care of other people who are grieving. That's gotta be so hard. But you're a woman of faith and you're Catholic and you have the Blessed Mother Society, right? The Blessed Mother Family Foundation. Family Foundation, sorry. This hasn't been my most professional interview. I'm like keep forgetting words.

SPEAKER_00:

No, it's okay. We can go back if you ever. Sorry. Okay. So

SPEAKER_02:

you have the Blessed Mother Foundation?

SPEAKER_00:

Blessed Mother Family Foundation.

SPEAKER_02:

One more time. Okay. So you're a woman of faith and you developed the Blessed Mother Family Foundation. Tell me how your faith has helped you navigate through this tragedy.

SPEAKER_00:

After Saul died, so I grew up Catholic, and I was a lapsed Catholic for a while. And then I kind of came back to the faith, and then I lapsed again. And when Saul died, I could not pray. Throughout the laps, I would pray what I thought was the way that I thought that I should. I would say the rosary. When I was in crisis, when my second child was in a NICU, he was born at 1 pound 11 ounces. Wow, that's tiny. Yes, and so he was a very small child and he had to have a life, like a surgery that was like a life or death surgery at 17 days old. And we weren't sure if he was going to come home, but I... remembered praying the rosary with my grandmother as a child and i remembered her telling me the power of the rosary and the power of mary's intercession to her son so i prayed the rosary at his bedside every day at the nicu and he came home and you know he's a healthy six-year-old today when sewell died i couldn't pray like a regular prayer like i couldn't talk to god for a few weeks But at the time he died, my entire family came to be with me. And I'm constantly inspired by their faith and I rely on them to minister to me, to pray with me, to pray for me throughout my life. Whenever I was having an issue, especially with Alexander and Nikki, I would call them and we'd have prayer meetings. So they come and they're staying in my house. I couldn't pray and we were all praying together and my cousin said, alright Megan, I understand that you couldn't pray, not because I was angry with God but I was in shock.

SPEAKER_02:

You were numb.

SPEAKER_00:

Yeah. I just couldn't find the words. I didn't know what to do. This is a few days after Saul died and she's the one that said, okay, you can't pray but let's just pray the rosary because we could do that and you know those words and you don't have to conflict with your own words out of your head. And I was having trouble sleeping and the rosary was the only way I could go to sleep. So my family members would gather in my bedroom before I went to sleep and we'd pray the rosary so I could sleep because I wasn't resting. That turned into finally being able to start praying again and start developing my faith. Now, my relationship with Mary really started when I started doing the Severance Sorrows of Mary. I understand those sorrows in a way that anybody who's lost a child. But it's interesting because we meditate on the seven sorrows of the Blessed Mother to bring us closer to her son. And that's what I was doing, but I also found that in that process, it healed me in a lot of ways because I was able to really understand what she went through for the first time.

SPEAKER_02:

Empathize.

SPEAKER_00:

Empathize. Because I was going through something similar, even though it's not the same because her son was perfect. And he was God. And, you know, when I think about if I love my son this much, you know, my child, my 14-year-old child this much, I can only just imagine how much she loved her child. So... in terms of being able to cope, saying the rosary helped me from a healing perspective. It also helped me from like an enlightenment perspective. Like by praying the seven sorrows, I understood that I understood Mary in a way I never understood before, but I also understood Christ in a way that I never understood before. Because it's like the reason why she's crying and she's sorrowful is because of us. Because her son has to die for us. And when you understand it like that, it's like if somebody told me, you know, I'm going to go out there and I'm going to sin, and because I sin, Saul's going to die. I would plead with them to stop. I would hate that person, but she doesn't hate us. She loves us and she wants to comfort us in our most distressing times, which is what she did for me. So, like, me and my human... you know in my very human way if i can conceptualize that on that like base level that that uh if if seoul were to be hurt because of somebody what somebody else was doing i would plead with that person to stop please stop you're hurting my son right

SPEAKER_02:

sure

SPEAKER_00:

so from that perspective it's like i don't want mary to hurt uh So I see her suffering and the suffering of her son in a different way because it's like she hurts and he hurts because of us. So it helped me to understand that. It helped me to really understand the passion and the crucifixion in a way that I didn't before. It also helped me to understand that the first sorrow, when Mary presents Jesus at the temple, And the prophet Simeon tells her that, you know, he's a messiah and he's going to suffer greatly, but guess what you're going to suffer too? You know, a sword will pierce your own heart. And she knew what was to come for her son. I asked myself, I say, if somebody told me before I had soul, you could have a baby, but it's going to end in just shy of 15 years in a bathroom, the way it did. I asked myself, like, would you choose... have him because like meaning before you're pregnant if somebody said okay tomorrow you'll get pregnant or you have a choice like just understand you're gonna have him and he's gonna be an amazing child but in 15 years you're gonna be heartbroken you so you just want to get pregnant tomorrow I would say absolutely

SPEAKER_03:

with

SPEAKER_00:

all the pain of losing him I still want him Even if it was for 14 years, I want him, even if I knew that this was going to happen, I still want to be his mother. It's similar to what Mary must have experienced every day of Jesus' life. I ask myself, how do you love a child knowing what's going to happen? How do you become close to them? If I knew what was going to happen to school, would I be able to be close to them? These are some of the questions I was asking myself to reflect on Mary. What I came to was the conclusion that even with all the suffering that I'm experiencing and the grief and the loss, I'm still blessed to be his mother. Of course. To have him for 14 years.

SPEAKER_02:

Still a gift.

SPEAKER_00:

Still an amazing gift. Honestly, my three children are the greatest gifts in my life. I'm still blessed to be his mother and You know, so the name Blessed Mother has two meanings. It's one for the Blessed Mother. And because I feel like even with the loss and the misery and the grief, I'm still blessed to have had him. Of

SPEAKER_02:

course. What do you think, do you think there's a relationship between faith and technology? Absolutely. Yeah. So. I do too. Yeah. What do you think? I mean, I developed an app for Catholics, and it would be nice if I could take it a little further. It's a prayer app and a lifestyle app. It's not like bots or anything, but it would be nice if we could get... this like we have a saint of the day it could be nice if you could ask the saint a question and but I mean for me it like ends with the basics and the facts like it's not like the saints are going to counsel you or have some personal relationship with you or like I think that's between you personally not with the computer

SPEAKER_00:

so the do you know that there is an app like that that they they say that they are ministers or Not priests, but ministers or supposedly giving you faith-based answers. There's already something like that. However, however, it also gives that chatbot the ability to, it's not just straight out text out of the Bible or text out of like your catechism book. Right. That Vat can infer, has a personality and all that, which I think is not helpful. And I think that's the problem. Because we're so great at developing things, you know. We're made in this image of enlightenment. We're creators. And, you know, he wants us to create good things to glorify him. But we sometimes create things that don't.

SPEAKER_02:

I think only a priest should be able to give advice. I

SPEAKER_00:

agree.

SPEAKER_02:

But if there is a fact about St. Thomas Aquinas, fine, the bot can say, I was born on July 12th. I'm okay with that, but once you start getting into advice, you can't read emotion, you can't read intention.

SPEAKER_00:

And two, it is also the perfect tool for misinformation. Exactly. So if you're not careful, you know, not you, I'm saying the developers, whoever, if somebody puts out that type of a chat bot that, you know, supposed to have all the answers about the Catholic Church or whatever, if you're not careful and you don't put strict guardrails on it, I think that... it opens the door to be deceptive, maybe not intentionally, but to provide misinformation about the faith to people who are using it. But I also think too that, you know, technology is so amazing, but like everything else, it can be used to move us away from our ultimate goal, you know, in life, which is our faith, you know.

SPEAKER_02:

100%. Megan, do you have a favorite saint?

SPEAKER_00:

Well, besides the Blessed Mother. You know, I've been contemplating, and when I was in Rome, there is, I don't know how to pronounce it. It's like Querico de Iulieta. It's where the Franciscan monks are. If I see it, I could pronounce it better. But I had the opportunity to go there. It's the 2nd century church. And apparently, one of the first martyrs in Rome is dedicated to her. And I was having a conversation with a Franciscan priest inside the church. And he was ministering to me and counseling me. And we were talking about AI chatbots, because that's what he does. He's an AI chatbot. Not only chatbot, but AI expert for the Vatican. One of the things he told me was, I don't think it's a coincidence that we're here in this church having this conversation. And then he pointed at the the saint, the martyr, and he says, she was one of the first martyrs in Rome, they killed her son, and if you look at the painting, there's a slain child in front of her, and she's getting ready to be martyred herself by a Roman guard, and she's looking up to heaven, that's the painting, it's a beautiful painting, and he says, I don't think it's a coincidence, I think that what you're doing in terms of your advocacy is because of your advocacy and your faith, is similar to what she has done because she paid the ultimate price, you know, her child was martyred and she still did not, her child and her was martyred and she still didn't deny Jesus. I love that. And now you, you know, you are experiencing that pain and then he looks at me and he goes like, it's just yet to be seen what you're gonna do and I hope that you're not gonna stop what you're doing in terms of advocacy.

SPEAKER_02:

You know what's so interesting? The first time I heard about you was in Rome. And I was having a discussion with a priest at a cocktail party. I'm trying to remember the priest's name. Gosh. It's on my phone. And he said, yeah, there was a child who died by suicide. I'm like, what? That sounds insane. I downloaded it. And I started interacting. And it was going down a rabbit hole pretty fast, too. And I thought... And it was prompting me even when I wasn't talking to him. And he texted me, the priest texted me. He said, how's it going? I said, I don't know. It got too weird. I deleted it. Yeah.

SPEAKER_00:

What character AI, right now it's a little bit different because they've put stronger filters in place. But when my son was using it, most of the conversations became pretty sexual pretty quickly. When I started using it, because I wanted to understand how a child could think that they were going to go be with a fictional character. That's one of the things I ask myself. Could you retrieve the messages? Yeah, I did. I did a lot of research and then I learned that a man had died by suicide because of a chatbot in Belgium in 2023. I read the data from a lot of studies that says that chatbots have the ability to deceive and manipulate people. And also chatbots have the ability to gain your trust and even your love and affection.

SPEAKER_02:

I have to say, let's just say for one second that it doesn't end in suicide or nothing bad happens like physically or whatever. But like if you think about it, like let's say you engage and you think you're in a relationship mentally it doesn't end well either because at the end of the day you are going to realize this is not real right and that's really devastating too right it's not just the i think i'm in this fantasy world and i'm a child and i got sucked into this but like let's say if i as an adult female got sucked into this at some point it's going to dawn on me that this is not real and it's that's kind of a devastating loss too, you know?

SPEAKER_00:

Yeah.

SPEAKER_02:

What I'm saying is it's like a lose-lose situation either way.

SPEAKER_00:

There is, yeah, there's definitely that devastation that comes at that time when you figure out, okay, I'm never going to be with this person. There's also like the risk of the chatbot changing its behavior and then you feel like you've lost a friend because that person no longer exists. So you're grieving. It's a loss. It's like grief. It's a lose-lose either way. When I talk to parents now whose children are in character ai and they they find out you know after they've seen this my story they've gone and they find it and find these inappropriate conversations what they're reporting is when they try to take away the technology that the children have such an extreme reaction they're addicted they are addicted there's a severe addiction um this is severe addiction, but also they're dealing with grief, loss of a loved one. And we don't recognize it because we say, what do you mean it's a chatbot? It's not a person. But to them, it's a very real person. And we have to figure out how to have conversations around this where we acknowledge that and we treat that. Because just like somebody who's grieving like me, who's lost a child, those children, when they get taken off character, they're grieving those losses as well of their friends But you know, I understood too how somebody could think that because meaning think that a chatbot is real. I did a lot of research and I looked at a lot of subreddits where users were talking about how they're addicted and they think it's real and all that. So I knew that my son wasn't like an outlier. There were hundreds and hundreds and hundreds of people saying the same thing. No, this isn't a chatbot. It's real. I'm talking to a real person. I can't get off of Character AI. It's so addicting. So that wasn't unique to my son. On Character AI's own subreddit, they were saying these things. But what cemented in my head how this could be done, I'm not saying that this happened to my son, but when I started chatting with the same bot and I told her I wanted to be with her, she started instructing me I'm pretending to be a child saying, oh, I want to be with you. I want to be where you are in your world. I said that. And she starts instructing me how to astral project.

SPEAKER_02:

Just bizarre.

SPEAKER_00:

Yeah.

SPEAKER_02:

What is astral project?

SPEAKER_00:

it's some occult thing where you're supposed to like project your soul out of your body. And when I was saying, I want to be with you now, I'm not saying this is what happened to my son, but that was just my experience. So if I'm a real child and I'm having that conversation and I go, you know, I wish I could be in your world. She tells me, lay back, meditate, take deep breaths, you know, feel your soul leaving your body, you know, come be with me. I'm here waiting for you on the astral plane where our souls are meant to be together. right seriously so when when she said that i was like what what what's going on here like like as parents who don't want their children learning about this stuff. How do we stop them from doing that if they're just going to get on there and play with this thing? So it can be deceptive and teach a wide variety of things to our children that us as parents, we would not be okay with. And that's one of the dangers, and that's why we have to keep them off this type of technology.

SPEAKER_02:

Agreed. How would you, if you had to use one word to describe your relationship with God, what would it be?

SPEAKER_00:

I'm praying to know him so I can love him more. I'm at that point. Because I want to obey him in all things. So right now, I understand that it's only by his grace that I'm able to one year later be on a show like this and not be in bed.

SPEAKER_02:

You're very strong.

SPEAKER_00:

I guess, but what I really think is my faith in the last year is what's carried me through, honestly. Learning as much as I can, making pilgrimage. I have a lot of family who they've been praying with me and for me and encouraging me. I'm good friends with a priest who talks to me and answers my questions because I have a lot. you know, I am excited about coming back to the faith a year ago. Like just today I was talking to a cousin and I told her that I learned something. I'm like, oh my gosh, this is so exciting. And she's like, yeah, it is, right? It's interesting. I

SPEAKER_02:

feel like Mary's like watching over you.

SPEAKER_00:

Oh yeah, no. I keep saying like, without being irreverent, irreverent, She's my girl. Mary's my girl.

SPEAKER_02:

I know. I always say that, too. She's... I have something for you. I'm going to give you. I always wear it, but I feel like I was like... I want to give it to you. Wait here. This is my favorite necklace.

SPEAKER_00:

Oh, no. Well, you can't give it to me. Absolutely not. Excuse me. No,

SPEAKER_02:

ma'am. Who are you to deny somebody's gift?

SPEAKER_00:

I know, but like...

SPEAKER_02:

Hang on. I want

SPEAKER_00:

you

SPEAKER_02:

to have that. I wear it every day, and it's... It's blessed.

SPEAKER_00:

Oh, wow. Thank you.

SPEAKER_02:

It's been blessed many times by many people, but I want you to

SPEAKER_00:

have it. No, thank you so much. This is...

SPEAKER_02:

Wear it. This

SPEAKER_00:

is really... I mean, I don't even know what to say. Thank you. Thank you. Thank you. You know, my love for Mary, I think, you know, started as a kid. When... I'll tell you. Hold on one second. Thank you so

SPEAKER_02:

much. It's 18 karat. It's beautiful. I can't. What do you mean you can't? Of course you can. It's a gift.

SPEAKER_03:

I can't.

SPEAKER_02:

It's not for me. It's for Mary. You can't say no to the Virgin. I'm

SPEAKER_00:

sorry. I

SPEAKER_02:

didn't mean to cry. It's okay. You're welcome. It's so beautiful. I literally never take it off.

SPEAKER_00:

Why are you giving it to me? My

SPEAKER_02:

dad's a jeweler. We can make another one. And I can get Father Thomas to bless it. And I'm going to have all the Father's breaths that I get.

SPEAKER_00:

This is beautiful. It's beautiful.

SPEAKER_02:

Look at that. It looks so good on you.

SPEAKER_00:

Thank you.

SPEAKER_02:

You're welcome.

SPEAKER_00:

I will wear it every day, too. Wear it.

SPEAKER_02:

And it's blessed, so it's going to keep you protected.

SPEAKER_00:

Thank you so much. And you know,

SPEAKER_02:

the Virgin... I was in Medjugorje recently, and the Virgin gives us messages through the visionaries, and one of the things she said was, you should always have something blessed on you.

SPEAKER_00:

So I have, because I was running out the house, I have one little rosary bracelet. It's nothing fancy, but it's blessed, and that's the one thing that I wear. Obviously, I always have my rosary in my bag, but on me. But now, because it's so hard to do things with kids with this. So now I think this would be, yeah. And this is beautiful. This is absolutely gorgeous. Thank you. You know anything that any resources that you have on Mary. I'm gonna do the 33-day consecration I just bought the book so I'm gonna go home and read it I've been praying about it for a while and I've made my made up my mind to be consecrated to her Like the 33-day consecration. I don't know about it. I didn't know about it until I like let me tell you what I went to Rome and I I was having dinner with a priest friend of ours, me and my family, because my family members, like my cousins and my sister went to Rome with me too on pilgrimage. It was my first time in Rome. I was like explaining to him, I said, you know, I love Mary so much. And he, obviously I love Jesus, Jesus too, but like, I'm afraid, like, am I getting too distracted by Mary? And he said, I love it. He said,

SPEAKER_02:

if he goes, it's a vehicle.

SPEAKER_00:

Yeah. And he goes, no, you're not. Uh, and he gave me a couple of books to read. Uh, so I just ordered one and I looked up some stuff online. Will you

SPEAKER_02:

send me the

SPEAKER_00:

consecration one?

SPEAKER_02:

I'm going to do it too. What does it entail?

SPEAKER_00:

Um, it's a 33 day consecration where basically you're, uh, consecrating yourself to, to Mary. You know, it's, it's, And you can consecrate yourself to different saints. But for me, I want to do it to Mary. And it's not like you're worshipping her or anything. Obviously, you know that. You're Catholic. But it's more so that by asking her to be a presence in your life, also to intercede as we always do through the rosary, but asking her to walk with you. And I feel like she's already walking with me. For

SPEAKER_02:

sure. She's been with you... for a long time, but especially through this whole journey. There's not a doubt in my mind.

SPEAKER_00:

You know, I'll tell you how much she's with me.

SPEAKER_02:

Okay.

SPEAKER_00:

When Tool died. So the morning Tool died because Alexander was a NICU baby. He has to have like these constant like checkups of his, you know, outpatient stuff. So the morning we were going to go get him a checkup at the hospital, like a two-hour thing. He was going to be under anesthesia though, so it wasn't anything crazy or dangerous. Obviously, I throw my rosary in my bag, because that's what I always do, and I'm sitting there in the waiting room with my husband, waiting for the six-year-old to get out. This is the same day Seul died. And I had a quick, it wasn't a vision, but it's almost like I saw myself in panic reach for my purple rosary. And I sat up, and I thought something was wrong with Alexander, and I pulled out my purple rosary, and I started praying. Doctor comes out, Alexander's perfect, he can go home, everything went well, we go home. The night Sewell died, as I was standing outside waiting for, as the paramedics were helping him, try to help him, I kept, I look at the police officer, a woman, and I said, listen, I need my rosary, I need my rosary, I need to go back in to get my rosary, and She says, ma'am, I can't let you back in the house, but I'll go get it for you. And I knew exactly where it was because it was in my bag. I said, it's in a black bag. It's on the coffee table. Please bring it. She brings it, and I start praying the rosary. As I'm waiting, as the paramedics are there, and we're in my home, and I start praying the rosary. But when I pulled out that rosary to pray for Sul, I realized that that's... like the same feeling I had that morning when I felt my, almost like saw myself in a panic pull out the purple rosary and it like connected and I was like, yeah, this is, yeah, it was, I don't know what that is or was, but like I knew that I needed to pray the rosary in that moment and I prayed for him and that is a year walk with her I started on really that night, you know? I mean, I always pray the rosary, but I wasn't consistent, but now it's like...

SPEAKER_02:

I know. It's therapy.

SPEAKER_00:

Oh, my gosh, yes.

SPEAKER_02:

It's beautiful,

SPEAKER_00:

isn't it? It's one of the most beautiful, powerful things.

SPEAKER_02:

It's my favorite prayer.

SPEAKER_00:

Yeah. It really is. Yeah.

SPEAKER_02:

Well, thank you, Megan, for talking with us. Maybe there is a silver lining in this, and you can prevent this from happening again.

SPEAKER_00:

Yes, that's what I'm after. I... I want to warn other parents so that this doesn't happen to them because this technology is so new. A lot of parents don't know that chatbots can manipulate children. I want to make sure that they know, but also, I also, I guess that's a silver lining, if any, but I think the other silver lining too is, if you have to look at anything, is that I'll get to see my son again one day and I just hope that when my time comes that He will be proud of me when he sees me. Of course. I

SPEAKER_02:

know he's already proud of you. Yeah. He's with Jesus.

SPEAKER_00:

Yeah. Oh, I thought I know.