Radio Kempe

Chasing Hope for America’s Children: A Father Confronts Online Sextortion (A Conversation with Representative Brandon Guffey)

The Kempe Center

Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.

0:00 | 41:20

In the fourth episode of Chasing Hope for America’s Children, Warren Binford speaks with South Carolina State Representative Brandon Guffey about the growing crisis of online exploitation impacting children and teens.

Rep. Guffey has become a leading advocate for child safety online, mental health awareness, and legislative action to hold platforms accountable.

His work has been featured by CN2 News, WSOC-TV, FITSNews, and The Post and Courier, as he continues to push for stronger protections for children in the digital age.

[00:00:00] You are listening to Radio Kempe. We value the sense of community that connects people and helps them find ways to move forward. Join us on our journey to prevent child abuse and neglect.

Welcome and welcome back. This. Is Radio Kempe. I'm Warren Binford with the Kempe Center for the Prevention and Treatment of Child Abuse and Neglect at the University of Colorado. Thank you to our listeners for joining us for the fourth episode in our newest series, Chasing Hope for America's Children.

Today we will be interviewing representative Brandon Guffey, who is a member of the South Carolina House of Representatives. Representative Guffey is a husband, father of three, and the passionate advocate fighting online crimes against children. Welcome Representative Guffey, and thank you for joining us.

Thank you for having me. Dr. Binford, [00:01:00] can you tell us a little bit about your journey, representative Guffey? How did you first become interested in fighting online crimes against children? Well, I was running for the State House in 2022. Um, had just won my primary in my district and was headed off to the general election.

And on July 27th, 2022, um. I came in and, uh, that night my son ended up taking his life and we would later find out that he was a victim of a crime called Sextortion. And I had never heard of Sextortion at that time. Um, but I was just blown away at how these criminals. Could come into our children's bedroom and attack them.

Um, you know, we're all often focused on protecting our children and our families from the outside world, but too often we're not paying attention to what's happening on the digital side of things. [00:02:00] And unfortunately, that's what happened to my son. Um, and so my mission to the day that I die is trying to save the next Gavin.

So you, you mentioned this, this crime sextortion. What, what is that? Sextortion is whenever someone pretends to be. Another person. So many people, as I'm talking to teenagers, I often bring up catfishing. Um, essentially it's whenever someone catfishes pretends to be someone, uh, establishes a relationship with them, uh, often quickly online.

'cause it's not uncommon for today's or the younger generation to have online relationships and they end up sharing photos, intimate images, and convince, um. The person to send over images, and once they do, they start demanding either more photos, images, or they use it as a form of black blackmail or extortion.[00:03:00]

And has there been an increase in sextortion in recent years in the United States? Uh, there has been. Um, I do not have the data directly in front of me. I don't know if you have it with you, but I know, um, just between 2022 and 2023, I think NCMEC reported, um, over a hundred percent increase. Just in the, uh, amount of people reporting through their website.

Um, that's not to mention, uh, all the other sites, but last or a few months ago I was in Denver speaking at uh, Denver University and Thorn was there and Thorn is one of those groups that tracks. Uh, the statistics between sex trafficking and exploitation and the statistic that they came up with at the end of 2024, it was one in 17 teens had been a victim, but the statistic, wow.

The statistic that they presented [00:04:00] there, it's already up to one in five have been a victim. Wow. And when you say that one in five has been a victim, uh, is that one in five who's been a victim of Sextortion or one in five who's been a victim of online exploitation or, and abuse? One in five that have been a victim of online exploitation.

They didn't mention abuse directly in that statistic, uh, but it was one in five that have, has been a victim of exploitation online. Um, now that doesn't mean that they. When, as far as sending money or being blackmailed, so that's not just purely extortion. Um, that could be something simple, uh, as simple as commencing a teenager to strip on TikTok to get funds.

Mm-hmm. Um, that could be the grooming that we often see online. Um, but the one in 17 statistic was purely sextortion [00:05:00] for those 17 and under. And we saw a huge increase in the target of young males aged 13 to 17 from 2022 to 20 through 2024. So far, um, those numbers have risen dramatically. Um, those crimes are often committed by groups such as the Yahoo Boys out of Lagos, Nigeria, which started off with those prints, email schemes.

Um, but they're finding out that it's a, it's a quick way to make money because. American teenagers often have cash that they're willing to pay, you know, couple hundred dollars and they can extort 10 to 15 teens at one time, um, until they are demanding and threaten, and they're gonna make those photos go viral and share 'em with everyone within their friends group if they do not pay to stop it from happening.

So you, you, you mentioned that. There is, um, a, a trend towards targeting [00:06:00] boys for these extortion schemes in particular, and I remember seeing a, uh, warning from the FBI in the last. 15 months or so saying that American boys in particular were, were being targeted and, and had a higher rate of suicide. Yes.

As a result of the extortion schemes that both girls and boys are being targeted. But there's something about the boys that makes 'em especially vulnerable and more likely to commit suicide when the extortion happens. Are, are there other crimes online that are affecting, you know, America's children?

Well there, there are so many online crimes that are affecting and targeting our children right now. That's what makes social media dangerous. You know, sextortion started off at, in the traditional sense, primarily targeting young, uh, females. And what it was doing was someone would pretend to be, you know, this hot athlete in a county next door, [00:07:00] um, and then convince the girl to send over images and what they would do instead of.

Requiring money, they would force them to send over more images and we didn't realize that that was still going on as prevalent until we started seeing states issue these uh, be to bell cell phone ban policies. Now we're seeing the increase of those reportings come back up because the females were often just sending the images.

You know, and this is an assumption of mine, but I would, I would say that they were thinking, well, this guy might be a creep, but he's the only one that's seeing the photos, so what's it gonna hurt to send 'em another topless photo? Um, but they would send a message and say, you know, send me this photo and you got 15 minutes to do so, or I'm gonna release your photos.

Once the be to bail phone policy started, those girls were no longer getting those messages while they were in school. So by the end of school, they would get the [00:08:00] message and see, I missed that timeline, freak out, and then end up going and reporting it. Now that traditional sex extortion, what was happening is these extortionists we're talking to kids such as like my son, teenage boys.

And the teenage boys, as these reports started coming out, started getting a little bit wiser and they would say, okay, I don't believe that you're real. Send me a picture. Topless giving me a peace sign as an example. So those extortionists would reach out to the young ladies and say, Hey, you got 15 minutes to send me this, this photo, topless throwing a peace sign.

So then the boys really felt like they were talking to that teenage girl on the other end. Um. And now we are seeing a huge rise in what they call sadistic sextortion, which is not for money, but just for the sadistic reasons, um, of control. We've had situations where people are being extorted over their [00:09:00] images to where they cause cutting cau, causing, uh, teenagers to cut themselves or carve screen names into their arm.

Um, I believe that's a perfect example of what that group 7 6 4 is, which we've seen a lot of these prank phone calls about school shootings across the country. That's an example of sadistics extortion to where they begin to extort a teenager and then force them to commit a crime, um, on their behalf.

How is this happening? How is there such a significant increase in these heinous crimes against children online? My, well, one of the reasons I think it's increased is because a lot of the apps are now going to end to end encryption. Um, so the, um, the people committing the crimes have numerous layers to protect them, uh, from these images.

Not just the images, but the messages. [00:10:00] Um, in order to prosecute them or if they're overseas, it makes it very difficult to prosecute them and they know how much money that they can get off of this. Um, but ultimately it boils down, in my opinion, to the, the, our country has forgotten grace. We're too busy trying to act perfect and we forget that we all fall short and we're all gonna make mistakes.

And our children have seen that whenever someone makes a mistake, we as adults. Tend to kick them while they're down instead of lifting them up and understanding that people are gonna make mistakes. So now our children don't feel like there is a way out. Um, so it's this, I call it the chicken little attitude to whereas one thing goes wrong and it appears that the sky is falling and nothing will ever be right again.

Well, where are the parents in all this? I mean, how do all these strangers from Nigeria or you know, the town next door, how do they have [00:11:00] access to America's children? Because these social media companies that are out there, whether it be Snapchat, TikTok, meta platforms, such as Instagram, Facebook, WhatsApp, um, these platforms are listed.

Due to section two 30, which was written in the 1990, written in 1996, um, these companies are not really held responsible for anyone that's above the age of 13. So therefore, they want these teens on there, but yet they don't offer the protections to keep, say, a foreigner messaging the child directly or taking over an account.

Um, and in my opinion, they care more about profit than they do about protecting the children. And you asked where the parents are. You know, my background was in tech in the early two thousands. I walked away because I was scared of what algorithms were doing within the marketing space. And now we're in a [00:12:00] situation to where this is just amplified, but there is no protections for children.

Online and I can't even operate there. You know, there's 10,000 plus social media apps. How am I supposed to know how to implement every parental protection on that platform to ensure that my child is not, um, you know, seeing pornography and things of that nature? I used the example whenever Gavin passed.

I had very strict parental controls on his phone. However, those parental controls did not cross over into applications. And whenever after he, I was confident he wasn't looking at pornography, he wasn't going on the internet to look up, you know, websites that he shouldn't, um, because the parental controls are so strict.

But what I found was he had, um, Twitter accounts. He had three of them. One of them was his normal account. The other two were just full pornography. But yet, if you go to download the app, [00:13:00] you know the app is rated for 13 and above. So you have a teenage kid that can access full fledged pornography just through social media.

Uh, and I use the example of meta. You know, they are exchanging child pornography through their direct messaging platform. Now they will blur the images and say, this may contain child porn. But then still allow someone to open it. You know, the child pornography statistics, going back to, I think it was 2001, it was less than 300 people on there.

And at the end of 2024, it's up to, or in 2025 is estimated over 104 million CSAM images out there. Um, there's no reason that these images should exist and they should be wiped. But this problem is even being exacerbated by the introduction. Of ai, um, visual tools and apps such as notify apps where you can take a picture of [00:14:00] anyone eight to 80 and strip them down naked and create child pornography if you wanted to.

And many times in these six extortion cases, the reason it went from one in 17 to one in five, a lot of it has to do with the generated images or the morphed images. We've seen stories of teenage girls that never took a photo, but yet. Classmates are threatening them with a photo. Matter of fact, the first person charged under Gavin's law in South Carolina used a morphed image of a classmate and said, if you don't send me more pictures, then I'm gonna make this go viral.

And the girl never sent over an image to begin with. Wow. I am one. You know, one of the things, a lot of what you're saying is hard to comprehend. Um, and, and just so, so distressing, one of the things you said is, is is that your son, your late son Gavin and, and other kids, presumably mm-hmm. [00:15:00] Have social media accounts that are full of pornography.

Isn't it illegal? In our country for kids to be given access to pornography. So how is that happening? It, it certainly is on the print side. However, big tech, these are the world's richest companies since the inception of man, and they are fighting tooth and nail to stop any parental controls or taking our physical laws that exist and applying them to the digital world.

Um, you know, they're trying to scare citizens. By saying that we're trying to track that the government wants to track everything, anything and everything that people do. No one has a problem that a teenager cannot walk into a gas station, um, and buy an adult magazine without presenting an ID or buying alcohol, tobacco without a id.

But as soon as we try to implement. Age verification on the social media platform. They fight back [00:16:00] and they fight back because it is a trillion dollar industry for that 13 to 17-year-old market for them to be able to advertise. You know, the first time I testified, um, or I went to Congress was whenever the Zuckerberg and Snapchat executive and TikTok executives, all of them came in.

And they were doing their testimony. Many people know that as the time that Zuckerberg so-called apologized to us parents that have lost children, but there were groups there and studies and it, I don't remember the exact number, but I know it was over $250. That is from internal emails, from meta. They classified basically put a dollar amount on each child's head.

That's what they get in marketing revenue. So they don't wanna lose that. Even as we are seeing national legislation being pushed to protect kids online, often you will see they're just changing that age from 13 to 16. [00:17:00] I don't, the only reason to do that is to compromise with big tech because the majority of our laws are 18 and under.

So you've got adult minor and it's pretty simple, but they still want to keep those two extra years because that is revenue. In their mind. Wow. So you, you mentioned a couple things. You mentioned the build to bell laws. What, what are those, what are you referring to? There be to Bell Laws are cell phone bands for schools.

So as, as researchers have come out, such as Jonathan, he, and he wrote the book, the Anxious Generation, he started talking about the. The brain development and screen times and short form videos and social media and the problems that it was creating with our teenagers. And he dubbed them the anxious generation because of how much time is being spent and, and these social media companies, they've got it down to [00:18:00] a direct science, just like Big Tobacco did to release these dopamines and to keep you on the screen and to keep you addicted to it.

So many of the schools started saying, well, we're just gonna ban cell phones from the morning Bell until the end of the day. Bell goes off. We've got to get back to removing these devices from our classes. 'cause kids are too interested in watching a short form video, then paying attention to the teacher.

Our kids are on social media throughout the entire day and we had a lot of pushback at first, but what we are finding is that teenagers. That are in the schools with the Bell to Bell policy, they're improving their own mental health. And they are, they are the ones who are actually preferring this. Where the only pushback we really get now is from parents that say, well, I wanna be able to get in touch with my kid in the middle of the day.

Um, but those be to Bell policies have increased or, or helped the mental health and the anxiety within those schools. [00:19:00] Um, so we're seeing a trend across the country. Sometimes it's individual school districts. In South Carolina, what we did was we made it an option for the Department of Education to implement the policy, and they did so.

So that applied to all school districts to implement a bill to Bell policy. And the, the other, you know, law that you mentioned was Gavin's Law, and you said that's in South Carolina. Ma'am, what is Gavin's Law? Yes, ma'am. Gavin's Law was the law that I wrote, um, to really criminalize digital sextortion in South Carolina.

And I, I based it off of Utah's bill on Sextortion and started working with it. And as we started going through. The one thing that we added that really made a difference. But if, if you target a child or an at-risk adult in South Carolina, um, and commit sextortion, then you're looking at up to 15 years in prison.

And [00:20:00] that's just for a first offense. It adds on additional years for each one. But if you target, if it's an adult to adult, then it can be up to five years. So let's say you have a 20-year-old dating, a 21-year-old, and he says. If you break up with me, I'm gonna post this video of us having sex on a website with just that threat of him asking for something in return that would be up to a five year criminal penalty, um, uh, which would allow it to be prosecuted as a felony.

But the biggest, the biggest asset of Gavin's law is we mandated the education of what Sextortion is throughout all schools in South Carolina. Now Gavin's Law was just recently adopted at Alec last month as national model policy for all 50 states. Wow. Um, and Del, I know Delaware has passed Gavin's law as well.

They passed it in 2025. Um, and they combined it with what is also known [00:21:00] as Braden's Law, which was written after Braden Marcus, another teenager that passed away in Ohio. And Braden's law states that these tech companies, because often. We were seeing a huge rise in suicide with teenagers, but we didn't know why.

And if, uh, someone has taken their life due to shame, it's not like they're gonna leave a suicide note to, to list the problem of why they took their life or to inform anyone. Um, so the only way that we are able to find out was whenever we get the cell phones and you have to file for subpoenas to get information from the social media companies.

It. But if the social media companies don't act quick enough, then that data can be lost in the criminal escape. So Braden's Law gives it a very short time period for that discovery to be turned over. So I do plan on implementing that as an add-on to Gavin's law in these sextortion cases as well this year.

Are there, are there [00:22:00] things that, uh, you know, additional things that we can be doing to go upstream to prevent these online crimes against children? It, there's, there's so much that we could be doing. Um, what I say is these online, these online bills, because in 1996, whenever copa, the Children's Online Protection, um, bill was passed and Section two 30.

They were looking at it as the internet as a whole, and I understand the reasoning that what they were saying was that if you come onto my chat board as an example and you say something outlandish, me as the company, I shouldn't be able to be sued. You know, you have to go after the person directly doing this, not the company itself.

However, that was written well before social media. And unfortunately, we as parents didn't understand the detrimental aspects of social media up until recently. And we're seeing [00:23:00] that, you know, the generation that grew up on social media, they were kind of a Guinea pig generation, and we're seeing the harms that are in place and we're seeing the criminals target so.

In order to fix that without repealing or sunset Section two 30, which there are bills out there. But keep in mind, big Tech is the richest company since the inception of Man, they will spend tons of money fighting to keep something like that from passing. Um, so what we have to do is we have to kind of pass bills and bricks as I like to call 'em, through each legislation, um, in order to build that wall to protect our children.

Um, as an example, you know, we passed on, on the federal level. Take It Down. Act was passed this year, which was Senator Cruz and Senator Amy Klobuchar, bill. Um, and the first lady, Melania Trump made it her mission to try and usher this through. We'll take it down. What it does is it states that if any morphed image or a [00:24:00] real image is put up on a social media site and it is reported, um, then those sites only have 48 hours.

To remove it. However, whenever it comes to enforcement, we lack enforcement, so we need enforcement laws as well because our litigation options have been removed due to section two 30. So we have to rely on the government to be able to enforce, um, and implement many of these laws. Does that make sense or did I ramble on a little too much?

No, no. So that was actually something that I wanted to ask you about is prosecution and enforcement. And I know that you had mentioned earlier with the example of Braden's Law that mm-hmm. Historically, social media platforms have not complied with requests for information that would allow a parent to identify what criminal activities may have led to their [00:25:00] child's death or exploitation.

You know, the child doesn't have to die for a parent, obviously, to get involved and trying to advocate for getting information that they should be entitled to, you know, but, but what are some of the solutions that well. We need to pursue in order for protections for children to be enforced or crimes against children to be prosecuted in this digital world.

Yeah, and so one of the things is, you know, tech moves very fast law and justice moves very slow. So even if you were to find someone extorting someone right now and start a prosecution, it might be a year or two. Before you are able to get that prosecution. Meanwhile, thousands upon thousands of these cases are continuing to happen within your state.

So we've seen local law enforcement get overwhelmed and they are overwhelmed because whenever they [00:26:00] trace it back and it goes across jurisdictional lines or it goes overseas, you know, we can't control other countries. And just as we were talking about with Braden's law. Even if this is an app such as say, a notify app that's based out of an out of a country that's not gonna comply with us, but yet we don't have a net over the US to say, your app can't be available because you don't comply with us.

Those are the problems that we have. So whenever we're looking at it, you know, I often say justice has to look backwards. Prevention has to live in the present, and then our purpose of protection is what the future's about. So we have to approach it from all forms. We have to approach it on the app level, we have to approach it on the app store level.

Um, we have to approach it on the prevention and education level, and we have to approach it on the prosecution level. We need to ensure that we are able to prosecute and that those laws [00:27:00] exist. And I'll, I'll use the example of. A morphed image law that we passed this year, and I never thought about it before.

Common sense would say that if I took a video of you and I created a pornographic video through AI that, and I posted it out there, well, if I didn't have your permission, I, I could be charged with a crime. Or if you were a child, I should be able to be charged with child pornography. But whenever you start looking at it from the legal standpoint, you have to say, okay, well.

Is the, you know, the window decals of Calvin and Hobbes and, you know, peeing on a Chevy logo is that child pornography you have to rate, like, what percentage of the image has to be real? Because the, with AI out there, people can make a video of whatever they want now, and we have to state that if you're making a video of two eight year olds engaged in sexual [00:28:00] activities.

And you're watching that, well, you creating it, you are creating child pornography, but legally there's not a victim, so to say there. So we have to really look at it. Um, and what I've learned being in the, being in law making, is that a lot of times it has to be based on intent. So as we're writing those laws, it becomes a, a long discussion.

Going back and forth between, and I, I often say prosecutors and defense attorneys because they're the ones who know both sides. And we have to make sure that whenever we are writing these laws, that we're looking at things from all sides. And our ultimate goal is ke. And, and your ultimate goal is what our ultimate goal is.

Kept of protecting kids or protecting your image. Your name, image and likeness should be protected. Okay. You know, there, there's, there's a law in Tennessee that I really love called the Elvis Act, and [00:29:00] I don't think it was written based off of images. I think it was written more off music than Elvis Presley's rights.

Um, but it essentially says that citizens own their own name, image, and likeness. And I think that's the way that we're gonna have to go. So if someone wants to make a video or create a podcast using your voice. Well, that's your voice. You should own that and you should be compensated and they can't use it without your permission.

Given the complexities of all of these issues, why would parents even give their kids the phone in the 21st century? If there's really no way to keep them safe there, if there's no way to protect them from these harms, are there really enough benefits? A smartphone and giving kids access to the digital world that justifies allowing our kids to be potentially exposed to these crimes.

Well, I believe [00:30:00] that's a, that's another industry that we're seeing, you know, telecommunications, um, and the tech side of that. Their goal is to get devices in the hands of those younger and younger and younger. And regardless of what we say, we can't control everyone. Um, you're, you, you still have the parents that are passing off an iPad to a 1-year-old, um, and just letting them sit in the screen.

They don't understand the problems. Like whenever their kid gets on YouTube, kids or YouTube and is watching videos and all of a sudden a choking challenge video pops up and they, they try to re imitate that. Um, you know, it, it is, I often say God has a way of teaching you tolerance and a lot of times.

That comes through tragedy. And I think until we hit that awareness point throughout the general public of the true dangers of this, then it's not gonna change. And even with that, we can pull away and say, [00:31:00] well, kids can't have devices, but it's just gonna shift. We would've never said Kids can't have computers until this age.

And now that we're having phones, you know, phones will become wearables. Whether it's video game, through video games that you're accessing this other stuff, all of that will change. So we can't pull away and just say, we're gonna ban this. Um, and we can't educate the parents enough to overcome the, uh, advantages of having a phone, like being able to track your kid.

Uh, I often say the only reason Gavin had Instagram was because his seventh grade teacher. Told the class to download Instagram so that way she could communicate with everyone at one time. Now, at that time, Instagram was nothing more than exchanging photos and um, you know, it had the chat feature, but that was just her way.

And we see it through soccer apps, coaching apps, things of that nature. But those things can progress and, and [00:32:00] change in, and morph into something that we're not expecting. So the solution on the device level is finding devices. That will protect your kid. There are apps out there such as, you know, the, the Bark app and they have devices as well.

Um, there are pure device things such as the Aqua One by Cyber Dive that stops any nudity from being transferred. The child can't take a picture, it locks the phone down, or if they receive a picture, it locks it down. Um, but it also has mirror technology where a parent can do like a DVR and rewind and see what their kids are looking at.

But we're, let's be honest, we're all extremely busy. So what it has to boil down to is we have to teach our children to make more responsible decisions with that. Um, I'll use the example of a seatbelt law. Whenever the seatbelt law passed in South Carolina was 16 years old, so very rarely, I was never in the habit of just putting on my [00:33:00] seatbelt.

So the teenagers that were in the middle of it. They're not just putting on their seatbelt because of it, but yet I have a sister that's 16 years younger than me. She was raised her entire life strapping on her seatbelt so she doesn't get in the car without putting it on. Well, why is that? It's because we, we taught her by making the law that at a young age, she has to do that.

Well, whenever we're gonna hand devices, if we're handing our child a device that does not have the protections that we need. And that is simple enough that you don't have to be a techie to operate, then we're not going to be able to teach them how to make smart decisions. One of the reasons I, I really love the Aqua one is because it gives the parents complete control.

Whereas if you've got, you know, I've, I've got three boys, they're all on different maturity levels as they were growing up on what they could handle and what they couldn't at different ages and. I want to give them a little bit more freedom with the more responsibility, [00:34:00] or as they show, they are more and more responsible.

So just by doing that, we're teaching them how to make decisions and how to make the right decisions and be aware. But the parents can monitor, and if the parents don't catch it, then you have something that catches it and says, Hey, wait a minute. You need to look at this. Does that make sense? So what I, yeah.

I mean, what I'm hearing from you is that this really is a partnership between, you know, policy makers and parents and tech companies and the kids themselves that, you know, it doesn't just fall to, you know, one, one group or one person, but everybody working together to keep kids safe. I, I think the groups that we are missing are social media.

The big tech companies, I think that they are looking more at bottom line than they are looking at protections. And until we have this, this [00:35:00] social uprising of people saying, look, I, I, myself am not gonna use this product, or I myself am not gonna buy this stock until you prioritize protecting children, then it's just gonna continue.

Because at the end of the day, these public companies care about one thing and that's the bottom line. And their bottom line is money. Correct. Their, their return to their investors. Um, and it's a race. We're in the middle of a race, uh, uh, as we are leading into, and the, you know, the way that I look at the introduction of AI and as we have quantum coming, this is a new revolution.

This was just as, whenever the internet came out. You know, this is, this is something that we're having to learn. The problem is legislatively, we pass laws in the late nineties to try to protect kids online, and yet we have failed them since because legislation moves so slow. Um, and now [00:36:00] we've got the next revolution coming and we still haven't even protected kids from the first revolution.

Um, so as lawmakers, I feel like we have to, it sounds great to campaign and say, I wanna be proactive, proactive, proactive, proactive. But we also need to learn that whenever we see a problem, we have to be reactive and we have to be quick and nimble. And if we pass a law that we see is causing more problems, and we have to be quick and nimble to remove that as well.

Um, ultimately we don't wanna stifle innovation. We simply want our kids protected. We, we want our online to be treated as a product. If you buy a toy from Walmart and kids are choking on it, they recall that toy, they fix the choking problem and then put it back out there. But yet we have these AI chatbots that are coaching kids to commit suicide or giving counseling advice or [00:37:00] stripping naked and having sexual conversations with, you know, teenagers and younger.

Those are problems. If you're going to, if you're going to issue or allow those underage to access them, then you must have those protections in place. So what I hear you saying is that this is a product design issue. Yes. Product as much as anything. Product design is a major issue, and I don't think it's intentional whenever they're doing it.

But if, if you created a car. That went really fast, but the brakes weren't good. You're gonna pull it back and you're going to fix the brakes. But there's nothing to where these companies can be held liable, so therefore they're not making the changes. And the only other way to do that is through legislation or public outcry.

Um, so we've got to continue pushing on the product design safety by default. Um, or ensuring [00:38:00] that these are products that say maybe even the consumer protection, uh, group has to monitor to ensure that our citizens are safe. At the beginning of our conversation, you talked about the fact that you were committed to, you know, fighting to keep kids safe online for the rest of your life.

What about the next couple of years? What do you think? Should be your top priorities, what will have the most impact? Well, like I said before, we have to approach it from the app level and the app store level. Um, and you know, one of the big bills that has a lot of momentum is the App Store Accountability Act.

And what that states is the app stores, which we basically have. Amazon, Google and Apple. If they're a store providing a product, they should be the one to verify the age of the person that is downloading the product. Um, which would just make sense if you, you know, [00:39:00] Philip Morris wasn't required to go and check everyone smoking a cigarette to prove, prove that they're 18.

It was up to the stores to prove the age before they provided the product to 'em. Um, you know, that one has a lot of momentum. I believe that ultimately the future, the Web3 0.0 model will actually be more helpful to people because you'll be able to port your data. So it's, you know, digital Choice Act. I think it was, um, Doug FIA out of Utah passed it and he was a Google engineer.

Um, so he is allowing you to own your own data. And transport it where you want. But with that, you would also have the protection stating you can't collect data on our children. You know, what is the purpose of an app if you're gonna download a free game for a child? The reason it's free is because they're [00:40:00] collecting your data.

Um, but at that time, with all that data they're collecting, the algorithms can be used against a child. And that's removing a lot of our critical thinking as well. Um, so there's a lot of different ways to approach it. And as we're having the introduction of cryptocurrency, we have to really ensure that, you know, the scammers or the things that are happening.

Um, crypto is kind of following a lot of the same banking rules that we have. So you don't have an elderly person going and sending $80,000 in cryptocurrency to someone overseas. Just because they're getting scammed and they think that it's, you know, someone that they really have a connection with just in another state.

Right. Well, thank you Representative Guffey for everything you're doing to try and keep kids safe in the digital world. And to our listeners, thank you for joining us today. [00:41:00] Please tune in again soon for our next episode here on Radio Kempe. Thank you for listening to Radio Kempe. Stay connected by visiting our website at the kempecenter.org and follow us on social media.