The Wild Chaos Podcast

#87 - Predators Don’t Break Into Homes..They DM Your Kids: The Truth Behind A.I., Deepfakes, & Sextortion w/Ben Gillenwater

Wild Chaos Season 1 Episode 87

Think your kid’s phone is just a screen? Think again. It’s an always-open door. A Pandora's box you can only regret you shut.

 We sit down with Ben, The Family IT Guy, a 30-year cybersecurity veteran and dad, to map the new terrain of digital parenting: algorithm-driven feeds, anonymous chats, sextortion, deepfakes, and the quiet ways addictive design erodes sleep, focus, and safety. No panic, no fluff—just the playbook families need right now.

From there, we tackle platform myths. Roblox’s parental controls still let young kids into explicit spaces. These dopamine driven app designs and their origins make it a magnet for predators and sextortion. The rule of thumb is clear: avoid algorithms and open DMs. If a product can scroll forever or message anyone, treat it as high risk.

To watch Episode #87, like and subscribe: https://youtu.be/1Syc0LbBInE

Sextortion gets the spotlight it deserves. Criminal networks now use AI to forge nude images from everyday photos and clone voices from short clips. The guidance is firm: don’t pay, tell a trusted adult immediately, file with the National Center for Missing and Exploited Children, and use Take It Down to remove images. Build a family tech agreement that includes a “free pass” when kids report harm—predators rely on shame and silence. We also dig into schools’ devices, expanding surveillance tech, and why privacy can’t be assumed.

AI isn’t the villain or the savior—it’s a power tool. Used unsupervised, it can short-circuit thinking. Used well, it helps kids ask better questions and break problems into first principles. Our stance: kids don’t use AI alone; adults learn it first and model critical thinking. By the end, you’ll have clear rules, smart tools, and scripts to talk with your kids without fear or lectures.

If this helped, follow, share with a parent who needs it, and leave a review so more families can find these safety tools. Your next best step: set a family device drop-off time tonight and talk about a free pass.

Send us a text

Support the show

Follow Wild Chaos on Social Media:

Apple iTunes: https://podcasts.apple.com/us/podcast/the-wild-chaos-podcast/id1732761860
Spotify: https://open.spotify.com/show/5KFGZ6uABb1sQlfkE2TIoc?si=8ff748aa4fc64331

⁠⁠⁠Instagram: https://www.instagram.com/wildchaospodcast
Bam's Instagram: https://www.instagram.com/bambam0069
Youtube: https://youtube.com/@wildchaospod
TikTok: ⁠⁠⁠https://www.tiktok.com/@thewildchaospodcast
Meta (Facebook): ⁠⁠⁠https://www.facebook.com/TheWildChaosPodcast

SPEAKER_00:

Roblox, Fortnite, Minecraft, Call of Duty, Snapchat, Instagram, TikTok, they're built by the world's best addiction experts. Direct messages, open text messaging, any game that has a chat function, big big red flag. Online chat is where predators hunt for kids. That's terrifying.

SPEAKER_01:

Glad to be back. Anyone listening and watching, Ben was episode 66, where we talked cyber everything, really, about children, safety, apps, all that stuff. We have grown tremendously since that episode where you came on, and a lot of people have re-watched it or have caught up since and asked for you to come back on. Um, it's very important. You are doing God's work, I would like to say, because you have 30 years in the cybersecurity business.

SPEAKER_03:

Yep.

SPEAKER_01:

Uh, it is uh you are lived in the bowels and the trenches of some of the darkest, nastiest shit that's online. You've really dedicated your life now to helping educate families, which is what spun off Ben, the family IT guy, which they could find you on all the socials, which we'll make sure to plug. I would highly recommend everybody going and checking out all of your social medias because you put out nonstop knowledge of how to keep your children safe, what to look for, what predators are doing, where they're lurking, AI, safeties, protection, cyberbullying, sex tortion, everything. Yep. You you got your hands on a little bit of it all. You uh you're also building some really cool AI programs that are gonna help with kids and predators and some really messed up dark areas. And I've told you on our very first episode, and we've actually become pretty good friends since that you could always use my platform to be able to get the word out because obviously we have grown a tremendous amount since the last time you were on. I want to catch up our whole new audience of what the hell's out there. So we put a poll out. This is gonna be a little bit of a different episode. We're still gonna have our normal, awesome conversation, but I put it out online to parents that had questions for you specific. Like that, this is you're you're the expert in this world, and now these parents get to ask a question and I get to relay it, and we're gonna get these things answered. And I'm sure if they're asking it, there are thousands of parents that are listening that are thinking it or need to be thinking it. So at least we could plant a seed to help anything. And if this helps at least one child not get extorted, bullied, taken advantage of in all the millions of different ways that are happening online now, I'm cool with it. That'd be wonderful. So if I could turn my if that was the whole mission of this platform moving forward, I would 100% be cool with that. So, Ben, why don't we do a little bit of a recap of who you are, and then we're just going to dive into it. It's gonna be more of a QA since we have, I don't I want to get through all the questions, but you parents showed up, and we have a lot of questions that are gonna go from platforms, apps, AI questions, this good, the bad, the ugly of that, which we'll save toward the end, and then just all around how to keep my children as a girl dad, you're a father, how to keep our children safe online in this ever-evolving, terrifying world of the tech. So, all yours, buddy. Right, awesome.

SPEAKER_00:

Thanks for having me back. Absolutely. Okay, so yeah, just a quick for all the folks that haven't seen the last episode. So, as you said, I'm I'm I'm Ben, I'm the family IT guy. Um, I've been yeah, in the tech business since 1995. I'm a dad. And when I when those two worlds collided, being a tech guy and being a dad, I found out that it is not only very difficult to manage uh being a dad and raising a kid right now, but if it's if it's hard for me, and I'm I'm a tech expert, it must be very hard for folks that are not. Yes. Because for all of human history until 2007, parenting was contained all the same stuff trying to raise a kid to be to survive and to be healthy and to be able to grow into a responsible adult. And then in 2007, the iPhone got announced, and then eventually the App Store and eventually Facebook became on the phone in 2012, and then Instagram and Snapchat and TikTok, and now there's this new category of parenting.

SPEAKER_02:

Yeah.

SPEAKER_00:

Our genetics don't know what to do with that, right? We're we are not programmed to navigate a digital world. No. Nor are we programmed to navigate one that's an avalanche of things that are crashing on our kids and evolving every day. Yeah, and constantly changing. And so, how is anybody supposed to keep up? And so my mission is to help with that. My mission is to inform parents and to share knowledge and and in the process inform kids as well. Because the the only true way to address this chaos right now is with education, which is what we're doing right here. Very, and with community. And so sharing, like meeting others that are like-minded that share our values, you know, because one of the most beautiful parts of being human, I think, is that we're all different.

SPEAKER_02:

Absolutely.

SPEAKER_00:

And that depending on where you grew up and who you grew up with, your values will be relatively unique to the person that lives next door or in the country next door or on the other side of the world. Yep. And so, how do we align and find you know others that share our values so that we can be on the mission together and have support? Because I think we're a tribal species and we don't have physical tribes anymore. And so we try to seek those online. And so, how do we leverage these digital tools to help us find community? So, education and community are the answer, and so thanks for having me on because we can further both of these uh through through shows like this. And so um yeah, super happy to be here, and and I love the fact that that parents are sending in questions and I I can't wait to help answer them.

SPEAKER_01:

It's terrifying in a way. I parents reaching out is like the is the greatest thing because obviously they're listening and they're like, these are parents that are curious or worried and have these questions, but I feel there is a huge group of parents that just don't truly know the dangers of what's lurking behind that screen. And I think one of the greatest phrases I've ever heard anybody say was when you give your kid a phone, you're not giving them access to the internet, you're giving the internet access to your children. Yeah, and that should terrify everybody. And I just recently saw a commercial, I think I have it saved, and I would I should post it, where this dad tucks his kid in to bed at night, and in the corner there's these mean girls, and they're like, You're ugly, you're a piece of shit, and they're they're just roasting this kid. And he's like sitting there with in his with his blanket pulled up, and then you have this criminal in the corner who's like, Hey buddy, wanna come hang out? But it's after the dad closes the door, and you have all of these different bullies and predators, and guys like, hey, give me your give me your social security number, hey, what's your address? Like, hey, buddy, I just want to be friends with you. So you have the groomer, so you have all of these forms of these people, but it shows them inside your child's room in this video, yeah. And it all comes back to the phone, and that's what the reality is and who you're letting in your child's room at night while you send them in the bed with an iPad or a phone in their room, and it's and it's it's terrifying. Now, I don't I know you had I don't want to say close call, but this is what kind of was your aha moment with your son. You started to notice behavioral issues, and we talked about it in the first episode, it's been a while now, but you know, you started to pick up on these little little not say ticks, but his behavior started changing, you started noticing things, which it's really hard if you're not a super engaged parent to just know every single thing. Like I'm very fortunate with my wife, she knows every single thing that's wrong with our kids the second they walk in a door. And I think a lot of parents with the world that we're in, it's busy and the hustle and bustle, you kind of don't pick up on certain mood swings and changes and things like that. And you did, and then you dug into that and you realize what was going on on his iPad.

SPEAKER_00:

Yeah, that's right. And and just to recap that story, because I think it's I think it's worth sharing that that that I messed up. I made some significant mistakes looking back that now knowing what I know now, I wouldn't have done. But I did. And the reason I like talking about this is A, because it's why I do what I do, and then but B, it's it's good to acknowledge that we all make mistakes, and I think it's really good to even tell our kids that we make mistakes and talk to them about it. So, yeah, when my kid was five, I gave him an iPad, and I put YouTube on his iPad, which knowing what I know now, so I've been doing the family IT guy thing for since January of 24, and it's currently November 25, so coming up on two years. So we're talking like five years ago. Within days, I noticed significant behavior changes and and patterns to him seeking the iPad at every available moment, you know, waking up earlier to go to the iPad, coming home, like one of his best friends lives directly across the street. If we open our front door, we can see his front door. It's really cool.

SPEAKER_03:

Yeah.

SPEAKER_00:

And so the pattern prior was we come home from school and go over to my friend's house. Or they would just meet outside. No, now he's going to the iPad. And then, oh, it's we're having dinner, but oh, I don't want to have dinner, I want to go on the iPad. Oh, I don't want to go to sleep, I want to go on the iPad. You know, and so and then I and then come to find out that there's, you know, YouTube does what it does, which is it sends you down rabbit holes. Yeah. And if you're a kid, it still does that, even if you tell it that you're a kid. And so I took it off and I put on YouTube Kids. And then I learned during the process of that that YouTube Kids is no better. Even though it's called kids, it's supposedly, you know, like I was telling you last time, like the CEO at the time of YouTube Kids was a grandma. So you have a product made by a grandma, not made by, but run by run by, called kids, not safe for kids. And and just as a caveat, I've since learned there's a mode that I didn't know about at the time. Again, I'm an IT expert. I didn't know that there was a mode in YouTube Kids called approved content only mode. Where if I knew about that, I could have set it up so that he could only see videos that I approved. And that it turns off the algorithm, which is the most significant change you can make. Algorithms and kids should not go together. Long story short, took YouTube Kids off. It still had some like kid-friendly games on there that I had put on, you know, cool little like racing games and stuff. But it the the device was still so fantastic that it's just it was too attractive, and and the behavior patterns subsided a little bit, but not totally.

SPEAKER_02:

Yeah.

SPEAKER_00:

And it it looked to me like you something you might see in somebody that's that has an addiction problem.

SPEAKER_01:

100%. It's it's no different than a narcotic.

SPEAKER_00:

Like Yeah. And it I've learned since that it has the same effect in the brain. If you do brain scans, you the the commonality between a significant, like heavy drug user and and a device user is the same. So we took the whole device away. He was left, he he was left with nightmares for years because of the stuff he saw on YouTube kids, you know, like really scary stuff that I mean for years. And and it took a while for that drug to flush out of his body and for him to stop really, really seeking that device even after we took it away.

SPEAKER_02:

Yeah.

SPEAKER_00:

Because I put it up on top of my my dresser in my bedroom, and he'd come in the room and just be this beacon, be like, oh, there it is, you know. So I thought I literally had to hide it.

SPEAKER_01:

Oh, for sure. We we've we've done that before when we notice our little one starts to get more of a kind of like a dependent on it. Because there's times where we're super busy, we're traveling, it's like it's an easy fix, and we're not, I'm not, we're gonna sit here and be like, we're the perfect parents, right? Like who is and so yeah, nobody is. But and then when you notice, we and we notice it was like, where is it? And we're like, okay, whoop, gone. Yeah, and then we go months and months and months with even not even mentioning it, and we're like, I don't know, what'd you do with it? Obviously, now we've learned, and nope times cut, cool, no room, nothing after bedtime, like all that good stuff, and we wind our minds down. So we've obviously since your episode, we've learned a lot, knew a lot before, learned even more, and so yeah, we've we've implemented some pretty serious routines, like even doing my our oldest, she's about to be 18. There's no phone in your room after nine.

SPEAKER_00:

Nice. It's a good move. That's the best move, actually. You know, if for everybody out there, like really kind of short answer right at the top of the show here. If you could do just one thing, it's to take the devices out of the bedrooms. Why? The most the most significant dangers occur for both mental health and physical health late at night. So that's when kids are the most well, everybody's the most tired, but when kids are tired, um their ability to like I I read recently that if you if a if a child wakes up at 7 a.m. by midnight, their ability to process thought and to uh uh exercise impulse control is like that of somebody that has a 0.05% blood alcohol level. Really? And so your kid is effectively drunk in terms of their ability to act in a way that's smart. And all the things that are trying to get them, all the apps and the social media platforms, the algorithms, the predators, they all know that. So most harms occur between midnight and 2 a.m. And simply removing the device from the bedroom helps, well, A, better sleep, because this thing, there's been a bunch of studies, like the University of Texas did a study with 800 students where they did a thing on test scores, where if they they had a group of kids or of college students where the device was next to their desk in the backpack. And then they did a group where the device was how was I think one was the device was on, the other was the device was off, the other was the device was in the other room. And it wasn't until they put it in the other room that their test scores changed. Just having it nearby was this like subconscious distraction. And so having the thing in the room, just being there, even if they're not using it, is is a risk. Oh, 100%. And for adults, it's the same thing, right? And so we have to lead by example. So if we take our devices out of the rooms, we take our kids' devices out of the rooms, you can get these USB charger bricks that have like a whole bunch of USB ports, throw one of those in the kitchen, and then that's your family device charging station. And you pick a time at night that's 30 to 60 minutes before bedtime so that you can detox.

SPEAKER_01:

Is that what you would recommend? 30 to 60 minutes? Yes. Okay. Minimum? Yeah, minimum.

SPEAKER_00:

Yeah. Um, and then you just get it get a uh traditional alarm clock. Yeah. So like Seiko, they have alarm clocks that are made in Japan that are between 20 and 50 bucks, and you get all different styles and colors and whatnot, and they sell them on Amazon. Just go get a Seiko alarm clock, throw it in the room, you know, it'll last forever, and it doesn't have an internet connection. Because I think I think the bedroom should be a place of solace.

SPEAKER_01:

For sure.

SPEAKER_00:

And it should be a place where you can disconnect.

SPEAKER_01:

It's a place of rest.

SPEAKER_00:

Yeah. And so you said it well at the at the opening here, where all the different risks, the bullies and sex torsion and algorithms and all this stuff that is bad for you. Bringing that into the bedroom uh means that there's no escape. And and and there's even the bully thing. This is very, very common. And and the bullies at school now, everybody's texting each other, there's group threads. The bullies follow these kids into their bedrooms. Yeah. There's no escape. There's no escape, and there's some really awful stories about kids taking their own lives.

SPEAKER_01:

Absolutely. It's a daily thing.

SPEAKER_00:

Yeah.

SPEAKER_01:

And it's becoming more and more.

SPEAKER_00:

And so the very, very common things that occur can you can wipe out a bunch of them by just this one change.

SPEAKER_01:

Such a simple change. And I hope parents, this is the reason we're doing this show, is that parents can listen to this and be like, we're not saying take it away. You're not pulling it away from your kids, even though you don't want to admit that they're an addict. They're an addict to most, I don't want to say most children, but a huge portion of children are 100% addicted to these devices, and so are we as adults. So I'm not saying that.

SPEAKER_00:

Yeah, yeah.

SPEAKER_01:

So it's not like we're asking or telling you to cold turkey this and you're pulling this thing away that this kid depends on as his lifeline or their lifeline. It's hey, we're just gonna do something. I I'm working, I'm going to 100% get that charging pad because Britt and I honestly just had this conversation this morning because we were up late and I was scheduling some shit. I use scheduling as my excuse to get on, and then it evolves into endless before you know it's scrolling. And we're like fuck, it's two in the morning. Yeah. So we literally woke up this morning, okay, we got to make a major change. And that was just this morning. So you even say in that charging dock, okay, cool, everybody. Boom, eight o'clock, whatever it may be, that's where it goes on. It's something, it's just these simple little things that, and this is why we have all these parents that have asked these questions. It's not like we're sitting here gonna do a whole entire, what is it, when you um when you trap somebody like a drug uh an intervention with your child, you know, it's not like we're asking for you to do an intervention. These are just simple little things to help keep your kids safe because in their room, that is a place of rest. That is where the body is healing, that is where the especially young children, you have so many changes and they're developing, and there's so much going on in our young children's lives and bodies and chemicals and hormones and growth of their brain. Why are you not letting that body completely heal and recycle and rest? Yeah. To make them the best version of themselves the next day.

SPEAKER_00:

That's right. That's right. And and and it's worth noting, by the way, that the the issues that we're discussing, it's not the kids' fault. It's a hundred percent on the parents, I'd feel like. Well, so there's so yes, but I will also go as far to say that for all of us adults that are addicted, so there's eight billion roughly people in the world.

SPEAKER_02:

Okay.

SPEAKER_00:

Roughly six billion on the internet, roughly five billion on social media. So it's roughly five billion of us that are addicted to algorithms. That is not because we are bad people, it's not because we're bad parents, it's not because we've done anything wrong. It's because the world's foremost addiction experts are partnered with the world's foremost software engineers, and they work together in order to maximize the addiction of the tools that we use. Well, actually, tools is being a little too fair. The systems that we use every day, they are designed to be addictive. They are taking advantage of our biology. So it is normal to be addicted to these things. And if you're a kid, you have no chance at not getting addicted to these things. No chance. And to such an extent, it'd be one thing if if these companies didn't know, right? If they're like, oh wow, we like, dang, we messed up. We didn't realize that we were screwing with kids' heads. But a bunch of whistleblowers that worked for Facebook just went in front of the Senate, the US Senate, in uh I think it was in September of 2025, and said that not only does Meta, the parent company of Facebook, know, but they've known for a long time and they've purposely suppressed the evidence and have not changed the mechanisms that hurt kids. They hurt kids on purpose.

SPEAKER_01:

And they're hiring experts to be able to manipulate to the highest level possible on these platforms. They're not just, oh, we built it and it just evolved into this. They're seeking these experts that know how to manipulate the mind, how to control, how to make things addicting, and they're hiring meta, Facebook, Instagram, WhatsApp, Oculus. They have one they have a team of experts that are literally programming these to specifically addict us and our children.

SPEAKER_00:

That's right. One of their whistleblowers said that they will watch the the algorithm will watch a young girl take a selfie and post it, and if she deletes the selfie or like cancels the post, they show her ads for diet pills because they've determined that she's feeling bad about herself, and so they make her feel worse.

SPEAKER_01:

Like, why like how are we allowed to get away with this with our children? Well, that was just legal.

SPEAKER_00:

You know, I think it's um the this saying of if somebody tells you who they are, you should listen. So I think the reason why it's legal and the reason why it's happening is because for corporations and for politicians, if they can, they will. That's my experience. And so they can, so they will and so they will, and they do. But there's a positive side to that because they're telling us who they are. They're telling us that they are not on our side. For those of us that are parents, for those of us that seek mental health and physical health for ourselves and for our kids.

SPEAKER_01:

The ones that are listening.

SPEAKER_00:

That's right. And so through shows like this and through my channels, we're trying to help everybody understand the truth of what's actually happening. Because it's normal not to know about these things. And so these companies are showing us who they are. The politicians have shown us who they are forever. I mean, this this you know is not new. Um, like anytime a politician said they're saving kids, it's usually not, it's something else, right? It's a smokescreen for something else. 100%. So we're the only ones that can be that can protect our families and that are responsible for our families. And I like to think about protection in the form of if I think about the concept of security, I think that uh you should never outsource the security of yourself or your loved ones to strangers. And so could these corporations change their behaviors to protect our kids? They could. Are they responsible to? No. Could the politicians enact regulations that might actually help protect kids? Yes. Are they incentivized to or responsible to? No. Can we change our behaviors to protect our kids? Yes. Are we incentivized to? Are we responsible to? Yes. Yes. Yes, yes, yes. So it's very important to recognize that we are the only ones that can protect ourselves and our families. If you outsource your security to strangers, you will fail. If you in-source your security and you focus on being responsible for yourself and your family, and you forget about all these strangers that are actually in the business of harm, then you can succeed. And there is positive change to be had. And once you recognize what's happening, which is my mission, is to help people understand what's happening, then you can take steps to make positive change for yourself, and then you can talk to your kids about it and show them what you're doing for yourself and show them why you're having them perhaps change their behaviors of we're gonna take the devices out of the bedrooms because we are responsible for ourselves. And I love the concept of self-responsibility. I think as a cultural like a healthy culture, in my opinion, is one based on self-responsibility.

SPEAKER_01:

Makes sense, you know.

SPEAKER_00:

So we're not outsourcing anything to anybody else. We look after ourselves, we looked after our family, our extended family, our friends. Those that's our actual tribe. Yep. Everybody outside of that, I I wish them the best, and I I love people by default. Yeah, benefit of the doubt by by default, but I got to look after me and my family first. Don't cross my walls. That's right. Unless you're invited, right? Exactly. And so so there's on one hand, you could look at it and say, oh, there's things to be afraid of. I think actually there's things to be aware of, yep. And then there's things, there's a bunch of benefit in the positive changes that can be had once you're aware.

SPEAKER_01:

Absolutely. And I'm glad you say that. I'm glad we're having this conversation conversation because I immediately me, being who I am, if anything happens to my kids, I that's on me. But I also have to realize I know from conversations like you, my previous world of being traveling the world and dealing with the evilest of the evil, that anything that happens to me, I put on me. So, and I I appreciate you how you explain it because it's not I, you know, I immediately it's it's the parents' fault. Really isn't because we are the last generation to grow up not with technology. We remember the sky without chemtrails and what it was like to go look at the neighborhood and find where all the bikes were, and that's where all your buddies were, and that type of shit, right? We're the last good generation to grow up knowing a world without tech, and then being introduced to tech when our kids are just that's all they know. Like, and it was really sad the other day. Um, my youngest can you know Kennedy very well now. We're watching something, and she's like, God, it's so sad, I don't even know what's real anymore. And it and it broke my heart. Whoa, and it I looked at her for a second, I'm like, God, we're here now. Yeah, we're here, yeah. We are, and it's gonna get worse and worse and worse. And so that's why we're having this conversation, man. So hopefully we can wake parents up so they they know what's going on. And I guess let's just jump right into these questions. These are all parent-driven questions, and so there's some grandparents that reached out, there's siblings that reached out and wanted to know answers. So we have some really good ones, and nice. Um, let's just rip through them and we'll start this so that way this is gonna be like a fire and effect type of episode, so that way we can just fire hose as many questions, and I obviously, if there's things we need to go into depth on to for people's knowledge, go I want to know it all. So we're gonna do our best to get through this list because, like I said, we have got some questions, which every question asked could save a kid's life or potentially stop them from some horrible things.

SPEAKER_00:

Yeah, and you know what? The ones that we don't get through that we want to, um, I can take them on later and make videos about them.

SPEAKER_01:

Perfect. Perfect. We'll post them on here. So we'll just start out, man. Um, I need to know the best app for monitoring my kids' phones so I can know who I can block or not block. Is there a specific app for this?

SPEAKER_00:

Yeah, there's actually um 36 of them. I've done some research and I've found 36 different surveillance systems for kids' devices.

SPEAKER_02:

Okay.

SPEAKER_00:

So there's actually two major components to these types of systems. There's filtering and there's surveillance.

SPEAKER_02:

Yep.

SPEAKER_00:

And so back to you know, family values being unique. Some families might not want surveillance, but they might want filtering. Or they might want surveillance, but they don't want filtering, or they might want both. The biggest one that does both is called Bark. Okay. Um, I was actually just in in New York with them last week. They had an event that they invited me out to, and I met a bunch of their people, really good people on a good mission and having a very large positive effect. There's um Go Guardian, MM Guardian, Custodio, there's a whole bunch of them. Um uh Gab is another one.

SPEAKER_02:

Yep.

SPEAKER_00:

I think security is best done in layers. You should always have multiple layers of security uh in the military, right? Sometimes they say like one is none, two is one. Yep. So I like to have at least three. And so I always start with the systems that come with the devices. So on Android devices, you have Google has a thing called Family Link. So if it's an if you get your kid an Android device, put Family Link first.

SPEAKER_02:

Okay.

SPEAKER_00:

If you get your kid an Apple device, it's called Screen Time. That's the name of their parental control system. And there's parental control systems on Windows computers and on Chromebook computers. In fact, Chromebook uses Family Link as well. Um, and so I would start there first. And then if but those don't do surveillance, those just do filtering. If the filtering is not enough and you want to add surveillance, then you can look at one of the third parties that I mentioned. Note that surveillance systems require the degradation of privacy.

SPEAKER_02:

Okay.

SPEAKER_00:

In order to surveil, you have to collect information, and that information has to be stored somewhere so that the parents can be alerted based on like a text message that's saying that like your kid's being bullied, your kid is having suicidal thoughts, your kid is looking up, you know, X, Y, and Z inappropriate material on the internet. In order to conduct surveillance, you have to store the data. If the data is stored, it's at risk. Someone has being stolen or of being misused. Always have that in the back of your mind. There's a trade-off.

SPEAKER_02:

Yeah.

SPEAKER_00:

And then the third layer that I like to have is one that granted is a bit more for the more tech savvy out there, but it's called the DNS filter. And there's a system called NextDNS, N-E-X-T-DNS, that you can add a whole nother layer of filters. And that so that gives you three in total. So I like to run all three at the same time if I'm if I'm trying to be comprehensive about it.

SPEAKER_01:

And I know people listening are probably like, what the hell already? We're there's this deep. You have probably all of this searchable with links all connected to the Family IT guy and through your platform. So yeah, moving forward through this episode, if Ben explains anything that you guys aren't 100% just registering, we're gonna have all of your socials and websites so people can go and click and then you can search or they can at least ask you for in depth. We want to make sure everybody has the ability to be able to find, go back through. These are just questions that we're answering, the spark thought, and questions in parents' minds, so then they can go and do their own research, and you pretty much are probably gonna have all of this somewhere on your website.

SPEAKER_00:

Yes. Or have talked about it on your socials. In fact, I have a chat bot on my website that I've put up for free that's trained on all of my work. Perfect. So you can ask if I talk about a DNS filter, I have an article about it on my website. I have videos on my social media channels. Perfect. You can also ask the chatbot about it, and it'll answer very similarly to how I would. Well God blessed. Yeah, right. I'm trying to use the tech in a good way where I can, you know.

SPEAKER_01:

Yeah, I might as well. Okay, this is from some grandparents. Our seven-year-old grandson is allowed to play Roblox, which is a huge topic lately. Yeah. Is there anything we can do to suggest to their parents that they should stop this? So I guess let's dig into Roblox. Have they changed anything since that one gamer came and outed all the pedif the pedophiles that were on it? Is anything changed on Roblox first?

SPEAKER_00:

Yeah. Uh so technically they've made changes. None of them are useful. Roblox is not safe for kids. Okay. Roblox is a platform for adults that is marketed to children. Of the 111.8 million people that play Roblox every day, according to their most recent report, 40 million of them are under 13. On the Apple App Store, the Roblox game is rated 13 plus. You can register on the Roblox website as young as five and it'll let you play. They partner with SpongeBob and Barbie, whose primary audiences are young children. They do not have the capacity. So, okay, let me first tell you like what Roblox is. Because I I didn't know until I started studying this stuff a couple years ago.

SPEAKER_01:

There's a lot of parents that have their kids on Roblox.

SPEAKER_00:

Yeah. So Roblox, if I understand it correctly, is the most popular game that there is. It's out there, it's up there for sure. Huge. It's okay, it's a platform in which people can develop games. It's actually a software programming platform. So they have this language called Lua.

SPEAKER_02:

Okay.

SPEAKER_00:

And using the Lua language, you can build a game in Roblox. So there's like 40 million games. Some of them are huge, some of them are tiny. The game developers have to submit the age rating that their game is for. And they're meant to be honest about that rating. And when they're not, sometimes the Roblox moderators catch them and sometimes they don't. But their filtering mechanisms have been proven to be ineffective. For example, I played Roblox so I could learn about it, and I told it I was eight. And then I turned on all the parental controls. Which by the way, when I first created my account before I hooked up my parental account to it and it was just the kid account, it set my filtering levels to medium instead of to low.

SPEAKER_01:

Even as an eight-year-old register.

SPEAKER_00:

I told it I was eight. Like how is that not an immediate default for Roblox? They turned off chat, which is good because chat is the vector where predators hunt for kids. So that's a step in the right direction, but it is not conducive to their business model to limit the games that kids can play. So then I turned on all the parental controls. I set the thing to low to like the filtering for violence and sexual stuff and things. And it let me play this game called public bathroom. And in public bathroom, I walk into this big open room. Like if you go to the YMCA and you go to the pool, they've got those giant rooms, indoor pools. Imagine that, and then there's beds around the edge of the pool, and then there's characters humping each other and making sexual noises around the pool.

SPEAKER_01:

And you're an eight-year-old.

SPEAKER_00:

I'm an eight-year-old. So I'm just cruising around this little pool with my Roblox character as an eight-year-old. And then I walk into the bathroom component, and there's bathroom stalls and there's shower stalls, and there's people going in and out and sharing the stalls with each other and making noises and humping each other in the bathtubs and stuff. And so all that to say that it's not their filters are ineffective. This guy Schlepp has been making a bunch of noise. He was actually an ambassador for Roblox. As a 10-year-old, he was like doing he was going to their events and was all about Roblox. He ended up meeting one of the developers of the games that he liked that he really looked up to, and he wanted to be a game developer. The game developer ended up grooming him and traumatizing him, and then Schlepp actually ended up in the hospital after he tried to commit suicide because of this. Roblox wouldn't even reply to his mom's emails. They just basically told him to go away. And so since then, he's been on a bit of a kick of trying to help people understand this. Because they've sent him like cease and desist, like they're trying to hush him up. They did. So he's actually gotten six predators arrested that he caught on Roblox and handed over to the cops. Roblox closed his account, sent him a cease and desist. Chris Hansen hooked from the Predator, the Catch a Predator show on Dateline, hooked up with Schlepp. They just did crime con together. Um, I just met with Chris last week and I'm I'm working with him on some Roblox stuff. We did a podcast together. At the end of the day, does Roblox have games that are appropriate for kids? Yes. Do they have mechanisms to prevent kids from playing games that are not appropriate? No. Have they made changes in response to the Schlep stuff? Yes, but they're all BS. It's all smoke screen. There's only one change that they need to make. So they've announced they announced a hundred safety features this year. All BS. They only need one. It's called a whitelist mode. The same thing I was telling you about YouTube kids. All they need to do is respect parental authority and not let kids play things that their parents have not pre-approved. And they could call it an approved games mode. That's all they need to do. Like a Minecraft. Yeah. Minecraft has that.

SPEAKER_01:

Yeah.

SPEAKER_00:

If they do that, they will signal to the world that they care about kids. They have not done that. They have had the opportunity to do that. I can tell you as a technical person that the doing that would not be terribly expensive or time consuming, relatively speaking. They have the talent, they have the skills, they have the capabilities. They're not interested in protecting kids. Kids should not play Roblox. That's my spiel on Roblox. I'm not a fan. No. Neither are we. Not in this house. I was actually just on CNN a couple weeks ago, kind of trash talking, you know. I say trash talking out of jest. I actually try to be as fair as possible. I really do intend to be fair in the way that I speak about them. That's why I recommend that they could do things. Um, but they're not. They're not.

SPEAKER_02:

No.

SPEAKER_00:

They don't care. If somebody tells you who they are, you should listen.

SPEAKER_01:

Listen.

SPEAKER_02:

Mm-hmm.

SPEAKER_01:

So we use this in our home. This person would like to know if there's any better options for Life 360 for tracking anybody. We personally use it in our everybody in our house has on every device has the Life 360, especially being a girl dad. We got teenagers flying all over the place. Is there a better app than Life 360 that you have found?

SPEAKER_00:

To be clear, I think Life 360, that's the Norton product.

SPEAKER_01:

I don't know if it's a Norton product. It's the tracking app so that you can show you everything, speeds, where they're at, when they stopped. And if you're not familiar on it, that's fine with me. So this is more of a be able to track your kids' app.

SPEAKER_00:

Yeah, I'm not sure. Um if it's a Norton product, I'm Norton has a mixed history from an ethical perspective.

SPEAKER_02:

Okay.

SPEAKER_00:

If it's a Norton product, I would switch.

SPEAKER_01:

Okay.

SPEAKER_00:

If it's not, then I'll have to study it and see.

SPEAKER_01:

Well, there we go. We'll get to the bottom of that. I'm sure you'll have something for me here. You'll create your own friggin' app or some shit here, knowing you.

SPEAKER_00:

Got some plans, but you know.

SPEAKER_01:

Is there an app that monitors your kids when they're online?

SPEAKER_00:

Yeah. Um in fact, the ones that we talked about before do that.

SPEAKER_03:

Okay. Yeah. Okay.

SPEAKER_01:

What's the most dangerous platform that for your that your children are on that parents think that are safe?

SPEAKER_00:

Oh, I just have to pick one. I mean you can give us a few, but so I always start with Snapchat. Okay. But Snapchat shares a common set of attributes. So let's actually talk about the attributes that you can look at. If you look at any app, if you identify any of these attributes, then you should proceed with caution. There's two that are worth focusing on. There's algorithms, aka bottomless feeds. Yes. So if you can scroll and scroll and never stops, that is a big, big red flag. That equals mental health risks. That equals a 300% increase in suicides for kids age 10 to 14 between the years of 2007 and 2019. That's terrifying. 300% increase in suicides to the effect that now suicide for young children of that age is the second leading cause of death, according to the CDC. Because of a bottomless feed. Yes.

SPEAKER_01:

Meaning your kid never stops scrolling. They never hit an end. That is the meaning of bottomless feed. They just they can do this for 12 hours a day.

SPEAKER_00:

Yes.

SPEAKER_01:

Okay.

SPEAKER_00:

Or 14 or 18.

SPEAKER_01:

Whatever it is.

SPEAKER_00:

When I was at that BARC event last week, they showed a video where they interviewed a bunch of kids and they asked them, like, what's the most time you've ever spent on your phone? And some of them said 18 hours a day per day. Very, very dangerous. Algorithms are not safe for humans. Let alone let alone young ones. Because your frontal lobe doesn't develop until you're at least 25 on average. And your frontal lobe is what helps you avoid these kind of mechanisms. And even for those of us that have developed frontal lobes are still stuck on these things because they're built by the world's best addiction experts. So algorithms. The second one is online chat. Online chat is where predators hunt for kids. There's two types of predators right now, mainly. There are those that are, let's just call them perverts, there are those that have sexual tendencies towards children. And then there are those that are criminal networks that are extorting kids. The criminal network thing, it's it's to the effect now where apparently globally there are ten reports per second of kids being extorted online.

SPEAKER_01:

Ten reports per second?

SPEAKER_00:

Yeah. That cannot occur without anonymous online chat.

SPEAKER_02:

Okay.

SPEAKER_00:

Anonymous online chat is facilitated by all the major platforms that are popular. Explain anonymous. Okay.Anywhere where there's direct messages, anywhere where there's open text messaging, anywhere, so any game that has a chat function, like Roblox, Fortnite, Minecraft, Call of Duty, you know, whether it's a voice chat or text chat, you know, if you're wearing a headset or typing on a keyboard, any of the major social media systems, Snapchat, Instagram, TikTok, Sora, the new one from OpenAI, the video generating service, they have a DM function. That when you when you mentioned, you know, you're exposing you're you're not exposing your kids to the world, you're exposed or how was it the other way around? That that's how that happens. You're you're exposing your kid to a world full of predators.

SPEAKER_01:

You're giving access. Yeah. You're giving the world access to your children.

SPEAKER_00:

That's right. You're giving the world access to your children. And the they don't have to the the pervert guys and and women don't have to risk their safety by going and sitting in the van by the park. Yep. They can um switch their IP address and anonymously be online and and hunt for kids that way. Um, and so if you expose your kid to an algorithm, it's risky for their mental health. If you expose them to anonymous online chat, it's risky for their physical health, well, and their mental health as well. Um, because even kids that don't get caught up in actual physical activities, they still get they're seriously traumatized. I've heard some just terrible stories from kids that have been exposed to these issues and these types of predators where it's it's really messes them up. Yeah. And so, you know, when you say like what's the most dangerous app, there's a lot of them because a lot of them have algorithms and chat. Two big no-nos. Yeah, you need to stay far away from those two things. If you eliminate those two things, you eliminate the vast majority of issues that exist on the internet immediately, right? Like right away.

SPEAKER_01:

Not saying let your guard down, but you could at least you're not over your kid's shoulder worrying about when's the next whatever's gonna happen. Right. Okay. So we'll probably ask some questions that are already been touched on, we'll hit them anyways, just because there might be another clip that that person's fed instead of another person. You see what I'm saying? So it might sound redundant, some of these questions, but some people only watch shorts and I want to be able to touch it all. If there was an app that I had to delete off of my children's phone right now, what would it be? ChatGPT. Really? I was not expecting that. Why?

SPEAKER_00:

Because the brain is like a muscle. And just like your muscles, in order to grow and develop, they have to be used or stressed in a good way. You have to apply stress in order to enable growth. The current generative AI large language model systems, the chat bots that we're all familiar with. You are taking this the thinking process and outsourcing it to a to a computer. And there's a bunch of information coming out that shows that that makes it so that your brain doesn't develop and that you don't gain the ability to think. Makes sense.

SPEAKER_01:

It's making us stupid.

SPEAKER_00:

Yes, it is.

SPEAKER_01:

It's making us stupid.

SPEAKER_00:

It is.

SPEAKER_01:

If I'm a parent and I allow my kid to be on every social media platform, what's the one social media platform you would refuse, knowing everything you know about the dark side of the internet? What's the one social media platform you wouldn't allow your child on?

SPEAKER_00:

Snapchat is always the top of the list. Why? Give me a quick Because Snapchat is so the initial design of Snapchat was made so that you could share sexual photos. 100%. Because if you if you're regardless of your age, if you're dating somebody, you're in a relationship with somebody of any sexual nature, and you want to send them a photo, like it's super risky to do if you text them or you email or you whatever, if you send it to them in any chat system, they could take your photo and and and whatever embarrass you at the very least. And so Snapchat was meant to help sort of solve that problem. Right? It's it's its purpose was to share sexual photos. Um the A, it's not even good at that. So everything's saved. There's third parties that so so Snapchat, there's no reason that they're deleting those photos on their servers. And there's third parties, they have a third-party plugin system, and those third parties can capture the screen and keep the photos. So like if you're sending me a photo that like and I want to keep it, I I could do that, and then you wouldn't know. And it turns out that those third parties have been hacked, and then all those photos got stolen, anyways. And then so that's just where a lot of like for a kid.

SPEAKER_01:

There's a lot of predators inside Snapchat as well. I have a buddy that's a cop, he does a lot of the cyber, he does a lot with drug at uh drugs and narcotics on there, and they say the majority of the the plugs, the drug dealers that they're catching and getting warrants for, and as soon as they submit a warrant to Snapchat, they just give it to them full access. Yeah, how many of those people are in the child pornography, grooming kids? He's like, it just he's like, it's almost every single drug dealer they catch. That's like the tip of the iceberg, then it just opens up all of these cans of worms that they're just finding all these rabbit holes that they go down.

SPEAKER_00:

Well, and Snapchat themselves has have internal reports saying that they have over 10,000 cases of sex torsion a month just on Snapchat. So you got yeah, total drug like big drug marketplace on Snapchat, huge vector for sex torsion. It's it's um it's not a positive place to be. And they'll allow a child on there, yeah. Which is god, it blows my mind.

SPEAKER_01:

Yeah. How fast if I'm a parent and I let my kid on the internet for the very first time, how fast can a predator find them?

SPEAKER_00:

Less than a minute. Like I uh there's uh I I'm not paid by Bark. I keep talking about Bark. They do good, they they have a video.

SPEAKER_01:

If they're a good platform, I want to I'll just push it up.

SPEAKER_00:

So Bark has a video on their YouTube channel where they did um they they set up a decoy operation where they they turned on Instagram accounts for different age girls, and they fired up an Instagram account for a for a fake like 12-year-old girl. It was something like 42 seconds or 67 seconds? It was real fast. Almost instantly, started getting sexual messages and and really really rough words for a 12-year-old girl to receive. I mean, it's you might as well be instant. Okay. If you turn the thing on, because who's gonna fire up Instagram and use it for less than a minute, right? You're gonna look and go and this and that. It's it happened right away.

SPEAKER_01:

I saw a video once where this guy logged in, and I think he's he went into a chat, made a name, like it was like Butterfly Lover, whatever. I think it was like 19 seconds. It was like it was in a kid's chat, and then it was 19 seconds before the first guy was like, Hey, how old are you?

SPEAKER_03:

Yeah.

SPEAKER_01:

Okay, so that's what we need to realize as parents. You think your kids in this open forum talking to other kids.

SPEAKER_00:

Yeah, and and and Instagram has this this teen mode parental control false thing. No.

SPEAKER_01:

Okay, so as a parent, if I sign my kid up, little Timmy's ready of age, I think everything's cool, he's mature enough to be on social media, but I'm gonna check all the little parental teen only. I don't I want you know sensitive content turned down. Does that does that actually even work?

SPEAKER_00:

No. Okay, no, they can't to an extent, but not well enough. It doesn't get rid of the dangers. No, it I think it probably dials it down for a little bit, but you know, I like to look at it as like I like to look at it, I like to look at things like this through a lens of is it, is it, does it have a net positive or a net negative? Okay, for sure, net negative. Yeah, you know, because you could say that like, oh, but there's positives, or sometimes, well, we have to use this platform or that platform because that's the sports team, that's the friends group, that's this thing. That's all real, it doesn't make it any easier. It's super difficult sometimes to remove these dangerous systems for kids' lives. But at the end of the day, does it match your family values? And is it a net positive or net negative?

SPEAKER_02:

Okay.

SPEAKER_00:

And these things are net negatives.

SPEAKER_02:

For sure.

SPEAKER_00:

You know, they they're not, I mean, like I said, the meta whistleblowers told us what they're doing, they're harming kids intentionally. So do do with that what you will.

SPEAKER_01:

That's the problem. Most parents aren't doing anything with the information.

SPEAKER_00:

Yeah, and it's it's you know, I heard an interesting saying actually on the way here. I was listening to a podcast and uh with Joe Rogan and Elon Musk, their most recent. I want to catch that one. Yeah, I haven't watched it yet. And Elon said something about it's easy to fool people, but it's really hard to convince people that they've been fooled. Yes, which I think our country is there now. Oh, yeah, globally. This is glow, I mean, for sure. Our country, yes, globally, yes, everybody, man.

SPEAKER_01:

Yeah, not I don't want to get down that, but like that's how I feel right now. It's like once you're the awakening happens in you and you're looking at all your friends, like that's where we're it's so hard to convince people. It's like convincing some bigfoot person. It'd be like you trying to convince me that the earth is round, right? Not buying it, not buying it, you know, especially now.

SPEAKER_00:

Look at everything that's coming out. You know what, man? I don't take anything for granted anymore.

SPEAKER_01:

No, so that's and so that's the point we're at now. It's like, okay, yeah, I'm I'm gonna look into that because you have to.

SPEAKER_00:

I'm down, I'm down, I'm down to consider anything.

SPEAKER_01:

Okay, next question, which this is a this is a great question from a parent. Are my kids' biggest threats coming from strangers on the internet or their own peers?

SPEAKER_00:

Oh man. Um you know interestingly, if your kid has access to all of these to to algorithms or to online chat systems, I don't know. I wanted I'd I'd guess that it'd be strangers, but then there but even if your kid doesn't have a device at all, then the biggest threat is from their peers because their peers do. And their peers are gonna be exposing them to all the things. Yeah. And so it's a mixed bag, and it probably depends per person. I, you know, actually, like I would say strangers really is probably the answer because how many people in your, even if you try to prune your like followers list of like who you follow on Instagram and who you follow on TikTok, how many of those people do you actually know? How many of those people could you say what are their favorite colors and hobbies and where do they live and what are their parents' names and what's their dog's name? Right? Not that many. So everybody beyond that, in my opinion, is a stranger.

SPEAKER_01:

Yeah. And even, I mean, I've had three friends now that I considered in my not inner circle, but considered friends. We travel together, we've done events together. I've had three so far that have all getting caught, they've all been caught for child pornography, molestation, something along those lines. So, like, even if you truly, and that's what's so terrifying when I see these moms putting up like Timmy's back to school picture, and they got the name of the school and the grade and his little nickname and all this bullshit on there. I can't shit on it because every going back to school year, I'm the one that's on there, like, hey, you should probably clean your shit up because I now I know your kid goes to school, I know their little nickname, I know what they're into, because they got the little chalkboard with all your kids' mugshot picture on there, and it's like, well, it's just my private, my private Facebook page. I these were dudes that I would have no problem coming through swing and rolling through town, like, yeah, bro, crash on the couch, and then they now two of them are dead because they they made the only choice that you can make after the decisions they made, one of them still in the process of it and trying to fight it all. You don't know anybody. No, you do not, and I try to explain that when I talk to young parents and dads, especially the dads. I'm like, how well do you know these people?

SPEAKER_00:

Yeah. It it really hurts me to acknowledge to myself and to like say out loud like how many really like evil people there are.

SPEAKER_01:

It's disgusting and it's getting worse because everything's now easier to find and easier to make and see and download.

SPEAKER_00:

It's yeah, because I want to think that everybody's good, and I do. I approach people from that perspective as a default when I meet new people, but but to what extent? Will I invite them into my house and stuff? You know, no. But or if I do, then it's with caution and with boundaries and stuff. But man, it's just it sucks because you can't take anything for granted, you can't take anybody for granted.

SPEAKER_02:

No.

SPEAKER_00:

And it it's it's really, you know, working with the with the NSA and working with the defense and intelligence community as I have, and being exposed to what goes on in those areas that's like super unethical at the very least, and then learning about all this stuff through Family IT guy of how many people hurt kids. There's a really, really significant proportion of humans that are just really not good. Yep. You know, and we like to we we put on these these clothes and these suits of you know civility, but it's not uh it doesn't mean anything.

SPEAKER_01:

And it's not the creepy predator-looking dude. No, it's not. It's the pastor next door, it's the cop that lives down the street, it's the teacher that lives behind you.

SPEAKER_03:

Yeah.

SPEAKER_00:

It's everybody, it's the popular football player. It's the 70-year-old, it's the 40-year-old, it's the there's no there's no demographic, there's no descriptor. It is just some percentage of humans. That's all it is. Yeah. Men and women. Yes.

SPEAKER_01:

That's where I think a lot of people let their guard down with women. Yeah. I don't want to say just as bad. I'm sure the numbers are staggering compared to men and women. Probably. But they're just as they're out there. Yeah. What's the number one mistake parents make when they give their kid a phone?

SPEAKER_00:

Not setting up the filters. Not putting in the controls. If you a really easy analogy is a car. When the kid turns 16, you go, hey, you know what? The Corvette in the garage, here's the keys. See you later. I'm gonna go back to doing what I was doing, peace out. Have fun. Don't crash. Oh, you don't know how to drive, don't worry about it.

SPEAKER_01:

You'll figure it out.

SPEAKER_00:

You'll figure it out. Hey, you know what? The uh the gun in the cabinet, now that you're you know, whatever age, here's the code, rock and roll. Don't kill anybody. Oh, the the uh prescription drugs that are in my safe too, go for it.

SPEAKER_02:

Be smart.

SPEAKER_00:

Yeah. Like things require like the most simple, those are the most simple things. You know, cars, guns, drugs, you know, alcohol. These are things that we should proceed with caution and training and awareness and all this stuff. Computers are super complicated. Those devices do a thousand different things. Each of those things must be considered. None of them can be taken for granted. The really difficult part is that it's super hard to know how to set up those filters. Like I wrote a guide on my website on how to set up the iPhone filters.

SPEAKER_02:

Yeah.

SPEAKER_00:

It's 82 pages with 330 screenshots. Like, how do you expect a normal parent to be able to figure this out?

SPEAKER_01:

You just you it's it took me a while to figure out. But you have this link on there that we can search and parents can go and walk them through this.

SPEAKER_00:

Yeah, and and to be fair, it's the it's one of the very few things on my website that's not free because it took so much work to put together. Okay, it's probably worth it. So I normally I sell it for 30 bucks. Um, and I'll I should put together a code. I'll put together a code for wild chaos. Perfect. And we'll make it like make it 10 bucks. Yeah, we'll put a little link in there. Because I want people to have it. Yeah. If you're gonna give your kid an iPhone, you gotta do that. If you're gonna give them an Android device, just buy one that's already from one of the companies that we've already discussed. Yep, get one of those that just comes with the filters. Yeah, do not give your kid an unfiltered internet device. It's it's um, I mean, it's an immensely powerful tool. The most powerful tools have positives and negatives. Absolutely.

SPEAKER_01:

It's just how we use them. This wasn't really a question, but it broke my heart reading it from this mom. She says, Can you guys touch on the sex tortion topic my son was taken advantage of online? Sex stortion is a growing terrifying thing, especially in the young teen world. I I mean it happens in all ages. Yeah, can you dive into sex tortion and what people are doing with this and how they're doing it real quick?

SPEAKER_03:

Yeah.

SPEAKER_00:

Oh gosh, it's one of these. I've it's only it's the only things that I've like I've sitting sitting at my desk researching, just like just tears streaming down my face, reading reading these stories. I got goosebumps just thinking like sex tortion terrifies me. Everybody should look up the name Jordan DeMay. Okay. D-E-M-A-Y. Um, 17-year-old kid that uh committed suicide because of sex tortion, and I think it was within six hours of initial contact, he was dead.

SPEAKER_02:

What?

SPEAKER_00:

This stuff happens fast. I'm talking the same night, his family saw him go into his bedroom, and that was that. Everything's probably fine, maybe a little off, whatever. No, no, everything was totally fine. And then according to what they found on his phone, he got contacted by supposedly a girl in a nearby town. Turned out not to be a girl, turned out to be a Nigerian scammer. They got him to send a naked photo, and then they extorted him, and he couldn't take it, so he killed himself. That's the reality of sex tortion, and that's happening every day. It happens a couple times a month that we know of. That's right. So the FBI says that the sex tortion is the fastest growing crime against young people. Teenage boys specifically are targeted by criminal networks. So the same group in Nigeria that used to do the Nigerian prints emails and like the financial scams, they're called the Yahoo! Boys. Now they do sex tortion. They share instructions with each other on how to maximally affect teenage boys. They if your specially boy is on the internet, uh Officer Gomez from here in a police officer here in uh Boise told me that 100% of boys are targeted. Targeted, not victimized, but targeted. If your boy is on the online, they are targeted for sex torsion. Now it's recently changed. So it used to be that the kid would have to do what their biology tells them to do, which is to reply to the cute girl. The girl sends a naked photo. The of course the the boy sends one back. That's what we're built to do. If it's if it's a post-pube pubescent, if it's a teenage boy, they will fall for it. They have no choice, basically. Because this happens to the best of kids. Good families, good kids, good grades, play sports. This is Jordan Demain. His mom, and maybe even him, didn't she said this, and she's made it a purpose to inform parents. She said that that she didn't know about sex torsion.

SPEAKER_02:

Most parents don't.

SPEAKER_00:

Right. And so, okay, so the so the basic thing is, right? So somebody will contact the kid and and um convince them to send a photo. Once they send the photo, they get extorted. The the person doing the extortion, if it's a financial thing, which the vast majority of them are, there's lots of ones that are from perverts, but most of them are financial scams. They want you to send like an iTunes gift card. Don't do that. I think it's over, I saw a statistic that's like over 90% of the time, if you pay, they come back asking for more. So do not pay. It doesn't make it go away, it doesn't make it stop. You should tell somebody that you trust, whether it's your parents or your teacher, or your pastor, or if hopefully if you're a kid that this happens to you, you have somebody that you can tell.

SPEAKER_02:

Yes.

SPEAKER_00:

You should contact the National Center for Missing and Exploited Children. They have a tip line where you they they can help and they can engage the correct police resources to try to investigate if you want to pursue an investigation. And they have a service called Take It Down, where they have partnerships with tech companies so that if the images that you've shared are on the tech company's servers, these tech companies will go try to find them and take them off.

SPEAKER_01:

That's good at least.

SPEAKER_00:

Yeah, so that it helps. Now, where it's changed. The kids don't even have to send the naked photos anymore. If you put a photo of your kids on the internet, it can be used to create a fake naked photo. It has the same effect on the kid. They're still traumatized, some of them still commit suicide. In the first half of 2025, there were 100,000 reports of sex torsion that used AI-generated deep fake images that were from an innocent, normal photo that were used to extort children. It's like a 6X increase over 2024. The we're on track. The National Center for Missing and Exploited Children estimates that by the end of this year, 2025, we'll have about a million reported cases of sex torsion.

SPEAKER_01:

That is insane.

SPEAKER_00:

It's a big problem. Now, there's there's two primary mechanisms that enable this to occur. One is the camera.

SPEAKER_03:

Okay.

SPEAKER_00:

If you give a kid an internet connected camera, you're putting them at risk. Now the messed up part is you cannot buy an internet connected device in the United States that doesn't have a camera. Except for a watch.

SPEAKER_02:

Which ones?

SPEAKER_00:

Apple Watch. Really? Cab watch.

SPEAKER_02:

Okay.

SPEAKER_00:

Um, I think the Bark Watch doesn't have there's even watches that are including cameras now, which is crazy. But so either get a device that doesn't have a camera or use the parental control systems to disable the camera. Got it. And what's what's also kind of wild is there's nuance. On an iPhone, you can turn you can disable the camera through the screen time controls and it just goes away. There's the hardware's there, but you can't use it. On Android, same thing, but it still works sometimes. So, but because of the AI thing, so okay, so you take away the camera, but that doesn't stop the AI generated deepfakes. The only thing that stops that is not posting their pictures in the first place. Yeah. Which is super messed up because we're all spread out all over the place. Grandparents are on the other side of the country. Sometimes mom and dad are in different places, cousins, uncles, friends, this and that. Everybody wants to share innocent photos. But these people, man, they're they're they're using these things against kids. And it's just the fact of the matter. That is what it is. And it's not going away and it's getting worse. And so you can't take anything for granted.

SPEAKER_02:

No.

SPEAKER_00:

I had to tell a close family member of mine this morning to remove photos of my kid from their Facebook account that they had posted without my permission. Because I'm in the business of pissing these scammers off. Yeah, they're coming for you. They're gonna want to attack my kid.

SPEAKER_02:

Yeah.

SPEAKER_00:

And that is deeply sad. But it's the world that you've stepped into. That's yeah, I'm I'm doing this on purpose. I'm accepting this risk on purpose. But everybody else that's not, that's just living and just existing and just sharing photos with their family on Facebook. It's it's a big problem.

SPEAKER_02:

Yeah.

SPEAKER_00:

So sextortion is a very big problem, and people need to be aware. Kids are dying because of it, and kids are being extremely traumatized from it. Hundreds of thousands, a million this year.

SPEAKER_01:

That are reported. That are reported. Think about how many kids are just too scared to go to their parents, which I feel is a huge, huge Absolutely. This is where it goes back to being an intentional parent and building that relationship with your kids because these times are gonna come. Yeah, they're gonna fuck up, they're gonna get in a wreck, they're gonna back into something for the 10th time with my truck or run into my garage door. Yep. And they need to be able to come to you. All joking aside, your kids need to be able to come to you, no matter what. Yeah. And for these, you might go through your whole entire life with your child and until they're 17 or 18, and then that might be the moment. But to be able to build that foundation of trust. Yeah. Mom, I made a mistake. Dad, I really screwed up. Okay. What do we got? Yeah. Instead of them in their room just festering and building in the stress and just starts overwhelming them to where they and it's so sad to me to think that taking their life as their only option as a young teenager. Yeah.

SPEAKER_00:

I I can't even sit here and actually, I'll just I can't even properly process it. It's just too much.

SPEAKER_01:

For sure. And I know there's a lot of really good parents out there that it still was not an option that their kids went down and didn't come to them, even having a great relationship, but it just helps. You need you, we as parents need to have as many shields guarding our children, and that's one of them is that communication, yeah, line of communication to have with your cut kid because it might they still might not even come, but you want to have that as an option.

SPEAKER_00:

And it's terrifying, man. I mean, you so there's actually another thing you can do too that you should do is when you give your kid a device, do a family tech agreement. And I have a template on my website for free that you can use and you can modify for your own family, and establish, first of all, expectations around like what are the boundaries of the device and and where you know, if if this happens, then you are in trouble, and if this happens, you're not. And if you get caught up in this in in something bad where you made a mistake, free pass. Very important. Yep. Free pass. You because if if a kid gets caught up in sex tortion, that is their biology working as it's meant to work. They have been taken advantage of. It's not their fault. So free pass. And then you have to honor that. They come to you and you're like, whoa, and it's it'd be normal to feel upset or to feel scared or to feel everything, right? Free pass. You have to. So at the very least, all the technical controls and all whatever, if that's too much, family tech agreement, free pass.

SPEAKER_01:

We have some rules in this house. We don't have the tech agreement, but we do without the technical term behind it. But uh, that was one of my things. Like, if I ever ask for your phone and you hesitate, it's mine. I get to see what's on it. Because at the 10 at the end of the day, it's I'm paying for it. I bought it. Uh it's it's my phone that you're lending.

SPEAKER_03:

Yeah.

SPEAKER_01:

So you want to spread your little wings and grow and get paid for your own phone bill? Cool. But as long as you're under my roof and you're on and one of my electronics that you get to use and socialize with and take because for the safety reasons, and I want to know where you are at all times, I have my own rules. And so, you know, that was one of them. It's like, okay, cool, you're gonna hide something from there's no hiding. But at the same time, I 100% agree with you. You can't be what the fuck? And then you blow up because then you're gonna be like, Oh, I'm never doing that again. Yeah, now I need to hide this, I need to get better at hiding this. So when that moment comes to take that breath, because I especially as men, I feel fear comes off as anger a lot. At least it used to with me. Yeah, oh yeah. I'm not mad at you, I'm terrified because I'm a father. You've someone's gotten behind the walls, someone's inside of our fortress now. So now it's like it's it's this rage and anger, and I'm taking it out on them. They didn't know, they just they're doing what their little biological mind is built to do. Yeah, so that's where the free pass is huge. There's times where your parent, your kids are gonna come to you, and if they're on that's a big thing in our house. If they you come to us with being honest about a mistake, there's no there's gonna be repercussions. We might have to remove something, or hey, we're gonna take a pause from this. There's gonna be a little break, whatever your decisions that parents need to make at that point, but that knee-jerk reaction out of fear that a lot of times comes across as anger, that's what pushes that kid away to be like, ooh, yeah, I oh shit, it happened again. I can't go because last time I got my ass chewed, I lost my phone for a week, I was grounded for a week, couldn't even talk to my friend. Like, that sucks, and now I'm just gonna hide this. That's where it's very important to have those rules laid out. Hey, if we make a mistake, let's correct it. We make the mistake again, then there's going to be repercussion, but let's not look what we're gonna learn from this. And I think we talked about on the last episode where we caught our little one, not caught our little one, she came to us and was telling me about one of her friends that she was with, and this was pre-the-womb roadblocks was really blowing up, and that's when she was like, Oh, I was chatting, and this little girl, and she was she's a little girl too. And that's I remember telling you, I was like, How do you know it's a little girl? And that when our youngest was like, and you could see the wheels turning, but she's I mean, this was a year ago, you know, before our episode, but you could tell just they're they're children, they're not thinking that this is a predator, they're not thinking this person's gonna groom them, and this is their sole purpose of a rotten, vile human being is to get inside your little innocent mind. That's not how kids are programmed. That's where when they come to you, like, oh, I'm talking to Susie, and she likes horses too, and she says we should come hang out, and it's like bing, like, where are we? What you know, that's when I was whoop, start collecting, and I'm going to the wife, like, what the fuck? Like, yeah, and we just didn't have any idea. And that's when I was like, okay, we're done. But once we explain it to her, and if you have that open line of communication, you talk to your kids mature at a mature level that's age appropriate, then hey honey, remember all these talks we had about people? Did you see this kid? No. Well, how do you know? Well, I didn't know. Okay, so prove to me that this is a little girl talking to you because what if it's an old creepy man right now? And then they're like, Oh my god, I'm not trying to terrify my kids, but you kind of have to. Yeah, I would rather my kids be terrified of talking to somebody online than not have any fear at all. Yeah, comes down to that age-appropriate conversation where you need to let them know what is going on to that age. But there's to us, it's like I would rather her know the reality of what's out there rather than oh, well, just be safe. They start asking weird questions, come tell me, because that's part of grooming. Right. That is where these kids it's slowly chipped away at before you know it, they're running off with some 40-year-old man and they're 13 years old and they're sneaking out at night because it started as a friend in group chat.

SPEAKER_00:

You know, and it's worth thinking about like if if that sounds like something that you don't want to talk to your kid about, then they're not ready to access these systems at all.

SPEAKER_01:

That's a great point. If you're not ready to have these uncomfortable conversations with grooming, sex tortion, sending unappropriate pictures, receiving unappropriate pictures, if this makes you go as a parent, you should not be giving that access to your child.

SPEAKER_00:

Yeah, right. Then just don't do it. There's nothing wrong with it. Like nobody's that's the that's the best part about this whole thing. Nobody's coercing anybody to use any of these things. Free will. This is all voluntary. Yeah.

SPEAKER_01:

And if you don't 100% voluntary, and if your kids um they're not mature enough for me to talk to them about sex tortion, that's your sign. Yeah. That your child that should be the biggest aha moment as a parent. Like, yeah, they're just too young for me to have this conversation with them, then they're too young to be on that platform or on that device. That's that's the most basic way to explain that. Or if you're like, I really do not want to explain this to my kid right now, well, then you probably don't need to have them on there.

SPEAKER_00:

Yeah. And the crazy part, no, it's not crazy because that's that's a demeaning way to say it, and I don't intend to be demeaning. Um, some kids fall for sex tortion, become victims of sex tortion multiple times. 100%. They're children, right? So even then it goes back to kids. Sorry, I mean you're go ahead. Well, even like if you're to to like the free pass thing and the like, if you come to me, I will help you, and I'm not gonna just you know yell at you. It can occur multiple times, and it's still not their fault. They're kids, man. Like, I mean I got I couldn't even tell you how many times I've did the same stupid shit over and over again. I'm so lucky that I didn't have wasn't exposed to this stuff when I was I'd be screwed.

SPEAKER_01:

I wouldn't be sitting here if I grew up with the internet. There's no way. Man, God only knows where my path would have gone. Yeah, I would have so many, I would have been like, oh, look at this chick. Bink, bing, pink. I'd have been sending who knows.

SPEAKER_00:

All the time. And terrifying and the uh the criminal guys now hire um, well, actually, I shouldn't say hire, have actors, actresses to be so that if you think like, oh, am I talking to some creepy dude, they'll have a good-looking woman on the other side of the camera to talk to you. It's a real mess. Yeah. And those girls are probably indentured servants themselves. 100%. Because I mean, I used to I used to live and work in South Asia where like in Southeast Asia, where this is very common. It's gotta be an extension of that. So it's it's it's a real um you gotta be real careful. You don't take anything for granted.

SPEAKER_01:

That's the whole that's the whole reason for this conversation, is because it it goes this deep, but I just feel our generation as parents that didn't grow up in the tech world. We just don't we don't know how the the levels of scum that are behind these apps that are in these apps, their sole purpose every day they wake up is to find a child to take advantage of. And I feel so many parents just hear that, but they don't truly take that information and take that statement in like, okay, every single day there are thousands of predators that are literally logging into Instagram, roadblock, Snapchat, and they're just searching all day. Yeah, boom, oh hey, boom, then they start chatting. Got one. Then it probably gets passed off. There's probably teams of people that are just setting the hook and passing it off.

SPEAKER_00:

Oh, yeah. There's a whole um the way you would operate a sales team for like a call center, same thing. They have the guys that gather the leads and then pass them off to the next tier of of you know closers and then pass them off and eventually extort them. It's a machine. It's a it's a it's a business, it's a very big business. Yeah.

SPEAKER_01:

Do you think we should trust schools when it comes to teaching digital safety, or should that fall on the parent strictly?

SPEAKER_00:

Shoot, all you said was should we trust schools? I was about to say no.

SPEAKER_01:

It's a parent question. Uh clearly we are not in the system. We don't trust anything from the government. But for the parents that haven't realized that the government and schools are not for their best kids' interest, would you would you just leave that up to the schools? Absolutely not. Okay.

SPEAKER_00:

So schools don't pay well. And so they don't get the most talented IT people. They could have and they do have well-meaning, good people with true intentions, oftentimes running the IT departments and things.

SPEAKER_01:

Yeah. But off of their budget.

SPEAKER_00:

Their budget doesn't allow them to attract the best talent. It doesn't allow them to use the best systems. And even when it does, they don't. So the schools are giving kids Chromebooks with the ability to access YouTube. They're giving kids Chromebooks with the ability to have access to chat. They're giving kids Chromebooks with AI. This is dangerous stuff. This happens all over the place. Many school districts and many I mean, this is common. So no. The the man, I hate to say it, but assume the worst. Every when it comes to the internet every time. Yeah. If the the all the the tech that what gets measured gets done. And what gets measured in school environments are standardized testing scores. Yeah. Nothing else. Well, it's actually it's attendance rates because their budgets get impacted if the kids don't attend. Which is why they report attendance rates on the report cards. Not because they want your kid to be there. The teachers do, because teachers are most often really, really great. But I'm just talking like administrative levels. Yeah, the the districts and they get paid based on attendance rates and they get paid based on testing scores. And the rest of it end of story. It's a side fluff. Side project that's just like, eh, maybe, you know, that yeah, no.

SPEAKER_01:

Okay, so on top on on this subject, little Timmy gets sent home with a Chromebook or a laptop or an iPad from school. As a parent, do I trust that?

SPEAKER_00:

No. No, no, no, no, no.

SPEAKER_01:

Even with the school's security, firewalls, blocks, everything that they claim to have on that. Is that a is that a let my guard down? It's from the school, they know what they're doing. My kid can't access anything on this.

SPEAKER_00:

That's a put your guard up. Okay. Yeah. Okay. Unless you have some magical school that respects parental authority and gives you the ability to completely monitor and filter and control that device, which maybe there's some that do that. I've never heard of any. No. That is that is a dangerous piece of machinery.

SPEAKER_01:

Makes sense. It 100% makes sense. This was a great question from a dad. Knowing everything you know about the internet, would you rather? Hold on, we have how do I This is like one of those games. Would you rather Yeah, this is a this is kind of a would you rather scenario? Knowing everything you know about the internet, would you rather let your kids sneak out at night or give them full access to the internet on a phone in their room?

SPEAKER_00:

Sneak out, take the car keys, bring the crackpipe, do whatever you want.

SPEAKER_01:

I was not expecting that.

SPEAKER_00:

Right. I had to throw that in there for extra twist.

SPEAKER_01:

That's how much it's uh you're comparing it to giving your kid a phone in their room at night.

SPEAKER_00:

Yeah. Oh yeah. No, sneak out here. Let me open the window for you. Here's the flashlight. What else do you need? Go to Jack in the Box, you know. Yeah, give me some man. Yeah, let me know what you get up to. Uh take pictures, don't play them on the internet, but show me when you get back. Oh shit. Yeah, no. No.

SPEAKER_01:

Well, it's dude, it's that's terrifying. You know, like we said at the very beginning, that commercial that I saw, like as I've, you know, when I was first watching, I'm like, what is it? And then I was like, this is everything on a phone.

SPEAKER_00:

Yeah, that's a good I've seen that video too. That's a really good one.

SPEAKER_01:

It's that's something that like unfortunately, due to the algorithms and the reality of it, it'll never get traction and go viral because then it just shows what's the app that you're watching and on what you're allowing your children to do. It's things like that. That's what's that's what sucks about the world we live in. And if anything good is done really well, like that commercial of kind of making you realize what who you're allowing into your children's room at night, you know, the creepy criminal and the mean girls and like the old lady and all this, you know, all the people that you would just not even think about. That's the type of stuff that feels so suppressed and so like, oh, we can't get this out there because it makes us look bad. So these platforms don't promote educational content like that that'll let some parents see that and be like, oh fuck, my kids in there right now scrolling.

SPEAKER_00:

Yeah. Yeah, and and um I've learned that um information that we we don't humans, we don't make decisions with information. We make decisions with emotions. So even even like sharing information, like I recognize that like I'm in the business of sharing information and like it's really we're we we're driven by our emotions.

SPEAKER_02:

Yeah, 100%.

SPEAKER_00:

And uh as are these platforms, you know, they're they're in the business of manipulating our emotions and stuff. So yeah, it's tough to it's tough to make um to have those things make a positive impact. They do somewhat, but you know, not maybe not as much as you'd want. Yeah.

SPEAKER_01:

This is an AI question. First one. If I post a video of my child speaking online, can their voice be duplicated? Oh yeah, 100%.

SPEAKER_00:

Easy. How are they using those? Um, I mean the whole like uh the concept of a of a deep fake, it's used to make deep fakes, you know. So a deep fake is is is reproducing somebody's likeness. Okay. And their likeness is their how they look, how they move, and how they sound. And so it doesn't take I don't know what the exact number of seconds is, but it's not many seconds of audio of somebody's speech patterns that you need in order to reproduce their speech in a in a deep fake.

SPEAKER_01:

And then they then somebody can take that and call you when is your kid's in distress. Mom, hurry, I'm stuck. Can you wire me a hundred bucks? Yeah. Is that what they're doing with these? I mean, what do you so if I clone my kid's voice, like how are they using these besides try I that's where my mind goes, they're gonna try to extort me for money playing that it's my kid and she's she's stranded somewhere.

SPEAKER_00:

It's um so that's possible. Okay. I don't think it's common yet because the technology for it is uh like it requires a level of tech savvy in order to take advantage of that. But it's possible right now, very possible. Okay. So I'll tell you that the the most commercially available, like biggest tool you can use is called Eleven Labs. And that's that's what they do. You feed them your voice and then they recreate it so you can have it say other things. Yeah. Um, and it's a it's a really powerful tool, it's a it's a good tool. Um the the thing is I'm not that I am concerned with how things are used now. I'm more concerned with how things are going to be used in the future. Oh, for sure. So every piece of data that you put on the internet right now, I'm thinking five years ahead. And right now at the rate of change that occurs from month to month, yeah, the doubling of capability that occurs every so many months in in these generative AI systems.

SPEAKER_02:

Terrifying.

SPEAKER_00:

Yeah, like this this time next year will be a completely different landscape. So if you put your kid's voice on right now, it's sure, what are they gonna be used for right now? What's it gonna be used for six months, twelve months, twenty-four, thirty-six? You know, that's that's the concern. And and we can't even imagine some of the stuff yet. But um, yeah, will it be used to to fool us as family members and parents and trick us into things? I mean, yes, absolutely, and or fool us into believing that a a politician has said a thing or that uh you know an influential person has said a thing. Yes.

SPEAKER_01:

It's gonna get interesting.

SPEAKER_00:

Yeah. Yeah. The internet the internet is um gets more dangerous by the day, and it should be avoided more and filtered more by the day. Like we we need to treat it as a very, very dangerous place. And don't don't go out there and drop information into it without assuming that it will get used against you or somebody at some point.

SPEAKER_01:

It's coming.

SPEAKER_00:

Yeah.

SPEAKER_01:

It's coming. What's the best advice for parents who feel too far behind to be able to protect their kids online? Is this something that you offer? I'm a parent. I'm not quite the boomer. Got younger kids, a little bit older, wasn't raised in the tech world. How do I learn to protect my kids online?

SPEAKER_00:

What's the easiest way to do that? The easiest way to do that is by demonstrating the behaviors that you want them to emulate. Okay. So check your screen time numbers of how long you spend on your phone every day and which apps you spend your time on. And show that to your kids. And if you want them to behave in a way that's healthy with technology, show them you using your technology as little as possible. So you don't have to know anything about tech in order to do that. And it's super hard to do what I'm saying. It sounds very simple, but we all know like how slippery these things are. Oh yeah. But that would be the way to do it. Is and and and you can you can do that or give you a very concrete thing to try. Pick a night. Pick one night each week, Tuesday night for an hour. Nobody's on any devices at all. You know, whatever it might be. When we all get home from work or school or it's from five to six or six to seven or whatever it is, we all just put our devices, we take them, put them in the drawer, and then we just sit in the living room and be like, stare at each other and be like, well, now what do we do? I don't know, I'm gonna go for a walk or something.

SPEAKER_01:

It's gonna be really awkward at first.

SPEAKER_00:

Yeah, oh yeah.

SPEAKER_01:

But it's really weird because if you pay attention, then you start asking questions. Things start popping up, and we're like, Well, what about this? You guys want to try this? Like, it's it's a natural rhythm in your home that we've lost due to staring at each other's phones and iPads and TV and everything else. But you'll see it doesn't take very long to go back to those roots of how like we all grew up and we're rays of just sitting around and just talking and having a conversation. I think Sundays are the best day for that. Sundays is a day of rest, yeah, a day of peace, and that's where you should let your mind also rest because when we're sitting here scrolling, this goes for adults, not even just the kids.

SPEAKER_03:

Yeah, yeah, yeah.

SPEAKER_01:

Fed just war, propaganda, politics, evil, evil, evil, evil, bad, bad, bad, violence, violence, violence. And it's like it's good for adults to take a day and just decompress from all of that and start your week over again.

SPEAKER_00:

You can take a day. We take Sundays, yeah. Best day of the week, man. So hard. You know, like my phone draws at me. Yes. I have to tell myself, no, it's sun. I don't need to check that. Yeah, I'm super busy. I got lots of work to do. I don't even use social media, and my phone draws at me. It's really hard. And after you're done with your one hour or your one day or whatever, collectively as a family, be like, what did that feel like? Yep. Because then you can talk about, like, whoa, actually that felt kind of freeing. Yeah. Wow, it actually feels kind of good. And then the next time you get back into your phone, get back together, like, what did that feel like? Not as good. Why do I not feel as good? Why do I feel kind of gross? Why do I feel irritated? Why am I irritated? Yeah. Why am I short? I'm bothered all of a sudden, you know? Like, why? Yeah. So it's it's a it's a good opportunity. So I think that's the non-technical, just just separate from the phone for a short period of time and see what it feels like.

SPEAKER_01:

Is there any apps or games for parents to put on their kids' devices or even on their own that will help with healthy digital habits? Is there anything out there for our children?

SPEAKER_00:

The the parental control systems that are built into the devices are really good for that. So the fan Google Family Link and the Apple Screen Time. Okay, you can you can implement, you know, um time limits, um, prevent certain apps, you know, shut off the app store. I would I think that is the way to go. Shuts it off and they don't get an option. Yeah, I mean, or if you use something, if let's say you find an educational tool, you find something, just look for does it have an algorithm and does it have anonymous chat? And if it doesn't, and it actually serves like a purpose to be like a positive tool, it's it could be a religious tool, could be an educational tool, something that's actually good, yep, that doesn't have an algorithm and doesn't have chat, it could be considered as a positive way to use those those uh pieces of technology. Cool.

SPEAKER_01:

Let's shift gears a little bit. I want to talk AI. A lot has evolved. You're deep into the AI world. There's a lot, a lot, a lot, a lot of good from AI, and there's also a lot of bad from AI. And that goes since we're on the topic of kids and stuff, I kind of want to keep it that path. Yeah. So let's just dive into it. Can AI actually make the internet safer for our kids or no?

SPEAKER_00:

Probably. So one of the things that these AI systems are really good at is if you give it a bunch of information and have it assess the information or have it like um like parse through and like and like read through and say, you know, classify this paragraph. Tell me about it, give it a give it a label, you know, give it a descriptor. These AI systems are really good at stuff like that. And so, yeah, AI would be a really good way to you know flag something. There's actually some existing systems now that are trying to help kids browse the internet. There's one called FrameBright. Okay. And it's it it does that. It's it's like a it's like a a proxy to the internet that uses AI to assess a website before it's accessed and determine what it's all about. Oh I think that's a really a really neat use of AI.

SPEAKER_01:

Okay. Are there any AI tools that can help parents detect grooming before it happens or as it starts to happen? Is there anything out on that exists like this?

SPEAKER_00:

Yeah, in fact, the the surveillance tools that we talked about before, like Bark and MM Guardian and Custodio and all these, a lot of those surveillance tools look for that specifically. And they they um depending on the tool, they can sometimes they can read any app, sometimes they can only read some apps. So you have to look carefully to see which ones they can monitor. Okay. But they do they do try to look for that stuff, yeah. I guess we should probably just cover AI real quick.

SPEAKER_01:

Sure. Just what the hell is AI since it's such a I don't want to say new thing, but it is, it's growing every day, and if there's still people that are fighting it, then half the world jumping into this train, I guess, with in a brief description. What is AI?

SPEAKER_00:

It's um um a word prediction machine. And so if I it it it actually it uses a concept called a neural network, which is very similar to what we have in our brains. Okay so if I say the word cloud, what words do you think of?

SPEAKER_02:

Rain.

SPEAKER_00:

Okay. And then I I think of like airplane, yeah, like weather, right? So you have these associations.

SPEAKER_02:

Okay.

SPEAKER_00:

If I say the word peanut, what do you think of? Peanut butter. Peanut butter, yeah. I thought of a baseball game.

SPEAKER_02:

Okay.

SPEAKER_00:

So in our neural networks and our brains, we have these electrical associations where these words and concepts are similar to each other and they have pathways between each other. And so the AI systems do the same thing. So they store information and they give the information relationship scores. And so there's a relationship score between peanuts and peanut butter, or between clouds and rain. And there's a big gap between that of what happens from there to Chat GPT, but the underlying mechanism are these neural networks that establish relationship scores between words and sentences and paragraphs. And they are these these giant word association machines where they can like, you know, if you're as ChatGPT is replying to you, yeah, it's as each letter hits the screen super fast, it's it's referencing its neural network and seeing like what is the most closely associated thing that comes next. And so these systems are in the business of uh replicating human communication. And so that's why they're so tricky, is because it seems like there's a person, it seems like that's intelligent. And I guess I mean I don't know how to like technically define intelligence. Maybe they are technically intelligent. But I mean, they're learning, they're teaching themselves, right? Yeah, just like we do.

SPEAKER_01:

You know, and they said I just read that it finally AI just finally passed like the bar exam or something like that, because so many people were feeding it questions, yeah, that it just eventually learned the whole thing, or like the insurance exam is one of the hardest tests. Oh, is it? It's something ridiculous, and so many people were are just asking questions and to try to help them study that it just gathered all the information and then like, yeah, I guess it just breezed through it like it was nothing.

SPEAKER_00:

Yeah. And so when you ask these systems a question and they spit back an answer, they're going back and referencing their neural network and figuring out which things are related and then feeding them back to you. It doesn't mean that they're true. Yeah, it doesn't mean that they're useful. And so the the developers that make them have tried to then fine-tune and enhance usefulness and enhance the propensity for something being true or not. But you know, there's a stat that's like 40% of the references that these things reference are from Reddit. And Reddit is just random people saying, like a Wikipedia in a way. Yeah, yeah. So it's um and that's what we're learning.

SPEAKER_01:

Like that's what these kids are.

SPEAKER_00:

That's what we're outsourcing our thinking to. Reddit. You know, yeah. Reddit posts. Reddit has since like tried to turn off their ability to get scraped because they were, you know, upset about their data getting used and whatnot. But um yeah, I mean, they're just they're they're it's a really, really fantastic demonstration of uh word generators or even like the image generators and the video generators. At the end of the day, what's underlying everything that you see on your computer screen is a bunch of zeros and ones. And so you can use that to instead of like generating a word and having relationship scores between words, you can have relationship scores between pixels to generate an image because those pixels are zeros and ones underneath. That makes sense, and so it just builds it out from there. Yeah, it's like if you went to the library and the librarian had consumed all of the books in the library, and then you said, Hey, you know, tell me about Julius Caesar. Like, oh yeah, I can tell you all about Julius Caesar. And then their relationship, their neural network accidentally like pulls in data from the book that was next to Julius Caesar and feeds it in and twists it and gives you an incorrect fact or something that's slipped in amongst the paragraph about Julius Caesar. Got it. You're not gonna know, but it'll give you a fantastic little report on whatever topic you want. Um, and so what's cool is that because these things have sucked up all the world's information in order to train their neural networks, you can there's all these patterns with like how people communicate and how people think and how people assess problems. And so you they're I'd say they're not useful for generating answers, but they can be useful for generating questions. And they can be useful for helping you think about things that maybe you hadn't thought of yet.

SPEAKER_01:

Oh, absolutely. We yeah, I mean, we fought it for a long time just because I think AI is direct from the devil. It really seems that way. Unfortunately, it does, but it's the world and it's evolving. It's like I'm not gonna learn the internet. It's like being one of those people, right? It's it's here and it's not going anywhere and it's growing at a rapid pace. And so, like, once we got into it, started using it for the business side of things, like, hey, I really can't figure this out. This is what I'm thinking, this is what I want the outcome to be, this is how I would like it laid out, X, Y, and Z. And then it's like, and you're like, holy shit, I didn't even think of this. Like so, as far as me going to and be like, tell me this, this, and this, I don't ever hardly use it for any of that type of stuff. It's more of like, hey, what do you what give help me build out questions that I can go for here, or like for social media, you know, or hey, help me format this in a specific way that makes more sense to me? That's how I'll use it, but I mean, dude, you see these kids now, and it's just every single thing they do. I got a buddy that's getting his master's right now, and he's he hasn't done since AI came out. It has he's like, bro, I do nothing. He's like, I'm it does everything for me. He's like, it's the stupidest shit.

SPEAKER_03:

Yeah.

SPEAKER_01:

And it's just so so I guess leading into that, could AI tutoring or any of these learning systems help our kids develop real-world skills faster than public school kids or in a skills. I think so.

SPEAKER_00:

Okay, I'm actually very excited about that. I really want to get into that. I really want to find a way to safely expose my kid to that, which I I haven't found yet, nor have I like dug into it deep because there's this balance to be had. There's a there's a um this like tech adoption cycle that it kind of looks like a bell curve. That the far left side of the bell curve is like when something is brand new, and then the top middle of the curve is like a general adoption phase, and then the right side is as it gets old and ages out and people start using something different. And all these things are on the far left of the curve in the tech adoption cycle.

SPEAKER_02:

Okay.

SPEAKER_00:

And when something is in the far left of the curve, what comes along with that are problems for sure, like bugs. So historically, in the software world, when something is brand new and it's early in the tech adoption cycle, it will have things that don't work right. Or it'll have like you click a button and the button doesn't work, it doesn't do what it's supposed to do. In an AI system, when something doesn't work right, it's feeding you information, and that information may not be right. And so we've seen a bunch of the classic examples of Google Gemini's image generator, you know, George Washington was a black man, and you know, the founding fathers were all different colors, skin, and some of them were female and whatnot. And those were parts of being an early, it's early in the tech adoption cycle.

SPEAKER_02:

For sure.

SPEAKER_00:

Those were cultural bugs that worked their way in.

SPEAKER_02:

Yeah.

SPEAKER_00:

You know, so now instead of a software bug, there's cultural bugs. And and I mean, shoot, OpenAI has published papers where the engineers that wrote the early versions of Chat GPT publicly said in these papers that they've injected their personal biases into the ways that ChatGPT replies. So there's some dudes in San Francisco that injected their personal biases into somebody that's using the system in Pakistan and in France and in you know South Africa. It's uh it's a real dangerous game they're playing. But so, so when I when so if we come back to like should how can kids learn from these things and can they be used as educational tools? Absolutely. The potential is very strong. But they're still early in the tech adoption cycle, and so we must be very cautious. And I will say simply that kids should never use AI alone. Okay, never, ever currently, right now, as it is now, do not let a child use AI alone. These systems are very dangerous.

SPEAKER_01:

Explain why. What's the scariest part about letting your kid on AI?

SPEAKER_00:

Kids don't know how to filter, they don't have they don't under they don't have critical thinking. You can't use an AI tool effectively without critical thinking. You have to question everything. Like you were mentioning with your daughter that that was playing the game and didn't know that the person might not be who they say they are. Yeah, it's that. Okay. It's it's you cannot you we've talked when the law I've said this a lot today, we're like, don't take it for granted. Like, don't take it for granted. It's you know, it's really bad, man. Like to the point where and I I'll just use an extreme example just to just to throw it out there. Um, everybody should look up the story of Adam Rain, R-A-I-N-E.

SPEAKER_02:

Okay.

SPEAKER_00:

He's a kid that ChatGPT helped him commit suicide. It helped him optimize his suicide note, it helped him tie the noose, and it convinced him not to tell his parents. He said that he wanted to tell his mom, and it told him not to. So that's a very extreme example, but that is a real thing that did happen.

SPEAKER_02:

That is horrible.

SPEAKER_00:

It's another time of me sitting at my desk crying while doing research, you know, like and AI is a thing that requires full supervision for your children. For your children. And shoot maybe for adults too. But like you said, if we're to keep it to kids, like kids should not use AI alone. So like when we're talking about the schools, and the schools are enabling you know, the the most schools use Google systems. And so Google Gemini has been turned on by default in a lot of these systems, which by the way, there's a switch in the Google Workspace console for the IT administrators where you can turn that off and not just haven't turned it off, which is insane. But um yeah, the the propensity for education is very high. Very high. And ChatGPT has a mode for like learning mode, okay, where it basically like won't answer your questions, it'll just help you learn. Amazing potential. Like a like the ability to accelerate the learning process for a child, yeah, that's astounding. I I really want my kid to have that, but I just haven't waded into that yet.

SPEAKER_01:

There's a school that we've been looking at, and and they brag, like, and these kids, I mean, what sold me on it? There was like this 11-year-old kid, and he did a project on like Airbnbs for this specific school that's just all AI. But the kids go in and they're using AI to build everything, they're teaching them everything how to use it and the spider webs of it. It's like 10, 11-year-old kids, like, yeah, me and my buddy, we built a whole entire Airbnb business plan on it, and they actually implemented it. Now this, like, these two 11-year-olds have their own Airbnb business where they're like buying homes and shit off AI for the school. And I'm like, okay. Yeah. But then at the same time, like, do I need my kid that deep? I'm like, yes, we started businesses with our kids, but it's like we're baking bread and doing farmers markets. Like, do I need my kid, my 11-year-old running a fucking Airbnb business, you know, at 11 years old?

SPEAKER_03:

Yeah.

SPEAKER_01:

But it just goes to show, like, okay, the it's there, but like they're I don't think they're just turning these kids loose with AI programs. Like, they're they go to a physical school. I think there's an online portion of it too. That's what we're really looking into because we missed the curve, obviously, with her being a little bit older. You're on your own now, kid, but good luck, godspeed. But with the little one, we still got some years with her, and I want to be able to sculpt that one, you know.

SPEAKER_00:

Hey, well, and it's you know, we're always learning, right? So 100%. I I learn a lot with these things. Oh, I've I've learned a lot.

SPEAKER_01:

Well, every time we we meet up, dude, you're you just like word vomit so much shit about AI, and I come home and I'm like, whoa, no idea what Ben's gonna be doing now, or what's next?

SPEAKER_00:

Like, it's insane. Yeah, I mean, it's it's great. Um, but very it's very sketchy, man. Like every day. I'm I consider myself like an AI wrangler. You can't, I mean, it is a it's a wrangling. Like, you gotta look after these things. I feel like if you missed a week not in the trenches, you're you're like playing catch up for a month. Yeah, and like every word that spits out of it, I I question it.

SPEAKER_01:

Oh, yeah.

SPEAKER_00:

I'm like, that's probably false.

SPEAKER_01:

Well, we catch it all the time, especially on math and stuff, like certain numbers, because the wife's really good, and then we'll do something. She's like, and she'll do it or something. She's like, wrong, and then you'll ask it four or five times, and it'll it'll give you the wrong answer every single time. There's some math questions she has, and I'm like, just chat. We're not that family that like just chat GPT everything by any means. We are the opposite of it, but there are some questions where I didn't do the math for Marines, okay? Like, I'm a I'm a fucking crayon eater, and so she'll come to me with these problems, and I'm mom's gone. I'm like, Let me see that, take a picture of it, chat GPT it. I'm like, Yeah, there's your answer. She's like, Dad, it's not an option. I'm like, it's gotta be an option, and then we wait for mom to get home. Mom's like, hey, you're an idiot to me. And I'm like, Yeah, that's 100%. We've gone over the shears ago, but I'm like, what's the answer? And she manually does it and she's like, we're like, uh, all of us, uh, okay. She's like, don't trust in shit. And we're like, yeah, okay, for the hundredth time, we look at it.

SPEAKER_00:

Well, what I've found is that it's most effective when you can boil it down to like the smallest granule. So if you take like a math problem and there's a process to math, and there's there's different components to a math problem, if you can, if you can have it help you determine what the com subcomponents are of the problem-solving process and focus on each component at a time, it could be more useful than if you give it, if you give it a um a problem as a whole, and then it has to, and then you don't have it break it into pieces, it will skip some of the pieces. And so, like anytime if it's a business thing or if it's any kind of problem-solving thing, I'll have it help me identify what the subcomponents are, okay, all the way down to the bottom. Like, what's kind of the first principles? And you can actually uh one of the things I like to tell the the chat bots is use first principles thinking and play devil's advocate. Oh, yeah. If you say those two things, right, those are kind of because it it triggers things in its neural network to go like, what things do I associate with first principles thinking? And it brings that into the conversation. What things do I associate with devil's advocate? What does that mean? And then it brings that into the conversation and it and it mixes it up, and then and then you can use that to help to boil down to the first principles. You're just taking little bite sizes off of it. Yeah, the smaller the better.

SPEAKER_01:

Instead of, hey, build me.

SPEAKER_00:

Yeah, because like the Airbnb thing, my first thought there is they that's too big. That's not the correct use of these systems. That's a that's a a uh uh poorly informed use of these systems. They need to boil it down to the first principles because a business is like a million subcomponents.

SPEAKER_01:

They may have in that, but yeah, I get what you're saying for sure.

SPEAKER_00:

Yeah, right, and they might have, but but but it you know, it takes a lot longer and it's not as it's not as quick and satisfying. People want the quick fix now.

SPEAKER_01:

Yeah, yeah. So with AI and the the evolution of everything that's going on, is there ever going to be a time or is there anything now that we can use a program to help pretty much become like a digital bodyguard for our kids online that it's monitoring, or is that already through like the BARC system and all that?

SPEAKER_00:

Yeah, that that's through like the BARC. So those systems they're trying to add those in. Yeah. Oh, okay.

SPEAKER_01:

So it's growing, it's a it's a thing.

SPEAKER_00:

Yeah.

SPEAKER_01:

Okay.

SPEAKER_00:

Yeah, and it it is a good use for that. Um you know, and it's the thing is that a lot of the a lot of the commercially available AI systems that offer um what we call software development kits. So like software developers like me can can use their tools to build something on top of Chat GPT or on top of Claude or on top of Grok or whatever. That means that we have to send data into those systems to be processed. And the way that you do that can degrade privacy of your users, okay, of your customers. Okay. And so whenever I look at AI systems that are child-facing, I mean for adults too, but specifically for kids, I question what's called the LLM pipeline and the data pipeline. Like where's the data flowing through and whose servers is it touching?

SPEAKER_04:

Okay.

SPEAKER_00:

And is it correlated with um personally identifiable information? Name, email, IP address, um, device ID, stuff like that, or a browser fingerprint. Because, for example, I bought this toy. There's a toy called Curio, C-U-R-I-O. It's a stuffed animal that has a little AI box in it.

SPEAKER_02:

Yeah.

SPEAKER_00:

That's marketed to kids as young as three, and that it sends data through OpenAI, Microsoft, and Perplexity. And those three companies get to store as a result the chat history of your children. So they're recording all of your kids' voices. They're thoughts. Yes. Yep, exactly. And so I made a video about it suggesting to parents not to use this particular toy because it's a privacy nightmare. Yeah. Those kids don't know. They're living your kids. They're reading their thoughts into some servers of some corporations that don't care about them. You know? So we have to be very cautious, but this is because we're early in the tech adoption cycle. We need to let things mature. Don't expose kids to brand new tech. Makes sense.

SPEAKER_01:

Makes a hundred percent sense because it's just like when the internet came out, you know, there was that bell curve, and everything was, dude, everything was getting stolen and cybersecurity level. It was yeah, you know, and look at it now, now it's like nobody even talks about Google. I mean, obviously Google is a powerhouse, but with AI taking over, it's like, okay, it's slowly not the thing anymore, you know. It's now everybody just chat GPT's everything. It's yeah. It's an interesting world, man. Um are we headed toward a future where our children's digital ID is going to be stolen before they're even 18 years old?

SPEAKER_00:

Oh, that's already the case. Yeah, yeah, that's already the case. Yeah, if you put if you put your child's any if your ch if your child touches the internet, it's it's uh it's not that's done already.

SPEAKER_01:

You know what's terrifying that nobody thinks about the company that has been coming into public schools since we were children and taking our yearly photo pic our picture, you know, your yearbook picture. Where do those photos go? Who owns those photos? Is it just some random photographer from your town, or are they are they hired by the public school system? Right. Where's the database of all of our photos every single year that they're taking from little Timmy from first grade to senior year that they just watched your child grow up? Who has those photos?

SPEAKER_00:

It'd be interesting to find out. It's um if you really think about it, the image capture of kids is a dangerous game. And it's probably all the above. I bet you in some cases it's some you know corporate entity, in some cases it's an individual photographer. But the what's interesting is there's actually federal regulation around protecting kids' data, but it has no consequences. So it doesn't get so it doesn't get followed. And so that means that schools operate with no effective regulation on the way that they use technology or in the way that they distribute data. And so it's it's just all out in the open. There's no There's no control, there's no standard, there's no nothing. Which because I looked into it when Power School got hacked. PowerSchool is that company that manages it's like a school administration software. Yeah. And there's like 60 million students that got their um information stolen. So this includes name, home address, school history, parents, phone numbers, emails, everything. And the staff too. The teachers got their their stuff's in there too. And so I started looking into it like, like, wow, like, okay, so there must be some federal, like the Department of Education would regulate the way that schools use technology. And there is a thing, but it has no teeth, and nothing happened. Oh my god. There's no consequences. And it's because they didn't have two-factor authentication, which is like the most fundamental thing that there is in terms of security. Power school didn't have that on their customer service front end, so a kid hacked into it and stole it all. I mean, good for him to figure it out. Right? And then sold it for that. Unfortunately for him, he got caught. Yeah, clearly. He did successfully ransom them.

SPEAKER_01:

Good for him. You gotta give it to these kids, man. I mean, this is the world we're growing up. They left the door open. He stepped right in. Yep. Do you think that there's going to be a time, which I'm sure it's now, where AI replaces human interaction for children, and they just develop this whole entire personal relationship with AI?

SPEAKER_00:

Oh, yeah, it's now. Yeah, it's already it's super. Oh, it's I mean a downfall of humanity. Amongst other things. But that it just it's it's really, yeah. It's um because like we're saying, man, like people like you have to be a tech expert to be a parent. How are you supposed to know, right? How are you supposed to know? So, like, how many kids have an unfiltered internet device? A lot. Most. A lot. So therefore, do they have access to Chat GPT and all that? Yeah, of course. So are they out there are they getting their their relationships and their thinking and their it's it's um yeah, that's that's is now.

SPEAKER_01:

Well, because I saw um because X Elon dropped Gronk or whatever Gronk Gronk. And I watched a video on it of how sexual it was. Yeah. This guy just asking questions and the way she was responding. It wasn't like, oh, absolutely, Mr. Marshall, I'll build this up for it. It was just a like sexual voice and how she was like responding to everything. And I'm like, I'm just sitting here like, and but it was like perverted in a way, like it was very sexual, like everything everything that she would respond to was like, I would love to see this size. And I'm sitting here watching like this video, and I'm like, what the fuck? Like, what if this is a 14-year-old boy on here? And now he's just being talked to because he's full of piss and vinegar, all his hormones are kicking in, and he's full rut, and now he's got this AI woman sitting here just saying everything that they're feeding into. And how quick does that hook a kid into going down that rabbit hole?

SPEAKER_00:

Well, and ChatGPT is turning on erotica mode later this year. So I mean this is this is um I I don't even know what to say. It's um where do we go?

SPEAKER_01:

Like what what where does where do we go as a a society when we're now marrying robots and shit? That's gonna happen.

SPEAKER_00:

Yeah, I think that intelligence is gonna leave us behind. I think that intelligence we're I think we um are used to intelligence and humans are one and now intelligence is a separate thing and it's gonna evolve without us. Um and I think that uh one of the upsides to everything being fake is that maybe we will value human connection more.

SPEAKER_01:

I think we have to, because now I just saw something on Facebook where these this guy's building a unplug ranch. Oh cool, a disconnect ranch. Yeah. And you're there's zero anything on it. Yeah. And it's just to be able to reconnect with people again. Like they're sitting at a table with strangers having a conversation was the whole point of this. That's already happening. Yeah, fast forward five years from now, which who even knows if we're gonna be on the planet anymore, five years, but yeah. And I've told you my conspiracy on AI, how I think they're just dumbing us down so much that it's gonna be you've seen the movie Idiocracy, right?

SPEAKER_00:

Where you know, I actually have not seen that. I've heard it's amazing.

SPEAKER_01:

You need to watch it. Like, that's what I I saw a thing where that's what actually helped Crocs go from being going bankrupt was they needed like the dumbest looking shoe for these people. So they bought no 100%. You TikTok will facts check me on this one as always. This is great. I'm almost a hundred percent croc, the business the you know, crocs, which I rock the shit out of. I got I got a couple pairs, man. It's like we were hunting in them two weeks ago, you know. That croc was gonna go out of business, and it the movie bought it for props because they were the ugliest, dumbest looking thing that people can wear in the movie, and it saved it, and now look at us. But if you watch that, you're like, this is our this is our government, this is our politicians, this is literally the dumbest human beings are like giving a voice now, and they're running for political parties, and this is where we're at. You should watch, and you're gonna be like, Oh my god.

SPEAKER_00:

I remember when it came out, I was like, I have a feeling that I might, you know, already know some of this stuff, but I need to watch it just just for entertainment value.

SPEAKER_01:

Well, the fact that it was made so long ago, and we're here now.

SPEAKER_00:

Oh, dude, this stuff. I just watched um ex Machina. This is like the it's the whole thing is uh is this uh super wealthy dude that is running a big tech company and he's develop he's got this remote RD center that's like his house in the mountains, and then he develops a sentient robot, and the movie is like, whoa, that's uh current. You know, there's a lot of this stuff that or the movie Her.

SPEAKER_02:

Okay.

SPEAKER_00:

Have you seen that one? That's where the uh um uh Joaquin Phoenix, I believe it's Joaquin Phoenix, um he develops a relationship with an AI in his computer. Oh no, the movie was made before the stuff was real, and now the movie is it's real.

SPEAKER_01:

Where do you see where we're going with AI? What's the future with AI? Somebody that's buried in the trenches in all you're developing every time I talk to you, you're developing something new with AI. Where is the future? Where are we going with AI? Or actually, let me rephrase that. Where is AI taking us? Wherever it wants.

SPEAKER_00:

I I know these are loaded questions. I I think about this a lot. Um Okay, I'll just I'll start with I'll start with the positive side. I think that we'll have actual distributed private personal computing. So anything that you need to be automated or done for you or with you, it can be done or will be probably done by um I hope, a device that is private and belongs to you, like a device in your home or that you wear or carry that okay, right now computing is centralized and is software is a uh high value, very difficult to create thing. And so all the software we use is created by centralized entities, by corporations, because we can't, for most of us, we can't create our own software.

SPEAKER_03:

Yeah.

SPEAKER_00:

And so we use software to our phones, like for phone calls, are running on software. The the data that flows from our cell phones to the cell towers to the servers are using what are called software-defined networks.

SPEAKER_02:

Okay.

SPEAKER_00:

There's software at different levels that run all the tech that we access that we completely take for granted that's invisible. The ability the ability to communicate, transact, you know, all of commerce, all of industry, all of government operates on these centralized computing systems because they're so difficult to create and so difficult to manage that there's only a handful of people in the world that can do it. And so they're done by Google and Microsoft and OpenAI and all this stuff. So the ability to replicate like human words so that Chat GPD can spit a paragraph back to you, that can is also being used to spit back code. So you can now have an AI chatbot help you write code. And where that's gonna go as it continues to develop, because what these AI systems are best at doing the things that their creators are good at doing.

SPEAKER_02:

Okay.

SPEAKER_00:

And they're created by software developers. So they're really good at code. And so what that will enable is like if I want to have a communication system or if I want like a, I don't know, like a photo storage system, but I want my own. I don't want to use Google's, I don't want to use Facebook's. Yeah. I want my own that it belongs to me. It doesn't touch anybody else's computers, it's private. AI will enable that. And I think eventually software will become fully commoditized. It will no longer be high value, it will be relatively low value on terms of a unit basis. And when software becomes commoditized, we will all have the option of having our own. You're talking like our own robot? Robot, personal assistant, everything, all the apps on your phone. If you want an now, some apps you interoperate with others. Yeah. You have uh connections. Those may still be centralized or perhaps even decentralized, but the apps that are functional utility type things, like we can just have our own. So I already do that. I've already built stuff software just for me that I don't sell or give to anybody. I just use it for myself. And AI enables me to do that at a very high speed, at a very low cost. And so I think that we will have like Microsoft in the 90s used to have, if you go, they used there's this show called Comdex in Vegas every year. And I went when I was a kid, and it was like my dream, like nerdland. It's you know, millions of square feet of tech. And Microsoft would put on these keynote presentations to show the future of computing, and there'd be these computing hubs in every neighborhood, and there'd be computers in every home, and the computing would be localized, it would be decentralized. So I think those AI systems are going to enable that. And so as robots come online, the first robots that have already that are currently being built are using centralized computing systems.

SPEAKER_01:

So that somebody else can own them and control them.

SPEAKER_00:

Yeah. But the ability to commoditize software that will let us decentralize our computing systems will allow us to have private robots. Your own personal robot. Yeah. And so it's not my so my robot won't be controlled by anybody but me. It's on nobody's system but yours. Yeah. And so that's like actual private computing.

SPEAKER_01:

Do you think it'll get to the point where we could use these robots against each other?

SPEAKER_00:

I mean, yeah, absolutely. We already are in Ukraine. You're talking drones? Yeah. And all of our military, so I mean, we've been doing that forever. I mean, we were doing I saw demonstrations of that 20 years ago when I was working with Northrop. I I saw the systems that did that, where we would uh we would like sit at a digital thing that like looked like you're in a movie set, and then the commanders could could send robots to fight each other. Like this, you know.

SPEAKER_03:

Fucked.

SPEAKER_00:

We are so fucked. You know, and so because of that, like also all the other entities that exist, governments and militaries and intelligence, they're also gonna have their own, you know, well, they already do, but because they have more money than we do, but but the I think the commoditization of software will allow like private computing, private custom computing, yeah, where you can just say, Hey, um, you know, make me so like I built like a I've told you about my note-taking system. I built my own custom note-taking system where I can talk into my phone, it processes my voice, and then it formats my notes how I like them to be formatted. And then it correlates my notes together in a big database, and it and my notes, I can talk to my notes and they can reference each other. And so you could like to say, like, hey, build me a note-taking system that does X, Y, and Z, and then it'll just spit it out, and then you can just open it and use it. Which is great. Yeah, I might I built the first version of mine in like 15 minutes. Like something this would take months for a team of people before. You know, so it's so I think there's um so that's part of the future of AI is like private computing.

SPEAKER_02:

Yeah.

SPEAKER_00:

Truly like, you know, would turn personal computing, truly personal computing. Um I also think that our bureaucracies that exist around the world that have embedded surveillance into our lives that we completely take for granted and that are constantly being expanded upon will be expanded upon further. And so we're seeing now a lot of pushback against systems like Flock. Flock is the um surveillance systems that that police departments are putting in all over the world and all a bunch of U.S. cities where there's these cameras up on poles, and they're sold as license plate readers, but they're much more than license plate readers. They can actually identify your vehicle even if you remove the license plate. They are like if you were to give Chat GPT to the cops with cameras, it's like that. It makes mistakes, but it's also really good and it's really powerful. And there's already been people that have been held at gunpoint on accident because the cops were told that somebody was a criminal that wasn't. That's a lawsuit. I just watched a video the other day of a cop that went to somebody's house saying, like, hey, you're being arrested, you got to come with me because you did a thing, and the person was like, I wasn't even there, and they're like, Well, yes, you were, because our thing says so. And the the regulations that are being put in place around these things are indicating or such that like the systems that ICE is using right now, yeah, their regulations indicate that their system is true. If you show a passport that says you're a citizen, but their system says you're not, then you're not. So the the surveillance apparatus that the Patriot Act enables, and that a bunch of other systems enable is a big I mean it's already a problem, and it's just gonna get way, way worse. And I wish that it could be dismantled, but it it won't be. So that's gonna be one of the downsides, I think. And then you can't, you can't, as an individual citizen, you can't fight back against any of that. It'll be total and complete. There will be zero privacy, I mean, even maybe in your home.

SPEAKER_01:

That's why these people let um Alexa and stuff into their homes. I mean, everything is monitoring and listening. Everything.

SPEAKER_00:

Well, and all the Google Home devices are getting updated to Gemini. So instead of the Google Home Assistant being like a sort of a dumb, like Alexa style basic system, yeah, it's now gonna be uh Chat GPT style. Google's thing is called Gemini.

SPEAKER_01:

Which is collecting every bit of data. I think that's where people don't understand that this all comes into data. Just collecting, collecting, collecting.

SPEAKER_00:

Yeah, that's what's valuable.

SPEAKER_01:

So yeah, and in the in the tech world, everything goes back to data. And so if you put these devices in your homes that are watching your eye movement, collecting everything you say, learning your habits, you're driving, you're gonna go shopping, you're gonna go to the gym, you're gonna pick the kid up. I mean, they're learning every single thing about you. They have your whole entire life figured out, and you and we're just like going along with it like we're just completely fine. So all these electronics that are AI connected, internet connected, are collecting every single thing that you do.

SPEAKER_00:

They're surveillance devices. In fact, the the ring doorbells are now tied in with Flock. What the hell? And and and Flock has been abused by cops that have followed their girlfriends and like all kinds of stuff. They they can just follow you around town. They do, not can, they do follow every single person without probable cause and without a warrant.

SPEAKER_01:

Yeah, my buddy was telling me you're 100% right because I gotta buy it's a cop here. He was saying that they could pull up their system and like wherever it'll trigger there's points all through Meridian, Eagle, boy seat that you go you drive by, and if they're looking for a car, it'll ping it, and then it automatically tells the last location of that car was not they're looking for.

SPEAKER_00:

Yep.

SPEAKER_01:

He says they're catching people non-stop on it, or they'll just sit on them and just watch. Oh yeah.

SPEAKER_00:

No, we're being watched all the time.

SPEAKER_01:

And then they they not him, but I got another buddy, he was with the U.S. Marshall team, and he was telling me how they'll go up, they'll dress up as uh utility workers, and they're putting cameras with these AI programs on them now that are tracking everything, and it monitors everybody's faces getting pictured. And then he was showing me on his phone how it tracks, and it's just dude, you could there'll be a hundred people on the street, and there's a little square around each one of their faces, and then he could turn it so it leaves a trail where they walk, so you could watch their patterns, you could turn that off, you could single hand, you could single out somebody, so if somebody was a perp or whatever went into a crowd, you could just click them, it would, and it would track them through a whole crowd, hundreds of people, and just followed them all the way through. Like it never left them. And he's like, Yeah, we do, we just we get a warrant, we'd look like a utility company. We put these up on like drug houses, known dealer corners, things like he's like, we want we watch everything, and it just builds a system and it just feeds it back to them. And then here we are as Americans, like, oh, it's our freedom. We're the freest country ever. It's like, dude, we're so we're so dumb.

SPEAKER_00:

Well, there's safety and freedom. You know the scales. It goes like that. You can't have both.

SPEAKER_01:

And and a happy medium. How how many years ago was that? Like 50.

SPEAKER_00:

Um yeah, and then systems like Palantir connect all those dots back to so you see a so you remote retina scan somebody, which is a as as better than a fingerprint. Yep. You can do from hundreds of feet away via camera. You identify them as a criminal, you immediately pull up and see their entire transaction history, everywhere they've been that day, bank accounts, everybody they've talked to, everybody they have relationships with, they build an entire web of connection. And then anybody that's in that web now gets correlated to that criminal and is also considered a potential criminal.

SPEAKER_01:

That's why these protests and these protesters that are like outside these ICE buildings and outside the capitals and stuff. Hilarious. They're scanning all their eyes.

SPEAKER_00:

You're you're you're they know who you are. You're volunteering a federal building, especially a law enforcement building, is like the highest level of surveillance. The company I used to work for made the sensors that these guys would use. I'm talking advanced technologies for surveillance. Years ago. That's 20 years ago. Like the stuff now, I mean, the full everywhere you go, you're surveilled. And the the trouble is, even if we tell our politicians that, like, I'm not gonna vote for you unless you dismantle this, they can't. Because if you if a cer if a politician dismantles a surveillance thing and then something happens, it's on them, then it's on them. Because with security, you can't succeed, you can only fail. And if you fail at security, you're toast. Yeah. So there's no winning.

SPEAKER_01:

I never looked at it like that.

SPEAKER_00:

It can't it cannot be undone. And so the the the concept of privacy I hate to admit it, but it's gone, man. It doesn't exist.

SPEAKER_01:

Everything's tracked. And even if you live off grid, the second you step foot in a city and come back for any s anything, your your car's picked right back up.

SPEAKER_00:

You don't even have to step foot in the city. If you have a phone, or if you have a credit card or a computer, those things are all triangulated and and and located exactly to where you are and what you're doing and who you are and who with and at what time and and and and full history, full breadcrumb trail. With perpetuity. They store everything. Everything. How much water and data, like where are we storing this shit? And how do we In Maryland, Texas, Utah, Hawaii, Australia, UK, New Zealand, Canada?

SPEAKER_01:

Is it true when I chat I type in, let's just say, hi Chat GPT, how's your day going? Is it true that that uses like thousands and thousands of gallons of water to help process the servers and stuff? Like, how does it actually work? Like, so you let's take Meta here, right? Yeah, come into the valley. We have this ginormous data center that we're building, and then we got Micron that's doing their$500 million expansion or whatever that is. How much water does AI eat up? So, like, are we ever going to run out of this multi, I guess, leveled question? Are we ever going to run out of natural resources with how much infrastructure and the the water it takes to cool these systems and natural resources? Because I feel like it's like it's it's unreal of how much people are using. I mean, just to say like thank you back to ChatGPT was using some astronomical like number of gallons of water per day. Is that true? Is that real?

SPEAKER_00:

So to an extent. Okay, but so here's here's the way it works. So data centers are buildings full of computers. Yep. And they're very dense. So like the number of computers per square foot in a data center, I don't even know what it is anymore. Well, the last time I operated a data center was like 2004.

SPEAKER_02:

Okay.

SPEAKER_00:

Now, when I was operating data centers, the computing density was high. And what would happen is like so a data center is basically like a giant heat exchanger.

SPEAKER_02:

Yeah.

SPEAKER_00:

So you pump electrical, you pump electricity in, the computers use the electricity to function, and the the you know, if you put in a watt of electricity, then you at some point have like a watt of heat. I don't know if that's the right, or it'd be a BTU or something. But you have a unit of electricity and then you have a unit of heat that comes out because it doesn't just disappear, like it turns into heat, and then you have to take the heat out of the building. And so a lot of the power that gets used to run a data center is actually used to operate the cooling system and like the heat exchangers and stuff. And so when I was doing it, we had what we called hot aisles and cold aisles. And so I'd walk into the data center, and the floors were perforated on the cold aisles, and then um like we like you'd go behind the servers and it'd be super, super hot, and the servers are all exhausting all their heat, and then the heat would get sucked up and then evacuated.

SPEAKER_02:

Yeah.

SPEAKER_00:

And then the on the cold side, there'd be perforated floor and there'd be air-conditioned air that would come up the front and get sucked in the front of the servers.

SPEAKER_02:

Okay.

SPEAKER_00:

And so just cold air in, hot air out, and then you just exchange it. And at that time, we would run them through coolers and then like, you know, cool the air and bring it back in. The current uh computing systems are so dense, there's so much computing that happens in such a small amount of space that it creates an enormous amount of heat. And the current cooling systems use evaporative cooling. And so they have these giant like uh water towers outside of the data centers that the the water um as it gets used, it it just evaporates. And so it doesn't like go away, like in terms of like our natural resources don't disappear, they just evaporate, and then you know the cycle continues, right? It rains and whatnot.

SPEAKER_01:

But okay, I get what you're saying. But if we're only putting out X amount of gallons of Lucky Peak per spring, depending on our snowfall here in the valley, which all the farmers, the ranchers, all the community, and everybody lives off of everything that's coming up from the reservoir.

SPEAKER_00:

Oh, now you have I see what you're saying.

SPEAKER_01:

Now you have Meta, Micron, expanding, blowing those things, like how much of that water, even though I know it's going evaporating, it's going to the atmosphere. How much are they using of that?

SPEAKER_00:

As much as the local politicians will allow them to.

SPEAKER_01:

So that's what it's going to come down to when we start running out of natural resources. Because what I'm afraid of here, which it's already happening, it's such an incredible area, but it's just it's turning into an even Matt, a little Matt Todd with the ranch, he could deny that we're fucking turning into California all he wants. Love the guy, but it's clear like when's it gonna get to a point when now our politicians are doing this? Now our electrical bills are going up, our water bills are going up, everything's gonna start increasing because we have natural resources that these tech companies are gonna need. Like, how how long until now we're in California and we got rolling blackouts again for whatever reasons, because or hey, neighbors are ratting on each other because we do live in the high desert here. I'm sure you rewound 20, 30 years ago in California, you'd be like, hey, you're gonna be calling your neighbors out for watering their lawn or washing their car. Yeah. Are these centers using that much? I mean, and obviously it's not slowing down, it's only expanding. We're we're you look at all the um future growth for these businesses here, the projected business or projected growth. I mean, dude, it's you're like, where how are we gonna where are the are we gonna get the water from this?

SPEAKER_00:

Yeah, it's a good question. I mean, what I've read is that the so in a data center, so data centers are very carefully planned and very carefully placed, and there's lots of contracts that have to occur in order to facilitate the operation, like the building, and then the ongoing operation. And so you have a bunch of like major contracts with utility companies like for power and water. And during the design phase, you estimate your needs and you say, like, we're gonna need this many gallons per hour and this many, you know, kilowatt, megawatt, gigawatt hours, whatever, of electricity. From what I've read, the there's been some underestimating of resource consumption, and then when it actually is operational, they are consuming more, and then the utility companies are facilitating the additional consumption because they're often public utilities and the public utilities are regulated by the local politicians who want that business to be there. And so the data center is the demand side, the politicians are the supply side, and it seems like in some cases the supply is out, or rather, the demand is outrunning the supply.

SPEAKER_02:

For sure.

SPEAKER_00:

And the the citizens that you know drink that water or use the water for farming are uh being left as secondary consumers.

SPEAKER_01:

I think it's gonna happen here pretty quick.

SPEAKER_00:

Yeah. I mean, you know, and there's a there's an efficiency thing, you know, with every new major cycle of technology, there's like you start off as a ver being very inefficient, and then you find efficiency. And there's that's happening. But what's what's also happening is that as you gain efficiency in these AI systems, you gain headroom. And then so the developers of the systems um make the systems bigger and more powerful to consume the headroom.

SPEAKER_01:

Yeah.

SPEAKER_00:

So it's so the efficiency right now, we don't see it affected in the utilities. We see it affected in the uh Chat GPT getting smarter. Got it. Now you know, if it enables nuclear fusion, if if AI enables us to get to nuclear fusion faster, or these other forms of more efficient systems now, but I don't know how those things get cooled because you still have to cool them. So there's in some some data centers are being built in places where you can actually submerge the data centers in cold water. So you take a natural very large body of water like a lake or an ocean and you submerge the data center. So that's been something that's been worked on for a long, long time. I saw prototypes of that in like 2005, 2006 that Microsoft was doing, where you you seal up a shipping container that's full of computers and then you just put it in a body of water and you you know you use that.

SPEAKER_01:

What's that do to the ecosystem?

SPEAKER_00:

I don't know. I mean it it probably like it'd be interesting to see if you would how much um heat that box would emit and what that would do to the local ecosystem around it, you know, and like how big does a body of water have to be to be affected or not affected. I don't know. But we got some interesting times ahead of us. Yeah, for sure. And um, so I think you know I think I think the the with AI to to to bring it back to like to sum it up is don't use don't let kids use AI alone. Yeah. One of the most important things I think adults can do is to learn AI and to like test it and find its boundaries constantly. Yep. Because you have to teach your kids how to navigate it if they're not old enough now, then when they do become old enough. And the only way to do that is by doing it yourself. Nobody else is gonna teach them the right way. Um now how feasible is that? It's not very feasible, but we still have to try.

SPEAKER_02:

Absolutely.

SPEAKER_00:

You know, and that's um the and and just and just demonstrate show our kids that it's okay to be bored and show our kids that we control our technology and that it does not control us. I love that. That is very, very important. I love that. It's gonna be the title of the episode right there. There you go.

SPEAKER_01:

Write that shit down. I like that. Well, dude, I don't know if you got any saved rounds that parents need to know, but I feel like that was a great conversation. Hopefully we save a kid from that, or two or three, or a hundred.

SPEAKER_03:

Yeah.

SPEAKER_01:

Um, like always, man, if you find any new stuff and you need to come on and use my platform, we are always it's yours. Use it whenever you're doing a ton of great work. Thanks. You've got a lot of really cool things in the in the pipe that you're working on with Chris Henson. Chris Hansen. Chris Hansen. Yeah, I've got some. I would love, dude, when you build a better, you got a friendship going with that, dude. Please, please, for love of God, plant the seed. I would love to get him on the show. That'd be that'd be really cool. So many questions.

SPEAKER_00:

Because he would have some wild chaos stories.

SPEAKER_01:

Bro, I was obsessed with his show. Oh, cool. All the way up until they caught that judge or politician. I think it's what ended his show originally. The guy that killed him.

SPEAKER_00:

He was like an attorney general or something.

SPEAKER_01:

He had a and they had the whole swallow, everybody, all the cops are out front, and they ended up hearing the shot go off. That was, I believe, his last episode. Yeah. For a while, and he kind of was like, I'm surprised it got that far. I mean, dude, there's some that's why these kids that are out here now, like doing their own catch a predator shit, which I a hundred percent support. They just need to do it a little bit more smarter because it's dangerous. It's very dangerous. Like, especially when it comes to being a predator and you catch these dudes, they're you got that's fight or flight. Yeah. For a lot of them. They're lucky that a lot of them are just so demonic and pathetic that they don't know how to fight. Yeah, they're gonna true. I mean, these guys are just beating the shit out of these grown-ass men, and they're just like cowarding down. I mean, it's it's clearly there's a they have a demon and whatever. But yeah, I would love to get that dude on the pocket. He's he's yeah, I would just be like it would be like a three-day show just hearing the behind the scenes and like the work story time, yeah. So, but dude, I'm I'm super excited for you as always. I appreciate your time. You're doing a ton of good things. Um, for everybody that's made it this far, Ben the Family IT guy. We will get everything linked. If you have any questions about your children, protecting your children, how to set things up. You offer damn near everything that we talked about today on your website. You got a chat on there that you can search anything, and if they can't find it, I'm sure they could reach out to you directly at the family IT guy. I think there's an underscore in between each one. Yeah. And ask you because you're very open to everything. You get right back to people, you're doing a lot of good stuff, and I know you're just as passionate about your I don't want to say more passionate, but you're just dove into that's your world. Yeah, I've made it my thing. Yeah, helping kids, and I'm super proud of you, dude. And you got some cool things coming out to help help kids and families in the future. We're using some programs, so yeah. It's it's an interesting world we're all in as parents, especially our generation of parents. We're learning all this at the same time, and yeah, we gotta help each other out. If we could just put out a basic episode where a lot of a lot, those are some good questions. I mean, obviously, parents are thinking, which is great. It's a little it was a lot different questions this time around, a lot of similar, but it's I mean, dude, we even had some grandparents reaching out. The fact that grandparents are reaching out's great. Yeah, that means they're thinking, they're aware of the environment that their grandchildren are being raised in, especially with tablet kids and all that being a thing these days. It's it's gonna be a good sign. It is a good sign. So as always, bro. All right, thanks, man. Till the next one. Appreciate you, dude.

SPEAKER_00:

That was great. We got it. That was we covered some stuff.

SPEAKER_01:

I think that's the best best right there.