nextTalk

AI and your Kid

January 08, 2024 nextTalk
nextTalk
AI and your Kid
Show Notes Transcript Chapter Markers

You hear about it in the news, from your kids, and often linked to warnings about the latest scam. What exactly is AI? What do we need to know to protect our kids? Kim talks to a tech expert who breaks down how we can understand AI, conversations to have with our kids, and what's on the digital horizon. 

Support the Show.

KEEPING KIDS SAFE ONLINE

Connect with us...
www.nextTalk.org
Facebook
Instagram

Contact Us...
admin@nextTalk.org
P.O. BOX 160111 San Antonio, TX 78280

Kim:

Welcome to the next Talk podcast. We are passionate about keeping kids safe in an overexposed world. It's Mandy and Kim, and we're navigating tech, culture and faith with our kids. I have to tell you, mandy and I have fielded so many questions over the years about AI, especially in the last year. It has just garnered so much attention in the news and parents are just curious about it. Now, Mandy and I did do a show AI and the metaverse but it was me doing research and not really knowing the details of it. I was coming more from a parent perspective, which is good, but we knew right away that we wanted to talk to someone who actually knows this industry and what it means for parents. And since AI is always changing and we're always getting questions, this is a show we wanted to do right off the bat. Anyway, I'm super excited to have our guest on today. His name is Marshall and he's going to tell us a little bit about himself to kick it off, so you know exactly who we're getting to talk to today.

Marshall:

Hey, thanks so much for having me on. So it's really been fun to watch the AI industry grow and a couple of months ago I got to go to y'all's conference here in the Dallas area and my wife and I both really felt strongly that, since technology is always changing, this is a ministry that we wanted to pour into, because even in that one conference y'all poured a ton into our lives, into our family. So thanks for having me on.

Kim:

Yeah, that's awesome. I'm so glad to hear that. I know you spoke to Mandy after the event and she was just really impressed by you and your passion to be so helpful. So tell us a little bit about your background and your family.

Marshall:

Sure Kim. So I have been writing software since I was 12. I actually started helping out using sign language, writing some software with that that helped kids even an orphanage in Africa and so I've kind of been passionate in that area of helping people with whatever the latest technology is. I've worked in churches. I've worked in the secular market and always pushing the boundaries Some of the first churches to do live streaming on the internet, back when people thought that you know, the internet was demonic or something.

Marshall:

We were out there pushing the gospel on that platform, helping run it, but also even building some of the software that underpinned some of those broadcasts. In addition, more recently working in the virtual reality space, which I'm sure there's questions about how all of this stuff fits together. These latest buzzwords I've been in a lot of them for various reasons. We have a small business that really tries to use technology to earn money as well as to specifically pour into the kingdom as our ultimate goal there. So everything we do really is about pushing the technology boundaries because that's fun, but also using it for a purpose which is to advance the kingdom, whether directly or indirectly.

Kim:

I love to hear when people are able to use their passions and their God-given gifts to further the kingdom. I mean, I think that's really what we're all supposed to do, right? Yes, your dad too, right.

Marshall:

Yes, so I have three kids. They're all in the younger grades. We do homeschooling and that really gives us an opportunity to pour into their lives with Bible and all those regular subjects, but also even seeing how we can use this technology for younger kids, keeping them from being addicted to technology and not just being consumers but also using it to distribute something that glorifies God. It's exciting to see that where these kids that each have their own unique talents but God puts them with someone with parents who can understand them, so it's really great seeing how their skills are developing with that and, of course, as technology changes, the things that are here now weren't there when I was a kid or when my parents were kids, and so God has put us all in a unique position in life, a timeline in world scale, so it's just really exciting. I love studying that, so it's great to see that and how it relates even down to my specific family.

Kim:

I love that. I love getting to talk to someone who has both perspectives, that we're looking at the parenting side. Well, really three the parenting side, the faith side and the technology side that's a hard combination to come by. So I'm excited to jump right in with some of these questions that I have, our team has and that we've heard from parents. So I think we need to start like base level right. We need to know really, how would you define AI and VR, and then maybe we can move into how those things are connected.

Marshall:

Sure, okay. So AI and VR, they're different. So I'm going to start with AI. It's artificial intelligence and I promise the only time I'm going to say it here it's a bunch of numbers, so I'm not going to go into all the math.

Marshall:

This is not an advanced engineering degree or programming where we're going to talk about all that stuff, so don't zone out here. Basically, it's trying to mimic how humans think. It's kind of a better version or at least it claims to be, or the goal is to be a better version of something like search or creating an image, or even something like a health diagnosis or a legal document. It's trying to do that using all the knowledge out there. And even some of these things go cross cutting, just like when you learn, when your kids learn how to put blocks together. That helps them later in life, like there's no job out there that's putting blocks together.

Marshall:

It's not direct, it's an indirect way of learning about a future skill. And so AI we're kind of doing that same thing. We're saying here's a bunch of knowledge in these various areas, now go apply it to some other application. It has a whole new way of addressing things that people are still wrapping their heads around AI could be maybe overwhelming or confusing because of that, because of how unknown it is, but I think that's where we can come in and help understand it more.

Kim:

Okay, I think I got it. It's taking the information and applying it to all different types of applications, where we normally, as just the computer programs it's very direct, I guess very specific, whereas AI seems to be able to gather information and apply it beyond that one thing you're asking.

Marshall:

Yeah. So my wife is visual. So she's always like stop talking and show me a picture. Well, we don't have visuals here, but I can at least paint a word picture here. So with artificial intelligence on images, one of the things you can do is like background removals or help, you know, removing an object. And because it has seen a whole bunch of faces, it knows that if there's a tree branch covering half of someone's face, it can kind of guess what the other side of that person's face looks like, just like you might do. That doesn't mean that it really understands faces or that it understands, you know, human anatomy. It's just kind of saying, well, the pixels on this side match the pixels on that side and it comes out.

Marshall:

So you look, a lot of the early models from last year. There was all these photos of people that the photo looked great from AI until you realized the person had seven fingers. Two of the fingers were mapped between different hands. You know there was a thumb on each side of the hand because it didn't know what a hand was. It just was like hey, let's draw some shapes here. So there's an element here where we have to remember that AI is just I don't want to get a bad idea that it's only for evil. But you know, god is creative. Satan is an imitator. The same thing here. Humans, we're still creative. You put that inside of us and the AI is still. It's an artificial version, it's an imitation. It doesn't have truly this creative build something out of nothing. It has to start with everything and boil it down to something.

Kim:

So that really helps me. I mean, when you said it can't come from nothing, it has to come from something, and we, as humans, are the creatives created by our creator, and so I can also see the flip side of that and why I think so many of our parents, including myself, are concerned about AI. I mean, there's all these stories in the news that we'll touch on, but you know correct me if I'm wrong that creative force could be from someone who has an evil intent.

Kim:

So we have someone who has an evil intent who is pouring into the AI world. Basically, that's where we're seeing some of the stuff that are so scary for parents.

Marshall:

Absolutely, and some of the early adopters of this technology are a bunch of lonely nerds At least that's this stereotypical version A bunch of people who are smart, who can think a lot. A lot of those people don't have a lot of great human relationships because they've poured into computers or just because they don't have great relationships, they pour into computers, and so you do see a large percentage of those people looking at pornography or things like that. But then again, you know, look at sports, look at any other industry so many of those people struggle with addictions as well. It's not that the people making this technology are evil more than others. They just happen to be building something that's a force multiplier through technology. One of the big things that we can look at here is that artificial intelligence you highlighted it there is.

Marshall:

It's a multiplier of someone's intelligence. If someone intends to use it for good, most likely it's going to come out with good results. If someone intends to use it for evil, very likely it's going to come out with something that is destructive. Good people are using it to make better products, to even and even using AI to protect others. So while we may think, oh, ai understanding what pornography is or understanding what an evil text is is bad. It can also be used and people are using those filters every day that use AI to detect pornography, to detect lewd behavior, to detect even aggression and tone of voice in text. You know if you're writing something passive, aggressive or down to flat out aggressive, using, you know, anti-semitic language or whatever, even trying to substitute letters to make it not be English but kind of read like a fake word. Using artificial intelligence, they can detect that stuff much more accurately and filter out nuanced logic, like now where I use the word anti-Semitic I'm not being anti-Semitic, I'm talking about it and so the AI can somewhat understand intent, if it's been programmed to do that, and help protect people.

Marshall:

I believe a specific case of that is Apple has at least can be enabled on their phones the local ability on the device to detect sexual content in images. So not only can it protect something that you take a photo and say hey, wait a minute, on a child's account. You can't, you know, are you sure you want to send this out? It's detected to be, you know, violate that filter the same thing, even messages coming in. It can try to detect it as well and see if those images may be something that they shouldn't be looking at. So it's using those filters here, obviously for good. Someone could just as easily turn that filter around and only allow the bad stuff through.

Kim:

This is something we talk about all the time at Next Talk in terms of anything can be used for good or for evil, and that's why the conversation with your kid is so important, and I think this is a great example of you said a couple of times sometimes, or mostly, and I think that's key that this AI development as it's able to better understand the use of words or images that's pretty incredible and can be used for really good, like the filters, like on the phone, but it's not 100% and so we still have to rely on that relationship with our kids and explaining what AI is, the good and the bad of it, because it is. It can still get through and we've seen that a lot in the news with all kinds of things happening, and I do want to talk about that. But we did begin the show saying we're going to explain AI and VR, so maybe we can jump over to what VR is and then how those two are.

Marshall:

Okay, absolutely so. Vr is virtual reality. Also, a close cousin of that is augmented reality or AR. So we have AR, AI. Like it's easy to confuse all these Virtual reality is the goggles that go on your face.

Marshall:

You know you can't see out and you're immersed completely in a different world. Augmented reality there's kind of some variation here. It goes anywhere from the goggles that are like that Apple's getting ready to release some this year. They cover your vision, but they have cameras that look out, so you feel like it's just glasses all the way to the Snapchat guys. They have glasses that are completely transparent, but they have cameras on the side that look out and can take photos, or I guess snap. You know photos all the time and be looking at things. Meta has the same thing as well, with some glasses as well as the goggles.

Marshall:

So there's kind of a merging of the technologies at different price points, different things they can do, and people are still trying to figure out how to use those.

Marshall:

One of the things that I read about in the last couple of days is that Meta and Microsoft and some of the big tech companies are looking at merging artificial intelligence with AI, and the way they're doing that is taking the goggles, which is physical hardware, and running these AI models.

Marshall:

So they've trained it on something like with the glasses, being able to look out and see a sign and say, oh, this is a, you know, a pedestrian crossing sign or a stoplight for someone who's, maybe, vision impaired. It can help and speak to them. And when you're walking through you know New York City or Chicago can say there's a traffic on the left, your light is green, do not walk those type of things. Or even bringing those models forwards to say maybe, even if you're autistic and you can't read people's facial expressions, can that model once again be trained on a human face and know is this person happy, Is this person annoyed? Maybe it's trying to understand body language and say, okay, they're starting to back off or they're getting hostile to you, and be able to communicate that to someone who's blind or autistic or something like that.

Kim:

My son who's 14, he was telling me just the other day. He showed me this little clip of a guy wearing some of those glasses and he was like able to look at a plant and he said what is that? I guess the AI explained to him what the plant was, if it was dangerous, you know, as far as being poisonous, and I think that's what you're talking about, like merging the two worlds together to add the intelligence to the visual.

Marshall:

Yeah, absolutely, and there's so many ways that this can go. It depends on what people respond to, what type of things sell, and there are things out there that that people focus on now that are doing a lot. That's where they're going to try to insert AI first. So it's up to us as Christians, I guess, to kind of shape that market and say here's where our demand is. Is it to make better animated characters for a Bible story or is it out there to use it to try to promote some ideology of the world? If they use it for the world, let's take that same technology and apply it for Christian principles, rather than running away and saying, oh my gosh, this technology is evil. Remember that. It's not evil, it's just how it's being used is evil.

Kim:

It's so easy, and I've done it myself out of fear. A lot of times, that's the first place we go as parents. When we see something, we're like, oh my gosh, so many bad things could happen, get rid of it, get rid of the phone, get rid of the VR, get rid of the AI, whatever. And we push it away, therefore making our kids sometimes even more curious because we're saying no, no, no, but it's out there in the world, instead of learning it and talking through it with our kids about how it can be used for good or used for things that we value. I love that you said that and I think, as Christians, that's one of those call-to-action moments that sometimes we're afraid to step into, but it's so important that we have a voice in this World that our kids are growing up in mm-hmm.

Marshall:

Yeah, totally antidotal. My house a couple days ago on Christmas Eve, the lights burnt out and we have lights around our house we do it for like every holiday and we love to light it up and people see our house. It's really prominent in the community and everyone can see it and they love the lights, they love to see what I do with them, and so that's kind of like a little witnessing moment, because then when they see us in the neighborhood, they're like oh, you're that house and we have a moment to share. You know just a tiny bit of the gospel, and On Christmas Eve the lights went out. It was raining and so they went out and I reset and stuff. It ended up that the that one of the power cords caught on fire and there's this huge fire outside and so.

Marshall:

After that, obviously I'm, you know, a little stressed out about the moment and I told my wife I was, like you know, I kind of just feel like I want to just turn off the power in the house and step back and not turn on the electricity ever again.

Marshall:

And she's like and what would you do if you did that, as like, well, we'd have to get out the oil lamps, which probably is a bigger fire than the electricity would be. And so you know it's, it's obviously an overreaction, and Pulling away from electricity isn't gonna solve my problem of a power cord catching on fire. What is is to well is to maintain it well, to take care of it and to understand that that cord had been out there too long in the yard and was starting to decompose, not to say electricity is dangerous. If it's maintained Well, those cords are gonna transfer that power through them. That's gonna, you know, energize the house, and let us do things that that spread God's love around the community. So the goal isn't to run away from it but to Maintain it well, to grow and to understand it and to use it properly.

Kim:

That's a great story. High five to your wife. That's so cool. Well, okay, so I'm glad we were able to establish what AI is and VR and how now they're kind of being merged together. I think most parents you know, and it is a generalization, but I think most of us want to know about what's happening in the AI world and how it's being used now to Trick kids or for scammers to use it. Things, things that are happening on or on the horizon that we need to be aware of so we can talk about it with our kids and keep them safe.

Marshall:

Yes, so One of the places that our kids are the most is school. There's a lot going on there that we need to be aware of. I know you sent me an article earlier about a school and the article talked about how the school was implementing policies, or they planned on taking care of policies and so and as deep as the article went and I've seen several of these type of articles and All they say is policies. They don't go into specifics and I think the schools and other people are a little bit unsure of what to do. I think the thing is, as parents, that we can ask specifics and don't expect AI to be this magical thing that can be solved. You know kids have been passing notes in school, they've been drawing crude pictures or you know laughing at kids for ever since. You know school has existed ever since there's been kids, so we're not gonna solve it when we get to AI Any more than we have with any other piece of technology. So I think the key is to you know, if you're concerned about something like this, to ask the school what happens. You know, discipline wise, if Something happens in the school, what is that policy? And, obviously, are you gonna follow through with that outside of school? It's about the legal system. What does your community, your state, you know the federal government getting involved? What can they do to help take care of the situation?

Marshall:

For AI, most of us have probably heard it is called deep fake, a photo that's been modified. That's not real, that's been completely Computer-generated based on some person. That type of technology has I mean, we've had it since Photoshop or even before then with being able to manipulate images, and so I think the thing is to ground our kids in truth, to say, based on their age, obviously is how you talk to them. But if a situation like this has happened, where someone has Photoshopped or used AI to modify their body, put them in a compromising position or something is to say look, did you do that? If you didn't, if you didn't take off your clothes, if you weren't in that compromising position, then that's the truth and this image is just a lie. It's not you, it's a fake image. So we want to teach our kids to ask questions, to dig into it and to not just react and not dismiss their feelings, but also teach them to be grounded in that truth of what the situation really is the deep fakes which are crazy how good they've gotten.

Kim:

And just to expand on that a little, there's been a lot of you know reports in schools when they have taken your child's Head image and put it on a naked body, or you know something in a, even a video, a pornographic video and then distributed it. So the parents that we've walked through this with, or some version of this, I love that you are suggesting that there are detailed conversations with the school. How will you handle this if? Which tells the school that you know what's going on, you are not putting your head in the sand, that you are aware and you want to be a part of the solution Having these conversations with your kids about here's what people are capable of. You need to know that ahead of time. You know when you're taking a picture or when you're. You know when you have a social media account.

Kim:

These are things that can happen, but on the tail end of that, one of the things that have been really powerful is Parents. You really do need to say no matter what, we will find a solution, and sometimes that solution is Is not convenient, or is it something and it's not something that we want to do. But if we are on our kids side and we know how deeply at those age especially tweens and teens how Mordifying it can be to have something happen in your social circles or your whole school to see something and it affects them greatly you know we're talking about depression and suicide and all these things that come alongside of that we want you to take that time to tell your kids before anything happens hey, I'm on your side, no matter what. If something like this ever happens, we'll figure out a solution, even if it means home schooling, finding new school, whatever, we will figure out. Never think that you are hopeless or that we can't figure it out together.

Kim:

Kids need to hear that and Sometimes parents don't think to say that until after the fact, when their kids are already struggling. So I really want to encourage parents today, with this rise of things like deep fakes happening in younger and younger ages, have that conversation with your kids and have it more than once, and it doesn't have to be a sit-down deep thing, it's a hey, I want you to know. I know what's happening out there. Here's what I've heard about. You can even use next talk and say I heard it on this podcast with Marshall and Kim, I just want you to know we will figure it out and that, with the identity and who, who you are, which is a child of God, and that is it period. Like you're saying, marshall, that is so important that they have those two pieces to hold on to so that, if something happens, they've already got that as they're the groundwork stand on.

Marshall:

That's really well said. I think, even beyond as the family, talking to them as parents, is having that sort that extended circle of people that you can trust, your village of people, whether that's childcare workers or youth pastors or someone in your church, other people in the church that aren't even the clergy, but other friends and Especially adults that they can trust to go to if something like this happens. You know, if there's something like this, where there's a deep fake of someone, they're not comfortable going to their parent first, they need to go to one of these other adults who can, you know, tactfully handle it, so that they can get up the courage to go to those next steps and and talk about with their parents and in the school district and all those things. There's still places where because this is a new technology, sometimes it goes off in left field, just like your toddler.

Marshall:

You tell them you know it's time to go, it's time to go, it's time to go and they start walking towards you. They turn around and go get a toy and start playing. They just get distracted, and sometimes these AI models do the same thing. There's been instances where someone has said I feel like I'm gonna kill myself. What do I do? And it's like here's how to get a piece of rope from you know the store to hang yourself.

Kim:

You know and like.

Marshall:

That's not what I was asking for, so you've got to watch.

Marshall:

You know, obviously, when those things come up, they look at it and they say, okay, how can we fix this to not do that? But then someone else comes in and says you know, I'm struggling with eating disorder, and it gets confused and it's like eat more food or eat less food, and that's the opposite of what you need to hear. And so, just like if you misheard someone Say you know, I'm eating too much or I'm eating I'm not eating too much, you can misunderstand someone easily that the AI models have the same problem and so we need our kids especially, but even as adults, don't turn to these models to truly change your life. Look at it with skepticism that healthy amount of skepticism and making sure that it compares to other sources, that it's accurate, and don't use it when you're in a vulnerable spot.

Kim:

That is a great tip. That's something that kids can hold on to. That's very practical. Don't use it when you're in a vulnerable spot or mindset I love that and don't use it as your source or sole source for things, because, like you're saying, I mean those were some great examples and you even shared with me before about some sexuality things that were answered for minors, like in a chat, gte kind of situation. Can you share a little bit about that?

Marshall:

They're trying to block that type of content. You know what is sex or some variation of those terms, asking it explain this, explain it like a haiku or a poem or in certain ways, and it'll come back and do that Because it's saying okay, I'll take what you asked me, or you give it some details of something and say give me a summary of this, write it as a poem, something that a kid might be more susceptible to believing or taking into heart. We process songs differently. We process words, and so we don't let our kids necessarily listen to the explicit rap that is available on streaming on audio. We also don't want to let them listen to something like this, because it's going to go into their brain a different way.

Marshall:

You know, at the same time, you could possibly use it to take you know, here's how God designed our bodies to work with, you know, in the context of marriage and put that to a poem, and it can come back with this maybe beautiful poem, maybe really cheesy, whatever that is. If it works out, you modify it, you could have something that maybe does impact you. I think the thing is there. You know, if someone has bad intentions, or someone's naive and wants to, and happens to ask a question like this, it's going to use that, regardless of the subject matter. It's going to try to fulfill that request.

Kim:

You know, one of the things I talked about you know, a real and then also on our show was my son came home and had heard some stuff about AI and wanted to use it to help plan our vacation. And the first one was like no, what? No? Ai. Of course that's where I wanted to go first. But instead I was like okay, let's learn it together, let's look at it, and it was really cool. Like we were able to put in the details of like what kind of things we like to do, the ages of our kids, where we were going, and it put together this super cool itinerary.

Kim:

But and that same thought the conversation I had with him was imagine if you had not told me and you saw what it could do. And then you said what could a guy and a girl do in this situation? And it could have generated all kinds of things that may or may not be true or fall under your value system. And so it created good conversation there. But it also made me realize that good, bad side of AI and I think it's something we're always going to have to deal with.

Marshall:

I don't know what site you used for this, but there's so much stuff now, high definition, if you remember. You know, a decade or so ago everything was high definition because it was the buzzword. So now everything is AI because it's the buzzword. You know, high definition sunglasses. Sunglasses aren't a TV screen, so they can't be high definition. In the same way, there's some things out there that just really aren't AI and they're being marketed that way.

Marshall:

The chances are the only way they're making money is by selling the information you type in or get out of that system.

Marshall:

So you know you've got to be worried about scammers as an average Joe, most likely not, you know, as far as being targeted by some scammer for some high value situation.

Marshall:

But AI here can kind of help bring that level down and say, okay, it's no longer presidents, fortune 500,. You know CEOs and you know congressmen that we're going to target. We can now use AI to sift through a lot more information. You know, plus, if you're handing it to them in this app. You know if your son typed in this information into some unreputable app, what's to keep potentially from it saying okay, so they're going to go on vacation at this time. They said they're going to leave from this airport and they want to go for seven days and they're able to look back and you know, maybe it uses location history, maybe you know it's something you typed in about which school you go to in a previous chat being able to use all that and kind of put it together and build a profile of this person, what's to prevent them from breaking into your house, you know, while you're on vacation.

Kim:

Okay, so that is so. I'm so glad you said that. I did not think about that, and it's such a good point that AI takes all the information and brings it together for a certain purpose that may not be the one that you intended. So, parents, this is another great conversation piece is remind your kids that maybe this one time they type in about their school and we always say, don't share any personal information ever, but let's say they're not thinking, and they put in their school, and then this other time about the vacation, and this time about you know, whatever it is something about, maybe the hair color and AI can bring all that together and create a profile for people who intend to do bad. Your kids may not be thinking in that way, and so this is another great point to talk with them about, and I'm so glad that you brought it up, because it leads me to this scam that we had heard about some while back but I've heard it a few times now where scammers are using AI to trick parents into believing that they've been kidnapped.

Marshall:

Yes.

Kim:

And I don't even understand how that works, so maybe you can explain it. But again, collecting data to manipulate in this situation.

Marshall:

So I'm a skeptic of a lot of things, and so I'll say this is a quick disclaimer that it's been possible for years to mimic your child's voice, because the reality is get any kid and have them scream mommy, and you're probably gonna think it's your kid on the phone. So there is a healthy amount of skepticism there. That's always been applicable. But, with that said, we are now entering the age where AI can be used to synthesize voice. The popular show Adventures in Odyssey from Focus on the Family yes, one of their main actors has died. He passed away two years ago or so. Their latest episode has his voice in it and they're using a different actor to play him, but they're using AI to synthesize the voice of the other actor. Wow, and the reception online, from what I've seen, has been overwhelmingly positive. That, while maybe it's not perfect, it is still a pretty good thing. Right now, there's systems out there and, for the most part, the models that allow you to do that are being locked under paid type of contracts. It's making them lots of money to run these models some of the corporations and so they're extracting that value by charging, but it also is there to help prevent you from you know the average Joe from just using the model to make a fake video of a president or another actor or something like that. There's a lot of voice actors out there who would be very mad to see their voice cloned by some average Joe and make something out of it. So there's some good pressure from, in this case, hollywood, really saying like, oh, we don't want other people to do that, please lock the models away. But it's only a matter of time before it becomes at least in some fidelity of quality accessible to the average Joe. So with that, you know, if I remember, even back with the Nazi concentration camps, they would take people into interrogation rooms and they would try to get them to talk and answer very benign questions. And then they would go back and they would take the tape and they would splice the tape with an X-Acto knife and take taking yes, I agree and please don't. And they would put it into a different interview that said you know, do you want to disown your family? Yes, I do. You know, do you want to ever leave this camp or are we treating you nice? No, I don't. It's great here. You know, he never said that. He did say that, but he didn't say that Right, right and so, and there they weren't even synthesizing voice, they were just re-peacing it together, and so whether AI is synthesizing a voice that they heard or that they're just trying to mimic that child, yeah, it's going to get better.

Marshall:

So I think the what do we do with that Is to always verify. A very first step would be if this person that is supposedly Calling you from an unknown number and saying hey, it's me, mommy, or it's your parent or some, or an aunt, is to say Hang on just a minute and and call their phone. Now, obviously they may be away from their phone or or based on this, you know Travel itinerary. You're up in the air on a plane for the next six hours, so I can't get ahold of you, but you know someone's claiming that you missed your flight and you're now stuck and you need, you know, $5,000 wired. So there may be times. That makes it harder.

Marshall:

One of the things that would be useful that the Guy named Johnson at the FBI shared some tips about this. He called it a family password I would say a code word or something like that. Going back to the the family drama odyssey, they used a code word in there called apple sauce. So don't use it. But you know, but that word was kind of a secret that would let people know that it was actually you know the inventor of the product was. You know it was authorized, and so as a family it's useful to have a code word. Anytime that it's used, make sure you change it to something else. You know, if the kids tell the babysitter change the word, if you know there's a situation and they use it, then it's time to change that word again. So you stay, so it stays, a secret password.

Kim:

That's great.

Kim:

We we have encouraged that also.

Kim:

It's something we've always had in our family and We've only had to use it once and it it's scary when it's used, but so also Comforting in a way, because you have this way of verifying one, that it's that person or two Telling your parent I need help now, and so we use it for those two reasons.

Kim:

And something I would add to that which I think A lot of us haven't thought of, is our elderly community is very similar to that. The community is very susceptible to being scammed, and so it you know, if, if I got a call saying my kid was kidnapped, I may have more of the Tools or the knowledge of saying, like, let me verify this. But if my kid called my you know my mother or his grandmother, she may be more susceptible to falling for it because she doesn't raised in the digital age and may not know all of these things. So have that conversation with the grandparents to maybe have a special word for them also, or they're part of that extended family emergency word or password, because our elderly are Definitely being targeted with scams and they need to know how to protect your kids and themselves as well. So that's great. Those are some great tips. I appreciate that.

Marshall:

Yeah, we've got. You know there's family drama all the time, so I would say I would suggest that those passwords are in the smallest group as possible. Just like you don't share passwords among different websites. I hope it's the same thing here if, if your kid is at your grandparent, you know at your parents house, and they're feeling unsafe there for some reason, they need to be able to use the password without yes, you know, raising that suspicion. That is the password. So you know you don't want to. Yeah, exactly, but you know it depends on your family. How how good are your kids or your parents at Remembering multiple different passwords? Or how much can you keep up with which password you know? Wait, did you say applesauce or did you say the other word you know like. So, yeah, anyways, you don't want to confuse yourself in those stressful moments, but having something like that to help verify something is a good thing.

Kim:

I agree completely. Okay, so this has been a lot to process. I guess my last question for you would be Are there things that parents need to know, or things on the horizon or anything about AI that we haven't covered today that you think would be helpful as parents?

Marshall:

Sure, on that side of like what's going forward, it's only going to become more prevalent in our society, just like computers, just like cell phones and and other technology. It's here to stay in some form or another. I think the key is teaching your kids how to use it as a tool and not as a crutch. For instance, there's you know, if they're struggling with math, don't let them use it to solve their math, but there are AI tools out there that can explain a problem. So if you sit down with them, you're working with them and you type in you know some algebraic equation and it sits down and it breaks it down and says this is what's going on behind the scenes. You've now just learned how to use it as a tool. You've now just learned a little bit better.

Marshall:

If your kids are more isolated I mean, we've had video games that have the NPCs, the non-player characters that are virtual and maybe use AI to run them, and you know that's. We've seen that now for decades. That it's it's useful to make the game more interesting. But if it crosses over to where they're using it for, like a virtual friend, where they're disclosing their deepest, darkest secrets to it, whether audio or text, and they're starting to be able to understand that type of thing where it can chat with you or talk, you know, and respond and look like you're in a video chat with someone that's real, but it's just a game character or some sort of SIM friend. Obviously, there's the sexual side of that that should be a clear, obvious line but even the more benign side. You need to make sure that they're having healthy relationships with actual humans. You know, and even if you're struggling, that they have very strict limits around that and what's helpful.

Marshall:

You can use AI as a cool party trick. I've made some cool things that can make you know. Ai generated Bible lessons for Sunday school, which is a cool trick. I wouldn't verbatim go and take that and use it for Sunday school, but sometimes it's fun just to brainstorm and think outside the box of something. Using it as that is just a cool brainstorm, is might be acceptable, but depending on it for your work, is is most likely not.

Marshall:

I think something that I didn't mention about how AI works is that it's the term is hallucination. If it's not trained on something, it doesn't know about it, but it may think it does. It's like that toddler that thinks they know everything and just battles on about nothing. And so the more generally known a subject is like, oddly, the Bible. There's a ton of stuff out on the Internet Bible studies. There's actually a good amount of stuff that does draw some interesting conclusions. You know that are at least not completely far off. But you also don't want to rely on that is your only thing, but some of those more more nuanced things about like your high school or something in a small town, it may not know enough to tell the truth, even though it gives an answer that sounds just as sure as something about you know another subject, so making sure your kids understand how to trust it and when to trust it, when maybe not to trust it as much.

Kim:

Well, it just made me think that's a good like one liner with your kid that it's knowledge but not necessarily truth. You know it's information but not necessarily to be trusted. I think that's real important distinction for your kids to understand.

Marshall:

I think, going forwards, you're going to see AI be used in a variety of careers, whether it's, you know, with the augmented reality goggles saying, you know, oh, that pipe looks like it might be corroded, you know, or the pressure is off on this line and it's this pipe up here helping you do your job better. Or whether it's more on the creative side, you know, somehow fueling that. Or even on the legal side, or healthcare, where it's taking huge amounts of data and helping to still it down so you can make a better decision. Like even there, I think, they're saying for certain MRIs or CAT scans that the AI is already better at diagnosing tumors than an actual doctor. Wow, and so if we, if that keeps up, you still, as a doctor, need to know how to diagnose it by hand, but the computer can look at millions and millions of these things and use all that context to look at things you know. Even on a CAT scan, that's not. It wasn't for tumors, it was for some other you know, infection or something. But it may notice a tumor even though the doctor wasn't looking.

Marshall:

So teaching your kids how to understand that it's just a tool will allow them to harness it, to use it in their careers, when they become doctors or lawyers or you know any other creative professional, to use it to help, obviously, in our case, promote the gospel, promote keeping people healthy. But don't use it as a way to slack off. And well, the AI can kind of do it. So I just sit back and double check that it looks right and I don't pay attention. You know God wants us to be proactive and it says the worker is worthy of his wages, you know. So you still need to work, you still need to have discernment and then also remember that that that truth there, that God isn't about mostly true, he's about the way, the truth and the life. So we want to really try to always be finding that and never settle to let the computer, whether it's AI or anything else, tell us what the answers are.

Kim:

You know that brings us back to what we've said a few times here is, whether good or bad, whatever the case, it's still a tool and it's all about the conversations that you're having with your kids. We are the ones who God has given us these kids, and it's our job to train them up and show them how to operate in this world as believers, no matter what is thrown our way, because AI is today, but tomorrow it'll be something else and you know it'll send us into a tailspin again of oh no, this is worse and horrible, but the answer is still the relationship, and the answer is still this relationship as well, with Jesus first, and then, you know, bringing that into your home with your kids and being open and letting them ask all the hard and weird questions and walking them through it.

Kim:

So, gosh, thank you so much. It's been so great to get your perspective and your insight and knowledge on AR and VR and AI all the all the AI are All the acronyms yes? Obviously I'm not an expert, but it has helped me with some really good practical things I can say to my kids, and I know it's helped our listeners too. We really appreciate you being here. You're welcome. Thanks for having me. This podcast is ad free because of all the people who donate to our nonprofit. Make a donation today at nexttalkorg.

Marshall:

This podcast is not intended to replace the advice of a trained healthcare or legal professional, or to diagnose, treat or otherwise render expert advice regarding any type of medical, psychological or legal problem. Users are advised to consult a qualified expert for treatment.

Understanding AI and VR in Parenting
The Merging of AI and VR
Protecting Children From Online Harm
AI's Impact on Privacy and Security
AI and Virtual Relationships With Kids
Insight on AR, VR, and AI