
The Corey Podcast
Welcome to The Corey Podcast—where no topic is off-limits, not even the ones lurking in the shadows. I’m Corey Strong, your host, known from Surviving It, Being Happier, and Frankly Speaking. Here, we dive into raw, unfiltered conversations, and I share my unapologetic takes—some refined like fine wine, others with a bite you won’t forget.
If you’re a truth-seeker who isn’t afraid to question everything, you’re in the right place. Let’s get into it.
New Episodes Weekly . Please rate and review the podcast if you enjoy it!
The Corey Podcast
The Future Is Now: William Welser IV on Tech, Trust, and Power
In this episode, Corey speaks with William (Bill) Welser IV, CEO of Lotic AI, about the importance of data privacy in the age of technology. They discuss the concept of Lotic as an 'anti-tech' company that prioritizes user agency and understanding over profit. The conversation delves into the implications of AI on mental health, the need for individuals to take control of their data, and the potential dangers of relying too heavily on technology for cognitive tasks. Bill emphasizes the importance of maintaining friction in society to foster growth and creativity, while also advocating for transparency in how AI systems operate and use personal data.
For more on William Welser IV and Lotic AI, check out the links below.
https://www.instagram.com/lotic.ai/
https://www.facebook.com/loticai
Email: talk2us@lotic.ai
Intro: Names and Identity
00:45 Excitement and Energy
02:57 Dinosaurs and Human Technology
05:58 Anti-Tech Tech Company
09:05 Behavioral Science and Technology
12:07 Data Ownership and Privacy
15:09 Empowerment through Personal Data
17:39 Understanding Privacy in the Digital Age
20:36 The Value of Personal Data
23:20 Awareness and Digital Security
26:17 Opportunities for Change
30:44 Musical Foundations and Personal Growth
32:19 AI and Mental Health: Uncovering Our Truths
36:57 Privacy in AI: Trust and Data Security
39:34 The Role of AI in Emotional Labor
43:12 Cognitive Offloading: The Impact of AI on Learning
47:19 The Dangers of AI Dependency
51:43 Encouraging Curiosity and Innovation
56:06 AI Literacy: A Call to Action
59:32 Hope for the Future: Embracing Friction in Society
01:03:59 Introduction to Technology and Curiosity
01:06:27 Outro
Share a voice memo with your questions or comments for Corey to coreystrongpodcast@gmail.com
Follow The Corey Podcast here:
patreon.com/CoreyStrong
https://www.tiktok.com/@coreystrongpodcast
https://www.instagram.com/coreystrongpodcast/
Order Corey's self-help Journal, ONWARD! here:
https://a.co/d/6DLwpbw
corey strong (00:01)
Hey everyone, it's Corey. Today I'm talking with Bill Welser IV, CEO of Lotic AI,
a company that calls itself anti-tech in a world where tech seems to be taking over everything.
Bill joined me for a candid conversation that really made me stop and think,
Who actually owns our data? Where is AI taking us? And how do we hold on to any real sense of power or privacy in a world that feels more digital than human? Bill doesn't give cookie cutter answers, guys. He gives context, clarity, and a little bit of discomfort in the best way.
This episode is for anyone who uses tech, questions tech, or wonders what it's doing to us.
Here's my conversation with Bill.
Corey Strong (00:58)
I wanna welcome you to the Corey podcast, Bill Welster. Thank you so much. I'm so honored to have you here. And I wanna start off by just asking you, how are you?
William Welser IV (01:03)
Thank you.
You know, I am fantastic this morning. ⁓ I'm usually not a morning person, but today I was really excited and not to pump up your ego, but I was excited to talk with you. And then this past five minutes have just gotten me even more energized. So I feel like I've just had four shots of espresso and like, I'm ready to roll. ⁓
Corey Strong (01:31)
Same. I actually did. I actually did. But
it's the same. Honestly, I get excited for all of my guests to come on. But I was like that hidden nerd in me was just so excited this week.
researching you and looking at your videos. I'm just like, I almost feel like I was having Bill Nye once today. My son was just like, don't embarrass us. I said, well, I'm going to try. I'm like, I'm going to try not to, but yes, it's mutual my friend. truly is. ⁓ So I have to ask you.
William Welser IV (02:02)
I'll do the embarrassing. ⁓
Thank you.
Corey Strong (02:17)
Are we smarter than dinosaurs? Because I am a fan of Buddy. I just have to let you know that resonated with me and I'm just like, ⁓ have to talk to, I could talk to you. I could talk to this guy. That was the first video when I, when you guys reached out to me and I said, well, let me pull them up. Let me see who this is, you know? And I was just like, dinosaurs.
William Welser IV (02:20)
You know?
That was.
minute.
Corey Strong (02:45)
I was like, yes. I was like, schedule him. Schedule him now.
William Welser IV (02:50)
Well, that was a funny, ⁓ it was an interesting period of time. It was while I was at the Rand Corporation, large nonprofit think tank that focuses on policy. And I was spending a lot of time in the community that looks at near earth objects. So asteroids and whatnot. And it was pretty scary to me that the room was about 200 people or less thinking about this.
Corey Strong (03:08)
Hmm.
William Welser IV (03:19)
you know, topic that, you know, could be earth ending if it were, if the thing were big enough. And I really got very deeply into this idea of, okay, well, what technologies do we have? And this is, you know, when I gave that talk, it was about 10 or so years ago. ⁓ And at the time we really didn't have anything to, to push something out of the way. We had ideas, but we hadn't really tested anything.
Corey Strong (03:23)
Mmm.
William Welser IV (03:47)
And as I've found in my career as an engineer, as a scientist, until you test it, it's just, you know, it really is just an idea. Just like until you ⁓ build or ⁓ manufacture the couture product, right? Like it's just a drawing. ⁓ And so for me, was, you know, we haven't done anything to test. so frankly, like, yeah, we're really cool with our...
Corey Strong (04:03)
Hmm. Wow.
William Welser IV (04:15)
know, smartphones, and we're really cool with all, you know, our every other piece of technology that we have. But when it comes to an earth ending event, we really aren't any better than the dinosaurs.
Corey Strong (04:25)
No, it's just like we're
waiting to see what's going to happen next.
William Welser IV (04:28)
Yeah, so since then we have, ⁓ humans have rammed ⁓ a projectile into an asteroid to move it, until it kind of changes course. And space is so big, outer space is so big that really the way that physics works, as I'm sure you're familiar, like you just have to kind of nudge something a little bit. And if you nudge it a little bit far enough away,
Corey Strong (04:39)
Whoa.
William Welser IV (04:57)
it'll miss you by a lot. And so you just, ⁓ when we as humans kind of pushed this object by ramming it with a projectile, moved it just a little bit off course, and now it's going to miss us by quite a bit. But we are trying to prove that we can move something like that.
Corey Strong (04:59)
Wow.
that you can move this object.
Wow, that is so interesting. And, I mean, the dinosaurs, poor guys, they should have had this technology, but they didn't.
William Welser IV (05:32)
I mean, they probably, don't know, there's some ignorance is bliss, right? Like there's something that's flying through the sky. everything's over. Like, I don't know, that might be better than the AI apocalypse. Just saying.
Corey Strong (05:34)
you
Exactly.
Yeah, it could be.
It could be very well that way. So you've described Lotic and I'm saying that correctly as an anti-tech company. You you dare to say that in this time that we're living in, which are brave. I'm a fan of you for saying that. What does that mean?
William Welser IV (05:57)
Correct.
I do.
So I think that tech companies, it really is a play on the fact that most tech companies we see today, ⁓ we as individuals or as the population might think that they're kind of just in it for themselves, right? They're in it to objectify us, to make money off of our data, to sell us stuff, to really not necessarily act in our own interests, but instead to kind of, know, yeah, make our lives easier, but it's...
Corey Strong (06:30)
Mm-hmm.
William Welser IV (06:40)
really good for their bottom line and not so good for our personal bottom line. And so when I say anti-tech tech company, it's not that we're not using technology, like that would be silly, but it's instead that we've focused all of our technology efforts on giving individuals agency, helping individuals understand themselves better so they can make better decisions about little things in their lives or big things in their lives.
that they can ⁓ decide that they want to do something because they're well informed versus deciding they're going to do something because they're kind of in a trial and error state and they're just going to see what happens. And right now we're in this really interesting space of technology where it is very reasonable to think that
Corey Strong (07:23)
Hmm.
William Welser IV (07:34)
Technology can aid someone at the individual level in a meaningful way. And I'm not talking chat GPT. Like that's not a meaningful way right now, but in a far more meaningful way than ever before. And companies need to fill the gap of giving individuals the tools that are focused on them and let individuals run with it.
Corey Strong (07:42)
Right, right.
Yeah.
William Welser IV (08:01)
difficulty
and the last thing I'll say on this is the difficulty is like, how do you make money as a technology company doing that? Right? Like that's, that's a, that's a tricky thing. Right. But when I say anti-tech tech, it's that we're driven by other things. We're driven by other goals. We're driven by other ethics. We're driven by other values.
Corey Strong (08:06)
That's the bottom line. Yeah, yeah.
Wow, that's great. So let me ask you this. ⁓ Back at your time at Rand and working in aerospace, what made you shift those spaces into building something that's more human centered? And how do you think we can get these companies to get on board with this?
William Welser IV (08:42)
Great, so those are two really great ⁓ questions. So the first is when I, so I started my career in the military as an officer in the Air Force building large technology systems like satellites. And I had a chance to build like a large chemical laser and like really cool things. And I left the Air Force and went to the Rand Corporation and I was there for 10 years. And I did two jobs at Rand. I led the engineering and applied sciences department, which was like 300.
plus engineers and applied scientists working across Rand's portfolio of defense and education and infrastructure and health, et cetera. And then I had my own research portfolio. And when I showed up, my research portfolio was entirely about advanced technology. What are the policies around advanced technology? How do you control advanced technology? All that sort of thing. Ethics of it, that. And.
RAND doesn't have just engineers. They have behavioral scientists, they have political scientists, they have economists, they have statisticians, they have all these different types of experts. And I fell in love with the behavioral scientists. I fell in love with the people who understood how the brain works. So clinical psychologists who understand how communities work. So IO psychologists who understand ⁓ why we are who we are, like anthropologists and ethnographers.
And I really started bringing them into my research. So my research became not just advanced tech, but like the collision between advanced technology and human behavior. And so when I was done of a decade of Rand, and I love that institution, it is a fantastic institution, but I was tired. I wanted to start building and I jumped into the startup space and I said, I want to build something that
Corey Strong (10:20)
Hmm.
William Welser IV (10:38)
is behavioral science first and technology second and like led with how can we best help the individual again, back to the anti-tech tech, like do this thing backwards, do this thing upside down, ⁓ focused on the individual. And so that's kind of how that shifted ⁓ over time. And it really is just because I think the, some of the most amazing minds out there.
Corey Strong (10:49)
Yeah, yeah, yeah.
William Welser IV (11:07)
our behavioral scientists, our people who understand how this thing works, this complex system that's up in our, up in our noggin, ⁓ it's so far more complicated than anything else.
Corey Strong (11:19)
think that's fascinating.
Yeah, you know, I go back to watching, ⁓ I was a Nova watcher back in the day, and just watching how the mind and like how scientists.
William Welser IV (11:28)
Nice. Great.
Corey Strong (11:36)
always inform us that there's more going on than we think is happening in our minds and just kind of being. I was always the type of kid also that I'm just like, well, can I really control something with my mind? You know, or can I? You know what I'm saying? Or what can I do with this? Or can I really think this into existence? And so fast forward to today, it makes me even more excited to see what's next because.
William Welser IV (11:52)
Yes!
Corey Strong (12:05)
With all of that information that you just shared, there is clearly something more there that we need to point it more so in the direction of helping humankind instead of just profiting from humankind. You know what I'm saying? If that makes sense. And I love that your company is centered around that. And so...
William Welser IV (12:26)
I do, I do.
Corey Strong (12:33)
Let's talk about the control aspect of it. Like you said, most companies, they want to take our data, you know, and they want to use it to shape our behavior. You're trying to flip that. So moving forward, how? How are we going to do it? How can you achieve that?
William Welser IV (12:45)
Yes.
Well,
yeah, there's a few different things that go into that. So the first one is you want to make sure, at least I do, in building a system that ⁓ an individual, their data is theirs. And I can't stop Meta or Google or Amazon or whomever else from collecting your data. Like that's in their business model. You're going to continue to use those services. They're going to continue to collect data.
but I can help you create a data store or a database of yourself that is unique from what they have. And I can keep that secured at your individual level using advanced cryptography and make sure that it's not something that is easily hackable. So the first thing is privacy. How do I get privacy and security in place? And
that from the very beginning we have focused on if we're gonna collect data, that data is gonna be owned by the individual. We will help them collect it, organize it, process it and store it, but, and make sense of it, but like we're not going to profit directly off of it. So that was point number one. And I think that's ⁓ company values and also just ethical use of technology in my
Corey Strong (14:09)
Okay. Okay.
Yeah, that's important.
William Welser IV (14:20)
⁓
The basis of that data is ⁓ really context data. And when I say context data, the best way that we collect it is spoken word. Our system collects spoken word data from individuals. ⁓ It's carefully prompted with prompts that we developed with my behavioral scientist colleagues. So again, clinical psychologists, people who have PhDs in brain plasticity and stuff like that.
Corey Strong (14:47)
Okay. People who work with people.
Yeah.
William Welser IV (14:51)
people who work with people and understand people, right?
And we ⁓ prompt individuals to speak and we want people to speak in a messy way, right? Like so free association and no punctuation and run on sentences and switching from topic to topic and do it in kind of bite size 90 seconds to two minute kind of pieces. And it turns out that you can take that information and if you...
match it up to other things that you might have like wearable information or the weather outside or anything else that you might collect or might be collected about you, you can start understanding your subconscious. You can start understanding why you made certain decisions. Like when it rains, do I buy more online? When I'm, you know, at a...
Corey Strong (15:32)
Hmm.
Yeah. Or do I
eat more in my case? ⁓
William Welser IV (15:44)
Right?
But those are real things, right? Like where we kind of think that that's true, but we're not really sure. We're like, we've built ⁓ heuristics in our heads of like, well, we, that's probably true, but we don't know for sure. And if we want to change a behavior, we need to know for sure. And so keeping this data at the individual level, because you, Corey, are an individual, you
Corey Strong (15:48)
Yeah, it's real.
Yes.
William Welser IV (16:14)
Your data doesn't belong to, you know, with mine because we're different people and the things that will serve you are different than the things that will serve me. And so why on earth would I want a big set of data that says, Corey, you're the 77th percentile of people your age living in Ann Arbor, blah, blah, blah. And, you know, Bill, you're in the 68th percentile of people your age living in Austin, Texas, blah, blah, blah.
Like, what the heck does that mean? Right? I'd much rather have it be like, hey, Bill, when you work out in the morning, you tend to be in a negative mindset for the next 30 plus hours. And when you work out in the evening, you tend to have like, you know, a little bit less, but like a 20 some odd hour kind of ⁓ upbeat aspect to your, to your personality, to your mood.
Corey Strong (16:44)
That does nothing.
Now see that's some data that I can get behind because as I said, I listened to you explain that trial that you were involved with and I would be interested to find out about myself that I should not be working out because it's just not up my mood. When I heard that I said finally someone who understands why we should not be working out.
William Welser IV (17:13)
Like, that's cool.
Corey Strong (17:42)
It just puts me in a really bad mood. Now the bad mood for me happens before I arrive to the channel.
I realize we're joking now, but that is so important and that is so empowering to have that type of data because it gets down to the point of it being not just a... I hate to say it like how hospitals are here in the United States. It's just like they see one problem and this is how we treat it.
William Welser IV (18:09)
Mm-hmm.
Corey Strong (18:14)
not giving you the research of, you know, taking time to understand how you arrived to this situation. Like, why are you a diabetic? Not just because, you must have ate 20 donuts. Now you're diabetic. Let's treat you instead of, you know, is this brought on by stress, environment, you know, genetics. So what I'm hearing from you it makes me feel good because
William Welser IV (18:24)
Right.
Corey Strong (18:43)
Clearly, this is a way that we can access AI and become more empowered and learn about ourselves on an individual basis, which I believe is very important. And in saying that, why do you think that we so easily and so willingly
William Welser IV (18:53)
You
Corey Strong (19:05)
give up our privacy to companies like Meta and I know I cancelled my Facebook but at the same time I'm like well I need my Instagram you know and I know what they're doing with it but why do you think that you know we are so easy to just give up our privacy like that
William Welser IV (19:14)
Yeah, right.
So I have three children. I have a 20-year-old daughter, an 18-year-old son, and a 16-year-old son. And we actually talk about this quite a bit. And I sometimes might ⁓ ambush their friends and ask them about this and embarrass them.
Corey Strong (19:40)
Yeah, I know. I've been banned from talking with friends. That ban should
be lifted by the weekend, hopefully.
William Welser IV (19:47)
and ⁓
embarrass them. I think that people really separate privacy ⁓ from a physical sense and privacy from a digital sense. Just like we would separate security from a physical sense. I lock my door at night because Austin does have some crime. mean, not a ton of crime where I live, but some crime. So I lock my door at night.
Corey Strong (20:11)
Mm-hmm.
William Welser IV (20:16)
Am I as deliberate about locking my data down? Am I as deliberate about locking my computer down? Most people are not, right? And it's because, you know, the digital space is, you know, very ethereal and like it's hard to touch and you can't see it. And it's, it's very difficult to put sense around it. The physical space, like I'm shutting the door and I'm turning the deadbolt.
Corey Strong (20:44)
Mm-hmm.
William Welser IV (20:44)
Right? I'm setting the alarm system or whatever it might be. And so one thing, I don't think people are making the connection between giving up privacy. They just see it as like, hey, I get this really cool tool like Instagram or whatnot that excites and delights me. And if they have some information about me, like, what harm could that information do? You know, the best thing that it does is it shows me more stuff that I like.
Corey Strong (21:13)
Hmm.
William Welser IV (21:14)
The issue that I see, I mean, where people would really start to bring privacy into the space is we have no idea, zero idea how much that data is worth. So Instagram is free. Instagram isn't free because Meta is taking a loss. Instagram is free because it makes a ton of sense to make it free. And we...
Corey Strong (21:29)
Yeah.
Yeah.
William Welser IV (21:39)
oftentimes are giving stuff away because we see this thing that is free. We don't have a way to value. know, if Instagram makes $130 a day on me, I'm just making something up. Like I have no idea what it is. Would I still choose to use it if they didn't give me any part of that?
Corey Strong (21:56)
Yeah,
Right.
Yeah, because you want to post pictures. And you want people to say, you look nice.
William Welser IV (22:11)
There are other
systems that maybe, what if there was a competing system that, you know, makes the same amount, could make the same amount, $130, but is willing to give you 50.
So it's not only free, but they're paying you to use it.
Corey Strong (22:26)
It's the mindset I believe though, because I will have to say I honestly I scroll past all of that privacy clause. I just want to get right into the app. You know, they give you this whole scrolling of like the different. I've never read any of that and I'm just like except except now except let me post and then when I hear about data breaches and stuff like this and I'm just like.
William Welser IV (22:28)
Right?
No one does.
Corey Strong (22:53)
Well, that wouldn't be able to include me because in my mind's eye, I'm just posting pictures. I'm just trying to learn by other people. I'm not aware of what could possibly potentially happen. And this spring, I read a book. I don't know if you've heard of it. Well, I'm sure you've heard of it. It's called Careless People. And that book...
That book made, well that's when I canceled Facebook, because I'm just like, well maybe I need to start somewhere, you know what I'm saying? Because this is evil. There's kind of an evil intent behind this joy that we're feeling connecting on these platforms.
William Welser IV (23:27)
Yeah, yeah.
Yeah, I mean, and I don't want to get super political, but I'm going to ⁓ state something which I think is really interesting. So some apps and some wearables suggest that they can track a woman's ⁓ menstrual cycle, her hormone cycle. ⁓
Corey Strong (23:49)
Yeah, we won't.
William Welser IV (24:06)
that data is not readily available via the APIs of those apps or those wearables because if someone has access to your data as a woman's data and it shows that she's pregnant and then all of a sudden she's not pregnant and she lives in a state where abortion is illegal, like, whoa, right? And so it's very easy to say privacy,
Corey Strong (24:27)
then hello, hello, red flag.
William Welser IV (24:35)
is, ⁓ well, who wants my data, right? But it turns out that as these artificial agents get better at plowing through large sets of data, if that information were available, they could immediately, you know, a coalition of individuals who wanted to make sure no abortions were happening, they could immediately go and find all those people whose pregnancies, you know, quote unquote, disappeared. Right?
Corey Strong (24:49)
and there's nothing you can do about it.
Just disappeared.
William Welser IV (25:04)
That to me is like, forget about whether or not you agree with abortion, right? That's almost beside the point. Like we're talking about basic human rights, right? ⁓ And those sorts of things are not too far off from using Instagram, using Facebook, know, Google going through and searching through your Gmail, which it does, like all these things happening. Like it's not that far off.
Corey Strong (25:13)
Yeah.
It's right there in the same yard.
William Welser IV (25:35)
to jump
to someone being able to say, hey, notice this was going on, now it's not. Like, did you do something illegal?
Corey Strong (25:45)
It's the same with the whole sharing of location. Like my daughter, she's in Arizona state, but that was one of the agreements that my wife and I made. It's like, you need to turn that on. If you're going out of the state, the same with my son. It's just like, you need to turn that on. So it's helpful as a parent to know.
But what if that information is shared with someone that you don't know that has your location? And to be honest with you, when they first started having this location stuff on the phones, I didn't know if it was on or not. It's just like the air dropping. I got a girl's pictures last week that was sitting across the aisle from me at the coffee shop because her air drop was just open to everybody. I'm just like, hey, you're sending me pictures. And she's like, ⁓
William Welser IV (26:34)
wow.
Corey Strong (26:40)
But she didn't know that she was doing it. But I just think she wants to use the phone. You know what I'm saying? And it's not, OK, what am I doing? So for myself now, it's just like.
William Welser IV (26:43)
no.
Right.
Corey Strong (26:53)
I consciously look at what I'm using, how I'm using, I'm double checking like, this cut on for this person? Is this the same? I used to tour professionally as a musician and I used to post every single moment, everything. And I'm thinking to myself to this day, like, I'll never have a fan, know, no one, nothing crazy, you know? And then one day I was having lunch and this girl approached the table and she was like, hi.
I was like, hi. And she just ran down. She's like, ⁓ I knew you were going to be here because you posted it. She's like, I love your music. I'm just like.
William Welser IV (27:29)
Wow.
Corey Strong (27:31)
And the feeling I felt was like you would think that a bear walked up on me because I felt so... It's that feeling of someone took your wallet and you don't know what was in your wallet, but you know it's missing and you violated my space. And now I've adapted that same feeling now when I'm downloading.
William Welser IV (27:44)
Yeah. Yeah.
Corey Strong (27:57)
And when I'm using the phone, I'm very suspicious of like, okay, what do you want to do? What do you want to know? Why do you want to track this and it's hard to get that into I think everybody's routine because like I said, they just want to get on they want to hop on and see what's happening and How do you think we can
Make people more aware or more just like you were saying with the whole you're locking your door because you know That someone can physically come through your door. You can have a physical encounter But on the web you don't see that harm the same way. How can we get people to? Become aware that this is literally a real thing
William Welser IV (28:33)
Mm-hmm.
This is a really hard problem that I talk with my behavioral science colleagues about a lot because you can drive people only so far with fear before they just start ignoring it. Right? Like it's like, I can't do anything. I'm gonna throw my hands up. I can't do anything. So to what extent do you drive them with opportunity? Like what is the opportunity if you...
Corey Strong (29:03)
Yeah.
William Welser IV (29:12)
If you secure something, if you operate in a way, if you act in a way where you're more secure, what kind of opportunities are you opening up for yourself? And one of the things that, and I'm actually, I have a book that's coming out at the end of this, toward the end of this year in October, that talks about a different way to generate wealth. That's more data focused and not dollar focused.
And it's a, I'm not gonna totally summarize it right now, but the idea being like, that's an opportunity where if people can generate wealth, particularly in a time where the socioeconomic gap is widening, and there's a lot of fear about future of work and ⁓ whatnot, like if you can give people an opportunity to
Corey Strong (29:43)
No, because you have to come back for that part. ⁓
Yeah.
William Welser IV (30:07)
to do something positive, they may start to pay attention to those other risks, right? And so that's a different approach. It's not a fear-based approach. I don't think fear-based approaches, particularly in this space, are too effective because we haven't, you don't see people that, people didn't stop shopping at Target when Target had the huge breach, right? They just kind of, I hope it didn't affect me.
Corey Strong (30:30)
I I know. And I'm guilty
of it myself. It's just like, well, I'll get my money back at some point from somebody. Someone's going to give it to me. You know, I know I didn't use misuse my information. So yeah, I think that is really. I think that might be the way by offering ⁓ to incentivize it. You know where you can, you know you're getting something.
William Welser IV (30:37)
Right.
Right.
Corey Strong (30:56)
you know, to move people in that direction because I don't think we're pretty, we're not afraid of anything anymore.
William Welser IV (31:03)
Just hard, mean, and we have like, ⁓ humans, it's wonderful. We are wonderful ⁓ beings, but we have a very hard time thinking about like the downside risk. Like the worst case scenarios, we do not really wanna think about those. Very few of us want to focus on them. And I mean, it's one of the reasons why people don't start investing until later in life, right? Because they have a short-term outlook, not a long-term outlook.
They're not thinking about what could happen, why they might need that, et cetera. Just, it's the way that we're wired.
Corey Strong (31:36)
Listen, this is what my daughter tells
me when I'm trying to tell give her some advice. She tells me Let me make sure I'm getting it right I'm here. I'm here for a good time. Not a long time And I'm just like what does that mean So she's basically saying she's just here to have a good time Doesn't care how long she's willing to take the risk
William Welser IV (31:54)
I don't know.
Yeah.
Corey Strong (32:04)
And I think that's the mentality that everyone has now. She's 21 now. Yeah, because I'm just like, watch out for this. Don't do that. Don't do this. She... I'm just here for a good time. I'm not here for a long time. And whereas I'm like, I want to stay as long as I can. You know, but they don't think that way anymore.
William Welser IV (32:08)
How old is your daughter?
Yeah, I have a feeling maybe she and my daughter share a lot of this. I feel like I've heard that same comment.
Yeah, yeah.
Well, it's also, I mean, I think it's scary to ⁓ be an individual growing up, to be a ⁓ young adult growing up in the world that we're in right now. There's so many things that are confusing. There's so many tools that are available to us, but like, what do you trust? What do you not trust? Like there's been this huge push with populism to start to distrust large institutions. And so how...
Corey Strong (32:46)
Yeah.
Mm-hmm.
William Welser IV (33:01)
how do you navigate that as a developing brain? I think it's really hard. I understand what your daughter's saying. ⁓ My kids would probably resonates really strongly with them. And for me, I can't falter for thinking that way, right? I mean, it's hard for me to think that way, but I kind of get it. ⁓ So what instrument did you play, by the way?
Corey Strong (33:04)
Yeah.
Yeah, Mm-hmm.
Yeah.
my goodness. Well, I was classically trained for voice and pipe organ and I played some piano as well. So yeah, those were my instruments. Yeah.
William Welser IV (33:27)
⁓ cool.
Nice, nice, amazing.
I was wondering if one of your instruments was here because you've got a very resonant voice that I could imagine is killer when you start ⁓ flowing some air through it.
Corey Strong (33:38)
Yeah.
Well, thank you.
I love singing. I used to be a wonderful ⁓ singer, I would say for myself, but I haven't sang as much in the recent years. I kind of like pivoted and went on to other things. one of those... ⁓
individual creators that was like okay that's enough of that let's move on so after this conversation with you who knows i may ⁓ you know get into aerospace engineering i don't know i may pick that i'll pick it up for about a month because you're making it seem really cool here
William Welser IV (34:05)
That's all right. That's all right.
Amazing,
amazing.
Corey Strong (34:21)
So let me ask from a mental health because on my podcast we talk a lot about mental health and ⁓ behaviors and you know how society and culture all plays in with that. So let me ask you Bill from a mental health and a behavioral standpoint what can AI or what will AI actually be able to help us uncover about ourselves?
William Welser IV (34:50)
Yeah, I think that the first thing is it won't help us uncover anything unless we're vulnerable. Right? So there is no easy way to approach ⁓ mental health or behavioral health ⁓ without being vulnerable and honest and upfront about what you're feeling. Now that doesn't...
That's your perspective, right? It's your truth that you're looking to share. But it's just like if you go to see a therapist and you were to ⁓ tell them, you know, nothing that was true, everything that just.
Corey Strong (35:20)
Hmm.
I was just going to say
that but I didn't want to interrupt you. When I first started my therapist it took me like two months to actually be real because I just didn't trust talking to someone outside of myself and it did. It did take a while for me to be... I was surprised myself because I'm not a liar but I just did not give my true self.
William Welser IV (35:38)
Yeah. Yeah.
Yes.
It's hard. It is really hard. the, I mean, one of the things that Lodic does with our, and you can go on online right now and there's a version of our technology that's for free. It's at getlodic.ai and you can speak into it and it will, it'll walk you through some prompted conversations. can free freely converse with it.
Corey Strong (36:12)
I'm gonna share that.
William Welser IV (36:24)
And at the end, it'll give you insights about your patterns, your trends. It'll talk, give you a mantra around the thing that you've just kind of explored with it. ⁓ Because of the way we deal with privacy, like I can't, if you, Corey, were to go create an account today and start speaking into it, I wouldn't know. And I couldn't go find your data. ⁓ I would just know that I have an additional user today.
Corey Strong (36:45)
Hmm.
William Welser IV (36:52)
And so that for me is extremely powerful because it just builds up this level of trust. And ⁓ from the feedback that we've gotten from people who have openly shared about using the system is that it has helped them with their relationships. So one person ⁓ reached out and said, you saved my marriage. Another person reached out and said, you've helped me from a ⁓ behavioral standpoint. I'm not as... ⁓
not in the same mindset of depression that I was previously because I'm seeing the patterns that were leading me to kind of dive down those holes. So being really clear here, we're not diagnosing nor are we preventing depression, but we have done studies that show that people who actively speak aloud their truth, actively try to make sense of the world around them.
Corey Strong (37:35)
Mm-hmm.
William Welser IV (37:48)
in a either written way from a journaling standpoint or a spoken way, and spoken tends to be better, ⁓ they will improve their state of anxiety. They will improve their state of depression. ⁓ Over a population, a population will improve. So statistically, you run a better chance of improving. Not everyone will. So I have to like put a lot of caveats there. it is about being vulnerable.
and speaking your truth and your truth, not what you want someone else to hear. And when that happens, ⁓ if the AI is properly tuned, which ours is, it'll give you insights that you can choose to take action on. ⁓ And so, you know, be more empathetic with your spouse when these things happen. Like that's a set of ⁓ insights that I've gotten.
Personally, right?
Corey Strong (38:46)
Wow, wow. Well, I know
I'm gonna be prompted. I just wrote down so I can sign up. I'm probably gonna be so full of prompts that I'll probably like call you tonight like, Bill, you know. I need you to go into my data and I need you to adjust something.
William Welser IV (38:51)
Hahaha
That's fine. You can text me and we can...
Well, so there's the thing. Like I actually
had somebody, we were working with a boxing gym locally and they were doing a fitness challenge. And part of that is mental health, behavioral health. And one of the participants said, Hey, you know, I recorded on Tuesday and like, I don't think it saved it to my, I can't find it in my history. Can you go and grab that file and add it to my history? And I was like, no.
Corey Strong (39:32)
No.
William Welser IV (39:33)
They're like, well, why? was like, I don't know who you are in my system. Like, can't, I can't, I mean, I'm going to have to assume that maybe it didn't, you you didn't press and record or like something else hiccupped, but like, I can't go in there and find it because I don't, I can't see your data.
Corey Strong (39:50)
So you'll
never get subpoenaed, is what you're saying.
William Welser IV (39:53)
I mean, they could try, but like, I can't go in the way that we've used advanced cryptography to hash out someone's data from their persona. And really the only thing that we really need from you persona wise is a way to credential you. So we use email, but like, you know, if you're, even if your email is your name, we can't.
it's not being tied directly to your data because we've obscured it via normal cryptographic ⁓ routes.
Corey Strong (40:24)
a particular profile.
That makes me
feel good. I don't know what it does for other people, but that makes me feel like this is, number one, I get the feeling that this is a company who is doing exactly what it says it's going to do and that cares about the privacy of its users. So why wouldn't we want that, you know?
William Welser IV (40:36)
people at.
Right.
Corey Strong (40:51)
It's almost like was that person calling you? Were they testing you to just see if you'd be like, all right, I'll pull your profile. ⁓ really?
William Welser IV (40:57)
I don't know, right? They seemed actually genuinely upset that I couldn't get that data.
But again, it goes back to the mindset of what you brought up earlier. How do people think about privacy in the digital space? How do people think about it? It's hard for us right now because it's not tangible. And we've not been given the tools to see how powerful it can be for us individually.
Corey Strong (41:20)
Hmm.
Yeah. Geez, that is so interesting. So I know I had a conversation a few weeks ago. I was just with a random person in Whole Foods and we're just talking about AI and she was telling me that.
There's this idea that eventually AI will start replacing workers, which we've seen in a lot of places like Amazon. I know recently at UPS they tried to do similar things. ⁓ But replacing like the emotional laborers and therapists and caregivers. Do you see that happening and do you think that should happen?
William Welser IV (42:14)
So I do see it happening. ⁓ I don't think it should happen, but that doesn't mean that it won't. Two years ago. Yeah.
Corey Strong (42:22)
that it won't.
Because there's levels before you go there because
with me, I would go to the therapist, but then when they have telehealth, it's like, well, I can stay home and get the telehealth. And now with you introducing me to this, it's like, well, why do I need the telehealth if I can go here? So I think it's kind of nature to kind of wind things back. mean,
William Welser IV (42:48)
Yeah, I am.
I think the biggest thing, so for me it is knowing the system and the values placed in the system. when I mean values, mean values, ethics, morals that have been placed in that system and the transparency of are they showing me my data? Are they showing me what they've done with my data? Are they showing me results from my data beyond just the response of the chat bot, right? Of the...
chat interface. you know, think it's, look, chat GPT and ⁓ the like, all those tools that are out there are very powerful and very wonderful ⁓ for certain purposes. But people are using them to figure out how to date. They're using them to figure out how to, you know, better communicate with their bosses. And those
those tools have not been trained by people who understand behavioral science, who understand human interaction. And that puts us at a huge disadvantage if we're trying to rely on that instead of relying on someone with trained expertise. And so I don't think it should replace, but I'm watching it replace. And that's concerning to me.
Corey Strong (44:06)
Yeah.
Hmm.
So that is very concerning. It's 100 % concerning because it leads me to this next question. My son, he recently told me that there were a group of kids at his academy that got in trouble for using chat GPT to do their work. ⁓ I use chat GPT myself.
William Welser IV (44:32)
Mm-hmm. Mm-hmm.
Corey Strong (44:38)
things like organizing, know my podcast episodes and nothing really, you know what I'm saying too important, but just to kind of give a little bit of ⁓ guidance, but they turned in these assignments and basically just kind of use the shortcut. So my question is yes, they can do this, but
How does relying on this AI to do all the thinking affect the real learning and skill development that is necessary to do anything important to do the job that you are doing? How can we depend on AI to ⁓ help this young person who thinks that this is the way to go?
you know, instead of learning, taking the time to learn.
William Welser IV (45:37)
So you've just hit on like, I think the, I think you've hit on one of the scariest things about where we are with artificial intelligence right now. And that is the offloading of cognitive tasks. I've watched, you know, the large language models came onto the scene in, like for real in 2022. And
Corey Strong (45:53)
Yeah.
William Welser IV (46:05)
⁓ If you look at where we were then versus where we are now, the world is completely different. If you look at where we were six months ago, it's completely different in terms of how people are using these tools. And ⁓ the offloading of these cognitive tasks is upsetting because not only are you losing kind of critical thought, are you losing the ways to problem solve, creativity is kind of being
Corey Strong (46:10)
Mm-hmm.
William Welser IV (46:34)
Some of the tools are fantastic, but it's still starting to bound creativity, right? You're starting to really put creativity in a box. And what happens there is that people just get into a routine, a set of behaviors where those skills that they used to have just atrophy. And now they've been relying on these tools and like basically that whatever that tool comes back with is truth.
Corey Strong (46:39)
Mm-hmm.
William Welser IV (47:03)
And that is, ⁓ for me, terrifying. ⁓ We should be better than this. Yeah.
Corey Strong (47:07)
It's scary. And it's embarrassing.
That's what I'm saying. It's embarrassing.
And like my son said, they all were shocked that the teacher knew. was like, sure, they gave like maybe 10 different variations, but it all was the same story. It's just like, that's so telling. And we already, I know from a musical standpoint, being in the industry.
That was one of the things that made me feel most trapped was how they want to put you in the box. If you sing this particular style of music, sure, I'm classically trained. I may sing gospel, but I may even want to Western music.
based on how I feel that day, but no, you can't because you're a pop singer and you can't do that. So it seems like to me that the way that these other companies are using AI is just putting on another form of a bridle to kind of hold you in this particular space where they want you to be and
William Welser IV (47:58)
Mm-hmm. Mm-hmm.
Corey Strong (48:20)
It's kind of like, it's kind of suffocating, if it makes sense.
William Welser IV (48:25)
Yeah, it does. It totally makes sense. my perspective here, I want to believe that the behavior of the tech companies is not malicious in nature. Like they don't, they're not trying to be malicious. They're just trying to be capitalists, right? Like, so OpenAI started as a nonprofit. ⁓ And all of a sudden you have Microsoft putting in, you know, $10 billion and you have others investing in it. And then
Corey Strong (48:43)
Mm-hmm.
William Welser IV (48:55)
know, chat GPT comes out ⁓ and is available for people for free. And then there's a paid version and now there's a, you know, enterprise version. And now there's going to be some sort of hardware and there's going to be some sort. So, you know, it's, it's a very common way of getting people hooked on a product and like, you know, then starting to charge them. I Netflix did the same thing, right? Like this is not, it's not a new thing. ⁓ But ⁓ it's different in the fact that the
Corey Strong (49:00)
Mm-hmm.
⁓ God.
William Welser IV (49:24)
product itself is allowing us, it's not entertainment, it's allowing us to offload these cognitive tasks, which is, you know, in turn atrophying those things that make us, you know, individual contributors to society. And that's, that's scary. It does, right? Like there's, ⁓ I mean, it is a, my lead, it's funny, my
Corey Strong (49:41)
Yeah, and they make us dumb. That's what I was gonna say. It makes us so dumb.
William Welser IV (49:53)
my AI practice within Lodic is led by a clinical psychologist. ⁓ Her name is Dr. Katie Penry and she is brilliant. We brought her in to lead the behavioral science piece and she's jumped in and she's now like an expert in artificial agent generation and this, that, and the other. And she and I were talking about this idea of ⁓ the decline in IQ.
basically, and it isn't the that people are just going to get dumber from the way the brain could work, is that they're just not exercising those things, right? And so there there's just going to be a decline in effectiveness. And that's
Corey Strong (50:38)
Yeah,
but there's also an increase in Alzheimer's and I think I said to myself like I think that this is happening is because we don't have to think as much You know what I'm saying? And then on top of that we go and we are traumatized and we're slapping our brains with all this negativity Every day it's just like but at the same time we're not exercising the brain because we don't have to think anymore
William Welser IV (51:06)
That's an interesting study that I wonder, that is an interesting hypothesis for a study.
Corey Strong (51:13)
Well, see, you figure that out for me. Let me know. Seriously, I literally thought about that when I look at things. I'm just like, and I'm not a scientist, but I literally thought like, this is, there's something with this. I bet because there's so many people, even my mom, she's...
William Welser IV (51:15)
I'm already thinking about it like, wait a minute, is there a connection there? this is...
Corey Strong (51:37)
in her late 70s. She's not a telephone person, never has been, but now you could never get that. You couldn't pry that iPhone out of that woman's hand. It's like she's worse than the teenagers. She's worse than the teenagers because you know what they say anyway, it's like you're a baby, you grow up, then you revert back to being a child. And she's in her teenager stage. I literally was talking to her the other day and she's like,
William Welser IV (51:58)
Yep. Yep.
Corey Strong (52:07)
huh. I'm like, are you on the phone? She's, I'm texting a friend. I'm like, this is just unbelievable. This is unbelievable. And then you ask her something, she's like, huh, what was it? I'm just like, I'm like, you have Alzheimer's. And she's like, no, I don't, no, I don't. I was like, yeah, you do. But it's like, I'm joking with her, but I'm like, is this really what's happening?
William Welser IV (52:29)
It's interesting, you're spurring me to think about one of, I'll just say one of my children ⁓ talks about this a lot, saying, feel like I don't have the rich memories that I should of things that have happened in the past. I don't remember that. don't like, I know you've told me about this, but I don't have that same memory. And this is,
all of my kids, I'm blessed with kids that do very well in school and whatnot. And so, yeah, that is always, I mean, that plays into what you're talking about. I need to think more about this, yeah.
Corey Strong (53:01)
Yeah, same.
Please, my
kids are the same. It's like I took them somewhere and I was like, this is a place that we used to come all the time and you really like it. And they're like, when did we come here? You must've brought somebody else. like, I know because I have the receipts from paying for this. they're literally swear up and down. This is not anything that they did with me.
William Welser IV (53:23)
Yeah. Yeah.
Interesting.
Corey Strong (53:33)
And they're just like, well, maybe you're having deja vu. I'm like, no. I literally, I'm like, okay. But they do not remember.
William Welser IV (53:40)
Yeah, you're not alone in that. This happens in my house as well.
Corey Strong (53:41)
They don't remember.
Yeah. Okay, you
gotta figure that out for us. ⁓
William Welser IV (53:52)
Yeah, we will. I want to go back to something you said a
few minutes ago, which is something along the lines of you not being a scientist. And I think it's really important for everybody who's listening to your podcast, who may not listen to podcasts in the space of like, you know, deep, deep, deep science ⁓ theory or whatnot, because it
at least for me even, some of those things are very intimidating. ⁓ The best ideas come from areas outside of the area of expertise. The breakthroughs come, they are inspired by someone who hasn't really thought about it deeply, who isn't vested deeply into it, who hasn't biased their mind in terms of the way something works and how it should work, et cetera, et cetera. And instead just says,
What about this? Why not this? Couldn't this be the case? And it's really, really important that we continue to do that, right? That we look at spaces outside of those areas that are our expert areas, our areas of comfort and safety. And if we see something and say like, why not? That we at least like go spend a little bit of time looking at it, right? Because you don't need a degree to go look at like
Corey Strong (54:50)
Hmm.
Hmm.
Yeah.
William Welser IV (55:18)
Why does a large language model work this way? Why does chat DPT give me these answers? But you can go look and be like, wait a minute, ⁓ I can become informed about this and maybe I should talk to somebody who knows about it. And maybe that's gonna spur them to do, cause you never know. And so don't sell yourself short. It is a long way of me saying don't sell yourself short. ⁓ The brilliant ideas come from people
Corey Strong (55:23)
Mm-hmm.
Okay.
William Welser IV (55:47)
who are running these things through their own internal models, their own internal filters and saying, that doesn't make sense. And then sharing that with someone who's working in the space who's like, ⁓ I've never thought about that. Like I need to look into that.
Corey Strong (56:04)
Well, listen, you're opening the door. ⁓ You're going to be like, why did I ever connect with that guy? Because I am the type of person like, why is this happening? Now I have someone to ask that ⁓ to. 100%. ⁓ jeez. listen, I would love that. I would love that because.
William Welser IV (56:06)
HAHAHAHA
I'm excited.
I'm excited. will not on the recording, but afterwards I'm to give you my cell phone and you can text me whenever you'd like.
Corey Strong (56:29)
That is empowering. have to, it's important, I guess, that we bring this data to people that know what to do with it. That is important. Jeez.
William Welser IV (56:35)
Yeah. Yeah. Just like
I was telling you at the beginning, before we started recording, like it is amazing to me that people can do the things that they do with textiles ⁓ in the fashion world, right? Like I have so many questions about like, wait a minute, how do you make that fabric do that thing on that person's body? Like, right?
Corey Strong (56:48)
Yeah. Yeah.
Mm-hmm.
and it's extraordinary. There
is a science because I know I'm a detail oriented person, individual, and when I see someone who's not wearing the right fabric or I know that fabric doesn't necessarily work for that body type, I know it, but I don't know why I know it, but it's just like, it's something I know and I know what I don't like. When I see a shirt in the store, I'm like, that shirt's not gonna be comfortable. And then my wife's like, no, that's actually, that looks soft.
William Welser IV (57:04)
YES!
you
Both are very soft.
Corey Strong (57:29)
I know it. know it. My wife is like, well, how do you know? You're not gonna like it. I'm just there's something about it. I know I'm not gonna like it and I don't want it. I don't want it. So, okay. All right. Okay, and we're not gonna be political, but I have to ask you if you had ten minutes with Congress, what would you tell them about AI and the data privacy?
William Welser IV (57:39)
Yep. Yep.
Mm-hmm.
You know, so I actually gave a talk about this in Manhattan Beach at a TEDx in Manhattan Beach in November of this last year. And ⁓ part of it was about informing elected officials. And if I had 10 minutes, it's a great question. If I had 10 minutes, I would say, you don't need to be an expert, but you need to be fluent.
Corey Strong (58:20)
Hmm. ⁓
William Welser IV (58:20)
Like
it is required of you as an elected official to be fluent in at least what is happening. And if that means you need somebody to give you a primer every day, you need somebody to give you a tutorial every week, that's fine. If that means you need to have small group sessions where that's happening for a lot of people, that's fine, but please be fluent. And then please help the population be fluent.
Corey Strong (58:41)
Hmm.
William Welser IV (58:47)
And I think the best way to do that is actually via labeling. So when we go to the grocery store, ⁓ we pick up a box and we're used to saying, okay, well, if I'm health conscious or just interested, I'm gonna look to see, you know, on the nutrition label, like what's in it, know, amount of my daily calories, blah, blah, blah, blah, blah. We need something like that for AI systems.
Corey Strong (58:51)
Hmm.
William Welser IV (59:16)
Where is my data stored? How is it being used? How much is it worth to somebody? Like, where is it going beyond just this company? Are they reselling it? Like all those sorts of things. And those labels are super important to tell us, you know, how everything's working, how are things being used. Even starting with just a label that says AI inside, you know, just like it's like, hey, this was...
Corey Strong (59:22)
Hmm.
William Welser IV (59:45)
this meat was, you know, the animals received hormones or the animals received, you know, some sort of, you know, treatment to help them grow, et cetera. Like, AI inside, like, I should know when I'm dealing with an artificial system. ⁓ And so I pick up the phone and I'm talking to somebody for, you know, such and such airline. And if I'm talking to an automated chat bot that just has a real voice,
Corey Strong (59:55)
Hmm.
Yeah.
William Welser IV (1:00:12)
It should say, by the way, you're about to get on the phone with ⁓ an artificial agent.
Corey Strong (1:00:19)
That's a very valid point. And I'm thinking of how nice it would have been for those kids at my son's school to receive a note that said, we've given the same assignment to the other 10 people before you.
Wouldn't have been great, but it would have been necessary and it would have given them the choice. Okay, do I still take the risk?
William Welser IV (1:00:36)
Right?
The interesting thing about this is that a year ago or 18 months ago, people were talking about ⁓ at colleges and in high schools like, students are going to use these large language models to write their papers and, ⁓ my gosh, there's going to be all this like cheating and blah, blah, blah, blah, blah. Well, now professors can design agents, design artificial agents to look at those papers, to look at those reports, to look at the assignments and determine
whether or not it has the markers of potentially being created by an artificial agent, right? Wouldn't that be amazing for the professor to say, instead of like playing this cat and mouse game, for the professor to just to say like, hey, by the way, one of the things I'm doing, I've got these two teachers assistants over here, you know, they're humans. I've got a third one, it's an artificial agent. And so like, you can choose to use these other systems. Here's the rule set that you need to follow. If you break that rule set,
Corey Strong (1:01:33)
Yeah.
Mm-hmm.
William Welser IV (1:01:43)
I'm going to know about it. Right? And so let's all go in setting the expectation. again, I think the literacy ⁓ is so important in Congress that they have literacy around these topics. is, I'm not going to give examples, but I have personal examples that were really, really sad where I was dealing with people in Congress who just didn't know what they were talking about. ⁓ And yet they were
Corey Strong (1:01:45)
Yeah. Yeah.
Hmm.
We see that a lot.
Of course, of course, of course. And that's the sad thing about it. know, geez, geez. Well, you know, we're connected because that was gonna be my final question for you is what does hope look like for all of us who
William Welser IV (1:02:12)
on the committees creating policies.
But there's hope. There's always hope.
Corey Strong (1:02:41)
are looking towards tech and AI. What should we be hoping for? What do you think we should be hoping for?
William Welser IV (1:02:50)
Well, think that we should, I can, I'm gonna answer that in a, in a, in the converse. What we shouldn't be hoping for is a frictionless society. So we should not be seeking complete lack of friction, complete convenience, complete like everything just done for us, thought for us, et cetera.
humans in general need friction. We will seek it. Even if we think we don't want it, we want conveniences, we want this, we want that, we will seek friction. And so I think we should hope for a future where these tools are great to augment and supplement those things that we want to get done, but where we still have this base level of friction
Corey Strong (1:03:22)
Mm-hmm. Yeah.
Hmm.
William Welser IV (1:03:48)
that we have to get through, that we have to work through. And whether that's interpersonal or intrapersonal, ⁓ that friction is so important to our growth. It's so important to us staying sharp. It's so important to future breakthroughs. And it's really important ⁓ for us just to remain individuals that are unique. So I would hope for friction to stay around. ⁓
I would hope for the frictionless, ⁓ ultra-convenient society to never show up.
Corey Strong (1:04:25)
Mmm. That is something. Powerful. Powerful. That is fantastic. Jeez.
William Welser IV (1:04:31)
I don't know.
Other people are like, yo,
I'd really just like the grocery store to know what I need to order so I don't even have to go on and have it just show up at my door. Like, I'd really love that.
Corey Strong (1:04:43)
They don't know it at this time, but they need what you just said. We need it. That's just being real. Where can people learn more about Lotic in your work? Where do you want them to go and kind of poke around and dare to be real with themselves when it comes to AI?
William Welser IV (1:04:48)
It's so important.
Sure, so we have a version of our system that is available, like I said, for free and that people can play with. We've just put up some of our features. It's at getlodic.ai. Our website is still, I mean, we're coming out of stealth right now. So we're still a little bit, you know, kind of stealthy with our website, but that's, I think the best thing to do is just go try it.
Corey Strong (1:05:31)
Just try it.
William Welser IV (1:05:31)
and just go
use the system. Again, the system is built around individual privacy and we've had tens of thousands of people use it. ⁓ And people get back to us and say, it's, well, that was really interesting all the way to, my gosh, it changed this aspect of my life. So that would be something where I'd really encourage ⁓ individuals to go and check that out.
Corey Strong (1:05:49)
Mm-hmm.
That's amazing. Bill, this has been extraordinary. You are phenomenal. Thank you. I was so worried. Like, oh my God, what am going to talk to the scientist about? Like, what? You know, what am I going talk to him about? Like, I'm just pretty sure everything I said was going to be like, what? But no, we did it and you.
William Welser IV (1:06:01)
Well, I thank you for bringing me on. You are phenomenal.
But we did it.
Corey Strong (1:06:22)
were extraordinary. I have to say that. Thank you. And I'm sure the listeners out there are excited to move forward and not be afraid to use AI. You know? Yeah. Yeah.
William Welser IV (1:06:24)
Thank you. Thank you.
Yeah, I think that that's a really great takeaway. And just be conscious of it, right? Just be
just eyes wide open.
Corey Strong (1:06:40)
There it is. Thank you so much, Bill. I appreciate it. Bye bye.
William Welser IV (1:06:43)
Thank you. All right, very good.
corey strong (1:06:52)
Fascinating, fascinating guys. That was Bill Welser the IV He's sharp. He's really sharp. I really enjoyed this conversation. ⁓
Guys, we have so much to think about, don't we? About technology. I love that Bill is forward thinking and just so easy to talk to, you know? And that's the way it should be.
You know, what stuck with me the most was something he said. We're all scientists. You know, we shouldn't be afraid of technology. That's what I learned from this interview. but we should be asking questions. And more importantly,
We should be getting those questions into the hands of people like Bill, who actually know what to do with them and who care enough to help us understand the answers. That's important. This talk honestly made me think twice about where my data goes. And who's using our data? We have to think about these things now.
It can be unsettling, sure, but it can also be empowering. We just have to stay curious and thoughtful about what we're using and why we are using it. Personally,
I'm going to sign up for Lotic AI because I have some things that I needed to work out for me.
actually going to sign up because I truly believe in what they're doing. They seem to actually value user privacy, and that's a big, big thing, especially now in society. And I'll definitely report back.
and let you all know how that experience goes for me. You know that I will. And if you are curious too, check the show notes. I'm dropping links to connect with Bill and to learn more about Lotic AI. And whatever tools you're using, let me just tell you, if it's chat GPT or something else, don't just lean on it. Stay involved, keep learning, and most of all, keep using your brain.
Thanks so much guys for stopping in and listening and I'll catch you next time.