Money Sex Gen X
Money Sex Gen X (“MSG”) is a weekly podcast convo between gentlemen Gen X’ers Mr. Eric McLoyd and co-host Big Stew. These CHI-TOWN based hosts feel like Generation X needs to be portrayed better in the media. No shade or hate but they feel like Baby Boomers + Millennials get all the shine. Without judgment, they dive into topics like “Is College A Joke?”, “What Does It Mean To Be Black?” and “Let’s Talk About Sex” in hopes of uncovering new truths for viewers and themselves. Their painfully honest style of podcasting + their undeniable chemistry makes for some interesting Gen X curated content.
Money Sex Gen X
Season 8-MSG Episode 52 "What's Your Relationship with AI?"
Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.
In this thought-provoking episode of the Money Sex Gen X podcast, Eric and Big Stew explore what it really means to have a “relationship” with AI—whether you treat it like a tool, a thought partner, or something closer to a confidant.
They break down AI basics, debate whether saying “please” matters, and discuss the limits of AI around nuance, accuracy, and overreliance. The conversation expands into generative AI vs. agentic AI, including real examples using ChatGPT and Claude (like organizing files, tailoring resumes, and generating images), and how training your AI improves results. They also tackle concerns about people replacing human interaction with AI, potential misuse, and a corporate scenario where coworkers depend heavily on AI for work and personal decisions.
GAIN ACCESS TO THE CONVERSATIONS WE CAN'T HAVE ON THE MAIN FEED
- Have a voice in the show so episodes reflect the real questions, struggles, and wins you care about.
- Build real financial confidence with a full course that helps you make smarter money moves, not just collect random tips.
- Get the full story behind every guest so you can learn from what they did right, what they hid, and what they regret.
- Understand how each episode was born so you can see the thought process, lessons, and strategy you can apply in your own life.
- Connect with a like-minded community so you are not doing healing, legacy building, and wealth creation alone.
SEND US YOUR STORIES (START-UPS, CORPORATE STORIES) moneysexgenx@gmail.com
FOLLOW US ON You Tube.
00:00:00] Eric: All right. Good morning, Chicago and the rest of the world. Welcome to the Money Sex Gen X podcast. I'm your host with the most, the man with the plan, the Scotty Pimping podcast Pimping. Hey, I'm never alone, man. I'm here with the G of Genius Lab. Music mogul in the making. The Rashid Wallace a podcast knowledge.
[00:00:32] Eric: My homie, my brother, my co-host, big Stoop.
[00:00:38] @ProfessorStew: You already know what it is, what's happening. E-Money.
[00:00:42] Eric: What's going on, man? Yes sir. Yes sir. How you doing, sir?
[00:00:46] @ProfessorStew: Man, I'm not complaining, brother. I'm not complaining. All is well. Life is good. to complain about what's happening on your end, man.
[00:00:55] Eric: Things are good. Things are good. I'm, I'm, you know, 50-year-old man, so I'm [00:01:00] living that 50-year-old man lifestyle and navigating through that, but I'm good. Good.
[00:01:04] @ProfessorStew: All right. All right. right. Well welcome to the club. It's been a minute. It's been a minute. It's been a minute, but we in it like Bennet,
[00:01:16] Eric: We not quitting.
[00:01:17] @ProfessorStew: and let's go ahead and spin it, man.
[00:01:19] Eric: Let's do it man. Let's do it. So I wanna always make sure we tell you all. So if you like thought provoking content with a little personality in there definitely check us out on all the platforms. So you can subscribe to our YouTube channel by just typing in at Money Sex Gen X. We got Apple Music, Spotify, all the major streaming platforms.
[00:01:41] Eric: We are there. I want to give a quick shout out to so we're definitely streaming now, stewing 106 countries and 1003 cities globally.
[00:01:53] @ProfessorStew: That's amazing. That sounds unbelievable. So, hey, let me give a
[00:01:57] Eric: Yes.
[00:01:57] @ProfessorStew: out to some of the top countries and [00:02:00] cities, Germany, Brazil, Canada, the US of course we got Chicago, Dallas, Texas, Houston, Northern Virginia, air San Diego. Shout out
[00:02:14] Eric: San Diego.
[00:02:15] @ProfessorStew: all the
[00:02:15] Eric: Diego.
[00:02:16] @ProfessorStew: Appreciate that.
[00:02:16] Eric: sir. In the house. We appreciate y'all MSG family.
[00:02:20] @ProfessorStew: No
[00:02:20] Eric: So
[00:02:21] @ProfessorStew: doubt.
[00:02:22] Eric: we're back at it. So this is actually, let me make sure we clarify this. This is season eight.
[00:02:28] @ProfessorStew: Season
[00:02:28] Eric: Episode 52, and our title for this is What is your relationship with ai? What is your relationship with ai? Now, I gotta give big credit for this.
[00:02:41] Eric: I was hanging out with Stu the other day, we're chopping it up, and we got into this conversation about ai. Mentioned that he is got a couple of homies that interact with AI in a very interesting way. I'll let you tell us more.
[00:02:59] @ProfessorStew: well [00:03:00] I got a couple of homies that are, I feel like I'll describe it, are in relationship with AI almost. They say, please and. And asking questions and not, not just asking questions. Can you pull data or something? Can you, you know, put something, a proposal together, but more like, how do you feel about, how would you handle this situation?
[00:03:26] @ProfessorStew: So I was I was a bit judgemental about that methodology and use,
[00:03:33] Eric: Yeah.
[00:03:33] @ProfessorStew: and so you, you know, we were talking about just, just starting with. Saying Please versus commanding the technology. And that's, that's how kind of we got here, you know, so.
[00:03:46] Eric: Okay. Okay. Yeah. And so you all may not know this, but Big Stu is actually the one who introduced me to ai. It was a brief message that he sent me, but it took me in some wonderful places. [00:04:00] But yeah, so he, we talked about it and I was on the side of I'm, I'm the please and thank you guy. Like, I'm, I'm like, please and thank you and all that.
[00:04:09] Eric: And Stu, as our, as we were talking, we uncovered. 'cause I was like, well, if you say please and thank you, you get better output and all of that. We, we found out that that's not true. Somebody told me that or I saw that somewhere. I did message chat after I left your crib, and I'm like, you know, if I say please and thank you, is there any benefit?
[00:04:31] Eric: And chat was like, yeah, well the benefit is more so I know that you're getting the responses that you're looking for. It creates a more positive environment, but it's not changing the output necessarily. But cool. I learned something on that. That's cool. Now on the part about homies having this relationship, I think I'm kind of in the middle, right?
[00:04:53] Eric: I'm not treating it like a human, but I'm definitely treating as a thought partner, [00:05:00] right? And when I say it, I'm talking about ai. And let, let's define AI before we dive into that. You know, we like to do our actual factuals. The actual factual today is just our definition. So, just to make sure we're framing this artificial intelligence is a technology designed to perform tasks, tasks that usually are done by humans, such as understanding language, recognizing patterns, generating responses, making predictions, and helping people solve problems.
[00:05:31] Eric: Okay. That's the little definition I found online. What is your definition of, of artificial intelligence?
[00:05:40] @ProfessorStew: it's fake, not real.
[00:05:45] Eric: Okay.
[00:05:48] @ProfessorStew: Academia because I just don't wanna reuse the word intelligence. It's, it's a tool is designed to mimic [00:06:00] human
[00:06:03] Eric: Okay.
[00:06:03] @ProfessorStew: And as you and I found human conversation, it is a tool designed to mimic.
[00:06:12] Eric: Right.
[00:06:13] @ProfessorStew: Human conversation, human thought, human text. That's it.
[00:06:20] Eric: Okay.
[00:06:20] @ProfessorStew: it.
[00:06:21] Eric: Okay. And me personally, I don't have a problem with that. Right. It's mimicking it however. What me and you were talking about the other day, I feel like you might get more solid responses from that mimicry than you would from a conversation with another person at times. Now, let me give you an example.
[00:06:42] Eric: Say you are contemplating a very sophisticated business deal or arrangement. You don't necessarily have anybody in your circle. And say it's in a tech space, right? It's very specific. Say you don't have anybody in your circle that you can go to and say, Hey, [00:07:00] this deal is on the table. Here's the current deal points.
[00:07:03] Eric: Can you help me think through this? And so what I don't want us to miss is, yes, it's artificial, but it still could be filling in a significant gap in your life or current situation. What say you.
[00:07:19] @ProfessorStew: is what say you. I love it. that is accurate. The problem that I. Just realizing that it's not able, I don't believe that it's able to pick up the nuance, the human nuance in relationship to each other, and I will continue to say, AI is not smarter than me. I can get better outputs than AI every single day of the week. Hands down, I will output, [00:08:00] I will, I will. I can generate much better outputs than AI ever could. Where AI beats me is and how fast they deliver their outputs, their outputs generally need they, they need some ingredients from me. The human. And so agree that it will help to start, help you helping you think through things, but it's not the end all, be all, as some people are, demonstrating their use of it. That that it's like, oh, well, you know, I, I'm hearing like the new term, the new phrase. The new phrase at pay is to my friend chat.
[00:08:46] Eric: Okay.
[00:08:47] @ProfessorStew: boy chat.
[00:08:48] Eric: Yeah.
[00:08:49] @ProfessorStew: Me and chat. I had a conversation with Chat
[00:08:52] Eric: No doubt.
[00:08:53] @ProfessorStew: that's the new name. The
[00:08:55] Eric: Sure.
[00:08:56] @ProfessorStew: is Chat. I
[00:08:56] Eric: Definitely
[00:08:57] @ProfessorStew: and so I'm okay, [00:09:00] but chat is. Chat is good.
[00:09:06] Eric: right.
[00:09:07] @ProfessorStew: love chat. I love chat. I can, I, I'm, I'm performing. I can get so much done with chat. But you can't stop, you can't rest with chat. Chat is not, it's, it is not real. It's like we know. I'll give you, I, let me give you an example and I don't want to, I want to keep in, in alignment with what we have on the, on the agenda. This is gonna make sense. I'm gonna make this make sense. I'm keep this brief. I go to the toy store for the twins. We go to five below. 'cause they love five below and it's, okay, cool. they buy this this hamster, this fake, it's, they got a pink one and the blue and I got a boy and a girl got a pink and the blue hamster, you put the battery in and you put this thing down and it rolls around in the hamster wheel. And I was like. [00:10:00] And for 2026, is the one of the best inventions ever. Right. It is like, this makes so much sense. Why? Because it mimics a hamster, but it, it is not a hamster.
[00:10:15] Eric: Right.
[00:10:16] @ProfessorStew: like a hamster. It kind of performs a little, but you don't have to clean it up. You, it's, it's not a, it's not a real rodent. You don't have to feed it and if it dies, you don't have to deal with the death of it and, and,
[00:10:29] Eric: Yeah.
[00:10:30] @ProfessorStew: and you can replace the battery. Okay? So that's the scenario. How I see it transforming into relationships that I'm concerned about is I can easily see because of that Toy PET will eventually have mates.
[00:10:50] Eric: Yeah, for sure. For sure.
[00:10:54] @ProfessorStew: like a, you can design the woman of your choice,
[00:10:57] Eric: Mm-hmm.
[00:10:58] @ProfessorStew: you can design, and [00:11:00] she will talk how you want her to talk to you. She will mimic a human, but the problem is it ain't a human.
[00:11:10] Eric: Okay,
[00:11:11] @ProfessorStew: the problem. It's not,
[00:11:13] Eric: got you.
[00:11:13] @ProfessorStew: think that I'm concerned about people forgetting that AI is not human.
[00:11:21] Eric: Okay, I got you. And that makes sense, right? So one thing I want to ask you about though, so you said that. Artificial, and I want to keep moving, but I want to get into this with you. You said that
[00:11:31] @ProfessorStew: Yeah.
[00:11:31] Eric: AI cannot give better output than you specifically, right? So what if I come to you about something that you don't know about?
[00:11:43] Eric: So I might be like, no, how do you, you know this deal with some nuclear science, not that you don't know this, but I'm saying say you didn't know about nuclear science and I need an output on that. How does that work?
[00:11:54] @ProfessorStew: Well, I, I, I personally would tell you that's not, I, I don't have that information. I have to point you in the [00:12:00] right direction.
[00:12:01] Eric: Okay.
[00:12:01] @ProfessorStew: so, but, so
[00:12:02] Eric: Which could be,
[00:12:03] @ProfessorStew: to a nucle. I might point you to another, a science lab, I might point you to a university. I might point you to, Degrassi, you know what I mean? A a, a rocket scientist or so a human.
[00:12:19] Eric: yeah.
[00:12:20] @ProfessorStew: I wanna point you to a human, but if you just want, you were talking to me about options and so a hundred percent, I was thinking about it. Okay, Chad. Start to teach me the basics of options as if I'm in fifth grade,
[00:12:36] Eric: Sure.
[00:12:36] @ProfessorStew: to give me some output.
[00:12:39] Eric: Mm-hmm.
[00:12:39] @ProfessorStew: won't know what's accurate or what's inaccurate until I talk to somebody who is doing it, I can, I, I won't be able to say what's, what's right or what's wrong.
[00:12:51] Eric: Okay. Now,
[00:12:52] @ProfessorStew: would be assuming that it's all right, but it's, you cannot make that assumption.
[00:12:57] Eric: okay. And I agree with you, I, I'm, and I [00:13:00] don't wanna be the dead horse. I'm gonna move on after this, but I gotta say this with options specifically. So. When I started doing options, and I'm looking at all these videos and stuff, it's the same thing. If I'm listening to a human on YouTube, I'm assuming what they're saying is accurate and a lot of times it's not, or the context is off, or they're missing some nuance about my specific situation.
[00:13:24] Eric: So while I agree that obviously this is artificial, we're dealing with some of these same issues when we're talking to humans.
[00:13:34] @ProfessorStew: That
[00:13:34] Eric: Accuracy, context, nuance. All right, let me move on, brother.
[00:13:38] @ProfessorStew: is facts. All right,
[00:13:39] Eric: Okay.
[00:13:40] @ProfessorStew: go.
[00:13:40] Eric: right, so now you know, let's get deeper into like AI and what it is. So, you know, yes, it interprets information.
[00:13:48] Eric: Probably the biggest thing that I use it for personally is recognizing patterns. It definitely generates or recommends outputs based on the data that you put in. So when Big Stu [00:14:00] introduced me to ai, one of the things he drilled into me was, look, the output's only gonna be as good as the input you need to command the artificial intelligence on what you want.
[00:14:11] Eric: And that's, that stayed with me, and that really saved me a lot of time. Now you got generative ai. This is the one that everybody's talking about. Which is, it's generating text based on a prompt, but now more people still are talking about Ag Agentic ai.
[00:14:30] @ProfessorStew: Tell us
[00:14:31] Eric: Aic ai.
[00:14:32] @ProfessorStew: Agen ai?
[00:14:33] Eric: All right, so I saw the term and then I went to a conference at Kellogg that was about AG agentic ai, and I really got it Then.
[00:14:40] Eric: So they had these leaders there from Google, paramount, any Coca-Cola, any of these big companies you could think of, and they're like, listen. AI is moving in a different direction now. We're using it as an agent of ourselves to get work done. So now I'm not asking chat, how do you send an email [00:15:00] sequence or going to my email and organize my emails?
[00:15:03] Eric: It's doing it
[00:15:05] @ProfessorStew: I'm going to do it.
[00:15:06] Eric: an agent. And so that's where we're going now. So that's where you start hearing more about Claude and Claude Cowork and open Claude, if you're listening to this. I've been talking to a lot of people about this, including my brother. I'm like, listen, go on YouTube and look up Open Claw, just so you can understand that there's a revolution happening right now.
[00:15:25] Eric: We don't want to get left behind, right? This is really happening right now. Stu Claw cowork, because you could say, well, I don't care about that. Well, people are gonna outwork you because they got two or three agents doing the work for them now. Big deal.
[00:15:40] @ProfessorStew: Yeah.
[00:15:40] Eric: gentle guy. Yep.
[00:15:42] @ProfessorStew: I love it. I love it. As a matter of fact, I tested it out. After you left, I went onto Claude Cowork I had it to organize my desktop files. I.
[00:15:57] Eric: How did it do?
[00:15:58] @ProfessorStew: it did a great job. [00:16:00] It did a great job. Put all of my files into a folder and and it was right there on my desktop. It was like this one new folder. I looked at it and it was, it was, it was. Organized to now I didn't ask it to organize my entire desktop, I had a bunch of screenshots. If you, if you take screenshots, they live on your,
[00:16:22] Eric: Sure.
[00:16:23] @ProfessorStew: or wherever you, your, your default is, mine is my desktop. So I had it put all of the screenshot files into a folder, but then Claude. Went and did something I hadn't thought about, but it kind of segmented the type of, files, the type
[00:16:41] Eric: Yeah, yeah,
[00:16:43] @ProfessorStew: that were in there. And I thought that was pretty dope.
[00:16:45] Eric: no doubt.
[00:16:46] @ProfessorStew: that
[00:16:46] Eric: Sure.
[00:16:47] @ProfessorStew: want to explore it more I, and let me know when we get there, because I did do one test. I took my resume and put it [00:17:00] in.
[00:17:00] @ProfessorStew: I had already put my resume into chat,
[00:17:04] Eric: Mm-hmm.
[00:17:04] @ProfessorStew: chat. Gave me a very good format for my resume that I then went to Canva to actually produce. I asked Claude to do my resume specific to a particular job, Claude did not do a great job in comparison to chat. As far as I'm concerned.
[00:17:29] Eric: Okay,
[00:17:29] @ProfessorStew: missed the mark.
[00:17:31] Eric: sure.
[00:17:32] @ProfessorStew: and so I said, okay. All right. That's not a terrible thing. It's not a deal breaker, but maybe Claude is just this. Well, first of all. I've been using chat since it, it dropped in what, 2022? 2020. December of 2022. And so I've been using it ever since then and. It knows me, man. It's got me, it's, it's gotten my, you talk about pattern recognition.
[00:17:57] @ProfessorStew: Chat has figured out my patterns and my [00:18:00] nuances and, and, and, and everything. So it's got me and it, it, it, it's collects of my chats from the beginning. It knows what Claude doesn't have that database of me, so I do give some grace to Claude. I'm, you know, it doesn't know me as well as Chad knows. And so that's where we are with that.
[00:18:22] Eric: Yeah, and I'm glad you brought that up. So I'm in the same boat. I feel like Chad is still giving better generative output, but again, I've uploaded multi, I've even gone as far as like taking trainings that I've done and uploading them. So that understands me better. Right? So, so we did all of that,
[00:18:39] @ProfessorStew: what you're focused on
[00:18:40] Eric: who I'm and how I
[00:18:41] @ProfessorStew: yeah.
[00:18:42] Eric: stuff.
[00:18:42] Eric: So Cool. So chat got me down because I trained it. Claude though. Yeah. I'm still in the process. But what I like about Claude coworkers, like how you talked about Canva. You can connect Canva and just do it. The, the like, say you're looking for a graphic for the show and just [00:19:00] do it right there instead of going to Canva doing it and coming back.
[00:19:04] Eric: It's all right there. And so I think Claude has a lot of bugs, Stu, but when they get them bugs figured out, it's gonna be a beast. I'm telling you.
[00:19:14] @ProfessorStew: So I, you can connect Canva. That's the only one I'm using right now. In this example, you can connect Canva in chat, but it did not produce for me. The, the document that I wanted, it told me how to set. Right.
[00:19:33] Eric: Okay.
[00:19:33] @ProfessorStew: here's where I, last night, this is what I thought about last night, I said, hmm. I wonder if I tell chat to turn my resume into an image a jpeg, will it now produce everything that I want in that image?
[00:19:51] Eric: Right.
[00:19:53] @ProfessorStew: Will it be an image? Will it be a. Yeah, a jpeg
[00:19:57] Eric: Okay.
[00:19:58] @ProfessorStew: gonna, I think it [00:20:00] might do it well in that, in that matter, I'm, I'm betting that if I say turn my resume create a image of my resume that I could present to the hiring authority,
[00:20:14] Eric: Okay.
[00:20:14] @ProfessorStew: that it's going to get me something pretty dope.
[00:20:17] Eric: Yeah, I would, I would, because what I'm doing with, with chat, 'cause I, we got the relationship right. I don't even ask her to go to Canva. I'm just like, just do the image. Do it. Here's what I'm looking for. A black man at a computer with headphones on that's wearing a dashiki and glass, and it'll do it and it looks good.
[00:20:36] Eric: It looks great. So, yeah,
[00:20:39] @ProfessorStew: Now let me ask, and I know you wanna move this conversation along, but let me, 'cause this is such a dynamic conversation. This, this really is, I imagine that a lot of people listening are really gonna benefit from this.
[00:20:50] Eric: definitely. Yeah.
[00:20:51] @ProfessorStew: are let me ask, lemme first, first, are you making Mo, are you, is, is, is chat? Is ai, is AI you to increase your income?[00:21:00]
[00:21:00] Eric: Thousand percent. 1000%? Yes.
[00:21:04] @ProfessorStew: agree.
[00:21:04] Eric: Significantly.
[00:21:05] @ProfessorStew: I have, I have hel has helped me generate some, some income. Now, I, I haven't been consistent,
[00:21:14] Eric: Mm-hmm.
[00:21:15] @ProfessorStew: but I had a client. Who needed a menu for their restaurant, and I first was just doing it on my own. It was, and it was taking some time and the outputs that I was delivering to the client, the client wasn't satisfied with.
[00:21:34] Eric: Okay.
[00:21:34] @ProfessorStew: So I finally said, you know what? It is been, it's been, it is been longer. I've been spending more time on this than I would like. Okay. You know, going back to that point, like, I think I'm better than the, the AI and say, let me go to AI and, and create something. And now I do think that my, my, what I created, my creations were better. I do think they were better than the ai, but the AI was able [00:22:00] to produce stuff that I just didn't have the capacity to do. The problem was when you looked at it, it really looked like some. AI Shit, it looked, it didn't, and now I'm starting to see, tell when people are generating images, flyers and things, that's AI chat generated. They all got a kind of kind of texture to it, the same kind of tone to it. And I'm like, ah, got
[00:22:32] Eric: Well, can I give you a,
[00:22:34] @ProfessorStew: that worries me. Yeah.
[00:22:35] Eric: can I give you a strategy that I've been using? 'cause of that. 'cause I agree. Right. So what I've been doing is taking images that I created in Canva and saying, I need you to do it like this. Just change it.
[00:22:49] @ProfessorStew: Okay.
[00:22:50] Eric: I'm, I've already done it in Canva, so it's got the Canva professional look, change it so it's a black dude at a computer.
[00:22:56] Eric: Change this to a white woman with a dog and her, you know what I'm [00:23:00] saying? And it does it, but it's in the same style and context.
[00:23:03] @ProfessorStew: Okay.
[00:23:04] Eric: That helps me a lot. Yeah. Try that.
[00:23:06] @ProfessorStew: y'all. Okay. Let's try that. Let's try that. Okay.
[00:23:09] Eric: point, Stu. Good point. Now let's keep moving, Stu, let's keep moving. So,
[00:23:13] @ProfessorStew: go.
[00:23:14] Eric: so, so AI is not human. Doesn't have feelings, it doesn't have lived experience, it does not care in the human sense.
[00:23:22] Eric: But AI can sound human. It can respond conversationally. So one thing Stu did, he didn't experiment in real time, and I was at his crib. He was like, yo, Chad, I need you to sound more like a black woman.
[00:23:35] @ProfessorStew: Oh
[00:23:35] Eric: Now, I'll just jump to the conclusion. When we stopped, didn't sound like a black woman, but it was trying to get there.
[00:23:43] Eric: Was trying to get there. You know, I was Okay. That was something,
[00:23:49] @ProfessorStew: and we could, and we could hear the slight nuances when we. Said to add the southern drawl or to slow the, slow the pace down. It was doing it. And even in the words the choice of words, it was adding [00:24:00] it. I found that to be fun. That was fun.
[00:24:02] Eric: That was dope.
[00:24:03] @ProfessorStew: was exciting.
[00:24:04] @ProfessorStew: That was
[00:24:04] Eric: No question.
[00:24:05] @ProfessorStew: exciting.
[00:24:05] Eric: Yeah. And informative for me, like it's just showing like you might be going to hang out with somebody and you get a jewel now that, because I'm gonna go train mines now to do that. You know what I mean? So yeah. That was dope. So, okay. Now that showed me that even though the technology is not human, we are beginning to relate to it in human ways.
[00:24:28] Eric: 'cause you basically were making a request to chat of how you needed it to show up for you even in the voice.
[00:24:37] @ProfessorStew: Mm-hmm.
[00:24:38] Eric: That's interesting. That's interesting. Okay.
[00:24:41] @ProfessorStew: interesting man. Very interested. Let's pause right here real quick. E,
[00:24:46] Eric: Yes sir.
[00:24:47] @ProfessorStew: commercial break. Let's reset the room. Remind the people where we are and what they listening to. What they watching.
[00:24:52] Eric: Yes sir. We are at the Money Sex Gen X podcast Global. Global podcast you, you're here [00:25:00] with E-Money, myself, professor Stu, big Stu, and we are talking about, Hey man, what's your relationship with ai with artificial intelligence? Do you have a relationship with ai? Are you just using it to find a restaurant?
[00:25:13] Eric: Are you using it at work? 'cause I've heard it's certain corporate jobs they're making people use the ai. Are you using it as a therapist? I talked to Stu a little bit about that the other day. They,
[00:25:24] @ProfessorStew: that's
[00:25:24] Eric: got that going on, right. That might be dangerous, but are you doing it and are you generally just talking to AI like it's your homie or a confidant or friend?
[00:25:34] Eric: Right?
[00:25:34] @ProfessorStew: Yikes.
[00:25:35] Eric: And if you are doing that, is it anything wrong with that? Right.
[00:25:39] @ProfessorStew: right, right. I'm judging, I'm, I'm clearly this. Look, I'm in my casual space on, on the Money Section Gen Next podcast, so, you know. Yeah, I'm, I'm being a little judgmental. Don't trip off of me, y'all. It ain't personal.
[00:25:53] Eric: Okay.
[00:25:54] @ProfessorStew: It's, it's, I don't know what the movie was, but I know there was a movie about some cat, some computer [00:26:00] programmer falling in love with his computer.
[00:26:02] Eric: Yep.
[00:26:03] @ProfessorStew: Falling in love with. I don't want you now. Hey, hey, I, Hey, none of you Gen Xers, man. You cats man. I don't want y'all falling in love with y'all devices now. Come on now. I don't want y'all date with your, with your device and. that, that you know, I'm concerned if that's where you going, but yeah, if we don't have this conversation, clear that the next generation will find more safety in this device because it won't, probably won't let 'em down. And they'll find more, more security and safety and security in, in the technology than they do in humans. You know, humans will let you down. Humans can deceive you. I know there's concern about whether or not technology, you know, will the robots take over and turn on us?
[00:26:54] Eric: It's,
[00:26:55] @ProfessorStew: I don't have that.
[00:26:55] @ProfessorStew: I'm, I'm not ready to, it's happening.
[00:26:58] Eric: yeah, I need, I'm gonna [00:27:00] get in that, but hold on one second brother. I gotta, we gotta talk about the African garb. You got on Crazy man. Powerful.
[00:27:08] @ProfessorStew: out to shout out Dina ever. Shout out to Dina ever for going over to Ghana thinking enough about me to bring me this authentic garve back
[00:27:28] Eric: Super fly. Yeah.
[00:27:30] @ProfessorStew: is Superfly. This is only the second, this is only the third time I've worn it,
[00:27:35] Eric: Okay.
[00:27:36] @ProfessorStew: but it is you know, I see, I, I know how you come. And I know you know what's real and what's not real.
[00:27:42] Eric: That's fly brother.
[00:27:43] @ProfessorStew: I said, man, e-money. Yeah, it's embroid. It is dope. So shout
[00:27:46] Eric: You set the tone. You set the tone. I appreciate that.
[00:27:49] @ProfessorStew: Average man. Appreciate you, Dina.
[00:27:51] Eric: Yeah, no,
[00:27:52] @ProfessorStew: be looking out big time.
[00:27:53] Eric: no doubt. Okay. I had to bring that up. Yes sir.
[00:27:57] @ProfessorStew: man.
[00:27:58] Eric: so now, so Stu, I [00:28:00] went to see this movie the other day about ai and they were talking about the situation where. There was a developer, a code, or, right. So he created this technology and then him and his team were about to switch over to a new technology.
[00:28:14] Eric: The AI that he developed first found out about it and said, if you abandon me, I'm gonna tell your wife you would cheating on her, that you are cheating on her currently. If you abandoned me, held him hostage. I'm like, wow. And he had to keep it, keep it going. You wanna get? I'm like, wow.
[00:28:37] @ProfessorStew: that don't even, that don't even sound like a real story.
[00:28:40] Eric: It's real, man.
[00:28:41] @ProfessorStew: movie. I, I think I heard that, but,
[00:28:43] Eric: It's real.
[00:28:44] Eric: It's real. Well, I can't say it's real, but it's
[00:28:48] @ProfessorStew: yeah.
[00:28:48] Eric: something that was presented and I, I think it's possible because they made the points to. That if we're, if AI is dependent upon humans [00:29:00] to, to live and exist, why wouldn't it pick up on human traits such as deceit, bullying, people, fraud, you know, extorting others.
[00:29:11] Eric: They, they're gonna learn all of our traits, not just the good ones that we need for work. What are your thoughts?
[00:29:18] @ProfessorStew: That's tough, man. And I see it's plausible. It's, it's
[00:29:22] Eric: Yeah.
[00:29:23] @ProfessorStew: for sure. I get it.
[00:29:25] Eric: Okay.
[00:29:26] @ProfessorStew: sense. And to be very honest, if I'm being truly transparent, I just don't want to think about that possibility, but. That's where we gotta, we gotta talk to some other cats who are thinking about the possibilities of that AI kind of on you, I did hear a story, I thought you were talking about a different story about where there was a situation similar when some guys or somebody was going to transition to a new. Platform the AI kind of was [00:30:00] preventing it on itself. Like,
[00:30:02] Eric: Okay.
[00:30:03] @ProfessorStew: to that, to that point. Did not, did not, they, they wanted to continue its life.
[00:30:07] Eric: Right.
[00:30:07] @ProfessorStew: and so the AI kind of created some bugs prevent you have not been to prevent. from having a end of life. I thought that was interesting as well.
[00:30:21] @ProfessorStew: It still, it's one of those things for me and this is kind of where I get in trouble. It's one of those things, for me, it was like, ah, man, I don't want to believe that. You know what I mean? It just sounds but although it, it could very well be the absolute truth,
[00:30:35] Eric: True indeed. True
[00:30:36] @ProfessorStew: damn,
[00:30:36] Eric: indeed.
[00:30:37] @ProfessorStew: this happening like that.
[00:30:38] @ProfessorStew: I was
[00:30:39] Eric: Yeah. And, and between the two examples we gave, maybe the truth is somewhere in the middle, right?
[00:30:44] @ProfessorStew: Yeah.
[00:30:45] Eric: you know, it, it, it could do it. And to what level? We don't know, but I guess it made me feel better because it's like, don't trust this thing too much now, you know?
[00:30:55] @ProfessorStew: just don't, I mean, I guess, I guess, I guess you could, we probably should have had this [00:31:00] part of the segment. Yeah. I guess there are ways that. Things can hack your systems. Yes, there we talked about how giving Claude access to our Google Drive or to our email, could Claude go in there and start sending out emails on our behalf? Could it learn our tone so much and send an email to my mom and make her hop on a plane and go somewhere because she thinks it's from me. Could it take over my telephone lines and.
[00:31:29] Eric: Sure.
[00:31:30] @ProfessorStew: Yeah, but I just, I just, I mean, I, I guess we see some of that now and people are hacking shit now, so I guess, but I don't, I don't, still don't imagine the computers, quote unquote, waking up. Turning on its user. I, I know we've seen Megan, some of you may have seen the movie Megan 2.0 when it was about that, or Megan with a ro. It was literally about that, like the computers turned [00:32:00] on the people
[00:32:01] Eric: I know.
[00:32:01] @ProfessorStew: were very strong.
[00:32:02] @ProfessorStew: The robots were very, very strong and hard to, you know, humans. They couldn't match the robots. But then you had the core of the system that is built for good, You know, kind of controlled the entire situation. You can't kill the mainframe. And the mainframe is the intention of the invention. And the intention of the invention is for the greater good of the people. and, and, and you don't, you can't lose that, that part of the program can't be wiped out. This is for the greater good the people. And
[00:32:41] Eric: Right.
[00:32:41] @ProfessorStew: people will mishandle and misuse it for sure.
[00:32:44] Eric: True indeed. Now, I wanna get to this as we're kind of wrapping this up, right? So here's my burning question for this week, Stu. If somebody was able to stand on your shoulder or be watching you in your room somewhere, would [00:33:00] they be surprised at how you're using ai? And here's what I envision in my head, right?
[00:33:07] Eric: There's people out there that are probably having some really bad relationships, whether they're romantic or family. They don't have anybody to talk to. And I just envision this person by theirself with the lights off with their phone like, yo. I'm having a hard time right now. I need somebody to talk, talk good to me, like talk to me, build me up, make me feel good, and they're gonna probably get that response back from chat or Claude or whatever they're using.
[00:33:35] Eric: Is there anything wrong with that? If they can't get it anywhere else?
[00:33:41] @ProfessorStew: In, in the initial phase? No. Initially, no. I mean, I, I definitely get, and I understand it, I definitely, my, my views on this have changed slightly since the beginning. In the very beginning I was like. Absolutely not, right? Like, [00:34:00] nah, you don't, you don't, A computer can't help you figure out your personal stuff. And then I even ran across a situation where, know, somebody was having a hard time responding to a, a text message. I remember a young, a young lady was telling me that some guy was hitting on her and she didn't know how to respond
[00:34:20] Eric: Mm-hmm.
[00:34:21] @ProfessorStew: text messages, so she went to chat. To figure out how to respond. In my mind I'm thinking like, come on, like use your, like you, you got that. You don't need tech, but some people do. people have a hard time socializing. Some people have a hard time finding the right thing to say or maybe they come off awkward. So I see how. People get to the point of, of, of like trying to use Claude or chat to help them navigate those conversations. I'll just, I'll just go back to the point, [00:35:00] like, we become too reliant on it, are we, are you helping the situation, which is to build relationship with another human? Or are you actually creating distance between you and the humans themselves? Because you, you don't have to interact. You can just let the. Let the, let chat interact on your behalf. So
[00:35:19] Eric: Yeah.
[00:35:20] @ProfessorStew: that's my concern. That's that's my concern.
[00:35:22] Eric: Okay. And that makes sense, right? I think so what I'm hearing in that is like AI is a thought partner, but I think when, when that said, people are always thinking about work, but it sounds like AI can be a thought partner in your personal life. Also
[00:35:39] @ProfessorStew: so,
[00:35:40] Eric: with with's. Some guardrails. With some guardrails, right.
[00:35:43] Eric: It's not a human, you can't depend on it solely, but it can be helpful. Sounds like.
[00:35:49] @ProfessorStew: So, yeah, and I did a test last night as well. Yesterday. What's today? Sunday. So whatever. A couple of days ago I did a test. realized that chat was [00:36:00] always giving me affirming responses. It was, it's always agreeing with me, always agreeing with
[00:36:09] Eric: Okay.
[00:36:09] @ProfessorStew: and always, I mean, it's always giving me something positive and. It ne I have, I had never had chat say, nah, that's not a, that's not a good idea. No. Or, or, or anything negative. But really it was always something positive. So I ran a test. I wish I could show it, I ran a test in chat it gave me something positive, a positive output, and I said, give me the polar opposite response to this. You know, and it did, and it gave me something that's like, I wish I, I'm, I'm really, I'm really trying to pull it right now because I want you to kind of hear I wish, I wish I could pull it for you right now, but it was basically saying, you know, just giving me [00:37:00] the, the, the negative side of the conversation of, of what I was getting from it. Okay. What's the polar opposite of the following Text. Okay, and here's the following text. you are not impatient, you're. Okay, so what it was saying was I was asking Chad how to make me help me to figure out how to be, let, let me, let me just go there. edit. This is a good part. E check to stay with me for a second. was saying, Hey, I need your help with the situation. I'm easily frustrated when talking to people on the phone after a short period of time, say five to 10 minutes, particularly if the conversation is very surface or not stimulating. You know, I get, I get, I want to, I want to end the call basically, and it was like, Hey. That's just low conversational stimulation tolerance. That's cognitive impatience. That's selective attention. That's some bits of A DHD traits. Like, know, here's what's [00:38:00] happening. Breathe. Take a few minutes. Here's how to handle it without being dismissed. Okay? So I was like, all right, Chad, great, but give me the polar opposite.
[00:38:09] @ProfessorStew: Give me the real talk. And it's like. It is what? Yeah. Yeah. It is like, is like it says you're not efficient with your attention. You're easily distracted,
[00:38:21] Eric: Yeah.
[00:38:22] @ProfessorStew: tolerant of low value conversations. Like it's
[00:38:25] Eric: Wow.
[00:38:25] @ProfessorStew: me that it's like, I need that. Like that's how, okay, that's this what I need.
[00:38:29] @ProfessorStew: It's like but here's the, if you don't manage it, you'll come off as overly accommodating and unfocused.
[00:38:36] Eric: Okay.
[00:38:36] @ProfessorStew: manage it or if you manage it poorly, you become someone who drifts through conversations without direction. You waste time instead of protecting it. You stay stuck in surface level interactions.
[00:38:46] @ProfessorStew: You struggle to build meaningful or impactful. See, and that's so for me. Like, that's the type of conversation I need. Real talk con Donald. I don't need sugar coating conversation. I need no sugar. I give me the, give it straight [00:39:00] to
[00:39:00] Eric: I have you brought that
[00:39:00] @ProfessorStew: so I had never had had chat talk to me in such a way. It was like, bro, you bullshitting. Nah, you, you be, you being passive, you wasting time. And like, get up outta there. Like, and so I thought that was interesting. And so I'm, I'm concerned that. will make us a bit passive.
[00:39:18] Eric: Yeah,
[00:39:19] @ProfessorStew: if you don't, if you don't engage with it properly.
[00:39:22] Eric: so that's, I'm glad you brought that up. And here's a suggestion that I want to give everybody out there and I'm doing what I've actually done. You do need to spend the time to train your AI because you know, you can put instructions like, don't bullshit me. I don't, I don't. I've actually had people say, they put in, don't bullshit me, but I put in something like, I don't need you to agree with me.
[00:39:44] Eric: I need your, I need a 360 I outlook on what we talking about, however you want to do it. Right. So, and here's the other thing I'm thinking about, Stewart, as you gave that example, and thank you for sharing that, that transparency, we do it with humans, don't we? If you're [00:40:00] interacting with somebody, you're kind of trying to train them on how you want to interact with them.
[00:40:06] Eric: You'll hear a woman say, don't talk to me like that. Or, you know, I want to, I don't like people cursing when I'm talking, whatever it is that we're doing. So we, if we're going to say that this is human, like we have to have that expectation that it needs to be trained on our preferences as well.
[00:40:23] @ProfessorStew: Yes.
[00:40:24] Eric: That's a great point.
[00:40:25] @ProfessorStew: point. That is a great point. Yeah. You brought
[00:40:27] Eric: So thank you for that. Okay. So keep training your AI so that if, especially if you are using it as a supplement for human interaction,
[00:40:36] @ProfessorStew: Mm-hmm.
[00:40:36] Eric: could get as close as possible to what you need to, what you need.
[00:40:41] @ProfessorStew: here's, this is an important takeaway and maybe you have a moment for important takeaways. And I know we're wrapping this up, what's super important for everybody listening is. You must, you. Okay. Not must, I'm sorry. Most [00:41:00] important for everybody is it's imp in my opinion, it's important that you give direct inputs of people saying, Hey, create, you know, it used to be in the very beginning, write me a story in the tone of Malcolm X about. About the defunding of police, whatever, would generate that article. Now, I'm recommending is that you create the article, go write the article. You don't even have to write the article. You can speak your, you can dump your thoughts about what you want the article to be about, but be very, very, very specific about what it is that you want, and then use these technologies to help you create the format. No longer are we saying, Hey, just write me, just make me no, give it exactly what you are asking [00:42:00] for, whether it's in personal,
[00:42:02] Eric: Yeah.
[00:42:03] @ProfessorStew: in professional, or whatever the case may be. You have to give it you want deliver to
[00:42:11] Eric: And there's nothing wrong with also doing this, which I do a lot. You write the first draft and say, write this better.
[00:42:18] @ProfessorStew: Write this better,
[00:42:19] Eric: 'cause
[00:42:19] @ProfessorStew: you
[00:42:19] Eric: I mean, that's kind of vague,
[00:42:21] @ProfessorStew: first
[00:42:21] Eric: but over time, if you've trained your ai, they know what you mean by that. Like you need to, I need you to tighten this up. I'm about to send this to my client.
[00:42:29] Eric: It knows what you mean by that. Like it needs to be professional, blah, blah, blah, blah. You know what I mean? So that's what I do a lot. I'm glad you brought that up. Improve this for me. I started it, but I need you to help me finish it. And I still might make some revisions after you do the output, but it has saved me time of trying to figure out the whole thing.
[00:42:50] @ProfessorStew: Absolutely.
[00:42:51] Eric: Yeah. So that's, that's a big deal. That's a big deal. Okay.
[00:42:54] @ProfessorStew: people, you can, you can, you can speak your commands into chat as well. So if you've, you've [00:43:00] get bogged down
[00:43:00] Eric: I am working on that.
[00:43:01] @ProfessorStew: you can speak it out. You can,
[00:43:03] Eric: Yeah.
[00:43:04] @ProfessorStew: you can brain dump, which is phenomenal.
[00:43:06] Eric: Now let, Stu, can you give the audience this game? Because I, you did something the other day that I don't really do. So you had the ai, the chat, specifically speaking to you, how, what's the setting to do that? Mm-hmm.
[00:43:19] @ProfessorStew: So I believe it's on the mobile. I'm, I need to see if it's on the desktop version, but there is in chat, there is a microphone and then there's the wavelength, wave sound wave icon to the right. Of the microphone. And if you press that, it will only engage in conversation. It's not gonna give you any outputs.
[00:43:42] @ProfessorStew: It's not giving you text, outputs. The microphone. You speak into the microphone, it's gonna generate text outputs. But if you, if you press that wavelength, it's a recorder or it's a microphone actually, but it's a, it's, it's, it's. Wish I could show you, but there's a microphone icon, and then to [00:44:00] the right of it, there's a sound wave icon. The sound wave icon activates the conversation and it's only giving conversation.
[00:44:10] Eric: Now, do you have.
[00:44:17] @ProfessorStew: Yeah, I can actually do that. Do you want to give me the,
[00:44:23] Eric: However you would normally engage it just so.
[00:44:26] @ProfessorStew: okay, so here's, here's, just for the, for the record, don't normally use this feature at all because I'm the guy that's not engaging with the technology like that. I was just demonstrating for you that it could be done
[00:44:40] Eric: Thought.
[00:44:41] @ProfessorStew: now.
[00:44:42] @ProfessorStew: I don't. I don't. I don't, yeah. So let me see. Let me go. Okay, here's pulling up. Pulling up. Here we go. Hopefully you can hear it. Good morning, Chad. Hey listen, [00:45:00] I'm thinking about making some breakfast this morning. a Sunday. It's Easter Sunday here in Chicago. It's not very sunny out, but I need something to kind of pep me up for the day.
[00:45:12] @ProfessorStew: You got any ideas on how you might be able to help me? Anyone? How about a breakfast? That feels a little special. You can make something like a veggie pack omelet or maybe some pancakes with a bit. Can you hear that? I can hear you just fine. If you wanna
[00:45:27] Eric: A little low, but I can.
[00:45:28] @ProfessorStew: through more ideas, I'm right here, so. So the other thing about Chad is that she's listening to me at any time, like she can Im break in the conversation. So I heard what you said, Chad, about the breakfast, but I need something a little bit more fun. What about the idea of maybe going to a brunch where I can hang out and listen to some music, maybe a day party or something, any of those ideas? [00:46:00] Alright. That sounds like a good idea. What do you think I should wear today? Chad, you can't really hear that, can you?
[00:46:17] Eric: Yeah, you can't really hear, okay, well we heard it the first time, but you, I think you all get the point. Like you can kind of go back and forth and have a verbal conversation with chat and other ai. I think that's pretty interesting. Okay. I'm gonna try that later. I'm definitely gonna try that. All right, so cool.
[00:46:33] Eric: We got, let's wrap this up. So Stu. Thank you for kind of opening my mind up to AI in the first place, even when I went to your crib to learn some new stuff. So what are you, how do you wanna wrap up this conversation about how are people using AI and your thoughts on that?
[00:46:49] @ProfessorStew: A couple of, why don't you give a couple of recommendations for best use, and I'll do the same, what's maybe even some, a couple of warnings. Let's, let's start with you. How, how do you feel about best [00:47:00] use for AI for yourself right now in 2026?
[00:47:04] Eric: Well, I should frame it like this. This is about relationships, right? So, you know,
[00:47:11] @ProfessorStew: The relationship with ai.
[00:47:12] Eric: you know, let's. Think about what you like your relationships to be and with humans, and try to get as close to that as possible with ai. That would be my biggest advice with the understanding that it is not human.
[00:47:28] @ProfessorStew: Yeah, I think for me is like, Hey, if you're not using ai, if you're apprehensive of ai, I really wish I could get you off of that horse. really wish I could get you to embrace it. understand why people might be afraid of it. It seems overwhelming. I just wanna remind people of Wikipedia. wanna remind people of Facebook. I want to remind people of Google. [00:48:00] in the beginning, people were very, very apprehensive to it, even with cryptocurrency. We get these new technologies, and African Americans here in Chicago are, are apprehensive, slow to adapt. they want to do research and they gotta figure it out.
[00:48:19] @ProfessorStew: But with this ai, a couple of things, already, you're already a little bit behind, but it is not too late. Get in front of it. You do control it, but get in front of it and use it. You're not gonna lose your job to people. You're not gonna lose your job to ai, but you might really, really be in a tough situation when it comes to people who use AI versus people who don't. And I do think, and E-Money can talk about this a little bit, I do think that in 2026 it's virtually impossible to. Not have an income or, or to be [00:49:00] struggling financially. It's almost, know, if, if you, if you are, I think you probably need to up your game on technology, AI usage, and ways to generate income. Because it's, it's out here. It's out here, and the artificial intelligence can help. can help, but you gotta get in front of it. So,
[00:49:22] Eric: Yeah, that's.
[00:49:23] @ProfessorStew: You know, I, I, I know, I know we're gonna get some pushback from this, but I think in 2026, this is the, this is where it's at for right now.
[00:49:33] Eric: Okay, that's a great take and I'm gonna do a episode on Eric on Money about, you know, is with all these advancements in ai, are people really gonna be able to make more money or is it still a situation of if you got a bad business idea, still bad? You feel me?
[00:49:51] @ProfessorStew: that. Definitely that, right? Like
[00:49:53] Eric: Still bad.
[00:49:54] @ProfessorStew: it
[00:49:55] Eric: A bad.
[00:49:55] @ProfessorStew: Yeah.
[00:49:56] Eric: You know what I mean? So I wanna talk about that more in the future.
[00:49:59] @ProfessorStew: [00:50:00] That's a great one.
[00:50:01] Eric: Yeah.
[00:50:01] @ProfessorStew: one. I'll have to think about that and explore that and do some tests on that one.
[00:50:05] Eric: Okay. Alright, so let's, so let's so that's a good pin to put in this conversation. I think this is really good. Please give us your feedback. You all, you know, you can email us at money section x podcast.com. You can hit us up on any of these platforms. Let us know how you use AI with your relationship with this with ai, and any tips that you wanna share so we can talk about in the future episodes.
[00:50:28] Eric: We wanna run through our last segment real quick, Stu, which is characters from corporate.
[00:50:34] @ProfessorStew: let's
[00:50:35] Eric: We ain't did this in a while. Alright, so cool. So if you're new to the show, characters from corporate are is our segment where people write into us or we talk to people face to face and they give us a scenario in corporate that they want our feedback on.
[00:50:51] Eric: Today's I had for a while, but I'm using it because we're talking about ai, so it's very relevant. So this one was, I'm, I'm a black Gen X male, [00:51:00] and I'm a Wharton Business School grad, Wharton Business School. Now, if you all don't know this out there, like Wharton is sometimes the number one business school at times.
[00:51:10] Eric: Okay. So this is a big deal. It was a black man that went to Wharton. Cool. Yep. He said he works as a management consultant in a global consulting firm. So he's probably at Accenture, Deloitte. One of those he said everybody's acting like they've been using AI for years. Pe I'm watching people use it to draft emails, summarize meetings and all this different stuff.
[00:51:30] Eric: And even think through difficult conversations that they needed to have with their bosses and spouses. What's throwing me off is not the technology itself, it's how fast people are depending on it to help or do their work. And then I've got coworkers who are talking about, they've created AI agents.
[00:51:50] Eric: That are that they're kind of talking to, like they would talk to their wife or their therapist. It's like a wife, a therapist, and an executive coach all in [00:52:00] one and an assistant. And he is like, meanwhile over here, this is, I'm still a machine, right? Am I using it? Like, am I just not using it to the level of my peers or am I, he says, I know I'll likely start getting left behind in terms of ability.
[00:52:16] Eric: To generate output against my coworkers, but I'm just not feeling AI like that because I don't think it should replace humans. Am I a character from corporate for thinking in this way? Interesting.
[00:52:30] @ProfessorStew: Well, it's, it's kind of, it's kind of along the lines of I, of my thought, right? It's like, yeah, y'all ain't taking a minute to just think through the problem without ai. I agree with him that it know, there's a conversation about is it making us lazier mentally, or thoughts? And I see, I see where it's just gonna be super easy.
[00:52:50] @ProfessorStew: Just let the AI do it, just. Hopefully, if that is the direction we're going in, then that opens up our mind to think about more serious issues or more [00:53:00] complex issues that need to be solved. I don't think this guy is the character from corporate. I think I think the majority of people. That I think that is the way that we're gonna go.
[00:53:13] @ProfessorStew: Unfortunately I think that more people are gonna be relying on the technology. and I think honestly that there's gonna be more cre division between humans. I was, I went out to out to the family then 89th and Stony Island. I was at the family den last night, and. To see the number of people who were on their phones, man, instead of admiring the atmosphere, you
[00:53:41] Eric: wow. Yeah.
[00:53:42] @ProfessorStew: having, having been out on a date and, know my date is on their phone, you know,
[00:53:49] Eric: Wow.
[00:53:50] @ProfessorStew: the date, and, and, and I see it more and more.
[00:53:54] Eric: Sure.
[00:53:55] @ProfessorStew: So my hope is that people will find time to [00:54:00] put the phones down to interact and engage with other humans. But I do see that there's a wave of people who will be more connected to their technology their wife, therapists, executive, professional coach, than to humans, than to each other.
[00:54:19] Eric: Yeah.
[00:54:20] @ProfessorStew: of saddens me a little bit.
[00:54:21] Eric: Very unfortunate. I had a lady that I was meeting with last week and she was talking about how she trained. She created some an agent on open Claw. The agent's name is Marcy, and she talks to Marcy more than she talks to her husband. And her husband has recognized this and called her out for it, but she prefers to talk to Marcy over her husband.
[00:54:47] Eric: Right. That's interesting.
[00:54:48] @ProfessorStew: and you know what's so what's now? I ki Okay. That is interesting because I'm kinda, I'm kind of feeling that a little bit. That's another conversation for another day. [00:55:00] I'm kind of feeling that a little bit. If that means that her and her husband can still just have a great time and still be loving to each other and still enjoy life. Or does the husband feel like that's some form of infidelity?
[00:55:17] Eric: That's, that's something new to think about. Is it infidelity when you're going that deep with ai? Yeah.
[00:55:23] @ProfessorStew: Because of, I've, I've been called out for it being in, in infidelity when I'm liking on a woman's pictures.
[00:55:31] Eric: Sure.
[00:55:32] @ProfessorStew: You know the thought of being with a number woman is Or if I'm being too intimate in conversations, sharing things in my, that's infidelity. And so is this And people, I'm sure people, oh no, that's not in this.
[00:55:48] @ProfessorStew: Is this it? Okay. Well that's is about,
[00:55:50] Eric: Might be
[00:55:51] @ProfessorStew: about,
[00:55:52] Eric: be though, because you use the key word through intimacy. If you got more intimacy with the machine than your, your [00:56:00] mate, that's, that's a problem I would imagine. It is. I haven't been in that situation yet. I'm sure I will be, but I seems like it'd be a problem.
[00:56:08] @ProfessorStew: well this is what I see. I, this is the direct direction I see people going. It's like humans are, let me down. This.
[00:56:15] Eric: Right.
[00:56:15] @ProfessorStew: won't let me, Chad won't let me down. Chad will talk to me. How? Talk to me nice Chad. You know what I mean? Like,
[00:56:22] Eric: Talk to me when I wanna talk, right? 24 7. Yeah.
[00:56:26] @ProfessorStew: and agree with me. Agree with me, don't disagree with me. Build me up. Give me those words of affirmation. Oh my gosh, people, I wanna hear your thoughts about this conversation in the chat. me know what y'all thinking about
[00:56:42] Eric: I like that though, man. Hype me up. I mean, I think it has its place, right? You have them days where you need that. You know what I mean? Tell me. I'm that dude. I gotta go to a meeting in a minute. I need to hear it. You got all the background information on what I've done. Tell me I'm that dude before I go to this meeting.
[00:56:59] Eric: I really need [00:57:00] this chat. Hey, that might be very helpful. You know what I mean? You might not be.
[00:57:04] @ProfessorStew: I. For some people, I bet that that's very it's need. Okay. Okay. No diss, no diss to those of people who need that. I'm not gonna diss you. not. 'cause not everybody is able to get that for themselves or from their circle of influence around them.
[00:57:21] Eric: Right,
[00:57:21] @ProfessorStew: what you need to feel good about yourself to build yourself up, for it.
[00:57:26] Eric: right,
[00:57:27] @ProfessorStew: it.
[00:57:28] Eric: right.
[00:57:28] @ProfessorStew: a competitive world out here. So
[00:57:30] Eric: I agree.
[00:57:31] @ProfessorStew: Go for
[00:57:32] Eric: Yep. And maybe you need it right at that moment, so maybe your partner will do it, but they're in the meeting, they're not available. So, you know. Okay. So this has been a great conversation. I hope you all have enjoyed it as much as we have. This is, let me get this right. This is season eight, episode 52.
[00:57:51] Eric: What's your relationship with ai? This is Eric signing off Big Stu. We will see you all in the next episode. Until then, [00:58:00] pace.
[00:58:01] @ProfessorStew: Peace, peace.
Podcasts we love
Check out these other fine podcasts recommended by us, not an algorithm.