The Habit Architect
Hosted by Michael Cupps, The Habit Architect is designed to help you intentionally build the habits that lead to success and break free from those that hold you back.
Each episode, Michael guides you through practical strategies for designing focused, productive days that align with your goals and vision. Whether you’re striving for personal growth or professional success, this show will help you create the daily routines and mindset shifts needed to unlock your full potential.
Tune in for expert insights, actionable steps, and real-life examples to transform your habits and build the life you desire—one intentional habit at a time.
The Habit Architect
THA S02 EP#13 - Ghosts in the Code: AI’s Hidden Influence on Our Habits
For those of you who can't believe it's almost Christmas already, let's go back to the end of October with this episode. AI promises to make life easier, but what happens when these tools actually hijack our habits instead? In this conversation, host Michael Cupps talks with Cortnie Abercrombie, author of "What You Don't Know" and founder of AITruth.org
They discuss why 90% of AI problems trace back to data, how algorithms reflect human biases, the difference between productivity monitoring and toxic surveillance, and what the new Texas Responsible AI Governance Act means for businesses. Whether you use AI for scheduling, hiring, or productivity, this conversation will help you understand what is really happening behind the interface and how to use AI responsibly.
This Show is sponsored by TimeBandit.io
Check out our Live Show Events here: The Habit Architect Live Show
Subscribe to our Newsletter: The Habit Architect Newsletter
Michael Cupps: [00:00:00] Hello and welcome to the Habit Architect. My name's Michael Cupps. I'm the host today, and as, as usual, um, and it's a special kind of trick or treat episode today because we're close to Halloween if you celebrate that sort of thing. And certainly on every TV and every store there's candy everywhere, all the things.
Michael Cupps: Uh, but the concept of trick or treat is something I want to explore with our guest today. Uh, and I'll tell you a quick story. So, you know, AI is everywhere. A, you're, we're all tempted to use AI for just about everything. I stepped out there, you, you, you know, my background with productivity and habits and things like that.
Michael Cupps: I stepped out there and tried an AI tool that was gonna magically make my calendar work for me. And, uh, I'd like to say it was successful. I'm not gonna name the, the name brand, but, uh, what really happened was it just filled my calendar with. Any request and I couldn't figure out a way to prioritize the request.
Michael Cupps: So in, instead of actually optimizing my time, it filled my calendar with everything and anything, and then it was just a difficult thing to navigate. So I kind of abandoned that tool, a good, good thing. It was only [00:01:00] a, you know, a pretrial period. But the point being that AI is incredible, it can do great things.
Michael Cupps: But used incorrectly or maybe I didn't give it the right information, it became a disaster. And that's not to even mention something that I, you know, it could use my private information. So, uh, I was at a dinner not too long ago and I met, uh, our guest today, Cortnie Abercrombie. And I was really impressed with, um, with the conversation we had, not just she and I, but with this group of people talking about AI and AI health.
Michael Cupps: And, uh, Cortnie had some great, uh, observations and, and comments, so I immediately asked her if she would join us on the show. So, uh, Cortnie is an author of a book, uh, that was, uh, that is now on Amazon, if you wanna go find it. It's called What You, what You Don't Know. And it is about AI truth. She is also of a leader of a, of a nonprofit organization called AI truth.org.
Michael Cupps: You'll have this sticker that'll show you that. Uh, so she's, she's established herself as not only a board member and a technology leader. Uh, also in women's health and, but, but in particular what we're gonna talk most about today is [00:02:00] kind of this AI trick or treat kind of program. So let's bring Cortnie onto the, uh, onto the show.
Michael Cupps: And while we, while we do that, I did mention that she's got a book on Amazon. She's got a website. You can go to ai truth.org and uh, she will be here and I'll let her introduce herself. Hello Cortnie. Hey,
Cortnie Abercrombie: how are you?
Michael Cupps: Very good. Good to see you again. It's, it's, uh, I'm really looking forward to this conversation.
Michael Cupps: I hope you don't mind the play on Trick or Treat and Halloween, but I think it's a good, of course, I think it's a great way to frame it because there's so much, you know, everybody wants to dive into ai. Not everybody, but some, most people wanna dive into AI and see what it can do for them. But there may be some tricks in there that we, that we don't.
Michael Cupps: On the surface, see, so I'm looking forward to talking about that. But let's start with just maybe a, an introduction of yourself. I tried to introduce everything, but I'm sure there's more about you that you can share.
Cortnie Abercrombie: I think that's a great way to introduce that. I appreciate the introduction. Um, yeah, no, I mean, I've been in this space, uh, and been doing, um, things related [00:03:00] to AI and data for, well data forever.
Cortnie Abercrombie: Um, and, and then of course as AI evolved in the enterprise, um, you know, as an executive at IBM working in, um, looking at incubating different AI. Solutions over a hundred different ones across a lot of different industries. So, yep. So yeah, just been privy to all kinds of, uh, interesting ai, uh, adventures, let's put it that way.
Michael Cupps: Yeah, absolutely. Absolutely. And data's at the heart of it too, your own personal data, the company's data, and everybody else's data. Well, let's, let's start maybe at a high level, what's the biggest trick or the biggest threat. People should know about using AI If, if you can narrow it down to one
Cortnie Abercrombie: well, oh yeah, I can.
Cortnie Abercrombie: I absolutely can. And I always say it is 90% of what is going wrong with AI is related to data. I mean, I know that sounds stupid, but a lot of [00:04:00] people don't realize that these things actually are not just like magically existing on their own. They are fed by data and people Yeah. Are the exhaust that creates the data.
Cortnie Abercrombie: So like we create data exhaust everywhere we go and little breadcrumbs behind us and uh, you know, how we take in that information, um, and how we filter it, even from our own standpoint. Like, you know, even when you talk about police, uh, witness testimony, you know, stuff like that. Mm-hmm. When you think about how different witnesses, um, will tell you about the appearance of somebody, or you know, what happened, it's often driven by filters, right?
Cortnie Abercrombie: So we all have filters, and at the end of the day, the people who create algorithms, which is nothing more than a set of instructions for how a computer should analyze a given scenario. It all comes down to [00:05:00] people and what our filters are and how we set those instructions up, and then it takes on a life of its own from there.
Cortnie Abercrombie: So we have this. Ground truth, but it all comes back to data and people and context.
Michael Cupps: Yeah. Well I I'm glad that you said it that way 'cause I'm, it's interesting 'cause a lot of people hear the word algorithm and they don't think about it. You, you mentioned that it's an instruction set and are they, if, if you look in open AI and Claude, and those are, are those instruction sets vastly different or are they similar?
Michael Cupps: Uh, I mean, obviously
Cortnie Abercrombie: no, they're, they're, yeah. Yeah. It all comes back down to just, these are sets of instructions that humans give to computers in the form of, we give 'em in all kinds of ways. Algorithms can be as simple as. I can, I can tell you, uh, the most simple thing I can think of is write down a set of instructions for your 2-year-old, or 3-year-old or 4-year-old to make a peanut butter and jelly sandwich.
Cortnie Abercrombie: Yeah, that's a, that's an algorithm, but if you get much more complicated [00:06:00] in the way that we know physicists can in, and we, the way that we know actuaries can in, uh, you're applying extreme mathematical concepts. Um, and that's where it gets beyond a lot of us. You know, our own recognition. I don't know the last time you took Calculus three or beyond that, or Lanier, you know, Lanier algebra or anything of that nature.
Cortnie Abercrombie: You know, that's where it starts to get beyond, you know, our normal capabilities, unless we're mathematicians to understand what these actual algorithms are up to. And even when we start talking about really complicated ones that are doing, um, neural nets and things of this nature that are. Tend to be a little more black box in nature.
Cortnie Abercrombie: We basically set the algorithms up with mathematical equations and then we do a set it and forget it. Remember that guy, the Roco guy, set it and forget it? Yeah. Yeah. So we tend to do that. That's what we do now. Uh, we do lots of massive large language learning of, [00:07:00] um. And back propagations and all kinds of really cool mathematical things with, with it.
Cortnie Abercrombie: Yeah. And then we, we do ensemble methods of those even, so that means like we're taking the most complicated, mathematical complicated things we can think of. And then we also, in addition to that, do ensemble versions of them. So we take. You know, all these different linear things together, and then we put 'em with these other abilities and then we smash 'em all together and we create new ways to analyze, uh, data.
Cortnie Abercrombie: I mean, it's Right, but it all comes down to math and sets of instructions.
Michael Cupps: Yeah. And people, and that, and you made it sound simple, but it's, it's, it can be complex. So what you, you, you mentioned at the start that 90% of ai, uh, problems, if you will, are. Our data problems. And so as an individual, um, and then we can talk about the company.
Michael Cupps: But as an individual, what, what is the data problem that that instantiates itself differently for, you know, getting different responses, getting different, uh, or, or even wrong responses? [00:08:00] What is it about our data that that's, that's wrong or that maybe we're giving it.
Cortnie Abercrombie: Ironically, it comes to critical thinking of humans who put together the algorithms and how they include data and what they don't include in the data.
Cortnie Abercrombie: Ah. And so when I look at a set of data, and I'm gonna analyze it. And this process is called the exploratory data analysis. Part of what a data scientist is gonna do, they're gonna look at a set of data and they're gonna analyze it for what they need out of their equations, you know, and how they're gonna engineer this data.
Cortnie Abercrombie: But if you look at it and you think, wow, you know. Women are only like 20% of this information, whereas men are like 90%. But we don't wanna exclude women in the future from getting a, let's say we're looking at a jobs potential, um, for, for job candidates, for engineers. But you only see historically [00:09:00] there's like only two women in this set, and there's like 10 guys in this set.
Cortnie Abercrombie: Right? Um, well then that means that you're gonna have to do something so that those women's features. Related to them in the data will also be picked up for when you're recruiting, so that you don't, let's say those women are going to Pinterest, whereas most of the guys are going to gaming. This is a way over.
Cortnie Abercrombie: Yeah, that makes sense. But let's just say that's the, let's say that's the case as our example here. Um, yeah, you would buy, you would by definition leave out other women on Pinterest who are looking up engineering things. Right Versus this is way oversimplifying, but, and then it makes me, my heart hurt, actually, but to do it this way.
Cortnie Abercrombie: But I think it's a simple way to deliver the message and, and versus if you go out there and you assume you would pick up women on all the gaming sites and then recruit them into your engineering program. You might be wrong, you know, and so therefore you have inadvertently left out. But if you're a guy and you only [00:10:00] know what you know around you, right?
Cortnie Abercrombie: Because that's how we all as humans look around us and learn, right? Is hey, all the engineers I know are guys because you know, 80% are guys and only 20% are women. So all the ones I know are guys and they all hang out on these gaming sites. So I'm just gonna go recruit there. That's, yeah. That means you're just gonna reinforce.
Cortnie Abercrombie: Yeah. And so you have to be contextually aware as a algorithm, as a person who's instructing, um, yeah. Bringing the algorithms to the computer programs about, you know, maybe we do need to expand beyond just when we have, when we see that only 20% here and we have 80% of whatever it is, let's say it's not women and men, but it could be anything.
Cortnie Abercrombie: Yeah. You have to figure out how are you going to wait differently? How are you going to, do you need to create a subset of data and analyze it separately? I mean, there's like a whole bunch of things and this is where things get [00:11:00] wrong and sideways. Yeah.
Michael Cupps: Yeah. And, and that makes sense. I, we do have a question.
Michael Cupps: I'm gonna get to that in just a minute, but, so what is that HR leader that's, that's set the job description out and now they're using AI to go out and find candidates? What are they to do when they get the results back? That is that 80% men and, and two 20% women. I mean, what, what, how do they, how do they change their behavior to accommodate.
Michael Cupps: You know, kind of that getting a diverse work workflow. And if you don't wanna use male and female, maybe it's ages of pe of different candidates or something like that. How, what, what is the business person supposed to do with it?
Cortnie Abercrombie: Oh, yeah, absolutely. It could be age, it could be anything. Literally, culturally, um, it could be ethnicities.
Cortnie Abercrombie: I mean, oh, good. Gosh. It, it could just go on for days as to how we can. Inadvertently exclude. And a lot of job descriptions are written by managers, right? Yeah. So, I mean, and managers, again, in this case, the job description is basically gonna be like an algorithm, um, because mm-hmm. You're going to go in and try [00:12:00] to replicate, um, what you already have in your company.
Cortnie Abercrombie: Right, because you're gonna write it from the perspective of what's available. Well, the word choices even that I use, um, how much of what you see, like women will apply to, um, you know, like if they meet 90% to a hundred percent of what's in the job description versus men will apply to 50%. So you have to be good enough, smart as right.
Cortnie Abercrombie: You have to be smart. Um, if you're in HR recruiting, um, you know, uh, not just HR recruit recruitment managers, but actual HR recruiting development of tools. Yeah. So if you instruct correctly, um, build your instruction set correctly and build your data correctly, so. Sometimes you have to look at, well, how can I add more women to this?
Cortnie Abercrombie: Um, you know, or how can I add more of a certain age group to this or an ethnicity to this so that I don't exclude, um, [00:13:00] how can I create? Sometimes you have to create synthetic data, though I'm not a huge fan of that. Um, you know, how do you build up and bolster and downplay, um, ones that are, uh, crowding out with noise in your system?
Cortnie Abercrombie: Yeah. Um, the others, so, I mean, some, there's lots of methods that data scientists can use if they're contextually aware that they, that they're even. Gonna possibly create a algorithm that could do that, that could be be biased, you know?
Michael Cupps: Yeah. Well, and that's one of the questions that it, I don't know the name, it just says LinkedIn user, but it says, do you use synthetic data to solve some of the imbalance of, of, of
Cortnie Abercrombie: the data?
Cortnie Abercrombie: Sometimes you don't have a choice, but the better, the better, um, way to do this is always to, um, try to go out to, there's different companies now that provide the ability to, um, get insights. From specific, uh, crowds. Remember the all good old days of, um. Primary research, I mean, literally like [00:14:00] going and creating a focus.
Cortnie Abercrombie: Well, it's kind of that same idea, but you know, you're sending out a mass, uh, a mass, uh, amount of, uh, questionnaires to get more data on a particular group that you're missing. Sometimes that can be helpful. Um, sometimes you can tune hyper tune use parameters and, and hyper parameter tuning, um, in your feature set, which is a data scientist way of doing things, um, to create synthetic data.
Cortnie Abercrombie: But. I don't know. It's, it's always hard. It's always hard. As long as you're aware and you're looking for the right things, you, you can in fact find. I think that the best way though is just to research that those groups that are underserved in the, in the aspects that you're looking at and being aware of it and just trying to make sure that.
Cortnie Abercrombie: As you're pulling in that you're not getting, as you're pulling in results, you're not getting, um, the same exact people over and over, and that there's not some sort of weird bias, like remember the Amazon case with the Jareds? Yes. Yes. You [00:15:00] know, so by default you've left out pretty much anybody.
Cortnie Abercrombie: non-American, I think. I don't know many s across the, I dunno. Yeah. I mean, it seems like it's pretty United States based. I don't know. But we, we would see, right,
Michael Cupps: exactly. Exactly. Yeah. And, and it's, it's being careful and, and, and I think maybe the right word is aware. Where
Cortnie Abercrombie: monitoring what's happening with the results.
Cortnie Abercrombie: Like, what am I actually bringing back from the real world and, and is that working? And do I need to immediately adjust, shut down, adjust, and then re, you know, do and ultimately Amazon shut that one down completely. Right, because they couldn't make it work.
Michael Cupps: Yeah. Yeah. And so, uh, you, you also do a lot with women's health, and I'm, I'm struck by, because I was thinking about one of those aura rings.
Michael Cupps: I haven't invested in it yet, so I don't know that I will, but. There's a lot of, a lot of AI in Health now and health tech, and then you also, you know, a champion for women's health. I mean, do you have big concerns there in that kind of category first and then maybe we can drill down a little bit. [00:16:00]
Cortnie Abercrombie: Of course my, it's, I'm gonna sound like a broken record, but of course it's the data, right?
Cortnie Abercrombie: I mean it's, uh, I feel like it's like with politics and it's the economy stupid, but I mean, it's the data stupid is what I would say. Yeah. I mean, back again, the problem is a lot of these things are from systemic issues, right? Because we haven't, typically, women's hormones are so fluctuating throughout their life.
Cortnie Abercrombie: We've literally been left out of so much research and so much data because of it, because we, that they, even the pharmaceutical companies haven't been able to figure out how to account for, um, you know, women's fluctuating hormones in how they, uh, you know, metabolize, um, even COVID shots and, you know, everything else on the planet.
Cortnie Abercrombie: And even how we present when we have heart attacks is just so very different. And even what we know about women, uh, even what women know about ourselves. If you [00:17:00] ask a lot of women, what's the number one killer of women? A lot of 'em, because of breast cancer awareness might say breast cancer, but guess what?
Cortnie Abercrombie: Heart attacks are like here. And the data for all other cancers combined, including breast cancer, uterine cancer, ovarian cancer, all the cancers combined for women is way down here. So it, yeah. Cardiac here and all the others here. And mental health, by the way, is also starting to emerge as a huge one because, um, a lot of what happens, um, I mean, here's an example.
Cortnie Abercrombie: A lot of women get sent home with a Valium when they go, yeah, present with a heart attack because. Is now being learned That panic attack actually might be, and there's still study in it, but there's new studies going on, new data being prepared, but it turns out panic attack might actually be a lead in to a woman's heart attack, whereas men have a jaw and a.
Cortnie Abercrombie: An arm, you know, um, you know, uh, lead into [00:18:00] a heart attack. Women present with a lot of different stuff, stomach upset, I mean, weird things that guys don't seem to have. So, so yeah. I think the thing that scares me the most is that there's not a data out there to make these. Uh, bold clinical diagnostics using ai because there's just not that level of research that men have typically had.
Cortnie Abercrombie: So yeah, that's, but I'm looking forward to more research being funded. I think that as we get more data, we'll just get better and better.
Michael Cupps: You just made me think, and by the way, you see people behind me and I did a very bad job of picking my location in this airport because I'm, there's two bowls of nuts and chips and things ahead of me, so people want to get out on that.
Michael Cupps: So excuse any disruptions, but I'll help them get to, to the snacks. Uh, so the, uh, what you said there is interesting 'cause when I think about any medication, uh, and you read the, the directions, cautions, whatever it is on the back of it, it's usually based on weight. [00:19:00] It's usually based on maybe age, you know, known children under 12 or something like that.
Michael Cupps: But you're right, it doesn't take into account other things. Like when you said hormones, you're, you're absolutely right. There could be an adverse reaction that's not even on that box because they really tested it for your weight relative to the amount of, uh, medication going in. So that's, that's a big open-ended thing.
Michael Cupps: How do, how, I mean, how do we get there? How do we, how do we close that gap?
Cortnie Abercrombie: Right. I mean, that's what I'm hoping to do with my new Women's health initiative that I am secretly you guys don't tell anybody. Trying to learn as we speak is trying to help women at least connect with other women about how their health is.
Cortnie Abercrombie: Presenting so that we start to get some of that data, um, in there, um, in, in, in, out into the world so we can start using it to help each other and help, uh, pharmaceutical companies help us honestly too as well. And then also, I mean, I don't know if you know this, but a lot of, um. A lot of testing and [00:20:00] benchmarking is not even covered by insurance.
Cortnie Abercrombie: That's ridiculous. I mean, we literally have, like, women need to have benchmarks of where they are, um, on a regular basis, uh, with their hormones. And instead what happens is you show up to the doctor's office, they take a blood draw, and they're like, well, here's where you are today. And that, and that can, you know, because of the fluctuations, it may or may not be helpful.
Cortnie Abercrombie: So, I mean, I think. You know, as we learn more and as we can get more testing funded by healthcare organizations, then we can get, then we can start truly getting to some solutions and some, some validation. Let's just admit it, like when it comes to women, a lot of women feel alone out there about, you know, Hey, is this crazy?
Cortnie Abercrombie: I'm kind of losing hair and you know, like, I can't sleep at night and some weird things are happening with my skin. And, but it turns out estrogen, you know, it affects so many me metabolic prophe. Processes inside of women and, and how we as women, um, you know, [00:21:00] digest, uh, even, you know, pills and everything else and, and our lot.
Cortnie Abercrombie: Yeah. And how we even, you know, manage our, our anxiety. You know, it can, it can be affected very much by hormones. So, um, yeah. You know, we have to understand those better and we'll, we'll, we're getting there. Pe there's more research being funded, uh, than ever before, but it's. Still, it's still lacking and we still need to get there faster.
Cortnie Abercrombie: Hopefully we can use some AI tools to get there.
Michael Cupps: There, there you go. There you go. And somebody else commented. I I think it's, uh, Gula Ra said, and it made a good comment. Uh, but it says it is the warning that prepackaged software. Maybe it may have biases that you don't know about. So should, is it more of buy, buy versus build if you want to.
Michael Cupps: I, so it, the reference was to HR hiring, back to the con conversation. We were talking earlier, Uhhuh, but it's a, maybe be cautious when you're buying a pre-packaged software to do your, maybe your candidate tracking or something like that. Is that fair? Is that the right assumption?
Cortnie Abercrombie: I think so, absolutely.
Cortnie Abercrombie: Well, you, I [00:22:00] mean, it's not necessarily that you have the. Not every company's gonna have the ability to build. So I mean, you just, what you need to do is, is understand from the development teams that create these, especially if you're using ones that aren't well documented out there just yet. Um, yeah.
Cortnie Abercrombie: Understand what they're putting into it. Um, understand how did they account for any bias? Ask the right questions. Put yourself a discussion question guide together for Yeah, the development team or the sales team. Um, and yeah. You know, not many people have the ability to, um, develop their own recruitment.
Cortnie Abercrombie: Um, and honestly, a lot of recruitment software out there specializes in specific industries as well. Um, so there's lots of benefits to using those. You just have to make sure that you're not gonna exclude any of the people that you, you wanna make sure don't get excluded. Like you need to understand, uh, how do they make up for different ethnicities?
Cortnie Abercrombie: How do they make up for gender and age and that [00:23:00] kind of thing. Yeah. So, so that you're not, uh, gonna end up with a, a biased result because at the end of the day, um, and I've done plenty with the, um. Uh, employment, you know, uh, agencies and stuff. Uh, with the government, you really have to understand, um, you know, what is the disparate impact.
Cortnie Abercrombie: That's the words that they use when they come after you, right? So you better, you better understand, plus you wanna be good citizens. So, you know, just make sure you understand the result of what you're getting. And if, if you're not getting the results that are, uh, ethnically de diverse. You need to start asking questions before you actually implement that program based on what you've got.
Cortnie Abercrombie: Right. And so it's just a constant monitoring of results. I mean, you gotta make sure you've got the results you want.
Michael Cupps: Yeah. Yeah. Well, so that may be a good time to talk about your, uh, nonprofit. Um, what, what's the primary mission of the nonprofit AI health dot?
Cortnie Abercrombie: Oh yeah, AI truth.org. Yeah, I'm the [00:24:00] No worries, no worries.
Cortnie Abercrombie: Yeah, it's educate, um, businesses and government leaders and, um, and just regular moms and dads and just, you know, about how kids are affected. Um, I do work with Karu, which is a better. Bureaus a child advocation, uh, an advertising research unit. I mean, like all kinds of, um, all kinds of advocacy just to make sure that people know what's going on, how it's affecting their lives in terms of their jobs, their productivity, their, their, um, rights, even, um, like, you know.
Cortnie Abercrombie: You know, we can see that there's a lot of monitoring software out there that a police are using to, to surface individuals. And you know, ironically, people are having to use AI to get out of these situations. Like, Hey, my social media. Program has me, GPS to overhear. But you think using your facial algorithm that I was over here, [00:25:00] so I mean there's lots of like AI use against AI, actually, which is, which is kind of interesting.
Cortnie Abercrombie: Um, yeah. So I mean, yeah, we're seeing it in every. Every place in everybody's lives. Um, yeah. From child, you know, from children, you know, looking at character ai, which is developing your own avatars. I mean, Disney now has the ability to develop your own avatars and making sure that those are safe for kids, all the way through to helping you understand how AI can affect your job opportunities.
Cortnie Abercrombie: And, um, yeah. And then working with business to make sure that they understand how to truly integrate that into, um, every aspect of, of their AI development inside the company. Including the much overlooked organizational change management. Yes. Yes. That goes with. A lot of the AI tools that needs to be implemented.
Michael Cupps: Yeah, yeah. We had a guest on, we had a guest on earlier, his [00:26:00] name is Tony, and he, he talked a lot of, he spent more time on change management than anything because of, of the nature of that. So let's dive into that workplace. I mean, first off, what should, what should leaders be saying and directing their.
Michael Cupps: Next level managers and stuff, what, what's the core message they need to, to deliver to make sure that their employees are safe and make sure their customers are safe and all of those things.
Cortnie Abercrombie: Well, there needs to be a full on training of people when any AI is gonna be used inside the company, whether the company develop, it develops the AI themselves, or whether the company is going to go out and use AI with their users.
Cortnie Abercrombie: Like in the case of Disney, they both. AI internally, and they use AI externally as well. Um, yeah. You know, so you have to, your, all of your employees and board members, I mean, that's an overlooked area as well, need to know, um, what is this thing gonna do? How's it gonna work? How do the rails come? How, how do [00:27:00] we predict the rails could come off?
Cortnie Abercrombie: If those rails come off? Um, what's our plan to remediate that? Um, and it can't just be, well, we're gonna have a big PR conversation. No, it can't be just about schmoozing. Yeah. The news people. No. This has to be a true, honest to goodness because customers really trust is completely eroded in the marketplace right now.
Cortnie Abercrombie: Completely. I, I love the Edelman Trust barometer. I, I watch it. And, and now there's others coming out. Pwc I think came out with one as well, a trust barometer for business, uh, users. So, I mean, uh, you know, trust is, you're gonna have to earn it. And if you can't show people how you're going to protect them and put guardrails in place and what your plan is, then they're not buying it.
Cortnie Abercrombie: Literally, they're not buying.
Michael Cupps: Yeah. Yeah. Well, and when you think about rolling out technology inside of a company, a lot of times the employees don't get a choice of what they're starting to [00:28:00] use because they're, you know, they're, it's been decided by it or member. And so how does, uh, I mean, how, how do you tr, I mean, what's your training methodology?
Michael Cupps: I mean, is is there just always a policy that says here's our, our ethical view of this ai, or, or is it broader than that?
Cortnie Abercrombie: Well first you have to start with the strategy. Um, I mean, it goes all the way back from the moment that you think you wanna develop a particular type of ai. Um, you need to evaluate it.
Cortnie Abercrombie: Is this a high risk AI or are we just like picking colors? Like, you know, like, is this an AI that's just gonna pick colors around or whatever? I'm not saying like, you know, maybe colors are important to some people, but I don't care about that. Yeah. But I mean, if it's something that's gonna impact someone's livelihood, safety or happiness in like, and I'm talking like, like in a big structural way, like give them no hope.
Cortnie Abercrombie: Um, yes. Of being able to change the system, so to say, um, and not, you know, just kind of, you know, replace [00:29:00] them as, you know, and their purpose Yeah. And their meaning. Right. Um, then that's, that's a problem. Uh, and you have to rethink how you handle that, that type of ai. And you have to be. Super careful in putting policies in place from the very beginning.
Cortnie Abercrombie: Um, making sure that you're always analyzing, do we have a representative set of data going into this situation? Yeah. Is this even a humane use of ai? There's a list of questions actually on ai truth.org that I, I. Promote to, um, business leaders to say, can you answer these questions? If you can't answer these questions, you're not even ready.
Cortnie Abercrombie: You're not even ready to implement that ai. Oh, I love that. Yeah. And as you go through the process, so I've also put up a, a process map so that you can kind of go through this wheel as you face each part of the process of the ai. Lifecycle process, then you [00:30:00] should be able to do those things at each process and answer those questions.
Cortnie Abercrombie: And if you can't, then you don't need to. Then you're not gonna do a good job of leading with AI in your company. You have to be able to. To, um, implement policies. Um, yes. Train your employees. Every employee should know at, at the very least about what type of AI is being used. That's high risk, even if they don't necessarily get a say.
Cortnie Abercrombie: But I would. Encourage, um, the key stakeholders inside of a company should always be given a chance to weigh in. You should always have a process and a time and a place for people to weigh in and, 'cause we can, we see what happens when you don't do that. We've seen the many Google. You know, employees are a force.
Cortnie Abercrombie: They are absolutely a force to be reckoned with, and they can absolutely tank a product well before exactly it comes out. Yeah. [00:31:00] So I mean, I think we say employees don't have a choice. They kind of do. You know, because, well, I mean, they don't, and they do, they don't because you could roll it out anyway, but if you don't go through the stakes of trying to get them on board first.
Michael Cupps: Yeah, that's a
Cortnie Abercrombie: big mistake in my opinion.
Michael Cupps: Yeah. Yeah, I think, I think so. And because there's so much you can read about AI's gonna replace people's jobs and then, you know, does it really, what? There's so much hype out there that that's, I think it the companies have to, before they roll it out, there needs to be a strategy and a conversation about what it is and what it isn't and why, and why it's important to the company.
Michael Cupps: Right. Because it does need,
Cortnie Abercrombie: yeah. Yeah.
Michael Cupps: If
Cortnie Abercrombie: you're gonna do surveillance, you better make sure that you are explaining. And giving a transparency framework, developing a transparency framework that says, this is what it's gonna be used for. This is how we're gonna take your data. We're going to empty out that data so you have a new chance to start over so that you're not always locked into being whatever performance.
Cortnie Abercrombie: Let's [00:32:00] say they're doing performance ratings off of surveillance, which I. Yeah. You know, but if that's what they're gonna do, then they should at least be a good enough company with leader, strong enough leadership to go in and say, Hey, look, this is what we're using this for. This is, it'll only be on and used during these times then.
Cortnie Abercrombie: And put a finite, you know, put it to a finite measure. You know, say, Hey, we will only use it from this time to this time. You know, you won't know exactly when or whatever. 'cause you know, obviously that's why they're doing it. It's trying to get a baseline of what people are doing without them really knowing.
Cortnie Abercrombie: But I mean, they should at least have some idea and then they should know what did it, what was the ultimate decision that was made and why was that decision made and how long will it take before. The next cycle so that there's a new, um, opportunity because people are driven by hope and opportunity.
Cortnie Abercrombie: Come on. You know? Yeah, yeah. Absolutely. You don't wanna be the, that kind of company that's just [00:33:00] gonna have your comp your, your employees not trust you anymore and feel stressed and, and competitive with each other, you know?
Michael Cupps: Yeah. Yeah. I've been a part of many productivity based, uh, rollout technology, and some of them go fabulously because.
Michael Cupps: They get people behind the project and the reason why they're doing it. And there's others that are the surveillance type and they are miserable environments when they, when they act. Yeah.
Cortnie Abercrombie: There's fun ones where you're like competing with other teams and you're, your team gets closer because of it. And then there's ones where it's just beatings on the head basically.
Cortnie Abercrombie: And like, that's not the way to use ai.
Michael Cupps: Yeah, exactly. So, uh, we're we're almost outta time. I'm, I'm curious about your book. Can you tell us a little bit about the book?
Cortnie Abercrombie: Oh, thank you so much for mentioning it. It's, um, what you don't know, um, AI's unseen influence in your life since we've been talking about all the unseen influence, um, and how to take back control.
Cortnie Abercrombie: So at the end of every chapter when I talk about kids, I talk about. [00:34:00] Make this contract with your kids before they get that first device. Like, you know, this is what you will and will not do. You will not post pictures of yourself with your home address in the background. You will not post pictures of yourself at school by yourself.
Cortnie Abercrombie: You, you know, like things that could set up predatory. Yeah. Um, you know, and then we, I talk about how to outdo the, the AI interview process. You know, um, if you're gonna be interviewed by ai, here's what to look out for and how to claim back some, some semblance of yourself. Yeah. While still getting, uh, the, the AI interview done.
Cortnie Abercrombie: Um, and then, you know, of course talking about, uh, you know, what happens in employee productivity monitoring. Um, what your rights are, how to, uh, you know, how to think about, um, and how to know that you're being monitored. All the grocery stores, for example, that monitor you as you go through their store. I have lists of those.
Cortnie Abercrombie: I mean, it's like everything. And then as business people, it talks about the major. Business [00:35:00] functions that I've seen AI be used for and how to be ethical about those. And then at the very end, the most important part is, um, a challenge really to all leaders out there. Um, to take the AI challenge, um, at the AI at the.
Cortnie Abercrombie: Challenge and say, yeah, we will do these things in an ethical way. And a lot of it is based on the questions on the website right now. So if you go to ai truth.org, you'll see all of those questions that, um, and if you can answer the right way to those, um, then, uh, then you're doing pretty good. And you'll probably meet all of the eu, um, AI regulatory and the Texas AI regulatory now too.
Cortnie Abercrombie: Coming up January, everybody. January. Yeah, it's gonna be enforced January, so,
Michael Cupps: yeah. Yep. Well, I've got two more questions, two more questions for you that I ask everybody, but before that, I just wanna remind everybody, this is sponsored by Time Bandit, so go to time bandit.io and check out the mobile app a book, as well as the training opportunities on Time Bandit.
Michael Cupps: [00:36:00] So, um, so two questions I always ask everybody and I, and did you wanna mention the Texas thing for a minute, or I feel I cut you off
Cortnie Abercrombie: the Texas. Texas Responsible AI Governance Act. It, it will be coming out January, uh, 2026. Um, it will be on a, uh, a per complaint basis with the Attorney General's office. Um, so if, but they will, they are giving.
Cortnie Abercrombie: If you document what you're doing, um, and honestly, if you can answer all the questions I have on AI truth.org, you'll be in a good position with the Attorney General as well. Yeah. According to the rules stated in the, uh, in that act. So in the Texas Responsible AI Governance Act. Traga,
Michael Cupps: yeah. Yeah. And, and I would encourage everybody, and in the state of Texas, there's also, I think it's through the Texas Workforce Commission.
Michael Cupps: They will actually pay small businesses or give us stipend to small businesses to do AI training of their employees. And I, I just urge everybody, no matter what state you live in, to go see if that that's available because [00:37:00] you started this conversation by, it starts with education and that that's, that's great.
Michael Cupps: And by the way, uh, Trayvin said great insights, Cortnie, and Absolutely correct. He agreed with everything you said, so that was not a comment. Oh, thank you. That I wanted mention. Um, so the, so the last two questions is, number one, what is a daily habit that Cortnie has that's non-negotiable?
Cortnie Abercrombie: Oh, well, for me it's every morning I get up and pray.
Cortnie Abercrombie: That's good. That's non-negotiable. I feel like we need a lot more, uh, love and care in this world, so,
Michael Cupps: ah, that's great. Yes, absolutely. That, that is fantastic. And then secondly, I, I've been toying with this idea that tomorrow's more important than yesterday. And if that's, if that statement is true, kind of what do you, what are your thoughts on that and, and how do you focus on today versus yesterday, you know, dwelling on the past?
Cortnie Abercrombie: Oh, I don't dwell on the past much at all actually. I try to just always keep moving forward, um, because we're always gonna make big, huge mistakes. We're always gonna do [00:38:00] things that we regret. Yeah. And we just need to keep trying to make things better and better and better, and, uh. Matter of fact, I start my book with that very, that very situation where I'm talking about a situation that I kind of regret with the, with a big cigarette company actually.
Cortnie Abercrombie: So, and then I talk about, you know, but we've gotta move forward. We've gotta use our past to educate our, our future, right? We gotta use our past, educate our future. And, um, yeah. And if I give you nothing else, this one leave behind is that relationships. Are the single most important thing. And if we can't use AI to build relationships, if we use them in any way to tear them down, whether we're.
Cortnie Abercrombie: Creating distrust, whether we're not following through and helping people, you know, get to the end game, you know, point A to point B if you're a vehicle, automated vehicle, uh, group or whatever. But if we're not doing things to safely [00:39:00] help people, then we're not. We're not doing it right because we, we, we gotta value human to human relationships more than, more than anything on, on the planet.
Michael Cupps: Yeah. Yeah. Absolutely. I love that. That's a great way to end on, on relationships, the importance of that. I, I love it. So, Cortnie, thank you so much for being here, for everybody listening and watching. Uh, you can find us on YouTube, on Spotify, apple podcasts, all that stuff. So go like and follow or subscribe, whatever their mechanism is on your platform too.
Michael Cupps: To see every episode. Cortnie, thank you so much. Everybody should go out and check out, uh, ai tru uh, ai truth.org and go get the book what you didn't know or what you don't know, I think is the right way to say that. Uh, and Cortnie, thank you so much for helping, uh, everybody just learn more about ai, responsible AI at, you know, and keeping everybody safe.
Michael Cupps: Thank you so much for everything you do.
Cortnie Abercrombie: Thanks for having me. It's been great.
Michael Cupps: Excellent everybody. Thanks for tuning in. We'll see you next time and thank, uh, have a great trick or treat [00:40:00] Halloween Day if you, if they, if you celebrate that sort of thing. Take care.