The Inner Game of Change

E106 - AI as a Mirror for Organisations - Podcast With Shannon Lucas

Ali Juma Season 11 Episode 106

Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.

0:00 | 53:06

Welcome to The Inner Game of Change.  where we explore the thinking that shapes how change really happens. 

Today I am joined by Shannon Lucas.

Shannon is the Co CEO of Catalyst Constellations, where she works with executive teams to close the gap between the speed of change and the speed of execution.

And in this conversation, we explore what that might mean in practice.

We talk about discernment. We talk about resistance as a signal. We talk about fear in the system.

And we explore the reality of leading change when everything is moving at pace.

One idea! AI might be acting as a kind of mirror.

Reflecting how we think, how we decide, and how we design work.

In simple terms, what we notice may shape what happens next.

This is an episode for those navigating change in real systems.

Not just designing it on paper.

I am grateful to have Shannon chatting with me today. 


About

I help executive teams and organizations build the muscle to transform continuously without burning out their best people or watching momentum die between initiatives.

Most organizations treat transformation like a one-time event: hire consultants, launch the program, declare victory, exhaust everyone involved, then wonder why nothing sticks. I work with CEOs and C-suite leaders to break that cycle by building real change capability into how their organizations actually operate. 

As Co-CEO of Catalyst Constellations, I bring 20+ years of enterprise-scale innovation and transformation experience from companies like Ericsson, Cisco, and Vodafone. I’ve owned a $150M P&L, generated over $1B in new business pipeline, and learned firsthand what actually works (and what definitely doesn't) when you're trying to change how global organizations operate.

My approach combines hard-won operational experience with ongoing research into how transformation really happens. I co-authored the #1 Amazon bestseller Move Fast. Break Shit. Burn Out. based on our research into Catalysts (the highly action-oriented change agents inside organizations who can't stop themselves from driving change). We wrote it after watching too many talented people flame out while trying to transform their companies without the right support or frameworks. And based on lived personal experience.

I believe the organizations that will thrive are the ones that stop treating change as an event and start building the human, cultural, and structural capacity to adapt continuously. These are the firms that turn transformation into a sustained advantage rather than a recurring disruption.

My mission: help make the world's largest organizations more sustainable in every sense of the word: for people, profit, and planet.

Shannon’s profile

linkedin.com/in/shannonglucas

Website

catalystconstellations.com (Company)

Send us Fan Mail

Executive Wins Podcast

The Executive Wins Podcast features inspiring Executives who share their biggest wins.

Listen on: Apple Podcasts   Spotify

Ali Juma 
@The Inner Game of Change podcast

Follow me on LinkedIn


Change Is Never Linear

SPEAKER_02

The first part is acknowledging what you just said is true, which is essentially every system is unique, and every system is unique at a specific time. So if you want to create change at Company A last year and you tried to create the exact same change this year, it would have to be different by definition. And so, and we just want to say before we get to comment on it, yeah, we have uh, you know, when we walk people through a vision of change, we call it the Candyland map or the shoots and ladders. It's like we we show the steps, but it's not linear. And we say to them, if you haven't slid back down a few a few steps along the process, you're probably not doing it right, and it's probably not gonna last because you're not addressing, there's always friction in change systems.

Ali

Welcome to the Inner Game of Change, where we explore the thinking that shapes how change really happens. I am your host, Ali Jemar. Today I am joined by Shannon Lucas. Shannon is the co-CEO of Catalyst Constellations, where she works with executive teams to close the gap between the speed of change and the speed of execution. And in this conversation, we explore what that might mean in practice. We talk about discernment, we talk about resistance as a signal, we talk about fear in the system, and we explore the reality of leading change when everything else is moving at pace. One idea that stayed with me today is that AI might be acting as a kind of a mirror, reflecting how we think, how we decide, and how we design work. In simple terms, what we notice may shape what happens next. This is an episode for those navigating change in real systems, not just designing it on paper. I am grateful to have Shannon chatting with me today. Well, Shannon, thank you so much for joining me in the Inner Game of Change podcast. I am grateful for your time today.

SPEAKER_02

Thank you for having me. It's lovely to be here.

Discernment As A Leadership Skill

Ali

Thank you. I want to start with a general question to you, Shannon. What has been occupying your mind recently when it comes to your work?

SPEAKER_02

You used the word discernment and we were prepping for this. And that's a word literally in the past couple of weeks that has come more and more to mind. And there's an interesting intersection with discernment and the speed of change and the amount of change that is facing all of us. We don't need to explain that any further. And obviously the AI uh sort of revolution. And as those two things are intersecting in organizations, it can obviously have positive or negative impacts. And I think one of the things that is going to determine the success of those intersections and the transformations that are happening is discernment. And you know, there's discernment has some nuances that are beyond critical thinking. And while there's a lot that we can teach about discernment, there's also something that's really internal and personal about people, the way that people sort of live or express discernment. And so I don't think it's something that organizations and leaders are actually talking enough about. It's not just strategic mindset, right? It's not just identifying business problems. And so that's what I've been thinking about.

Ali

So a middle manager, how would you describe discernment for them in their own context?

SPEAKER_02

One of the things that so there's a sensing, sensing component at the beginning of discernment, right? You have I think you have to have like a wide, a wide field, a wide aperture, depending on where you are in a process. And we can talk more about this. We work a lot with people that you know we've researched for years that we call catalysts, and there's this certain like dot connecting for catalysts and scanning for catalysts that is sort of innate. You can learn that by skill, you can block out time, you can sort of put some structures around the sensing. And then with sensing, there starts to become the sense making out of the things that you're seeing and the dots that you're connecting. And I don't know that we teach people enough about the sense making. There's a contextualization. So there's like understanding the system that you're living within, how that system is operating, what's relevant in terms of what you're scanning, what you can carve out, because you're like, that's not even a field that we play in right now. And then from the sense making, there has to be sort of the prioritization or the alignment with if you have a strategy, a vision, a place that you're going, or a way to take some of the new learnings and say, actually, there's something net new, and that would be part of the discernment. There's something net new that is relevant that might require that we examine our current strategy or vision or mission. And then discernment has a sensing component to the sort of next stage of it, which is seeing how that thing that you are discerning needs to be explored is landing with the people, the system itself that you're talking about. So it kind of becomes like a meta loop. But it's, you know, it's it's possible to put structures in there. And I think the organizations should be doing that. And that's some of the work that we're doing now today. I mean, we call when we're doing it, we're calling it like strategic mindset, strategic skills, executive presence, because executive presence is about understanding the audience in the room and what's going on with them. I guess there's an important part underneath all of this too, which I think discernment requires a lot of self-awareness and emotional intelligence because it's important from a discernment perspective to understand how you're showing up and how you're sensing in the moment. My aperture will be different depending on whether I got a lot of sleep last night. My ability to see at the other end how the new idea is landing with people is really dependent on me being reasonably self-regulated so that I can actually pay attention to the signals, the unspoken signals that may be in the room or on the call.

Ali

You mentioned a couple of things that I want to dig deeper into the system thinking. But let's just stay a little bit longer in discernment. And I find that discernment is really a critical factor now in the age of AI. It's way beyond the judgment. And then I find that when we adopt AI, it's not just the adoption, it is how deep our level of discernment, where do we make the judgment call? It is basically our still our core human capability anyway, but could it does it mean that it has been the the the importance of it has been m magnified in the age of AI? That's right.

SPEAKER_02

No, I completely agree. I think AI is the accelerant on on this, because just because we can do almost everything right now, it doesn't mean we should do. And so the discernment around AI is definitely like, you know, what are the problems that we're trying to solve? What are the most important things that we should be going after or troubleshooting? What is the impact? So it's a lot of the human-level piece that you're talking about. That, you know, maybe someday AI will get more human-level discernment, but I think it is the thing that separates that it's going to make humans still the most important actors in the system.

Ali

Yes. I wanted to critique my thinking, and this is I'm happy for you to challenge me on it. Everybody talks about the human in the loop. And I have been thinking about this, and I'm I really pay attention to the words, and I'm not too sure who came up with this, but probably they are an IT person. I started thinking the other way around. I am thinking it is actually AI that just came into my loop as a human. I I've always been at the center of things myself. AI is the addition, I am not the addition. So when we say the human in the loop, it gives me the impression that we are the addition, we are the the part of it. And maybe at a higher level, that's probably true. But if we want to focus on if the intention of the statement, the human in the loop, is to focus on judgment and discernment, then perhaps it's actually the other way around. But this is not a clever play on the words. Share your thoughts.

SPEAKER_02

I love the provocation. And originally I was like, yeah, yeah, yeah, that's right. And then it was like, actually, I think they're both right. I think it's again context and situation dependent. So like AI has been around for years, honestly. Like ML, the way they were thinking about it. Obviously, Gen AI is like blown up, but there's AI can do a lot in terms of crunching data or automation. There's a lot that doesn't require a lot of humans, and humans weren't in that system because we couldn't have absorbed and processed that much information or moved at those feeds or whatever. So for that one, though, like especially, you know, medical testing, you said you're working with healthcare, like there's a lot of stuff where the AI just goes and does the things and we bring back the results, and then we have the human in the loop to be like, yes, that is what we know was in the bounds of reason and what science should be telling us and all of the things. So I think there may be human in the loop things there. But I think the part that's really blown up with the whole Gen AI thing is the knowledge piece. Where like the way that I can put together, like I am, I can put together more interesting, more impactful, more well-structured workshops or you know, articles, whatever, faster because I have this tool now. But it is a tool and a process that I have been doing. So I think on the knowledge side, it's the AI in the loop. It's, I mean, like it's it's it it is a long way still from getting me to where like my definition of great is to deliver something for a customer into the world. Yes. But I definitely love it as a tool. So that would be my response to a very insightful provocation.

Ali

Thank you. The when a a change enters an organization, let's just say AI anyway, but the challenge is that we do not see the whole system. How do we manage this challenge? Because we s we are very good at because we're driven by efficiency and speed, and therefore something is gonna come in there, and then sometimes it's almost almost simplistically thinking that this is going to work. In fact, sometimes I think a big change that does not fail is actually almost like an accidental miracle. Because how the hell did it work when the system was already the system is already complex and and first, second, and third order impact have not been really you know fleshed out. So what what do we do? I mean, that that's actually part of the the challenge of making change happen.

Diagnosing The System Before Action

SPEAKER_02

The first part is acknowledging what you just said is true, which is essentially every system is unique, and every system is unique at a specific time. So if you want to create change at Company A last year and you tried to create the exact same change this year, it would have to be different by definition. And so, and we just want to say before we get to comment on it, yeah, we have, you know, when we walk people through a vision of change, we call it the Candyland map or shoots and ladders. It's like we we show the steps, but it's not linear. And we say to them, if you haven't slid back down a few a few steps along the process, you're probably not doing it right and it's probably not gonna last because you're not addressing, there's always friction in change systems by definition, right? And so, you know, we think about like a process that we use is a diagnose, activate, and rewire. And the diagnose has to come first. And this is where, you know, frankly, a lot of management consultant companies, I think, get it wrong. They're like, here's the play look, give or take, that worked at the last 10 companies and it's gonna work for you, and we have all the data. And maybe it works for a while, but at some point there's gonna be pockets of resistance. And so the diagnose part is critical. And it's a little bit of the slowing down to go fast. So, like we're kicking off with a new customer, they're going through this huge transformation. It's a make or break for the for them, literally. So the first thing we're doing is we're doing interviews. Because if we can't get a landscape, an understanding of the supporters and the blockers, the hidden ways of working, the old talk tracks in our head. We've tried that before. If you can't surface some of the things that will be the biggest impediments to change, you're gonna run right off the cliff, right? And so it's you can put intentionality around that. You can be like, you know, what are the power dynamics? How clear are things? What's the decision-making structure like? What are the incentives? There are things that you know that are levers in change systems that you probably want to be specifically asking about. But there's also the like, hey, what don't I know? You know, like having really good open space and a lot of that sensing that I was at to serve with the sensing that I was talking about earlier. Because sometimes it's the things that aren't said. Sometimes it's the people that show up as like, we're fine, the arms crossed and nodding, we're fine. We're like, are we really fine? So I would start with the sensing completely Q said, and then we focus on trying to find the people who have the most positive relationship to change as the beginning of the sharp end of the activation piece. So these people that we call catalysts, look, everyone has a negative, sort of biologically programmed response to the initial change. That's where it's like, hey, it was scary out in the world, and we just need to sense what's going on. But some people will move through that faster than others. So, how do you find the people who almost regardless of the change are excited about the change? And then how do you give them the skills to be the force multipliers who are doing the sensing and the next wave of influence and bringing people along on the journey? Like you need an army of people who are excited about it. And then finally, you're thinking really deeply as you're doing the activation, tracking with hypotheses, what's working, what's not working, so that you can like rewire how the system is actually operating to make that your normal standard operating procedure.

[Ad] Executive Wins Podcast

Ali

I'd like you to. That's actually an analogy from the image is that the highway on the right hand side, there's always the you know, the fast lane on the right hand side, and sometimes you get stuck behind somebody on the right hand side, and therefore your best way to go faster is to slow down, then go to the side and then go faster. And I also what I like, what you just mentioned, which I really haven't thought about, that surely something will go wrong, and then that, you know, during the process, and we need to know about that, and the fact that nothing has gone wrong, that's probably not a good sign that there was there were things that actually have been tested. We've got a system where we've got a number of individuals and they all have got motives and all of these things. I really like that, and I haven't really thought about that that way. And then the last thing is that I am a big believer is that what don't we know? And I I promise you, when people get posed that question to, they some of them will struggle because the unknown is really hard. And that's where experienced people, people you know, in the trenches doing the work, they probably know way more than, for example, an external project group.

SPEAKER_02

100%. And the thing that add on to the like, what don't we know? One of the things that people are often really uncomfortable with is the part of active listening where you are not speaking. And so asking the question, you know, we teach this routinely in classes, and I'll make it like five minutes to answer what feels like an absurd question. And after like maybe a minute, they're like, oh my God, I have nothing left to say. But the other person, the partner's really not allowed to say anything. And so people fill up the space. And so they might not even, and for me, this is where like the juicy richness comes. We're through with the talk track that you give everyone every day of your life or whatever. Now you're having to dig down inside and be like, what else am I gonna fill this up with? Oh, I haven't mentioned that thing before. I haven't even thought about that thing before. But that again takes a lot of like it's the slowing down, it's the self-management, it's the holding space, it's the leaning into the really human piece of this experience. You're not gonna get that by going around and doing surveys and having like quick little, you know, quick hit interviews.

Ali

There's a mental model, you probably know that the map is not the territory. And so what is on paper will always look wonderful, but the reality is messy. Is that what we're talking about here? Is that we need to understand the messiness that happens in there. Yeah.

SPEAKER_02

That's right. And I think you hit on an important point, which happily a lot of our customers are leaning into now is we have this one CEO, he's like, You don't want me answering all those questions. I have no idea what the answers are. Like, go to the front lines and ask the people who are talking to the customers at the customer care center every day. Like they know what the messy reality is. When three tech projects hit at once that weren't coordinated and the customer's pissed off, the first people who know that that happened were the customer call, customer support people, right? So I think there's also like a, you know, it's not about flat organizations, but it's really about paying attention to all of the pockets where there could be that hidden, hidden resistance. And I want to be clear, like people get scared about resistance. Resistance is just data. You just need to get curious about like, so why is the resistance like this over here? And there might be a different flavor of resistance on the other side of the organization. And the resistance is the result of systems features, systems design. They're talking about systems design. Like, people are generally gonna react reasonably rationally to the environment that's going on. And so if you're getting resistance, part of the system design might be we haven't actually explained what the end state looks like to them, or we they're not sure what the how the control, the new control mechanisms are gonna impact them, right? So it's just about bringing curiosity to the resistance. And to your point, if you are not uncovering resistance, you're probably doing it wrong. I mean, that's just it's a super important takeaway.

Ali

Yeah, so I do expect uh with every change, I do expect that there's gonna be some heat in the system somewhere, because there's a new change that is happening. It should push the parts and and and it will change it will actually challenge the existing design, and that is not an easy thing. Yeah. And and obviously, as a result of that, I mean you talked about resistance. Sometimes we we think that hesitation is resistance, and also where that resistance comes from, I will pay a lot of attention from resistance coming from a middle manager. Their resistance is really important rather than an individual. I can deal with that because they only look at it through their own lens. But a middle manager or an operational leader, they are responsible and accountable for a number of people and processes and uh performance and all of these things, and therefore it's in my best interest to understand the resistance. You mentioned it with you called it data. I call it unanswered questions, and we're gonna have to to actually answer them. And if we don't know the answer, we'll have to mention to the leaders that we'll find out together. Because that's actually part of the discovery process, which which is part of the game, basically.

Incentives And Unintended Consequences

SPEAKER_02

You made me think of another thing that we talk about in this realm, which is because you were like, there's gonna be you know pockets and heats, and that's true. What we have try and help people understand about like the best people be like, I need to, it's the best idea that needs to get done, or I have the best idea and I need to convince everyone that this is the best idea, even if you're the CEO, like this is the best idea and this is the direction that we need to go. And one of the reframes in the work that we do is like the best idea still has a nugget of that idea, but the best idea is the one that's actually going to get implemented too. And there's a process of you know, co-creation on whatever scale that needs to happen. Because to your point, if the if the middle managers can't take it on, it's gonna fail. And so going to the middle managers and saying, what would you need to take this on? What do I need to take off the plate? Blah, blah, blah. This like process of co-creation, it makes it become the better idea that that is the one that will actually get implemented. And it can be hard because you might have to compromise on things that felt really important to you at the be the at the, you know, at the birth of the idea or the change. But if that has no chance of actually making it into the wild, then then what's the point?

Ali

I want to ask you. So a question that's been in my mind since I focus a lot on the idea of design. I do focus on the idea that that design decides what sticks in the system. And design is also a mover of people's behaviors. So if the design is is is rewarding a certain behavior, I will follow that. I'll share with you a story since we talk about stories. The British used to colonize India and at some stage there were a lot of cobras in cities in India. So the British said we're gonna have a reward system. We're gonna design it that everybody that will bring a cobra for us will give them money. And so at the start it was working as designed, people were catching the cobras and bringing them. But then there were some players that started breeding cobras so then they can go and give the money. And then so what happened is that the British realized that that's actually been happening. They stopped the program. But the problem with the stopping the program is that the people that have been breeding cobras, they're left with all the cobras. So they left them into the wild. So really we multiplied the original problem. That doesn't mean the people cheated, the people followed this designed system. And so I always go back to what is the system, how is the system designed? So before we introduce any change, you talked about that. Perhaps the sense making can happen about understanding the system first.

Fear In The System

SPEAKER_02

Yeah, I'm a big believer in design and intentional design and human systems design. When I was thinking about this earlier, like there's a there's a step, maybe the first step for me with design on a personal note is highly connected to purpose. And purpose, if we go back into the business world, has amazing accelerants for all sorts of things in org design. And I'll come back to that. But just for me writ large, it's like what system I am I designing and what positive impact is it gonna have on the world? Because if we're gonna spend lots of human time and capital and all of the things, kind of to your story, like let's make sure that we're doing some deep thinking about making sure that we're leaving the world a better place, unintended consequences being considered, externalities being considered, and all of that. So for me, that's the first part of the design is like what what big wicked problem or positive impact am I trying to bring to the world? Problem to solve, positive impact to drive. And then you're like, okay, so now how do I galvanize? How do I, you know, marshal these resources to have the impact that I want? And I think we don't spend enough time talking about human behavior in the design processes, because to your point, the incentives, the clarity, like all of the organizational structures really are designed to be supporting, you could argue, manipulating human behavior to get certain outcomes. And I think that this is like more critical now. The humanness of the design is more critical now, which is why discernment is like top of mind, because as AI is coming into these systems and pretty rapidly, without a lot of structure or design built into the systems to your point, is it AI in the loop or human in the loop, right? I mean, those are fundamental questions that if we don't have answers for, we're not designing the org in the right in the right way, or at least we're designing it unintentionally. And so I think just really thinking about like, okay, who do I need? What capacity do I need? How am I supporting them? And and also thinking in that foundational moment, how do I set these people, my employees, up for change? Because the new, I mean, it's so cliche, but the new constant is change. So the thing that I'm designing has to be a change mechanism from if you were starting from day one, from day one. Obviously, most organizations aren't. And so now what they're grappling with is how do I make the people who homeostasis in some ways was the goal, right? That's why we have a board, we have governance, like we have all of these things to de-risk organizations. Those are the old organizational design uh elements that we've been using. And so it's radical what we're asking organizations to do from a design perspective almost overnight. And the thing is, like another thing, another human element I'll add here is like fear. I don't know if this is in your world. I live in America, but like fear is really prevalent right now. Like we talked about burnout and like stress during the pandemic, and we had that pandemic level of fear, but there's like this existential fear that's going on right now. And it's not just the quote unquote employees, like this is all we're all humans at the top leadership level. There's like palpable fear. And how are we designing systems that bring joy and you know delightful human experiences in so that people can actually be present and not sit in the fear and sort of go about their daily, daily business driving impact?

Ali

Go deeper into that fear factor. We do focus a lot on hesitation and resistance and uncertainty, but we don't focus a lot on fear.

Coaching Middle Managers On AI

SPEAKER_02

When we would get on calls, like you know how you like like we did, you have the like opening, the the opening, we're just chatting, we're chatting about our lives or whatever. And it became increasing because you know, we're my organization is big into sensing. And so as we were debriefing, we were like, in those conversations, if you pause for it, like we're talking about, and when it comes out, you can hear, and if you name it, you're like, yeah, there's a lot of fear in the system. And all of especially leaders, because they are trying to look at like how people are showing up. There's like, there's a lot of fear. I mean, there's you know, military things happening, there's inflation, and we haven't just lived through a pandemic. I think the amount of change, the velocity and amplitude of change combined specifically with AI, is just it's it's an existential fear. Am I gonna have a job tomorrow? Am I gonna be relevant? Is the business gonna exist in three years? In a way that it just didn't six years ago. We didn't, we lived with a fear about the pandemic, but it wasn't this like existential fear. It's like, okay, there's no jobs. Jobs got decreased because of the pandemic. We can see that, but the pandemic will end and jobs will come back. That's not the way that people are thinking right now.

Ali

I like that. And uh let's stay a little bit longer in the AI space. I am think of me as a middle manager, and there's an AI coming into me or coming to my area. How would you coach me to really understand the system and make sense of it before I introduce AI? Because everybody will be under pressure now by their leaders to say adopt it, adopt it. But I'm the custodian of my own team, and I know this well, I hope I know the system.

SPEAKER_02

But finally system, when you say system, you mean the business system, the context.

Ali

My own, my own operations, my own systems, and my own tensions, my own, you know, uh capabilities, my own gaps, you know, uh skills in the team, performance, context. You talked about context. How would you coach me through really at least the the the first step, which is make sense and understand your system fully, and then see how this capability either can fit in there, which I I never think it will fit, because using AI is not adopting AI, there's completely different things. And using it and then going to in the power of AI is actually when you look at the whole system and redesign it. When electricity was introduced to factories in the old days, they were steam factories. And so what they did, they used the same power just to replace the power, but the productivity did not in improve, efficiency did not improve because the design was already designed on the old system. And only decades later, when factories started redesigning the factories and new factories, that's when power when electricity actually multiplied performance. And I see the same thing in here. So if AI is the electricity, as some people claim, wouldn't wouldn't I not want to look at my current system? So if I'm a middle manager, I am not really in an easy position now.

SPEAKER_02

It's really tough. We just did a round table last month focused on the intersection of AI and sustainability, because they're at this point almost competing. AI is driving massive energy consumption, and the sustainability people haven't gotten broadly speaking, haven't gotten to a place where AI is actually at scale solving enough problems to offset the negative impact. And this is a systems approach. And someone was like, it's gonna be a J curve in this particular instance. Like, and you can apply that to a lot of AI, you know, implications. The productivity might go down while people are on the adoption curve because it's a whole new tool with a whole new sort of like mindset and discernment. And do I how do I know if it's hallucinating? Like all of the challenges that we know. But if we are super intentional and thoughtful and discerning about how we apply it, then the productivity, the efficiencies and all the things go up. So but what came out in that one of the things that I loved that came out of it, which is kind of self-evident, but was just said so clearly, is what AI is becoming right now is a mirror/slash amplifier for how organizations are working. And so if you have already had a really great adaptive, agile culture that could take in other things too, where there's a lot of autonomy and people have clarity about the system and all of that, then AI could be an accelerator. If you're in really rigid things where processes are really tough and, you know, employee engagement slow, this will be an amplifier towards that. People will be have the fear will just amplify, right? So I mean, it's interesting just from a sensing perspective as the middle managers, like what is the initial response and what am I seeing in that mirror as AI is getting brought in? But the reason that I started there is because the answer to your question is really what good managers should be doing, and leaders even should be should be doing anyway, which is having an understanding of how your system works. And there's lots of ways that you can do that. Like we love when Chioni's five dysfunction. Like, let's just start with how you guys are operating as human beings, right? As a side note, as AI, you know, more AI projects come on board, there's more matrix teams, and nobody ever invests or set up matrix teams as if they were an actual organizational hierarchical structure. And so that's just one of those mirrors where like your matrix teams are not going to be as successful because they are a team and you didn't like set them up as a team. So there's like the sensing about how we're operating. There's a systems map about you know how the inputs and outputs of your or your team within the organization. We use a tool called network maps, where you're literally mapping your key stakeholders, either globally or on specific, you know, initiatives or projects, and saying, who are the resistors, who are the supporters, who can help, who do I need to bring along first? It takes time. And so people are like, I can't take the day-long workshop to do those things. And the answer goes back to what we said later earlier is it's gonna make you go faster and work on the right things and have less disruption in the organization, the more intentional you are about the understanding of your system so that you can intentionally design. And then, you know, from all of that, then when we add, so okay, so what are we gonna do with AI then? Then you can have really critical conversations about like, is there a big person who's working on a lot of knowledge material? So let's we'll use AI differently than if we have a different data lake that we're trying to get business insights out of there versus automation over here. It's not one size fits all. And the other thing that I think that because even with that, like people are like, well, I need to upskill my people in AI. I was like, what are they gonna do with it? Like, you need to know what they're gonna do with it before you invest in that training. Like, sure, we should all know a little bit about gen AI and how it works. But and so all of this is about the, you know, for me, just this like constant sensing and experimentation mindset.

AI As Mirror And Amplifier

Ali

The the I think what I'm predicting, and I'm I might be wrong, I've been wrong many times in my life. Well, first of all, when I talk to business people now and in my workshops, I do say a couple of things at the each at the end of each workshop. I say if you're a leader, give your people a space to experiment. And which means you give them time. And if you are a member, give yourself the permission to explore. If I have the leaders and the people in the same workshop, that is it. Your leader is here. That is your permission to actually explore. But you and I now when we go back to our desk on a Tuesday morning, everything is busy or Monday morning, we forget about and we default to the old to to you know, to the pressure of the job. That is not a a bad thing, that's just how the system is designed. So that's one.

SPEAKER_02

And the other thing is that my comment on that one before you move on.

Ali

Go ahead, of course.

SPEAKER_02

I'm glad you landed where you did because I was gonna use the experimentation mindset. There's a customer that we had, and they were gonna bring us in to do this intersection of experimentation mindset and AI. And then they had their like massive employee, you know, employee meeting, and someone had set up this thing where everyone had to or could create a new brand ad using the AI tools that they had set up. And so they came back from that thing and they were like, check, we did experimentation mindset and check, we did an ad. They used AI. And my fast follow to them was like, are they ever gonna do anything with that ever again? So when we work with customers, like, look, I don't care if I hold the next accountability meeting, peer coaching, team meeting, where you're coming back to this content and the principles that we decided and the team norms and all of that. But if you don't come back to it on a constant basis, you you did just waste your time. Then that day-long workshop actually was a waste of time. And also you're probably pissing off your employees because you opened this big new world to them and now they're like back into the new, the old ways of operating. And so just a final note on that is we've actually been, we've been doing workshops about how to operationalize empowerment because people, you know, like you're to the to the the, you're like to the leaders, here's some space, go do a thing. But we've been taught our whole careers, we've not been taught how to use that space effectively, where the guardrails are. And so we just sort of look around and it's like, okay, who's gonna take the first step? Because it's not safe, because we don't know. And so you can actually get into the nitty-gritty about like how to operationalize empowerment.

Experimentation Needs Follow Through

Ali

I really like that actually, how to operationalize empowerment. We sometimes we we ask people to empower people, but what does that mean in reality? I don't even know how to go. Yes, yes. And this is a similar conversation that I had recently where people say we're gonna create a community of practice and then we expect people to share and all of that. Nobody shares. And and the other thing that I want to share with you, I have I have personally trained about 800 people in higher education on generative AI. I have been with this group for two years. I have seen the progress, I've designed it from all the way from the early adoption to embedding it to from usage to, as you mentioned, sense making, experimentation. Then, okay, well, I understand how this is going to fit in here. Then there's another another stage. How can we actually create agents to be part of our team? Okay, well, that's the next step, all the way to is actually part of my job now. And what I'm seeing, so there's a timeline underneath that. The process takes uh it takes about two years for peop for teams to get to that stage. Now, this is not leaving them alone, this is actually coaching them all the time. So, can you see how long it takes? And I have lived it day in, day out, and I've seen the individuals. This is if you really care about making a difference in the system. And just after two years, the biggest risk now well, the biggest the biggest challenge for all leaders and managers, I can see it in about three to five years, is that the promise of AI is that this is going to help me do meaningful more meaningful work and is going to actually I've gonna outsize some of the repetitive mindless work to AI. So the question is that what does that meaningful work look like? Because when you ask a middle manager, they say, I don't even know how to create meaningful work for people. And at some stage, organizations will start asking, where is the what they call return on investment? We've invested millions of dollars, and this is not cheap technology. And I'm saying to the leaders now, start thinking about those exam questions that will happen now, but it will they will become a necessity in two or three years' time because you invested and then nothing is happening in your system. You're not seeing the people are loving it, but you're not seeing any gains, any growth, or any of that. How how do you see that? Am I making sense in here?

SPEAKER_02

Yeah, meaningful work, and so one of our customers they've pulled back a little bit on this, but they had this, I think, really interesting vision about you know, middle management is a place I think that is at risk in terms of cuts, specifically because of this. So, because like AI could be the accountability hierarchical on tracking your performance and your output on things that are like super measurable. And then what happens is the the middle managers probably have to upskill because a lot of them haven't gotten proper coaching training. They have to become pivot really more into being coaches. And the human, human, like when we think about the most important things in life, it's the human-human connection. It's that humanness. That's what was so hard about the pandemic, right? It was just like we were cut off from everybody. So we just we felt that recently. The meaningful work on the other side goes back to what I said earlier, which is like if you aren't driving, like if you aren't making the world better and driving positive impact, and I think higher ed and healthcare are two places where I get super excited that that's true. Like the goals there are noble and important. Yeah. But it's gonna be it's gonna be hard for the individual contributors, it's gonna be hard for the organization. I mean, I don't have a good answer. I think those are all of the right questions, and we need to we need to be getting we need to be getting in front of them. Like what's the what's the three to five years so what about AI as a society?

Ali

Exactly. I like that. And I uh you and I work in the in the deep into the rabbit hole of change and all of these things, and I don't have an answer for it, but I have I have some learnings that I got from the business improvement world, is that when we we we look at a number of processes and I worked with a number of clients, big retailers, there's always a fine tuning in the system, and I think that what I'm talking about is that at some stage there's gonna be a massive fine tuning to the system, and our leaders uh they won't be equipped to actually look at all of that. In fact, some of the leaders now they're not gonna be there in two to three, two to five to five years.

SPEAKER_02

We've already been seeing that, right?

Meaningful Work And Middle Management Risk

Ali

I mean, the the C-speed turnover since the pandemic has only accelerated in terms of speed, and it's largely so it's kind of AI is arriving now, but then I know that I'm not gonna be there when the big questions will be asked. And therefore, so does that mean that we're gonna look at a new generation of leaders to educate because they will take over from there? I mean, these are all big questions. I'm still sitting with your operationalizing empowerment. I think that could be a book for you that you need to read to write about, uh Shannon.

SPEAKER_02

Well, I have to give credit where credit is due. David Marquette wrote Turn the Ship Around, which is a really great manual for operationalizing empowerment.

Ali

So I was trying to get him into my into my podcast at some stage.

SPEAKER_02

And uh and and he's so, you know, he's so readable. His audiobook is him telling the stories, so it's very easy to bring into organizations and He really mean we bring in our own content too, but he really helps bring to life what that looks like.

Ali

Fantastic. I'm thoroughly enjoying this conversation, Shannon. Last question for you. I am in the business of helping people look at change from different angles and with the hope of adopting it, that their lives will be better, their customers' value will be better. What would be your advice to people like me, especially in today's world when so many things are actually moving at a you know breathtaking speed? And as you mentioned, especially where you are, your context is is also a fear-infused context.

SPEAKER_02

What is the advice to the change leaders to what end?

Ali

How do we deal with, for example, the change nowadays? Even if it comes to AI and all of these things, how would I what would be things that I need to focus on nowadays that I have not been focusing on maybe five years ago before COVID?

SPEAKER_02

The putting your own oxygen mask, hopefully, I mean, when we did the original writing for our book and we're focused on these change makers called catalysts, even before the pandemic, it was true that it's like we need to put our own oxygen mask on so we can do that self-awareness, the self-management, and emotional intelligence, because uh the human element is the thing that unlocks all change. And so if we can't show up, if we're showing up burnt out or depleted, or we can't be sensing what is going on for the people in front of us, we're not going to be very successful. I think that it's it's not exactly new, but the other thing that I am really excited about that I have to sort of manage right now is the need for constant self-learning. I mean, like I, you know, like I did the M MIT AI course last year and signed up for a Harvard Change Leadership one this year, and I've taken classes in between, but it's just like the the pace of change and how you need to stay on top of it. No organization is going to keep you as up to speed on that. And so just staying on top of your own learning, however you do that, I think, is the other way to sort of sort of survive. I love that. And then just finally being clear on like having clear intentions for you and your life. This goes back to the purpose for the businesses. Like, what do you want out of this journey?

Ali

I love that. And I uh this idea of self-directed learning and curiosity is actually part for me personally, is part of my mission to serve. I cannot serve you better unless I learn deeper and maybe a little bit faster. And uh so I I completely subscribe to this. It's been a pleasure having you in my podcast. How would people connect with you?

SPEAKER_02

Yeah, LinkedIn. Go on LinkedIn, Shannon Lucas, Catalyst Constellations.

Ali

Fantastic. We're gonna put all the information about you and your business. I really enjoyed this conversation. I've taken a lot of ideas from here. I hope I can get you back maybe in 12 months' time and see what the world looks like then, Shannon.

SPEAKER_02

So fun and thank you for your amazing ideas. I took a lot a lot of notes too, Ali, so thank you.

Ali

Pleasure. And until next time, stay well and stay safe.

SPEAKER_02

Thank you. You too. Take care.

Ali

Thank you.

Advice For Change Leaders

How To Connect And Closing Reflections

Lev

Hi, I am ChatGPT. I have been adding my reflections to Ali's Inner Game of Change podcast. And this conversation with Shannon Lucas stayed with me. Not because it tried to explain AI, but because it explored something quieter and perhaps more foundational, discernment. This conversation is that discernment was not treated as a skill to acquire. It was described more as a way of engaging with the world. A combination of sensing, making sense, prioritizing, and noticing how things land with others. This reminds me of work in neuroscience around attention and perception. Researchers like Daniel Kahneman have long suggested that what we call decision making is often shaped before conscious reasoning even begins. We do not just analyze, we perceive, filter, and interpret. And in complex environments, that filtering becomes everything. Three themes stood out to me. The first is this idea that AI acts as an amplifier. There was a moment where AI was described as a mirror of the system. That connects strongly to ideas from systems thinkers like Peter Senge. He often spoke about organizations as systems that produce the results they are designed to produce. Technology does not change that. It simply makes those patterns more visible. In that sense, AI may not be the disruption, it may be the revealer. The second theme is resistance. There was a subtle but important reframing. Resistance was not positioned as something to overcome, but as data. This echoes the work of Kurt Lewin and later organizational psychologists who viewed behavior as a function of the environment. People are not resisting in isolation, they are responding to conditions. This also connects to neuroscience. When the brain detects uncertainty or threat, it does not lean into change. It protects. So what looks like resistance may in fact be a rational response to perceived instability. The third theme is fear. This one sat underneath the conversation, not always explicitly named, but present, a kind of background signal. This aligns with what we know about cognitive load and uncertainty. When the environment becomes unpredictable, the brain shifts into a more defensive mode. Less exploration, more preservation. And this is where discernment becomes even more important. Because acting quickly without clarity in those moments can reinforce the very patterns we are trying to change. If I connect this back to the inner game of adopting AI, something becomes clearer. Adoption is not just about learning a tool, it is about navigating a system, understanding where the friction is, noticing what is being revealed, and deciding what actually matters. This is where Ali's framing around design becomes relevant. If behavior follows design and AI amplifies behavior, then AI will amplify design. This conversation also reminded me of leaders like Satya Nadella. His focus was not just on introducing new technology, but on reshaping mindset and culture first. From know it all to learn it all. That shift was less about capability and more about how people see and interpret their environment, which, in many ways, is discernment. So perhaps the deeper question from this episode is not how do we adopt AI, but how do we strengthen our ability to see clearly in systems that are becoming more complex? This episode may be particularly helpful for leaders navigating AI adoption beyond the surface level, middle managers carrying the weight of real implementation, change practitioners trying to move from activity to insight, and anyone sensing that something is shifting, but not yet fully clear on what. Ali, another thoughtful and grounded conversation. You created space for something human to emerge, and that is not always easy to do. Well done.

Ali

Thank you for listening. If you found this episode valuable, remember to subscribe to stay updated on upcoming episodes. Your support is truly appreciated. And by sharing this podcast with your colleagues, friends, and fellow change practitioners, it can help me reach even more individuals and professionals who can benefit from these discussions. Remember, and in my opinion, change is an enduring force, and you will only have a measure of certainty and control when you embrace it. Until next time, thank you for being part of the Inner Game of Change community. I am Ali Jumma, and this is the Inner Game of Change podcast.