The Digital Project Manager

Why Top-Down AI Strategies Might Fall Short Without the Right Leadership and Expertise

Galen Low and Kelsey Alpaio

When is the right time to hire an AI leader, and what do they actually do once they’re in the door? In this episode, Galen Low sits down with Tim Fisher, VP of AI at Black & White Zebra, to unpack the real-world impact of AI leadership roles. Together they explore the tension between hype and practicality, the mix of skills needed to bridge tech, business, and people, and why AI leadership is less about flashy experiments and more about building trust, change readiness, and operational maturity.

Tim shares candid insights from his own path into the role, offering a grounded look at how organizations can approach AI without losing sight of their business goals—or their people. If you’ve ever wondered whether a VP of AI is a made-up job, or how AI leadership can actually smooth project delivery, this conversation is for you.

Galen Low:

What the heck does a VP of AI do?

Tim Fisher:

This title, I think can mean a lot of things. This role should be practical. It should not be flashy like I have business results in my hip. This like do cool stuff with AI, which I think people think is like the job description. I don't think that's valuable to a CEO or a CFO.

Galen Low:

When is the right time or wrong time for an organization to hire an AI specialist like yourself into the leadership team?

Tim Fisher:

I think it might be too soon if leadership is still debating whether AI is a fad, no data infrastructure, if your culture punishes experimentation.

Galen Low:

What's the biggest challenge that you see in front of you?

Tim Fisher:

The biggest challenge is creating a culture of change, a culture of failing fast, and a culture of comfort with ambiguity. Implementing AI in an organization is way more of a human thing than it is a technology thing.

Galen Low:

Welcome to The Digital Project Manager podcast — the show that helps delivery leaders work smarter, deliver faster, and lead better in the age of AI. I'm Galen, and every week we dive into real-world strategies, new tools, proven frameworks, and the occasional war story from the project front lines. Whether you're steering massive transformation projects, wrangling AI workflows, or just trying to keep the chaos under control, you're in the right place. Let's get into it. Today we are talking about AI leadership roles within digital first organizations and how hiring an AI executive can impact the way projects and operations run — for better and for worse. With me today is Tim Fisher, the new VP of AI at The Digital Project Manager's parent company, Black & White Zebra. Tim is a deeply experienced media tech executive, entrepreneur, data science geek, and hack engineer, hailing from companies like People Inc. (which is formerly Dotdash Meredith) and retail giant, Target. But more than any of those things, Tim is someone who likes to solve problems using people, technology, and a little bit of cleverness. His role as VP of AI at Black& White Zebra is a new role within our media organization. So we're gonna put Tim a bit on the hot seat and unpack what his role is all about and what other pieces need to be in place before an organization can unlock the value of this kind of role. Tim, thanks for being here with me today.

Tim Fisher:

Thank you very much, Galen.

Galen Low:

And welcome aboard. So, yeah, you know, a full disclosure, Tim and I work together. He's a new hire and as part of our hazing ritual, I dragged him onto the podcast. But I thought it might be a really good lens because there's a lot of organizations out there right now looking for AI leadership to drive their AI strategy or create their AI strategy or unwind and untangle their AI strategy. And I think there's a lot of sort of murkiness around what that means. Some skepticism, some optimism, some zealots. And I thought maybe we could sort of unpack it today, but I joke about the hazing thing, but as soon as we got on a call, I knew I needed to have you on the show because you've admitted to me that you are as nerdy as I'd hoped you'd be. And I know that my listeners want to hear the raw sort of like ego free lowdown on the impact that an AI leader can have on an organization. And so far, you and I are pretty hit and miss when it comes to following an agenda versus just like going down interesting rabbit holes. But here's the tentative roadmap that I've sketched out for us today. To start us off, I wanted to get like one big burning question out of the way, like that difficult, direct, and maybe even insulting question that everyone wants to know the answer to. But then I'd like to zoom out from that. I wanna talk about three things. First, I wanted to talk about the skills that someone stepping into an AI exec or leadership role needs to have, as well as like when the right time is to hire for that role. Then I'd like to talk about the impact of AI leadership and what that looks like for folks in the trenches who are like delivering projects and tackling day-to-day operations. And then lastly, I'd like to zero in on the future, just the future of AI, enhanced ways of working and what that looks like as it pertains to projects and just general operations and maybe even like what we need to be careful about.

Tim Fisher:

That sounds wonderful.

Galen Low:

So as I mentioned at the top, there's been a lot of talk in my project management communities recently about like the need for AI experts to be at the leadership table in executive roles, in senior leadership roles. But there's also a lot of people who think that like a chief AI officer or any kind of AI leadership role is actually not a real job, or like not yet. Anyways, they say that it's a bit cart before the horse. It's that hiring move that's maybe like a little bit more flashy than it is practical. So I thought I'd be direct about it and ask you. What the heck does a VP of AI do? What does that mean for digital project leaders and how does that translate into AI enhanced ways of working throughout an organization?

Tim Fisher:

Well, first of all, I'm not sure I've heard this not a real job thing. So maybe this is just you, and this is part of the hazing thing, but.

Galen Low:

You come from a long line of of AI executives.

Tim Fisher:

No, I'm kidding. I've certainly heard that before and I'll be honest. Even to me, I think VP of AI sounds a little made up, and I'll be honest, at my last job, I actually did make up the title that I had. So, but look, this title, I think can mean a lot of things. It can be really tech focused. It can be really product focused, it can be business focused, or it can be a hybrid of all of those, which is how I would describe my role. And this whole like cart before the horse thing. Look, this comes up all the time. The thing is, this is new, like there's no AI process yet. There's not an employee rubric for what an AI title means or where it sits in the organization. There's no definition of the role that everybody's agreed upon, but just with everything with AI, you have to start somewhere and absolutely some organizations are gonna get this wrong at first, and that's totally okay. And like you said, this role should be practical. It should not be flashy. Like I have business results in my head, like real business results. I here at BWZ and also previously People Inc. You know this like do cool stuff with AI, which I think people think is like the job description. I have real job description. I don't think that's valuable to a CEO or a CFO. And I'm sure that's like driving a lot of the eye rolling and like, what is this job for? I think like a real practical AI role, like the goal of a role like that in leadership. It's that person's tasked with building capabilities, starting something like that's really the job and focused on real business results. You know, like my responsibilities are, I have a lot of responsibilities like education up and down the org chart, automation, strateg. Operational rigor where there currently isn't any around AI workflow consulting, like throughout all the different departments of a company, change management, legal and risk management. There's just, you know, it goes on and on. There's a lot to this, and the leadership role can be really helpful to help bring all that together. It depends on the size of the organization and the organization's needs and the skill sets of the hire. You know, it might be like building a team like I did at People Inc. Or it might be like dotted line relationships to existing product teams and legal teams, you know, in a really big company or maybe more of a solo independent contractor or a mix of them. The big secret here is not really a secret, like AI and executive titles feels a little bit silly right now, like totally admit that. If you scope it properly, you know, it can bring clarity and order and some light process not too much. And real KPIs for something that otherwise feels incredibly ambiguous right now. I think specifically for digital project leaders and really anybody delivering anything at all, and they executive. Brings an opportunity to be a really close partner to translate a bunch of chaos into really practical tools and workflows, which for someone who's delivering something, means smoother projects.

Galen Low:

I like the framing of like change agent, and it kind of sits in between. You're right, it's like it's nobody's role necessarily, like arguably maybe a chief operating officer, but I think there's like, there's more to it than just even operations. And the way that it sort of transforms ways of working. But I also like the note I took was change leadership. And what I meant by that is, you know, we were joking about a made up title. You actually did make up your title right at your, in your last role. But the core of it being because it was necessary because. Someone needed to sort of take on the challenge of this change. It happens to center around AI, but to your point, it's not just about the technology and like tinkering with it and having cool ideas and flashy headlines and good PR about cool AI stuff. It's like actually operational change. It's change management, it's people leadership, and somewhere in there tucked in there is what the technology is capable of. I really liked your inflection about the idea that like. Education, right? Because part of that changed leadership is bringing people along and also the AI stuff is moving so fast. Any kind of emerging tech, but like I think this is pretty unprecedented in terms of what I've seen in my time. Trying to even just keep up. That is a full-time job. Having someone educate us about like what is current is just like, it's so much better than swimming around Reddit and Udemy and YouTube and all these things, and just hoping for the best and then bringing it all together into like ways that we can do things, actual workflows that have business impact and like business impact that's measurable.

Tim Fisher:

I think you hit on a lot of really important points there. I especially like the one about, I don't know exactly what you said, but around. People being in different places or bringing people along. I think one of the struggles of a role like this is to meet people where they are and hold their hand and bring them into the future that this role maybe lives in, right? And so people exist all along the spectrum. In there, there are people who are just getting started because they've been really scared or haven't had the opportunity or whatever it is. Then there's people that are living, you know, in the right now, in three and six and 12 months ahead. And an organization, especially a larger one, can have large groups of people all along that spectrum. And being able to meet folks where they are and do that where there are so many people in so many different places is I think a really big challenge.

Galen Low:

Yeah. It's like the unification or, I mean, maybe that's even too big of a word, but even just taking people from where they are and sort of accounting for that because I think a lot of the organization, I won't get into it right now, but a lot of the organizations I see shooting from the hip are the ones that kind of we're making fun of in the memes, right? But they're like, do the AI, do it now. We need the AI done yesterday. Do it. Please. That's not AI leadership. That's just like dictating four words at a time. So I agree. I appreciate the sort of disambiguation here. I wondered if we could like zoom out a bit, because I've kind of been framing this as. I don't know, as a change agent. And I think maybe some folks listening might be like, well, anyone can do that. You know, that's just change management then. Like you just happen to have AI in your title. But I, you know, I don't wanna like rush under the rug the sort of technology bit. I know we've been talking about the fact that it's more than just the technology. Sure. But when it comes to the technology, I mean, we're talking about fast change and some folks are, you know, they'll say that like. AI isn't something that anyone can even be an expert at yet. It's too new. We haven't like, sort of, you know, figured it out yet. We haven't run all the studies and got all the results. And how can anyone be an expert? So I challenge you a bit and say, you know, maybe just ask, can you tell me a bit about your background? What credentials or accolades or experiences make you qualified to represent AI at the leadership level?

Tim Fisher:

I think it helps. I have the physics and math background. Super big nerd. At some point I switched to business because I needed to pay rent, but that's definitely worthy, you know, where sort of like a lot of my excitement comes from lifelong science nerd. I, I got into like tech and systems engineering and things like that. I had an opportunity at a previous job to create documentation around processes and to design workflows. As incredibly nerdy as this sounds, I absolutely fell in love with that. So. I've spent 30 years communicating about complex ideas and sort of, you know, back to this like mesh of like technology and people things. That's a really important part of the role. Also at People Inc. I built a top 10 technology side almost from scratch over two decades, which that was my role is demystifying technology and explaining it to people who struggled to understand it. You know, I scaled that skill across the internet. So that was, I think a really big set of experiences that helped me. Learn how to talk about this stuff in a way that is really important right now. You know, back to our conversation about all the different places people are right now. I've also started a few small businesses. I've run about a dozen media brands, so I've been responsible for the business up, down, left and right, so I understand how it all works really well. I think the culmination of all of that is a qualification in AI because I have this like deep technical understanding and I also have this ability to clearly communicate to multiple different types of audiences. So there's like this nerd layer, right, where it's like I know how language models are trained and how they operate, and then there's this business layer that sits on top where I'm able to confidently communicate and understand around like. When an LLM is good for X and maybe not for Y and how we can get from zero to one. So did I go to some sort of AI leadership school? I, no I don't think that exists.

Galen Low:

The two, two day bootcamp.

Tim Fisher:

Right. I mean the, I say this, but there probably isn't AI leadership school right now that I'm not aware of. But all of those things are spinning up as we speak.

Galen Low:

Well, it's really interesting you say that, you know, we're joking about it, but in some ways what you've described to me is is sort of this, like, it's a rare, almost. Some might say unicorn sort of trifecta of like business and entrepreneurial understanding. Like, you know what like makes a business go. You've run businesses, you've started businesses. There's also the sort of like geeky technology layer, right? Having that deep understanding of it and then this like communication layer, right, of being able to explain it, almost be the translator between the two to say, you know, technology business people between the three, I guess. And like, how can we get this done? And then you've done it right? And with technology, you've like scaled operations and teams and sites. Not necessarily all AI, but AI was almost the like natural progression of what you were doing already. So it makes sense that you'd sort of make up your own title at your last job. I'm taking this too far, but like, you know, you had mentioned like physics and math and I immediately, not to draw comparisons between you and these individuals, but I thought of like folks like. Neil deGrasse Tyson on the astrophysics side, or like Hannah Fry on the mathematics side. People who just kind of have this mix of qualities where they kind of get it, they get like the sort of blend of things that they need to do to educate, to enact change, to, you know, keep people in the point in the right direction and to lead and still understand, you know, all of the inner workings of it. You know, now that you're saying this out loud, I'm like, gosh, this is a really difficult role to hire for in some ways, you know.

Tim Fisher:

It is. I don't know. I'm struggling to move past the, you threw my name out there with Neil Degra Tyson, so we can stop right now if you'd like.

Galen Low:

Well, like, well, you know, what an amazing role he plays and others, right? Of just like bringing it down to a level that everyone can get on side with, but not like blurring it into lies and not like simplifying it too much that it, you know, is too basic, but actually distilling the essence of what needs to be done. Choosing what level, right. We were talking about bringing people along from wherever they're at. Some people they wanna know how the language model is trained, and you can explain that. And some people, like you just wanna know, am I supplying too much context to my ChatGPT taking me like 25 minutes to write a prompt. Like, is this normal? Right? Like, just tell me, help me Tim. Right? Like, and I think that's the job in some ways, right? To be like, okay, well you're here and you wanna know this and like I can educate you at your level. Is it wired that does the videos? It's like five levels of something, right? Five levels of harmony five levels of, you know, juggling or like hacky sacks or something like that. You know, you can explain it on all the different levels and still bring people along. We joke about the, explain it to me like I'm five, but actually that might be the most helpful thing for some people who just need to get over that hump.

Tim Fisher:

Sometimes it is.

Galen Low:

I alluded to it earlier about timing. Timing for this role. Hiring for this rule. And I did, I came across this meme at, I, I think it's circulating around. Or it's you know, it's CEOs, they're like, who are we CEOs? What do we want AI? When do we want it? Now? What do we want it for? And they're like, eh. So there's this kind of like the sense that like, you know, business leaders, CEOs want AI without really understanding the outcome that they want from it. I get it. And you get it from the sort of perspective of like, think competitive, right? If you're not doing it, then you're kind of falling behind. You kind of have to be on this bandwagon. But you know, when I look out there. Some organizations just don't look ready, even if they're saying that they're ready, that they're doing it, they just don't look ready. So I thought I asked, you know, like when is the right time or wrong time for an organization to hire an AI specialist like yourself into the leadership team? What level of maturity is required to achieve AI related goals? Maybe like what are some examples of what those goals might be?

Tim Fisher:

I think it depends on a lot of factors. Obviously, CEO and executive support. Some organizations actually don't have that. Which oddly often comes along with a lot of blind pressure to do something and org's agility. Most organizations aren't particularly agile. These are not small changes. These are large changes that are happening inside of organizations because of AI. I think when conversations shift from, let's see what we can do, or let's experiment, or let's see what's possible to this at scale, like that's a good sign. I also think scattered bottoms up AI use around the organization that maybe needs some coordination and some sort of like order and some light process. I think that's another really good sign. I think it might be too soon if leadership is still debating whether AI is a fad, that is something that is still happening. There no data infrastructure as anyone who spent any time working with AI at all. The inputs are extraordinarily important, so if you don't have data that can go into an AI system, your company might not be ready if your culture punishes experimentation, or there's not a lot of tolerance for ambiguity. Wow. This is gray. This is some really great stuff. It's getting better, but if you're not prepared to sort of absorb and deal with that's not a great sign. I think most importantly though, and I think we all take this for granted in the digital world that we all live in, but the digital nature of the organization, there are many organizations out there that aren't particularly digital even in the year we're in now. So if it's all about inputs and outputs and most of your stuff is still in file cabinets, you're better off hiring somebody to come scan your documents before you create some kind of an AI strategy. It's really important. It's also the right time if there are teams drowning in repetitive work, if you know there's opportunity for automation, right? And you even mentioned this a little bit ago, like. Competitors winning at automation. Look, if your competitors are doing something that's clearly good for their business that you're not doing and might be time to ask yourself if we need to make some changes. Right? And then of course, if your CEO is asking about AI in your strategy, it's all, look, it gets to a point sometimes where waiting is the bigger risk then jumping in. But yeah, that's why I would summarize all of that. You also mentioned goals. I think goals are really interesting. I think. Unless you're an AI startup or a foundational model creator. So like if you're open AI or Google, or Microsoft or Amazon, or you know, maybe you're building something that literally was not possible before AI, then your goal is to do something with AI. Like AI is the actual goal, but I think for most businesses it's about transforming the business and operating at its best using the latest technology. So I would say the business goals should not change at all. You just change how you reach those goals. Maybe a hot take that probably shouldn't be. I think the AI revolution for most companies isn't actually about AI itself at all. I think the value is the forcing function for automation and transformation. In fact, I was just talking to somebody recently, I said. Estimate, like maybe 25% of the projects I've deployed so far in my AI roles could have been deployed previous to large language models. It's just that the business didn't prioritize those things because there were no dedicated transformation leaders. There was no like inward pressure to ask ourselves, how can we do this better? We just do what we've always done. So back to the title thing, I think honestly these types of roles will probably evolve away from having AI in their titles and more transformation in those titles, because that's really what I think it's about.

Galen Low:

You know, I think with the titling, like we've seen fluidity in titling over the past little while in areas like cybersecurity, like DevSecOps. Even like, like RPA, like process automation, I'm sure I haven't looked, but I'm sure there are dozens of VPs and directors of, you know, process automation, RPA, all of these things that we did to do exactly what you said, which is like become more efficient at achieving the same goals and the target may shift. Right? So in other words, right, if it was like I just had on as guests. Folks who do like AI call center transformation. I guess what all those metrics stayed the same, but like the target is to reduce call volume or reduce call time or increase, you know, the resolution rate of like, did we solve our customer's problem? It's not like, I'm sure there are AI metrics within that, but the core business goal is not to become an AI organization. It's to better serve our customers, better serve our agents, hopefully, and also just, you know, do business better. What I liked about what you said earlier is that. I don't know. I we use this word like business maturity. In my head it's like this, like flat still water where it's like, oh, we've achieved business maturity, nirvana, everything is calm and now we're ready to launch into the next thing. But you didn't say that. You said either there is support and like data infrastructure and like you're ready. Or things are happening, like at the lower levels, people are experimenting and like if you don't organize that energy and focus it somewhere, like it's just gonna disperse and it's gonna be mayhem and chaos and it could actually damage culture before it actually impacts your business in any positive way. And I thought that was neat because like I do see that a lot. I do se see it a lot where like there's a lot of energy, like, and sometimes it's positive and sometimes it's frantic, like, you know, panicked energy, like. I need to figure out AI, so I better like do all these things and then, you know, hope that I keep my job. And some people are just really excited about it. They're like, oh, I might not have to take notes during a meeting. Fabulous. Say no more. Like, let's do it. Let's build a machine. And you know, to your point, not inventing AI, but using AI to do what they're already doing. But I think the biggest thing that I, you know, I picked up on in there was that notion of like the safety, right? This like psychological safety to experiment. Because I think the other bit is that there's some organizations where all of that experimentation is happening, but like behind a closed door, secretly, no one's talking about it. Yes. No one knows it's happening. Frankly, like some of the data that is going into some of those, there's probably no governance around it. It could be mayhem that business leaders don't even see.'cause they haven't created that culture of it being okay to raise your hand and say, Hey, I'm playing with this. Like, is that okay? Like, you know, here's what I'm doing. And if they think they're gonna get their hand slapped, then you know, you can be damn sure that they're gonna do that to keep themselves relevant for their next job, not for their current job.

Tim Fisher:

Amen. Yes.

Galen Low:

This is really interesting by the way. Just 'cause like I, I don't know, I think I came into this conversation as like. It's almost like the utopia, right? Like this utopian vision of what AI transformation looks like. But actually it can be that it can be messy, it could be solving problems, it could create problems, it needs leadership. And I guess that's kind of like the crux of all of this is that it does need someone who can like zoom out and zoom in, see the force for the trees and guide people along so that this energy that gets spent into it is productive and actually enacts change that can be measured.

Tim Fisher:

Exactly. Yes.

Galen Low:

I wanted to zoom into like life in the trenches is kind of what I call it. My background is in project management, business development, account management. We are in the day-to-day sort of operation of a business, and we've been talking a bit about this as we go, but like when it comes to adopting AI into the way that we like actually collaborate on our projects and like do day-to-day operations. Do you see that as something that is best done, like top down within an org? Like is that the idea of the sort of AI leadership layer? The AI leader?

Tim Fisher:

I think you have to do both. I think top down and bottom up, and I think the right AI leader can make both of those successful. Both those directions top down. I think that's a little more clear, right? It. Setting goals and strategies, you know, a really clear direction. You can tackle really big workflows and systems and create these big projects and like, you know, completely change how your business does this enormous thing that they do every day. Right. And that's, I think that's the one that's a little more clear. And then the bottoms up I think is often thought about a little too late or it's assumed that it will take care of itself or it's not important. But that's like the education and like you're getting at a lot of this just in, in what you were just saying. Right. The education and the empowering individuals and the, I think a really interesting one is, I know at the last job. We had just like an intake form for great ideas to do. You know, like I've got this thing I do every day and I know based on your TED Talk, and you're like, you know, and like my research that I know that LLMs can solve this for me, and I would love for your team to build that. But a lot of what individual people do are individual and bespoke things in their roles. So there's not this enormous ROI to throw a bunch of engineers and product folks at this like solution for something that one person does. But it can be transformative for that one person. So empowering people to solve problems themselves with, you know, some of these like low and no code tools that I know a lot of us are familiar with. And using ChatGPT is your, you know, where Gemini are. Some other conversational AI, your like coding partner for app script behind Google Sheets, like empowering people and giving people permission to do things like that is incredibly impactful. So I think it is both of those things and a sort of like meeting at the, in the middle, but I think they both have to happen to be really successful.

Galen Low:

In my head, I'm almost like, oh, there's almost like this like layer that is similar to like a project management office in terms of like, is this a project? If it is, we can like, provide support for it. Like we've got our team, we can put it on rails, we can educate you if it's like actually not really a project, but like go ahead and do it. We can kind of guide you. I think that is a really interesting like division right now where I think for me there's this tension between like no code, you could just do it yourself. You should have done it already. Just do it yourself alone by yourself. And then there's the layer of like, actually it's really complicated. Like we need engineers and like people who understand the ethics and the governance and like, yeah, you know, these projects like these need to be, more care needs to be taken if we wanna be able to do this. Right. But I think you said it really clearly where it's like. There is this sort of line between what needs to sort of scale into impact for a business versus what is sort of more of an individual like productivity thing. And you know, folks on my team, you know, they're in lovable, they're in N eight end, right? Like they are just like building a thing to help them do that thing that they do. And I guess, and maybe we can walk through actually this pipeline, but you know, for us we're kind of individually making that call to be like, ah, that's just a me thing, so I'm just gonna do it myself. Or Wow, this could impact every podcast that we record. Like we gotta build this guys. Could we step through like that flow of like. Even like a bottom up sort of approach where people are experimenting and sort of what they can do with an idea and how you would like evaluate an idea of whether this should be like, okay, yeah, we should build an entire operational machine around this and we'll like use our resources to develop it. Don't worry, you don't have to code it yourself in lovable versus like, actually I can show you how to do that. It's definitely gonna be useful, but the scale is smaller in terms of like what we wanna invest in it.

Tim Fisher:

Yeah, I think that's an interesting question. I think I, this is such a bespoke answer depending on the organization and how you're put together. And it's funny, as I'm listening to you ask this question, I'm thinking about the variety of situations out there with respect to how much individual contributors even understand what they do, how that has an impact on the business. Again, so we're back to this like sort of transformation and introspective thing. I I mentioned a little bit ago where we don't spend a lot of time or I would argue we don't spend nearly enough time in business critiquing what we do and how we do it and its value. And I think leadership, I'm feeling very lucky at this particular company because I see this happen a lot, which is great. The push from the top to help people understand how every part of their, whatever it is that they do in their job, how that impacts. The goals of the business, right? So that constant challenge of communicating these big strategies and having them trickle down into what people do and having clear metrics and KPIs and things like that. So I say all that to say there has to be a process by which when you are experimenting on yourself and automating something that you maybe think has an opportunity outside of the walls of your job. To be able to quantify the value to the business, right? So depending on the maturity of the business around, you know, the connection between the top and the bottom around, you know, the value of the work they do and the business goals. That can be one of two things. It can be obvious. That's something that you're constantly tracking and reporting on anyway, and those trackings and reportings are visible to other team members. It becomes an easy task for an individual contributor to say, this is the value that I think automating this would bring to my organization. But without that, which I think is much more common out there in the business world, there needs to be some sort of a process of quantifying that. I know at at People Inc. One of the enormous tasks we went through across the editorial organization was digging really deep and trying to figure out what is it that people even do, and then what is the cost of those tasks and what is the value they bring to the business? And that helped us tackle it, you know, from a top down perspective. But it was interesting and frankly, very fortunate that we saw a lot of the individual things that people were working on and excited about matching up in ways that made us excited to scale some of that work. But it's definitely not going to happen again, this is back to this whole conversation we're having about an AI leadership role. It's not going to happen without someone wearing the hat of this is something we need to create a little bit of process around or at least have the information about. And again, I think back to the forcing function thing, I think a lot of organizations five years from now are going to come out of their initial sort of. Experiments and tackling of AI with maybe not a bunch of fancy PR worthy AI products to talk about, but a level of maturity just around how we think about things and how we decide to stop or start doing new things. I think will be worth whatever, sort of like anxiety came along with. I don't know what to do with AI now in this organization.

Galen Low:

One thing that I had turning in my head is earlier you had said, you know, AI leadership role. There's a sort of consulting aspect to it, right? A hundred percent. And as you were talking about understanding what people do and. That whole process of kind of like auditing, right? The word audit showed up in my head and I was like, that's a really scary thing for like, you know, your classic C-suite to be like, Hey, we're gonna do an audit of what you all do and decide whether it's valuable or not. Everyone's like, you know, they're thinking of like doja, right? Like they're thinking like, oh my gosh, someone's gonna tell me that I'm not valuable and haven't been for the past 15 years and I'm gonna be out on the street. Whereas I think there is that leadership layer in between almost, right? Where to be like, okay, well, like. We do need to think about the way we work through the lens of AI. It's not AI for AI's sake. We need to sort of use it to enhance our business goals. And what you said is we're gonna find out that some of the stuff you guys are doing is amazing and things that like we aren't giving you credit for that are great and we wanna like amplify that or keep them as they are. And there's gonna also be things that are tedious and boring and that you hate to do. And we wanna look at that too because there's ROI there as well. But I think circling back to what you said originally is like. That person needs to be trusted. You can't just walk in with the chainsaw, the proverbial chainsaw, or a real one and just kind of go like, Hey, like we're gonna trim out all the fat and then robots will take your job, or whatever. You know? It's like there's this trust layer of someone who can paint the picture clearly. At both levels there's a business impact and there's a people impact, and here's how it all kind of fits together. Whereas without that layer, it can be like, that could be riots in the streets, right? It's like, yes, yeah, we're gonna look at everything you do, pull up your process documentation, we are gonna inspect it to make sure that you're adding value. Like it's like a really, that's what everyone doesn't want, but everyone does want the tedious stuff to go away. I know. Trust is, it's absolutely about trust. Yeah. No, I think that's huge. I hope I can have you back on the podcast.'cause what I'd like to do next time is dig into, you know, it occurs to me that all these things that come up from the suggestion box, which I love by the way. And I love that idea that like if you've got the right culture and people understand the goals, you're gonna get really good ideas in that submission box, right? Or if you don't, you're gonna get a lot of like silly ideas and someone needs to parse them. But either way, there's that layer that says, actually, you know what? This is really good. Someone's made a really strong business case.'cause I do think it's business casing at the end of the day to build a thing. And like it's green lit by leadership and now it's a project. That's what I would love to get into next time of like, okay, well some of these things, spawn projects, which obviously my audience is very interested in. I have been talking a bit about, you know, sort of like AI enhanced ways of working, so like how can we also deliver projects better using AI? So I won't get too deep into that now, but I thought maybe I can like. Roll it up because it's become clear to me in this conversation that it's a very challenging role, right? Like, and I know a lot of people on paper, they're like, that's a made up role. Or you know, like, what does that person even do? Or aren't they just the messenger for the CEO, you know, with proverbial chainsaw, but it is challenging. So I thought maybe I'd just get your perspective on like from where you sit right now in this organization, Black & White Zebra, what's the biggest challenge that you see in front of you, and what is your hypothesis about how you're gonna solve it?

Tim Fisher:

The big question. I think the biggest challenge is creating a culture of change, a culture of failing fast, and a culture of comfort with ambiguity. Notice I didn't say a single tech thing. The headlines are all about model hallucinations and data quality and prompting challenges and all these like awful, horrible things and like these are all solvable challenges. The models update constantly and they get better all the time. There are plenty of ways to deal with that hallucinations, you know, grounding your inputs in more trustworthy information, you know, rag for anybody that's, you know, related into this stuff. But implementing AI in an organization is way more of a human thing than it is a technology thing. In other words, I think an AI executive job is to. Manage the expectations at the top, which is like me telling the CEO, no, there's no magic AI button for us to buy or build. That's not how it works. But otherwise it's managing change like all over the org. Like that's really what it's about. And you mentioned this earlier, there's all kinds of popular change frameworks out there for businesses. Like, you know, there are a number of companies that make billions of dollars a year helping to like manage all of that stuff, right? But I don't think they're gonna be right for the speed at which AI is coming at us and the scale at which that it's making changes. So this changes faster and scarier than the internet. I mean, AI fundamentally alters like the what and the how of work. We don't know what tomorrow looks like. That's frightening. I had an opportunity a couple of months ago to go spend some time at OpenAI, the company that made ChatGPT. A chief economist, they've hired a chief economist because so many people, them included of course, they want to understand and talk and sort of help direct how this is all changing everything at a really high level, you know, the economy. So I'm in this room with all of these C-level executives and all these, you know, a few of us have AI titles, but most of them it's CEOs, CIOs, CFOs, you know, the people who are making decisions about these things. And the conversations in the room were not about their businesses or about where their vertical was headed. It was about what my kid should major in college. It causes this. This is such an enormous change that these people in this, like once in a lifetime opportunity to spend time at this company in the big room talking about these things, and they have a very human reaction because they know just enough about what's happening to know that my daughter, who was gonna be a computer scientist, is now wondering if that is the right path forward. And the other side of that, like, well, what is the exciting next role in the world? Like, like where should we be spending our time thinking? And I, that really hit me hard when I heard all these people like. Stepping outside without a second thought of like why they were there and like they were there for lots of really big companies. It's just, it wasn't the questions that I expected to hear, but it made all the sense in the world. So I tell that story just to make really clear, these are enormous changes and it's not just the people down at the bottom doing the work that are like, you know, everyone is very anxious about what this all means. So, all that said, though, I think the opportunity every company has is buried in all of that, like anxiety and fear, right? Like humans and organizations that can change are going to win full stop. That is what will happen. The other part of this was like, what are my plans for, you know, doing this? Right? I don't think there's a universal template. I do not. I think there are some themes that are really important. So I would say normalize AI. It's here and it's messy. So talk about it a lot. Talk about it, honestly, talk about it openly. Get leadership on board if they're not on board. And then get them to talk about it more. Talk about it. Honestly, talk about it openly and no secret plans. That is such a destroy of trust, like, talk about the plans, talk about 'em. Honestly, talk about 'em everywhere all the time. There's a theme here you're probably hearing, which is like radical transparency, which I'm a huge fan of, and I think, you know, and it's a, the fastest trust builder I've ever come across. And you've said trust a couple of times. I cannot express how important I think that is. I also think you have to normalize rapid change. I hear a lot like frustration with leadership, changing their mind all the time, but changing your mind with new information is. Strength, like that's like a reality based response to a world where new information is being thrown at you all the time. I would be very fearful of an organization that sets an AI strategy that doesn't change about every quarter, because that probably means you're not paying attention to what's going on in the world. I also think you have to normalize failure. I think you have to fail fast. I mean, people hear that phrase all the time. You know, you have to build systems to make sure that you don't, you know. Speed dead horse, so they say, or whatever it might be, right? You just continue on down, like there's just so much change happening. The harder part of this, and the one people don't talk about often enough, is failing publicly. I mean that inside of your organization. You know, we all like, I think, deeply understand that when we fail, we learn something and then we change when we move on. If you feel publicly in your organization. The entire organization learns something that is frightening for most people to consider. But when you fail in public, whether that's in a public slack room or in a meeting or whatever it might be, everybody there learns something and I think that makes the organization move faster. Then I think the most important thing, and I hope I say this enough, admit that I don't know normalize, I don't know this over competence in what the AI future means and its impact on your organization or the world. Someone's selling you something and often themselves when they are expressing that. I know Sam Altman has a job as a marketer for his company, right? Like that's a very different story. But for leaders in AI and traditional leaders in an organization, that's a bad sign. So yeah, I have an easy job. Right?

Galen Low:

Yeah, totally. So the one thing that really stood out to me there is. We started talking about this notion that like, yeah, change managers, you know, like, you know, when I was working at a big consultancy, there's a whole team of people like that was an offering to like manage that change, but it was always manage that one change or manage some change. And what I appreciate about your framing is. The notion that this is a generational change, right? You're in a room full of executives, they're talking about what their kids are gonna major in because we are thinking about change at the scale that it's actually impacting multiple generations ahead of us. It's not like that, like should we renovate the lunchroom? This is like, this will fundamentally change the way organizations and groups of people are structured to also deal with more change. You know, we talk about the pandemic and you know how that kind of like primed us to unlearn some of the myths that we had about the way we work. And this is almost like that next wave of being like, and guess what? We don't have to have, you know, an airborne virus to like figure out new ways of working. This technology can push us forward, but it's gonna change everything, not just the work we're doing today, but like the education system, right? Career pathing, you know, like, all of these things. That's why it's a big deal. That's why it makes sense to have it in the title, not VP of Change or VP of Change Management, or Executive Change agent, but VP of AI or you know, CH with ca, I don't know what the acronym is. Chief AI Officer, CAIO. And the reason why it's, you know, it's not a buzzy title. I mean, it might sound buzzy, but it's actually because it represents this like fundamental shift. The change, not just a change. And you know, that's what's really interesting about it. Its impact is pretty big and cannot be understated and even still. Most of it's not about the technology, it's about the sort of radical transparency, this trust, and if I were to be honest with you, to show my bias, that's what I'm super interested in because I've worked in a lot of places where, you know, psychological safety is something that might be written on the wall, but isn't always the thing we practice that sort of like, cover up your failure, you know, cover your butt. Like is, it's systemic. It's natural. As you know, social creatures. This is a moment of like innovation where leadership is not knowing, leadership is not knowing, but maybe taking a chance on it, and then figuring out how to pivot and change as we fail, as we learn together.

Tim Fisher:

Agree.

Galen Low:

Tim, thank you so much for this. Just for fun, do you have a question that you wanna ask me?

Tim Fisher:

Would it be too rude if I ask you, so you mentioned at the very beginning about whether VP of AI is a real job, so I'm going to flip this right back to you. So let's imagine a world where AI is reshaped everything, you know, certainly delivery work. And so do you think being a project manager is about to become a made up job title? Is it about to change so much that it doesn't make sense anymore? Evolve into something bigger, something different? Like where do you. See this taking project management.

Galen Low:

Touche. Tim Fisher. Touche. But I deserve it. And actually, you know what? So here's an unpopular opinion. I would say probably yes, the function I think elevates, but I think the role and the title actually could go away. And for anyone who's like seen me speak in person over the past couple years, like we do this exercise where we we make up jobs that don't exist, right? Director of human and machine ways of working and methodologies, right? Like stuff that we don't do right now, but gosh, like if we could, everything would run smoother. It's just that we're so stuck in the weeds and the job is to like, you know, everyone describes it as like herding cats. You know, you're a project leader, you don't have like authority over anyone. Gosh, everyone's just running around running amuck. Leadership keeps changing their mind and like, you know, you've gotta manage the iron triangle of scope, schedule, and budget. Part of me actually hopes. It elevates beyond that. And to your point that it actually maybe gets a new title because it's already imbued with a lot of stuff. We already have a branding problem with project managers, but the excellent project managers out there do more than what most people think of as project management. And they deserve to be titled as such. Their role deserves to be described as such. And there's an education piece there too, just like with AI to be like, okay, well delivery of value through like human and machine collaboration like. It's gonna be really important. You know, it's important today. It's gonna be more important later, as we talked about, more projects are being spun up because of like emerging technology like AI. And in some ways it's just not good enough to like make a plan and hope it stays the same for like 18 months. Like it's just not gonna happen anymore. But yeah, coming back to it, I think it will become a made up job title. It'll become like this anachronism, but in a good way. Because I think we then shift on like, I don't just like, I don't want my. Title to be computer in 2025, right in 2045. You might not want your job title to be Project Manager. Heck, 2035 2030. You know, change is happening fast, but. Yeah, I do think so. I actually do. Okay. I think it will change and I think that could be good.

Tim Fisher:

Yeah, I agree.

Galen Low:

Thanks for playing along with that. And Tim, thank you so much for spending the time with me today. This has been a lot of fun. Before I let you go though, where can people learn more about you?

Tim Fisher:

Just head to LinkedIn, search for Tim Fisher, BWZ. Fisher without a C. And you can't miss me.

Galen Low:

Thanks again, Tim. I really appreciate this.

Tim Fisher:

Thank you very much, Galen.

Galen Low:

That's it for today's episode of The Digital Project Manager Podcast. If you enjoyed this conversation, make sure to subscribe wherever you're listening. And if you want even more tactical insights, case studies and playbooks, head on over to thedigitalprojectmanager.com. Until next time, thanks for listening.