Teaching Evidence-Based Management
The show is dedicated to sharing challenges, techniques, experiences and best practice when teaching evidence-based management to others (from under graduates through to executives). Hosted by CEBMa Fellow and 20 year change management veteran Karen Plum, each episode features experts in the field who have practical experience of teaching evidence-based management and bringing it to life!
Teaching Evidence-Based Management
Working with uncertainty: a conversation on evidence-based management
Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.
Working with evidence-based management can be uncomfortable. It asks people to sit with uncertainty rather than reach for certainty — something many of us are rewarded for in organisational life.
This conversation explores what that discomfort looks like in practice, and how uncertainty shows up when people engage seriously with evidence — whether they are managers making decisions, students learning to apply evidence, or educators supporting that learning. The discussion ranges across research, teaching, and organisational decision-making, touching on credibility, confidence, risk, and the realities of working with evidence in complex, high-stakes environments.
Rather than offering tools or prescriptions, the episode stays with the experience of uncertainty itself — including where it becomes personally or professionally risky, and where it can open up better questions, more careful judgement, and new possibilities when people are willing to stay with it.
This is not a “how-to” episode. It’s an opportunity to listen in as experienced educators and practitioners think together about uncertainty, without rushing to certainty.
Host:
Karen Plum
Guests:
- Eric Barends - Managing Director, Center of Evidence-Based Management
- Denise Rousseau, H J Heinz University Professor, Carnegie Mellon University, Pennsylvania, USA
- Preston Davis, Clinical Assistant Professor of Management, Coles College, Kennesaw State University, Atlanta, Georgia, USA
Contact:
Eric Barends, Managing Director of the Center for Evidence-Based Management
Hello and welcome to another live episode of the podcast. I'm your host, Karen Plum, and what I hear when I talk to teachers and practitioners of evidence-based management is that this work can be uncomfortable. It asks people to sit with uncertainty rather than to reach for certainty, which is often what we think we need and what we're rewarded for. But we live in an uncertain world. Goodness knows we had a toxic dose of uncertainty during the COVID pandemic. And it's clearly not something that's going away, either in organizations or everyday life. So in this episode, I really want to look at how taking an evidence-based approach changes our relationship with uncertainty. And I've invited three people who spend their lives immersed in evidence-based management and thinking to dig into it. So let's first of all hear from them and then we'll dive into the discussion. Firstly, a warm welcome to Denise Rousseau.
Denise Rousseau:Pleasure to be here, Karen. I'm a professor at Carnegie Mellon University in Pittsburgh, PA. And I'm the academic chair of the Center for Evidence-Based Management, along with my colleague Eric Barends. And I've spent 20 years plus doing research on evidence-based practice and helping think through what teachers need and what practitioners need and develop training materials to facilitate their learning, a different way of approaching practice.
Karen Plum:Thanks, Denise. So, Eric, how about you?
Eric Barends:Well, I'm not as eloquent as Denise, so I don't have a nice slogan, but I'm I'm the managing director of the Center for Evidence-Based management, which always sounds very impressive. But the Center for Evidence-Based Management has only one employee, and that's me. So my performance appraisal meetings are always stellar because I'm in charge of myself. But I'm by nature, originally I'm a practitioner. I worked in healthcare, I worked as a consultant, and now I know a lot about research and evidence-based decision making, etc. But I started out as a plain vanilla uh practitioner, needing to make decisions about employees or people or customers or whatsoever. And I think I never lost that background. I'm often in academic situations, but I first and foremost consider myself to be a practitioner.
Karen Plum:Okay. And last but by no means least, welcome to Preston Davis.
Preston Davis:Hey, yeah, happy to be here. I'm a consumer of what these fine individuals put out into the world, but I like to joke that I'm still a practitioner uh in a quasi-academic. I am a clinical assistant professor of management at Kennesaw State University. And then I also am still part of a boutique venture firm where we are pretty much agnostic and look at you know venture deals across pretty much the entire spectrum. So I generally see somewhere between six and eight hundred pitches a year in my capacity and role with them, as well as getting to take some of that and and teach about it in my capacity as an assistant professor at Kennesaw.
Karen Plum:A lot of decision making going on in there then.
Preston Davis:A lot of uncertainty.
Karen Plum:Yay. Okay. Brilliant. So it's great to have you all here. Thanks so much for your time. And I want to kick us off by asking where you most often see discomfort showing up when people start to get serious about evidence-based management. Denise, can you kick us off?
Denise Rousseau:There's many kind of you don't want to say pain points or sensitivities as people begin developing the skill and appreciation for evidence-based decision making. But I think the biggest surprise, and therefore first confrontation with uncertainty, is to realize that as we well understand in the practice of trying to be systematic in decision making, is that virtually no piece of evidence is perfect. Everything comes with uncertainty, some degree of information value, and some degree of questions about its trustworthiness. So no piece of evidence typically tells the whole story in organizational practice, and therefore we need to combine information from different sources the science, the practical experience, what organizational data tell us and what stakeholders are concerned with. But in combining different sources of evidence, they can tell different aspects of the problem, like the blind man and the elephant touching different parts. And so there's uncertainty with regard to how do the pieces fit, since they're talking about potentially different things, not confronting the same factor concern. So those two issues, the quality of the evidence that exists, and then the potential for you could say contradiction or missing pieces, leads to a sense of uncertainty that is inherent in paying attention to the quality of information you use in your decisions.
Karen Plum:Yeah, so there's no easy answers. Eric, where where do you see the discomfort first asserting itself?
Eric Barends:Uh well, I have a little bit of a different approach. My problem is that I don't see the discomfort when management students start. And I see Preston laughing because that is it, that is a phenomenon that we know all too well, that specifically the executive students, those are managers, directors, people with the scars on the back with 20 or 30 years of experience, and there's no uncertainty there. Zero. They know exactly what they want to do and how they want to do it. And they're open to learn some new stuff in this executive MBA course or whatever it is, but their position is completely different from, for instance, medical students. Medical students are inherently uncertain. Their first time as in their residency when they're in charge of a ward, they are scared that something happens and they're also scared to call their supervisor because they don't want to look stupid. So the whole idea of being uncertain, if you go to your your doctor, your general practitioner or family doctor, and you say, Hey, I have all these these symptoms and a pain in my ass, what's going on? And then the doctor says, I don't know. I have to run some tests. I don't know, I kind of can't say that be forehand. While a good trained MBA, if you say, Hey, what's going on here? The good trained MBA will give you the answer with a lot of conviction and certainty, etc. So, yeah, where where do I see the uncertainty after a few lessons? Then they go like, yeah, so so they have these strong ideas, and I usually try to be a little bit provocative and you know in the first day. So when it's okay, tell a little bit about yourself, yeah. I'm doing yes, I'm doing an agile project, yeah. Who doesn't? Agile, I mean, and you know, making fun of it a little bit, and and what exactly is the problem, they already start to get nervous. Like, who is this guy? This is annoying. And and when you get further, you get to this what I think Denise calls the valley of despair, or it's I think officially referred to as much. So so when you bring in like, well, we all think this, but the evidence shows that, it's nice to have some of those examples where people are absolutely convinced that it's X, but the the research or evidence kind of suggests, well, that really depends. It's not always the case. And you know, even large consulting firms claim stuff that turns out to be, you know, questionable at least. Um, yeah, then they start getting nervous. And it's it's crucial that you get there with your MBAs and management students, that you get to that point of of doubt and uncertainty. So I'm I'm really curious to hear from Preston how how the pitches that these people make, because they have this brilliant business idea, and that of course is going to work and make them a billionaire. And then you got, you know, this this nasty professor or person asking nasty questions. So what what's your experience there, Preston?
Preston Davis:Oh, come on, you know, I'm I'm I'm a pretty nice guy. yeah, you know, you know, it's thinking of just like reflecting on what you all you all were saying, but especially in terms of like with our executive MBA students, and to your point, Eric, I mean, the way that I see it is that in their eyes, and I would argue, even in the sense of how we feel when we go to a doctor, it's like you're rewarded for certainty, right? Like they are being rewarded, promoted for this, for this certain mindset, this like absolute determinism, like I know what I'm doing. In the second that we start giving this structure and this understanding, this way to critically think about something, it's not that which I think is interesting, it's not that they feel it's gonna contradict them to any extent, but I think the uncertainty that comes in that really creates some level of anxiety is that it brings their absolute determinism of I know for a hundred percent fact this is what I'm gonna do. It might move it down to like now I'm like at 80% now that I've done this research and understanding. And what I always try to say is like that's actually really great because I'm I'm all about the improvement of probable, you know, probabilistic outcomes, right? So if we think of like Bayesian theory or something, like that's how the way I try to approach it or teach it or try to bring it alive. And the way you all kind of discussed it, it's like I think a lot about like kind of that Dunning- Kruger effect. And and you can literally watch the students, you know, through the course, at least the program that I'm dedicated to, which is that ex executive MBA course, you know, from the very beginning, being absolutely so overly confident in many ways. Somewhere in the middle, they are all like, I don't think I knew anything about business, and I've been doing it for 20 years. And then finally, when they come out the other end, you know, they're starting to come back up that escalation ladder of like, oh, well, now I have these tools in disposal or at my disposal, and I can kind of start piecing to get together my gut with some evidence from academia with other research that I can kind of decipher and then raise my probability of certainty a little bit more than maybe when they were at their lowest point, but it almost never gets back to that overwhelming amount of confidence. Uh, and and when it relates to kind of these pitches, I think it's interesting. I almost tend to gravitate more towards people that can be very, you know, humble in some respects and give kind of the it depends answers and the little bit full confidence in their conviction and grit to find out what the real answer is, but willing to modify it as opposed to the ones that come in that's like this is the silver bullet. You know, whenever I it starts from that standpoint, I'm kind of immediately on the defensive of, well, no, you're gonna you're gonna shift about a hundred times, and I gotta believe that you can do that and that you can run through and look at the evidence as you go out there and you're beta testing or you're you're you know finding new different approaches in a lab and some new novel thing comes up. It's like, no, I want you to be able to explore that and not be so convicted in your original mindset. So so I think I take an interesting position there that I actually welcome and kind of want. Hey, this is our direction, but we're we're willing to accept all our inputs that are coming to us as we go about it and kind of change as we need to. I mean, that that to me is that's gold, right? If you can find founders that are able to do that.
Denise Rousseau:I think it's beautiful what Preston's described is becoming appreciative of uncertainty as something to manage. Reduce it where you can, and then anticipate different responses when you can't reduce some part of the uncertainty.
Karen Plum:I suppose what's coming up for me is uncertainty is an enemy. And it's not. It's there to help you to be more questioning, to go a bit deeper, not to rely on yourself purely. I suppose it's trying to deliver a thought about what being uncertain might contribute, that it might be a benefit for you, it might give you something you didn't have before.
Eric Barends:Well, well, it starts with the first step. I mean, the whole basic premises of why do we take an evidence-based approach? It's about reducing uncertainty. But if if you don't experience any uncertainty, you're not going to take an evidence-based approach because you already know there. So it's it's a starting point - that means the first step of evidence-based decision making is asking questions. And the first question is, what is exactly the problem we're trying to solve here? Or how do we know for sure that this is a problem? Or, you know, in Preston's situation where a pitch is being made, like, hey, we have this great product here, it will solve this problem. Then the question is, how do you know? Did you do any beta testing? Did you talk to stakeholders, users, or whatsoever? So, so the first step if is already problematic if you are very certain of yourself. And I think we have in many, many presentations this, I think it's a Mark Twain quote. The problem is not so much what you don't know, it is what you know for sure, but turns out not being so. So that's what I usually tell students. Like, there's a lot of stuff that you think you know for sure that are, you know, common truth. We all know this in management or change management. That's a that's a big one. A lot of big assumptions there. And then you say, just you know, take the position that it's probably not completely correct, or maybe it's sometimes plain wrong. Take that position and ask questions. Because if students don't ask questions, you yeah, that's that's the basic part. I think the neat one's that if I can choose between this evidence-based practitioner that's really knowledgeable when it comes to research, effect sizes, running surveys, you know, analyzing organizational data, or a person that can ask stellar critical questions, I would always go for the person with the inquisitive mindset asking questions. And I think there's some truth in there, absolutely. Yeah.
Karen Plum:You're still open to possibilities, aren't you? And bearing in mind that, of course, our teacher audience and our student audience are broader than just the exec students. So we need to think about our undergrads as well. But it seems to me that when I talk about the discomfort, certainly what I've heard is that people are uncomfortable about the evidence itself, and particularly when it doesn't align. How do you think about working with evidence differently in those two conditions when it broadly aligns and when it doesn't? I know we've talked about this on the podcast for our aggregation episode, but I thought it might be worth revisiting it here.
Eric Barends:Yeah, well, I think as as as we said in a podcast, thank God it doesn't happen that often. Nine out of ten times most of the research kind of aligns or support, it's it's usually in the in the contextual realm, like, oh, but there are factors, or or this type of employee, or that type of company, or whatsoever. It's not often that it's full head-on contradictory, especially when you bring in, bring the evidence together in what we say a cross-validation discussion. I mean it's a fancy word, just means that you bring the scientific evidence with the practitioners and say, hey, this is what we found, and and what do you think? And then usually the discussion is like, yeah, my experience is X. It's different from what the evidence from research is. But I understand this with an organization, you try to find out what makes it different. I don't want to steer the discussion from that, but you mentioned undergraduates. What I notice is that they are maybe more uncertain about what to decide, but they are well trained. They go to college and they learn that science is important and science needs to be trusted and followed. Follow the science, trust the science. And then there are these evidence-based teachers here pointing out how terribly flawed most of the research is, and that you really, really need to be careful. And then they get uncertain because they thought, hey, that's the magic bullet. We take an evidence-based approach, meaning we have a look at the research, and the research tells us X, so it's X, and then nah, it's a little bit more complicated. So that's where I see the uncertainty grow when you try to point out that, you know, and the other way around when you aggregate things, and often actually practitioners come up with things that you go, like, oh, but I have in this situation actually a different experience, was never researched by the researchers because it's a contextual factor and they they didn't know. So, for instance, now all the back to office mandates we've because we're doing a review on work from home, there was not that much research on that topic because it's new. I mean, specifically in the United States, everyone has to go back to the office before the academics, you know, keep up and do the research and publish it. Uh, we have to trust on all the more experiential evidence from the practitioners and see what happens. So, yeah, I see uncertainty also with undergraduates, not so much because things contradict, but some of the sources turns out to be less reliable than they than they figured. Yeah.
Preston Davis:Yeah. I just think that's interesting in and in the context because I did teach undergrad a little bit here and there, but I just from the mindset of how we're you know brought up, thinking about like the person standing in front of the the room, this expert, which I'm which I'm glad we're not gonna call this anything to do with experts, because I am certainly not one in any way, shape, or form, but I think that it's just like there's a difference, right? Because when I teach like the executive MBA students, it's very different than teaching undergrad. And and they give everybody kind of like a slogan, like each of the professors they meet with. And so I so my mine is I'm the "it depends" guy, which I think is perfect, you know, settles for exactly who I am. But I don't know if I would get that same kind of nickname with undergrads. And so I think that would be an interesting or maybe a topic for you to for you too, Denise, because I know you teach undergrad and have a lot more experience there, is that I think undergrad, they look at you as like you are the expert. And so it does create, like Eric was saying, like this weird uncertainty when all of a sudden you say, Well, no, you kind of got to challenge some things, right? It depends. There's all this contextual, you know, nonsense that that might have to come up. And and and I think from that standpoint of when all of a sudden that person standing in the front of the room is admittedly not an expert, right, in some degree or sense. What does that do for the environment? And when and when I'm in the executive MBA session, they already come in with, especially the last few years, like the science being right totally under under the gun to some extent, and Almost like a naughty word, right? So that's almost some sometimes the first things I get is that, well, I'll find whatever evidence I want to support my claim. And I'm like, well, that's not exactly what we're doing. And I get it, all right. I get it. But I think that that's been an interesting kind of macro effect right now is just that kind of science or evidence being under a different type of microscope or a different kind of like attack on those terms and that idea. And that now you can kind of go to social media or wherever and find any guru you want to tell you whatever you want, right? And so how do we critically appraise that and how do we teach that? And how do you get people just to be open-minded about it? I know I just dropped a lot in there, so I'll I'll be quiet, but it's interesting when I when I was hearing Eric kind of talk about that.
Denise Rousseau:I think the thread through this discussion is very much what Preston put his finger on, which is this issue of trustworthiness. And if you come from a, you know, an initial position of this is good, this is bad, I'm an expert, you're not, you're failing to really ask the critical question in evidence-based practice, which is what is the information I have available or can get, and how trustworthy is it when I get it? That's one competency that evidence-based practice really does grow. And with building a sense, a critical sense of, well, what makes a body of research and a systematic review trustworthy or somewhat or not, or what makes this practitioner an expert or not. And there are clear criteria for trustworthiness based on the areas that they've practiced in and the success of their previous practice, that you can begin to gauge is a so-called expert trustworthy or less so. That's an important concept for people because it broadens their thinking as well as the questions. And going back to Eric's point about the critical thinker, if we can broaden the questions people ask, so they're not just peppering with questions to show how smart they are, but questions that are asked because they're curious. So, like, you know, how reliable is that information? And what's the track record of this person who's making a claim? You're creating a path so that um it's a little bit easier to figure out, well, what can I know with reasonable trustworthiness or confidence? Where are the points of doubt? And I'd be interested in how Preston would approach this issue, which is some circumstances you can pretty much predict what will happen in an analysis of the situation because the evidence is so strong. Like in the context of healthcare, where they have checklists for intakes of people with chest pains in an emergency room. And there's that's based on very rigorous research and meta-analyses over years. So we know what to do. There's a almost a standardized path, as opposed to, hmm, you know, this is kind of a different situation in an unusual context. And there may not be an historical answer to what to do in this case. But in evidence-based practice, when the uncertainty is high, we know to run experiments, we know to build scenarios, to test and learn and revise and test again. And that's how uncertainty is dealt with in the real world is to not go a hundred percent with one approach, but to be more deliberate and learn what is working and what's not. So in the investment biz, uh, what about the role of truly uncertain situations? How how would you suggest people handle that?
Preston Davis:Wow. Yeah, it depends. So I could bring that up. Yeah, I mean, yeah, I mean I again it I think it it it really is it depends because it depends on what that uncertainty is. I mean, in a lot of ways, you know, the diversification of what we do at that level is that you have an expectation of failure, right? So from an investment perspective, right? So you can run kind of your betas and try to figure out kind of overall positioning on all the places that that you have. And if you need to be right one out of ten times, it it's different, and obviously, then it's different for managers whether or not they can actually go out and test. And obviously, when we work with portfolio companies, I'm a I'm a big kind of A-B tester. It's like if you're gonna do a pricing thing and you have enough, you know, enough clients, like, why are you not gonna test it out on a couple different cohorts, randomly sample, and then say, oh well, now we can see that you know this is the better approach and we can roll it out in a bigger way? And I think that people don't often do those things, which is unfortunate because I think that it adds a lot of value, but I think that that level uncertainty again, like it just it really does depend on the foundational level. Because then with obviously with my students, I always go back to like, well, what's the best available evidence we have? Is there anything that's pointing you in some direction? And then you can give it the it depends, you can give it the contextual disclaimers or what have you, but I would much rather go into something in and going back to just probability, you know. If right now I'm at like a 10% probability, if I can gather enough little data points that moves me even to 15%, that's a big change in the investment game, too. Right. And so if I can pull these extra data points, it helps me kind of modify my approach and I and I can get a few more percentage points of probability towards an expected outcome, that's huge, right? I don't need 90% and 100%, especially with what we do and I think that's really scary. I think it's really scary depending on the context, right? Because obviously you go see your doctor like Eric was talking about earlier. I don't want to walk in there and then, you know, and then just have no idea, then they do all the blood work and all the analysis, and they still come back and be like, yeah, I don't know. We're still kind of like we're like 20% sure you have this, but it is probably another 20% that it could be this. And so we don't know what what direction to send you in. I think that depending on kind of the field and the analysis, uncertainty has you know different weightings personally and so I think with managers, I think it's hard. I think it it's hard when, again, going back to the core, when you're rewarded for your confidence and commitment and decision making, and that's how you got to where you are, right? And then you want to go and you want to level up and you want to think about the things that you could have done better, and all of a sudden start people start bringing in evidence and contextualizing everything and realizing, wait a minute, maybe I shouldn't have been so certain all the time. And some of this was probably luck. That gets scary, you know. I mean, that's why I did it.
Karen Plum:What strikes me is that you know, for for these executives, they're in positions where it's personally risky for them. You know, they're having to navigate this space, uh, as you say, Preston, where they were hired or you know, they have a reputation for great decision making and you know, always knowing what to do. And but there are real reputational and political consequences to making decisions and making those decisions under pressure. And the temptation to fall back onto gut instinct, what I've always done before, must be really strong.
Eric Barends:Yeah, I mean, that's exactly the reason why I laugh undergraduates and bachelor's students, because they don't have that burden. And I always explain to them you're in such a fortunate position because you're going to start your career in an organization and you're this person, this young puppy. I knew you were gonna call them puppies in the eyes of the seasoned executive manager, you're this young person with no experience whatsoever. These people are, you know, complete ignorant, stupid. Just treasure that label and use it to your benefit and start asking what you think are stupid questions because you get away with it. You can ask stupid questions. You can ask in a whole room of executives when the CEO says, oh, we know this da-da-da-da-da. So we're going to do that and that. I would be careful because it can be a career-limiting exercise. But in general, you're in a better position to ask completely sincere and and maybe naively say, how why is that? Can you explain? Because I don't I don't know how that works. Can you can you explain how that works? And then it turns out that people are actually not as certain as they think they are, uh, because they have to acknowledge, like, well, yeah, well, to be honest, well, we haven't looked at the data reason. But yeah, maybe we should, yeah, yeah. Well, this is the general idea. And then you see the uncertainty, you know, shining through. So I think usually I explain to undergraduates that they have a superpower, which is being able to ask questions and they should feel comfortable in their uncertainty. I think that's the point. Being uncertain is absolutely okay. But if you feel comfortable in your uncertainty, you know, you can act really strongly. And I think that if you're young and you are uncertain, you're maybe less uncomfortable about it. And when you are an executive and a person who's supposed to know what needs to be done right now, and you go, like, oh God, I don't have a clue. And you just fake it because you feel I've seen CEOs, executives, you know, stating like, hey, we have too limited information. We are there's a lot of uncertainties here, there are a lot of strong assumptions. I know guys and girls here sitting here, all my executive colleagues have strong opinions about that. But to be honest, there is a lot of uncertainty here. And I think we should look into that. If a if a CEO or an executive makes that statement, you can see that people appreciate that. So even there, if you embrace the uncertainty and and feel confident with the uncertainty, and then that's a complete different situation. So it's a very important part of uh teaching evidence-based decision making, but also learning how to take an evidence-based approach, how to deal with that uncertainty. It's crucial.
Karen Plum:Preston, coming back to my statement about the situations where it's personally risky for people, is that something that you spend much time in class helping people address?
Preston Davis:Yeah, I think that's that's an interesting question and it resonates, I guess, a little bit more too. I think what Eric was saying was like, I love the idea around undergrads not having like the experience paradox, if you will, right? Yeah, I think that the way I pitch it always too is that we we we run all kinds of simulation and and experiential learning in our program. And so I'm always a big one of try things out in our safe environment before you take them back to work and decide you're gonna totally revamp being a you know coaching mindset with your team. And does it apply the right way? And you know, you might not have enough psychological safety on your team right now to like do some of these things, so you gotta build that. Like, how do you build that? And so I think it's I don't think they wrestle with it the same way, thinking it's going to jeopardize them totally. And and obviously, you know, these are these are people that are wildly successful, uh, or at least definitely on that path to being wildly successful. So I, you know, they're not going to jeopardize to a certain extent and put their total reputation out there on something. And so normally the way I try to approach it is like just bring it in, even if you don't a hundred percent go with it and share it or reveal the approaches you're taking kind of initially, because you're afraid to say, you know, I'm using these evidence-based practices and I'm, you know, kind of doing these rapid evidence assessments, even if you're like mind modeling those to some extent, but giving your yourself some context and support for what you're going with, and kind of baby stepping it into it. I think that that's kind of where I guess lean towards. I've never had any of the kind of you know, manager director level people say, oh, well, this this is gonna, you know, I'm gonna get fired for this. Like, I'm gonna be like, oh yeah, don't do that. But I think it I think the harder the harder one to to grasp is seeing yourself as somebody that makes good decisions already, having been successful for a while, and then having enough, and I think this goes back to kind of that idea around metacognition, it's like being able to think about your own thoughts, it's like a similar way of like, can I think about the decisions I've made in the past and if they were right in the moment and the evidence that was there, and like like rethinking all these things that you've thought about, and can you really get into it in a humble way and look at yourself in a very self-aware manner so that you can actually grow from it as opposed to getting stuck in and again, like it's very biased already because the people that come back, especially at this career level in the in in their life, it's like they're they either right, we're we're just biased, like they're already wanting to seek more or find a new path or grow in some way, and maybe they're gonna grow in a different way than they originally thought, but they're definitely there for different purposes as you know, undergrad, at least traditionally, and it's like some people thinking, oh, I need to check this box societal hoop to get a job or whatever. And it's like, no, when you come back to school and you're 40, that's a very different, biased person that wants answers, wants understanding. And I think that, at least from my standpoint, from a critical thinking standpoint, I love being able to use evidence-based management as kind of that, and I hate using the word framework, but to use it in that way to give them this structure to then have this real deep, thoughtful process that they can go down and challenge themselves, right? And be open about it and have this idea around criticizing evidence. Uh so there's a long-winded non-answer, maybe, to your original question, but I just love it from that standpoint because I was the exact same way. You know, I was a practitioner, and yeah, I like to think that I had some success. And then I was like really starting to question myself and was I making the right decisions and why why do I think this way? And you know, I always hated the phrase of like, this is how how we've always done it, and it's always worked. I'm like, but you know, there might be a better way, right? I mean, and so I started having all these questions, and I thought, oh, I need to go back to school. You know, I need to go back, I need to learn what maybe I don't know. And and I fell in that same trap of now that I feel like I know a lot more, I am much more uncertain. But I love it, right? Like I love it because I love wrestling and being in the uncertainty.
Karen Plum:Yeah, yeah, you've learned how to walk with it.
Preston Davis:Yeah, and it's real, right? I mean, because that's real life. And and I was saying this earlier to a group, and so I'll use it here, but I think like one of the the worst things that we ever did in society was make complexity like an evil thing, right? So I think that everybody wants the silver bullet, something simple, like life is meant to like there's these simple solutions out there. And if you just do these three things and like life is great, and I'm like, no, you know what's really cool and unique about life and makes it so worthwhile to be on this infinite universe or whatever on the smaller rock is that life's really complicated and it's really complex, and it's from that complexity that we get all this beauty, and so I think that living in the uncertainty to some extent gives you exposure to it, and so that's much more like of a philosophical lens on this, but I try to like bring that into the evidence-based practice to say, no, it's good to feel uncertainty, it's good to question, and it's good to like think about your own thoughts in a way that you might not have done it before. Uh, and so I don't know. So I kind of take that approach to it to help push people in that direction to say you don't have to have all the right answers, and you don't by any means need to always have the simplest, you know, solution because there might not be one, and that's okay.
Karen Plum:Okay, well, that certainly sounded like a manifesto and not a "it depends" in sight.
Preston Davis:No, no, you'll you'll have to kick me off my soapbox every once in a while.
Karen Plum:So I'm gonna move us towards a close because I think it's it's about time we wrap things up. And I'm I'm entirely with you, Preston, in terms of embracing uncertainty and sitting with the discomfort for long enough for the evidence-based approach to to do its work and I suppose we all find our own ways of being able to do that. And I think I've probably just heard a lot of what what you would say in answer to my question, but I'll start with Denise and Eric. What does help people to sit with the discomfort progressively, do you think?
Denise Rousseau:Having task strategies for addressing the uncertainty. It's not a void, it it's an array of activities and processes. And one of the first issues for people is to be able to think about well, what is the uncertainty here? Is this something that's particular to this situation, or is it really we're doing something novel? People coming back to work having from remote work after years, you know, in the into the office. Because what is truly a novel situation we have no historical knowledge about. So research will not help us in truly novel situations. The more novel, non-routine something is, the more you learn by doing. And I think what what I heard Preston describe was very much an opportunity for thinking about pilot testing rather than betting your whole company that somebody's pulled, you know, made the right decision. That kind of test and learn approach is really one of the reasons why organizations that spend their time dealing with uncertainty yet good at it. I'm thinking about the US Army as an example, is they do after-action reviews all the time, not just to train the people who went through the action to do it, quote unquote, better, but to think about what they did in this potentially novel situation and what seemed to work and what didn't, so they could build an understanding of what they were trying to do and see if it could be replicated, which would tell them that they're on the right path. And you have a lot of uncertainty, your prior knowledge won't be that useful if it's true uncertainty. It's not historically a familiar circumstance. But if your uncertainty is about something that has existed before, you know, there is possibility of research and expertise, then I think you go to different to a different path to try to reduce your uncertainty.
Karen Plum:Okay. Eric, any final thoughts from you on that subject?
Eric Barends:Well, I don't know. I was thinking I was at a meeting that was set up by Denise in in Toronto, and and there were, well, actually, they were all academics, and and there were a few with a practitioner background. And I always felt like, oh come on, let's not make it too complicated, evidence based approach, it's just a set of skills, just do it, you know, follow the steps, etc. But but I'm afraid the academics are right. There's more to it than just a set of steps or skills or practices or whatsoever. Denise cheering there. But yeah, the whole idea of dealing with uncertainty, being able to deal with ambiguity, that what Preston says, that this is not a black and white answer. This is not about a dichotomous binary kind of situation. Should we do this or that? That whole aspect is also something that you should maybe train your students. I mean, in how we teach evidence-based approach, evidence-based decision making, it's it's mostly skills. This is what you do, is how you study the evidence, is how you figure out whether the evidence is trustworthiness, you have a look at the this and the that methodologically, etc. But maybe, maybe there should also be, if if you teach evidence-based decision making and you have five lessons or five moments that you're together with your students, half of one of those sessions should be about how do you deal with uncertainty, how do you deal with ambiguity, how do you deal with complexity, maybe not even as a professional, but even more general, as a person in your life. I mean, life is full with complexities and uncertainties or whatsoever. And running away from it, that's actually what evidence-based decision making is about. Reducing uncertainty, but then first you need to acknowledge uncertainty and and be able to confess, you know, speak up for yourself. Yeah, sorry, this is so much uncertainty here. I don't know. I really don't know which way to go. I mean, it depends. Yeah, depends on what. There's so much uncertainty. So dealing with this is, I think, something to to be, I hate using the word, but still I'm going to be mindful about with your students and and dedicate some time. Because it is it is a thing. If you don't handle that that well, if you don't know how to deal with uncertainty, you won't be as good as an evidence-based decision maker as you could be if you really know how to embrace that.
Karen Plum:Thanks, Eric. Preston, last few thoughts from you?
Preston Davis:Yeah, so still indirect with that. I think that I go back to kind of the some of the simple things. Is you know, one is naming it, like you said, kind of that uncertainty is like naming it, calling it out, understanding it. But then the the other aspect, which I think was underlying here, is that there's there's a level of I don't know if you want to call it community or group or just like people, right? And so it's like not doing this in isolation that you're like the sole master of figuring something out. And I think that at least when we get to executive MBAs versus kind of undergrad, the one thing I do like is that they're already positioned that and sometimes we have to reteach them a little bit, but like academia or research or knowledge is not it's not a selfish like you're doing it by yourself and you're competing with everybody else. It's like they've been a practitioner enough in their life to say, yeah, groups are important, teams are important, like it's really hard to build or do anything by yourself. And so I like that, especially as it relates to uncertainty, because the second you can name it, the second you can have a community that you can talk about it with and understand it, I think it opens you up to a better opportunity to feel comfortable with it, right? And share in that uncertainty as opposed to, you know, sitting isolated in a room by yourself and thinking that you have to figure something out or wrestle with that uncertainty. And so so I just I just think that that's always kind of the last element. I love kind of the same concept of action after reviews, right? It's it's they're not doing that by themselves, right? Like they're in a group of people having these open discussions, naming uncertainty, calling out from an accountability perspective, asking that next hard question of like, well, what could have looked different here? Hey, we were successful. What how could it have gone better? How could it have gone worse? What did we do? And I think that it's again, it goes back to, and I'm a huge introvert and I love being by myself. And even I will recognize that wow, being around people and having these difficult conversations out loud with other people and kind of like sharpening that knife is super important. Uh, and so I think that with that, I just that was kind of my ending is that wrestling in with it doesn't have to only be by yourself, but like to go out there and participate in the world together with other practitioners, other academics, whatever it might be. And so that you can really kind of like fine-tune that and get it out in the open.
Karen Plum:Okay, so alongside your it depends strap line, we might have evidence-based management's a team sport, right?
Preston Davis:Love it.
Karen Plum:Okay, well, I'm just gonna wrap up now and say that I think what I've taken away from the discussion is that an evidence-based approach isn't about being more certain, it's about being more careful, more curious, more able to stay with the not knowing. And perhaps part of the work, whether we're managing or teaching, is modeling that stance and that approach for other people. So that's it for this episode. Thanks very much, Denise, Preston, and Eric for exploring uncertainty with me.
Eric Barends:Thanks for inviting us.
Karen Plum:Thank you. And if our listeners experience this sort of discomfort, then they hopefully will take some comfort that they aren't alone and that there's something to be learned by sticking with it and seeing what it reveals. Thanks for listening to this episode. See you next time. Goodbye.