Dr. Journal Club

Challenging Assumptions in the Race to Diagnose

January 25, 2024 Dr Journal Club Season 2 Episode 3
Challenging Assumptions in the Race to Diagnose
Dr. Journal Club
More Info
Dr. Journal Club
Challenging Assumptions in the Race to Diagnose
Jan 25, 2024 Season 2 Episode 3
Dr Journal Club

Embarking on a thought-provoking journey, our latest episode delves into the often-unseen impact of cognitive bias in the medical field. Drawing from Daniel Kahneman's "Thinking, Fast and Slow," we explore Type 1 and Type 2 thinking, uncovering how these mental shortcuts influence clinical decisions. We share stories, strategies, and insights on skepticism, emphasizing the value of revisiting initial diagnoses.

Navigating medical heuristics, we examine how biases like availability bias and base rate neglect shape our clinical landscape. We discuss balancing Occam's razor with comprehensive differential lists and the influence of diagnostic momentum. Confronting the challenge of 'doing nothing' in medicine, we highlight the art of balancing intervention with watchful waiting. Our reflections stress self-scrutiny and continuous learning in clinical decision-making, inviting the audience to deepen their understanding of evidence-based medicine and its psychological complexities.

Learn more and become a member at www.DrJournalClub.com

Check out our complete offerings of NANCEAC-approved Continuing Education Courses.

Show Notes Transcript Chapter Markers

Embarking on a thought-provoking journey, our latest episode delves into the often-unseen impact of cognitive bias in the medical field. Drawing from Daniel Kahneman's "Thinking, Fast and Slow," we explore Type 1 and Type 2 thinking, uncovering how these mental shortcuts influence clinical decisions. We share stories, strategies, and insights on skepticism, emphasizing the value of revisiting initial diagnoses.

Navigating medical heuristics, we examine how biases like availability bias and base rate neglect shape our clinical landscape. We discuss balancing Occam's razor with comprehensive differential lists and the influence of diagnostic momentum. Confronting the challenge of 'doing nothing' in medicine, we highlight the art of balancing intervention with watchful waiting. Our reflections stress self-scrutiny and continuous learning in clinical decision-making, inviting the audience to deepen their understanding of evidence-based medicine and its psychological complexities.

Learn more and become a member at www.DrJournalClub.com

Check out our complete offerings of NANCEAC-approved Continuing Education Courses.

Introducer:

Welcome to the Doctor Journal Club podcast, the show that goes on to the hood of evidence-based integrative medicine. .

Josh:

Please bear in mind that this is for educational and entertainment purposes only. Talk to your doctor before making any medical decisions, changes, etc. Everything we're talking about that's to teach you guys stuff and have fun. We are not your doctors. Also, we would love to answer your specific questions. On www. drjournalclub. com you can post questions and comments for specific videos. Go ahead and email us directly at josh at dr. journalclub. com. That's josh at dr. journalclub. com. Send us your listener questions and we will discuss it on our pod. Hello sir, how are you doing?

Adam:

I'm doing great, Josh. How are you?

Josh:

I'm doing great. Welcome everybody to the Dr Journal Club podcast with Josh and Adam. We are doing great. That's the latest, with all the caveats therein. I don't know about you, but I've just had this crazy day running around chicken without my head. I just got back from Seattle so I'm playing catch-up. It's one of those catch-up days for sure. I don't know what's going on.

Adam:

Have you seen that meme where the dog is sitting in the kitchen and he's like everything is fine and his whole entire kitchen is on fire? It's on fire, yeah.

Josh:

I was wondering how broadly that was. Is the algorithm just so good that they know what my life looks like, that I'm getting surfaced up, or is everyone just relevant to everyone?

Adam:

I think that's just the life of everyone.

Josh:

Yeah, fair enough. Today I was thinking we take a little bit of a departure from the type of thing we normally do to talk about cognitive bias. This was a paper I came across for another project, actually the forensics project. I was doing some background reading and I was like cognitive bias, yeah, it's relevant in forensics but it's super relevant in clinic. I found a really good review paper and I thought we should chat about it.

Adam:

Yeah, I would say. The human brain is a complex organ with the wonderful power of enabling man to find reasons for continuing to believe, whatever it is that he wants to believe.

Josh:

Yeah well, Adam, I don't think people have realized he's a philosopher, he's a polymath, he's a polyglot. He definitely is a polyglot and a polymath. I don't know about the philosopher part, but yeah, that's how the paper starts. He didn't just have that quote out of his back pocket, but I thought I was a good one.

Adam:

I just casually memorized. You didn't have that in your education, Josh.

Josh:

Well, that's how I'd pick up girls back in the day. You'd get a little bit of philosophy memorized and you thought you were.

Introducer:

You're a poet and didn't even know it.

Josh:

What's that?

Adam:

You're a poet and didn't even know it.

Josh:

Yeah, yeah, there you go. I think I was quoting Bertrand Russell. I was this deep atheist.

Adam:

Were you actually?

Josh:

Yeah, no, I was a total nerd. I was a total.

Adam:

You didn't know how to pick up girls?

Josh:

Yeah, no, it was just ridiculous. Oh my God.

Adam:

I'm so sorry.

Josh:

Certainly, maybe with Voltaire it would have worked, but certainly not with Bertrand Russell, anyway, but we move on to the matter at hand. Okay, so tell me everything you know about. Well, here we go. Let's talk about this. Like in your medical training because you went through training after I went through training, I didn't get any formal training in cognitive biases and how it plays out in medicine. I don't think it was really until I went to like a doc talks maybe 10 years ago and Ryan Bradley got up there and gave a whole lecture to a room full of really experienced doctors about how they're making cognitive errors left and right and they need to consider it. I just thought it was a great talk and I got super hooked on cognitive biases. But are they teaching it now in school? Did you come across it in your training?

Adam:

I don't, honestly, I don't know if that's a yes or no, but I do know I think part of the educational landscape in general is actually changing, in a way of just trying to sort of not eliminate but kind of reduce our biases just from a social construct, and then I think that then sort of plays into other aspects of life, including like medicine or other occupations.

Josh:

Yeah, yeah, I think that's fair. There's more of an awareness of how our thought processes are biased in general and I think One of the ways Okay, so in this paper, and we'll give a link to it with the show notes, I'll send the paper to Michele is this idea that cognitive biases, also known as heuristics. So I guess let's start with that. This isn't so. We're used to thinking of biases like this bad thing when it comes to social bias and racial bias and all that. This with cognitive bias is a little bit different. They're sort of like mental shortcuts that for the most part, like maybe even 90% of the time are extremely efficient, correct and get you what you need with less cognitive energy and less fatigue essentially, and faster to a response. And there's, I guess you can kind of think of them as like mental shortcuts. And my sister, my sister sent me this book for my birthday last year. Thinking Fast and Slows tells you like how bad I am at reading. She sent it to me a year ago. I'm like 60 pages in.

Adam:

Oh yeah.

Josh:

But this is a great book and I think when I was reading through this I'm like oh, this is what Conamin's talking about, and I think Conamin will also give a link to this book. It's a great book. I don't know Vented or pioneered this like. Was it Program 1 or Program 2 Thinking, or Type 1 or Type 2 Thinking? I can't remember the phraseology now but basically this idea that there's like fast Thinking fast and thinking slow, and the fast thinking is almost always good and that's what we use all the time, but every once in a while you need to slow down and use your more thoughtful evaluation. And anyway, so long story short, it's relying on that Type 1 thinking that can really bias us clinically and that's sort of what we are. They focus on in the paper.

Adam:

Yeah, and I just, I don't know. I think it's an interesting thing to think about and to kind of study, because in a way, as humans were kind of programmed to sort of develop cognitive biases, right, like if you touch a hot stove when you're a kid, you don't know it's hot initially, but the first time you touch it, you know, you scream and cry because you're in pain. And that's a way of learning of hey, don't touch a hot stove. In medicine specifically, you know, we sort of see one, do one, teach one, and so we sort of we see a patient kind of present with a complaint, and they have sort of these characteristics to that complaint and we try to identify patterns, to come up with a diagnosis and a treatment strategy for that, and then we see that as students and then we do it as a resident and then as attendings we teach it to other students. So just kind of the cyclical process of things being ingrained and you sort of don't recognize it or you don't. You become challenged with it when someone or something tries to sort of break that.

Adam:

And it's something that happens up all the time, especially if you're a really busy clinician, right, if someone comes in with, you know a chest pain, you kind of go, okay, let's go down the heart attack.

Adam:

You know algorithm, if you will, to make sure that's not what's going on, we get them acute care and then, if it's not that, then we jump to the next most likely thing and next most likely thing. But it can become a problem when there's something that's not really fitting the mold and you use that shortcut a little bit too abruptly and kind of stick your hand, your head in the sand, if you will, of like I don't know, let's use like Crohn's disease as an example or all sort of colitis. Oh, they don't have bloody diarrhea, so it's unlikely that. Sure, it's unlikely that, but it's still possible. And so I think that sometimes as clinicians and this is probably what you know the papers is kind of alluding to is you can get kind of in trouble with the cognitive biases, even though there is sort of these heuristics that try to kind of be that we use sort of as mental shortcuts.

Josh:

Mm, hmm, yeah, I think that's right and I like the phrase mold because I think that's right, like, as long as things fit the mold, this type one thinking, this fast thinking, works, you know, most of the time and you're probably good. It's just that you're going to miss some stuff and sometimes that can have really important outcomes clinically. So yeah, I thought it would be fun to kind of go through some of the examples of the biases that they bring to the fore. There's all sorts of different ones and I think, and then at the end they sort of talk about different ways to bias proof yourself, I guess from a cognitive bias perspective anyway, like cognitive bias training, and a lot of it comes down to just awareness. So I think kind of going through these and talking through some examples might make some sense.

Adam:

Yeah, yeah, you just start from the top.

Josh:

Yeah, I think so. Just a couple of other things I wanted to sort of underline that were in the paper that I thought were interesting. So this was kind of crazy. So, like some of these studies that were looking at error rates amongst doctors, they're estimating about 80% of these errors involve a cognitive error with a patient encounter and a lot of this stuff is happening like with the patient themselves, which I think makes a lot of sense and is reasonable. And then also this idea that it's been really hard to study cognitive biases in medicine because the clinical decision making process, to use their words, is somewhat invisible and mysterious, which I thought was great, and we're trying to standardize it and guide line sort of like these algorithms that are coming out, that we're coming out with, and that helps a lot actually avoid some of these. These sort of these checklist help us avoid some of these cognitive biases.

Josh:

But a lot of times that gut feeling, that intuition when a patient comes in, that is often right but that can be kind of quite wrong and this intuition or gut feel or just feels like this is what's going on with the patient. That can be. Are we calling that intuition? Are we calling that type one thinking Are we calling that heuristics? What are we calling that? And I think it's probably a combination of the mold that the mind is making after seeing like hundreds and hundreds of, maybe thousands of cases and how useful that is over time. So anyway, so it's a very hard thing to study. What else yeah.

Adam:

Well, I think, also curious. It looks like you can actually kind of see it in the real world if you kind of pay really close attention or you sort of have insight to some subtleties that occur. And you'll see it like with younger docs. They'll kind of do maybe over extensive lab ordering or diagnostic imaging or really spend the focused time doing clinical exams. And then you have a really well-seasoned doc who is so used to seeing so many things that they just kind of not that they're doing bad work but are not doing as thorough as a workup and thorough doesn't necessarily mean good, they're just it's kind of that heuristic that cognitive bias comes into play where they've seen so much that they kind of use that gut instinct, if you will, as to knowing what's going on or, as a younger clinician, is more likely to do sort of excessive work.

Josh:

Yeah, I think that's right. And so younger clinicians cause they don't have that experience are using type two thinking almost extensively. They're thinking through their problem, looking at alternatives, because that's what they have and they're not comfortable with it yet. And then the more comfort they get, the more they shift to type one thinking and just being like this fits the mold, this is what it is, it's standard UTI, let's move on, type of thing. And so, yeah, I think that's exactly right. And again, there's value in these heuristics, especially in, like, emergency situations, ers and things like that. Like these things can be very helpful, but the problem is when it doesn't work, et cetera. So, okay, so let's talk about a couple of these I liked. I highlighted a few. This is a great paper. I think they give like 20 different examples of people are interested. I highlighted one, two, three, four, five, six that I thought were kind of cool to chat about. One was the availability bias, which I thought was a kind of neat one.

Adam:

I really liked that one.

Josh:

Yeah, okay, should we go through our favorite ones. Do you want to do availability and then I'll do base rate neglect? That's a fun one to.

Adam:

I mean, yeah, sure, so with availability bias. Basically, what that is is maybe you miss something that was pretty significant, and so then when someone comes in with maybe a similar complaint, you're kind of doing that work up for that big thing that you missed for a lot of people, even when it might be excessive. And so the example that they gave in the paper was someone who had a pulmonary embolism, a blood clot in the lung. If you miss that, then the next person who may come in with shortness of breath, even if it's just, let's say, from a viral upper respiratory tract infection or maybe it's asthma, you're ordering sort of pulmonary or lung CT scans on everybody, as opposed to really kind of working up of like, hey, is this actually a pulmonary embolism, is that the more likely diagnosis, or is it just the fact that you missed something that was pretty significant? And so now you're sort of overcompensating for that.

Josh:

Yeah, and I was thinking of a corollary for me. So I'm like very so that, like the examples they gave and that you gave really well, which is like this rare thing that happened once recently or it was really important because you messed it up or something, and so now it's front of mind, it's available in your mind, but also, like in our niche practice, like I think about this all the time, like 99% of the patients that I see are for small intestinal bacterial overgrowth, and so when people come to see me and the symptoms fit that, like that is my lazy way of thinking that it's like okay, well, yeah, I mean, it fits the criteria. Probably is, you know, small intestinal bacterial overgrowth or there's a few other things we should rule out, or yada, yada yada. So I think, even if it's something's not rare, you just see it all the darn time, like you know, if you've got a hammer, everything's a nail. Or you know, if you're a surgeon, you look at things one way. If you're a psychotherapist, you look at things another way. And so I think that's also part of it too is like maybe that's a different type of bias, but it's like the framework in which you work really colors the case, and it's not like if I was family practice and someone was coming in with you know all these symptoms. What I view it differently, I guess, is one thing I think about a lot.

Josh:

So the other thing is somewhat related, which they call base rate neglect, which is this idea that you know if you are, if it's very, very unlikely, with your base rate risk. So let's say you I don't know, but let's say you come in with some stomach discomfort, some stomach pain in your I don't know 20 years old, with you know no family history of anything, and it happens. With you know really heavy meals and maybe you're a little bit overweight, yada, yada, yada. You know you are going to think about you know reflux and things like that sort of these obvious things. And that's because the group of people that would have say I don't know stomach cancer, that rate is gonna be very, very low in that population, right? And so a good clinician is constantly thinking about the base rate, right, or at least that's the argument that you think about.

Josh:

Sometimes we call that, we think about this as, like in Bayesian terms, like what's your pre-test probability, your post-test probability, right, low, if someone comes in looking like this, this you know demographics, what are the chances that they have this condition, and that should sort of inform your, your work up, moving forward, right. So that's sort of this idea. And if you neglect that, maybe because there's something in front of mind like the availability bias and you're like I just missed a stomach cancer, so I'm gonna scope this person right away, type of thing that would be. You know, that might be one example like a base rate neglect, which I think is interesting On the flip side, yeah, sorry, go ahead.

Adam:

I was gonna say. I think another good example is you know, if you practice in the States and like, let's say, you're somewhere in the Midwest where it's a primarily like Caucasian, you know demographic that you're working with. If you were then to move to somewhere where it's a lot more diverse or you see a large like immigrant population where you know what you might be used to may be sort of like laceration repairs and diabetes and hypertension and high cholesterol, to then all of a sudden like lots of different, like infectious diseases or sort of these different presentations of you know. Perhaps a fever to one clinician where they're practicing is just, you know, a, an upper respiratory tract infection. But once they move to a different you know demographic that they're working with, that base rate of you know, let's say, an infectious disease that they may have is much higher, given the population that they're in or sort of the area that they're living. But if you don't have that knowledge going into it, you may not consider, I don't know, like malaria as on top of your differential.

Josh:

Yep, yep, I think that all plays in, and so it's just a fascinating. It's just a fascinating way of thinking about medicine, like the probabilities going in and probabilities going out, and so that's a really good point that you have to know your population well enough to shift that base, that base rate, if you will, before evaluation testing. But also, like they brought up a neat example I thought about and this comes into our like evidence-based medicine skill stuff that we talk about which is if you have a really low base rate, like, let's say, one in a thousand in this population for having this type of cancer, and even if the cancer test is very, very sensitive and like catches 100% of people with this cancer, because the base rate's so low, if you have like, and you're doing this in like thousands or millions of people, you know, even if it's pretty stinking specific but, you know, not perfect, it is very often more likely that a positive test result is a false positive than a true positive right. And so they give a couple examples of that, just kind of working out the math. But that's really important too is to think about you know, even if you have you know, because the way I, you know, even though I know this, like when I get patients test results back and let's say it's a positive test result, like I'm immediately concerned. Like you know, we had a I don't know positive CA-199 the other day and I was like, oh my goodness, like you know, we've got to work this up. But you know, the base rate was really low for this patient. It turned out it was a false positive and you know I think you know we whether it's CYA or concern for the patient or just you know, whatever it is, you see a positive test, you want to act and we very often, I think, forget that we have to take that in stride with the base rate and that even positive tests, even if they're good tests, if the base rate's really low, there's a very high chance often that that positive test result is a false positive. Yeah, exactly, cool, cool, all right. And then we've got a bunch of videos on test interpretation in the Dr R Club website, so you should check that out.

Josh:

Look, the thing is we don't do this for money. This is pro-bodo and, quite honestly, the mother ships kind of eeks it out every month or so, right? So we do this because we care about this, we think it's important. We think that integrating evidence-based medicine and integrative medicine is essential and there just aren't other resources out there. The moment we find something that does it better, we'll probably drop it. We're busy folks, but right now this is what's out there. Unfortunately, that's it, and so we're going to keep on fighting that good fight.

Josh:

And if you believe in that, if you believe in intellectual honesty and the profession and integrative medicine and being an integrative provider and bringing that into the integrative space, please help us, and you can help us by becoming a member on Dr Journal Club. If you're in need of continuing education credits, take our Nanceac-approved courses. We have ethics courses, pharmacy courses, general courses, interactions. That's on social media. Listen to the podcast, rate our podcast, tell your friends, tell them there are ways that you can help support the cause. All right, what's your next favorite on the list?

Adam:

My next one is what they have written out as conjunction rule. I like to refer to it as Occam's razor. I think Occam's razor it sounds cooler and I have to just drop that, actually in mid-conversation with people of oh, it's probably Occam's razor.

Josh:

That's a philosophy reference, isn't it?

Adam:

Yeah, I guess that's actually a pretty well-known one, but I don't think a lot of people actually know what it means. I didn't know what it meant at first too. So I remember looking it up and just being really fascinated by it, and I worked with one clinician who would always quote Occam's razor, and he was just the clinician that I really looked up to. So I think there's some emotional attachment to this one. But basically and I also like this one because I don't know if, when you were in med school, if people would always ask, ok, well, what's on your differential? And people would kind of get upset if your differential wasn't this broad, expansive sort of list and I was always like, oh, it's probably like this, this or that, and then they would always be like, well, what else could it be? And I'd be like nothing.

Adam:

It's probably one of these three things that's most likely this I'm not playing this game. I hated playing the whole what's on your differential game because I would always only have one or two, and the reason being is, with this, it's the idea that if you have multiple possible outcomes, the more likely outcome or the one that is more statistically likely to happen is probably that diagnosis and so with Occam's razor or the conjunction rules, the definition of that is the incorrect belief that the probability of multiple events being true is greater than a single event. So it's likely an upper respiratory tract infection, not the pulmonary embolism. So I'm not going to have a pulmonary embolism on my differential.

Josh:

Unless, the risk is super, super high right.

Adam:

Right.

Josh:

Yeah, absolutely, and I think it's interesting. So cartillary for that again from the SIBO stuff, when you've got conditions that are I don't know what we would call this the opposite of Occam's razor, that when you've got a condition that can have lots of nonspecific symptomatology, I think there is a tendency to ascribe everything to SIBO and I just feel like my patients will say, well, I've also got this headache and my hair is also falling out and I'm also depressed, and technically all of that can be related to gut issues, absolutely, and you can come up with a mechanism. But then the possibility is like, well, or lots of people have headaches, or maybe you're reacting to something, and so that's always kind of the challenge. It's almost like this reverse when it comes to nonspecific things, it's like, well, maybe not everything is related to the SIBO, maybe it is, but maybe it's not. So that's like the cartillary to Occam's razor.

Adam:

You know that's called right. What's that, yeah, you know what it's called. It's called Goldenberg's plastic spoon.

Josh:

Goldenberg's plastic spoon. Okay, fair enough. Now you've heard it. Now. First on this. On the podcast folks, I don't know that that's going to take off.

Adam:

Obviously, the Occam's razor is a Goldenberg plastic spoon.

Josh:

Goldenberg's spoon, is a Goldenberg's spoon.

Adam:

Plastic spoon specifically.

Josh:

Okay, that's an interesting one. I'm going to have to think that through. All right, and then what are some other ones? I think confirmation bias is something that everyone's very familiar with, I think with social media discussion, so I'm going to skip through that.

Adam:

Well, for those who don't know what it is, why don't you just give like a super quick, like layman's definition of it?

Josh:

Yeah, and I guess it is relevant in clinics. So it's like if you have a position, as new evidence comes in, you're going to sort of glom onto the evidence that supports your position. And I think that is definitely true for me clinically. If I'm working through a case with a patient and I've sort of come up in my mind with what I think is going on, very often as new data comes in from the narrative or whatever, I'm like, oh okay, that can fit in there, or maybe I underweight that one. I feel like there is subconscious overweighting and underweighting once you've kind of come up with a diagnosis or come up with what you think is going on. And so I think that that definitely plays a role. And I think that's probably related to sort of confirmation bias of it and or like there are supplements that I like can't stand. And if a patient comes in and says it didn't work, I'd be like yep, yeah, that's something that's worthless, or if it did work, I'll quickly ascribe it to something else or all discount that it actually was the thing that helped. So I think I definitely fight these biases every day in clinic. But that's sort of this confirmation bias, like you have a position and then you basically glom onto overweight, underweight or just ignore evidence that supports or refutes that position.

Josh:

One that I think comes up a lot in forensics work, in expert witness work and also with some doctors, is this overconfidence bias, where it's like I am the expert, I know all and I can't be fallible and so this is what I think is going on and if you disagree, you're just an idiot.

Josh:

And I think I see that with a lot of these like star clinicians, I feel like a lot of patients, like the certainty, like the certainty feels good, and I can totally see that I'm much more of a hemming and hawing type of clinician and I should be like well, it could be this, it could be that the evidence is sort of balanced here or I'm not 100% sure what's going on, but here's our strategy, whereas I think some docs, especially in the integrated medicine world I mean probably everywhere they'll be like this is what's going on, this is what's wrong with you, this is what you need to do and you will get better. And having such confidence in a health partner I think helps a lot with the not specific effects of intervention. But yeah, I think that's definitely something that we need to be cautious about as clinicians.

Adam:

Yeah, that's a good one. I think another one or this actually might be the corollary that you were looking for to Occam's razor is representativeness when excuse me, it's not representativeness, it's search satisfying when basically you stop looking for additional information or alternative sort of like solutions or possible diagnoses when the first plausible one is found. So kind of like you're like oh, it's probably this and it's likely that and we're done.

Josh:

Yeah right, yep, yep, yep. And so I see that with family members currently family member who has really really bad back pain and they did some imaging and they're like, oh yeah, there's definitely stenosis here and that's probably what the pain is and maybe it is. And I mean, I'm not a neurosurgeon but I feel like you find this radiographic evidence that supports the symptomatology and you're like that's what it is, period, and then the rest of the search kind of stops. And one thing we know from imaging is like, yeah, you can have zero symptoms and really bad imaging incidental omas or whatever we want to call them and the converse too. Right, and you can have normal imaging and still have symptoms from it.

Josh:

So I think, with imaging in particular, that can be a big issue because it's not always like a one to one correlation, but it fits the narrative well and so I feel, and it feels objective because people point to it and show you on a screen that's what's causing your pain. This is the mechanism and it's very plausible and so it kind of makes sense that people would just stop searching. But I feel like that's definitely an example of that. Yeah, let's see what else One I definitely want to talk about, which is diagnostic momentum. I've been guilty of this in the past.

Adam:

I was just going to say yeah, yeah.

Josh:

Okay, do you want to go through that one?

Adam:

You can go for it, okay.

Josh:

Well, maybe we have different experiences with it, but I find this a problem when I rely too heavily on other doctors' opinions. So you get a patient and they have this existing diagnosis and particularly if it was made by a clinician you respect, or maybe just a specialist in the field, you just assume that that is God's honest truth and work from that position, that that's what's going on. And I think, as I've gone, and so that's the sort of diagnostic momentum someone gave the diagnosis then just sort of sticks around and then builds and builds, and builds and then it's in your chart and then the next clinician is in two charts and no one kind of reexamines it. And I think when I was I'd like to get your experience when I was young in my career I definitely would just defer a lot, be like, oh, wow, well, they saw a neurosurgeon and this was the diagnosis, that's probably what's going on, they're probably right, and then I would do my little thing. But I don't know. As I've seen more and more patients over the years, I feel like I've gotten a little bit more skeptical and I think everyone's on, especially if things just don't fit right and the narrative just doesn't quite fit the patient presentation in the labs.

Josh:

It is always good to go back and retake stock and just sort of retake that case, and I'll do that when I get stuck again. Curious your thoughts on this. When I get stuck with a patient, we're just not making progress. I'll do this for myself. I'll say, okay, we've been working at this a long time. We are not making the type of progress I was expecting. I want to retake the case and I'll often just go back through all my old notes, try to look at it afresh, maybe even bring in a colleague and write up a case report. But just retaking that case what I thought was going on, maybe it's wrong. Looking at alternative possibilities I think that's really important and you kind of need stop points in the sand to say, no, this has been going on a long time, what are we missing here? Let's not just do another year of this without progress.

Adam:

Yeah, I haven't encountered that too much. But in the sense of like deferring to other clinicians, I have definitely sort of like not really trusted my gut sometimes and then look towards either a specialist or to a more senior physician and then kind of maybe putting too much stock in what they had to say, even if what they had to say is unlikely or maybe incorrect. That has happened. But yeah, no, I think it's pretty common to see that where you send to a specialist for something that you're not totally sure of or you don't feel really comfortable managing, and then you just kind of buy their work up for the truth, even if, perhaps if they went to the same specialist but within a different institution. So, like if you went to, instead of like I don't know, like OHSU, you went to University of Washington and you saw two different cardiologists, you may get two different answers.

Josh:

Mm-hmm. Yeah, no, totally, totally. Yeah, I think that's true. You know, over the years I have learned to trust my gut a lot more, like a lot more, and I feel like and I remember a clinician, like seasoned clinicians, telling me this when I was a young doc but this idea that when things just don't feel right or you have this initial impression about things, like you still do your due diligence and work it up but like really trust that, like that served me well, and those sort of like diagnostic momentums, like when my gut is saying like something is wrong here, like I've just learned to really trust that. I mean like no, we need to retake the case, or I need to reassess this, or you need to go see another specialist.

Josh:

Because that gut feeling and maybe that's the type one thinking or maybe that's the mold is you just have enough cases that you can't quantify as a bit of a black box, like AI, like you don't know how you got that gut feeling, you don't know that you could recreate the rationality, but more times than not, like it serves you well. And so, yeah, that's sort of this interesting. There's this mush in your head and it comes up with the signal and you kind of learn to respect it more and more, knowing that biases exist but also kind of respecting it. So I've got one more that I like. How about you? Anyone else you want to? Any other one you want to talk about ?

Adam:

I don't think so.

Josh:

I wanted to do. The last one I want to do was this what they called commission bias yes, okay, yeah, yeah, as opposed to like omission, bias commission.

Adam:

I like this one a lot.

Josh:

Yeah, do you want to talk through that one?

Adam:

Yeah, basically and I see this all the time of you know, sometimes doing nothing is actually the answer. I think a lot of times as clinicians we're sort of we feel like we need to do something. Right, and you see it all the time with like upper respiratory tract infections or like sinus sinus infections, where we know that the majority of the time they're actually viral and so antibiotics won't do anything, but we feel like we need to do something. Or someone says, oh, they made me feel better. It's like probably a placebo effect, and so we want to do something. So we give them something yeah, even though it may actually not be the right thing or even possibly a harmful thing to do as opposed to like just giving reassurance or just sort of like I'm not going to give you anything, like let's work this up and figure out what's going on, is sometimes the correct answer.

Josh:

Yeah, and this is a delicate dance too, right, because there's dynamics with patients, like if a patient is worried enough to come in and see you and take time from their day, then the question becomes like, and then you quote unquote don't do anything Now. Now we know that we've done a full assessment and you know physical examination and we've done, you know, an assessment in our chart and we've thought this through and we've made brilliant chart notes about it. But to the patient it feels like we've done nothing and done nothing for them to help their suffering. So you get this sort of like, yeah, this sort of weird dynamic I don't know, maybe you don't. I certainly do with patients where there's this expectation for something, and so I feel like I can feel that pressure, whereas, yeah, sometimes the best thing is just, I just had this the other day, like last clinic day.

Josh:

I had this wonderful patient who's like really invested in her health and like doing all this work and you know, and it's seen all these specializations very concerned about things. She kind of like laid everything out and I was like and you know I'm not, I'm not cheap to see and she should put out a lot of money and I just was. Like you know, I'm listening to this whole case and, like looking at these other reports, I don't think there's anything here Like I don't. I think going after these lab results are not going to serve you and more likely just chasing lab values that some of her other providers are basically telling her to do. She has these flashy little printouts from these labs. I'm just like I, you feel. You feel you're feeling great, you're healthy, you're essentially asymptomatic. There's better explanations for the stuff that you are feeling. Like I don't think you need a thousand dollars a month of supplements and potentially blow up your microbiome in the process trying to chase these lab values.

Josh:

And it was a tough conversation because she had invested a lot of money into these products. She had other providers telling her like the complete opposite thing. And you know, in my mind, the best thing to do there was to not do anything and she was in a good place and not risk like upsetting the apple cart. And you know it took a lot of thoughtful, a lot of time, a thoughtful review of labs to come to that conclusion. But I quote unquote, wasn't doing anything. And so I'm curious like what the if she was asked to do a report on the quality of the visit, what she would say, how satisfied she was. Maybe she would be very satisfied, I don't know.

Adam:

I think a lot of this stems from, like, a microwave society that we live in, right, where, like, everything is an instant download. You order something from Amazon and it comes in within an hour. Your dinner, you throw in a microwave for two minutes, everything's done. So we're just so used to to like having a fix, and so you're going to someone, you're paying a lot of money and your expectation is that this person, who has studied you know medicine for all these years, is going to fix me. And so if they and so, and we have that pressure of like, oh, I need to make, I need to fix them, I need to make them feel better when it's when, sometimes the answer is actually no, like there's, like you said there's, they're fine. Either you're fine or there's nothing to be done.

Josh:

Or I can't figure it out.

Adam:

Yeah.

Josh:

Right, or maybe you know, and then maybe that's the time to refer, or the way I phrase those conversations are.

Josh:

I see what your clinicians saying, like there is this you know, viewpoint among some clinicians that there's value in this, and this is why I try to steal me on the other side's argument and then I say like I don't ascribe to that and this is why and you know, I want you to feel better, I want you to do well, and you know, maybe I'm the right clinician for you, maybe I'm not.

Josh:

But I try to explain it a little bit because it could just be like a philosophical thing. That is like you know, a lot of these clinicians are just sort of chasing lab values and you know, and if you don't do that, that means you're not trying to get to the root cause or you're, you're just not a smart enough clinician or something like that. And so I think you know, kind of all this, all this plays, plays into it, which is which is sort of interesting, but you know. So maybe there should be a new bias like the flashy lab bias. Let's do that. The flashy lab bias you get this flashy lab result in color. It says things like microbiome, dna, you know biochemistry, and there's stuff in red that pops up and there's this bias that it needs to be treated, the flashy lab bias. That's another one that needs to be added, at least in my world for sure.

Adam:

Well, I've heard, actually from from several like well-seasoned clinicians who are either at the end of their career or who have retired, that oftentimes you know they say, hey, in addition to treating the labs, you're treating a patient. So maybe those that are not chasing numbers, yeah, it's a little hard, because in the system that we have now, at least in America, so much of it is metric driven. You know how many of your patients are having A1C less than seven how many are, have their hypertension well controlled, how many people you know have their cholesterol at a certain level, and et cetera, et cetera. And so oftentimes, because you're pressured to do that in a small amount of time and you know for busy clinicians you can kind of get caught up in it, and so then these biases go towards that and so you're treating these lab values as opposed to sort of like quote unquote, treating the right not really in quotes, but instead of treating the patient in front of you.

Josh:

Yeah, and that's one of the things I loved about my natural bathing training was that was drilled into our heads treat the patient, not the labs.

Josh:

And I think that as labs have gotten more sophisticated, some of these like more like fancy lector biology labs where we don't fully understand the microbiome enough to fully understand the clinical implications. That it's just become like that and that's what I'll tell them. I'll tell them like we look, like we treat the patient, not the lab. But then you've got this, this counter and I know I'm going off in a tangent but this counter of like, well, even if I feel good right now, you know this is red and it shouldn't be red, and does that mean I'm going to have this like amorphous inflammation that's going to cause cancer in 10 years? And it's like you know how do you negate that argument and argue for less interventions, and I don't know that. It's just. It's just kind of a complicated clinical dance that I sometimes find myself in and, yeah, I don't know any thoughts on that one as we wrap up the last few minutes here.

Adam:

Well, I think I think that one thing that's pretty interesting and sort of hard for a lot of people to wrap their head around on the topic of screening is don't screen search for something if we don't have, like, a treatment for it. Even if screening for something may possibly be good If we don't have any treatment for it or we don't have anything to do with the lab or, or you know, any sort of interventions or anything else besides just screening for it. What are we doing?

Adam:

Yeah, or if it, and then to sort of yeah, just sort of wrap that up. It's like it's. It's obviously like, well, why don't we screen for this kind of cancer? It's like, well, we don't have any treatment for it.

Josh:

Yeah.

Adam:

But what did you want to know? It's like, well, we're going to have an expensive test for something that we can't do anything about.

Josh:

Well, yeah, but you could make an argument for a patient like, if it involves prognosis or you know, quality of life decisions, so, like to your, to your greater point, which I know, what you're making is basically like, if it's not going to impact clinical care and it's not going to, it's not of import to the patient, like do we need to run this? And I run into that. There's, there's a test on the market that's super brilliant, that's very clever, that I think is going to get the developer, like the you know, the Nobel Prize in medicine or whatever. Like it's just very, very clever, but basically it takes, you know, if you present with all these symptoms, you're like you know, I don't know 70% likely to have IBS. And if you take this test and it's positive, you're, like I don't know, 90% likely to have IBS. Like, either way, like you know, we can make the work under the assumption that you have IBS and and basically go from there.

Josh:

And you know, I don't, very often it doesn't change what the treatment looks like, and so that's sort of a conversation that we have. But some patients, like the engineering types, they're just like no, I want to know, I want to know the mechanism, I want to understand it, I want to see it objectively in writing. And other people are like, yeah, I don't know, is it going to change what you're going to do? And I'm like, no, I'm going to do the same thing regardless, because I think that's going to, you know, manage the symptoms and work on what I think is going on. And they're like, yeah, no, we can save the money. So I mean, it's sometimes it's it's value specific to for the patient.

Adam:

Right.

Josh:

Cool, cool, all right. Well, this is a little bit of a departure, but I thought was kind of interesting. Clinical turn and the other interesting thing which we don't have time to get into is just just not a lot of research on this stuff. So you know, we're talking about being rigorous and research based and thinking about cognitive biases, but there isn't actually a lot of research on modifying cognitive biases and how that impacts clinical care. So it's a big research gap. It doesn't mean it doesn't help, it just means that we don't know much about it. So we'll sort of end with that as well. All right, adam, thank you for your time. Yeah, go ahead.

Adam:

I just want to say I did like the conclusion of this paper. They kind of had, I think, one, two, three, four. They had basically six bullet points that's like you know how to kind of deal with this, which I thought was really interesting, okay.

Josh:

Walk us through. Yeah, the first one was the. Yeah, I thought you were going to quote a philosopher.

Adam:

Yeah, go ahead.

Josh:

Yeah, go through the four points.

Adam:

So the first one was to just slow down, which I think is kind of funny, because if you're a really busy clinician, that's probably the hardest thing to do.

Josh:

It's like telling an anxious person to calm down. Like that does not help. It makes things worse yeah.

Adam:

Or the press person just be happy yeah that's right, just be happy Right.

Adam:

Yeah, so, but it's still. I do think that there is, you know, some validity behind it of like, hey, if you slow down your thought process, will also you go into a different sort of realm of thinking? If you will, of you don't, you won't feel rushed to make a decision or to do an action. You can kind of take things as it comes and then make a maybe perhaps a more educated approach to what you're doing. Yeah, be aware of base rates for your differential. So I think that it was really interesting that they actually picked out a specific cognitive bias to think about. So, think about you know the, given the population in front of you, what are the you know rates of what you're thinking about is going on within the, within the context of your clinical setting. Consider what data is truly relevant. Actively seek alternative diagnoses.

Josh:

That one I liked a lot.

Adam:

Yeah, Ask questions to disprove your hypothesis. And I thought that this was interesting too because I had one attending sort of their approach to the differential was don't look for what the differential likely is, but look for what it is likely not. You can kind of shift it. That way you can get to a more likely diagnosis. Interesting, yeah. So if someone's coming in for shortness of breath, do all of like think as much as you can and sort of get the relevant information as to why that's not a pulmonary embolism and if it satisfies that criteria, then move on to the next you know diagnosis and sort of work backwards as opposed to working forward. Ask questions to disprove your hypothesis. So, similarly, on that note that I just talked about, yeah, and then I really like this one. Remember you are often wrong period.

Josh:

Mm-hmm.

Adam:

Yes, consider the immediate implications of this.

Josh:

Yeah, yeah, yeah, yeah, that's true, that's really true. I think that's good.

Adam:

So they have this whole paper and I just think that they it's brilliantly ended of remember you are often wrong.

Introducer:

Mm-hmm.

Adam:

All those years of schooling and reading and residency and training and conferences and nose in the books. You are often wrong.

Josh:

I love it. Yeah, I really like these. I feel like some of these I do, some of them I don't. I love the idea of like actively trying to disprove your hypothesis, like that's so cool. I don't know that I do that. Some of the other ones, yeah, I love the remember you are often wrong and consider the immediate implications. So I feel like I'll sometimes like set timelines. I'm like this is what I think is going on, but if you're not feeling better by this date, we need to do some imaging because we could be missing something big type of thing. So, yeah, like remember, like being wrong, even if it you know you're losing time too, right, like you're losing the time and the cost to the patient and all these things. There are implications, a lot of implications to that.

Josh:

But loved it, awesome. I'm glad you pulled that out. That's a really good kind of way to summarize this. We will post this wonderful article on the in the show notes. Thank you, dear listener, for listening and we'll see you again next week. Take care, bye-bye.

Josh:

If you enjoy this podcast, chances are that one of your colleagues and friends probably would as well. Please do us a favor and let them know about the podcast and if you have a little bit of extra time, even just a few seconds, if you could rate us and review us on Apple podcast or any other distributor, it would be greatly appreciated. It would mean a lot to us and help get the word out to other people that would really enjoy our content. Thank you, hey y'all. This is Josh. You know we talked about some really interesting stuff today. I think one of the things we're going to do that's relevant. There is a course we have on Dr Journal Club called the EBM Boot Camp. That's really meant for clinicians to sort of help them understand how to critically evaluate the literature, et cetera, et cetera Some of the things that we've been talking about today.

Josh:

Go ahead and check out the show notes link. We're going to link to it directly. I think it might be of interest. Don't forget to follow us on social and interact with us on social media at Dr Journal Club DR Journal Club on Twitter. We're on Facebook, we're on LinkedIn, et cetera, et cetera. So please reach out to us. We always love to talk to our fans and our listeners. If you have any specific questions you'd like to ask us about research, evidence, being a clinician, et cetera, don't hesitate to ask. And then, of course, if you have any topics that you'd like us to cover on the pod, let us know as well.

Introducer:

yThank you for listening to the Dr. Journal Club podcast. you

Exploring Cognitive Bias in Medicine
Medical Decision Making Heuristics and Biases
Analyzing Diagnostic Biases in Medicine
Trusting Gut Instincts, Challenging Diagnostic Momentum
The Challenge of Doing Nothing
Clinical Decision-Making and Cognitive Biases
Promoting EBM Boot Camp and Engaging Fans