See, Hear, Feel

Ep8: Everyday illusions with Dr. Christopher Chabris, PhD

May 04, 2022 Christine Ko/Christopher Chabris Season 1 Episode 8
Ep8: Everyday illusions with Dr. Christopher Chabris, PhD
See, Hear, Feel
More Info
See, Hear, Feel
Ep8: Everyday illusions with Dr. Christopher Chabris, PhD
May 04, 2022 Season 1 Episode 8
Christine Ko/Christopher Chabris

How well do you think your mind works? We all have fluency of thoughts, but how much should you trust a confident person? Whether you are confident in your own mind or not, everyday illusions can be examined with the lens of metacognition. Tune in to hear from Christopher Chabris, PhD, co-author of The Invisible Gorilla (link: http://www.theinvisiblegorilla.com/). Dr. Chabris is also a chess master and previously wrote a monthly column in The Wall Street Journal on games. For more information, visit his website at http://www.chabris.com/

Show Notes Transcript

How well do you think your mind works? We all have fluency of thoughts, but how much should you trust a confident person? Whether you are confident in your own mind or not, everyday illusions can be examined with the lens of metacognition. Tune in to hear from Christopher Chabris, PhD, co-author of The Invisible Gorilla (link: http://www.theinvisiblegorilla.com/). Dr. Chabris is also a chess master and previously wrote a monthly column in The Wall Street Journal on games. For more information, visit his website at http://www.chabris.com/

[00:00:00] Christine J Ko: Very excited today to speak again with Dr. Christopher Chabris, who is a Professor and Co-director of the Behavior and Decision Sciences Program at Geisinger Health System. We already had a Part 1, where he delved a little bit into everyday illusions and his work on researching perception and how those illusions affect us. Currently, he is faculty Co-director of the Geisinger Behavioral Insights Team, abbreviated BIT. He was formerly an Associate Professor of Psychology at Union College in New York. He has an AB in Computer Science and a PhD in Psychology, both from Harvard University. And he worked at Harvard subsequent to that for about 5 years.

[00:00:35] He is one of the originators of one of the most famous psychology experiments, and hopefully you checked the show notes for Part 1 for that, but I'll put it in again for this one. And he's co-author of a really great book called The Invisible Gorilla, which really, I can't recommend highly enough for anyone who is in a profession where attention and confidence and knowledge are important. It's really a must read. He is, amazingly, a chess master among all of that. He is very prolific as a writer; he used to write for the Wall Street Journal, a monthly column on games. He has very wide ranging interests, and for more information on him, you can visit his website which is in the show notes.

[00:01:13] I'll jump right in and just ask you: what everyday illusion do you think people are most susceptible? 

[00:01:19] Christopher Chabris: So that's a good question, but I don't think there's really an answer to it because, when we talk about illusions of attention, memory, confidence, knowledge, and potential, which are five of the main ones we talk about, it's hard to quantify, how much of one there is versus another. I think the main point is that we are all susceptible to all of these mistaken beliefs about how well our own minds work. An everyday illusion is a mistaken belief about how your own mind works. They can surface really in all kinds of activities. We have examples from law and order, not the TV show, the world of the criminal justice system; sports; finance; healthcare; games; politics; pretty much anything there is because these are sort of fundamental misunderstandings that we have about how our own minds work. So I wouldn't say there's one that people are most susceptible to, but think one that people often don't realize maybe as much as others would be the illusion of confidence. That's in part because the idea that we pay attention to and believe people who are more confident is not necessarily irrational.

[00:02:21] Usually there is some relationship between how well you understand something, how much you know about it, how accurate your memory is and so on and how confident you are about those. It's not there's 0 relationship between them. So for example, in eyewitness testimony, more confident eyewitnesses are, on average, more accurate about their identifications of suspects than less confident eyewitnesses. The problem is, they're not perfectly accurate, right? So they're still overconfident. The correlation is not as good as we'd like it to be. And yet juries will convict defendants sometimes based on just one compelling eyewitness. There's a reasonable chance that eyewitness could be completely wrong for a whole variety of reasons that have to do with how visual perception works, and memory for faces, and stress and trauma, and all kinds of stuff like that. But they all combined to make us make our memories not as good as we think they were and make our confidence too high; that can play out in a lot of different ways. I think it's one of the ones that has the biggest effect in a wide variety of our everyday experiences.

[00:03:17] Christine J Ko: Yeah. I really like this quote from your book, which says, "We take the fluency with which we process information as a signal that we are processing a lot of information, that we are processing it deeply, and that we are processing it with great accuracy and skill. But effortless processing is not necessarily illusion free." That really speaks to someone like me, who has been working in the fields of dermatology and dermatopathology for a long time now, which takes a lot of visual attention, recognition, perception. Sometimes I feel something's easy, but that doesn't necessarily mean ease or fluency translates to being right.

[00:03:53] Christopher Chabris: Yeah, I think we wrote that in the book, in part, because we were looking to figure out why do people make these mistakes, right? What are they paying attention to in their own thought processes that could lead them to think that they're doing better than they actually are? And I think part of the reason is fluency, as you mentioned. So fluency, in the cognitive psychology world, it's an internal experience of how quickly and readily and plentifully your thoughts happen. It's analogous to your fluency with a language, right? When you're fluent in a language, you can speak it more quickly. You make fewer errors, you understand it better. And so on. It's an analogy to that, but it's with our internal thought processes. Some thought processes we have operate very quickly. Danny Kahneman, famously, in his book Thinking, Fast and Slow, referred to these System 1 and System 2 processes, which is a way of summarizing the fact that some mental processes happen so quickly. So outside of our awareness, and so automatically, that the only sense we get is that they are happening instantaneously. And that is a signal that we often take to be indicative that they're operating well; a lot of visual perception and memory retrieval has that quality, right?

[00:04:57] When we're looking out at the world, we don't experience any difficulty, really, in processing things. Every so often we do see one of those visual illusions, right? Where it flips back and forth between two things. Visual illusions are nice because they do reveal some of the boundaries of our perceptual abilities. Even when we're remembering things, sometimes we have trouble remembering things, but often when we don't have trouble remembering things and we do retrieve a memory, it just feels automatic and effortless and we mistakenly then assume that it must be correct. Whereas, in the background, unbeknownst to us, our memories are constantly being distorted and updated and rewritten, especially ironically, when we retrieve them. Every time we tell a story, we change the memory a little bit.

[00:05:35] It's almost like playing a game of telephone with ourselves. When you pass the message along from one person to one person, the more often you tell a story, you're passing it back and forth to your own mind and it gets distorted. Whatever part of it you emphasize this time might become a more solid part of it for the next time you retrieve it, and parts you don't mention this time sort of drop off. And maybe something you accidentally add this time gets put back into the memory. So that then the next time you tell the story, it's got something it didn't have before, but we don't really notice any of that stuff. That all happens in the background, automatically. And we have a fluent experience of memory retrieval or visual perception.

[00:06:05] Therefore we think we're noticing everything and remembering things accurately. I think that's not the only reason we have these problems, but it's an important one that our fluency doesn't always give us an honest signal. 

[00:06:15] Christine J Ko: I'm glad you mentioned Daniel Kahneman's book Thinking, Fast and Slow because that's another one of my favorites. It's actually the book that really did first get me thinking about metacognition. What I found interesting about it that is he points out that expert thinking, the thinking that doctors are trying to achieve through training through med school, residency, et cetera, fellowships, if they do one and then practice... is to get that fluency, that expert thinking or System 1 thinking, which previously I think was called Type 1, or still is, maybe by others. And which is hardly thinking, what you're saying: sometimes we just easily retrieve something. It just pops up. We don't really have to think hard about it, versus System 2 thinking, which is more analytical when we start to be aware of something and really parse it out in a more logical analytic way.

[00:07:01] Would you say that these everyday illusions are more part of System 1 or Type 1 thinking? 

[00:07:07] Christopher Chabris: I think they are related to it. Reflecting on your own thought processes and asking, What am I missing? How could I be going wrong? And so on. That does seem to be inherently a System 2 process. Some of the many words that are used to describe System 2 thinking are reflective, and analytical, and sequential, and slow; thinking about your own thought processes and noticing when they might be making errors. Those seem inherently slow System 2 processes, and also System 2 processes are often hard to initiate. Often they seem to take some kind of special mental effort. They are effortful. They feel effortful unlike the fluent System 1 processing. So I would say certainly checking your own intuitions and instincts and so on is more of a System 2 kind of thing.

[00:07:47] The problem with everyday illusions is that the marvelous output we get from System 1 which is able to do amazing things, like recognize the faces of people we haven't seen in decades, remember things we haven't thought about in years, recognize thousands and thousands of different patterns on your slides or your x-rays or your chest boards or whatever, we don't have any insight in real time into what's going on there. That is where we become susceptible to the illusion that whatever we've noticed is everything there is. And whatever we've remembered is accurate, whatever our first instinct was must be the correct answer. 

[00:08:19] Christine J Ko: In your book, you end with what you call the myth of intuition, which I think is what you just touched on, that with experience, especially in chess or in healthcare, making a diagnosis, part of expert thinking is this intuitive sense of I've seen this before. But then if we're wrong, inadvertently missing maybe other things that would steer us differently. 

[00:08:41] Christopher Chabris: Yeah. The reason why we called it the myth of intuition was that it's definitely true that experts can recognize patterns and recognize them instantly; that amateurs, novices, less experienced people can't. One of the main engines of expertise of skill, that enables one person to be much better at something than someone else, it's not natural talent. It's practice, and training, and the acquisition of expertise. And that's been documented in Cognitive Psychology, in all kinds of fields, from medicine to chess, to other games, to sports, to pretty much anything.

[00:09:11] And it's quite well understood how that happens, that you gradually build up a vocabulary of different things. They might not be a vocabulary in words, but their vocabulary in visual patterns or associations between things and so on. And that's what enables a doctor to say, that kid looks sick, even if they can't immediately tell what's wrong with them or whatever, or to maybe feel like they have a sense before they know the diagnosis that they're going to figure it out. Of course, it's often good to confirm your intuition later on. Where the myth of intuition comes in is twofold. I think a, the intuitions happen first before the confirmations happen. So if you stop before the confirmation, you're left only with the intuition, but you might not realize that you never really confirmed it. And this has happened. Many cases like one thing I've been thinking about recently, and we write about it in the book also, is the art world where often you'll have these paintings that are said to be done by a famous artist. And the way they're authenticated is just by showing them to experts, and the experts look at them. And often the experts just instantaneously say, yes, that looks exactly like a Jackson Pollock. And maybe they'll look a little bit more, but there are other ways to confirm that. But people often take the expert intuition as the final judgment. In medicine, often we don't do this, we're trained... again we, I'm not a doctor... but people are trained to look at laboratory tests to confirm things, to get a radiologist opinion, to send it to the pathologist and so on. But still the initial intuition, I think, can have a big impact on the further direction. Even experts can be wrong in their initial intuition. Coming from the domain of chess, where I have a lot of experience, often you will see the best players in the world, literally the best players in the world, top 10 players say, Oh, I didn't even think of that move. And on the chess board, there's nothing easier for chess players to think about than where the pieces can move. That's like the fundamental thing. There's 30 possible moves. And yet sometimes you just don't even think of one which turns out to have been the best one, or the one, the best one for your opponent or something like that.

[00:10:53] So that initial intuition, that initial pattern recognition is powerful. But it doesn't go all the way. And in fact, when it's wrong, we can pay the price. 

[00:11:00] Christine J Ko: When an expert chess master says they didn't even think of that move. They're just questioned afterward. And they say that they didn't think of it. 

[00:11:08] Christopher Chabris: Yeah. People play chess at all different speeds. So believe it or not, there's something called 15 second chess where you have 15 seconds for the whole game. 

[00:11:14] Christine J Ko: Wow. 

[00:11:14] Christopher Chabris: And then there's something called correspondence chess, where a game could take 2 or 3 years. And there's every possible speed in between, even at elite professional tournament speeds, which is, let's say, 5 hours for a game. People spend like 20 minutes thinking about a single move. They'll come back afterwards and say, I made a mistake here because I never even considered that option. And I think that happens in a lot of professional decision making. Chess just gives me a window into it. I'm sure there are many cases where the ultimate diagnosis was something that the first person who saw the patient never thought about. Like the New York Times column on diagnosis, where the solution is like not anything that the first doctor or the second doctor or the third doctor even thought.

[00:11:47] Medicine is hard, but if we pay attention to confidence, if we believe too much in intuition and so on, you might go with whatever the first person said. And really 90% of the time, that's right. It's just like the other 10% of the time, because it's an important 10%. The hard cases are hard, for a reason, but they're just as important to get as the easy cases, I would say in chess, as in any other discipline. 

[00:12:06] Christine J Ko: Yeah. Is there a way around it? 

[00:12:07] Christopher Chabris: I wish that there were a way around it. I think we all have to respect expertise. People without MDs don't suddenly become doctors when they go on Twitter and talk about vaccines and respiratory illnesses and so on. We have to respect expertise, but at the same time, we have to recognize the limitations of expertise. It's not just being able to instantly classify things correctly, but it's also understanding what other kinds of information you need to come to a definite conclusion. There's a famous quote from Keynes, where he said something like, When the facts change, I changed my opinions, sir. What do you do? I think experts, should be expert in that, right? If the pathology report says something, they didn't expect, they don't just throw it in the garbage and carry on. They integrate that into their understanding of the decisions they're trying to make. And that's a struggle because once we form a belief, it's hard to just let it go. So that's a struggle that requires training, reflection, System 2 reflection, and an analysis and so on. And I wish I had a magic formula for that, but, first step is knowing about it. Second step is designing our systems around us to force us to do that. And I don't think I have any magic bullets for healthcare, but often just prompting people to reflect on things and consider alternatives can be helpful.

[00:13:12] Christine J Ko: I love that. I love how you summarize that. Respect an expert, but also recognize the limitations of expertise. That's great. Do you have any final thoughts? 

[00:13:22] Christopher Chabris: One final thought that I do have related to what you mentioned earlier. We have at Geisinger, a Behavioral Insights Team that Michelle Meyer and I started about 4 years ago. Now, when we're recording this, we really enjoy collaborating with clinicians and other people inside healthcare, because it's such a rich environment for trying to help people make better decisions. I think there's so many opportunities, not because everyone's making terrible decisions in healthcare, but just because the situation is so complex and there's so many behavioral components to it, and I think we often neglect the behavioral elements. We focus a lot on, Does the drug work, does it not work? How big an effect does it have? What combination of treatments is best and so on. Even when you have the best and fastest develop a life saving vaccine in human history, we found that there are huge behavioral gaps in getting the maximum healthcare value out of that. And that's just, I think, has showed us what is true in the rest of healthcare also, and in more extreme form. But we know that there are all kinds of other things that are behavioral that make a difference. And so we're really focused on trying to apply this kind of science and knowledge and ideas to helping improve outcomes for the system, and the patients, and the clinicians, and everyone involved. So excited to be working in this field. 

[00:14:30] Christine J Ko: I love it. I'm such a fan of your work, and one of my passions is getting better healthcare for patients. So I think the work that you're doing now is just so valuable. Thank you again for spending your time with me, Chris. I really appreciate it. 

[00:14:44] Christopher Chabris: Oh, it's been a pleasure. Thank you.