Thinking 2 Think

Why Smart People Make Dumb Decisions: The Knowing-Thinking Gap in Leadership

Michael Antonio Aponte Episode 58

Send us a text

Why do highly educated people make catastrophically bad decisions? In this episode, Mike Aponte reveals the "knowing-thinking gap"—the critical difference between memorizing information and actually thinking critically.

Drawing on his experience as a Merrill Lynch wealth manager during the 2008 financial crisis, Mike shares the story of a brilliant cardiothoracic surgeon who "knew" finance but couldn't think strategically about his portfolio. He also explores why high school students can memorize facts about clean energy but fail to think through second-order consequences.

You'll discover:

  • - Why expertise in one domain doesn't transfer to critical thinking skills
  • - The cognitive bias that makes smart people overconfident in unfamiliar areas
  • - How to recognize when you're "knowing" vs. actually "thinking"
  • - A framework for developing true critical thinking beyond knowledge accumulation
  • - Real examples from Wall Street, education, and law enforcement


Perfect for educators, leaders, executives, and anyone who wants to move beyond surface-level knowledge to deep strategic thinking. Learn the decision-making frameworks used by top performers across industries.

Support the show

Join My Substack for more content: maaponte.substack.com

🎧 Don't forget to like, share, and subscribe to join our growing community of thoughtful individuals!

🔗 Follow us:
📖 Check out my book: The Logical Mind: Learn Critical Thinking to Make Better Decisions:


📲 Let’s connect on social media!

  • https://x.com/Thinking_2Think
SPEAKER_00:

I had a client at Merrill Lynch, cardiologist. Brilliant guy. And I mean brilliant. His name was on research papers about cardiology, highly cited. When he talked about the human heart, he could break down mechanisms that would make your head spin. And he thought he was smarter than me about finance. And to be fair, he was pretty sophisticated. He understood PE ratios. He could talk about option strategies. He knew the terminology. But knowing terminology isn't the same as thinking. This was early 2008. The market was starting to act strange. Housing was showing cracks. Credit was tightening. I sat down with him and said, something along the lines of we need to think more internationally, possibly emerging markets, precious metals for hedging. He scoffed. And I remember this. He actually scoffed. Mike, I've been investing for 20 years. I know what I'm doing. The U.S. market always recovers. This is a buying opportunity. He knew that. And in some ways, um, he was right. However, he claimed he'd seen it before in 2000-2001. The market dipped and came back. Pattern recognition, mental model. But this wasn't 2001. The context was different. I tried to explain the fundamentals are different this time. We're looking at systemic risks, not just in market correction. He didn't want to hear it. He knew better. Eventually I convinced him. Then September 2008 happened. Lehman Brothers collapsed, the entire financial system seized up. The market dropped 40%. His friends, other doctors, lawyers, successful professionals who knew about finance, they were down 40, 50%. Some even more because they were leveraged. My client was down about 20%. He called me and he was ecstatic. Mike, you're a genius. I beat all my friends. They're getting killed, and I'm the only one down 20%. I didn't say what I was thinking, which was you'd be down 10% if you listened to me six months ago. But I didn't need to say it because his ego was intact. He could go to the country club and tell everyone how well his portfolio was doing relative to theirs. I'm sure he took all the credit. I'm sure he told them about his diversification strategy. And honestly, I really didn't care because I learned something more valuable than a commission or a fee that we got from Merrill Lynch. Knowing a lot doesn't mean you can think. This man could perform open heart surgery. He could diagnose complex cardiac conditions. He could save lives, and he probably has saved plenty of lives. But he couldn't think about financial decisions when his mental model, the market always recovers, was leading him astray. He knew finance, but he couldn't think financially. And that gap, that knowing thinking gap, is destroying more careers, relationships, and decisions than ignorance ever could. This is Thinking to Think, the podcast about making better decisions in a world designed to make you think worse. I'm Mike Aponte, also known as M.A. Aponte, former NYPD officer, former Merrill Lynch wealth manager, former uh trained actor, and current executive director at a charter school in Florida. Last week we talked about how your brain actually works. This week, we're talking about why smart people, people who know a lot, still make terrible decisions. Here's the uncomfortable truth. Education doesn't teach thinking. It teaches memorization. You go through school, you learn facts, dates, formulas, definitions, we take tests, you recall information, you get good grades, and something along the way you start to believe that knowing things is thinking. But it's not. Knowing is possession. Thinking is process. Knowing is the market always recovers. Thinking is under what conditions does the market recover? What's different this time? What am I missing? Knowing is the student is disrupt uh excuse me, disruptive because they want attention. Thinking is, what's driving this behavior? What need isn't being met? What am I not seeing? Knowing is we're always done it this way. Thinking is, why did we start doing it this way? Is the original problem still relevant? What's changed? The knowing thinking gap is the space between what you've memorized and what you can actually do with that information in a novel situation. And most people, most smart people, live in that gap without even knowing it exists. Let me break down what the knowing thinking gap actually is and why it's so dangerous. So there's a model I use when I'm diagnosing whether someone actually understands something or just thinks they understand it. So there's level one, it's called recall. Uh you can repeat information back, you memorize it, you can pass a multiple choice test. Example, the mitochondria is the powerhouse of the cell. You know that. Every high school uh bio uh student knows that. But do you understand how cellular respiration actually works? Could you explain why a mitochondrial disorder causes muscle weakness? Could you apply that knowledge to a novel problem? Probably not, because you're at level one recall. And to be honest, I wasn't a scientist, therefore, I'm probably a level one also, to be honest. Now, here's level two, comprehension. You can explain it in your own words. You understand the concept well enough to teach it to someone else at a basic level. So an example would be the mitochondria converts glucose into ATP, which is the energy currency the cell uses. That's better. You're not just repeating, you're explaining. I had to actually look that up. But can you troubleshoot when something goes wrong? Can you adapt when the context changes? No, not yet. Now there's level three, application. You can use the knowledge in a familiar context. You've practiced it, you can execute in situations that look like the ones you've seen before. For example, a doctor diagnoses a patient with a mitochondrial disorder because the symptoms match the textbook presentation. This is where most professionals operate. You know your field, you can apply your knowledge in situations that resembles your training. But here's the problem: most real-world problems don't look like textbook problems. Then there's level four, transfer. You can adapt the knowledge to novel situations. You can see patterns across domains. You can think with the knowledge, not just about it. So an example would be a researcher realizes that mitochondrial dysfunction might be connected to neurodegenerative diseases, even though that's not in the textbook. They transfer their understanding of cellular energy to a completely different problem. And I actually had to look that up. So this is thinking, and most people never get there. Why the gap exists? And there's three reasons why. One, our education system rewards recall, not thinking. And I see it at my school constantly. I see it in the state where we have state testing in other states as well. I could count on one hand the number of high school students who can have a real dialogue about complex topics, and that's sad. Most of them have been trained to memorize, regurgitate, and move on. And when you challenge what they believe to be true, when you ask them to actually think through the logic, they shatter. And I want to give you a specific example of this that I actually used in my own class. So I use the Socratic lectures to show students that the world isn't black and white. One day I brought up clean energy, solar, wind, the future of the planet. Every single student believed, passionately believed, that solar and wind are the solution. So I showed them a lithium mine, the environmental destruction, the fresh water usage, the oil required to manufacture and transport the equipment. Then I showed them wind turbines, the birds they kill, the maintenance costs, the fossil fuels required to build them. And I watched their faces, confusion, cognitive dissonance, everything they knew to be true was being challenged. They left the class questioning everything. And the next day, they came back with the exact same conclusion, solar, wind, or the future, as if the lesson never happened. That's annoying thinking gap. They knew clean energy equals good, but they couldn't think through the second-order consequences. They couldn't hold complexity. They couldn't update their mental model. Then the second reason expertise creates blindness. The more you know about one thing, the more you assume you know about everything. When I was cold calling as a stockbroker before JP Morgan and Merrill Lynch, we ran into this all the time. Doctors, lawyers, engineers, professors. Incredibly knowledgeable in their fields and incredibly ignorant about everything else. But they couldn't, they didn't feel ignorant. They felt smart because they were used to being the smartest person in the room in their domain. So when they stepped into finance, they assumed the same rules apply. I'm smart, I can figure this out. But finance is a medicine. Logic is different. The feedback loops are different, the risks are different. And the more educated they were, the more resistant they were to be to being taught. Because knowing a lot more about one thing makes you feel like you can think about everything, but you can't. Reason three: mental models ossify. You build a model, it works, you use it again, it works again. And over time the model becomes invisible. It's not a model anymore, it's reality. My surgeon client made a model. The market always recovers. The model worked in 2001. It worked in 2003. It worked in 2005. So by 2008, it wasn't a belief anymore. It was a fact. And when I challenged it, I wasn't challenging his thinking. I was challenging his reality. That's why he scoffed. That's why he resisted. Not because he was stupid, because he wasn't. But because the model, his model, had ossified. So I want to give you a case study, the difference between knowing and thinking. And let me tell you about two wealth management firms I worked at. JP Morgan Chase and Merrill Lynch at the time was with BlackRock. Both elite institutions, both deal with wealthy clients, both employ smart people, but they think about wealth completely different. At JP Morgan Chase, Chase Plaza, there's an entire education and training program. I don't know if it's still around, but it was around when I was there. And they teach you how finance works: interest rates, banking in general, portfolio theory, risk management. You learn the mechanics of money, the technical knowledge, the formulas. You know finance when you read that training. At Merrill Lynch, BlackRock, the training is different. They teach you the psychology of money. Not just how portfolios are constructed, but how clients think about their portfolios. Not just risk adjustment returns, but how people feel about risk. They teach you something that's rarely taught in finance degrees, the art of money. The money is fueled, how it's maintained, how it's expanded, but more importantly, how people relate to it. Because here's the insight that separates good wealth managers from great ones. The technical knowledge is table states. Everyone knows it. What separates outcomes is whether you can think with that knowledge in the context of human psychology. My surgeon client knew finance. He could read a prospectus. He understood asset allocation. But he couldn't think financially in the context of his own ego, his own risk tolerance, his own cognitive biases. He knew the mechanics, but he couldn't think through the psychology. And that's the gap. So how the gap shows up in education, um, it's relatively the same. I had a teacher at a school, and I'm gonna be very vague, uh, expert in mathematics, really knew her stuff, uh, could explain probably calculus six different ways. Uh, but she was aggressive and mean to students who couldn't connect. She knew math, but she couldn't think pedagogically. She couldn't see that a student struggling with factions wasn't stupid or lazy. They were missing a foundational concept from three years ago. She couldn't adapt her teaching to different learning styles. She couldn't troubleshoot with her explanation. Um explanation didn't land. Because she was operating at level three application in a familiar context. But teaching isn't about applying the same method to every student. It's about transferring your knowledge to novel situations, each student being a novel situation. And that's level four. That's thinking. And I see this constantly in educational leaders too. Principals who know instructional theory but can't think through a discipline problem with a complex student. Teachers who know curriculum but can't think through how to engage a disengaged learner. Other administrators who know policy but can't think through how to actually implement it in a way that teachers will accept. The knowing thinking gap isn't an intelligent problem, it's a transfer problem. So I want to go back to the clean energy. So remember my clean energy example about the students. The cognitive dissonance, the discomfort you feel when new information contradicts what you already believe. And here's what happens when people hit cognitive dissonance. There's multiple options, mostly two. Option one, update the model. This is the thinking. Hmm, I believed X, but this evidence suggests Y. Let me reconsider. Option two, reject the evidence. This is protecting what you know. That evidence is biased, the source is unreliable, this that doesn't apply to me. And most people choose option two. And we see this in our current events in politics and our society. And it's not because people are stupid or they're stupid, because updating your model is hard. It's cognitively expensive and it threatens your identity. If you've spent 20 years believing the market always recovers, and I tell you that belief is wrong, I'm not just challenging your investment strategy. I'm challenging your competence, your judgment, your identity as someone who knows finance. That's why my surgeon clients scoffed. That's why my students came back the next day with the same belief. Because updating the model would mean admitting I was wrong. And knowing centered people can't do that because their identity is built on being right. But thinking-centered people, they want to be wrong because being wrong means you get to learn something. So how do you close the knowing thinking gap? Here's what I've learned across my four careers. Strategy one, test your knowledge against reality. Don't just ask, do I know this? Ask, can I use this in a situation I've never seen before? When I was training to be an actor, directors would give you a scene and you'd prepare, you'd memorize your lines, you'd plan your blocking, and then they change everything. Now do the scene angry, now do it sad, now do it while walking backwards, now do it as if you just found out your character is lying. Um that's not testing your knowledge on the scene, that's testing your ability to think with the scene. The actors who could only perform the scene one way, the way they'd memorize, were operating at level three, application in a familiar context. The actors who could adapt to any direction, level four, transfer. Apply this to your own work. If you're a teacher, can you teach the same concept five different ways? Or do you only have one explanation? If you're a leader, can you solve problems you've never seen before? Or you only know how to handle a situation that looks like your past experiences? If you're a professional, can you adapt your expertise to a new industry, a new context, a new challenge? Or are you valuable in one specific environment? That's the test. Strategy two, seek disconfirming evidence. Most people look for evidence that confirms what they already believe. That's confirmation bias. But thinking-centered people do the opposite. They actively look for evidence that contradicts what they believe. My surgeon client should have asked himself, what if I'm wrong? What would have to be true for Mike's recommendation to make sense? But he didn't at the time because he was protecting what he knew, not thinking. Here's a practice I use. Before I make a big decision, I write down what would have to be true for the opposite decision to be the right one. If I'm about to hire someone, I ask, what would have to be true for this to be a bad hire? If I'm about to implement a new policy at school, I ask, what would have to be true for this policy to backfire? This forces me out of confirmation bias. It forces me to think, not just know. And a little other trick if you're into Dungeons and Dragons or what have you, imagine if you rolled a critical one. Now you made a decision, you The critical one, what happens now? That's called tabletop strategies. Anyway, going back, strategy three. Teach what you know to someone who knows nothing. The fastest way to discover the knowing thinking gap in yourself is to try to teach something to a complete beginner. If you can't explain it simply, you don't understand it. You just know the jargon. I see this all the time in education. Teachers who can explain a concept to other teachers, but can't explain it to a seventh grader. That's because they know the concept and the language of their expertise, but they can't think with the concept at a foundational level. Richard Feynman, brilliant physicist, said if you can't explain something to a six-year-old, you don't understand it yourself. He's talking about the knowing thinking gap. Strategy four. Operate in multiple domains. This is the one that changed my life. I've been in finance, I've been in acting, I've been in law enforcement, I've been in education, and the single biggest advantage of that career path, beyond adaptability, is that I can't hide in the comfort of expertise. Every time I switch careers, I went from expert to novice, from knowing to not knowing. And that forced me to think. When you're an expert in one domain, you can coast on what you know. You can apply the same patterns over and over. But when you're a novice, you have to think. You have to question, you have to adapt. That's why I encourage leaders to step outside their domains regularly. If you're a principal, go sit in the corporate boardroom, see how they think about problems. If you're a teacher, go shadow a social worker, see how they diagnose behavior. If you're a finance person, go teach a class. See how you adapt your expertise to people who don't speak your language. Don't just know your field. Think across fields. And if you want to go deeper on closing that knowing thinking gap, if you want frameworks for moving from recall to transfer, from expertise to adaptability, I write a Substack newsletter every week. In the Substack, you will get frameworks for testing your own thinking, downloadable templates for decision making. For the$10 a month subscription, you also get my show notes, more details on the thought process and uh each episode, as well as some philosophical uh theories, as well as some of my hardest decisions I made this week. How I, what was my thought process and my Friday's reflections and additional downloadable templates. So please subscribe, support the show, that would be really appreciated. Next week, we'll be talking about the cognitive ladder, the five levels of thinking and how to diagnose where you are and how to move up. If you ever wondered how do I actually get better at thinking, that episode is for you. Please subscribe, share, send this the podcast episode to someone you think that this will help them in their thought process. Thanks for thinking with me. I'm Mike Aponti, also known as M A Aponte, and this is Thinking to Think.