Talking Rehab with Dr. Fred Bagares

Dr. ChatGPT Will See You Now

Fred Bagares Episode 71

What happens when patients consult ChatGPT before they see a doctor—and the physician becomes the third opinion in the room? In this episode, Dr. Fred Bagares breaks down the new hierarchy of medical advice, why most doctors are reacting to it the wrong way, and how AI can actually become the physician’s greatest ally.

Rather than fighting “Dr. Google 2.0,” the future belongs to clinicians who learn to interpret, contextualize, and personalize AI-driven information for their patients. You’ll hear why being a curator, not a gatekeeper is the next competitive advantage in medicine—and how to make that shift now.

⏱️ Episode Guide

  • [00:00] The new hierarchy of medical advice: why physicians are now the third opinion
  • [01:00] A patient’s journey: from ChatGPT → self-editing → biased question for the doctor
  • [02:30] The five major failure points in this system—and why they distort care
  • [05:00] Why physicians should celebrate, not fear, AI in the consultation room
  • [06:00] Three tactics for turning AI into your competitive advantage
    1. Embrace the AI consultation
    2. Become the patient’s “prompt engineer”
    3. Address selection bias directly
  • [08:00] Example: shoulder pain, ChatGPT vs. physician personalization
  • [09:00] The uncomfortable truth: most doctors are replaceable—unless they reframe their value
  • [10:00] Lessons from travel agents and Expedia: adapt or become irrelevant
  • [11:00] Advice for patients: how to use ChatGPT wisely before seeing your physician
  • [12:00] The big shift: from knowledge gatekeeping → to personalization and context
  • [13:00] Closing thoughts: why the future is not physician vs. AI, but physician + AI

CTA:
If this episode made you rethink your role in recovery—or your relationship with technology in healthcare—hit subscribe and share it with a colleague or friend. And if you’re ready for clarity in your own care, visit MSK Direct to schedule a same-day consultation.

Support the show

I want to talk to you about something that's happening in healthcare right now that most people, including physicians, are completely missing. It's about the new hierarchy of medical advice, and if you've ever Googled your symptoms, ask Chat GPT about a diagnosis or walked into a doctor's office with printouts from the internet. This affects you. Here's the punchline. Physicians have become the third opinion in healthcare. Not the first, not the second, but actually the third, unfortunately. Physicians are extremely offended by this, when they should actually be celebrating, and here's why. I'm Dr. Fred b Garris, and this is the Talking Rehab podcast where we challenge the conventional wisdom about recovery and healing. And today I want to talk about whether or not you need a protocol or a customized treatment plan. Quick favor, before we dive in. If this podcast has changed how you think about your body or your recovery, hit that subscribe button. It's free. It takes two seconds, and it's how we keep bringing you these conversations every week. No fluff. Just real talk about what actually works. Thanks for being here. Now let's get into it. Let me paint you a picture of how medical consultations actually work in 2025. Step number one, the patient feels knee pain. They go to chat, GPT, and they type in, why does my knee hurt when I run chat? GPT spits out a comprehensive answer. Talks about it, band syndrome, patella, femoral pain syndrome, meniscus, tears, arthritis, and the patient reads it all. Step number two, the patient becomes their own editor. They take chat GPTs answers. They subtract what they don't like, what they don't understand, or what they simply don't want to believe. Maybe they ignore the part about stopping running for six weeks because, well, they simply just don't wanna stop. Step number three, the patient walks into the doctor's office, but here's the kicker, they don't ask the original question, which was. Why does my knee hurt when I run? They ask an iteration of it filtered through their own biases and fears. So the question comes across as, what is the likelihood that I have a meniscus bucket handle tear So the doctor isn't getting the clean, unbiased question that chat GPT got. They're getting a modified version. In essence, they've become the third opinion in the game of medical telephone, and most physicians, unfortunately, are really upset about it. So let's think about this systematically. What are the failure points? Failure point number one. The patient's original prompt to chat GBT might not be very good. Why does my knee hurt? It's not the same as I'm a 30 5-year-old recreational runner with knee pain that started two weeks ago or occurs only during the first mile of running. And improves with activity failure. Point number two. Chat GPT gives a probabilistic answer based on patterns, not a personalized diagnosis based on examination history and clinical context failure. Point number three, the patient cherry picks their information. They keep what confirms their hopes and discard what threatens their lifestyle or things that they simply don't want to entertain. Failure. Point number four. The physician gets a distorted version of both the original problem and the AI's response. It's like the childhood game of telephone, except instead of Whispering Purple Monkey Dishwasher, we're whispering about people's health. The result is physicians are trying to solve a puzzle with only half the pieces missing. The result is physicians are trying to solve a puzzle with half the pieces missing failure. Point number five, the physicians let their emotions about the whole situation get the best of them. They're offended that people come to them trying to do their own research and trying to quote unquote, doctor Google themselves. That obviously is not going to result in a good outcome and an unbiased view from the, from the physician if they let their emotions take over. But here's where despite their feelings, this is where most physicians are thinking about this completely wrong. They're seeing this as a threat to their authority. I completely get it. However, I see it as the greatest opportunity in modern healthcare. They're seeing this as a huge opportunity for them. Why? Because the physicians who figure this out first are going to absolutely dominate their markets. They're going to have the most satisfied patients, the best outcomes, and frankly, they're going to be a much more successful business as a result. Let me break down the strategy. Tactic Number one, you want to embrace the AI consultation instead of getting defensive. When a patient says, I asked chat, CPT about this, the smart physician says, great, what did it tell you? Let's start there and I'll show you where the AI got it right and where it might have missed some crucial pieces. You're not trying to compete with ai. You're trying to partner with it. Tactic number two, become the prompt engineer. Most patients are terrible at asking AI the right questions, so teach them, say, here's how you should have asked that question to get a more useful answer. Then show them the difference. You've now positioned yourself as the AI interpreter and an ally to the patient, which is extremely valuable. Tactic number three, you have to address the selection bias when you see that a patient has clearly cherry picked information, you have to address it directly. I can tell you didn't love everything. Chat, GPT suggested. What part of that worried you? What part did you disagree with? Now, you're not just treating their need, but you're actually trying to understand them as a person, their fears, their beliefs, and ultimately that's going to equate to a much more specific and personalized treatment program, so here's the mindset shift that everybody has to get around. Patients now have access to medical information before they see you. There is simply no way around this. It's available to them. It has been for decades at this point, and it is something that physicians have to accept and not ridicule patients about. This is an amazing opportunity being the person who can contextualize, personalize, and prioritize that information for your patient. Think of it this way, in the pre-internet era, physicians were like librarians with exclusive access to the books. Now everyone has access to the library, but they sometimes still need a curator. The question isn't whether you're going to be replaced by ai. The question is whether you're going to be the curator or just another book on the shelf. Let me give you a concrete example of how this plays out. Patient comes in, says their shoulder hurts. They've already consulted chat, JPT, which mentioned rotator cuff, tears, impingement, and also a frozen shoulder. Now, the old school physician's response would be, well, let me examine you and we'll figure out. What exactly is going on the. The new school physician's response, which should be great. So you've done your homework chat. CPT covered the main possibilities. Now let me show you how we narrow it down to what's actually happening in your shoulder. This way we can come up with a treatment plan for you specifically as opposed to this generic advice. The second approach makes the patient feel heard, validates their research and positions the physician as a specialist in personalization, not just diagnosis. As a result, the patient leaves, feeling they got the best of both worlds, AI's comprehensive knowledge, and also the human expertise. Now, here's where I'm going to take a position that might make some physicians feel uncomfortable. Most of you are replaceable. Most of you are not nearly as irreplaceable as you think you are. Being a physician does not necessarily mean that you are irreplaceable in the year 2020 25. In the year 2025, if your value proposition is I have medical knowledge and you don't, then yes, AI is going to be a threat to you because AI ultimately has more medical knowledge than you. It's available 24 hours, seven days a week, and it doesn't have a bad day. That being said, it does not have your expertise. It doesn't have your experience, and simply having data does not make you a good physician. So this is your opportunity. If your value proposition is I can take complex medical information and make it actionable for your specific situation, your goals, and your constraints, then AI just became your biggest competitive advantage. The physicians getting angry about patients using chat. GPT are like travel agents who got angry about Expedia. They're focusing on the wrong problem. The smart travel agents pivoted to become travel consultants. They used online tools to do the research faster than added the human layer of customization and service. Same principle applies here, and they're certainly still travel consultants or formerly known as travel agents, still around in 2025. So what can you actually do with this information? If you're a patient, ask Chad GPT your medical questions, but ask them. Well be specific about your symptoms, timeline, and relevant history. When you see your physician, be transparent about what you've already researched. Don't play games like Go Fish. Personally, I hate it when people feed me information, just trying to see what I'm going to say as opposed to just asking me the question to see what I'm gonna say. Number three, use your doctor as interpreter, not just the diagnosis machine. If you're physician, stop being so defensive about ai. Start being curious about how your patients are using it. Learn to prompt AI yourself. You'll give better medical care if you understand the tools your patients are already using. Position yourself as the bridge between generic information and personalized care. but in the big picture of things, this is what I think is actually happening. We are moving into an era where simply having information is not enough. As physicians, we have to be able to process this information while understanding what the patient's needs are, their concerns, their worries, in order to deliver excellent care. This has always been the case, but this now includes understanding and working with ai. The physicians who understand this will thrive and the ones who don't will simply become irrelevant. And for patients, this is unquestionably good news. You get more informed, more engaged in your care, and you have a safety check on both AI recommendations and physician advice. The only losers in this transition are physicians whose primary value was information gatekeeping, and honestly, those physicians weren't adding much value when you think about it Anyways, the future of healthcare is in physician versus ai. This is a time where physicians should be trying to become physician plus ai. The physicians who figure this out are going to have amazing and fulfilling careers and be extremely successful. So the next time a patient tells you, they ask chat GPT first. Don't roll your eyes, say thank you. They just did half your homework for you. As always, thank you for listening. Until next time. Thank you for listening to The Talking Rehab podcast. I hope that this podcast stimulates you to question your own practice and how you approach rehabilitation. I truly appreciate your time and attention. If you enjoyed listening, make sure to like and subscribe to the podcast. I wish you a movement filled day. Take care.