Digital Pathology Podcast

232: AI and Digital Pathology in Case-Based Renal Education

Aleksandra Zuraw, DVM, PhD Episode 232

Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.

0:00 | 18:04

Send us Fan Mail

Paper Discussed in this Episode:

Integrating AI-Powered Digital Pathology With Case-Based Teaching: A Novel Paradigm for Renal Education in Medical School. Zhou H, Cui L. Clin Teach 2026; 23(3):e70421. doi: 10.1111/tct.70421.

Episode Summary: In this journal club episode tailored for healthcare trailblazers, we explore a massive paradigm shift in medical education. We examine a 2026 perspective article that uses the notoriously complex field of renal pathology as a stress test for a brand-new teaching model. Moving away from dark lecture halls and static, perfect images, we discuss what happens when artificial intelligence is actively combined with flipped classrooms, fundamentally redefining what it means to be a competent physician in the digital age.

In This Episode, We Cover:

The "Bottleneck" of Renal Pathology: Why the kidney is the ultimate teaching hurdle. Students must translate the dense, flattened 2D reality of an H&E stain into an understanding of a patient's complex systemic autoimmune response.

The Danger of the "Curated Reality": Why traditional teaching methods that rely on textbook-perfect, heavily curated slides create "brittle" mental models. When students finally encounter messy, real-world biopsies with overlapping, ambiguous pathologies, the traditional educational foundation falls apart.

The "Spell Checker" for Histopathology: How collaborative AI elevates Whole Slide Imaging (WSI) beyond just high-resolution screens. The AI acts as a concurrent guide, using pixel-level pattern recognition to highlight regions of interest simultaneously and simulate the complex reasoning process of an expert pathologist.

The Case-Based Flipped Classroom (CBFC): The pedagogical engine that anchors these AI tools in clinical reality. Instead of passive lectures, students are handed the "detective's case file" beforehand to actively interrogate annotated slides, synthesizing diverse data streams to defend diagnoses in collaborative groups.

Redefining Medical Competence (The "Clinical Editor"): Why the new bottleneck in medical education isn't memorization—it's critical appraisal. We discuss the necessity of teaching "digital literacy," training students to skeptically manage AI, recognize its blind spots (like confusing a physical tissue fold for an abnormality), and actively audit the algorithm against the messy human reality of the patient.

The Impending Culture Collision: A look at the fascinating future where freshly minted, AI-native residents enter a legacy clinical workforce still transitioning away from physical glass slides, potentially reversing traditional medical hierarchies in the hospital.

Key Takeaway: The goal of modern medical education is no longer just memorizing histological patterns, as that heavy lifting is being outsourced to algorithms. By fusing AI-powered digital pathology with the necessary friction of case-based learning, we are training a new generation of diagnosticians to view AI not as a crutch, but as a powerful collaborative tool that must be thoughtfully scrutinized and audited for safe patient care

Support the show

Get the "Digital Pathology 101" FREE E-book and join us!

Imagine holding the future of a patient's kidney in your hands.


That's a heavy thought right there.


It really is. You're looking at a biopsy, right? And that single piece of tissue will determine whether they undergo this massive lifealtering treatment or, you know, if they go on a transplant list.


Yeah. The stakes are incredibly high.


Exactly. But here's the catch for a lot of doctors out there. The only training you've ever had to make this critical decision involves sitting in a dark lecture hall.


Oh, the classic medical school experience,


right? You're just staring at a single perfect unmoving photograph of a tissue sample


while a professor uh points a laser pointer at the screen.


Yes, exactly. And for decades, I mean, that is exactly how we have taught renal pathology. So, welcome trailblazers to the digital pathology podcast. We are so glad you're here.


Thrilled to dive into this one today.


Today, we are opening up a journal club style discussion and we've tailored this specifically for you, the listener. So, whether you're a medical educator who's, you know, redesigning a curriculum or a clinician looking right down the barrel of where diagnostic training is heading, this one is for you.


Yeah, we're looking at a really fascinating perspective article today.


We are. It's a 2026 piece from the journal Clint. It's titled integrating AI powered digital pathology with case-based teaching.


And then the subtitle is a novel paradigm for renal education in medical school.


Right. Authored by Hazu and Lee Qui. And our mission today is to explore how combining artificial intelligence, digital pathology, and case-based flipped classrooms is well, it's completely rewiring how medical students learn complex subjects.


Totally. Moving them away from those passive, dark lecture halls towards something much more active.


So, let's start right there because the timing of this paper, it feels really critical, doesn't it?


Oh, absolutely. Sue and Que are essentially mapping out this massive transition. It's a shift away from isolated passive learning toward an environment that is actually clinically relevant and


and powered by AI.


Exactly. Powered by AI. And what's interesting is they're using renal pathology as basically the ultimate stress test for this new model,


which makes sense. I mean, the kidney is notoriously difficult,


right? The logic is if you can fix how we teach the kidney, you can arguably fix how we teach literally any visual medical discipline.


Yeah. Let's talk about that sheer complexity for a second. When we look at standard undergraduate medical education, renal pathology is so often the bottleneck for students.


It's a massive hurdle.


It's not just about, you know, memorizing a few cell types. Students are suddenly hit with incredible hisystological complexity,


right? Because you're dealing with the three-dimensional architecture of the glomeily, the tubules, all of it.


The interstatial spaces, the vascule, and it's all just flattened into a two-dimensional pink and purple H& stain.


Yeah. The cognitive load is immense. Yeah.


And you know, the problem isn't simply the visual identification part.


What do you mean?


Well, the true bottleneck is teaching students the abstract clinical correlations.


Ah right. Bridging the gap.


Exactly. A medical student might learn to spot say the subtle thickening of a basement membrane on a slide. But translating that microscopic morphological change into an actual understanding of a patient's systemic autoimmune response.


That's where the traditional model just falls apart.


It breaks down completely because traditional teaching relies on the professor selecting a perfectly framed ideal example of a lesion.


It's a highly curated ated reality


highly curated. It doesn't look like real life.


I always think of it like this. Comparing the learning of complex renal pathology from those static images to real world diagnostics. It's like it's like trying to understand the plot, pacing, and the emotional weight of a three-hour film by just staring at a single blurry photograph.


That is a perfect analogy. Yes.


Right. It just lacks the context. It lacks the movement necessary to actually figure out the mechanism of the disease.


And we have to talk about the clinical danger of that disconnect because it really It cannot be overstated.


Yeah, spell that out for us.


Well, if a student's diagnostic foundation is built entirely on those curated textbook perfect images, their mental models are fundamentally brittle.


Brittle. I like that word for this.


Yeah. Because when they enter clinical practice, they don't get perfect slides. They encounter messy, ambiguous biopsies


with multiple overlapping pathologies.


Exactly. And if their training hasn't given them a mechanism for navigating that, ambiguity, that gap between their anatomical knowledge and their clinical judgment,


it becomes a huge liability for patient care,


which is terrifying, honestly. So, if the human eye and a 2D snapshot just aren't enough to scale this kind of contextual learning, how do we bridge the gap?


Well, that is where Zoo and Qui bring in the technological antidote,


right? The integration of whole slide imaging or WSI combined with artificial intelligence.


Yes, exactly.


Now, I have to push back here for a minute because, you know, my skeptical brain always kicks in whenever we throw the or technology at a teaching problem.


Oh, sure. The shiny new toy syndrome.


Exactly. A high resolution screen doesn't automatically make someone a better doctor. I mean, are we just taking a physical glass slide, scanning it, putting it on a fancy iPad, and calling it a revolution?


And that is the exact trap a lot of institutions fall into. They just digitize a flawed teaching method and expect different results.


Right? So, what makes this different?


The integration of AI is what elevates this. It's not just a format change. It's a cognitive shift. These platforms, they don't just display gigapixel images. They actively collaborate in the diagnostic reasoning process.


Collaborative AI.


Yes. The paper actually brings in context from a framework developed by Hang and colleagues which explores this specific pathologist AI dynamic.


Okay. So, how does that collaboration actually function at the pixel level for a student? Like what are they seeing?


So, rather than a student just aimlessly scanning a vast slide hoping to stumble on the pathology, The AI serves as a concurrent guide.


Oh, interesting.


Yeah. It uses pixel level pattern recognition to highlight regions of interest across the entire cortex simultaneously.


Wow. All at once.


All at once. It can suggest potential patterns of glomeular damage or even offer a differential diagnosis based on the structural degradation it detects.


That's incredible.


It is. It's essentially simulating the reasoning process of an expert pathologist. Like calculating the ratio of healthy tissue to fibrodic scarring in seconds,


which is a task that would take a human immense cognitive effort, right?


And tons of time to quantify manually.


Exactly.


So, it acts almost like a real-time spell checker for visual diagnostics.


Yes. A spell checker for hisystologology,


highlighting the like the grammatical errors in the cellular structure.


That's a great way to put it.


And the real power there to me seems to be scalability. You know, you cannot have a senior pathologist standing over the shoulder of 200 individual students.


No, physically impossible, right? guiding their eyes through the Z-axis of a complex slide one by one. But the AI scales that expert mentorship. It allows every single student to have an interactive dialogue with the tissue sample.


And what's crucial for you listeners to understand is that this directly mirrors the reality of the modern laboratory.


Oh yeah. It's not just a classroom trick.


Not at all. Zu and Kui point to work by Otsuka and colleagues who demonstrate the use of whole slide imaging for rapid treatment decision-making in actual clinical environments.


Real world application.


Yes. And furthermore, they reference research by Eker and colleagues detailing a specific AI tool called Galileo.


Right. Galileo. That's the one used for evaluating pre-implantation kidney biopsies, isn't it?


Correct. Galileo doesn't just look at the kidney. It assesses the viability of a donor organ by rapidly analyzing complex features like glumeilosclerosis


and tubular atrophy. I imagine.


Yes. Against massive data sets. So, by introducing medical students to these exact AI platform, forms during their foundational education. We are removing the latency,


the latency between the classroom and the clinic.


Exactly. Between what they learn in med school and the tools they will actually rely on to make life or death decisions later,


which is amazing, but it forces us to ask a really critical logistical question here.


Go for it.


If we have this incredibly powerful diagnostic spellch checker, how do we actually integrate it into a medical student's brain without them just, you know, blindly trusting the software?


Uh, the automation bias problem. Exactly. If you just hand a student an AI tool that gives them the answers, they aren't learning pathology. They're just learning how to read an output screen.


And that brings us to the actual pedagogical engine of Zoo and Que proposal, the case-based flipped classroom or CVFC.


The CBFC,


right? Now, as medical educators and clinicians, you trailblazers are already deeply familiar with the concept of the flipped classroom,


right? We don't need to rehash the basic definition of flipped learning here.


No, we don't. The innovation here isn't the format itself, but the highly specific execution of this framework for visual diagnostics.


Exactly. What I found fascinating is how the CBFC model anchors the AI tools in clinical reality.


Yes. The hybrid model.


It leverages the annotated digital slides right alongside complex unfolding clinical cases. So class time is entirely devoted to active problem solving,


right? And it's basically like giving them the detectives case file before the briefing. Correct.


I love that. that analogy.


Yes.


Like instead of spending class time just reading the facts during a didactic lecture, they spend class time actively interrogating the suspects,


interrogating the annotated slides. Yeah.


Exactly. And solving the crime together as a team


because the physical posture of learning fundamentally changes here. Think about it. For over a century, the microscope was this incredibly isolating tool.


Just one person looking down the barrel.


Exactly. Looking down the barrel alone. But in the CBFC model, Students are looking up and out at shared digital screens in small groups.


It's collaborative


very they're given a patient's history, their lab results, say you know elevated creatinine and proteinura and they get the AI annotated whole slide images.


So they have to synthesize all of those diverse data streams to actually defend a diagnosis to their peers.


Yes. It forces a shift from passive absorption to aggressive inquiry. And the evidence backing this up is really substantial.


Yeah. The paper cites a lot of recent evidence on this.


Zoo and CRE site multiple studies including work by Kai L and Yang W.


Oh, right. The studies on undergraduate pathology.


Yes. Demonstrating that this exact combination of flipped classrooms and case-based learning drastically improves active learning metrics.


Yang's paper was particularly illuminating to me because it highlights how this methodology significantly improved the critical thinking abilities of international students specifically.


Oh, that's a great point.


Yeah. When you anchor complex medical terminology in visual interactive problem solving, it sort of transcends language barriers


because it's visual and collaborative.


Exactly. The students cannot hide in the back row of a dark auditorium anymore. They must articulate their clinical reasoning out loud.


Bridging that gap between what the AI highlighted and why it actually matters to the patients prognosis,


which is fantastic. But, you know, this brings us to what I think is the most profound paradigm shift in the entire paper,


redefining competence. Yes. If students are no longer spending hundreds of hours trying to memorize hisytological patterns because the AI can just instantly highlight the glamlo sclerosis for them, what exactly are we teaching them?


It's a huge question.


Doesn't this entirely redefine what competence actually means for a graduating physician?


It dismantles the historical definition of competence entirely. Memorization is basically being outsourced to the algorithm.


Right? So what's left for the human?


The new bottleneck in medical education is critical appraisal. The definition of competence is shifting from biological pattern recognition to the effective skeptical management of artificial intelligence.


Skeptical management. I really like that phrasing. And this is where we need to dig into the Wallock study from 2025 that the paper references.


Yes, the Waldock piece is crucial here.


It explicitly looks at how medical curriculums must adapt to teach students how to evaluate AI outputs. We are no longer just training students to read a slide. We are training them to be the clinical editors of the AI.


The clinical editor, that concept is really the core takeaway for educators listening. Waldo's research underscores that students must develop what we can call digital literacy in nefrology.


Digital literacy meaning they have to understand the algorithm's flaws.


Exactly. They have to understand the confidence intervals of the algorithm and more importantly recognize its blind spots.


Give us an example of a blind spot.


Okay. For instance, the AI might flag a phys ical fold in the tissue sample as an abnormality


just because the tissue folded over itself during preparation,


right? Because it detects a density of pixels that mimics a lesion, but the competent student must recognize that artifact for what it is and override the machine.


Oh, that makes total sense. Or consider a scenario where the AI is, say, 98% confident that it's seeing tubular atrophy, but the student looks at the patient's history and sees completely normal kidney function labs. The student has to understand the underlying science. Well, enough to audit the algorithm.


They have to trust the whole clinical picture, not just the AI's tunnel vision.


Exactly. Because if they don't have that deep mechanized understanding of the pathology, they will just default to the software,


which is a massive clinical liability.


Huge liability.


The AI is an incredibly powerful diagnostic instrument, but it lacks clinical intuition and it lacks longitudinal context about the patient.


Right. It doesn't know the patient sitting in the exam room.


Exactly. So teaching A student to audit the AI requires them to hold the algorithmic suggestion in one hand and the messy human reality of the patient in the other and actively negotiate the truth between them.


That is so well said.


And you cannot teach algorithmic skepticism through a passive lecture. It requires friction.


The friction of the case-based flipped classroom.


Yes. Where students practice making the wrong call. Where they debate the AI's findings and refine their clinical judgment in real time.


It's brilliant. Honestly, because it addresses the complexity of modern medicine, not by simplifying the science, but by upgrading the cognitive delivery system to match it.


That's a great summary.


What Zoo and Que have constructed in this 2026 Clint Teach article is essentially a comprehensive rescue road map for renal education.


It really is a road map. By fusing AI powered digital pathology with the rigorous debate of a case-based flipped classroom, they are preparing students for the exact technological environment they are going to inhabit.


The ultimate payoff here is a generation of diagnosticians who view artificial intelligence not as a crutch, right? But as a collaborative tool to be scrutinized and applied thoughtfully.


Exactly. They'll enter the workforce with a diagnostic foundation that is elastic. It'll be capable of adapting to new algorithms and novel diseases without fracturing,


which is exactly what we need. But, you know, I want to leave our trailblazers with a final thought to mle over today as you look at your own institutions and practices. Let's leave them with something provocative.


We've been talking about minting this entirely new generation of medical students who are educated in this AI native digital first environment.


The true digital native,


right? They will be absolute experts at navigating whole slide imaging, quantifying pixel level data, auditing algorithmic outputs. But when they graduate,


oh, I see where you're going with this.


Right. When they graduate, they're going to enter a legacy clinical workforce. Hospitals that are in many cases still painfully transitioning away from physical glass slides and traditional microscopes.


It sets the stage for a really fascinating collision of cultures.


A massive collision. I mean, will we see a complete reversal of the traditional medical hierarchy here?


It's very possible.


When it comes to digital pathology, will these freshly minted digital native residents end up teaching their senior attendings how to navigate the diagnostic tools of the future?


Can you imagine?


How will a veteran pathologist who has spent 30 years looking down the barrel of a microscope react when a firstear resident challenges their diagnosis based on a pixel level algorithmic audit.


That is going to be a very interesting conversation in a lot of hospitals very soon.


We are definitely about to find out. Well, thank you for joining us for this deep dive into the Clinte paper everyone.


Yes, thanks for listening trailblazers.


Keep challenging the old models, keep auditing the new tools and keep exploring the future of medicine. See you next time.