AI Innovations Unleashed

AI in 5: How Your New AI Study Buddy Actually Thinks (May 11, 2026)

JR DeLaney Season 19

Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.

0:00 | 7:19

Your student’s AI study buddy is already in the room—are you ready to talk about how it actually works? In this episode of AI in 5, we pull back the curtain on the large language models powering today’s most popular study tools, from ChatGPT to flashcard generators. Host JR explains what it really means that these tools are “pattern machines”—not truth machines—and why that distinction matters for every teacher, parent, and student. Learn why AI hallucinations happen, how to spot over-reliance, and three simple human-in-the-loop rules your classroom or family can start using today.

 Show Notes Links:

• Sal Khan on AI tutoring: https://www.khanacademy.org/about

• Dr. Ethan Mollick, Co-intelligence: ethanmollick.com

• Join The Unleashed community: aiinnovationsunleashed.com

• Subscribe & review on Apple Podcasts: https://podcasts.apple.com/us/podcast/ai-innovations-unleashed/id1776672844

 

APA Citations
Khan, S. (2023). Harnessing AI for education. Khan Academy. https://www.khanacademy.org/about

Mollick, E. (2024). Co-intelligence: Living and working with AI. Portfolio/Penguin.

Mollick, E., & Mollick, L. (2023). Assigning AI: Seven approaches for students, with prompts. The Wharton School, University of Pennsylvania. https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4475995

Send us Fan Mail

Support the show

SPEAKER_00

Hello, this is AI in 5, where we break down one AI idea into about five minutes-ish. Today is May the 11th, and your student's new favorite study partner, AI, is to tell us how it actually thinks. Let's get one thing out of the way up front. This isn't hypothetical anymore. Across recent surveys, a strong majority of high school and college students say they're already using generative AI tools for schoolwork, often every week. So if you're a teacher or a parent, there's a good chance there's an AI study buddy sitting right next to your learner, even if you never see it. So let's start with what this thing actually is. Picture a student, they upload their notes, click a button, and suddenly they've got a nice, neat summary, a set of flashcards, and a practice quiz. It feels like the app understands them. It feels like a tiny patient tutor in their pocket. But under the hood, that's not what's really happening. Generative AI, tools like ChatGPT or those study apps that take a PDF and spit out review materials are trained on enormous amounts of text and other data. They don't know facts the way humans do, and they don't believe or care about the truth. What they do is spot patterns. A simple way to think about it is this it's like autocomplete on steroids. Instead of predicting the next word in a text message, it's predicting the next word in an essay, an explanation, a quiz question, or even a whole study plan. The engine behind many of these tools is called a large language model. You don't need the jargon, but you do need the picture. It's a pattern machine that has seen a huge chunk of the internet and a library full of textbooks, and it's incredibly good at sounding right. That's our first key idea. Your student's AI study buddy is not a many teacher. It's a very powerful pattern machine. Once you see it as a pattern machine, the good and the bad both, well, start to make sense. Let's talk about the good first. Students are using AI to do the things many of us wish they'd do more often. Turn dense notes or textbook chapters into short summaries, generate flashcards and practice questions for review. Ask for new explanations of the same idea, simpler or with a different example, or even in another language. They get low-stakes, judgment-free help on the dumb questions they're too shy to ask in class. On campus surveys, students say they lean on AI most for researching topics, brainstorming for writing, and studying for exams. In other words, it really is acting like a 24-7 study buddy. But remember, it's a pattern machine, not a truth machine. That's where the cracks show. Because it's predicting what sounds right. AI can do these hallucinations, it can confidently invent facts. Well, that are just, well, completely wrong. It can produce make it up citations, throw in some wrong dates, or even explanations that are smooth but misleading. And because it's trained on human-made data, its examples and explanations can carry biases and blind spots from all that data gathered. It might consistently pick certain kinds of names, stories, or perspectives and ignore others. Finally, it's incredibly willing to overhelp. If the student asks, write my essay, the patent machine will spit out a full essay. If they say solve this problem step by step, it'll do that too. Researchers are already seeing a split. Some students use AI as a thinking partner to get unstuck to practice to see ideas from another angle. Others see it as a shortcut machine to avoid the hard parts of learning. So here's the second key idea. AI is fascinating at helping you practice and explain, but it's terrible at deciding what's true and very willing to do the work for you if you let it. So, what do we do with all that? There's a phrase that's quickly becoming a North Star in AI and education. Human in the loop. It means let the AI suggest, but keep a human, whether that's a teacher, parent, or even the student, in charge of checking, deciding, and learning. Here are three simple rules that you can use in your classroom or at home. Rule number one, check important facts elsewhere. If a student is going to submit it or study from it, they need to verify the key facts in a trusted source, like a textbook, class notes, or even a reliable database. The AI can be the first draft of an explanation, but not final word. Rule number two, use AI to practice, not to copy. Teachers might say, use AI to generate quiz questions or to explain your steps, but not to write the paragraph or solve the whole problem for you. Parents might say, ask it to quiz you, not to do your homework. And rule number three, be honest about when AI helped. Normalize this little disclosure statements like, I used AI to help brainstorm these ideas. Or I asked AI to quiz me on this chapter. When students say how they used AI, teachers and parents can see where they're actually learning and where the AI might be taking over. When you put this together, the message for students is simple. AI can absolutely be your study buddy, but remember, it's a pattern machine, not a brain. You're still the one who has to understand. Closing, well, next time you hear a student say, I'll just ask the AI, you've got the language to answer. It's a powerful helper, but it's not a teacher, and it's not a shortcut around thinking. In the show notes, I'll link a couple of short guides for parents and teachers, plus a survey or two that show just how quickly AI has become part of everyday study. As we wrap up, if you don't want to face AI and education alone, come join the unleashed. It's a community for going deeper than these five minutes, sharing real classroom workflows, parental conversations, and what's actually working is AI. You can find it on Facebook or LinkedIn right now. You can also check the notes down below, and there'll be a link to let you come be a part of it. This has been AI in five. If this episode helped you see AI a little more clearly, share it with a colleague or a parent who keeps hearing studybuddy and wonders what's really going on.