Designing with Love

Guiding the Classroom with AI Copilots

Jackie Pelegrin Season 4 Episode 93

Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.

0:00 | 16:55

AI can feel like a runaway train in classrooms and training programs—powerful, fast, and a little scary. We take the controls and show how to turn generative tools into true co-pilots: clear roles, simple guardrails, and small pilots that free us to focus on coaching, feedback, and real human connection.

You’ll hear role-based examples across K-12, higher education, and corporate learning: differentiated reading passages and exit tickets, outcome-aligned case prompts and quiz banks, and realistic scenario practice plus microlearning nudges for on-the-job performance.

Want to put this into action? Grab the pilot checklist from the show notes, try one workflow this week, and tell us what changed. If this helped, follow the show, share it with a colleague, and leave a review so more educators and L&D pros can build ethical, effective AI co-pilots.

🔗 Resources and Related Episodes:

If you’d like to explore today’s topic further, here are a few resources to check out:

📝 Interactive Resource

AI Copilot Pilot Checklist: A ready-to-use guide you can copy and adapt to your context. Use this template to plan, run, and debrief a small AI copilot pilot—from choosing one workflow and setting guardrails to defining success and capturing what you’ll keep, tweak, or toss.

📊 Research Report

2025 AI in Education: A Microsoft Special Report: This research report, produced by Microsoft, provides key insights into how AI is transforming nearly every aspect of our society worldwide. 

🎧 Listen Next: Related Episodes

Episode 54: Beyond Learning Outcomes: Designing for Humans or Learners?: A practical look at human-centered design, focusing on how to move beyond check-the-box objectives and create learning experiences that serve real people, real needs, and real contexts.

Episode 79: Top Emerging Technologies Shaping Instructional Design: A tour of five emerging tools, including VR, AR, and MR, and how they’re reshaping the way we design for real-world skills and authentic practice. 

Send Jackie a Text

Join PodMatch!
Use the link to join PodMatch, a place for hosts and guests to connect.

Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.

Support the show

💟 Designing with Love + allows you to support the show by keeping the mic on and the ideas flowing. Click on the link above to provide your support.

Buy Me a Coffee is another way you can support the show, either as a one-time gift or through a monthly subscription. 

🗣️ Want to be a guest on Designing with Love? Send Jackie Pelegrin a message on PodMatch, here: Be a guest on the show

🌐 Check out the show's website here: Designing with Love

📱 Send a text to the show by clicking the Send Jackie a Text link above. 

👍🏼 Please make sure to like and share this episode with others. Here's to great learning!


Setting The Stage: AI Copilots

Jackie Pelegrin

Hello, and welcome to the Designing with Love Podcast. I am your host, Jackie Pelegrin, where my goal is to bring you information, tips, and tricks as an instructional designer. Hello, instructional designers and educators. Welcome to episode 93 of the Designing with Love Podcast. In this episode, I'll help you put AI co-pilots to work ethically and effectively with a practical starter kit, responsible use guidelines, age-appropriate and role-based examples, and fast pilot steps you can run in K through 12, higher education, or corporate training. So, grab your notebook, a cup of coffee, and settle in as we explore this topic together. Let's start with a simple question. When I say AI co-pilot, what exactly do I mean? And what do I not mean? When I say AI copilot, I'm talking about AI tools that sit alongside you as a thinking partner, not as a replacement for teachers, trainers, or instructional designers. Think of it as a smart assistant that can help you brainstorm ideas, draft first passes, and personalize support. But at the end of the day, you are still the pilot. You decide where you're going, how you get there, and what good looks like for your learners. If we think about traditional tools we had while we were in school, things like spell check, templates, and search engines, those were helpful, but they didn't really generate content for us. With generative AI, we now have tools that can create texts, images, quizzes, scenarios, and even outlines in seconds. That's where this idea of a copilot comes in. It's actively suggesting, not just checking. I also want to name some of the fear that naturally comes with this shift. Fear of job loss, fear that students will cheat, fear that we'll lose the human connection in our classrooms or training programs. Those are our very real concerns that we should not easily dismiss. However, the reframe I want to offer you as an educator and instructional designer is this. Our goal isn't to hand over the classroom to AI. Our goal is to use AI wisely so we can spend more time on the human parts of teaching and learning, such as coaching, feedback, encouragement, and meaningful interaction. If this whole conversation around AI co-pilots and teaching agents is sparking your curiosity, I also unpack these trends from a broader future focus lens back in episode 79, Top Emerging Technologies Shaping Instructional Design. In that episode, I talk about copilots and teaching agents as part of a larger ecosystem of emerging tools. So if you want the 30,000-foot view after today's episode, episode 79 is a great companion to this one. I have provided the link to the episode in the show notes for you. So once we know what an AI copilot is, the next big question is, how do we keep it from flying off course? That's where the guardrails come in. Before we talk about specific tools, I always like to start with the why. Learning outcomes first, tools second. AI should be in service of good design, not the other way around. There are a few core guardrails I recommend whenever you're bringing AI into the classroom or training environment. First is transparency. Learners should know when and how AI is being used. That could be as simple as a note in your syllabus, a quick explanation at the start of a course, or a short slide that says, here's how we'll be using AI in this class. Second is privacy. We never want to paste sensitive or personally identifiable information into an AI tool. A simple rule of thumb I like to share is this. If you wouldn't want the information posted on a bulletin board in your hallway or on your company's annual report, don't paste it into an AI tool. Third is boundaries. Make sure to be clear about what AI can be used for and what it cannot replace. For example, AI can help with idea generation, practice questions, and first draft feedback, but it should not replace original thinking, final assessments, or the relational parts of teaching and planning. We are the educators and instructional designers, so we need to always review the content generated with a critical eye. Fourth is integrity. For schools and universities, that means aligning AI use with academic integrity policies. For corporate, it means aligning with compliance, legal, and IT. You might say something like, AI can help you brainstorm and revise, but copying outputs word for word is not acceptable. This is where the first draft principle comes into play. I'm a big fan of creating a simple plain language. It's called AI Use Guidelines document. You could maybe have a one-page handout or a short slide deck. Currently, what I do in my college classes is post announcements for my students that remind them of the guidelines each week. Then I provide more detailed guidance for my assignments where AI usage is explicitly mentioned. For younger learners, you'll want to ensure you provide extra supervision and age-appropriate platforms. For corporate training, you'll want to make sure your tools are approved by IT and respect security and privacy standards. Once those guardrails are in place, we can finally talk about the fun part, what AI copilots can actually do for us in the classroom or training room. Great, so now that we have some guardrails in place, let's look at a few quick role-based examples so you can start to picture AI copilots in action. For K through 12 teachers, AI can be a planning copilot. You might ask it to draft differentiated reading passages on the same topic at different levels. Or you could have it generate practice questions or exit tickets, and then you review and tweak them so they fit your standards and the specific needs of your students. AI can also be a feedback helper. For example, you might ask it to suggest a few possible comments focused on clarity, organization, or tone. Then you choose the best ones and edit them instead of having to write each one from scratch. For higher education faculty and instructional designers, AI can serve as a design copilot. Here you can take your learning outcomes and ask the AI to suggest discussion prompts, case studies, or formative quiz banks that align with those outcomes. You still evaluate and curate what you keep, but you've cut down on the time you spent looking at a blank page. AI can also support students more directly by helping them break down large projects into steps, clarify confusing concepts, or practice with exam-style questions. Again, make sure you provide clear guidance for your students about what is and isn't appropriate. For corporate LD professionals, AI can function as a performance support copilot. One example here is you might use it to create realistic role plays or scenario-based practice that mirrors real situations on the job. You could also have it personalized follow-up messages or microlearning nudges for different roles or regions. The sweet spot in all of these examples is using AI to remove friction from the prep work so we can pour more energy into coaching, feedback, and the human side of learning. To bring this to life a bit more, let me walk you through a quick real-world style example of what an AI co-pilot might look like in action. Let's imagine Maria, a ninth grade English teacher. Maria has a wide range of reading and writing levels in her classroom, and she struggles to give every student personalized feedback on their drafts. She's interested in AI, but she doesn't want students to have the tool to write their essays for them. So she designs a small AI co-pilot pilot. First, she writes clear guidelines for her students. She tells them AI can help you brainstorm ideas, it can suggest revisions or alternative sentences, and it can help you check for clarity. However, it cannot write your entire essay, and you may not copy and paste the AI response in your own work. She also adds a requirement that every student include a short AI usage statement at the end of their paper, explaining if and how they use the tool. Next, Maria uses a district-approved AI platform to support her own feedback. She casted autonomized examples of student writing and asks the AI for suggestions focused on clarity, organization, and tone. She never includes names or identifying details. She then edits those suggestions into personalized comments for each student. She also asks the AI to create simplified versions of her assignment instructions and a visual checklist for students who need extra support. In class, she shows students how to ask good questions of the AI tool. Things like, can you suggest a stronger topic sentence for this paragraph? Or can you give me three different ways to say this idea more clearly? As students draft and revise, Maria walks around the room, coaching them not just on their writing, but how to question the AI instead of accepting the first answer. Over time, she notices a few things with her students. Number one, more students are turning in drafts on time. Number two, the quality of revisions improves. And number three, she spends less time writing repetitive comments and more time having meaningful one-on-one conferences. That's the heart of AI co-pilots, not doing the work for our learners, but helping them see more possibilities and get unstuck faster, while we remain firmly in the role of guide, mentor, and facilitator. So how do you create something like this for yourself without overwhelming your schedule or your stakeholders? Let's talk about a simple pilot plan. When you're just getting started, I recommend thinking in terms of a small pilot, not a full overhaul. This way you're not overwhelmed with too much in the beginning. Step one is to pick one small slice of your work. This could be a single lesson, one module or one workflow, like feedback, brainstorming, or generating practice questions. Step two is to define success in plain language. Here you can ask yourself if this pilot works, what will look different? Maybe it's more drafts turned in on time, fewer confused emails from learners, or more confident participation in discussions. Step three is to choose your tool with intention. Make sure to use organization-approved platforms so you don't risk proprietary information getting out on the internet. I suggest checking with your IT, legal, or admin team to make sure the tool aligns with privacy and security expectations, especially if you're working with minors or sensitive information. Step four is to create a short AI co-pilot brief for your learners or facilitators. In that brief outline, you can include what the AI can help with in this pilot, what it cannot do for them, and a couple of examples of good use and poor use. And finally, step five is to plan your reflection moment. This might be a simple debriefs survey, a discussion question, or short reflection activity that can include the following. What worked well? What felt weird or concerning? What should we keep, change, or stop? Once you've run that first small pilot, the real magic is what you decide to do with those insights. As you start to see what's working, you can use those pilot results to refine your guidelines and decide where AI truly adds value and where it just adds noise. This is also a great time to involve more stakeholders. In K-12, that might include students, parents, administrators, and instructional coaches. In higher ed, it might be department chairs, academic integrity offices, and your teaching and learning center. In corporate, it might be managers, HR, compliance, and IT. Here, I like to keep coming back to one guiding question. Does the use of AI help learners think more deeply, act more confidently, or perform more effectively? If the answer is no, it might not be the right use case, or it might not be worth the time and complexity that AI adds. If you've been with me for a while, you might remember episode 54, Beyond Learning Outcomes, Designing for Humans or Learners. In that episode, I dig into what human-centered design really looks like in our work and how we can move beyond just checking boxes on learning outcomes. That human-centered lens is exactly the lens I want to bring to AI. We're designing for real people, real contexts, and real constraints, not just what's flashy, efficient, or trendy. So if you'd like a deeper dive on that foundation, episode 54 pairs really nicely with today's conversation, and I'll link it in the show notes for you. AI will keep evolving, and our policies and practices will evolve with it. But if we stay grounded in human-centered design and clear values, AI copilots can be powerful allies rather than distractions. Alright, so here's your small action step for this week. Choose one class, one course, or one training program, and identify just one workflow where an AI co-pilot could help. Maybe it's drafting quiz questions, brainstorming case studies, or giving first pass feedback on writing. Sketch out a tiny pilot using the steps we talked about today. Number one, set your guardrails. Number two, define what success would look like in plain language. Number three, choose your tool with intention. And number four, plan a short debrief to capture what you learned. If you'd like a simple AI co-pilot pilot checklist to guide you through those steps, make sure to check the show notes. I have linked a Canva template you can copy and adapt for your context. As I conclude this episode, I would like to share an inspiring quote by John Dewey, an American philosopher and educator in the late 1800s and early 1900s that ties nicely into using AI in education. If we teach today's student as we taught yesterday's, we rob them of tomorrow. John Dewey believed that learning should connect to real life, encourage inquiry, and adapt to the needs of the learner, not just deliver content in the same way over and over. AI copilots, when used thoughtfully, are one of the ways we can avoid teaching yesterday. They're not the goal in themselves, they're tools that can help us design more responsive, relevant, and supportive learning experiences for the humans we serve. Thank you for spending this time with me today. I hope this episode gave you a hopeful practical lens for guiding the classroom with AI co-pilots without losing the heart of what makes teaching and learning so powerful. Until next time, keep designing with love and intention. Thank you for taking some time to listen to this podcast episode today. Your support means the world to me. If you'd like to help keep the podcast going, you can share it with a friend or colleague, leave a heartfelt review, or offer a monetary contribution. Every act of support, big or small, makes a difference, and I'm truly thankful for you.

Podcasts we love

Check out these other fine podcasts recommended by us, not an algorithm.

Buzzcast Artwork

Buzzcast

Buzzsprout
Podcasting Made Simple Artwork

Podcasting Made Simple

Alex Sanfilippo, PodMatch.com
The eLearning Coach Podcast Artwork

The eLearning Coach Podcast

Connie Malamed: Helps people build stand-out careers in learning design.
The Visual Lounge Artwork

The Visual Lounge

TechSmith Corporation
The Way I Heard It with Mike Rowe Artwork

The Way I Heard It with Mike Rowe

The Way I Heard It with Mike Rowe
The WallBuilders Show Artwork

The WallBuilders Show

Tim Barton, David Barton & Rick Green
Bible Verses 101 Artwork

Bible Verses 101

Daniel Lucas/Karen DeLoach/Jackie Pelegrin
Wake Up the Lions! Artwork

Wake Up the Lions!

Rory Paquette
Seven Mile Chats Artwork

Seven Mile Chats

Julia Strukely
Book 101 Review Artwork

Book 101 Review

Daniel Lucas
LOVE Letters Artwork

LOVE Letters

Daniel Lucas
Mental Health 101 Artwork

Mental Health 101

Daniel Lucas
Movie 101 Review Artwork

Movie 101 Review

Daniel Lucas And Bob LeMent
Geography 101 Artwork

Geography 101

Daniel Lucas
Abstract Essay Artwork

Abstract Essay

Daniel Lucas /Sal Cosenza
KAJ Masterclass LIVE Artwork

KAJ Masterclass LIVE

Khudania Ajay
lethal venom Artwork

lethal venom

Noah May
Hidden Brain Artwork

Hidden Brain

Hidden Brain, Shankar Vedantam