
The TechEd Podcast
Bridging the gap between technical education & the workforce 🎙 Hosted by Matt Kirchner, each episode features conversations with leaders who are shaping, innovating and disrupting the future of the skilled workforce and how we inspire and train individuals toward those jobs.
STEM, Career and Technical Education, and Engineering educators - this podcast is for you!
Manufacturing and industrial employers - this podcast is for you, too!
The TechEd Podcast
AI Can Close the Learning Gap in Underserved Classrooms. But We Have to Guide, Not Just Give - Sam Whitaker, Director of Social Impact at StudyFetch
In schools with limited resources, large class sizes, and wide differences in student ability, individualized learning has become a necessity. Artificial intelligence offers powerful tools to help meet those needs, especially in underserved communities. But the way we introduce those tools matters.
This week, Matt Kirchner talks with Sam Whitaker, Director of Social Impact at StudyFetch, about how AI can support literacy, comprehension, and real learning outcomes when used with purpose. Sam shares his experience bringing AI education to a rural school in Uganda, where nearly every student had already used AI without formal guidance. The results of a two-hour project surprised everyone and revealed just how much potential exists when students are given the right tools.
The conversation covers AI as a literacy tool, how to design platforms that encourage learning rather than shortcutting, and why student-facing AI should preserve creativity, curiosity, and joy. Sam also explains how responsible use of AI can reduce educational inequality rather than reinforce it.
This is a hopeful, practical look at how education can evolve—if we build with intention.
Listen to learn:
- Surprising lessons from working with students at a rural Ugandan school using artificial intelligence
- What different MIT studies suggest about the impacts of AI use on memory and productivity
- How AI can help U.S. literacy rates, and what far-reaching implications that will have
- What China's AI education policy for six-year-olds might signal about the global race for responsible, guided AI use
3 Big Takeaways:
1. Responsible AI use must be taught early to prevent misuse and promote real learning. Sam compares AI to handing over a car without driver’s ed—powerful but dangerous without structure. When AI is used to do the thinking for students, it stifles creativity and long-term retention instead of developing it.
2. AI can help close educational gaps in schools that lack the resources for individualized learning. In many underserved districts, large class sizes make one-on-one instruction nearly impossible. AI tools can adapt to students’ needs in real time, offering personalized learning that would otherwise be out of reach.
3. AI can play a key role in addressing the U.S. literacy crisis. Sam points out that 70% of U.S. inmates read at a fourth-grade level or below, and 85% of juvenile offenders can’t read. Adaptive AI tools are now being developed to assess, support, and gradually improve literacy for students who have been left behind.
Resources in this Episode:
- To learn about StudyFetch, visit: www.studyfetch.com
Other resources:
- MIT Study "Experimental Evidence on the Productivity Effects of General Artificial Intelligence"
- MIT Study "Your Brain on ChatGPT: Accumulation of Cognitive Debt when Using an AI Assistant for Essay Writing Task"
- Learn more about the Ugandan schools mentioned: African Rural University (ARU) and Uganda Rural Development an
We want to hear from you! Send us a text.
Instagram - Facebook - YouTube - TikTok - Twitter - LinkedIn
Matt, welcome to another episode of The TechEd podcast. I am your host. Matt Kirkner, I was at a huge artificial intelligence event this week, actually, in the shadow of Lambeau Field in Green Bay, and one of the speakers said this. She said that artificial intelligence is the great democratizer In this economy and in this age. And you know, as I thought about that, I couldn't agree with her more. The truth of the matter is that AI doesn't pick favorites. Artificial Intelligence is available to so many people in so many different ways. And so I think we are entering into an era, maybe already in an era where it doesn't matter your background, it doesn't matter your education, it doesn't matter your economic status, none of that stuff is important in terms of your access and ability to invent and to innovate using artificial intelligence. Now that's not going to happen on autopilot. There's some things that have to take place along the way, but I do believe that this is one of those technologies that, if we play our cards right, can be available to everybody. Can do an incredible job of knocking down disparity across our economy, here in the United States, and as we're going to talk about today, around the globe, we're talking about that with a guest that I'm really, really really excited to welcome to the studio of the TechEd podcast from Philly. Our guest is Sam Whitaker, the Director of Social Impact at study fetch. We're going to learn in a little bit about the work that he's doing. But I want to start first of all, Sam by welcoming you to the TechEd podcast. Thanks so much for being with us. Thank
Sam Whitaker:you for having me, and just to confirm it's home of the world champion, Philadelphia, eagles, Philly, not just Philly anymore, so,
Matt Kirchner:yeah, well, you're saying that to the guy that was sitting at Lambeau Field yesterday. What we should really do, if we had more time, was, is compare the number of world championships, and then you know which of those two teams had the first three of them, but, but that's a topic for another day. We're going to talk about artificial intelligence instead, and I want to dive into that with you, Sam. We're going to talk about an aspect of artificial intelligence that doesn't necessarily get quite as much news as I think it should, and that's the potential positive impact for people in underserved populations. So anybody coming from an underserved community, before we get into that, I'd love to hear a little bit more about your role as Director of Social Impact and why that work is so important to
Sam Whitaker:you. It's funny, I was having the conversation yesterday. I don't really have, like, an origin story for this. You know, I went to great schools, I had great teachers, I had a very supportive family, things like that. It's just always been something I'm passionate about, and it's actually how I got started in study, fetch. My friends started the company. I'd known them for years, and we were talking, and I've done work at a school, at a charter school here in Philly called Esperanza Academy. And the first thing I thought was, Hey guys, can you donate some licenses to the school? They said, Absolutely. And that was the genesis of me becoming all encompassed with study, fetch and AI education
Matt Kirchner:awesome. And if my Spanish is still a little it might be a little bit rusty, but I took a ton of it when I was going to school and spent some time in Spain. Esperanza Academy would be Academy of hope, if I'm not mistaken. And we're going to talk about hope today, talk about the hope that AI can provide to so many individuals across so many different socio economic backgrounds, different geographical backgrounds. Let's start with this. How do you think AI platforms in general can start to address some of the educational disparity that we see, particularly in rural and inner city communities? I live in, in Milwaukee, it is still one of the most economically and racially segregated cities in the in the world. Sad to say, a lot of progress being made, certainly a different world than it was in the 70s and 80s when I was growing up here. But so much work still to be done. How does an AI platform help address some of those differences in educational opportunities for our young people and people of all ages? Well,
Sam Whitaker:what you started out with democratization of opportunity? I agree with you 100% but then you made one point that I think is very important, which is, if we do it right, yeah, that's the key students need to be taught, and need to learn how to use AI appropriately and use it for good, for lack of a better phrase. And that needs to happen early, and it needs to happen soon, because if that doesn't happen, and if it's still the underserved populations that are just using unrestricted chat TPT, where students that are in what we would consider more privileged are being taught to use it appropriately, then that gap will actually increase, as opposed to decrease. It's about creating an environment, an AI environment, where students can use AI specifically to learn. So with products that are built for learning, chatgpt is a productivity tool. That's what it's built for. That's what it's supposed to supposed to do, supposed to make you better at your job. It's supposed to allow you to accomplish more in less time. That's not necessarily what you want for students. Students need to learn. They need to learn how to learn. And with. Studies coming out. I don't know if you've seen the recent MIT study comparing chat so they real quick, chatgpt, Google and basically handwritten essay writers for four months. The results were pretty staggering. The chatgpt group saw decreased alpha brain activity. Their work was very generic. They couldn't remember what they wrote, let alone recreate it. And probably the worst thing is, after four months, those effects lingered, and the students had to kind of relearn how to be creative and generate content on their own. And we're so early into this that's after four months, how long before those effects become permanent? We don't know yet
Matt Kirchner:that are, again, really fascinating observations. I'll point to another study from MIT, and kind of juxtapose the two of them, Noy and Zhang did one probably two or three years ago. Both students at MIT, they looked at they basically took college educated professionals, divided them into two groups, gave one group chatgpt, let the other one do it the old fashioned way. Found that the group using chatgpt Number one finished their work 37% faster. Number two, the quality of their work improved. And number three, their job satisfaction in real time improved as well. So that kind of points to the benefits of chatgpt And certainly, who wouldn't want to be able to do their work faster, more accurately and enjoy it more? Nothing wrong with any of those things. But at the same time to your point, if we're not utilizing certain aspects of our brains, and I'm not pretending to understand the science behind this, in a way that we learn, in a way that that learning becomes real to us, in a way that we're thinking through it, that we're thinking critically, we're thinking creatively and training our brains how to think, as opposed to just relying on a pre trained transformer to do that for us. That there's a huge loss in terms of retention, among other things. If that's the way that we're leveraging artificial intelligence, is that, is that correct? I agree 100%
Sam Whitaker:I always make the analogy to learning to drive. So when students are learning to drive, they get a learner's permit, they take some classes, they drive with their parents for a while. They can't drive at night for a while. And eventually, when we've deemed that they've successfully completed that, and they've mastered driving to an extent, although, when I was 16, I definitely hadn't
Matt Kirchner:mastered driving. I'm not sure I have yet, but I'm still working on that. But we don't
Sam Whitaker:have that process right now. We're handing the a car. An automobile is a big and powerful and dangerous machine, and so is AI, and if we don't teach kids how to use it appropriately. It's very, very dangerous.
Matt Kirchner:You know, it's a friend of mine years and years ago, told me this lesson. He had two daughters that were learning to drive, and he said, You never want to take a three or 4000 pound vehicle and move it in a direction without knowing what you're doing and looking exactly where you're going. And it's kind of the same thing here, where, you know, if we think about AI as a powerful machine, and indeed it is. And anybody who's used chatgpt or other gpts certainly has the ability to appreciate and recognize the power there. But in the same way that we get in big, big trouble taking a motor vehicle and running at 70 miles an hour down the interstate without some controls, without some guardrails, without some understanding of how that's operating and how to do it safely, we can get ourselves in big, big trouble. That's certainly true if we're driving here in the in the United States of America. It's certainly true if we are driving in other parts of the globe. You spend an opportunity doing that. We can't drive from here to Africa. You have to take a plane or a boat to do that. You actually did. You traveled to Uganda, as I understand it, over there, you were introducing AI to students at African rural University and the Uganda Rural Development and Training Program girls school. That just sounds fascinating to me. I mean, you start thinking about, you know, how we're using AI here in the West, here in the United States, would love to hear a little bit about what you saw in Africa. What familiarity do those students have with AI? Was it similar to what you would see here? What about their teachers? And how did that experience maybe differ in some ways from what we're seeing here in the United States as well.
Sam Whitaker:So it's interesting that our the segue there was from driving to Africa, because a lot of Africa driving is a big part of it. So the school was actually seven hours away from the airport, seven drive, but it was only to go 180 kilometers. The roads are really more pothole than road. They actually have a phrase for it. They call it an African massage, because you're gonna like this entire time, and there's a lot of weaving, and lanes don't really exist, but getting there, but all of that completely worth it, just to be there. So to answer your question, to start familiarity with AI was one of the first questions I asked to kind of try to get a baseline. And there were 30 girls in the class, and I asked, how many have you know? What do you know? What AI stands for? Everybody raised their hand, artificial intelligence. How many of you were using AI? 28 out of the 30 students were already using AI. Wow, to give some reference again, seven hours from the airport, this is about as far afield as you can get where you still have internet access, yeah, and it's mostly chat, GPT and deep seek. And then I asked, How many of you have taken classes to learn how to use AI, or how many of you, how did you learn how to use AI? And the answer was always, well, I do. Just started using it. That's was a little bit scary because, and honestly, like, to your point, not very different from what's happening
Matt Kirchner:here. Yeah, interesting. It's curious to me, you think about AI here in the United States, and certainly very familiar with deep seek, the, you know, one of the Chinese versions of gender, of artificial intelligence, really, some some innovative technology in terms of how they're doing searches, and doing it in a way that's more efficient and in some cases, consumes less energy as well. So it could probably do a whole episode just on the juxtaposition between chatgpt and deep seek, but but interesting. We don't hear deep seek much here in the United States, yet. Anyway, I know it's it's available, and people are talking about it, but certainly not as ubiquitous as some of the others. And you're saying that it was it as common as chatgpt When you were in
Sam Whitaker:Uganda, it sounds probably about 5050, okay, yeah,
Matt Kirchner:yeah, fascinating. So we'll see where that technology takes us, and also how different cultures are going to influence how we're basically democratizing and spreading artificial intelligence around the globe, whether we're right next to an airport here in the United States, or seven hours from one in Uganda. Why Uganda? I mean, of all the places you could pick to go and study artificial intelligence, why did you pick Uganda?
Sam Whitaker:It wasn't so much picking as it was where, just kind of where it happened when I was in a conference in Singapore the end of last year, and I met a teacher who had done some work at this school in Uganda, and we started talking, and it was a long process. I mean, it was like six months to really kind of get the get everybody on board. And then we did a lot of training with the teachers virtually, where we were kind of showing them the tools, and they were starting to get to use it a little bit. And then when we got there, we made sure that all the students were set up with our software, study fetch, and made sure that they were ready to go. And then when we were there, I mean, it was still there were, I mean, there were generator issues, and the internet went down a few times. But it was worth it. And monkeys running through the campus the whole time, which was awesome, wow. I was saying when I landed, I was like, I gotta see a monkey before I go. And my friend, who the teacher who came with me, she was like, Don't worry, you'll see a monkey. And as soon as we pulled up, I mean, there it's built into the forest, so there's 2000 monkeys that live just off of the campus. They just kind of run through and back and forth the entire day. It was amazing. It
Matt Kirchner:was so cool. Yeah, yeah. That is really cool, and certainly a lot different from what we see, of course, here in the United States, first of all, by coincidence, this morning, in our in our studio here at the TechEd podcast, our internet went down. Somebody had to, almost never happens anymore, so it's, it's kind of an anomaly here, but somebody had to go reset the router. In this case, never had somebody have to go fire up the generator, just so we could get our our internet working, and certainly not having to dodge the monkeys while they were doing it. So that sounds like a really, really cool experience that you had come back. We'll do an episode on deep seeking monkeys, right? That would just be, that'd be fascinating stuff. But for for the time being, we're going to stick to this topic of democratization of artificial intelligence. What role do you see AI playing? I, you know, I teed it up a little bit in the intro this conference. I was at this meeting, I was at yesterday, where somebody saw it as the great democratizer, and especially in terms of high quality learning. You talked about the importance of learning to use generative AI, the right way learning, the right things. I mean, do you see this as really kind of creating an environment where there's less disparity in terms of, you know, whether it's socioeconomic, geographical, political, whatever, access to education? Do you see it having that impact 100%
Sam Whitaker:it can, if we do it right back to that so going back to Uganda really quickly. So I gave the students. We ended up having about two hours with our software where the students could really just sit down and learn. I found an NVIDIA video on it was about an hour long video on deep learning and neural networks, which is the basis for all GPU software design. And students had never heard of Nvidia. They had never heard of deep learning or neural networks, anything like that. So about 60% of their education is actually agricultural as well. They're learning to live their lives in their world when they graduate. Sure this was nowhere on their radar. So they had about two hours with our software in this video, and they were tasked with just learning, and they were preparing to give a presentation at the end based on what they learned. And we got to it, I had no idea what to expect. I really did. I was just happy that we were there and happy that they were introduced to the software, and they were kind of learning, and the students were really all of them, even the professors were really kind of timid, a little nervous when we first started, but then as you get into it, it's just a classroom, and the students are getting more excited, and they're laughing and they're joking with each other. And when they gave these presentations, it was one girl wrote a poem about neural networks. Another wrote entire short story on deep learning. Another one wrote a song. And then some did like, just great presentations, and they're in no way experts, not at all, but this is advanced stuff, and they can hold a conversation about it. Now after two hours, that's what can happen with AI. But again, if we do it right. So if you look at democratization and kind of the haves and have nots, China, for instance, is mandating AI edge. Education to students starting at six years old, starting this year. And I guarantee you, they're not just giving them access to deep seek. They're teaching responsible AI use. They're using platforms to teach students how to use AI appropriately so they can build the next generation of deep seek engineers. But however, in other countries, in the US, for instance, we're being incredibly hesitant with how we roll out student facing AI to students, and it's, it's born of fear, and I get that completely but the sad fact of the matter is that if we wait, we lose we have to act now, and that's where the democratization comes in. It. We don't want to get to a point where it's the haves and the have nots. We want to create equality of opportunity across the globe, and the only way we can do that is if everyone is rooted in responsible AI skills.
Matt Kirchner:Everyone is rooted in responsible AI skills, for sure, and I've got a bit of a mantra going through life, Sam, is that I don't need to understand everything about something in order to be able to be able to use it. In other words, I don't need to understand everything my laptop is doing in order to be able to use it. Same thing with my smartphone, for that matter, same thing with my smart car anymore. I can use the Smart aspects of it without understanding why or how it's doing what it's doing autonomous vehicles. Same thing I wrote in one not too long ago, in Phoenix, and you get in a car and there's no driver. I don't know how it does that. I kind of do but, but I don't. I don't, I don't need to know that in order to get from point A to point B, unless, yeah, you're talking about, and you did it in two hours. It's not like you're creating AI scientists or PhDs in artificial intelligence, but, but you're talking about teaching deep learning, and you're talking about teaching Neural Networks. Why is it important to teach some of the framework of artificial intelligence and what it doesn't and how it does it in order for a student to use it responsibly. Or, I guess, do we have to teach that first of all? And if so, what I think
Sam Whitaker:we do, I think a basic basis of knowledge is good to your point, that you don't have to understand everything about neural networks in order to use chatgpt. I agree with right, but a breadth of knowledge is just a good thing to have, and kind of being able to have a conversation and understand where these things are coming from that's incredibly useful. And it's it wasn't specifically because we were doing AI that we were talking about AI, it just happened to be a good video that I thought would be good at a topic I was fairly sure they would know nothing about. Sure. That's kind of a baseline,
Matt Kirchner:yeah. And it proved to you, then that you could use this, this platform, to teach students a topic that they know nothing about enough so that in two hours, they're writing poems and singing songs and doing presentations about that technology, which is, which is really, really cool. I want to switch gears a little bit and talk and related topic, for sure, this whole idea of AI literacy, the idea that students are learning how to use AI tools. Really, really important. But you talk about AI and literacy, putting that word and in the middle, using AI as a tool to help students who are behind in such subjects as as reading and writing, and maybe, you know, written comprehension or verbal comprehension, help us understand the landscape of literacy. Let's start here in the United States. I know we've been talking about Uganda, but are students falling behind now in these areas, in terms of reading and writing and so on? And if so, do we see this problem worse in certain communities than others? They're
Sam Whitaker:not falling behind. They are behind. I've been doing a lot of research on literacy recently, and when you really dive into the stats, it's horrifying. Some of the scariest stats are, when you look at the prison population, 70% of inmates in the United States read at a fourth grade level or below. 85% of juvenile offenders can't read. You want to talk about making an impact on society, and then there's a multiplicative impact of education in detention centers in terms of recidivism. So the more inmates are educated, the less likely they are to end up back in prison. It just shows so clearly how education is so important. And if you think about traditional literacy, it is, I mean, it's the basis of being able to be functional in society. You need to be able to fill out forms, you need to be able to read signs, you need to be able to do things. And there's kind of levels of literacy. We don't really talk about grade levels anymore. We talk about kind of like functional literacy, and then kind of meeting expectations, and then exceptional literacy. Sure, so someone may be able to read a newspaper, and just because they can understand all the words, that doesn't mean that they understood the impact of what is being said. Sure, that's comprehension. Yeah, exactly that. There's a distinct difference there. And then you talk about functional literacy, things like banking, things like healthcare. And now we're introducing AI, and AI is going to be such it already is, and it's going to be even more of a required skill for success in the world. So what we're trying to do is and also it's tangible. So traditional literacy is something that's tangible in kind of in districts and with policy makers, whereas AI still isn't really quite there. So we're trying to solve, and help to solve the traditional literacy problem here, starting in the United States, while all. So teaching AI literacy at the same time, so tackling two of the many, many pillars that need to be hit,
Matt Kirchner:got it, yeah, years and years ago. And I mean when I say years and years, years and years ago, when I was in scouting, one of my friends was working on his Eagle Project. Of course, to be an Eagle Scout, many people know. Most people maybe, know you have to do a service project, right? You organize it, you plan it, you you lead it, you know, you do the post mortem and talk about the the impact of it, and so on. And one of my friends, his project was to paint a Literacy Center in Milwaukee. So we went into wasn't quite the central city, but it was near north side of Milwaukee. We would call it. And we went in, and we spent a weekend just painting this Literacy Center and kind of beautifying it. And it was my first introduction. First introduction, growing up in a you know, middle class suburb, to the idea that there were a lot of people in my community, like or the greater community in Milwaukee, who didn't know how to read. And it was, it was really fascinating to me, and not in a good way, but this whole idea of, how do you how do you function, how do you learn? How do you do a job? How do you gain knowledge, especially then when tools like smartphones were in and other ways that we had to learn weren't readily available. If you wanted to learn, you went to a lecture, you watched a TV show, or you read a book, that was how you learned. And so that was a real eye opener for me. And yet, even in this day and age, when it's, you know, 3040, years later, there's still tons and tons of people who don't read at the level they could or some people you know hardly read at all. And to your point, if you know had a huge impact on recidivism. We spend a lot of time in corrections, and the more educated that somebody is, the more they can come out and have a skill, can get a job that helps pay the bills, the less likely they are to re offend. And it's exponential. So huge impact on the literacy side. Certainly talking about artificial intelligence is kind of another chapter of that, right in the same way that if somebody couldn't read or can't read now, life is going to be really, really tough for them if they don't have a command of artificial intelligence, probably the same thing coming at them. So I get the two of those separately. Let's talk about the two of those together. So artificial intelligence, is it a tool to help with reading, with comprehension, to help improve learning? Talk about that
Sam Whitaker:without a doubt, and not just for when you're talking about demographics, for, you know, areas where literacy is particularly low, there's also cross demographic communities. Of what? A friend of mine, she's actually the Secretary of Education of Oklahoma. She likes to call them the jagged edge pieces. When we're talking about the puzzle that makes up the United States of America, we're all jagged edges in some way, and figuring out together, some students are more jagged edge. And we call them neurodivergent, or we call them on a spectrum of some kind. There are so many that are failed across demographics in education, and when we talk about how AI can affect that, it's about personalization. It's about meeting students where they are and also when they are. So AI having the ability, for instance, the literacy modules we're building now, we're building out ways to not only assess where a student is now, but as a student starts to read a little bit more and starts to get a little bit better, challenge them a little bit. And then if they're having a little bit more, a little bit of trouble, then you back off a little bit, and then you challenge them again. And slowly that inches up. But that process is different for every student, and in the in our education system, we try to fit every student into this box. And if you're in this box, and you do it well, you're smart. If you're not in that box, you're stupid. And even though we don't say that, that's the impression kids get so early on and by second third grade, they just so many give up because they just think, I'm dumb. I'm never gonna do this. Yeah,
Matt Kirchner:no, you're exactly right, and we talk about that a lot in this podcast. Our audience probably is sick of me telling about my journey through grade school and middle school and high school, and how I just learned differently. I couldn't sit still. I did fine at school. I mean, I had to put the effort in and get through it, and I did. But when there were so much better ways for me to learn than the traditional education model of sitting in a classroom and listening to somebody lecture and look, there's a lot of students that learn great that way. I know a lot of them in my own life that are really good at just listening, taking notes and capturing all the all the knowledge. Great for them. Not so great for the other. Whatever percentage of students it is that don't exactly learn that way, and we try to fit everybody into this exact same model of education. I agree with you that artificial intelligence is the great opportunity creator in terms of changing the way that students learn, I agree that we're going to get away from, if not already, grading people by what grade they're in. It's really can't. Do you have the competency? Do you have the ability, or do you not? And if you don't have it, you're not gonna be punished for that. Let's figure out how we can instill that competency in you, if it's important to you and it's important to society, and if you do have it, there's no reason for you to sit next to the student that doesn't yet and have to wait for that student because it's holding other students back. And by the way, those same two students, if you're in a different class, may be in the different positions where the student that maybe struggled a little bit more in an English class goes into a technical education or STEM class and excels, where the student excelling in English maybe struggles a little bit more there, just as one. Example. So it's not saying good student bad students. It's saying we're all different. It's saying we all have different interests, we all have different ways of learning. And AI really creates an opportunity for us to customize and personalize learning in a way that we've never, ever done before. That's one of the great opportunities that comes from generative AI and AI in general. There are some risks that come along with it, and a lot of educators, there's some that are that are saying, Look, you know, we need to embrace this. We need to find ways to use it. There are other educators that are saying, Look at chatgpt. It's just doing the work for the students. So how do we redesign the experience in the classroom? Right? Because our classrooms are set up for a sage on the stage, one size fits all, lockstep learning model, and now we're talking about having 30 students in a classroom, 30 different modes of learning and ways of learning. How do we prepare the classroom for the new way of teaching and learning in the age of artificial intelligence?
Sam Whitaker:More than anything, we need to get started, and we need to start now. We have to get student facing AI into classrooms now. That's why we're starting. I was talking about literacy being tangible with so much fear around AI and so much hesitance. What we're doing is tackling something that is tangible, that is a crisis, and people realize is a crisis. And to that end, we're actually going to be offering it. We're spinning it off as a completely separate product, separate from what we our core study fetch, and we're going to be offering it as close to free as we possibly can. In fact, we're working with some companies and foundations to offer it for free. We're spinning it off because it's not a profit center. It's going to be, we believe it's, it's a necessity in society. And to that end, we're going to be working with any competitors, any experts, anyone out there who's listening now or who you know is reading the transcript. If you want to help and get involved, we'll bring you on. We don't care who you are, if you help us get to a solution, but that's where we have to start with those steps. We have to start with getting it in front of students, and we have to be successful solving a tangible issue, and then it grows from there. And we have to figure out different ways to evaluate students. And can't be the typical, you know, go home and write this essay. It's too easy. You say, hey, chat to pt. Write something about Catcher in the Rye. That sounds like me. It's done in class, assessments, understanding and real retention, having discussions almost kind of mirroring like a like a dissertation in college, but at much younger levels.
Matt Kirchner:You know, one of the most fascinating things I saw on Tiktok was basically showing students how they can use software that's been created to help people with Lou Gehrig's disease or ALS speak. So it's basically language generating software for people that can no longer communicate on their own, and they're basically using that software, combining it with generative AI like chatgpt, having content written by chat GPT, filtering it through the language software that actually makes the words sound like the student themselves, right in the way that they would speak. And then you think about a poor teacher that's that's trying to police the use of generative AI outside of the classroom. It's like that's a fool's errand. The students are in some ways going to be so far ahead of teachers that are trying to catch cheating that we're really going to have to flip the education model. And you know, I used to say, we used to say that school is the place we went to learn, and home was the place we went to practice. You'd go to school to gain information, to sit in the class, to have the students, the teacher, rather, download their wisdom to you. Now we go into this world of the future using artificial intelligence, and we're not going to be necessarily learning at school and going home to do homework and practice. We're going to be maybe learning outside the classroom and then going back into the classroom to do the practicing and to do the work and to talk about the meaning of what we've read, or to apply what we've read or to do. In this case, you know, some of the songs and the stories and the poems that came from this work you did in Uganda. Do you see it the same way? I mean, are we unleashing creativity in ways that we haven't been able to in the past, and is that world of education going to flip in that way that we're actually going to class to basically experience and to perform and to practice what we learn and to perfect what we learn in ways that we never have in the past.
Sam Whitaker:So I always say with AI, we kind of mentioned spectrums earlier. I think AI is absolutely not a spectrum. AI is going to be extremes. It's going to be great, or it's going to be awful. I don't really see how you have an in between, because if it's great, if we teach students appropriate use early on, and we teach them to leverage AI, to expand and make them more productive and make them more creative, then I can't even imagine what the amazing impacts in society there. But if we don't, you're talking about the death of creativity. You're talking about the death of critical thinking skills. You're talking about I mean, you've seen the movie wall e where all of the humans are just on these hovercraft wheelchairs, and they just get just scooted around, and they watch shows and they eat food, and that's it. Sounds incredibly dystopian, but it could happen. I mean, why is gonna able to do everything for us? We're almost we may almost even have to choose. Choose to work. I mean, think about where we don't have to AI can take all the necessities, and the only way we're going to choose to work is if we're doing something we love, and that's another area. AI can start early and identify things kids love and things they're good at. Make it a part of their learning journey. Identify things kids are good at and they enjoy early on, make that a part of the entire educational journey, and that can lead into apprenticeships. When did vo tech schools become such kind of a bad word in the United States? It's somehow they got that stigma that was where the dumb kids went. Not every kid needs to go to a four year college, necessarily, but some kind of apprenticeship path where students, even students who are on like a PhD path for machine learning, and they're going to work for Nvidia someday, that doesn't mean an apprenticeship along the way can't be super valuable and even more valuable than a classroom experience, perhaps, absolutely,
Matt Kirchner:depending on the student without question. And we're a huge fan of our Technical and Community colleges across the United States of America. They are in vogue again. They never should have not been, because the truth of the matter is, there's huge pride and incredible opportunities for careers that can be born out of our technical colleges. But you and I agree 100% you know, we start to think about and this is really something that I've been locked in on for a while. I don't know, and I don't mean to put you on the spot, but have you read Genesis yet by Kissinger, Henry Kissinger's book, Genesis, he wrote it. They published it posthumously. You'd love it. You got to read this book. Yeah, we'll link it up in the show notes. I've talked about it probably five or six times on the podcast. There's a whole section Sam in that book about two things that you just mentioned. Number one, think about a world in which we don't have to work. Think about a world in which your food is farmed for you, created for you, prepared for you. Your house is heated using advancements in energy that require very few resources, that you don't have to drive anywhere, that you know that basically, your lawn is mowed, your kids diapers are changed for you, using robotics and automation, where, literally, if you don't want to, you can just not work. You can just sit and do nothing, and it doesn't really cost you much, depending on how all that plays together. That's a really, really different world from the world in which we live. And they go into some of the concerns about, you know, we get a fair amount of purpose from the work that we do, whether it's, you know, some people love their jobs, like me, some people are so so on. There's whatever. There's a purpose that comes from going to work every day that goes away when we don't have to do that. They go into this whole idea of what happens if we let AI go unguarded, right? And there's no guard rails, and we create these artificial Intel, artificially intelligent technologies that start to go to war with each other, right? So I've got one AI and another AI, and one decides that the other one is moving in on its space, and so it decides to attack it, and it decides to take over a weapon system where it decides to destroy the other AI's data center. I mean, this is really used the word dystopian, and I don't mean to go too far down that dark path, but you and I agree. I mean, if we don't think about how artificial intelligence gets deployed if we don't have some guardrails around the ethics and the uses and training people on how to use it and not to use it, things can end really badly. And on the other hand, it has a tremendous ability to solve problems in healthcare and improve our quality of life and give people opportunities that they've never had before. But it really is, I mean, the way that you put it, both in terms of the value of work and what AI is capable of doing in these two extremes, you and I are on the same page with regard to that, and so I think it's going to be really interesting to see how it plays out, but also really, really important to make sure that we're utilizing AI and generating public policy and thinking about how we regulate and don't regulate it, because I think over regulation of artificial intelligence can be a real problem as well. But reflecting on all that a little bit, are you seeing it the same
Sam Whitaker:way in many ways? Yes. So bring I'll go back to Uganda, because that's kind of the where we started. So the founder of the school there, his name is Dr Michelle, one of those guys you meet where you're just like, wow, I'm really glad I got to spend some of this man's time with him, it was, he's incredibly inspirational. You know, he's a great backstory. He exposed some corruption early in his career from people who were stealing money from a nonprofit exposed it. There was some incident with a grenade thrown into his house and shrapnel, and then he ended up getting the grant, and he founded Aru, and you are DT just an amazing person. And we were having a conversation, and he was talking about a lot of these issues. He's not a religious man, but he was talking about another Genesis, and right God creating man in His own image. And then he was talking about, you know, humanoid robots. And then we have AI that can reason, and they can think that's going to go into these robots. And at what point are you blurring the lines between what's human, what isn't human, what has certain rights, what doesn't have certain rights? It's when you talk about safeguards and guardrails. I agree with you completely. And and the the true dystopian the to. 1000 sky you know, all that Skynet. If that's gonna happen, there's probably not a whole lot I can do about it. So I'm gonna focus on this stuff I can do something about, which is almost to me, the more insidious way AI takes over is us letting it happen, right? Just becoming complacent and becoming lazy, lazier and just but lazy to the nth degree, where AI is doing everything and we're not doing anything. And that's where back to what we're talking about. That's where we have to step in, and we have to step in now.
Matt Kirchner:So let's talk about that. You know, you I think you actually dovetailed that right into the next topic that I wanted to cover. You know, we've talked about some of the challenges with AI. I agree on the whole idea of once we're once we're creating humanoid robots that have the ability to think and reason like humans, but maybe don't have the innate, I believe, innate ethical tendencies that I think are endemic in most people, and they just go rogue. That's something to worry about. All right, so let's bring this closer to home for our teachers. We have a lot of educators that listen to this podcast, and we're proud to have them along every week. And I know the reason that a lot of them listen is because they want to figure out, okay, what am I supposed to do about this? So, you know, I get that we should be worried. I get that we have to do it the right way. You know, you talk about the the idea that the issue in education isn't the tool, it's not the AI, it's itself. It's as you've already referenced. It's how we introduce it in the classroom. If I'm a teacher and I'm listening to this podcast, and I'm saying, okay, I get it. There's risk here. But what do you want me to do specifically is possible? What is it that we want them to do? What are you what's the call to action for an instructor, a teacher, a professor in education, there's
Sam Whitaker:a phrase, and it's from, I think it was mainly associated with the early days of Facebook, move fast and break stuff. I hate that phrase. I really do. I think it just, it pushes thought away. It pushes your preparation away. And it's just it gives people an excuse to just throw something out there and without really thinking about the ramifications. Having said that, we kind of have to do it a little bit. We have to be as thoughtful as we can be, we have to be as purposeful as we can be, but we can't do that to the point where we're waiting too long for a perfect solution. It's another I'll keep up with the phrases. A good plan today is better than a perfect plan tomorrow. We have to get some stuff into the hands of students and figure out if it works. And so from a teacher's perspective, I understand the fear, I understand the hesitance, all of those things, but you're doing your students a disservice if you're not teaching them appropriate use of AI. And there has to be a constant feedback loop between teachers, between researchers and between industry, and we have to figure out ways to what's working, what isn't working, fix it, try something else. Come back, try it again. These traditional methods of research, where it's a year long study, followed by six months of peer review, followed by finally publishing, and then maybe a few years from that, you see the results the MIT study I was talking about, for instance, the lead researcher specifically said in our release, I intentionally did not submit this for peer review. I have now, but I wanted to publish it first, because I believe the findings are so important right now, and that's teachers have to be a part of that, and industry has to welcome teacher input and teacher just teachers being involved, teacher involvement across the board, and right the ones who do, and I can say that we do. We answer every single teacher who reaches out to us on social media. We answer someone who emails us, and we take their thoughts, their concerns and their suggestions under advisement, and we put them in the platform as quickly as we can, because teachers do know best.
Matt Kirchner:Yeah, in many ways they do, for sure, and that's what drives continuous improvement, is listening to the voice of the customer, listening to the voice of the user, building improvement into the model, moving quickly in many cases. You know, I think about you juxtapose education versus industry, and I spent the first 25 plus years of my career outside of the world of education. The last 10 I've been inside, or at least tangentially, Inside Education, spending much, much more time with educators than I ever have in the past. Be honest with you, in an industry, we didn't peer review anything. I mean, you came up with a good idea. You didn't hand that over to a group of experts to tell you it was a good idea. You implemented it, and you saw if it worked and if it you know, I used to say, especially in small companies. What I loved about being in small to mid sized businesses, I could come up with a change on Monday that could implement it on Wednesday, and I could see the results on Friday, and if I didn't result like the results on Friday, Monday, I could start that whole process over again, right? And you compare that to, in some cases, the several years long process in academia and in some areas of research that it takes to be able to advance the technology this is moving so fast. I mean, think about where we were, you know, with generative AI, just just a year or two ago, and where we are today. If you're waiting two years to study what chat GPT three looked like to figure out what we should be doing today, you're just never, ever, ever going to keep up. So I think, I think it's really, really. Good advice for our higher educators, the idea of getting started now, doing it in a responsible way, educating themselves, for our educators and maybe, you know, maybe K 8k, 12, super important. What about students? So I'm a student listening to this podcast. How should my teachers be kind of making sure that I don't get lost and kind of lose my creativity and that whole idea of wonder and joy in education. So what advice would you have for a teacher and for a student to make sure that we don't lose what can be really, really special about the educational experience, which is really opening our eyes to the magic of what's out there and not getting too focused on somebody else's version of reality?
Sam Whitaker:Man, it's tough for students, because I think back to when I'm a student, you know, 10 years old, 12 years old, 14 years old, and you have adults who say things to you all the time. Be like, trust me, you'll regret that when you're older. Fine. Let me just go back to watching TV, right? Yeah, you really should probably start working out now. That's your older self will thank you for that.
Matt Kirchner:Oh, for me, it was like, Yeah, you think sixth grade is higher. Just wait till he gets to seventh grade. And boy, you have no idea what's coming when you get to high school, and then you get there, and it's like, nah, this isn't so bad.
Sam Whitaker:It's hard to say, imagining saying to myself now and saying, it's the easiest thing in the world to use chat, TPT to do your workforce. I know a kid who graduated from a top tier university, and when I say top tier, I mean, like top 10. Just graduated this past May, his last two year, got a job. Is going to be out in the workforce, or is it probably is out in the workforce. Now, did not complete an assignment for his last two years. GPT every single assignment. And what did he learn? Right? What he learned, but he got the diploma. It's really hard to say to a student when it's so easy, and that's why, I mean, say it absolutely, and if it hits anybody, if anybody hears it and says, Okay, you're right. I really do, you know, I'm really going to do this. I'm going to kind of look for ways that I can learn appropriately. But there's a reason that we educate and that we protect children as much as we can until they reach an age where we've deemed that they're old enough, just like the driver's ed analogy, right? I think for students, keep being curious, but also get out there, like show up other places. Don't, don't stay on, don't use just the AI. Build your interpersonal skills, because they're so important. But it's really incumbent on the rest of us to make sure that students have a safe way to learn.
Matt Kirchner:And the truth is, for a student, you know, yeah, I was always a path of least resistance guy myself, right? I mean, why do extra work? We bang it into our heads in manufacturing, why do more work than we should to get to the end, right? I mean, why do we call that over production or over processing in manufacturing? It's if you're putting more effort in than you need to. Because that's just a formal waste. I get why a student would use generative AI to complete their coursework, if that's what gets them to the diploma, and the diploma is their goal. There's really no differentiation in that, though, right? Any student that can speak the right prompt into chatgpt can then complete that assignment. Where's the differentiation and being a student and being a human being? So, so certainly understand that aspect of it, both sides of it, right, both taking the path of least resistance, because that's what I always did in education, but also recognizing that that you're probably missing an opportunity to learn when we do that. And then the greater message, I think, is to education in general, which is, let's create an environment where that isn't what it takes to get a diploma, and let's have people learning throughout their journey. And I know you learned a ton through education, but you talked already about some of the things you believe about education, personalized learning, those those kinds of things. Are there other things do you believe about education that would surprise other people, or that might be a little bit on the mainstream?
Sam Whitaker:So honestly, I think something that would be outside of the mainstream, especially based on a lot of the talk, if you read a lot of the headlines about AI and education, I don't think we're screwed. That's the bottom Yeah, I think there's absolutely hope. I believe there's hope. I think we have a chance to turn around, not only turn around what we're doing in AI, but turn around a system that's been failing students for 60 plus years. I believe as many times as we talk about the dangers of AI, I I believe wholeheartedly that if people like you and people like me and people like many people that I know, keep pushing forward, keep having these conversations, keep doing the right thing in our businesses, that we're going to see a brighter future. We're going to see better education. We're going to see more equality of opportunity. We're going to see equality of access and all these wonderful things that we want. I truly believe it's going to
Matt Kirchner:happen. Yeah, I believe 100% everything you're saying is so aligned with with my view on education. I'm always careful also, by the way, is in as much as we have so many educators that listen to the podcast when when I criticize, and I won't speak for you, but, but you can tell me if you disagree, when you criticize the model of education that's been around for 60 years, and in many cases, our model hasn't really changed much since shortly after World War Two. Believe it or not, if you look at most American classrooms, they look about the same as they did 60 and 70 years ago. That's certainly not to criticize the great teachers that we have across the United States of America that I know are going in for all the right reasons and imparting wisdom and and. Preparing students for life and doing incredible work, and in many cases, not getting anywhere near the gratitude that they deserve. So we're always careful to make sure that criticism of the model of education is not necessarily criticism of the educator. In fact, a lot of times, I think the model that we've created is held a lot of great educators back. So huge optimism for the future, as you have for the future of education. So appreciate the way that you answered that question. One final question for Sam Whitaker, it's a question we love asking all our guests here on the TechEd podcast. You turned the clock back a moment ago, and I listened intently when you said 1012, or 14 years old, and kind of thinking of yourself as a student, we're going to click the clock one year forward to the age of 15. You're a sophomore in high school, that young Sam Whitaker has his entire life ahead of him. And if you could give that young man one piece of advice, what would it be?
Sam Whitaker:Just show up, go to conferences, join clubs, get out in the world. I promise you, you will never regret the real you didn't watch, or the meme you didn't share, but just showing up. And so many good things happen when you just show up, when you get out there, when you have conversations, when you meet people, you never know where you're going to end up. And while it's a concern about AI that people will stop showing up, the more we do, the more AI is just going to enhance things. It's going to enhance enhance the rest of our life. So for my 15 year old cell phone, I was generally okay about it. You know, I did a lot of things, but I wish I did more. I wish I did it all right? I try to live that way now. So I talk to students all the time, and that's the one thing I say, just show up. Just do it.
Matt Kirchner:I've got a really good friend who talks about, we'll rest in the next life. Is his line. I'm like, I love that one. He's like, let's just get it done now. Let's get out and do as we possibly can. The truth is, if we use AI, right, if we leverage it, if we use it responsibly, that's going to give us more opportunities to interact with each other and to have those social opportunities, opportunities to learn, opportunities to engage in person, as opposed to just flipping a thumb on the screen of our smartphones. Totally agree with you. Great, great advice, just show up. We're really happy by the way, that Sam Whitaker showed up for this episode of The TechEd podcast. One more example of that we love meeting fascinating people, and he certainly meets that objective and meets that definition. So Sam can't thank you enough. Sam Whitaker, Director of Social Impact at study, fetch, for being with us on the TechEd podcast. Thank you so much. It was a pleasure. Terrific conversation with Sam Whitaker, talking about how artificial intelligence is transforming our entire economy, talking about how it's transforming different parts of the globe, and how areas like Uganda are doing things maybe in some ways the same and in other ways differently than we are here in the United States of America. The importance of literacy, the importance of guardrails, the importance of educators bringing AI confidently into the classroom, but doing it in a way that is responsible, that prepares our students for the future, that prepares them to in the age of artificial intelligence, where technology can do so much for us, it's still so very important to show up. We talked about a lot of different studies, we talked about different articles, we talked about some books, we talked about some resources. We are going to link those all up on the show notes for our audience. We do have the best show notes in the business. People tell us that all the time. So if you heard something you want to learn more about, rest assured, it's going to show up as a link in those show notes. You will find those at TechEd podcast.com/whitaker that is TechEd podcast.com/w h, i, t, a, k, e r, when you're done with that, and right before you show up doing something in person, check us out on social media. We are on Facebook, we are on Instagram, we are on Tiktok, we are on LinkedIn. Wherever you go to consume your social media, you will find the TechEd podcast. While you're there, reach out and say hello. We would love to hear from you. And in the same vein, we would love to see you again next week on the TechEd podcast. Until then, I am your host. Matt Kirkner, thank you for being with us.
Unknown:You.