aiEDU Studios
aiEDU Studios is a podcast from the team at The AI Education Project.
Each week, a new guest joins us for a deep-dive discussion about the ever-changing world of AI, technology, K-12 education, and other topics that will impact the next generation of the American workforce and social fabric.
Learn more about aiEDU at https://www.aiEDU.org
aiEDU Studios
Pat Yongpradit: Coding isn't dead, it's evolving
Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.
What if AI didn’t kill coding but made learning it more meaningful?
We sat down with Pat Yongpradit (former Chief Academic Officer at Code.org) to unpack why computer science still matters, how AI is reshaping classroom practices, and what real AI literacy looks like for students and teachers.
We also zoomed out to note that coding is only one piece of computer science, which also spans data science, cybersecurity, and the social and ethical impacts of technology.
We also talk about jobs, nuance, and the future of post-secondary education. BLS projections still show growth in software engineering even as “programming” tasks automate, which shifts demand toward systems-based thinking, UX, and human-centered design. Pat makes the case for a blended future where a renewed liberal arts education integrates technical fluency with ethics, communication, and problem-solving in order to prepare students for AI-infused roles across every sector.
If you care about teaching, learning, or the future of work, our conversation with Pat delivers a clear message — keep the human at the center, use AI as a partner, and give students the agency to build, critique, and create.
aiEDU: The AI Education Project
It's so funny to have you on as a guest, Pat, because we've been on so many calls. Like I kind of feel like this is just like another Zoom call. Um, but I'm trying to get my game face on and approach this with the the respect and uh uh uh gravity and all that scares because we have we have Pat Youngford on chief academic officer at co.org, one of the largest, maybe the largest organization advancing computer science for all students, um, and increasingly an organization that is a part of the now cacafinous conversation about what is the future of education look like? What is the future of computer science look like now that we have this kind of wild technology that can do stuff? Pat, thank you for coming on. And before we sort of dive into it, maybe could you just share your own journey? Because you've been at code for actually quite some time and you've seen a lot. Uh, you're definitely, this is definitely not your first rodeo.
SPEAKER_01:How long do you want me to speak about my journey? How about like two minutes and 30 seconds? I first learned computer science in fifth grade on an Apple IIe, and I learned logo. It happened to be the best class I'd ever taken in my entire life. It was three trimesters, you know, logo programming, slot cars, including building the slot cars, and uh chess. Um, and this was at a private independent school, and they were allowed to do stuff like that. It was amazing. And then I went to high school and I learned computer science as well at a special magnet program. But you know what? I didn't have any software tech people around me. No one even thought to tell me, hey Pat, this might be uh like a thing to do as a career. So I got a neuroscience degree in college, and um, while I was getting it, I was like, I don't want to be a researcher. And I started uh tutoring in a special ed program in a neighborhood high school, loved it, decided, hey, I'll redo my degree and get an education degree and be a teacher. My parents said, that's crazy. Get a finish your degree and get a certification uh in in teaching or whatever. Like just finish your degree. So I did, and then I finished my master's and a certification, and then I taught for 13 years, special education science, computer science, middle school and high school. And honestly, Alex, I'm still a teacher. I I could blink all this stuff out of the classroom stuff be be gone, and I'll be at my desk, and uh that's reality. Like this is all just a dream kind of situation. And um I I contracted for a variety of organizations um during summers uh and even during the school year for the last like three to five years of my career. So didn't take the summers off. I haven't taken the summer off in since I was like a little kid. I uh I don't know, like a high school kid or something. Uh uh because you got to make money as a teacher. And um, and then I got injected into the national computer science education community because there were very few of us at the time. Co.org was starting. There were very few people who could do what I did. Boom. I'm at co.org leading curriculum development, professional learning, school outreach, dabbling in government affairs, international development, and now leading the teach AI initiative, you know, with folks like you. And it's busier than ever.
Alex Kotran:There you go. It really is busier, busier than ever. Um, it's been now a couple of years since the AI zeitgeist was really sort of thrust upon us. I mean, I know you all were tracking AI and even had content about AI way before. Yeah, just like where are you in sort of this like journey? Are you uh are you meant sort of like denial? Like, uh are you excited? Um nervous.
SPEAKER_01:So I'm naturally a skeptic. One person once asked me, hey Pat, you don't believe in any like conspiracies, do you? I'm like, no, I don't, I totally don't. Uh and so because I'm naturally a skeptic, Alex, I constantly question myself, like, is this stuff like for real? Like, does it really matter? Am I spending my time wisely on this AI stuff? Um, and every single time I ask myself that, and I go through the the process of reflecting on what I'm reading and what I'm projecting and all that, I think, no, this is my best bet, at least for changing the education system and uh providing opportunity that's relevant for kids. So yeah, I'll continue doing it. So that's kind of where I'm at. Uh, and I'm investing everything I can at speed because I know that there's kids out there right now who who really need it who aren't getting the opportunity to learn what they need to learn.
Alex Kotran:What what have what have you seen that convinced you that this is not just another hype cycle?
SPEAKER_01:Oh hmm, yes. So what is the evidence? Um, or even just the observations. I'll start with my own personal experience with AI, which often amounts to just help it helping me think through stuff and write stuff. That's that's what it is right now with generative AI, uh with chatbots, uh LLMs. Um and I find myself both like extending my cognitive abilities, like example from like move moving from like a writing perspective to an editing perspective, but then I also see myself over relying and and and certain skills atrophying and and so that's that's personal anecdotal evidence that tells me, gosh, if I as an adult who am very aware of this stuff am dealing with some of the negative consequences, like over reliance, or maybe even I don't know, I I haven't got into the loss of critical thinking stuff, but over reliance for sure. I know kids for sure are straight up like without guidance, are gonna overall be affected negatively by all this AI stuff. And so thus I gotta do something about it.
unknown:Right.
Alex Kotran:Because it's being thrust on them with or without organizations like code or um AI EDU. And and the question is how do we make sure that there's like like the good outcome? Do you do you mind digging into the skills atrophy? Because I feel it as well. I I I find myself instinctively, as I'm trying to write a complicated email, instead of just powering through, I'm just like, oh, they just pull up the AI slot machine and just see what it comes up with. It's often not very good, but the fact that like that's my instinct is kind of weird. I've I've been self-conscious about that. What wait, what does that look like for you?
SPEAKER_01:Here's the uh I I don't think I've fallen into this, but I'm constantly tempted to fall into it. And honestly, it doesn't even have to do with like chatbots because we've had smart compose and stuff like that for a while anyway. And so it doesn't matter whether it's a an application or a feature inside a current application that I've been using. I feel tempted to not express myself authentically and just take this well-worded version of what I wanted to say and use that. And I'm constantly battling between two things. Um, ooh, it actually you like this sounds more sophisticated, or it just sounds clearer, but honestly, clearer, like clearer 100%, or clearer for me where where I was already 90%, but with authenticity, I'd rather take the 90% with authenticity than this 100% artificial. And so that's that's a temptation constantly.
Alex Kotran:Yeah. Yeah, Michelle Culver had this sub stack and it wasn't very long, but it was actually still thinking about it right now. And she asked, Would you rather get a birthday message from a friend, maybe even a close friend, and it's just very, you know, short, like happy birthday, smiley face emoji. Maybe happy birthday BB, smiley face emoji. Um clearly just like sort of like shot off Ireland Transit or something. Or would you rather get this sort of eloquent, poetic, really long text written by AI? It is clearly written by AI, it's not even or misrepresented.
SPEAKER_01:Yeah, obviously, I know where you're going with this. And the answer for most people would be I would love to see poll results on that question. And I I couldn't imagine even 2% of people wanting the long AI version. But for me, oh, you think that's interesting. I okay, okay, good keep going. Well, I'm just thinking like that that's an extreme example. What the more realistic example is that someone got some help and it was 70% AI and 30% them. Would I want that over the other thing? And even then, I don't think I'd want that. I'd want because it in that context, in that situation, it's the thought that we're optimizing for, right? The thought that counts. It's not the product. So because it's a personal product, it's not a it's not gonna be shown anywhere. It's not and it's a very simple message, you know? Happy birthday.
Alex Kotran:So Yeah, do you need a poem? Yeah, I certainly don't need a poem for my my few I don't have that many friends, but if you're listening, you don't need to send me a poem, do you? So a hot topic in AI is uh the future of computer science. And it I think that's because you have you know the CEOs of some of the like of all the biggest AI companies talking about um you know their perspective that you know computer science is going to become a lot more accessible and democratized. And yeah, uh, and some are going so far as to say there won't be computer science jobs anymore, there there'll be a lot fewer. Um it's hard to read through that because obviously if you're if you're the CEO of a frontier model company, your primary goal is raising capital to train bigger and better models. I was about to say so it benefits you to sort of stoke this hype about, oh, these models are going to be so powerful, so incredible that that we're not gonna need. So I I I read all that with a grain of salt, but I for what whether whether they are right or wrong or or hyperbolic, it is causing folks in our space in education to ask this question. How are you thinking about it? And do you I mean, and have you made it maybe even like do you have a perspective or are you still sort of trying to learn?
SPEAKER_01:No, I have a very deep perspective. Um, you know, the teach AI initiative has a whole project called Guidance on the Future of CS Education in the Age of AI. I did know that. I was teeing it up. Yeah. And it and it touches on um this very subject by just not being afraid to ask the hard questions. And um, and you know, when I talk about computer science, um, the future of computer science education to general education crowds, I let them know that the questions that we're asking are the same questions they should be asking. For example, what AI experiences should, well, it's just really every student, not just computer science students, but what AI experiences should at least computer science students have? Um, what are the foundational AI experiences for the field? And that's gonna differ for um English, social studies, math, et cetera. But you got to ask that question. You also gotta ask, is learning to blank still important? In the case of computer science, is learning to program still important, but is learning to write still important in English, right? Is learning to something else in some other field? Is learning to draw uh in art class, still important, or paint? And so we're asking these questions, and um, you know, the answers are quite nuanced because these people at their level simplify things for headlines, for whatever, you know, funding, whatever. And the truth is, is that the biggest difference between what they're talking about and what we're talking about is that they're talking about industry and the workforce, and that's their perspective. We're talking about education, and ultimately, like we learn lots of stuff that we actually uh have machines doing for us when we're adults, and it's still valuable, so it's not like you know, it's you know, right then and there we can start with that. But you know, I have a whole list of reasons why kids should still learn to code, but in an evolved way that utilizes AI.
Alex Kotran:Yeah, like I mean, this is something that we've been thinking about in terms of you know, you spend years just learning code, writing code, and then you get onto the job, and then that's where you're often learning, you know, working with customers or team members and you know, building sort of like these like like sort the soft skills, the durable skills, however you want to call them. Um, you know, if AI is taking allowing us to spend less time learning and writing code, then perhaps that means um even in the process of learning computer science, you're actually building some of those skills before you're thrust into either the workplace or into to college.
SPEAKER_01:Yes. I mean, let me give you just my top reasons why computer science is so important. So let's start off with this. When these people are talking about computer science, they're mainly talking about coding or programming. And a lot of them actually just straight up say coding, you don't need to learn to code. They don't say learn computer science, they say you don't need to learn to code. Um, or if they're saying computer science, they they actually just mean coding and programming. And uh, you know, fact is that computer science education covers much more than just coding. It actually does cover AI, it uh covers data science, it covers social and ethical impacts, it covers, you know, how these machines are even built, uh, and lots and lots of other topics as well. Cybersecurity. Um, so saying that you don't need to learn computer science because AI can code, that only covers like a subset of what we consider computer science education. Um, second, I'd say that um even if computer science was just coding in education and AI could do a perfect job of it, the purpose of learning things in school is not just about the product, it's also about the process and what students learn in that process. And there's very few activities you can do as a student that allow you to examine your thinking as much as coding because you're putting this very elaborate process of thinking into a medium that you can examine and we call it, you know, debug, right? And then refine. And that's beautiful. It's a wonderful metacognitive experience that all kids should have. Um, and frankly, a lot of them don't. So that's yet another reason why learning to code is still important. Um, and then I'll just add one more just for fun. Um, you know, uh the software engineering industry is probably the industry that has embraced AI tools the most. You know, GitHub surveys year after year show that more than 90% of software engineers are using AI tools. And uh you don't find that in any other field. And so what's beautiful is that students in computer science class can start getting familiar with like the effective use of AI tools in their high school experience and start to learn and just kind of almost um transfer that experience to using AI in in other aspects of their life. You know, just learning to evaluate outputs, learning to refine the outputs, learning when to use the assistant and when not to use the assistant.
Alex Kotran:This this question of like how do we get folks in a way that in a place where they can use AI really effectively at work, it's something that I've also been ruminating on quite a lot. Because you know, at our team, I don't know if this is the case at code, but at our team, it we realize it's actually a lot more complicated than just, you know, hey, folks, we want you to really use AI tools and find ways to to harness them to be more um, not just more productive, but to augment your work and um unlock your creativity. But in practice, there's a lot of like sort of small tactical things, like which tool do you use? And sometimes, you know, tools that have built-in AI are not using the latest and greatest reasoning model. Um sometimes they have sort of pre-prompting and and sometimes you don't want your team to use AI. Um sometimes that that human authentic authentic component, especially with you know, if you're interacting with schools or with partners, you know, I wouldn't I wouldn't deign to send you an AI generated email, Pat. Um now, if you've sent if you've sent me one, you're very good because I I I pride myself on being able to kind of catch stuff like that. And I and I feel like your emails always are very clearly human. Um but just to give you some examples, right? It's like that's all there's so much nuance in that. It's like it is an art, not a science. There is no algorithm that you can teach that's like use AI in this case, or like you know, a decision tree. Um I I mean, I is that are you finding the same thing? I mean, uh is code doing internal trainings, or are you kind of just letting folks feel it out?
SPEAKER_01:No, we we are doing no. Well, I mean, sorry. Uh the no is we are not just letting people feel it out. Uh we are um there's um you know, there's a whole I guess a business. Yeah, the our business team, our our people operations team has done a number of things, including giving us a stipend of$500 a year to explore a variety of tools, whatever$500 can buy. Um and uh has set up opportunities for people to share like what they're learning from their experience with their peers. Um thank goodness, right? Like we're walking the walk that we're talking. Um so uh now I can't tell you that I we we don't have like some kind of like AI principles for the you know, principles for the use of AI at code.org. We don't have anything as explicit as that. Even our curriculum development process right now, it's not like it's baked in from a like a technical platform level. Like folks are using these other apps and then they're just doing what they're doing and all that, but it's it's not like we have a an actual technical process where that's that utilizes generative AI to inform our curriculum um development. So I I mentioned because some orgs do have that, Alex, and I'm actually jealous of them. Are you jealous? I I'm skeptical. Well, they say it's useful and worthwhile. In fact, I was talking to one of these orgs, that's who I was jealous of, and they were like a small org, and they said, Yeah, we have a whole technical process, like a built-in something, some curricul a gen AI enabled curriculum developer that they crafted themselves, and they're a small team and they've found it quite useful. And I've looked at some of their stuff and it ain't bad.
Alex Kotran:So they said it's multiplied their productivity. It is interesting. Like the question is and it almost would be better if you don't tell me the name of the org and maybe you give me three or four orgs curriculum and and I try to find seriously. I mean, I think if if we can't tell the difference in the output, set aside sort of like what it means to replace humans with AI. It's just like if I can't tell the difference between uh, or if a teacher uh can't tell the difference between something that's AI generated or written by a human, um does it matter? Is there some sort of intrinsic value in human labor?
SPEAKER_01:Um It depends on the context. Just like I said with the whole like birthday wish thing. With the birthday message, yeah. Yeah, it just depends on the context. That's what I love about this AI stuff. It's so super nuanced. Like, you know, it yeah, it's just it's it's so nuanced because it's it's about intelligence and creation and evaluation and ethics all at once. So like you you can't ever just paint it into a corner. Um and you know it's funny, Alex. I was just thinking about this recently, like these conversations that we're having right now, uh we may not get answers to them like or have like any type of uh societal consensus. Like deliberate societal consensus. But A consensus will be built, and most of these questions just won't matter. That's what I think. I think they just won't matter.
Alex Kotran:Um what do you mean? Like we're just not gonna we're gonna move past sort of the metaphoric the metaphysical layer of discussion around AI is just gonna be just so seeming.
SPEAKER_01:Let me give you an example. Let me give an example, Alex. Anthropomorphic language when describing AI and how AI works. Man, you know, I understand. Look, look, I I mean, we we even put that as an issue in the framework on purpose because it it ultimately it's it's a reflection. Using anthropomorphic language as a kid, as a student, ultimately reflects your mental model for AI. And we don't want people to have erroneous mental models. Um, so therefore we language matters. But you know what? Give it a couple years. Everyone's just gonna use, you know, I partner with AI, or AI made me feel this way, or you know, my AI was unhappy today or something, you know. And people are just gonna get it. People are gonna know, no, we know that AI ain't alive. Like I use this language, I use it loosely, but no one believes that AI is alive or ascension or whatever. We're just we're just using language because it's gotten so sophisticated, it's basically acting like that way, but I know it's not alive. So, like battling around language, uh it ain't gonna matter. That's what I'm saying. It's just it's just not gonna matter. It it's gonna matter at a K-12 level at a at a point, but for most folks, it's just it isn't gonna matter.
Alex Kotran:I think you're right. Even though I don't agree with you that it doesn't matter. Oh, it does. I do think it matters all sides that's well.
SPEAKER_01:I'm saying it, it does matter, um, but it doesn't, I'm saying like it's not yeah, yeah. You know what I'm saying?
Alex Kotran:Like Yeah, I do. So actually, I mean, so let me, I don't wanna yeah, I don't want to misrepresent. So what what what you're saying, just bringing me back to you, what you're saying is um, yes, it is a really interesting question as to uh what the implications are of uh imbuing AI with this sort of like agency or this sense of like being a thing that you're interacting with as opposed to just sort of like a technology system. Um but but as with the whole thing about privacy, you know, we had all this hubaloo about our data and privacy. And I remember GDPR and I was I was working in Europe around the time that GDPR was really, you know, uh like at the center stage. And nobody, I mean, like nobody can really authentically tell you that, oh, I really, you know, I read the terms and conditions, I keep track of where my data is used. Everybody has the ring cameras, they're using TikTok, which by the way, I didn't know this, but it actually shares not blinded and anonymized data, but just all of your data, full stuff, full whole bore, all of your data is out. So we've kind of and and I cannot tell you how many panels I attended where people were just obsessed with this idea of like, how are we going to make sure that people have control over their data? Is it policy important to them? Policy policy people.
SPEAKER_01:Yeah.
Alex Kotran:Um I do feel like this is where education is such a powerful medium because it is it is not the sexy, I mean it's unsexy, right? It's like you're you're not in Washington, DC. You're in these often, you know, kind of old buildings, maybe even like slightly musty classrooms, but you're talking to people that are literally like, you know, uh cultivating, you know, hundreds of people into this world. And there's there's a lot of excitement excitement, but also, you know, responsibility in terms of um so to bring us back to your question about why is computer science important and how do we even talk about the value of education? I mean, you're that's kind of where we're landing, right? Is if you want to address the future of computer science, you actually have to step back and say, well, what is the purpose of education? Yes. Like why do we teach students about art even though they're not all going to become artists? Yeah. Why do we have them read Shakespeare, even though, you know, I I can't tell the last time I actually, you know, quoted Shakespeare. I think I'd probably get come off because a bit of a argument.
SPEAKER_01:My biggest argument for learning computer science, regardless of an AI context, it definitely applies to an AI context, though. My biggest argument for learning computer science is what I was getting at with the whole like you're putting out your thoughts and your creativity into a medium that you can debug and and refine and all that. My there is a a beauty and a joy in being able to like being empowered enough to create technology that you otherwise thought was just a thing that other people created in some mysterious way and just and just gifted to you. Like we live in a super duper digital world, and yet so many kids are just consumers, passive consumers of the media and the technologies themselves. And if I can give kids an experience like where they're actually changing those technologies, and then I can turn them towards thinking about real problems. Oh man, that's that's why I do this. That's why I do this. And so if they're using AI to help them do that and and they're still having that wonderful experience, cool. Like, cool. Let me just evolve kind of what I'm doing here so that I'm not teaching like them the I'm not teaching them to write code as much anymore, but maybe to read code and evaluate code and debug code. Cool, great.
Alex Kotran:Like you you said it so simply, but like this is like the entire debate is it's not just about teaching the right code, it's about teaching them how to do things with code. Yes. And I think this is like the same, you know, when we think about like we do a lot of work with English teachers and you know, social studies teachers who are trying to figure out how do we deal with cheating. And I'm like, well, it's not cheating, it's really shortcut. Um, but like stepping back, it is important for students. Like the here are the analogs. It's like you need to be able to write code. You don't want someone who doesn't know any has never written a line of code in the same way that you need to be able to be able to write an essay. But today, if in English class, the students are writing in some cases hundreds of essays a year, right? Like, surely that's not actually what English class is about, is being really good at writing essays. I haven't written an essay. I've written a lot, but it's not essays, it's a different, it's varying forms of uh varying formats. It's often sometimes I'm speaking, sometimes I'm on panels. Um essay writing is actually about conveying your knowledge and your thinking on a topic. And so you're you're basically describing this evolution of um like maybe how uh like how we teach more so than what we teach. Is that an accurate reflection of this? Because it's so nuanced, but I think that's actually what we need to do right now is help teachers unpack, like it's not you're not important anymore. What you're teaching isn't important because AI is gonna replace it, but it's also not you can just ignore AI. Say that again. Like, I there's there's two schools. There's like the folks that are like, the AI is just gonna displace learning. We need to just keep it out of the classroom. Okay. Yep. Right, as long as possible. And there's the other end of the spectrum, which is we go all in on AI, and AI is gonna just replace the need for us to teach these things. And so let's just focus on, you know, self-actualization. Um and I think you and I are sort of in this like middle ground where it's like, sure, we can't ignore AI. Um, but we also need to, you know, but what like the project of education is still very important, and so we can't let go of like it is important for students to learn, to read and write, and to learn math, and to learn computer science. And so how do you sort of balance those things?
SPEAKER_01:And well, like let's get to AI literacy, actually, because you know what's beautiful about AI literacy is that it's not a new thing, it takes bits and pieces from lots of things, puts it in a relevant context that is affecting everyone right now, and you know, both reinforces and supercharges the learning of these other things. And so it leads us into that promised land of interdisciplinary learning that I think that you were kind of alluding to just now.
Alex Kotran:You know, I think taking AI literacy very seriously, you published a framework, Code's initiative teach AI. Uh, but I think code was you know driving that. This is like really you, like your team.
SPEAKER_01:Yeah, the AI literacy framework was uh is a project that um was led by the European Commission, the OECD, uh, code.org with a whole group of international experts, some of which you know, um uh from the US, there are three. Uh Dr. Patti Ruiz from uh Digital Promise, Dr. Joseph South from ISTASCD, um, and Dr. Victor Lee from Stanford. Um, and then there are a whole bunch of other doctor people and academics from all over the world uh in the that that make up the the rest of that expert group. I know a lot of other thinking went into this. Uh so even though those are the listed people, you mentioned the teach AI community. I mean, even AIEDU's AI readiness framework, a lot of these ideas, you know, actually, Alex, the thing that I took away from the AI readiness framework that that informed like a key decision in the um in the framework was the idea of not trying to identify new skills for AI. Like just like it, they there aren't new skills. They're just skills that just have mattered before and matter now, and and it we just need to frame them in an AI context, but they still matter. And and and what I actually think thinking about one of my favorite skills in the set of seven skills we identified as relevant to AI literacy is actually self and social awareness, which isn't your typical like techy, you know, communication, collaboration, problem solving, all these things can show up in some type of like tech leaning framework or a framework of about a technology. But self and social awareness, you know, if anything, that might be one of the like priority skills that people need. Um and this kind of gets into the whole discussion of like like the human, the more human side of AI literacy. So, you know, again, that the this is that's the framework it it it it it learned from lots of other um frameworks, but it also I feel like it it it's added a bunch as well. Um, you know, that there are four domains engaging AI, creating with AI, designing uh uh managing AI and designing AI. And you know, the the last two are I think are very worthwhile additions to the thinking about AI literacy. You know, managing AI is straight up what you think it's about. It's about AI agents and AI teammates and understanding capabilities and limitations of humans and AI and when to augment and when to automate, and and even deciding whether or not to use AI based on the nature of a task. And then designing AI really gets into my constructionist roots because to truly understand something, you need to get in there and understand how it works and get hands-on. No, we don't need a whole bunch of AI engineers, though we actually do to for US competitiveness. But yeah, putting that aside, you know, not every kid is gonna be gonna become an AI engineer. So designing AI is not about setting them up for that path, it's just about giving them a hands-on experience because of all technologies now. Of all technologies, we had this complicated AI thing, and people just, you know, people aren't even coding and learning computer science. Now they're now there's this AI thing, and we want them to be hands-on, it's like crazy talk for most people. But in reality, as you know, because you've been in this game for a while, it's not actually crazy talk. It's just a lot of people just haven't had the experience at a young age or had the experience at all, actually shaping AI's behavior.
Alex Kotran:Yeah, and this is and and what's powerful about the way you're framing this is as you put it, is it's not just about having students be really effective and savvy users of AI and navigate that world, but it's a chance to get closer to this like this golden fleece of um how do we how do we truly do interdisciplinary learning, especially in the context of technology. Um, and that's something that code has been, this is not the first time code has thought about this, nor the computer science movement. I mean, it's there's been uh a lot of efforts to bring other teachers into um exploration about code. I mean, uh Hour of Code, I know, is certainly not just computer science teachers who are um facilitating those experiences. I mean, do you see do you see AI literacy as a chance to kind of move that bull forward? Is it how big of a how much of a game changer is is AI literacy in terms of our the ability to bring non-STEM teachers into into that work?
SPEAKER_01:Yeah, actually, I think of it the I'll answer that question, but I actually think about it the opposite way. I think um I think that as we all know, you know, STEM careers, learning STEM has really dominated a lot of the um education talk, education workforce talk for 10, 15 years, maybe even 20 years. Um, and rightly so because of our job needs and our and our deficits um in STEM education. And and you know, human uh enrollment in the humanities has dropped uh quite a lot over the same 10, 15, 20 years. I think AI literacy shines a light on the need for a liberal arts education. Uh again, uh but an evolved one, an evolved one, and uh where you know the there is no line between, or the line has been blurred between STEM education and liberal arts education. It it all becomes a liberal arts education, honestly. That's that's one way to think about. And so liberal arts and STEM is now one thing over here, and then straight up hardcore STEM is one thing. And that's what we have. And I think that's beautiful.
Alex Kotran:This is no, this is fascinating. I was just talking to somebody about the future of um, you know, a senior person, a senior person on the research team at a really big technology company, and there was this sort of commiseration around like, we have no idea what the future of post-secondary is going to look like. It just just it's just so perplexing. I think you're actually onto something. I think this idea of like, okay, you the vast majority of students are going into general liberal arts, maybe with with with uh uh focus areas. So you might have some kids focusing on computer science or some kids focusing on, you know, marketing and communications, but um, but a lot of this is actually building sort of the general ability to learn, building durable skills. Yes. And and yes, for a subset, perhaps there is this track, which is like the AI engineer, the astrophysicist. Yes. Um, but I think it was like physicists are one of the most they're actually there's like the I think the the they have a higher unemployment rate than um it's like they're like number two or three on the list, or they're very high up. And you're like, why physicists? And it's just it's this, it's actually very niche. I've also heard, by the way, an interesting take that um this is like the Y Combinator survey of their uh there were like a there was a podcast about the survey of their cohort and the fact that a lot of them are are using AI to write code. What they actually said is it's actually bringing allowing folks in the hard sciences that weren't necessarily majoring in computer science the chance to bring their expertise and depth in terms of computational thinking and like logical thinking. Um so that's kind of exciting. It's like this that's where this story about like, oh, you know, computer science as a as a pathway is, you know, as like there's like the unemployment rate is looking higher than usual. But oh, that's I'm increasingly of the opinion that that isn't that is a un, that is, it's not nuanced. It's not taking into account the fact that there's a lot of jobs that are not computer science jobs, but they're now being imbued with technology. Yes. Costco. I just learned that Costco went from, I think, 200 to 2,000 uh technologists over the last X years. Um I don't know if those are computer scientists, right? They're they're um, but my guess is those are those are jobs that or if you had a degree in computer science or background in computer science, you'd you'd do well.
SPEAKER_01:You know, I I was looking at BLS data uh projections. Uh and you know, BLS projections could totally be wrong. Uh, but this just came out, I don't know, four or five months ago. And uh they said that between 2023 and 2033, um, that software engineering occupations will actually grow by, I don't know, it was like 11, 16, or 19%. I can't remember. It was one of those numbers, um, but definitely at a higher rate than um just job growth overall or or even like similar-ish jobs. But they said that programming jobs, and see notice how they like delineate between software engineering and programming. They say programming, oh, that's just going down. Um, because you know, that's just like the basic stuff, like coding. Um, but software engineering, software development occupations, they involve like user interface design and um like working towards like uh like like working towards a much more a larger picture, like thinking from a larger, a lot more systems level thinking, um deployment and actual like people using it, thus the UX and UI work. So I'm I when I see like employ unemployment, I just know it's a lot a lot of nuance. I know it's our economic situation in this country right now. I know that for some of the tech companies, actually, I know a lot of Microsoft people who just got released for this last round where like thousands of people were let go. A lot of it was because a lot of these people were just really expensive. They had reached a point where they were really expensive, and Microsoft wanted to figure out like how to shave off like how to how to save money and probably use it, I don't know, for more AI training. I I don't know what. Um and and and and so it wasn't, and and these people are gonna find other jobs. The the people who need to worry, and that's what we're seeing right now, is these kids who are just graduating from these schools and expecting to get an easy job like it was like five, ten years ago. And you know why they aren't? It's not because AI's replaced all computer science jobs or whatever, it's just because there's so many of y'all, and that's honestly partly co.org's fault because we we got a lot of kids interested and they continued to pursue it in in college, and a lot of them might a lot of the higher ed programs probably have not um evolved with the times, and so now there's a lot of kids who just know um a certain model of thinking about computing, and it and there's just too many of them, and it might not be as relevant as the kids who also took an AI or a data science course in their experience. Yeah.
Alex Kotran:I don't know if it's your fault though. I think it's uh I mean I think I think post-secondary, I think post-secondary is but even it's like K-12 really has no control over what happens in post-secondary. And I've seen this in it's very different group. So is it the the spaces where you and I operate in, there's lots of folks from academia, but they're generally as my as I've seen it, they're edge experts in education in K-12 education. I've been in very few sit like settings where there's actually this holistic conversation about K-20. Well, that's another one.
SPEAKER_01:That's another way to make the most of this AI opportunity to expand the conversation to K-20 now and really think about it like deeply like that. Um, because we're all kind of all in the same boat, at least for this AI stuff, right? I mean, you know, the same cheating conversation in college is the same cheating conversation in high school. So that kind of unites people. Now we're able to do what you were suggesting, Alex. Like let's think deeper about the purpose of education, but let's think about it like holistically now, K 20.
Alex Kotran:It is the uh it is it is it is thrusting upon us, it's it is pr it is bringing this conversation long overdue. But I think what's powerful is it's bringing to every dinner table, everybody's talking about it, whether or not they actually know anything about education policy, because we're all touching and feeling it. Yes. And the professors and the teachers, what they're seeing is the homework assignments, yes, those grades are going up and the test scores are going down. And I feel for them.
SPEAKER_01:And you know, I'm not one of those people who are like, oh, y'all just need to rethink your assignments. And I I mean, I'd certainly say that they do need to rethink their assignment, right? But like I know it's not easy. And I know I understand. I mean, I was again, I can just blink and I'm back in the uh classroom. And I know this kid just totally copied this code from another kid because it's so easy to you just swap files, you edit some stuff. And I know I know how that burns you as an educator. And um, and so I I feel their pain. I do, I feel their pain. And I know that the easy way out is to use some type of detection tool of some sort, but it's it's so lame. I I'm I would love to talk to a teacher who's totally using those detection tools and just to understand like the pros and cons and like how it's played out. Like, do you you know, I I I say like even if the content detection tools were perfect, would you want to set up a police state in your classroom where everything has to be evaluated for authenticity and and and and all that? Like, that's not fun. Like, I just want to just I want to trust kids. I don't want to have to like well, yeah.
Alex Kotran:And if you know, you know, I don't think too many kids listen to this podcast, so I don't think I'm revealing too much. But I think the dirty secret is if you're a student and a teacher accuses you of using AI and doesn't really really matter what detection tool they have, um, you know, you can just dig in. Because those detection tools are not foolproof. And if you just say, look, no, I actually did write this, that detection tool is wrong. That's kind of the end of the conversation, unless the teacher is now to your point, unless there was like unless that teacher you know, keystroke, you know, like like keystroke monitoring on laptops. I mean, that that's actually, you know, maybe that is the police state is maybe the solution um for schools that don't want to evolve. Um that's that's a scary thought.
SPEAKER_01:Yeah. Um but yeah, I definitely feel for my um educator peers and and and the difficulty of of validating learning now.
Alex Kotran:But doesn't that bring us to AI literacy? Because what you're describing is a is a set of skills that that are yes can be important for students, but even for teachers, like ultimately their ability to to redesign these assignments, I don't know how they could do it without understanding the AI and how to use it and how to wield it. It's you know, it and that's hard because that's now another thing they need to learn.
SPEAKER_01:If there isn't a quick fix, but well, you know, here's a little dirty secret about frameworks. Um, yes, teachers are one of the number one audiences, uh one of the major audiences, or one of the most important audiences, but teachers don't really like use frameworks, they use curriculum or resources created by people who use the frameworks, or they get trained by people who use the frameworks to craft relevant professional learning agendas, right? So I am not worried, considering that there are bazillion AI and education shops out there these days, and everyone's an expert. I'm not worried that teachers won't have useful guidance or curriculum, off-the-shelf curriculum that kind of does the work for them. Um maybe I'm I'm being too hopeful. But in our world of open education resources and the fact that Gen AI can do a pretty good job of creating a lesson plan, I'm not too worried about the um the operationalization of AI literacy. Or like saddling teachers with with too much load.
Alex Kotran:Yeah, I mean, there's gonna be a lot of work to be done. Alex And I say that as we're going into the summer.
SPEAKER_01:Uh Alex, what is the hottest thing that AI EDU has planned for the future? And what is the biggest thing on your mind right now that if you could just do it awesome, you'd be so happy.
Alex Kotran:I try not to make these about us, but um yeah, reimagining school, like school transformation and actually following through. And what's different this time around? Because we've been talking about school transformation for decades. Um, so you know, observers would be right to kind of be skeptical that, like, okay, what's different this time around of the answer is the AI moment. This is like we cannot ignore it. People, no matter how much they want to ignore it, they'll be it is forced on them. We are going to be forced to deal with this. And, you know, I think our big thing is let's let's let's harness this the the disruption that it is going to usher upon us and and use that to actually um make like address long-standing problems in school that are also tied to student success in the age of AI, things like sure durable skills and project-based learning, but also just, you know, making learning meaningful and engaging. Um and school where students feel like what they're learning is relevant. Um because we have like, you know, sometimes a quarter, sometimes 40% of students in some of the districts we work with are not showing up to school. And so it kind of doesn't matter what the curriculum is and how great the teachers are and how they adapt their assignments if the kids aren't in the classroom. And that's a really hard thing, but you know, you ask for the big thing. That's the big thing.
SPEAKER_01:Well um so Alex, wait, wait, so school, reimagining school, are you telling me that you all are engaged in solutions that will get kids to just show up at school in the first place? Well, that's what we'd like to do.
Alex Kotran:We don't have the the data yet because it's it's gonna take time to actually get that data. But that is the goal. Awesome. Make learning more engaging and meaningful. Um and it's so multifaceted because it's not about just curriculum, this is about teachers. Well, that's what I'm saying.
SPEAKER_01:Teacher training, it's about it's about money and and and and policy and things outside of our usual scope, actually. It's about family.
Alex Kotran:So so so now yes. So you are getting so so uh that that is actually very that is an important sobering uh add-on, which is there's a lot of this is multidimensional causes for absenteeism. Um and learning not being engaging enough is is part, but but but actually quite small for uh uh so no doubt, no doubt. Yeah, that's what I'm like.
SPEAKER_01:Wow, if Alex is doing that, you know what's funny, Alex? Sometimes I think I want to just bag all this stuff. Like, what am I doing? Like, I should be working on absenteeism because of what you just said. Like, I should be working with families. That's what I should be doing, or I should be working on economic policy. I don't you know what I mean. Like that that's really gonna I I'll let all the other people figure out the education stuff. I'm gonna focus on like just getting kids into the school. I I I sometimes randomly think about that because I just want to think about how I spend my time most wisely.
Alex Kotran:Well, that's a hard thing. Um I I still like deep down, it there's like why are kids cheating? Kids are cheating because they just think this is busy work that's not real, that's not meaningful. Like you, you, you, teachers feel it. There's just this intuition that like kids are phoning it in. Um and look, the no organization can save the world. That's the thing. You got to pick. You you pick the the problems you're gonna try to solve. But that doesn't mean you can't think big. That doesn't mean that you can't push a conversation about, well, maybe we do need to push for more funding if we actually care about workforce readiness. And maybe that does mean that if you're uh you know a policymaker that's asking the question, well, what can I do to make sure that students are better prepared? Um, well, perhaps it's also making sure that uh their schools are adequately funded and their communities have um, you know, access to health care and healthy food. I mean, you know, it's like there's a world where you just don't shy away from those conversations, even if it's complicated. It's like, well, yeah, that the that is the hard thing, but it is hard.
SPEAKER_01:But adding a new subject to the K-12 menu was hard, but code.org decides to crackle it. You know, you know, uh about what you said, like no organization can save the world, you know, I I have this issue, this delusion that no, actually, there are organizations that can save the world. And when I thought about what you just said, I instantly thought about the Gates Foundation and Bill Gates giving, I don't know how much over the next 20 years. Like how much of the money is he is he ultimately gonna give over the next 20 years? Tens of billions?
Alex Kotran:Infinite, yeah. Mind-boggling.
SPEAKER_01:Yeah, you know what? And I know what they've done for childhood mortality and you know, malaria and stuff like that. They've saved, they can draw a direct line between their money and kids' lives being saved. I'm like, that's to that's saving the world to me. And I know uh that they could do so much more if they just have the right people kind of guiding the ship, engaging with people and all that. And I I I actually do believe uh in an organization like that, that they could save the world.
Alex Kotran:Well, if you're not if you're not trying, what are we what are we here for? It's not the money.
SPEAKER_01:Yeah.
Alex Kotran:Um Pat, this was really fun. And also just beginning, I think, you know, it'll be it'll be cool. I know that you have other um big stuff cooking. As always, I mean code.org always has stuff cooking, but um I'm very excited to see what's coming next. And uh maybe we can find some more time to to jump back on onto the pod.
SPEAKER_01:Yeah.
Alex Kotran:I don't know. Is it a pod? I'm not sure if it's a pod, actually, because it's more of a YouTube show. But um nonetheless, jump back on mics and virtual screens, or maybe in person.
SPEAKER_01:Yeah, yeah. Well, for sure in person uh very soon.