aiEDU Studios
aiEDU Studios is a podcast from the team at The AI Education Project.
Each week, a new guest joins us for a deep-dive discussion about the ever-changing world of AI, technology, K-12 education, and other topics that will impact the next generation of the American workforce and social fabric.
Learn more about aiEDU at https://www.aiEDU.org
aiEDU Studios
Victor Lee: Rethinking school and AI literacy
Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.
If you’ve ever wondered whether AI has 'broken' school, this conversation with Stanford associate professor Victor Lee cuts through the noise and gets to the heart of the issue.
We start by mapping out three lenses every learner needs:
- The user who applies tools well.
- The developer who grasps core concepts like models and training data.
- The critic who sees bias, persuasion, and societal impact.
That trifecta becomes a practical compass for teachers, parents, and leaders trying to decide what matters by the time students graduate.
Victor also shares fresh findings from his widely-cited study on AI and academic integrity:
- Cheating didn’t spike after ChatGPT arrived. The baseline was already high.
- When work is irrelevant or purely procedural, students seek shortcuts.
- When tasks demand interpretation, personal voice, and real evidence, AI becomes a helper rather than a loophole.
We also explore how to rethink writing beyond the five-paragraph essay, turning “writing is thinking” into prompts that reward judgment over regurgitation. Think role-play, multimedia analysis, and context-rich arguments that students can own and AI outputs can’t fake.
Lastly, Victor examined computer science in a world of co-pilots. Coding isn’t going away, but the value shifts from syntax to decomposition, abstraction, testing, and reasoning about systems. The best AI-assisted developers have strong fundamentals and the same is true in every field — domain knowledge multiplies AI. That’s the essence of AI readiness: deep subject-grounding plus AI fluency and skepticism.
If you care about smarter classroom assessment, meaningful tasks, and preparing students for an AI-shaped world, this episode offers a grounded and hopeful roadmap.
aiEDU: The AI Education Project
AI and learning
Alex KotranVictor Lee, thank you for joining.
SPEAKER_01Thanks for having me.
Alex KotranWe've we've shared at least one, I think more than one, conference stage. Um this is a bit self-indulgent because I I actually haven't had the chance to sit down with you and really dive in uh at length. And so this is really an opportunity for me and hopefully our audience to hear more about your research, but I think more importantly, just your perspective on this big question of AI and education and the future of learning. We have an hour to solve all of that.
SPEAKER_01No, is that all? Easy. We'll be done in 30 minutes, and then you can mention the advertisement or anything like that. Yeah, yeah.
Alex KotranAdvertising by standard. Um yeah, so so just tell tell us about who you are. And you know, you you're doing a lot of research. Maybe you can share. Um, there's one piece of research that has really, I think, made a lot of waves in the space, you know what I'm referring to. And you can sort of like talk a little bit about that piece, but also your work more broadly at Stanford.
Victor Lee’s research portfolio
unknownSure.
SPEAKER_01Um, I am a uh associate professor of education at Stanford University. Um my training is in learning sciences, which means I think a lot about what people learn, how people learn it, what kinds of environments, settings, and tools facilitate that. And then I also have um, as one of my responsibilities at Stanford um as the faculty lead for um AI and education at the Stanford Accelerator for Learning. Um The Accelerator is a unit on campus that serves to be a bridge entity for real-world impact um on questions or problems that involve things related to learning across the lifespan. Um, it has some key focus areas, but AI's definitely taken a large chunk of that. Um, and in that role, you know, mobilizing folks from like computer science, from music, from humanities, um, to collaborate with one another, to um connect out to the community and to try to move research along faster in ways that are more legible and consequential. Um by trying to get interesting questions from the challenges people are seeing or the wonderings that they really have, and you know, creating kind of a virtuous cycle um of research. So those are two of the main hats that I wear at Stanford. I also maintain my own research lab, but because I'm from the learning sciences and we have this combination of um research and design work, there's a number of projects um ranging from um CRAFT, uh, which is an AI curriculum resource repository that we um have been exploring how teachers think about design of AI literacy experiences. Um the study that uh has made some waves is the cheating study, um, and that was looking at academic integrity. Uh, the last publication on that um was looking at academic integrity before ChatGPD came out, and then um at the end of that first academic year. Um, and you know, the headline was that despite the news that kept popping up in the very scary stories, that the overall level was staying the same. Um, but what a lot of people didn't know is that the overall level was already really high and has been high for decades. Um, and there's some interesting complications that are in there. But, you know, in addition to doing research in that, um, have collaborations, some domestically, some internationally, um, having a really fun collaboration right now with Google Research on thinking through digital game-like experiences for students. Um, that that's been a lot of fun just to operationalize what we know about learning into some of the game mechanics in there. Um, and I think everybody's learning a lot from from that collaboration. But I could go on, but the hour would go and I would be like, and I have a project like this and a project like that, and you know, turn the table on to you.
The Cheating Study: What changed?
Alex KotranYes, you have a lot of projects. Um we're gonna talk about the cheating study. I I do want to just zoom out. I mean, you're you're one of the foremost voices right now, um, uh sort of like providing some guideposts as we think about AI education and AI literacy. Oh boy, then we're in trouble, aren't we? We AI literacy is sort of it's a term that kind of has kind of hit the uh become mainstream. Um, and yet I myself don't really feel like I fully understand when someone says A literacy, they could be saying one thing, they could mean something else. Like what does a literacy mean to you? And then I can I'll go, we can go, we sort of unpack that, but just sort of like broad strokes when you think about A literacy.
SPEAKER_01My honest answer is I don't think we know. And I think we collectively, being research or any of the leadership or education. So I have um a chapter coming out in a handbook with Dury Long. Dury is a uh professor over at Northwestern University and did one of the first big studies on um synthesizing AI literacy in the research literature. And there is a bit of talking past one another that's happening with people coming up with these are the most important ideas to learn, or if someone doesn't learn about how bias is a product of society that gets embed in the data, they didn't learn anything. And we have a a huge number of interesting topics. Some are looking at the mechanics of AI, some are looking at social impact of AIs, some are saying people aren't going to be dealing with the mechanics of it and just how do you use it as our jobs and daily lives change with AI in there. So in this forthcoming chapter, um, we've put forward uh characterization of how all the different voices that are talking to each other and saying, no, this is AI literacy, no, that's AI literacy. And it was said that there's basically three perspectives that are getting emphasized. One is the user perspective, and it's really just focused on how do I use this to help me write, how do I use this to help me think through problems, how do I use it to analyze data. Um, you have the developer perspective, which is not because it's an AI literacy angle, it's not having you jump straight to developing the AI, but it's actually understanding like what's a decision tree, uh what's in their own network, um, training data and testing data. Um so it gets to more of those foundations to go further into the technical side if you want to build it or critique one uh learning algorithm versus another. And then there's the critic perspective, um, which speaks to both the societal impacts, but also the healthy skepticism that people should have about what gets generated, especially when it's using something like an LLM or uh other technologies that are based probabilistically. Um those there's no right or wrong. There's some value in all of those. But what what Dury and I say in this particular writing is that the different stakeholders who are pushing for AI literacy land in different spots in that triangle. Some may really emphasize developer and critic, some might really be user-focused only. Um, and it's not a right or wrong, it's just what is the entire space that uh folks are playing out. And I've even been playing with a little bit of a side analysis to see how different people's um documents or frameworks land. And it so far is looking like, yeah, we're we're capturing uh some of the different polls that that are being put around that term. So you asked me for a definition, and I completely avoided giving you one.
Alex KotranI mean, this is a common thread. The the the experts in the space, the true experts, are the the ones least willing to make really certain predictions about the future. And I think there's a humility in that in saying we there's there's a lot that we don't know. The more you know, the more you know you don't know. Yes. Um so and this is an opportunity for you speaking of critics. Um I I'd be curious for your your critiques, if there, if any. Um so AIEDU, we I don't I don't know exactly where in the triangle that we fit. I think there's little pieces of all of them. But I think broadly speaking, we think of AI literacy as like how do you understand what AI is, how do you use it. Um and um but we when we do work in schools, and what I believe is really the most like the really big thing that we have to solve, it's not it's not AI literacy, it's AI readiness. And my definition of AI readiness would be our definition, is the the skills and knowledge you need to thrive in a world where AI is everywhere. And and the um the analogy that I use sometimes is the if you if you go back to you know 2007, it'd be correct. You'd be correct in saying that everybody is going to be using their phones. You cannot do a job, certainly not a job in knowledge work, without the ability to use your phone. And yet, I think we would probably agree that the focus back in 2007 should not necessarily have been mobile phone literacy. It probably should have been a focus, something that you should talk about in school. Um and I kind of feel the same way about AI literacy. I feel like the idea that teachers are going to teach students how to use AI seems perhaps quite silly. Like I think students are gonna be teaching the teachers, they're gonna be teaching. Getting it on TikTok already.
The user/developer/critic trifecta
SPEAKER_01I mean, they're picking up lots from their friends and whatever influencers are on there. Um so there's a lot of interesting points that you had raised. Um for one, and people may not know this about me, is I actually don't really like the term AI literacy. It's the one that's taken hold. Um, if you look at and unpack literacy, it gets very complex very quickly. Um however, it's just a lot easier to say it when I'm having a quick conversation with somebody, um, or if they're doing sort of just the information tagging as to what is this subject about. So I don't love the term because I don't think we have even an agreement on like textual literacy. Um, there's different notions of what it means to be literate um with text, and that gets complicated because then we have these new forms of text, graphic novels, or, you know, arguably you're seeing uh film as a form of text and a broadly conceived notion. So I I I never really loved the term AI literacy. Um, I think readiness is apt. Uh, you know, I also hear AI education being talked about, not AI in education, but education about AI. And some folks are positioning that as like the more technical stuff, the, you know, the like, here's the math of AI. Um, what I do tend to think a lot about though, is that as we settle on kind of our literate standard um as a country, as a society, it's really what do we want every person by the time they finish 12th grade in our country to know? Every person, if even if they're not gonna go work in an AI profession. Um, and being sure that you know the public education experience they've got has made sure that they've been equipped with that base knowledge. And that means not everybody's gonna be building AI, not everybody's gonna need to distinguish between supervised and unsupervised learning um in there. And there's gonna be different resonances or cases that are gonna be important for understanding why we have um justified concerns about uh societal biases or other kinds of um harmful effects that AI could have. So I would tend to think about literacy as like, what is it we want everybody? And I don't think we all agree on what is our model of everybody to have. Now we go back to 2007, yeah, we wouldn't necessarily have a mobile phone literacy, but we do probably kick ourselves right now for some of the information literacy, especially with misinformation, um, and social media and responsible social media use. Um, and you know, hindsight's 2020. However, um, you know, for AI right now, things have moved very quickly um in a very dramatic way. And it is the case that not everybody has a common base sense of AI. Like, even the idea is like, is AI sentient? Can AI solve all the problems? Is AI really going to take my job? And um, there's so much hype that's going around that I do think in the near term, we have a bit of an imperative to get everybody up to speed. Whether, you know, 50 years down the road, there's AI literacy standards because it's already so stitched into the fabric of our lives, you know, whether it needs to be specifically a school subject, that's who knows. You know, we'll be on to different things. Um one of the activities I am working on, however, is uh the National Academies of Science, Engineering, and Medicine is uh has a consensus group where we've been meeting over the past year and a half, two years. Um because what's happened in so many of these new tech developments and data advances um is things have changed. You know, we're starting to talk about computational thinking and computer science and all the schools, and that movements, not very old. Um, we've been talking about data science, and then people are saying we need machine learning education, and now it's like AI education, and then there's like talk about quantum, and then you know, there's probably gonna be something about like bioinformatics education or whatever's the next thing, or you know, robotics, and it's just become this pylon. Um, so what we're focusing on are what are the core competencies that despite whatever development is gonna be that thing to hang on to and kind of connect back to rather than let's teach you a specialized quantum thing and fit that into the negative hours that you have in the school day. Um, but rather like what's the core idea there and what have we been kind of missing that all of these things are trying to speak to?
Alex KotranSo I mean you're you're honing in on exactly what I was getting at, which is so I I think you're right that the the And let's just use AI literacy because we we also talk about A literacy in part because it is sort of now common vernacular um critical. I mean, there's the the the the capability the the the potential for AI to present itself as um authoritative fact that like my understanding is I think the single um the one thing that AI can do better than people at the very top of the list is persuasion.
SPEAKER_01Um recently I saw I was really good at sucking up to people.
What should everyone know by grade 12?
Alex KotranYeah. Which is, I mean, uh these are all sort of part of the persuasion. Yeah. Um so it's absolutely critical that people have the awareness of the fact that they need to ask these questions. Um you need to be aware of by like so I uh I'm not really arguing against AI literacy as something that should be happening in schools. My curiosity is around there's something there's sometimes a bit of a two-step where we we're talking about AI literacy, and then suddenly we're what we're actually talking about is workforce readiness. And I think that is actually where I'm really I'm I'm not so sure. I think, in fact, I have a I our our relatively strong opinion is we don't know what the jobs of the future are. Um using AI is not going to be a differentiated skill. And so if you think about like how do we make sure that you're able to thrive in the workforce, the real thing what you're really asking is how do we equip you with skills and expertise and knowledge that will command economic value? So you know, using AI isn't really going to be necessary but not sufficient. And every all the like studies that I've seen, and I haven't gone super deep in this, maybe you have, I'd be curious. Um, we're in the midst of doing our own uh quantitative research right now with Burning Glass Institute, but basically you know, my my lived experience and my understanding, at least so far, is the most effective users of AI tools are people that have domain expertise in the domain that they're using those tools for. And if you sort of abstract that, it it's it sort of sounds like you know, the the the what are the skills and knowledge that you need to thrive in the workforce is like math, reading, writing, science, social studies, like the stuff that is computer science, data science, the stuff that we're already teaching in schools. Like I I worry sometimes about this AI as the the shiny object that distracts us and that makes us feel like we don't we can sort of pivot to this thing when actually we we actually need we need to double down. And now, can you use AI to help you teach math better? Sure. That like that's that, but it's not gonna solve the problem. It's it's a tool. Um I mean I feel free, feel free to pay poke holes in that.
SPEAKER_01They were trying to wait out the whole AI thing because was it the new shiny object? Would everybody get distracted, you know, six months after Chat GPT? And the naysayers are saying, like, yeah, just this is a fad. We've seen this come and go. You know, it was blockchain, you know, two years ago, and who talks about that now, or crypto, although apparently we're talking about crypto again. Um, but that was that was what people were waiting on. Um, I believe the realization with AI is that it is on the sort of influence type that we'd say social media or the internet or television, um, airwave television had, where it just becomes so stitched into everything that we do that this is not one of those flash in a pans, even though it may be that, you know, we stop talking about Chat GPT the same way that we don't talk about Friendster anymore or MySpace, but we do have social media um in there. But I mean, I I hear you. It's it's really tough. And I have the utmost empathy and respect for educators in the classroom because they're just being dumped on. And some of it is, well, teach this new content, teach that new content, your standards just changed for this. Um, and we're not really giving them the space to do the jobs and work that they're really good at. Um, and that that's frustrating for sure.
Alex KotranUm, you know, you're sitting with one of the foremost experts on this topic, and even you are have the humility to admit that we don't know answers to a lot of these questions. And teachers are, you know, being are expected to sort of navigate and retool their assessments and their assignments, well, their assignments, I don't know about assessments, but certainly their homework assignments. Um and it's I mean, I feel like they're in over their heads.
SPEAKER_01Yeah, it's taking for granted, and there's a lot of pressures that are going on with um, you know, teacher workforce and um retention right now, but that's kind of a growing focus in research right now is how teachers learn and adapt um with AI and how is AI sort of integrating into teaching practice in line with all the things that we know that help make really effective teachers? But I'm gonna throw something at you. You know, Alex, you're a smart, good-looking guy. The good looking part doesn't matter, but the smart does. And, you know, um I'll just throw the sort of PhD student 101 uh thing that you get when you start doing education for a PhD. Why do we send our kids to school? Why do we have school?
Alex KotranWell, it was I mean, I guess I never had to answer that. question in the abstract like that. Um I mean to prepare them to live and work in society and to do so in a way that where they in a way that where they have dignity. Um I don't think it's just careers. I think career is a really big part of it. Maybe we we think about career a lot, but I think school is also a place it's a formative uh uh part of your development where you learn social skills and um sort of develop this awareness of your place in the world.
Shortcuts, temptation, and integrity
SPEAKER_01Yeah. I mean schools are a relatively recent invention historically I'm not going to go all lectury in in the history of schooling, but you know it's kind of complicated. And I I just pressed on it because you're talking about you know thriving and economic value in there. And I do think especially in the world that we live there's importance to that as a parent. You know, I'm thinking about that. I think about that for just young people I work with um in general. But you know this is a a debate that plays out in a lot of places like why do you go to college and um and some folks will say it's to get a job and others is it's just to get the piece of paper. You're really there to like network and then still other people will say no it's to learn how to think you know and just be prepared for whatever kinds of things change and think. And then similar sorts of questions go down to K-12 schooling. Is it like what's the relationship of the state to the schools? Is it to help develop um citizens, intellectuals? Is it strictly a workforce thing? And depending on which answer we prefer, um that has pretty different implications for what we do in schools and what we do um for and with schools. And I would just also say there's some folks whose answer is childcare, you know, that's childcare for several hours a day um as a parent and I know other parents as soon as your kid starts going to K-12, like your life changes you're not on beck and call and that's that's really underappreciated and I don't think the the primary calling for all teachers who go um into working education but it is one of the functions of school. And you know during COVID remote work with kids crashing into our calls and um seeing all that it was rough uh because we didn't have that. Um so it it's tough. I do think thriving is the right direction to go in, but um thriving in what kind of world and in what role do these people play in that world? You know, how much are they joining existing companies versus taking on a social cause? Um but that's why we sort of do it as like the 101 uh PhD question because we want to then start to get at the roots of why and how we've designed educational systems. Um what did we do even before schooling, the apprenticeship model, um, looking at how schooling was something just for the uh religious and then you know wealthy and elite why do we have the school day structure that is modeled after Harvard um from a hundred years ago with the committee of 10 um and we still stick with that to this day. So it it's a I think education's incredibly fascinating um and complicated and so when we think about well AI in schools um I'm not uh one to push for any policy of this is the new algebra by any means. Um and we we all take our very best informed guesses on what's the important stuff to know for a generation or population at a time or place.
Alex KotranIt's I want to get um we're not we're we're close to Stanford we're not literally in Stanford and so in an effort to get out of the Ivory Tower a bit and speak just directly to the like I think most people if you were to ask like a like you know Billy on the street style I think I think most people will will give you some variation of like learning about the world and how to live in the world but the vast majority like the the the certainly the mode is going to be you know getting a career. And so and the reason I say that is because as we as as as we are doing the work and we're actually sort of intersect I think quite a lot in in sort of our combined efforts um to create clarity or at least to orient people towards the curiosity that's going to be required to eventually get clarity even if it doesn't per perfectly exist today, the most accessible way to actually meet people where they are to like address what will eventually very I think very soon become their absolute top concern is is going to be about sort of like jobs. Because at the end of the day, if we can't solve if we can't ensure that a student is on a path to being gainfully employed however that looks um the other stuff is like it's it's nice but you but you know if you're unemployed like your mental health is definitely like it doesn't matter if you have like um you know self-regulation and all this other stuff I mean you know you you And this is a world that doesn't have universal basic income right that's right yeah we're we're we're not we're not ready we we're kind of sort of operating within sort of the political uh environment that we're in and I I think it's actually interesting though because some of when I hear about people talk about AGI and if you sort of like depending on how far down you look and it might some people say we're five or ten years away or three years. I think Dario said two or one. But even 10 years it's like not that far. And so there is a s and when I actually when I founded AIED this is like 2019 I started really in space 2018 um the AI ethics AI um the c the conversation about like social impact in AI was really driven by the effective altruism movement. And it was focused on AI safety and essentially the argument would go AI is going to potentially destroy humanity and so all of our effort around artificial intelligence should be focused on just like how what do we do to uh mitigate that risk. But the challenge is it becomes kind of like a it's it shuts down the conversation because it's like it's it's a whole separate, it's like a complete about face. It's like, okay, if we're actually talking about AGI then what we're talking about is literally restructuring the social contract and our relationship to capitalism and it's like you're you're almost moving out of the school of education to the school of philosophy and maybe that the argument is the school of education needs to be a lot of things that's what the PhD is for.
SPEAKER_01I'm a doctor of philosophy it just happens to be around education which then gets to messy things about society and the world.
Writing, thinking, and overreliance fears
Alex KotranAnd so I guess taking for granted that we have to it's not that we we can't address those things, but over you know you and I are often at conferences, you know, we've been speaking and people are sort of like out looking for answers. And they're orienting right now, I think very heavily towards like using AI in education um as sort of like the primary thing to be solved for. And which brings us to sort of your this question of cheating um and your research. And I think this the chiny question is I think part of the reason your research has been making waves is because it's not just a topic that people are really interested in. It's not just something that teachers are obviously concerned about and so it's the natural thing for everybody to get obsessed with is it actually gets the crux of why AI is I think quite scary for education because you know teachers are I think correctly intuiting that their students are and I I like using the word shortcut instead of cheating because it's just it removes the judgment out. It's like it seems unequivocal. And they're more advanced than most of their teachers. And so there's this sort of like arbitrage where if your teacher doesn't really know how to use a language model, you know, they're you're always going to be able to outsmart them because like the the the the solutions I've seen like oh you know I I have the students use AI, but then I ask them to critique the AI and it's like no you can ask the AI to critique itself.
SPEAKER_01Like that's doesn't that doesn't work or the Trojan horse where you you add some like white text um with some like silly I think last I saw it's put in the ampersand NBSP um and because that's just a a character that you wouldn't use but it produces the space and um but like and and maybe that does work but at some point your students are going to catch on and they have a very effective communication system.
Alex KotranI mean like they're the like this will this will get out there very quickly. And so it um so the question of cheating is really I think comes down to how do we address the concern that people have about the potential for these shortcuts or the use of AI as a shortcut to undermine productive struggle.
SPEAKER_01I think one of the things that made the cheating work really intriguing is that the fear that we have with AI um for one, being able to do a bunch of things that we were always um in cultured to believe are just not within the range of what computers can do was the sense that the kids would be corrupted. It's it's just too tempting and they will not do anything. It it it has almost these almost these puritanical kinds of uh things that it's that evil temptation and once it's made available it is going to turn them into lazy, thoughtless cheating slackers. Um and and I think the cheating study itself said, well the folks that you were concerned about, they were already doing versions of that. You know there was a lot of use of check there was a lot of use of copying from Wikipedia and restating the words or passing old versions of tests down from uh from the same class or having your friend help you out uh on something that's supposed to be a solo effort. Um so what I have seen actually just uh the the top lines of the study what it could do you mind just giving like the the TLDR okay the TLDR for this the study was we happened to um have data available on some schools before Chat GPD came out and it was data that was included amongst it what was happening in terms of cheating um in all different activities whether it was plagiarism whether it was you know um forging you know excuses from parents whether it was uh you know copying from an outside source um and forgetting to give credit whether it's looking over someone's shoulder at their test um so yeah all of those I've I've heard some really amazing and clever ones in there but then because that was already there chat GPT came out there was the moral panic um that everybody was going to start cheating and whatever research came out it would have been like shocking just shocking however many people are using it but we actually saw what was happening with the cheating beforehand and what was happening um in that first academic year after it had come out and it was the same um you know the numbers stayed the same no in fact there were some things that had some weird decreases um in there so then the thing is well it's it's cheating you know cheating on a test by looking some over someone's shoulder is still different from using AI to write a paper but we then looked at the different items that would point to plagiarism or copying and it's it's subtle. There's a line where they don't want to use it for their entire paper. They do want it to help them get started or to polish off some of their ideas if they're having a hard time with it. What they really like to use it for is to have it explain concepts to them that they don't feel like they understand from class. And they're coming up with a lot of really creative uses for it. So there's this interesting trajectory of you know this gray zone of use. And so that was you know really there and we talked a lot about in that original study. But the top line was oh it didn't increase um that has been misinterpreted at times to mean eh don't do anything about it. It's like no if you look at the numbers it was already high to start with so if you really are concerned about cheating we need to like go to cheating and think about like what's going on. And I channel my colleague Denise Pope and she's been working in the space for a long time a lot of it is school engagement the kids are stressed out um they don't feel like the assignments are worth it. Um it's not relevant to things that they're doing. And if you really want to address that then okay let's address that. If it's you know just this idea that they're very corruptible beings, well we're in a tough place uh in terms of how to respond to these super corruptible little people who are going to be inheriting the world in the next decade.
Alex KotranYeah I think I mean this is why I think we just need to get rid of like cheating as the moniker for what's happening.
SPEAKER_01It's it's like well it's like the AI literacy. I don't love the term but it's also like the term everybody knows and you go with it. And it it's just sort of you know the sociology of language and how it works.
Alex KotranYeah.
How work changes when tasks automate
SPEAKER_01Oh that'd be an interesting if you do recommendations for someone who I mean I I know some sociologists but you know that's if that's fascinating after this I'm pretty sure you're never going to invite another college professor again because it's like I'm getting you know my speaking turn I talk for eight minutes Alex gets one minute I go on for another 13 minutes Alex gets one minute.
Alex KotranWell I I have plenty I the the the point of this is actually not for me to talk. I I have plenty of opportunities to talk. Um this is really I mean this is a chance for me to learn and then people just sort of are along for the ride. Okay I'm gonna push a little bit and I don't I'm not actually debating this for the sake of debating it this is truly just something I'm trying to unpack and careful you're gonna want to get a PhD one of these days you're starting to exhibit truly if I could reinvent myself I would go get a PhD in anthropology and maybe sociology and like linguistics. That could be really fun. Yeah. Um but let's replace the word cheating with shortcut though for the purpose of this discussion. Um shortcuts are not universally bad.
SPEAKER_01In fact it in the workplace we are we use shortcuts all the time you'd be foolish automation is sort of a shortcut in the interest of whatever worker is getting a little bit less of the repetitive activity or for the company that's going to have a better bottom line. So yeah I mean we we tend to appreciate shortcuts. With the academic stuff where it's been hard now is we're heading towards almost like the symbiosis of our lives with technology and we've been very used to oh no so I said at the very beginning the big shock was AI can now do these things that we thought, oh no, that's only a people thing um only people can do that. People must be trained to do that. They're the only ones who can do those things. And that has been you know turned a bit on its head. There's a bunch of things that we now have the capabilities to do um that we thought were off limits. And we've set up an education system that aside from the childcare and the citizenship and um other things was to really train those things in people in a world where that was not automated. But the world changes all the time and it's pretty hard for the education system to you know turn on a dime.
Alex KotranBut it doesn't need to I like my I really I really am convinced that what's what's different about this new flavor of automation. Or just computerization in general it's it was shortcuts but shortcuts that generally speaking they they replaced routine work they didn't necessarily replace thinking um now there's like some sort of like you know there's certain models that actually could replace certain decision making with the decision trees but like it most of the you know well if you want to get picky on words I'll pick on replace replace is very loaded by itself there too.
SPEAKER_01And that's the fear whereas a lot of times it just changes things but um the nature of the work changes.
Alex KotranWell it replaces tasks so I think the argument sometimes goes um you know technology and computers uh the ATM machine didn't actually replace it replaced bank tellers but the total employment in the financial sector increased significantly because now banks are doing all this other stuff. Um so yeah I guess you I guess there's a bit of a semantic about some semantics about our if if a basket of your tasks are replaced with different tasks is the job replaced yeah this is a I don't know I have to think about when I go to college and they have the self-checkout lines and you know back when I was young and people were first seeing the self-checkouts like there was this movement to reject them.
SPEAKER_01We're not gonna let this displace you know our uh folks who are working the register we choose to not use that. Anyway those lines are just as packed as the lines with the people and there's also um one employee who's there hopping around uh with the folks doing the self-checkout and then there's still employees who are like helping to direct folks to lines whether it's going to be the self-checkout or or um one of the ones with people there. So you know it might have been that they didn't need to hire an additional five checkers um in there they have one instead of five to accommodate that station of six uh self-checkout kiosks and maybe like the number of employees had changed but I do not see a successful case I know there's been attempts like with Amazon Go or other sorts of things um where we do not have employees in the stores um it even if you buy stuff online there are the help desks and call centers and moderators and things that so it's not a peopless business. It's just the work has changed um sometimes in ways that we like and sometimes in ways that we don't but it just we change you know society changes.
Reframing assessment and relevance
Alex KotranCrus away no I I think there might be something materially different about using a shortcut to um to scan items in a grocery store to fill out to to make bookkeeping more efficient um to you know searching for information like there's it I completely understand the idea that there were probably were people and in fact I was talking to somebody I forget who it was um you know the pushback against uh putting computers into libraries and like well students need to learn the Dewey decimal system and it's just like no there are there are examples of things like this is the way we do things but I don't think you can really make an argument that the Dewey decimal system is like a fundamental part of let's say being able to think um I think writing is a little different. I think writing is actually Peter Gold at quill.org he really keeps pushing he's like writing is thinking and I was just I just had um uh Tony Wan who co founded EdSearch and he does a lot of writing and and he was talking about sort of what does he use these AI tools and he's like yeah you know something like he's played with them but like at the end of the day the the the the really hard thing about writing a newsletter or a a book or whatever it is um is that you're actually trying to express some some idea that you have and if writing is thinking then what does it mean to to is isn't there a risk that a student will develop the instincts to whenever they run into a challenge rather than push through that and really challenge like and really force themselves to like to do the hard part which is actually like getting the words and ideas in your head onto the piece of paper which in many cases requires you to really understand what it is you're writing about. If your first instinct is to go to your AI tutor, say hey I need help with this and they're like, oh I'm not gonna give you the I'm not gonna give you the essay. Like even setting aside the instance where they get the essay um but even just like getting help with the outline is like is that is that like reflexive is is there a risk of over uh sort of like an over reliance.
SPEAKER_01Yeah I think that's the current fear everybody has right now and they're pushing that we need to save critical thinking. We need to make sure critical thinking is what everybody's learning to do, especially now because there's AI. And I think there's an argument to be made that knowing knowing when to use AI is actually part of AI literacy. Like this is now at the point I will go and use this tool of knowing what it can do for me. Here's the skepticism that I'd put in is um you know I obviously I'm not that old. I wasn't alive at the time but in the scholarship on um the history of literacy there was the move from oral cultures before we actually had developed writing systems in there. And there was real fear and anxiety. Not everybody learned to go read and write. I mean that was something for the privileged few it actually was excluded intentionally um to populations uh you know history of the United States with enslavement um so it wasn't a universal thing um but as that started to uh become more integrated in things the naysayers were saying this is terrible it is going to make us forget we've learned how to remember things to tell stories and and rely on that now people are just going to write it down on a piece of paper and no one's gonna remember anything anymore. And so that was sort of the the uh freakout of that and granted you know there's things that change where I rely on maps in a very different way um as I navigate through the world or I there's a lot of people's phone numbers I literally don't know because it's stored in my phone and I don't have to remember them um and dial them in uh in the same way when I went to a pay phone or did our our home phone in there. So I mean I think it's it's really hard to predict the other example I'll put out there um is you know in the 80s there is a song called Video killed the radio star and it was yeah uh the 80s were great. It's the idea that oh video's so much more superior we're never going to do like these audio recordings people aren't going to sit around um by the radio after dinner as a family to like listen to the story hour or the news but now it's the year 2025 and we're sitting here recording a podcast you know it it's still there um we look better on video so I think it's better for people to enjoy it on video with us but there's times in the car that you listen to it or when you're working out and so we just have these different ways that this uh has fit into different corners of our lives. So I wouldn't say radio is dead but I don't think radio has the influence that it did in those times in ways. Um and people are saying print journalism is dead or uh broadcast television is dead and we haven't seen it go dead yet but it is certainly being affected in terms of its level of influence. So you know when we see AI I do agree that um and just as a parent, as a person working in education, yeah, there's times when I probably would not want the kids to be using AI and learning some other skills first and foremost, just as much as you wouldn't want to stick screens in a two-year-old's face all the time because that's just not really what our biology was suited for. So similarly knowing what the appropriate times um for things are but you know you could say like oh the automobile people will just drive everywhere and get fat. Well maybe that did happen.
Computer Science: Beyond coding syntax
Alex KotranI think that did happen. Also I think we f I think we I think that that did happen and I think we also remember a lot less. I mean I was I I think it was Guns Driven Steel, which I know has I had some uh criticism um but I read other books on anthropology and if you go back to sort of like you are going to get your PhD aren't you? I it's I I yes I find anthropology fascinating I just wish I had the time to indulge in it. But I think you know the social intersection of AI is like close enough. Um it is a form of rhythmology well it it really is and I think it's it's it's what's been missing and why I'm so drawn to your scholarship and your voice in the space because there is there is this techies basically have like completely it have dominated the conversation about artificial intelligence and and I think that they're actually a relatively narrow part of what has to happen. Like what you're taught I mean you're even like every every time I try to get specific you sort of zoom out to this sort of like societal lens. And and that's because we are dealing with something that is it it's truly cross cutting. But to go back to your uh I mean early humans that when they were hunter-gatherer tribes they they I think it was like they literally would memorize uh like hundreds if not thousands of plants and they knew exactly which plants were edible and in some cases they were like to oral tradition you know um memorizing like really complicated um you know whatever the the the the knowledge that you needed to be able to survive um and so that's changed and I think the example is really good because the I think the point that I'm drawing from that is so we we we lost a skill but but writing unlocked so many other opportunities for us to to do more thinking to um uh to advance our um yeah our ability to to think and to to to tackle problems um and I think that brings to that bring that brings us to like what is the result of technologies that make it easier to do stuff. So like you know before computers and you know students had literally had to like write pieces of paper and I think they would write like half a page per week on average in high school and today it's like three to five pages per week. And presumably if you work in an accounting firm you probably are able to get through a lot more spreadsheets than you used to if you were doing them by hand. I guess there's this this question of how do we ensure that teachers are striking the right balance of like increasing the the goalposts. Like if we assume that okay we can't it's like you go back to the beginning of the internet the internet has caused a lot of bad stuff to happen. It displaced lots of jobs there was no answer other than this is happening.
SPEAKER_01But on the flip side it started a lot of businesses and a lot of jobs and it's expanded reach for lots of things. I mean the fact that people listen to us talking right now thanks internet.
Alex KotranYeah and and you're right like YouTube dramatically disrupted the entertainment industry. Um and so so just because change is happening doesn't mean that there's we it doesn't make sense to spend too much time trying to resist.
Education policy, collaboration, and funding gaps
SPEAKER_01We're gonna zoom out again my colleague Roy P uh turned me to this really great um study of the history of housework. And there's a period of time because like you know imagine before we have all our Bosch uh dishwashers and things I will disclose I do have research funding from the Bosch Foundation but I also do think that's a great dishwasher. And refrigeration and and whatever and so like just the labor that we would be spending on that well as what happened was that was domestic work and these conveniences started coming out. And instead of creating more time to just do nothing, it became something everybody had to have such that most houses have a dishwasher uh most houses have refrigerators and stoves and things of the sort and um it morphed into not a luxury but a necessity to just simply keep up and you could see that even with food like you know we still prepare food. We may have more packaged foods or other sorts of things but we still do that. We also have restaurants that are helping out and um contributing in that way as well. So a lot of these things it's hard to look at it as a sort of um we just lose stuff. We we stop doing the things and in some ways like we do so many other things now um with and because of all of these kitchen gadgets. And it is possible if we don't socialize and reaffirm the importance of what we do as humans and how we as a culture work and operate and and um feel some agency in that we could go through some really rough patches um would be my guess. We don't know what the future is actually going to hold but you know I I think it would be wise to make sure everyone feels wise about AI um and so that when they use it, they use it responsibly. But I would say the same thing goes for be wise about how you search for things or be wise about how you um respond to advertisements or things that are going viral and understand like where does this sit in everything. And you know for all the examples of uh folks who could forage and identify plants and do other navigation there's things that we value socially at certain times back then it was going to be eating um now you know like I look at my kids and there's stuff that they know about social media and like peer interactions and things. We were talking about the 80s we know the great music of the 80s the great hair of the 80s there's just so much stuff we know. So if we actually had the ability to like count out items of things that we know or stuff that we could do um it would look pretty comparable may even look a little bit better now just because longer lifespan and the ability to get even more um stuff in there. So uh it's hard to tell I mean I I think just as uh you were hinting out earlier change happens some of it is going to be really bad stuff some of it's gonna be really good stuff we're always just trying to reduce and prevent as much of the bad stuff and um make sure that we have uh equitable access to the good stuff and then something else changes and that's what it means to be human.
Alex KotranUm zoom back in to computer science um yeah point taken you're right we probably actually consume just to just to evaluate what you're saying like I think we probably we probably do deal with and manage way more information than a hunter gatherer did.
SPEAKER_01I mean I'd argue that's part of why people have so much stress and anxiety right now and are like feeling so you know distracted and and running around all over the place is we're bombarded and our systems were not really um evolved to do that if it wasn't involving running and chasing things down.
Alex KotranUm computer science strikes me as uh something that is generally heavily anchored on jobs. It's like you study computer science because it's going to give you these these skills to go and work in these fast growing technology fields. Um and there seems to be a lot of like dissonance where you have you know people here in Silicon Valley and there's this exuberance about AI is going to just replace software engineers.
SPEAKER_01I mean yes it's gonna augment and massively billboards that yeah advertise companies that will replace your for you it's pretty dystopian.
Alex KotranAnd the VCs, you know, this is a qualitative but you know VC batches are the the founding teams are getting smaller. They're getting to you know their Series A with like much smaller teams um and it's partially vibe coding it's not just vibe coding actually there's also just like more a general downward pressure on cost, what have what have you but um the so you have like Dario Amade at Anthropic who is saying basically you don't have to like need to learn computer science. Jensen Huang basically said the same thing or he said I don't think if if you if I was talking to my kids I don't think that they need to learn computer science um everybody's going to be able to like use just English to code. And and yet you dig in a little bit and you what people also say is that oh well you know the the most effective vibe coders um people who are using AI as a copilot to write code are people that have classical training in computer science. And in this world where there are let's say it is more uh competitive to get the really good paying computer science job surely companies are going to place a premium on someone they're again everybody's gonna know how to use AI. That's gonna be a baseline assumption. So they're gonna want someone that knows how to use AI and has that classical training in in computer science and sure maybe computer science class classes change to be less focused on just learning coding languages and more building apps and getting experiences like product management and you know like it it's probably a bit more holistic but it but anyways that's that's my that's my current and latest interpretation of how computer science changes which is it changes but not as much as some people are claiming. And I think you can extrapolate to other domains of expertise whether that's like communications and marketing or design or art. I don't think everybody's just going to use AI to create art. I think people are going to want to pay human artists to get something that doesn't just look like what everybody else can have.
SPEAKER_01But let me computer science bid does that conflict with no no I mean I this is where you know in wearing my learning scientist hat, we really puzzle over what does it mean to learn and what does it mean to know something. So like to learn computer science, well what are you learning when you learn computer science? And I think a lot of folks by default say, oh, you learn to write code. Okay. But what people are doing is they're learning to structure problems in a certain sort of way, to think about um certain processes and potential automations and practices for uh what to do. So when you study computer science, that's really what you're learning. It's not the specifics of this programming language which will be out of date in 10 years. You know a lot of coders have the ability to learn and code in other languages. So the question is what is it that we need people to learn that is represented by those success cases within computer science or the things that have tended to have the most use and value for people who've benefited the most from that learning experience. And you know it's hard to say oh you don't need to learn computer science but I think also just with the writing um can we think about writing beyond like the five paragraph essay? You know, and most of the most interesting nonfiction that we read or writers like op-eds or you know really interesting publications or long form that are not the five paragraph essay, but we got very used to no, that's how you learn to write is you do this. And seeing what does it mean to actually write and to do the thinking that the writing feels like it is. And I'm a big proponent of using writing as a major thinking tool for myself that's what we want to preserve and emphasize and keep our eye on that ball. However, you know we learn things based on what we currently know and we know what we're participating in or engaged in or surrounded by. So we have to speak something to the now but we just can't lose focus and like you know the gaze all is only on the now. There are some things that have some enduring usefulness um and seem to have some robustness you know math tends to bring out some of that logic uh kinds of things uh unpacking the human experience by reading literature or enjoying uh different sorts of media computer science with the different kinds of um creations of of logic structures in there so it may change over the generations which ones are the ones that we must do but we also tend to fall into these little traps of conflating the thing that we tend to do superficially with what's the actual thinking and insight and expertise and how that develops for those. So you know I would be supportive of people going on to computer science. Now if it's computer science just so they can make X amount of dollars, I would encourage them to think about what they're looking for in life and um if that's how they want to pursue things. Because if they want to just focus strictly on dollars, then you know there's many choices that they can make.
Alex KotranYeah. So I I think we're actually converging now because I'm I'm understanding your your point, your pushback on you know should we be worried about AI automating writing? What I'm hearing from you is the sure the five paragraph essay like we may not want to die on that hill and the question isn't like well if you don't have to sit and write a boring five paragraph essay you're not gonna have productive struggle. It's like also it's it's a struggle in part because it's boring and it's not really relevant and nobody I haven't written a I've written a subset kind of counts but even that it's like it's different vernacular. So what I'm hearing from you is a an English teacher needs to to abstract like what is what is the point of being able to write a five paragraph essay. And so if the point is being able to to think and convey ideas about what you learned in this novel, um, there may be other formats for getting students to convey that knowledge.
SPEAKER_01And oh by the way, that's probably a good way to address the potential for shortcuts or cheating is if you know they're having to come in and you know reenact uh you know play out play a persona from the story and maybe you use AI to create sort of a role playing game where they're you know use AI to actually you know record your own 10 minute movie interpretation or something um with a different context and like reflecting the complicated human condition in there. And in some ways that could be real benefit for some studios And say you have uh you're dyslexic, or um this could be a way to show how keenly insightful you are and not get hung up on the thing that um has never quite worked um this well for you. So yeah, I I think partly the teachers, but also it's the curriculum developers, it's the parents, it's the policy makers. Um we all need to make this shift and be willing to embrace it, but also acknowledge that it's not the kind of shift that we feel comfortable with. Change is not things that people tend to like. It's laborious, it's scary, it's uncertain. Um, there's a little bit of unpredictability to it. Um but that's that's where I think everyone's uh seeing it in the water. Um, we need to change assessment. We need to change what are the kinds of activities that we value um and what actually causes certain outcomes in the end. Is it you went to college or you spent a lot of time thinking through very complicated problems? Um, and we tend to conflate a lot of those things. I am in favor of people going to college. I do work at one. But the same thing I think goes for writing, you know. I think there is some very deep thinking going on for families that didn't have writing. Um, and I think there's gonna be some deep thinking for folks who, you know, go many different life paths in there. And we want to try to keep that eye on the deep thinking part.
Alex KotranIt's important to have some folks that are zooming out because we we're basically going to be chasing our tail um in perpetuity by the time the education system is able to shift the hockey puck is now somewhere else. And um and if you're not if you don't have some perspective on that that sort of like these like deeper questions, um you know, we're always going to be sort of like we're we're always going to be reactive. But what like what advice would you have for organizations like AI EDU, CSTA, um, you know, Innovate EDU, uh standard for accelerator for learning. I guess maybe don't advise your own organization. Um, but civil society organizations that are trying to figure, okay, how do we how do we help this along? How do we what what what is the what is the like the critical path to getting the best possible chance? It's gonna be very hard. Like it's it like making a mistake, but we have to try. Um what should the focus be? I mean, there's obviously you're advancing the research and sort of like the deeper thinking, but what are the other sort of critical components?
SPEAKER_01This is uh where I'm gonna say I am not a policy expert, and I think that that is very much its own expertise. Um, but we do know policy um can be a very big carrot and stick for certain changes to happen. Of course, they could be done very poorly and other times more thoughtfully. Um so I would first off say the community should collaborate. There is unfortunately some friction where different groups are trying to have more dominance on the thing. And, you know, we're all in for the same outcome at the end. Um, so if we can all work together and build on whatever insider assets or connections one another has, I think we'll get to where we want to go faster. Um, and in our current system, we're gonna need to help steer policy. So, like part of that I would say is if we are gonna look at big shifts to what's happening in schools, we need to pave the way for that in terms of the policy system, the support for teachers, the new kinds of curricula or learning experiences, and it may end up looking pretty different um than what we have now. And there's also the risk for backlash um to that. And so I think we should, you know, do as much as we can to uh pave the way um so it's easier. Um actually uh be curious what you're reading nowadays, but one of my more recent books that I probably have talked about the most in the past few weeks is Um Abundance, um, Ezra Klein and Derek Thompson's book. And, you know, there's a lot of interesting points in there, but a lot of it is we've kind of regulated ourselves into corners. Um, we've gotten really, really good at creating these like structures that tie us up in knots and they end up raising prices and preventing things from actually happening. We ought to, you know, their thesis is we ought to actually start banding together and like trying to make some forward progress towards common purpose and not necessarily end all regulatory structures, but be more empowered to take responsible risks in in our decision making. And so I think whatever we start to recommend, and I believe that with the great folks who are working in this space, different people are trying to contribute different pieces to that. Um, and we should all collaborate on that. You know, there's some folks who are trying to build out frameworks, some who are trying to build out curriculums, some that are providing professional development, some that are focusing on system leaders, um, some that are just helping to rebrand what AI and education is in the mass public. Um, and the more that we can do it together, um, I think would help. Um, and yeah, right now policies tend to be a big thing, but there's also hitting that sweet spot of the right resources at the right time, um, at the low cost and low burden, um, which will be in there. So, you know, this is still going to be a bit of a gamble, but those would be my broad directions. How about you?
Alex KotranI think the same thing. It's it's we are really nitpicking as you as you go and look at all the frameworks, like the AI literacy frameworks. I mean, there are different uh weighted, like sort of different things weighted. Some are a little bit more technical, some are more than that.
SPEAKER_01Well, that's the challenge too, and that's where I say like there's this inadvertent rivalry of like, no, we came up with this one. No, this is the right one. And it's like, you know, the we don't have to necessarily like zero-sum game this.
Alex KotranYeah, I mean, it's there, there's far more in common than there is different. I mean, I think I think there are, and I I don't even know if they're nonprofits, but I think there are some for-profit actors that are really pushing like use training to use AI. And I think I I I do disagree with that. I think that the the push should not just be teach teachers to use AI tools, teach students to use AI tools. There has to be more to it than that, although that is a part of it. Um, but all the nonprofits that I'm sure you're alluding to that we work with, um there's you know, I think it'd be interesting. It'd be it'd be actually more fun if there was actually like meaningful material disagreement. And this was a a battle of uh, you know, a battle of ideas um for the real AI literacy. But I think actually we're like, you know, you're sort of tripping over yourself getting consumantics.
SPEAKER_01And then there's also the like, whose framework is the who is here first, who's gonna get the funding when there's only a certain amount of funding to support that? Um, who's gonna be able to go and work with this state or work with that state? And it's just part of the the system where I I wish we didn't have to be in in these scarce resource situations, but no, we're especially not not seeing the investment to do this except for in certain private sector things now, given recent changes.
Alex KotranI think that's right. I think that's uh it's a good call out to when people ask this question of like how can we ensure there's more collaboration in the space, and the answer is we need more funding in the space. Like the the sum total budgets of the organizations doing this work is, you know, probably less than 100 million, maybe, maybe even less, depending on like how you how you're gonna be able to do that.
SPEAKER_01I mean, we could ask AI, maybe it'll tell us.
Alex KotranI that'd be a I I would not bet on O3 getting that right, although we should just do it for the scope of the challenge is is so large that it I think it it is it is inevitable that there will be more uh more funding and philanthropy and investment in this. I I worry that it will take a crisis to as is often the case. Um we will wait until it becomes a crisis, and then there will be sort of this lagging um you know, follow-on investment. I think you know the work that we could do now is just how can we get as much forward progress as possible so that when there is a crisis of whatever however it unfolds, whether it's about economics or societal impacts um or mental health, I mean, there's lots of uh you know, doom and gloom that you could, you know, rabbit holes of doom and gloom that you go down. Um, but we're clearly better serviced, right? From having people who like fundamentally are curious about the technology. And I'd be very happy with, I think, the vast majority of the AI literacy frameworks. I'd be really curious about your book that you mentioned um with this the triangle. Oh, yeah, the handbook.
SPEAKER_01Yeah, that's that'd be really Yeah, and it's funny because everyone's like, oh, we want to see your definition of AI literacy. And it's like, well, it's actually a framework, it's a meta-framework for all the frameworks.
Alex KotranThat's that's like the checkmate, is I'm gonna create a framework about your frameworks. And then someone can do a meta-analysis of the frameworks about frameworks. Um the way down.
SPEAKER_01Um Victor Dr. Lee, actually. Victor Sign. I'm most people who know me um know I will pepper tough questions at conferences, but I'm also highly casual.
Alex KotranAnd and nice and like genuinely doing this for the right reasons, um, as I think almost everybody in education is. Nobody really goes into the education space if they have blind raw ambition. Um I think they're ambitious people, which is great for sure. Raw ambition, uh yeah, wrong sector. Yeah.
SPEAKER_01Well, having having me as a guest in this conversation, I think, is you know, a good, honest gesture of like having the crosstalk, sharing audiences, um, and just continuing to forge those relationships so that way when we're on stage at the next conference, it's uh just another like it's a hug rather than a high five. Right.