aiEDU Studios

Matt Sigelman: AI Raises the Bar – It Doesn't Lower It

aiEDU: The AI Education Project Season 1 Episode 41

Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.

0:00 | 47:55

Matt Sigelman has spent two decades building the most granular picture anyone has of how the labor market actually works – first at Lightcast, now at the Burning Glass Institute. So when he says schools are asking the wrong question about AI, it's worth slowing down.

The new research he's just released with aiEDU translates how AI is reshaping work into concrete implications for what schools should teach. Among the findings: AI raises the cognitive bar for students rather than lowering it. Writing matters more in the AI era, not less. And "AI literacy" is the latest in a long line of skill-of-the-moment dodges that lets schools avoid harder curricular questions. 

Matt joins our host, aiEDU CEO and Co-Founder Alex Kotran, for a discussion on teacher autonomy, what a 17-year-old who'd never used AI taught everyone at a recent vibe coding workshop, and why – even now – schools' obligation to teach what's "beautiful and true" doesn't go away.

Burning Glass Institute

  • https://www.burningglassinstitute.org/

aiEDU: The AI Education Project

AI Raises The Bar

SPEAKER_01

For students to use AI successfully across an array of domains and tasks, the cognitive burden goes up, not down. It's not that AI makes things easier. AI raises the bar. What we found is that we need greater levels of expertise. AI is required, giving you the tools to do what was easy so that you can do what is hard. Making sure that they have the ability to frame a problem, to ask smart questions that comes from going in depth is exactly the kind of thing that we need to do in order to make sure that students are going to be able to make successful use of AI.

Why Education Feels Unstable

Alex Kotran

So welcome to another virtual edition of AIED Studios. I'm in San Francisco. I'm with Matt Siegelman, CEO and founder of the Burning Glass Institute. Matt, you're in New York City right now?

SPEAKER_01

I am.

Alex Kotran

All right. So if you're someone who works in education, whether you're a teacher, uh a policymaker, an administrator, a parent, you know, I think everybody's feeling the same thing, which is the ground is shifting under underneath our feet. You know, this AI thing that sort of came into the public, you know, zeitgeist back in what, 2020, end of 22, um, it it seems to be going faster than ever. It seems quite clear. It's changing the way we work, it's changing what it means to be prepared for work. I think everybody intuitively feels that, that like we're not necessarily preparing kids and people for the workforce of the future because we don't even know what the workforce of the future truly looks like. And, you know, up until recently, I think the conversation about that has been really hand-wavy. You know, that we hear a lot of things like, well, there's going to be all these jobs that we don't even know about, and you know, it'll sort of work itself out. Or we hear things like durable skills and, you know, like that's, you know, or 21st century skills, like a lot of those sort of the tropes that we've been talking about for decades, a lot of speculation, you know, not a lot of evidence, a lot of intuition. And that's why I'm so excited to have uh my guest Matt on today. For those who don't know, Burning Glass Institute is um one of the most uh uh I think Burning Glass, I mean Matt, you should you should tell the story because I know that you were even at Lightcast before and that you basically created this space of uh real-time labor market information. So I'm actually going to hand it over to you to kind of explain what's unique about that. But um just as a precursor, this is you know, when we were thinking about who can we partner with to really try to uh get more fidelity on this question, um, everybody we talked to was was like, yeah, burning glass is is who you need to be going to. So Matt, yeah, can you tell us about BGI and um what is it and what do you all do?

Stop Focusing Only On How

SPEAKER_01

So so as as you um said, you know, I think we spent 20 years building um a company which is now called Lightcast, um, which does phenomenal work generating big data sets about um the nature of the landscape of opportunity and uh goes out and and realizes that you know, look, we need a lot more granularity than what you can typically get through public um statistics. If we want to understand the nature of work and how it's changing, we need things that aren't just um uh assuming that every job with the same uh in the same big bucket has the same skills, all that kind of stuff. And so created some really powerful data sets. I left that almost five years ago now to found the Burning Glass Institute as a fully independent nonprofit. And we're we consider ourselves a data lab. And um we're building the recognition that we need powerful data sets, but at the same time, solving big um issues at the intersection of work and learning, doing the things that will actually unlock greater economic mobility, that will improve outcomes for learners and workers doesn't just take data, it takes a whole bunch of modeling. Um, it takes the ability to start and backwards from what are those big problems and then say, now what data can we bring to bear? Um and so that's what we call ourselves a data lab, because we get to work with um a broad array of organizations. Um, sometimes it's education systems and institutions, sometimes it's public agencies, sometimes it's regional coalitions, employers, to uh then say, you know, what are those big problems? Um, and what data can we bring to bear to help solve them? And so this problem that you just um spoke about, uh, which is what should we be teaching in our schools, is exactly the one that when we started talking about this together, um, just got us really excited. Um, because exactly to your point, it's not just that there's a lot of prognostication. It's not just that when you ask people, what new jobs are those? And they tell you it's a prompt engineer. Um and by the way, I just like want to drive a stake through the heart of that example. Please, yes, let's say. Um, right. Like, you know, the the internet didn't create demand for Google search engineers and and AI is not going to create demand for prompt engineers. But but put that aside, right? Like put all the hand waving aside. Um what we've gotten to, because there's not been a great evidence base to draw on, um what what we've wound up doing instead is anchoring the discussion about AI in education to how questions, right? How do we use it? Um, how do we personalize learning? How do we keep students from cheating? Those are important questions, but they're missing the what questions, which are what should students learn, what needs to change in the curriculum. And so the ability to say, okay, well, what what are the effects that AI is already having? How can we see that there out? And how do we somehow map that back to what's uh what needs to be in curriculum was was a really big challenge, but one that really needed to be solved and so enjoyed working together on it.

Alex Kotran

Yeah, and it's funny because I I in the last two weeks I have I was at Stanford and with Jobs of the Future, and they were talking about research that they had done with Burning Glass. I was chatting with uh Lisa Gewelber at Google, and she was talking about research that they were doing with burning glass. I was at the American Enterprise Institute. It uh this is a couple like a month ago now, and and this is you were there actually. It was an amazing group of people. I mean, former governors and secretaries of uh commerce, and and and so it's uh so you really are in the at the epicenter of the conversation, or at least part of the epicenter. Before we talk about the results, sort of what we found in our research, I'm I'm curious like what your hypothesis was going in. Clearly, this was not the first time you had been grappling with this question.

SPEAKER_01

This is a question we've spent a lot of time thinking about. Um, but I think what's impeded us from really getting to an answer that I would consider satisfying is that there's a very different language in the world of work from the language that we use in the world of education. The world of work speaks a language of jobs and skills and tasks. And most of the modeling that's been done around AI is measured in those terms. The world of learning speaks a language of learning objectives, not surprisingly. And and there's some overlap in those languages, but but they're not the same. So, in in the absence of that, you know, what what I think you hear a lot, um, which is not illogical, is that, hey, AI is going to um change tasks, but fundamentally that's going to elevate the importance of air quote social skills or of, as you said, durable skills. Again, that's not wrong, but it's not specific enough. And and that's the frustration that that we had that made us want to tackle this.

Learning Curves And The New Bar

Alex Kotran

I think it was a very strongly held hypothesis for me about durable skills. I think the the question that I didn't and I didn't have an answer to this was you know, it's not that everybody's just going to be going around doing a bunch of communicating and collaborating at these in these jobs, right? I mean, like this these like the durable skills, uh you know be hard to imagine a world where they aren't critical, but I think there's like there's the the the what is still, you know, most of what we do in school is like, you know, math, you know, English, you know, reading, writing, social studies, science. And and AI can clearly, you know, it can write really well, not amazing, but but quite well. It can do math increasingly good now. It's you know, writing, you know, I think a quarter of all computer science papers are uh potentially have been written by AI, so it's quite good at you know, at science, or at least you know, uh writing about science. And yet I I I wasn't clear it wasn't clear to me that you know we don't need domain knowledge anymore. So maybe we can just dive into some of these these first insights. Um, this assumption that you know AI is actually going to lower the bar because if the machine can do it, people won't need to. Your research actually suggests the opposite. Can you walk me through why?

SPEAKER_01

And specifically what we what our research shows is that it is uh very field-dependent. Um so there are certainly are sets of of um of occupations where um most of the the barrier to and is is a barrier to entry. Um, right. So there's a set of domain knowledge that you need once you've acquired it, and a lot of that knowledge may be knowledge that you can um that you can um acquire um uh through extrinsic learning, that is to say, the kinds of stuff that you can um explicitly be taught in a classroom. In occupations like that, um where there's a uh technical knowledge base that's needed, AI may actually um enable people to be able to enter those domains more easily. But our research also found that, um, and and specifically I'm telling about some research they did together with um my colleague Joe Fuller, who um is at the Harvard Business School. What we did is um specifically we looked at um at how AI changes learning curves. And that sounds kind of abstruse. Let me tell you why it's important. Because uh in every given occupation, there's a different shape to how you learn. Right. In some occupations, um you have a bunch of mastery that needs to happen. And then after that, you kind of know it and your job doesn't change much. Um, think about a bus driver, for example. We're not gonna be terribly AI impacted, but but nonetheless, right? You know, if you've been um a bus driver for three weeks or for 30 years, your job is show up on time, don't crash the bus. Um, there's not a lot of um you know of opportunity for outperformance, for growth in the role. But most of the kinds of jobs that have been um that that characterize the knowledge economy, that um that provide the best earnings, are the kind of jobs that most of us who are listening today um came up through. Um, those are jobs, um, typically professional jobs. They're jobs that you um came out of school, were hired, probably based on your capability more or your capacity rather than specific capabilities. You were given some humble tasks. You you did those proficiently. Over time, you were given more complex tasks, and eventually you developed some sense of intuition and became good. It was a long learning curve. Um, and in those kinds of jobs, AI may, to your point, do exactly the opposite, not help people get on the curve, um, but make it more difficult to get on the curve. Um, because what we're seeing in those kinds of jobs, if you compare the um the skills that um are expected of experts in the field and compare them to the skills of people uh at the entry level, there's a significant difference in the in the nature of skills, the level of skills, and um the skills um that are demanded at the entry level, exactly the kinds of things that there's a humble task we were talking about before, are the things that AI does really well in those kinds of fields. Um, think software development, right? You know, I think there's been a lot written about this, that serious software engineers are really jazzed about uh about clawed code. They're not, they're not worried about it because um it's making them hyper productive, but it's making them hyper-productive because they already have deep expertise in the field. And so that's what um I think frames um a lot of the challenge right now to educators, which is both how do we rethink what students need to know, but also how do we um how do we raise the bar on proficiency?

Alex Kotran

I I think it's worth getting concrete, right? Because what I mean, part of the magic of of Burning Glass Institute is, you know, you looked over, I think it was like 20 or 30 million job postings, pulled out a thousand common skills across all of those. And then we looked at the data of like how are those skills changing? Not just do they matter, but like what about them matters differently. Um yeah, I mean, just can you just give me an example maybe or sort of take me through like what does that mean? Like it's it's very abstract, I think, when I say it like that.

Mapping Job Skills To Standards

SPEAKER_01

So so so sort of um starting from this um, you know, this earlier work which I just described, which which sort of frame the imperative, the more the work that we've just undertaken together um is really important because it says, okay, look, there's gonna be these really big changes um that could um uh wind up raising the bar significantly for students in an array of fields. Um what does that mean then in terms of how we prepare students? And to your point, to address that question, um, we tried to get past this language barrier. Remember, I was mentioning a few minutes ago that part of what's kept educators from being able to understand how to respond to AI is that we're not speaking the same language, right? So we know we can say, okay, AI is gonna change these jobs. Um, you know, we can hear all sorts of things about which jobs are gonna go away or whatever and uh what's gonna get automated, augmented, but it but it doesn't compute um because um we recognize what those skills are. Um a lot of educators say, hey, look, my job's not to be vocal, you know, vocational prep. And so what we did is we we looked at a thousand different uh workplace skills and we measured um the interplay of automation and augmentation on those skills. Um, which of them are ones where um AI is is making essentially replacing tasks that people are doing, where places where AI is making us more effective. And we then tried to translate that. We created a whole what's called a knowledge graph for being able to um then map those to on the other side of this huge data exercise, we went and we collected the the curricula of the K-12 curricula of 21 states, um, and we we then distilled them up to 140 different learning objectives. Um now here's what you get when you map them together. What it tells you is is that um that interplay of automation and augmentation changes what we need to prioritize um in each area. So um let me give you an example. If you were to think about um uh how we teach um communication skills um and presentation skills, right, you would say, hey, look, some of those, some of the capabilities within that broad heading of how we teach students how to present become increasingly relevant and also increasingly um rewritten based upon the cap how you would use AI in that domain. Some things become um critical to anchor to. I'll give you an example, right? So within that field of like how do you teach students how to um communicate and present, you know, things like um grammar and syntax are things that we know through just the kind of content generation that LLMs do really well. Um maybe things we still want students to know them, but there are things in the interplay of automation and augmentation, right? Like there, there's more automation that's going on there than augmentation, right? Those are things where um where we can expect students are are gonna get support in the same way that um in previous generations, we kind of said, okay, look, yes, you need to know, still know how to do um basic operations and math, but we also recognize that students are at some point gonna graduate to having calculators. On the other hand, there's parts of this um like oral communications where you know we can expect that um AI is probably not gonna change that so that part of it so much. And then at the same time, there's things like how do we construct the argument? How do we build uh and how do we teach rhetorical reasoning? Um, how do we um affect the craft of writing? And those are places where AI makes us both more efficient and more effective at the same time. And that says that um students need those skills all the more, and they need them at a higher level of proficiency. It's easy to look at something like writing skills within that broad domain of how we teach students to communicate and say, well, gee, um LLMs teach students how to write, or not teach them how to write it. They generate content really well. And and I think a mistake that a lot of um um educators are making because we're not changing the way we teach. Look, students, students aren't dumb. They they can see they can give an essay assignment to a to an LLM, it'll generate um pretty competent um responses. And you know, we can try to play cat and mouse and track down what's AI generated, or we can move to blue books, and and both of those are probably legitimate responses. But at the same time, what you really want to do is say, wait a second, is writing just the practice of generating content? Or is writing um the business of um formulating an argument, structuring the argument, evidencing the the the um the argument, advancing the argument? In that context, writing becomes something that we is even more important than before. Probably requires a greater cognitive load than before. Um, and in any case, certainly needs to be taught differently from the way we're teaching it today.

Alex Kotran

Yeah, it's like I mean, it's we need to we need to teach people how to think. Um and and I think that's where durable skills kind of like reaches you you get into a semantic gray area where there's a lot of durable skills that you build in the process, you can build in the process of developing the ability to think, but there is some aspect of sitting down in front of a blank piece. I love I love the idea of like AI as a writing assistant because I think it I often will hear, well, you know, the key is now we just need to help teach kids how to partner with AI to write, which I think maybe when you're in a senior in high school, certainly in college, that is actually a like a skill, like literally like a technology implementation skill. Um but yeah, I mean, there's I think what the research shows is there is actually a tremendous amount of value in a student sitting down in front of a blank piece of paper and having to like decide, well, before I go to the AI to brainstorm, like what do I actually think about this? What do I want to say? And then maybe the AI can help you later. Uh I was I was with ETS yesterday, we were just sort of talking about what this research means in the context of assessments, and I think they're they've they've seen it as like very instructive as to like what needs to shift around, you know, sort of like content knowledge and shifting into sort of like some of these other things to ask. I just want to like sort of tribal in something. I'm curious for your take on it. So I I was I'm a I'm a history buff. I studied history, so there's a political science history in in in college. Um I loved history so much that I just like read the whole textbook before, you know, over the summer because it was just so interesting to me. But history for me was like just a broad accumulation of like just random facts. It's like, you know, when was a TED offensive? Um, you know, like what was the who participated in the, you know, uh, you know, the various treaty meetings that happened in World War II. Um embarrassingly, I can't even remember like the specific ones. And yet I I had this experience. One of our vibe coding workshops, we it was a South by Southwest, we had like 100 people in the room, and there was one person who had by far the most interesting, sophisticated uh app that anybody out of anybody who had built something, and mind you, this is a bunch of technologists, right? These are petite educators who are tech forward. Um, it was a 17-year-old, her name is Addison. I've been talking about Addison a lot. Uh shout out to Addison uh Polo. So, what's interesting about Addison is her school completely bans AI, no AI allowed. This was essentially the first time she had ever used an AI tool. And in 20 minutes, she what she built was this um it was it was basically a game to help improve focusing for students with ADHD. It's based on neuroscience and cognitive science. What how did she manage to build this amazing thing? Well, in the class that she had just finished, um her teacher gave all the students the opportunity to pick a topic that was interesting to them, and they had the entire class to do in-depth academic research. And so she just spent six months like pouring over the literature on um non-pharmaceutical therapies for ADHD. So she came and then, you know, we gave her the quick prime, we gave everybody the quick primer and the tool. And there was something about what she had learned in that class that equipped her to completely like run and ahead of everybody else with AI. And it clearly wasn't AI literacy. Like, should we be thinking like maybe ambitiously about like maybe history class, social studies is less about learn everything that you need to know? Like, try to, you know, ingest essentially encyclopedic knowledge about history and more open ended and say, hey. Pick a period of history and spend maybe two years of high school going really deep into that into that period and become a domain expert. Maybe that's not the best example. I'm curious, like just for educators or or parents, even who are listening in, like, what does this research instruct in terms of like what it might look like to redesign or reenvision learning?

SPEAKER_01

So let me let me speak to two things here. First, um, I mean you said something really important um uh about um the need to make sure that we aren't always designing education to play catch up. Every few years there's a new technology and uh and and policymakers will say, hey, look, you know, our students need to make sure that they're ahead of the curve and they need whatever the new skill is. And so we're always playing catch-up, we're always um a few steps behind, and we're trying to say, okay, how do how do we make sure that that students have um that new set of skills? Think about the analogy of of language education. When I was when I was a kid, um, you know, the the schools that were considered at the the forefront were teaching kids Russian, and now they're teaching kids Mandarin, right? You know, um when we talk about teaching AI literacy, um, I think we're essentially doing the same thing. Look, these are technologies that are um actually, as a skill set, designed for accessibility. So what we're really trying to do, we should be trying to do, is not focus on what are AI skills so much as what is the interplay of AI with the skills that are bedrock to education. One of the things that our research together um made very clear is that for students to use AI successfully across an array of domains and tasks, the cognitive burden goes up, not down. It's not that AI makes things easier. AI raises the bar. Um this goes back to what I was describing before of our earlier research to come with with Joe at Harvard. What we found is that we need greater levels of expertise. AI is required, giving you the tools to do what was easy so that you can do what is hard. And so that what you described of giving students a greater anchor, um, of making sure that they have the ability to frame a problem, to ask smart questions that comes from going in depth, is exactly the kind of thing that we need to do in order to make sure that students are going to be able to make successful use of AI. It's not how do we tack on this thing into our curriculum that says, okay, great, there was an AI unit. Uh, in the the domain of history, um, I think it's it's it's a both end. It is going to be important to make sure that our students are historically literate. We're not just saying, okay, great, we're gonna allow you to focus on um on you know history from 1850 to to present, and and you you know that really deeply, but you've lost an anchor in in how we got to 1850. Uh, we need to have an ability to understand the narrative connections um between epochs. But at the same time, historical analysis, to your point, means having the experience of going in deep on things. And so this is exactly the kind of um design challenge that we need to equip um educators to do. We need to help them think about where are the areas we need to go deeper, how do we make sure that as we go deeper, um we also have some time left over to make sure that students are still getting the broader landscape view. Um, classroom time is the most precious resource that a teacher has. Um, so this can't just be sort of uh, okay, we're gonna add that in, um, right? What are the things that we're gonna reshape our priorities? How are we gonna shift our time?

Alex Kotran

I'm glad that you sort of tore down um the the my proposal because I think it's it demonstrates it demonstrates that it's not it's not so simple, right? It's not so simple as, oh, well, let's just change everything that we teach and just make it all about, you know, these uh sort of like deeper uh uh you know metacognitive skill development. And because it's it's gonna be somewhere in the middle. And yeah, I agree. It's like you it'd actually be hard to imagine how could you even do deep like uh a historical research on a time period if you don't even, if you don't have any grounding in like what the precursor uh events were that led up to that. And I think the same goes for I mean, I mean, it feels like every subject that we that we looked at, it's like you know, there's some things that actually we do need to anchor in. I I thought the four quadrants was really helpful, right? Like, you know, there's there's some skills that AI can't perform independently, but it can enhance, and that's like strategic thinking, you know, complex problem solving. And so we need to deepen that. There's some where AI is actually really good. You know, you mentioned some examples of like writing, like like research, you know, data visualization. Um, and so that's actually a moment, a place to transform, you know, like rather than teach just the process of you know processing data, think about like, you know, how could we use this data to do in you know intuit different interesting things and then anchor and streamline. And I'll, you know, we'll we'll give folks a link to the report. We don't have to go through it sort of in full detail. I I guess like as you, you know, as you as you look at the other work that Burning Glass is doing now, and and you know, I just mentioned I think you know, a very thin sliver of all of the all the research that is probably in motion, how do we balance, you know, there's it feels like this question may never be answered at the in the time frame that's required to actually start shifting the system. And it may be the case that we have an answer in five years. I don't know if you have a c any confidence in that, but it you said something about teachers needing to sort of be equipped to guide to kind of like answer some of these questions, and that's interesting to me because I think today there's this sense that, well, teachers just teach what they're supposed to be taught, and then the decision makers at the top kind of decide the standards and the assessments, and that's just sort of how it works. And is there a need for sort of like shifting that a little bit this next go-around?

SPEAKER_01

I I think there certainly is. Teachers play an incredibly important role as a bridge between their students and their communities on the one side and and their schools and their systems on the other. And so that intimate knowledge they have of both gives them the level of understanding that's needed to be able to figure out how to adapt these um these kinds of approaches. And right now, the problem is that we haven't designed for that. Um, teachers in many cases have um very heavy loads. They're teaching lots of students, they have lots of papers to grade, um, um, perhaps even more if we're thinking about moving to blue books again. And uh this is going to take space. I think it does take air cover. Um, so this isn't something where I think we can just kind of put this all on teachers and say, hey, go figure it out. Um uh systems um have a um a heavy responsibility for figuring out how to adapt these um these kinds of findings, um, for how to reshift priorities, for how to make sure that we're um dividing time in ways and prioritizing time in ways that um are going to um uh enable student success. But teachers play an important role in this as well, um, because they're going to be able to be um uh most responsive to um to their students. They're gonna understand um in a time where things are moving very dynamically, this isn't something where we can kind of do a traditional multi-year curricular um development exercise, and we can have all sorts of um A-B testing to try different methods, and then you know, in seven years we'll have a recommended curriculum. Um we're gonna need to be able to give teachers flexibility to to try different approaches within a set of priorities that that are determined more broadly.

Alex Kotran

Yeah, that was my my my takeaway actually from the research was was I think could be summated by, wow, this is actually really complicated. It's really hard to explain in, you know, or or provide a list of just like specific changes that a teacher is gonna be. I can it can we can give examples, but this actually this report, what it doesn't do is say, okay, here's the roadmap to exactly what the changes need to happen. Um and and you just sort of you you sort of frame this almost as like we need to build the culture, which we've been I've been thinking about this a lot of like, you know, schools need to be these like living labs, and how do you do that? Well you have to, you know, you talk about burning glass as a as a lab, as a data lab. Um and I and you can tell me, but my guess is that you know part of building a culture of innovation, it's not just that you can't write it down onto like a workflow, there's not like a standard operating procedure document of like creating a culture of innovation. There's a lot of nuance and change management and empowering, you know, individual leaders, you know, on teams and managers. And I'm sure you have a huge role, right, in like making sure that the entire organization, you know, is thinking about their work and through this lens of innovation. But you just described all of the barriers that we've created for schools. Essentially, we've we've we've treated schools as this like extremely linear dictated thing that happens, uh, like I say, it's a very sort of rigid system. Maybe you can bring some of the in, like, you know, because you also talked to a lot of you know, like leaders in industry who are thinking about this question from the perspective of like reskilling their own workforce. And if you're talking to similar people that I'm talking to, you know, they're starting to realize that it's not so simple as, okay, I need to pick which AI tool. Because when we started trying to evaluate the tools, the landscape literally changed over the course of six months. And now there's a whole new set of tools that I now maybe and I don't now have any confidence that it's gonna be the same set of tools in six months. What have you seen the private sector do in terms of trying to build the conditions to do have, you know, make some bets given all the uncertainty?

SPEAKER_01

So I think there's a couple of um core ingredients that enable some organizations, whether they be private sector organizations or public sector organizations, to um to be more innovative. And it starts with um providing um space um for um innovation and for entrepreneurialism. That used an important word, which is empowerment. Um, we need to make sure that people on the ground um are empowered. Um, and that means um both given the flexibility and right and authority to experiment, um, it means making sure that um we are giving them the means for tracking um how their experiments are going, right? We we, if we're gonna live in systems that are generally accountable, we need to not just say, hey, well, we're not gonna punish you for an experiment, but we also have to say, hey, how are we gonna know whether experiments are successful or not? And that speaks to another um thing which we've we've learned from studying um highly innovative organizations. They have a clear set of goals. Um and um they have a feedback loop. Um, right. Yeah, I think one of the problems that we have right now is that um our goals in the world of education today tend to be myopic. How many students are graduating? Um, that will be considered actually a fairly longitudinal one, right? Like how many students pass uh a given test or reach a different uh given level of proficiency uh on uh on an assessment regime? One of the things which um I think this moment forces um a debate on is should we be um holding ourselves to account for a um for a um a longer arc of outcomes? Is the role of education simply to make sure that students pass tests? Um, or is the role of education to make sure that students launch effectively into their careers, um that they progress over time? Um when we start to think about a longer arc of relevance, it's gonna be um uh critical because uh that's going to open up the door to um to teachers and to schools and to systems rethinking their priorities and trying things that are going to drive innovation um against a um, you know, uh that's optimized against a um uh a yardstick that's much more meaningful.

College Signals And Lifelong Learning

Alex Kotran

Yeah, that I hadn't quite thought of it like that, but I guess that's what that is why the private sector can move so fast is because they know what their goals are. Like more profit and you know, stay ahead of the competition. And in school right now, yeah. So I guess I guess you know, one of the primary indicators is you know college matriculation. You you have talked a lot about lifelong learning, and even like I think a year ago I was listening to a podcast that you had done, and you would it was very prescient because you were basically like, you know, the one thing that we need to index on is just like people's ability to learn and like even the act of getting a job, which I thought was quite insightful. Like, is it 12? What is the uh you you shared the number that like with the average number of jobs someone today is gonna hold is you know 12 and a half, yeah. 12 and a half jobs. Okay, so it's like, yeah, even the act of getting a job is is a part of this. Um you know, should we but at the same time, Burning Glass also did research into like some of these like credentials? And um, if I understand the research correctly, there's you know, uh only a only a small portion of credentials are actually demonstrating some kind of a meaningful value to the folks who go through it. Um so yeah, where do you like what advice would you give parents right now who are like, yeah, should my kid go to college? There's I don't even know if there's gonna be jobs, like should they take on debt? But then also like, is it that seems like a big risk not to go to college? And I hear about all these blue-collar jobs, but I think culturally we haven't quite come to terms with like how to talk about apprenticeships and sort of like alternative learning pathways. It feels like when I was in high school, it was that was like the failure. Like if you failed to go to college, then they were like, oh, alternative pathways for you know the the non-college kids. And only like 10% of people in trades in the US have a LinkedIn profile. That's my favorite stat. And in Germany, it's like 70%. Um so yeah, do you does your advice change compared to where it might have been five years ago?

SPEAKER_01

What I would say today is in a lot of ways actually pretty similar to what I would have said five years ago, um, which is that we live in a very dynamic labor market. Um, in fact, about five years ago, we did an analysis, maybe it was four, um, when uh where we looked five years back and and we we just looked at how much the skills of an average job had changed over um over the prior five years. Um before LLM. So we found the average job had had changed over um 37% of its skills. So literally, you know, um uh a third of of the things that you do every day, or that the thing, a third of the things you need to know in order to do your job are different.

Alex Kotran

And this is 2021 and how far back was the 37%?

SPEAKER_01

Like what so it was it was between, let's say, 2021 and 2016 or 2022 and 2017. Yeah, exactly. Right. So um LLMs have accelerated that, um, perhaps, but it um it it's it so if it all it really does is is accentuate the fact that we can't graduate students um with uh who are equipped with everything they're ever gonna need to know. It's a fool's conceit. Um and just as much as it is a fool's conceit that students are um when they enter into the world of work, are just gonna be able to pick everything else up as they go along. Um we know that uh one of the reasons why um displacement from work, um, a serious issue in the age of AI, is so um disruptive to people's lives and destructive in a lot of ways to people's lives, is that um we don't have a good infrastructure for helping people um navigate. We don't have a good infrastructure for helping people identify what skills they need to learn. And to your point about credentials, we don't, when they do learn stuff, we don't have a good infrastructure for helping them signal what they know. And uh all of this says it goes in some ways back to what we were talking before about durable skills. Um, right? We do need to do a much better job of helping students um graduate with the level of proficiency that that's um going to enable them to be truly successful in leveraging AI-based tools. But we also need to recognize that students are going to need to pick up new skills as they go along. Do they have the skills to acquire new skills? Um, what are those skills? Um, and do we have as a society a longer view of uh where education and training happens? Um my friend Mitchell Stevens at Stanford likes to talk about the transition from um a schooled society, one where learning is the prerogative, the exclusive prerogative of the young, to um to a learning society, one where we all have the opportunity to continue learning over time. So, in some ways, part of what we need to do is start to redefine what is school and when does school happen.

Alex Kotran

Yeah, that's quite heady. Um But it's okay, so this is I think this is a really interesting place to close on. Um because it's like, okay, being really good at learning uh is sort of this a grounding skill. And I think that can answer the question of why college. Because like when I'm asked the question, my intuition is I think you should still try to go to college, maybe just don't take on a bunch of debt, certainly to go to you know, a tier two private school just because you want to say you went to private school. Um But I think you talked about signals, and it's uh at the end of the day, there it is possible that AI can you can do all this learning on your own with AI. Um, but if you're an employer, how are you gonna really know if somebody has you know built those durable skills, had like has like built the muscles to continue learning? Well, if they spent you know two years or four years going and learning about a topic, um that will be a that will be a signal. And I think what I think what what you all have been proposing is a question of, you know, is the like do we need to be thinking more broadly about how to send like is that the only signal is that the only way to send that signal? But what I sorry, what I wanted to close with is is how do I like you know teachers need to be able to embrace sort of like the transition moment in part because if we are going to expect them to help teach kids, you know, not just what but like how to learn, certainly they need to be modeling that themselves. Um and like I'm curious, like sort of like this uh like a diagnosis of the state of K-12 right now, as you understand it, are we investing in sort of professional development in in a way that kind of like treats learning itself as sort of a goal, or is it more my my my view is it's more sort of compliance-based?

SPEAKER_01

I think this goes back to the question of of autonomy and um and license. Um in a system that gives um surprisingly little autonomy and limited license to teachers, it makes sense for training to be essentially an exercise in compliance. But that's not the kind of system that is going to be able to adapt. It's not the kind of system that's going to be able to innovate. It's not going to be the kind of system that will provide um, that will be able to um reprioritize and um and consider new ways of rising to um a very difficult challenge of graduating students um at a higher level of proficiency than we've done in the past. And we've already been struggling at that. Um, that's going to require a different kind of way of helping people develop professionally. Um but we have to first make sure we have the kind of structure where as teachers grow, they have the level of um of opportunity to be able to exercise um new capabilities and put them to work for their students.

Alex Kotran

Yeah, it's a unfortunate unfortunately it's just really complicated. It's like you have to do both. You can't just focus on you know equipping teachers because yeah, the the the agency frame makes sense because ultimately if you want to give someone the muscles to like leverage their curiosity, they have to have the space in which to actually practice that curiosity. And you know, so that's why you have to be doing the systems change work alongside the capacity building. And I think without you know, research partners like Burning Glass, it's very hard to get decision makers to move because they're sort of like it's very comfortable to say, well, there's just so much uncertainty, and to use that as an excuse to sort of like put off making hard decisions. I think what Burning Glasses bring to the table is like I don't know, we have a lot of certainty now, and there's maybe some. Things that we don't know, but we know enough to actually start making decisions. And I'm I'm curious like what your parting words of advice might be to, you know, um, whether it's a district leader or maybe even someone at a state education agency who is trying to figure out like, how can I get to the enough answers to be able to go to like you know my stakeholders and get them to you know start actually making some big, you know, maybe provoking moves.

SPEAKER_01

So we started off this conversation talking about what and how, right? You know, what being the the questions of what do students need to learn, how being how do we how do we teach it, um, how do we use AI. Um, and I I think those what questions really do need to be addressed centrally because they represent a set of shared priorities, because they also represent a strategy for um for what it is that we're going to emphasize, where we're going to bring resources to bear, and how it's all the individual efforts of different teachers are going to come together. We need to make sure that we're having those what conversations very purposefully in a way that's data-driven. Um, we need to make sure we have processes in place that ensure that we're continuing to update our awareness of how the landscape is changing and how the capabilities that are needed are changing. Um at the same time, we need to create greater license and provide for greater experimentation on the how side. And that's where I think um we can strike the right balance between what we do as a system and what we do in our classrooms.

Escape The AI Vortex

Alex Kotran

Yeah, I I couldn't agree more. It's it's a daunting, it's a daunting amount of work. I I wish it was so simple as just make sure everybody knows how to use AI tools. Um But I think it's also comforting, right? Because it it means that there is still this incredible role for you know the human advantage. And while it may be complex to figure out how to cultivate that, I can think of no better project for education than to really hone in on like what do we need to do to cultivate the amazing people that are going to help organizations, you know, figure out how to leverage technology. Matt Siegelman, this was awesome. Really excited. I also enjoyed this. So fun to uh to do this work together. I I would do wait with bated breath for the next uh release. Make sure um put the word out the minute there's um some new data to to share with the space. And until then, where can folks find you? Is if there's one thing that someone could would read, besides obviously the report we did, yeah. Maybe that can be your sort of like parting advice. Um, like what are you reading now? What's what's been sort of like keeping you uh sort of like drawing your thoughts in when you're on walks?

SPEAKER_01

Uh you know, I I've been well I've been reading on on the side the mill on the floss. So um uh um, but I I I I say that um only partly jokingly, I am reading the mill on a floss, and I will offer that as as um as parting advice, right? It's easy to get sucked into a um uh an AI vortex. Um and like those um, you know, all AI all the time, kind of news radio stations or something like that from the 1980s. I think it's uh it's it's incredibly important for us to recognize um that the world of education um at its core is not only about uh readying students um uh from a practical perspective, um, but everything that we've been talking about is is um uh uh doesn't change the obligation of of schools to teach what's beautiful and true, um, which includes the melod of floss.

Alex Kotran

Yeah, get out of the vortex. Um Matt, thanks so much. So enjoy this album.