Edtech Insiders

Building AI School Teams: How Leading Educators is Rethinking Teacher Support with CEO Chong-Hao Fu

Alex Sarlin Season 10

Send us a text

Chong-Hao Fu is the CEO of Leading Educators, a national nonprofit specializing in comprehensive instructional improvement. 

Over the past 14 years, he’s worked to scale exceptional teaching and leadership in some of the country’s fastest-improving districts while exploring how emergent technology like AI can bring new possibilities to instruction.

💡 5 Things You’ll Learn in This Episode:

  1. Why instructional coherence is the backbone of school improvement.
  2. How Leading Educators is integrating AI into lesson planning and professional learning.
  3. The importance of keeping teachers in the “pilot’s seat” when using new technology.
  4. How schools can build authentic, future-ready skills through civic and AI projects.
  5. Why scaling innovation requires both coherence and continuous improvement.

Episode Highlights:
[00:02:13] Leading Educators’ mission and unique research on instructional improvement.
[00:04:31] How coherence helped partner districts like Charleston achieve gains during COVID.
[00:07:47] Partnering with PlayLab and CZI to build AI tools rooted in high-quality curriculum.
[00:11:54] Redesigning school staffing and forming AI School Teams Collaborative.
[00:15:07] How teachers’ mindsets on AI shift when they get hands-on experience.
[00:30:07] Future-ready skills: pilots in Denver and Boston connecting AI, civics, and leadership.
[00:37:47] Why authentic, student-driven projects can transform both learning and teaching.

😎 Stay updated with Edtech Insiders! 

🎉 Presenting Sponsor/s:

Innovation in preK to gray learning is powered by exceptional people. For over 15 years, EdTech companies of all sizes and stages have trusted HireEducation to find the talent that drives impact. When specific skills and experiences are mission-critical, HireEducation is a partner that delivers. Offering permanent, fractional, and executive recruitment, HireEducation knows the go-to-market talent you need. Learn more at HireEdu.com.

Every year, K-12 districts and higher ed institutions spend over half a trillion dollars—but most sales teams miss the signals. Starbridge tracks early signs like board minutes, budget drafts, and strategic plans, then helps you turn them into personalized outreach—fast. Win the deal before it hits the RFP stage. That’s how top edtech teams stay ahead.

As a tech-first company, Tuck Advisors has developed a suite of proprietary tools to serve its clients better. Tuck was the first firm in the world to launch a custom GPT around M&A.

If you haven’t already, try our proprietary M&A Analyzer, which assesses fit between your company and a specific buyer.

To explore this free tool and the rest of our technology, visit tuckadvisors.com.

[00:00:00] Chonghao Fu: So if we want the same goal for our kids, we actually have to acknowledge the difference in the future. And she's like, and one of the things I'm encountering is a lot of our more experienced teachers, they might be the ones most distant from AI technology. And there's also the threat of a world in which kids are bringing in a whole new set of technologies that you yourself have not played with and are less familiar with.

And so how do we create spaces for people to play, to experiment, to make things, to understand the limitations? And then bring that back into their craft so that they can also feel again, like pilots and not passengers. I think we're just not creating that opportunity, and that's way more than AI literacy.

It is also more like AI use and confidence and creation.

[00:00:44] Alex Sarlin: Welcome to EdTech Insiders, the top podcast covering the education technology industry from funding rounds to impact to AI developments across early childhood K 12 higher ed. And work. You'll find it all here at EdTech Insiders. Remember to subscribe to the pod, check out our newsletter, and also our event calendar.

And to go deeper, check out EdTech Insiders Plus where you can get premium content access to our WhatsApp channel, early access to events and back channel insights from Alex and Ben. Hope you enjoyed today's pod

for our interview today. We are here with. Chong-Hao Fu. He is the CEO of Leading Educators. It's a national nonprofit specializing in comprehensive instructional improvement, including the use of technology. Over the past 14 years, he's worked to scale exceptional teaching and leadership in some of the country's fastest improving districts while exploring how emergent technology like AI can bring new possibilities to instruction.

Welcome to EdTech. Hi Alex. Thanks for having me. This is gonna be fun. Yes. So first off, for those in our audience who may not yet have heard of Leading Educators and what you do, can you give us an overview of what you specialize in and some of the really interesting research you've been doing over the last few years to improve instruction through technology and teachers?

[00:02:13] Chonghao Fu: Absolutely. So Leading Educators was started 14 years ago. We focus on the coherent improvement of school systems, and one of the things that makes this unique, we have. An incredible body of research. In fact, we're the first professional learning organization in 10 years to achieve tier one. What works, clearinghouse evidence from like RCPs of a number of sites.

And it's really about how do we set up adults with everything that's gonna allow 'em to coherently improve and do their best for kids. And to do that, we build in a lot of alignment from teachers to teacher leaders, to principals, to principal managers, all up and down the system. So we received a McKenzie Scott Grant about four years ago, and we set up an innovation team to explore three key questions that no partner was yet asking us to work on, but we knew was critical for students giving the changing state of the world.

So one question we decided to dig in was AI enabled instruction. How would teaching and pedagogy change given all the emerging tech that's coming up? Second, we set up projects to think about, well, how is the role of the teacher actually gonna change in this new context? And then finally, we've been digging into future ready skills.

'cause we know that the standards and competencies that students will need to really open up opportunity in the future, they are going to be different. And the current content standards are still important, but they're likely not sufficient for what kids really need to thrive. So that's what we've been working on and a big part of my job is helping to integrate the innovation priorities we have as an organization with the core work of helping to improve school systems based on where they are.

[00:03:44] Alex Sarlin: Yeah, well first off, you definitely get the prize for clearest vision of the future because four years ago was right before all of this AI stuff came out, and those topics could not be more relevant. You know, you must have had a great crystal ball to see that AI enabled instruction. The role of the teacher in this new era and future ready skills.

It is what everybody is talking about now. So let's dig into some of these things. I mean, before we get in, even though these are incredibly interesting, let's actually look at the coherence. You mentioned the coherence is sort of a core aspect of Leading Educators and that you've had these incredible effects.

Yeah. Through randomized control trials. What are some of the elements when you look at some of the systems around the country? What works for instructional coherence? How should they think about it? And what are some of the sort of core strategies that you tend to implement to help schools really improve their practices?

[00:04:31] Chonghao Fu: Yeah. Well, this is my favorite topic. One of the things that we have been really disheartened by, I think as an education sector is the slide obviously that happened in student results during COVID and Sean Rudin and his team at Stanford have been gathering this big national data set from 2019 to 2024 to understand like what actually happened.

And across the nation's 200 districts. Only five of them managed to improve reading scores year over year during the years of the pandemic. Four of those five are Leading Educators, partners, 

[00:05:00] Alex Sarlin: and in fact, 

[00:05:01] Chonghao Fu: the district that had the biggest gain, Charleston, South Carolina, is actually one of our core partners and what we were working on.

We know from research from the Carnegie Foundation from 10 years ago that when you're a new teacher, you might actually get feedback from as many as 18 different people. There's this famous study in Baltimore called the B 10 study where they found that new teachers had mentors, they had coaches, they had principals, and all of that adds up to a lot of noise and it doesn't help people know what to focus on.

'cause really. People can only get better at one thing at a time, and they need kind of more coherent support. And so in Charleston, what we did is we supported the district sharding of the turnaround schools, but then across the whole district helped them adopt better curriculum so that teachers are using the most research-based stuff of kids.

And then we aligned all of the learning experiences back to the curriculum. So if the idea was, Hey, we're gonna read more challenging, rigorous texts, but we're gonna help students do the thinking and discussing of that text, that's a very common look for when people are using EL education. Let's work for the teacher so everyone knows what that looks for, but also the coaches on how to coach for that.

Principals how to monitor and support that. And then principal managers how to support the principals. And if all four of those layers are have the same look for, but then they understand their role in supporting it, it turns out you could make things happen really, really quickly. 'cause everyone's now focused on the same thing.

They all know how they're supporting it, and you just keep rolling cycles like that every quarter on whatever is most important, right Then for teachers to support kids. 

[00:06:27] Alex Sarlin: I love that. So in some ways you're sort of taking the curriculum and making sure it's high quality and then sort of making it the spine in which everybody interacts with.

I mean, when I hear about 18 different inputs and then teachers have all of these students that they're trying to differentiate for, and then up to 18 different mentors and coaches and various types of people in the building, telling them, trying to improve that is a very tough role to be in. You're getting sort of squeezed from both sides.

Yes. But I can imagine that that type of coherence. Clarifies things. 'cause you know exactly what you're talking about. You're talking within the same language and you're talking about the same goals. 

[00:07:00] Chonghao Fu: That's exactly right. 

[00:07:01] Alex Sarlin: And it's 

[00:07:01] Chonghao Fu: particularly critical now because we, as we know, we're about to ask our schools to do more with less based on some of the policy changes and budget cuts.

And so it's really important for people to take stock of what actually are all the inputs and activities in our system, what do we believe is having the most impact, and how do we send clear signals? Around what's most important, and then align supports to that. And obviously that's true in the current state, but that's also gonna continue to be true as we think about AI enabled instruction ed tech in the future.

[00:07:29] Alex Sarlin: Yes. Let's talk a little bit about that. You know, you've mentioned your three pillars that you've been studying. Yeah. The things that you expected people to ask you about, and I'm sure they're beginning to ask you about it a lot these days. Yeah. So when you talk about the high quality instructional materials like the el, how do you see the relationship between that type of curriculum and ai?

Oh my gosh. 

[00:07:47] Chonghao Fu: So I think this is huge and. This has actually been one of our biggest r and d places. We've been working really closely with the team at PlayLab. I don't know if Yusef has been on, but PlayLab is committed to building open source AI infrastructure for educators to use, and they've been participating in a project where open source, high quality materials.

Are being knowledge graphed by the CZI team so that large language models can reference them more accurately. And so using those materials, we've been building apps and tools aligned to illustrative mathematics for the last, almost it'll be a year in December that we started this project, and they're about to roll out a number of districts and charters across the country.

What's exciting about this is when we're building tools of illustrative mathematics, we're using the AI to leverage everything we know about the curriculum. Everything we know about coaching and learning science, but then trying to design really usable tools. So I'll give you an example. One I'm kind of obsessed with is what is the very best 10 minute or 15 minute app that could help a teacher get ready for his or her next lesson?

And so some of what's been created has been a teacher will just ask AI like, Hey, can you shorten this lesson? Can you create more problems? Can you make these problems more real world? In those things, AI is doing almost all the lifting and all the thinking for the teacher. And it's not necessarily helping us as educators like internalize the content and what we know from cognitive science.

You actually learn more when you're asked questions and you have to do the active retrieval and the meaning making. So our app is actually designed by, okay, what is the most important math for the next lesson? What are the most likely student misconceptions that you will encounter? And if you encounter those misconceptions, like how will you actually reframe.

And as you give your responses, the AI will actually supplement your answers with everything that it can pull from the curriculum, from your district's professional learning, from your school's priorities. Like everything will kind of get filtered in as you are planning and engaging in a really tight way.

And then because we've been able to actually build in visualizations, you can actually see additional math representations and you can start engaging with that. And so by the end of 10, 20 minutes, you can actually do really thoughtful lesson internalization in a way that's much more interactive. Then what we have been doing, which is to like read a very dense lesson plan or unit plan and try to annotate it with post-its.

But it's still making sure that all the thinking is happening within the educator and it's really setting up the educator to have the most productive use of planning and internalization. So that that gives you a sense of the type of thing that we're working on. 

[00:10:19] Alex Sarlin: Yeah, that's a really exciting vision for how teachers can do their planning, and I love how it leverages both the expertise of the teacher and sort of keeps the educator absolutely central and core to the experience, but also allows everything to be interconnected, and that's what Play Lab and the Knowledge Graph works do.

It says, well, let's have every element of the curriculum have any question, any pd, any pre-read, all sort of at your fingertips. But still you are driving as the teacher. It's not just reading or internalizing, it's you're driving, you're making sense of it so that when you walk into the classroom, yeah, that's a really exciting vision.

[00:10:52] Chonghao Fu: You hit the nail on the head like so. One obviously is the coherence piece that we already talked about and kind of carrying that with us. But the second mantra of our team is how do we ensure that teachers are the pilots and not passengers of this work? They have to be in the driver's seat. Their agency and their expertise has to be centered and supplemented and scaffolded, and the job has to be made more coherent.

And I think one of the early lessons of EdTech is we have thousands and thousands of apps, but they aren't necessarily being used or coherently integrated because we haven't situated teachers, like you said, in the driver's seat of the work. 

[00:11:27] Alex Sarlin: Exactly. So when you think about teachers being in the driver's seat and you think a lot about redesign, you know, getting schools to think about teachers and in making them more effective, you've talked about redesigning school staffing as well.

So I'm curious how you sort of think about what are some of the innovative models, what are some of the ways you tell schools to think about how to get teachers who are engaged, excited, who are ready to develop and grow, and who are frankly, who are ready to embrace some of this technology as well? 

[00:11:54] Chonghao Fu: So based on this, a lot of our research at Leading Educators, we know that we get the best.

Impacts for kids when we're not working in isolation with just teachers or just principals. But we're working with multiple layers. And particularly we found like working with a school team where it's the principal and a team of teacher leaders. That's really amazing. 'cause then there's that school level coherence and there's also just peer accountability and reinforcement of the work.

Mm-hmm. And so last year. With an amazing organization, the Learning Accelerator. Now full scale learning. We launched this AI school teams collaborative. We put out a quick call across the country like, Hey, who wants to think about school level AI practice change? And we quickly got 20 innovative schools across the country.

And what was exciting about this, this was to our knowledge, the first project where it was a principal and a team of teacher leaders coming together to think about, well, what exactly. Is the change in practice and instruction in the way that we run our schools based on the emerging technology that's surfacing.

And we used a framework called vat, the value out of technology and teaching that we had spent two years developing with the Google for Education team. And what's cool about the VAT is it's rooted in trying to understand how can technology actually add value to teaching and learning. And it has three core value adds.

One is around do more, which is the efficiency plays. Mm-hmm. Do better, which is around relevance and targeted learning, but then do new, which is a whole set of authentic, real world experiences that we've always wanted kids to have. That technology can actually help us better unlock at this moment in a way that is like really, really promis.

And so we worked with these set of 20 schools and we said, where do you actually want to do more, do better or do new? And let's help you identify where that could live in terms of planning, in terms of instruction, in terms of parent communication and teacher learning. And we are just about to publish, and I think by the time this interview airs, case studies of all of these schools and what they took on and what they led.

And so I think what's amazing is folks want to. Obviously improve teaching and learning at their schools, and like having a container where people can work together and where they can have access to. Really amazing technologists, ed tech tools and coaches like that actually allowed for these schools to put kind of their visions into practice.

[00:14:12] Alex Sarlin: Yeah, it's incredibly exciting to hear and, and one of the things we talk about a lot on the podcast is how, and I'd love to ask you about this frankly, is that when you read some of the sort of popular media I've been chronicling over the last month, there's been like maybe eight or nine stories about teachers and especially creative writing teachers or professors who are just like struggling to incorporate AI or they're hating AI because they feel like it's allowing their students.

To cheat or to subvert their assignments. And the sort of popular media narrative has been teachers hate ai, and I just have seen it over and over again. And yet you mentioned, Hey, we put out a call for people who wanted to do innovative work with AI and collaborate in all these ways. And you got 20 schools in a heartbeat.

And I wrestle with this because these feel like opposite narratives. I'd love to hear you sort of, yeah. Talk about your experience here. How are teachers dealing with this age of ai? Are they excited? Are they cautiously optimistic? Are they throwing up their 

[00:15:07] Chonghao Fu: hands? Yeah, I hear the same narratives and I think there is a lot of either, or thinking right now when you know we're an organization that has grounded a lot of our learning in experiential learning, and one of the keys to experiential learning is like actually doing the thing.

Is actually the fastest ways to change like mindsets. And on our team, one of the things we like, this is two years ago now, we had a all staff retreat and we have a lot of AI skeptics at Leading Educators as well, as well as an innovation team that's like really advanced in thinking. And we pulled up and we asked folks like, who believes that AI is gonna make education better?

Who believes that AI is gonna make education worse? And let's actually. Gather all of the potential reasons, and then let's think about where we all are. And there's quite a, there's quite a spread at Leading Educators. And then we said like, yes, these things are real possibilities. And the things is that actually gonna move us from the more dystopian future to the more, more like the future we want is actually our own agency.

And what we do about it. So let's actually dive in and actually let's make some things together and let's see. And so when people start to play, when they start to do data analysis with it, when they start to kind of revise operation systems within our organization and they actually work and create things, mindsets start to shift really quickly.

You start to realize it is an imperfect tool. I think John Bailey at AI is very fond of saying that AI is like an eager but imperfect assistant. No one's gonna turn down an extra assistant in education, but nor are they gonna hand everything over. Right, exactly. So like as you get better and you start to understand what it can do and what it can't do, and as you start to realize that that's gonna constantly shift, that changes your own mindset.

And I think if I'm an educator right now, I think one of the biggest challenges is that I am not the one necessarily deciding my district's academic. Integrity or cheating policy. Right. And so we've talked to teachers where like they would like to use AI more, but they're worried that they might actually get kids in trouble in another class based on the definition of cheating.

And so I think there does need to be district level leadership and coherence that makes it safe for people to experiment. But I think everyone recognizes that the world is changing. We interviewed a set of teachers in Boston. Two weeks ago that were part of this Future Ready Leadership skills pilot that we're doing with the school in Boston.

And she said it so eloquently when she said the goal is the same in terms of the opportunities we want for kids. The goal is the same, but the future is different. So if we want the same goal for our kids, we actually have to acknowledge the difference in the future. And she's like, and one of the things I'm encountering is a lot of our more experienced teachers, they might be the ones most distant from AI technology.

And there's also the threat of a world in which kids are bringing in a whole new set of technologies that you yourself. Have not played with and are less familiar with. And so how do we create spaces for people to play, to experiment, to make things, to understand the limitations, and then bring that back into their craft so that they can also feel again, like pilots and not passengers.

I think we're just not creating that opportunity, and that's way more than AI literacy. It is also more like AI use and confidence and creation. 

[00:18:16] Alex Sarlin: No, that's fantastic. And you got right to it. I was asking about basically how teachers feel about this AI coming into their world. There's this huge range of different feelings, and you're saying even within Leading Educators there is, it makes a lot of sense.

What I'd love to ask you about here, I love your example at PlayLab, is an organization that is also dedicated to that sort of experiential learning piece where it says, get your hands dirty. Try building something that's gonna help you tomorrow. That's right. And it changes a lot of minds that way. And you mentioned sort of the need for coherence here that even the teachers who wanna be very.

Cutting edge with ai, they wanna use it. They're afraid of getting your kids in trouble by saying, oh yeah, of course you could use, I mean, Ethan Molik always says, force the kids. That's his philosophy. But you say, if you don't use AI on your papers, I'm not gonna grade them because you have to do it because that's the future.

But if a teacher in a school does that. Then the kid goes to the next class and does the same thing and gets kicked outta school. That's a problem. So I'm curious, you mentioned the coherence, the need for coherence here. This feels like a sort of burning need for coherence within schools and universities, frankly, with this feeling of, well, if there isn't a coherent policy about AI usages.

Especially for integrity. Then everybody is sort of left on their own and frankly the students are going to jump in anyway. They're using these tools, they're using apps, but the teachers are left not being able to have any strategic approach. How do you think we're gonna get to coherence there? 

[00:19:32] Chonghao Fu: I love this question.

You ask great questions. 

[00:19:35] Alex Sarlin: Thanks. 

[00:19:35] Chonghao Fu: Yes, and I think it's two things, right? Because it's both coherence, but it's also a continuous improvement. 'cause obviously what AI can do is gonna be different three months from now, six months from now. And so we need something that is both coherent, but that also brings people along to continuously improve and not view kind of education as a really static thing.

And you mentioned Play Lab and Yusef and Team. And so we were part of a convening that Yusef and Play Lab and Alan Chang from the Coalition of International and our Bound Schools in New York hosted, so it's 60 high schools coming together. Leading Educators actually led the math room of teachers working on.

Math and AI apps. So folks from New York City solves from Bank Streets, from New Visions, and like many high school teachers are coming together to design and build apps and tools. Now, obviously not everything is gonna be high quality, not everything is gonna be coherent, and there obviously need to be guardrails and screening.

But there is a lot of agency, there's a lot of testing, a lot of like. Storming. And one of the things in the math room we did was we anchored the development to a cycle of learning in terms of the planning, the instructional practice, the coaching, the data analysis. So the room was developing tools aligned to the natural planning cycle that teachers have to engage in every lesson, in every unit.

And like thinking about where are the opportunities in that cycle. To develop things that are useful that either save time or allow us to give better instruction. And then Alan Chang across all of these high schools, one of the things that he and Yusuf did that I thought was so powerful was they said, we're gonna anchor this entire three day summit.

Around the idea of productive struggle. Where can these AI apps actually encourage our students to think deeper and not take shortcuts? Where can we encourage productive struggle and not just like lighten the lift and then like there's nearly a hundred educators working over three days developing apps and tools.

Then they said, you know what? We're gonna fund some design fellows across all of our schools who are going to be kind of leading lights in terms of what great AI use looks like. And then here are the principles around productive struggle around really encouraging student thinking as opposed to diminishing student thinking.

That's gonna anchor app development design. And so you're putting out a few principles about what good AI use is. You're kind of highlighting and funding a few kind of leading fellows within your district. 'cause we know there's an innovation bell curve, right? And if all our energy goes towards people who are resisting, we're not actually going to forward what's possible.

And so there's always the opportunity to kind of invest in kind of early adopters and innovators and build more concrete examples that people can get really excited about. That's what we were seeking to do in that AI school teams collaborative. Like let's build some models. That other folks can see, and then that makes it easier for us to do broader adoption across the sector.

So I think those are some ideas, but I think it's gonna be a repeated process and that for folks who are serious, we actually need to bring. Teams of educators together pretty routinely and be like, okay, what can the apps do now? What is the agentic version of what we just did? What is the team-based version of what, there's so many things that are gonna continue to roll, and then we need serious conversations too about locate.

Where might we need to add to the content standards in terms of things entrepreneurship or AI literacy. Where might we be able to prune back the existing content standards to create more room? Like, these are things that I think feel really critical and that we need to have kind of honest conversations about that.

Both embrace the fear and the possibility and not kind of paper over the differences. I think that's what's actually needed at this moment. 

[00:23:04] Alex Sarlin: Yeah, the case studies that you just mentioned, we will definitely link to them in the show notes for this episode, and I think that's a perfect example of creating models.

I think there's just not enough right now models of how schools can think about this, of how individual. Educators can think about this and frankly, even how students can think about this, I think students are left looking at this wall of apps or going to Chad g BT and saying, okay, I'm gonna try to figure out how to use this.

But there's just not enough clarity, I think, in the world. 'cause we're, we're so new in this about what works. So put it simply Right. 

[00:23:34] Chonghao Fu: Yeah. One of the interesting things we did with a Harlem this past year, Harlem District five, where we've been able to support on the New York City Wreaths initiative. We did this work with the value add of technology and teaching and we said like, Hey, let's work together to make the literacy instruction more coherent.

[00:23:49] Alex Sarlin: Mm-hmm. 

[00:23:49] Chonghao Fu: Because we actually know from districts adopt way more digital supplemental materials right now than core instruction. So it can be very confusing around like what to use, what, when, and when. We started the year, many educators, like we don't have enough technology. And we're also overwhelmed by technology because it was almost like there was too much to just take in and to know if you actually had what you needed.

And by the end of the year that actually flipped. The majority of folks are like, you know what? We actually have what we need. We now just need more time to use it in a coherent way. It's even model building across different apps and tools. Like how does, like what's the vision of a really strong literacy block and how do these things fit together?

I think that building those things together with teachers is also really, really powerful. And that might change like six months from now or a year from now, as as different tools evolve. 

[00:24:39] Alex Sarlin: One thing that struck me when I read both Han's book and Ethan Malick's book about AI and education is they start the books exactly the same way they said when this technology first came across their radar For Han, it was very early 'cause he was sort of inside the beltway there.

You know, they were like, I stayed up all night, multiple nights. I just, I could not stop playing with it. I just. Was pushing on the limits. What could it do? What couldn't it do? What's new? What's possible? And you mentioned earlier this idea of, you know, the bell curve of adoption and the fact that getting hands-on sort of changes your perspective.

I think a lot of people who are very scared of AI are scared of it in the abstract and then they actually start playing with, and they say it's like an eager, you know, assistant who sometimes makes mistakes or has limits. And I wonder how, you know, as you mentioned, sort of this future of AI and I wanna get to the future ready skills next, but at this future of AI and teaching, you know, how might we get to a place where teachers can feel comfortable sort of playing with these tools as they change, as they grow in groups like the, like you're, you're mentioning, but just how even seasoned teachers who have been in the classroom for many years, how they can just get their hands dirty with it.

Play with it and try different things. Build their own apps, build their own questions, try adapting lesson plans or using it for, to pull data together or to build synthesis. You get where I'm going with this, but like as an EdTech field, can we help educators just sort of play with it and not feel so like black or white?

Like, oh, this thing is over here and I don't want to touch it, but my students seem to love it, but I don't wanna touch it for some teachers. 

[00:26:03] Chonghao Fu: Absolutely. I mean, I'm gonna start off an internal answer and then I'll get to a more kind of external one, but. So at Leading Educators, you know, we, we trained everyone up.

We've gotten everyone kind of a basic set of trainings. We acknowledge and embrace the skepticism, and then we stood up an innovation team that could go really deep and go fast. This year we've set the expectation that we actually expect everyone across the organization. To be really understanding how AI will embed in their core work.

And we, and we've set up an AI strategy team that's running all of these projects that are AI connected, but they will become embedded in the core team over the course of the year. Interesting. And that AI strategy team, the main bet that we're working on is kind of where we started this link between.

High quality curricula like illustrative mathematics or, or EL education or open sed, and thinking about, well, where are the AI levers in terms of doing better planning, better instruction, more real time analysis, better parent communication. There are lots of different places that we can strengthen and support that core instruction.

And once that becomes embedded in all of our professional learning and coaching, it then becomes a really just a natural part. Teacher learning the same way. Just to kind of use a crude example the same way that computers did, like it's, it's perfectly natural to say, Hey, let's take out our laptops and let's look at this lesson and let's look at this data set and let's look at this.

I mean, you could just as easily say like, now let's use this app that is gonna help you plan, or this app that's gonna help you differentiate this lesson or analyze this data. And so it's going to become embedded in the core tasks itself, and it's gonna solve. Educator problems that are approximate and are time consuming and are pulling educators away from the core work of really like seeing students and building relationships and helping kids master new concepts.

And so if we can embed it in our core work and just not even have it be its own separate thing, but just kind of have it be immediately useful. I think that's gonna be a really powerful entry point. And then as people are using it, we can do more. Hey, like if you want to dive deeper in building your own apps and tools, if you want to kind of enhance your AI literacy, you know, take this left turn here and it's, and we're gonna make those kind of easy opportunities as well.

That's what we've seen in our own team over the last two years. Everyone is using it, everyone's reporting. We actually have a new executive join our team who is absolutely amazing, but is coming from an organization where there like there was a lot of AI skepticism and so we're just doing it in our check-ins and like in, well, whatever is the upcoming task that she's thinking through.

Well, well, let's see what the, see how AI might enhance our thinking, our research. Yeah. Our drafting and then testing it, and then we we're gonna do micro steps every week. 

[00:28:46] Alex Sarlin: That VAT model you've mentioned where it's like do more, do better, do new, feels like a really powerful framework even for our own as adults, whether or not you're an educator, any work, there's this idea of do more like what can this do for me that improves my efficiency or productivity or take some of the onerous tasks off my plate.

It makes me able to do more. Agenda making, you know that I usually would because it can do it for me. And then to sort of do better, how can I actually improve the output? And then the new feels like a really powerful sort of trajectory for all of us to keep in mind, educator or not, about how to sort of move through the AI world.

How do you sort of think about the different things that can do? It feels like you're doing a lot of that internally. That AI strategy team approach, I think is a really interesting one. I think that that type of approach is happening in a number of different organizations, nonprofits, and for-profits of like.

Having a sort of internal team that has AI as sort of the top of its mandate, and then it sort of becomes baked into everyone's processes because it's just hard to take a moment and stop your normal work and then make sense of this new technology, what it can do, even if it's powerful. But it feels like that's a really promising approach.

I wanted to ask you about the Future Ready Skills piece of this. This is the third leg of of your Mackenzie Scott Grant of the innovation team mandate. Incredibly important one. Yes. What were you looking at and what did you come up with in terms of future ready skills, in terms of, you mentioned, you know, pruning back certain parts of the standards and adding other things in.

Where's your thinking on that right now? 

[00:30:07] Chonghao Fu: Yeah. Well, let me just, and since we're newer to this space, let me also just pause to just acknowledge all the incredible thought leaders that we've learned from here. Like the Kim Smith at the Tina Learner Studio, all the team at Big Picture Learning. People like Sujata at Incubate Learning.

They're all of these people who've been thinking about what the future of learning would look like for a long time. And so we're learning from so many examples. And then we're also then thinking about. Our seat. We're working right, right in the thick of it, shoulder to shoulder with districts and charter management organizations from the current reality.

And we're thinking about, well, where are the bridges? Like how do we build that bridge to the future given that we're in institutions? And so one of the things that we realized. And this, we did this with the Learner Studio team that Kim Smith leads was that we could maybe design some cool summer experiences 'cause there's a lot more flexibility in summer learning.

And so in Denver and in Boston, so the Denver School of Science and Technology, and then the Elliot School and Innovation School in Boston Public Schools, we stood up and ran some summer learning pilots that were at the intersection. Of AI and civics and leadership. And the core question is like, what are the skills that are going to set kids up to thrive in the age of AI that AI can't replace that are uniquely human?

And so one of the theories was that it related to civics and what kids would actually want to solve. In their communities and what might AI then allow them to do? And so I was looking at some of the Denver projects that are incredible. So it's high school students working in teams, basically run like a summer camp within the district or charter and.

The final projects were really compelling 'cause it's what kids cared about. So the one I was just watching was a team of high school seniors whose families had been impacted by some of the recent ice deportations raids in their community. And they had used AI to learn how to do GIS maps. And so they had then done GIS maps that overlaid where deportations were happening in Denver against where the mental health resources were in Denver.

And then against where their families and communities at their school lived. And then they came up with recommendations then about where the missing mental health infrastructure might be. Then they also looked at the research for secondary mental health impacts of deportations, not just on those who are then detained, but on the family and the community.

And then they made a case for what then the resource needs were in their community and all of that. The presentations, podcasts, the GIS maps, all of those were AI enabled. Right. And the problem was obviously kind of, we helped facilitate the process about students surface their civic problem that they wanted to work on.

But they all reported, like all the kids were like, okay. I feel dramatically different about what AI can be used for. I feel different about leadership and school and I feel different about like what my own agency is in terms of solving problems. This was, you know, Denver School Science and Technology was one of the schools in our AI school team's collaborative, and Zach, who leads their AI strategy at Denver School of Science Technology is an amazing thought partner and he.

As part of our work together, he was having kids design apps to help with voter registration back before the November election. And so like I think there is that thread at that school that is really powerful and it allowed us to see a very different vision of like what's possible that I don't think requires us to throw the baby out with the bath water.

We're still gonna need. Literacy. We're still going to need so many of the content standards, and there is actually a space for like a much more authentic, empowered learning at this moment now more than ever. And I worry that we're missing. I mean, that might be the thing more than anything that gets educators really excited.

We've all wanted our kids to see the immediate relevance of our disciplines. Exactly. And to feel more empowered. And I think there's a unique opportunity right now if we design together. Lean in. And so that's one of the projects I'm most excited about. 

[00:34:15] Alex Sarlin: Yeah, that's an incredible story. Authenticity and relevance feel like really key concepts in here because listeners of this podcast will recognize, I have been talking for now two years about how I'm just desperate for these case studies for students doing incredible things that are enabled by ai.

Things that they could never have imagined doing before that are so much more sophisticated. And I think you just gave. One of the best examples I've ever heard of that, of the idea of going from, of taking something of personal relevance that's local, that's meaningful, that's obviously very current events.

It's incredibly modern moment. It's something that's happening right now. And then being able to do something so complex and thoughtful that it has legal relevance and civic relevance and policy recommendations and has all these outputs. And then I think one part of it that really excites me about that project is, as you say it, it makes students feel.

Both that AI is incredibly powerful and relevant to them, but also that school is incredibly powerful and relevant to them, and that the things they're doing in school are actually part of their life and not parallel track the sort of, uh, yes. The things they have to get through to get back to their normal world, which is, you know, dramatic and has a lot of stuff going on.

So I think that's an incredible example. I want a library of these stories. Yeah, we have, 

[00:35:26] Chonghao Fu: we have, 

[00:35:26] Alex Sarlin: we're 

[00:35:26] Chonghao Fu: gathering some, by the way. I just, I agree with everything you said and there's even. The places where the kids got more AI skeptical, I think are bright spots. So there was a middle schooler in Boston, so as part of the final presentations in Boston, they created, they used AI to create podcasts of their work.

And then one of the kids was listening to the podcast and they're like, I didn't say that. They just made that up. They're like, and so this idea of kind of AI hallucination as part of just how it's designed, became then an active conversation. For middle schoolers in Boston and so exactly all becomes part of the learning.

[00:36:01] Alex Sarlin: Right. And I wouldn't even consider that AI skeptical. That's AI realist, right? That's knowing the limitations, which are real right now. And this is something that I think a lot of students don't understand when they ask AI to do their homework for them. Right. Is that that still does make mistakes or have errors sometimes, but yeah, I think there's a media literacy piece to that.

Be like, oh, that podcast misquoted me, or it, it put words in my mouth. Wow. I really have to keep that in mind. If I have to make sure it doesn't do that, I have to make sure that, you know, anything I listen to, I'm thinking about it through that lens. That's still AI literacy, even if it's. Not a positive experience.

Hundred percent. Yeah. This is what gets me most excited. You know, we've been talking to so many different founders and policy makers and various people over the last couple of years, but the story you just said is, I think, crystallizes so many of the pieces that I'm most excited about for ai, which is that if done right with really, you know, innovative and interesting experts, you mentioned this guy Zach Kenly, I'd love to talk to him.

You know, it can actually take. Educational experiences and curriculum and standards and students' interests and put them all together. You know, it's such a melange of different things that it's very hard to put all of these different factors together into one assignment or one project, but AI is, that is what it does.

It takes. Incredibly large amounts of data and makes sense of them and synthesizes them as literally, it's its core function. So that is what is happening there. It's taking all of these different students. It's taking the current political situation. It's taking the curriculum and standards and some of the things that need to be accomplished academically and putting it together into a project that is.

Meaningful and changes minds. It's really, really amazing. The next thing I like to see is those students, you know, presenting that policy to the local, you know, the community board or whoever could make decisions, and them actually getting that mental health resources like that could be a life changing experience if they could actually make the change that they're recommending.

Yeah, that's right. 

[00:37:47] Chonghao Fu: Absolutely. Everything comes back to then what is the authentic. Purpose of education, right? What do we believe to be true of learning? And then even for kids to know that if I use AI for this, I do actually shortcut my own learning. 'cause learning is the result of effort and like, what exactly am I trying to get to and what do I, what is the purpose of all things that we're doing?

And to create a space where people can ask those questions. And I think. To your earlier point about some teachers feeling more skeptical is I think teachers sometimes don't actually, are not being asked and don't have the space to then process and help to shape the course. And so in the Summer, learning Pie I think was also really powerful is that the educators were really excited.

Those were the teachers who said, you know what? The goal is the same. The future is different. That's what I'm taking away from this process. And so those are teachers who are also leaving really excited about, well, what can we create next? That kind of builds from everything that we know that works in schools and also embraces the emerging tech 

[00:38:45] Alex Sarlin: Goes back to the coherence, right?

If everybody is pulling in the same direction, if the teachers and the students and the policymakers and the, the heads of the school and external organizations like yourself who are sort of part of this are all saying. We all see this vision of future ready skills and using AI to teach new things and be more authentic and relevant.

We're all excited about it. I mean, amazing things can happen. I sort of, everybody, everybody gets aligned. So last question for you, and it's probably the hardest one of all, but like, how can we get to a world in which projects like that? Projects that are authentic and relevant and team driven and coherent, and bring a whole community together and use technology in new ways, how can we get that to become more and more of the norm in schools?

[00:39:28] Chonghao Fu: Yeah. This is where I'm actually more optimistic than many of my, my peers, because at Leading Educators, we've worked on a number of projects where we've seen things scale and across an entire system. Like with DC we. 10 years ago worked on what would it look like to align all of teacher learning with the curriculum, and that became the LEAP program, which is still running in dc.

It's now been copied by a number of districts across the country. This summer learning pilot was really quick to put together because. There's a lot of flexibility in summer school and summer learning, and so we could quickly design these with many school systems across the country and get them stood up for summer.

And one of the amazing things, there's a study from a math program called mathalicious, which is kind of a more engaging, problem solving math curriculum. And it turns out when teachers teach a one mathalicious unit, I love the name. Not only does it make teaching and learning better for that unit, it actually changes how they teach the whole year.

Because powerful learning experiences like that also impact educators and not just students. And some of those skills and dispositions and mindsets are transferable. So if we could design really powerful unit summer experiences, beginning of the year experiences, end of the year experiences, maybe it's at that month after state testing that people are like, what's gonna be really meaningful?

Like there are ways in which we could change. This mindset, skills, competencies really quickly, and I think once we get folks excited, I think. We can scale. I also think in general, we're in an inflection point where people are asking the questions of like, okay, what actually is the purpose and how do we design a future ready education system for our kids?

And so I think there's gonna be space to think bigger and to think differently. Now more than ever. And and so I'm excited to be a part of that process. 

[00:41:16] Alex Sarlin: Yeah, that's a really exciting vision. It's also very smart. I think there's an implicit lesson in there that you're saying that I, I think a lot of the EdTech folks listening to this should take note of, which is that the month after testing or summer, there are areas of the, that are lower stakes, so to speak, that are, people are a little more open to trying new things during the school year.

And if you find those windows. People are maybe open to trying new things in a way that they wouldn't be, uh, in other times. That's a very good insight. I mean, that's meaningful, that's a human-driven insight, and it's, it's real. And I love the, I love your point about, you know, being able to scale a summer project that makes sense for the same reason, right?

People aren't always that attached to what they're doing in the summer. They don't feel as constrained by the, the standardized testing or the curricula. So. I think that's fairly powerful to be, find the sort of areas around the edges where you can inject this. And then to your point about mathalicious, once people get a taste of what learning might look like, it can be transformative for educators and students alike.

I just can't wait. I just, I've been waiting three years now almost for the moment where people start to be like, oh my God, you can do that. That's what school could feel like. You just gave a really, really good example of what it, you know, a lot of ed tech companies are, are playing with that in various ways and it's, it's exciting to see.

Thank you so much. This has been a great conversation. Chong-Hao Fu is the CEO of Leading Educators, a national nonprofit specializing in comprehensive and coherent instructional improvement, and they've done really interesting research about AI enabled instruction, the role of the teacher and future ready.

Skills. Thanks so much for being here with us on EdTech Insiders.

[00:42:47] Chonghao Fu: Such a pleasure, Alex. Thanks for having me.

[00:42:49] Alex Sarlin: Thanks for listening to this episode of EdTech Insiders. If you like the podcast, remember to rate it and share it with others in the EdTech community. For those who want even more, EdTech Insider, subscribe to the Free EdTech Insiders Newsletter on substack.

People on this episode