Edtech Insiders

Week in EdTech 10/1/25: OpenAI’s Sora 2 Transforms Learning, Anthology Files for Bankruptcy, Code.org Launches, LingoKids & Outsmart Funding, and More! Feat. Sari Factor & Jason Fournier of Imagine Learning and Caleb Hicks of SchoolAI

Alex Sarlin Season 10

Send us a text

Join hosts Alex Sarlin and Matt Tower as they break down the biggest headlines shaping the future of education technology. From OpenAI’s new video model to major EdTech funding rounds and the rise of curriculum-informed AI.

Episode Highlights
[00:03:56]
EdTech Week 2025 preview at Columbia University featuring OpenAI’s education keynote.
[00:06:50] SETDA report: AI overtakes cybersecurity as top K–12 tech priority.
[00:09:05] OpenAI’s Sora 2 video model brings lifelike multimodal AI to education.
[00:14:10] Rise of AI actors like “Tilly Norwood” underscores new media literacy concerns.
[00:18:30] Code.org launches Hour of AI to expand AI literacy in schools.
[00:24:40] Debate: Is “learn to code” still essential in the AI age?
[00:29:30] Microsoft Copilot adds Study Mode with shareable learning tools.
[00:32:00] Anthology (Blackboard) bankruptcy exposes failed 2021 PE merger.
[00:38:10] Funding: LingoKids raises $120M; Outsmart (ex-Duolingo) raises $40M.
[00:43:50] National test score slump fuels “End of Thinking” education debate.
[00:46:10] Calls for clear new visions of learning in the AI era.

Plus, special guests:
[00:53:00] Sari Factor, Vice Chair & Chief Strategy Officer, and Jason Fournier, VP of Product Management for AI & Data at Imagine Learning, on curriculum-informed AI.
[01:04:00] Caleb Hicks, CEO & Co-founder of SchoolAI, on AI tutors and personalized learning.

😎 Stay updated with Edtech Insiders! 

This season of Edtech Insiders is once again brought to you by Tuck Advisors, the M&A firm for EdTech companies. Run by serial entrepreneurs with over 30 years of experience founding, investing in, and selling companies, Tuck believes you deserve M&A advisors who work as hard as you do.

Ednition helps EdTech companies reclaim rostering and own their integrations. Through its flagship platform, RosterStream, Ednition replaces costly data providers and complex internal infrastructure with direct, secure connections to any SIS or data source. The result: scalable integrations, lower costs, and seamless experiences for schools and districts of every size. With Ednition: Reclaim rostering. Own your integrations. Cut the cost.

Innovation in preK to gray learning is powered by exceptional people. For over 15 years, EdTech companies of all sizes and stages have trusted HireEducation to find the talent that drives impact. When specific skills and experiences are mission-critical, HireEducation is a partner that delivers. Offering permanent, fractional, and executive recruitment, HireEducation knows the go-to-market talent you need. Learn more at HireEdu.com.

[00:00:00] Matt Tower: I think that a lot of the stuff, when we react outta fear, it's usually because we don't have the skillset required to properly weigh the pros and cons. So we just assume it's all bad. So yes, I'm super excited that students will get this exposure. I hope it is as successful as the hour of code is, and I think all 50 states now have either a computer science requirement or at the very least a computer science option in every high school across the country.

I think that's a really big deal, and that was like more than a decade of effort that it took to get there. I hope AI gets there and I hope we do it for adults just as much as we do it for children. 

[00:00:41] Alex Sarlin: Digital literacy means something very different today as it did a month ago. I just say that straight out.

It is becoming near impossible to distinguish between video evidence of something real and video evidence of something that is somebody has created, and this is on day one. This is like, now forget when nobody's even trained on it. Nobody's even been playing with it, or actually trying to sort of master the art of how to make it.

Convincing. So, I mean, that part is incredibly scary. I say this with a smile 'cause I'm excited in a lot of ways about this world. I think it's really fascinating, exciting, just to be in a world where people can make anything they can imagine, which is basically what we're talking about here, but there's enormous risk as well.

Welcome to EdTech Insiders, the top podcast covering the education technology industry from funding rounds to impact to AI developments across early childhood K 12 higher ed and work. You'll find it all here at EdTech Insiders. Remember to subscribe to the pod. Check out our newsletter and also our event calendar.

And to go deeper, check out EdTech Insiders Plus where you can get premium content access to our WhatsApp channel, early access to events and back channel insights from Alex and Ben. Hope you enjoyed today's pod.

Welcome to this week in EdTech from EdTech Insiders. We have an exciting, slightly related, but very exciting episode for you today. We have with us one of our very favorite guest hosts of all time, known him forever. He's been following the EdTech space and reporting on it in incredible ways for years.

Matt Tower, from Whiteboard Advisors. Matt, welcome to the pod.

[00:02:30] Matt Tower: Thanks for having me, Alex. I'm excited to chat.

[00:02:32] Alex Sarlin: I always love talking to you. You always come with so much insight about the overall trends in those space. I always learn a ton when we get to chat. So a couple quick things. Coming up on the pod, we talked to Jamie Candee from Edmentum.

Always doing incredible things with career connected learning. I highly recommend checking that out. As well as Nick Chen, who has just been doing PlayMath. He's an epic and uh, ST Math product manager starting his own really exciting math games site. Check that out. And last, but oddly this week we talked to a teacher and his 11-year-old student who are co-founders of an AI ed tech company.

It was such an amazing interview. The company is called CoGuide.ai, and they're just incredibly fun to talk to them. And we talk about micro companies and new types of business. A teacher and a student. Starting a company together is a new one for me. I'm like thrilled and it was so fun to talk to them and they're doing really interesting things.

And we talked to Vanessa Gill and Lucy Stevens of Social Cipher, which makes games for Neurodivergent youth. So incredible stuff on the pod. And of course in the event space, we have EdTech Week coming up very, very soon. It's coming up so quickly. We have our EdTech Insider's Preview newsletter is coming out, so keep an eye in your inbox for that.

It has dozens of sessions that we recommend and all sorts of things. Ben will be there. Matt will be there. I will be there. We are all hosting sessions in different ways, and we've been doing a series of webinars about EdTech Week with some amazing guests as well. So I hope you're following all of that.

But actually, what are you excited about for EdTech week, Matt, before we even get into the news? 

[00:04:06] Matt Tower: I think it's become an annual event on my calendar, I think of the 10 pole events of the year as GSV in the Spring South by sometimes as a nice warmup to GSV and EdTech week in the fall, EdTech week in Hall and iq and, and those are really the moments during the year where I feel like I get to see folks like you and others in the space, in person, which as fun as remote work is.

I really do sort of cherish those moments in person too and makes it easier the rest of the year. 

[00:04:34] Alex Sarlin: Couldn't agree more and there's like 10 pole events which are on different coasts. Yeah. I feel like create a really nice sort of TikTok. You get to sort of check in on both sides of the country on two different related.

You do see a lot of the same people at some of these, but not all the same people. Lots of different types of ed tech entrepreneurs and investors, and. Watchers and operators. I'm really excited about it this year because I just feel like EdTech week has just been expanding and expanding. This year it's at Columbia University, it's at Alfred Lerner Hall big building on the Columbia University campus, which is new.

People have been there for previous years. You know, sometimes it's spread around different buildings downtown. It was in NYU last year. Now it's moving uptown and I love Columbia. I was a, I'm a teacher's college graduate and grew up nearby. I didn't know that. Cool. Oh yeah. Yeah. So this is like coming to my hometown turf.

My dad went to Columbia, my mom went to Barnard. Like I have a lot of history there. So, absolutely exciting to come home and see EdTech week up on Morningside Heights in the Upper West side. It's really exciting and we're gonna see incredible people. These amazing guest here, Leah Beski from OpenAI, is giving the keynote on the first day about all the things that are happening with OpenAI and education and all the training they've been doing with the A FT and UFT.

There's just gonna be a lot there. So. Matt, let's start with the AI space as we like to. What was the biggest story for you in AI this week? 

[00:05:52] Matt Tower: Great question. I think, you know, we were, we were sort of chatting before that it feels like we had the, like back to school rush where there are these big monster announcements, philanthropic and open AI and Google all doing their sort of proprietary study models.

And now it feels a lot more sort of like block and tackle and I think like we're starting to get a feel for the response and the usage at schools. Right. So like I know one of our topics for this week was the Setta report where now that all these tools are accessible to students, it's saying, okay, the administrators and teachers are figuring out how do we set the right guardrail?

And that was their sort of top concern in this survey of education technology leaders of like, do we set this at the classroom level? Do we set this at the school level, the district level, the state level? And you see implementations literally across the whole spectrum. Even some chatter at the federal level.

Although I don't think we'll see a federal level AI policy. So I think, you know, it's interesting to see like companies do big wave of announcements. Teachers, administrators, and students react. And we could talk about how like maybe they should be a little bit more in sync. But it's interesting to watch.

[00:07:07] Alex Sarlin: It is the setta is the State Education Technology Directors Association that put out this reported survey data from over 47 states. And the big news was this is the first year in which AI took over from cybersecurity, which has been the top issue for the state education technology directors for several years.

AI just surpassed it as the number one. Thing that they are prioritizing and thinking about this year, sort of across the board. But to your point, the policies are all over the place and everybody has to be very reactive because you have the big tech companies having this drumbeat of new features, of new opportunities for people to use AI to learn, to create, to discuss, to, they have companions.

And meanwhile, you have all of the education space saying, okay, you're moving really fast. And we are convinced that there's something here, there. There's a great New York Times article this week about how the schools have really gone from banning chae to embracing it and really starting to believe that there's some, there's there, there, which I, I'm really excited about.

But the devil's always in the details and there are so many details when it comes to AI policy and ai, what's allowed and what's not allowed, and how does it interact with existing systems. So I agree with you. I like that metaphor of sort of block and tackle. You have the big, the big tectonic plates move, and then you have to figure out exactly how it's gonna affect any given.

State district, classroom teacher, what do you do about problems that may arise or that are gonna be arising? Speaking of the tectonic plates though, what I thought you were gonna say, 'cause I know this is something on your mind, is that one of the big announcements this week from OpenAI, they put out their really enhanced video model, which is called SOA two, and it by put out, I mean they announced it, they put out some really amazing videos about what it can do.

It is in private beta. It is not accessible by very many quite yet, but what it showcases is that multimodal, as we like to call it on this pod, you know, video-based AI is going very, very quickly and combining Google's vo. Which we've been talking about here with open AI's, SOA two, which is now includes sound as well as just incredible, realistic, very, very indistinguishable from reality.

We are really entering that multimodal era of ai, I think in earnest now. What did you make of the SOA announcement? 

[00:09:20] Matt Tower: So first of all, to give the concrete like what is the product? It's a ten second video. To your point, multimodal has audio and, and visual cues, and effectively no rules. He says sort of like cringing.

You can say, I wanna see Alex Solan in Pokemon Yellow trying to catch a magic carp and it will do its best to create that. Right. And so you're seeing, one of the jokes was they took the guardrails, particularly off videos, using Sam Altman as a protagonist, and they did that I think, to allow him to be a little bit of a punching bag.

But I think the most interesting part about it is it really does incorporate, like there's videos of Mario, there are Pokemon videos. There are, I'm trying to think of some of the others I saw. 

[00:10:09] Alex Sarlin: I mean, all sorts of celebrities doing Exactly. 

[00:10:12] Matt Tower: Yeah. And very lifelike. There were certainly some where you're scrolling through your X feed or whatever end.

Like, whoa, I can't believe that happened. And it turns out it didn't happen. It was made up. And I think that gets at what we were just talking about of how is a teacher or an administrator supposed to respond to this? The immediate inclination is to withdraw and say, I have no idea how to manage this, so I'm just gonna ban it.

And fortunately, or unfortunately, I'm not sure, SOA is still, it's not broadly accessible, it's the number one app in the app store, but it's still very much sort of invite only and you gotta know a guy type of thing. But I don't know. My reaction was like, this is really cool, but I also totally get the fear.

A lot of people have. 

[00:10:56] Alex Sarlin: Yep, I totally agree. Raises the stakes of the risk and the rewards significantly. And you know, I mean anybody who's played with Runway or cling.ai, which is an incredible video creator, these video tools are very powerful and the realism has gone so quickly. You can do any style, you can make it look like particular types of film.

You can make it look like particular types of video. You can include any kind of character or real person. I was playing with a Hagen has a video creator now inside its tool suite where you can actually give it a topic and it'll create whole videos like with sound pretty similar to what you're seeing there.

I think it has be under the hood, but like we are entering this era that it, we is just totally unprecedented in human history. We say that a lot in relationship to ai, but like digital literacy means something very different today as it did a month ago. I just say that. Straight out. It is becoming near impossible to distinguish between video evidence of something real and video evidence of something that is somebody has created.

And this is on day one. This is like, now forget when nobody's even trained on it. Nobody's even been playing with it or actually trying to sort of master the art of how to make it convincing. So I mean, that part is incredibly scary. I say this with a smile 'cause I'm excited in a lot of ways about this world.

I think it's really. Fascinating, exciting, just to be in a world where people can make anything they can imagine, which is basically what we're talking about here, but there's enormous risk as well. That said, the educational possibilities there are incredible, and if anybody who has not played with any of these video tools, I have a video of teachers opening up the world in front of the classroom and you see all the layers of the inside of the earth with the Iron Core and the magma and the mantle, and it all looks totally realistic and live.

Or videos where you have a teacher going inside an atom and going on different quantum levels like these are a prompt away right now. I really encourage you to try it because they're incredible, but at the same time, yeah, everything changes when video involved, just as it did with the internet, everything changed when the internet became a video platform, and I think we're really going to that era very quickly.

[00:13:00] Matt Tower: Yeah, and I think on the like workforce side of things, one of the interesting things I saw was, you know James Cameron who Avatar Titanic, like well known for his CGI Chops Terminator. Yep. Yeah. He was talking about how it's less about like, sure, probably some people will not get or will lose jobs related to this, but his point was more when you're editing a massive movie like Avatar, the.

Compute power just to render all of the different graphics used to take days and days and days. And so he was excited that his editing professionals could compress those timelines to hours or even minutes with all these new technologies. So he was talking about how like this actually gives him the capacity to make more movies.

Sure. Which there may be an element of like too many movies, but, but I'm more interested in this like giving superpowers to the creators and producers of this type of stuff than just being like, well, yeah, the entry level job that's not that fun gets crushed. So I thought it was interesting to get his perspective of, as somebody who's pretty ensconced in the workforce side of this world.

[00:14:16] Alex Sarlin: Totally. And I think it's gonna change up and down the stack, right? I mean it'll, for professional filmmakers, it changes the processes in a way that, you know, this type of technology allows CGI to happen at a scale even they could never have imagined. Even people who are working with industrial light and magic doing incredible effects can do more than they've ever done.

And it means that the ninth grader can wake up in the morning and say, I had the craziest dream, and describe the dream, and then have a video of the dream that they can share with us, which is so 

[00:14:42] Matt Tower: cool. Like, exactly. Have you guys covered Tilly Norwood yet? No. Oh man. So there was this other. Announcement last week about one of the first, the first to get a lot of attention.

AI actor Tilly Norwood. So she is not a real person, but is available for any movie you should like to create with presumably a nominal fee attached to her name, image, and likeness. So you also, that same 14-year-old could create their own Tilly Norwood, which is fascinating. 

[00:15:15] Alex Sarlin: I mean, it's gonna have a huge effect on social media because it means that anybody can create social media influencers no matter who they are.

Right? Yeah. Which is a very different. World than we've been in in the past where people have to look and act a certain way to be an influencer. I had a crazy experience. I just wanna share this quickly, and it's, I don't want to go too deep into the rabbit hole here, but my wife and I had followed an Instagram star who is this sort of Russian makeup person, this very odd, glamorous looking girl.

And in one video at one point, there's this tiny little glitch and you suddenly see a face behind her. And it's not her. It's a male. It's a totally different person. Wow. And we were like, I mean, this person has millions and millions of followers. Millions of followers. Yeah. And I think this stuff is already out there.

Yeah. And we haven't even gotten started yet, but I think it's already out there. So anybody who's read Neil Stevenson's snow crash with sort of one of the Yeah. Of, of all of this where all the med, that was the word, the book that coined the term, the Metaverse. Yeah, I mean, we are racing. I've, I've been trying to get Neil Stevenson on the pod for six months now.

I think we're gonna get him, I'll say it here. Oh, that'd be 

[00:16:23] Matt Tower: amazing. He's got a great newsletter as well. 

[00:16:26] Alex Sarlin: I haven't read it. That's really interesting. I gotta check that out. But I mean, between Snow Crash and the Diamond Age, he's just been looking to this future for a long time and I feel like we're racing into that future.

But some of the big idea behind the Metaverse is that it was the concept of that everybody can have avatars of any kind that can represent them in the world. And I think we're very close to that. I mean, the social ramifications of this are enormous. The educational ramifications of this are enormous. I mean, you can also have teachers that are not real, right?

You can have photorealistic, incredibly convincing, incredibly charismatic educators, including celebrity educators, including real celebrity educators that you are faking, or celebrity educators that you are creating outta whole cloth and making into celebrities. We're entering a very crazy new world, and I hope that as a community we can get ahead of it, but it's really exciting.

So. If you have more to say about this, please do. I, we, we could talk about this for, no, I just, maybe 

[00:17:19] Matt Tower: to lead the audience with a question is like, if I told you that Britney Spear's Algebra one class would raise nap scores by five or 10 points, like not an enormous amount, but also not a trivial amount, an incremental amount like.

How would you feel about that? Right. And it's more complex than a like good, bad, right? There's a bunch that goes into answering that question that I think is worth spending time thinking about on your own, outside in a chair, without a device near you. 

[00:17:55] Alex Sarlin: Agreed. I mean, those questions are very close. I remember back in the course days, we would joke sometimes about how we were like, these courses have such incredible content, but imagine if they were delivered by certain ones were delivered by really charismatic people.

And then like a few months later, masterclass came out, which was basically, Hey, let's put Matt, David, and or, you know. But yeah, and, and in America celebrity goes a really long way. And the idea of a Chapel Rowan or a Taylor Swift LED math class, or any music class, I mean, let's keep it simple. How about a music class seems pretty appealing to a lot of people.

It's also, thats what we should do. 

[00:18:30] Matt Tower: We should pitch Taylor Swift on like a basic K 12 curriculum, just like Core K 12 curriculum taught by Taylor Swift, 

[00:18:40] Alex Sarlin: EA and music, right? And then you put 'em together and make music. We're raising 

[00:18:43] Matt Tower: 50 on 500, you know, you just respond, respond in in line to the deck.

Insider Substack. 

[00:18:50] Alex Sarlin: I mean, what I find interesting about it is that people who have brands like that don't have any way into education. They, of course, they're interested in helping educators and helping schools in lending their LeBron. James did something with Khan Academy years ago where they put together all of these basketball themed physics videos, and I don't think it went very far, but it was like, you can get very, very big name people if you're doing something educational.

It's pretty intriguing. Anyway, but I think your question's good. Do we want that? Is that something we want? Is that the road to Hell? Is bathed with good intentions there, or is that. Yeah. One other, I think relevant piece of this is the hour of AI launch this week. So for people who are not following this code.org, which had been doing CS for all and been doing the hour of code, they've been working for over a decade to get coding into the classroom in various ways, has seen AI coming for quite a while and is now launching hour of ai.

We interviewed Kareem, the chief product officer of code.org just a few weeks ago on the podcast, but it is now live. They have Google, Microsoft, Amazon, zoom, Lego, Adobe, ETS, our friends, Amanda Beckert, staff from AI for Education, and Alex Cotran from IEDU, Khan Academy at Tech Insiders is a partner for Hour of ai.

We're really excited to be amplifying their work, so I'm into it. I think it's a really great idea, but they're looking for partners. They are very actively looking for partners in the EdTech community, and this is a fantastic opportunity. Speaking of the block and tackle piece, Matt, of where people are saying, Hey, how do we make sense of this AI moment?

Things keep changing all around us. The policies are unclear. But what hour of AI is attempting to do is find this incredible collaboration network of partners and build really fantastic sort of first experiences with AI that are educational about AI itself, as well as how to use it in ethical ways, how to use it in ways that are creative and interesting, and introductions that actually mean something to learners, and also can train educators at the same time, and then build these experiences and launch them in a way that is, feels comprehensive and open and safe and approachable for classrooms all over the country.

So CS for all.org. Slack. I can't read this URL, but go, go Google our of AI partners and there are places online to apply to become a partner of our of ai. I really recommend it. They're an amazing organization and they are bringing all the things they've brought to bear to for the hour of code to this AI moment and saying, how do we actually make the best of this and go quickly with it.

I know you followed the hour of code and now hour of ai. What do you make of this movement, madam? What do you think a tech companies should think about when they think about what hour of AI might look like in a year or two? 

[00:21:22] Matt Tower: It's really important is the short and sweet way to frame it. I think I would like to see it not just for students, but for adults too, for company leaders, for politicians, for everybody who is, you know, we talked about the policy guidance before.

Anybody who's setting policy around AI should probably be more familiar with it than most are today is sort of my lukewarm take on this. I think that a lot of the stuff, when we react outta fear, it's usually because we don't have the skillset required to properly weigh the pros and cons. So we just assume it's all bad.

So yes, I'm super excited that students will get this exposure. I hope it is as successful as the hour of code is. And I think all 50 states now have either a computer science requirement or at the very least a computer science option in every high school across the country. I think that's a really big deal, and that was like more than a decade of effort that it took to get there.

I hope AI gets there and I hope we do it for adults just as much as we do it for children. Like, don't stop at the classroom. 

[00:22:33] Alex Sarlin: Totally agree. I wanted to get your quick take on something, Matt. I'm very curious about it because, so Natasha Singer is a New York Times reporter and she's been writing a book about code.org and sort of the coding education.

Push over the last decade. And I heard her on a podcast last week and it's really interesting 'cause her take is like the big tech companies have promised everybody that if you learn to code there's a job for you. And they've gotten coding into the curriculum in a lot of states. They've got, certainly they've helped make computer science the number one major in the country are very close to it and, and I think it is the number one major throughout almost all of higher ed.

And she's like, and now it's this sort of bait and switch because now coding is not the skill of the future. And coding, she was the one who wrote the Times article being like, coding graduates are working at Chipotle. Yeah, yeah, yeah. A person who's doing that and something is rubbing me very much the wrong way about this.

Take that. It's like this some kind of sinister plot. They tried to make everybody try to code just to pull the rug out and go to ai. It feels a little strange to me. But I'm curious what you make of it because it's, I can see that take in some ways if you squint us the right way, it can seem like some kind of sinister quote unquote big tech plot.

Yeah. But I, it's not been 

[00:23:40] Matt Tower: just like, this gets me a little riled up. 'cause that's like saying, well not everybody's a writer, so why do we teach writing? Right. It's how the principles of society work. Right. Everybody is exposed to digital technology. Sure. You could ignore like how technology works and just be like, yeah, I don't know.

Facebook shows me stuff. Like it's pretty interesting. Or you could be like, oh yeah, there's an algorithm that looks at my interests and that data points that I've given it and it develops an opinion of what it thinks I will like. And then I have some agency in deciding whether or not I like it. So I don't have to just be the on the receiving end of technology.

I can influence it both on an individual level and on a macro level. So it just like. Sure if you take it literally like it is silly for everybody to learn how to code. 'cause not everybody is gonna be a programmer. Like yes, that is literally true, but philosophically, in my opinion, it's naive and harmful.

Even not to understand how the products you use are built on some fundamental level. I'm not arguing everybody should know how to code Facebook, but understanding the principles involved of why it serves the product it does to you is in my opinion, extraordinarily important. 

[00:24:57] Alex Sarlin: Totally agree. I just interviewed the CEO of a company called AI Create that's doing AI literacy.

She said something very similar to what you're saying here. She's basically saying that knowing how these technologies work gives you autonomy and agency and freedom. You can listen to them or go the other way or build your own. It's like we live in a digital age. The idea of ignoring digital literacy and technology and coding as it's been used, it seems so strange.

And then the idea that coding is now turning into AI enhanced coding or that's a technological, I think it's an improvement. The idea that that is some kind of change that is like undermining the claim, that coding is a useful skillset. There's just something very. I don't know. I feel like it's very naive about that take.

It feels like you're sort of, if you've already decided that these big companies are sinister and they're purely profit driven and everything they do is trying to sort of screw everybody for their own behalf, like if you've already decided that, then you can make a really. Good narrative about how the coding for all movement is not turning out quite the way that people plan.

But like that is not how I see it at all. And the other thing that comes out is like you end up defending the status quo, right? I mean, if you are attacking the people who have been trying to push for computer science to be a new literacy in the classroom over the last 10 years, what are you fighting for?

You're fighting for that. It should just be reading English and math. Like that's the more important way. It's odd. So I dunno, it gets me riled up too, and I'm trying to control my rile, but that book is gonna come out and I think there's gonna be a lot of conversation about this at some point among certain circles.

This idea I, when I read The Atlantic, now every article about AI in the Atlantic makes it sound like it's this cabal of evil CEOs just trying to like screw everybody. And I, I think they've got a really. You know, a certain, there's a certain journalistic take on this moment that just like is really trying to fit it into big tech is this evil empire narrative that I'm just finding So divorced from reality after having talked to, you know, we're coming on our 400th episode.

We've talked to people at all of these big tech companies. They care so much. They're so desperate to upskill people. We've talked to people at IBM, we've talked to people at Microsoft, we've talked to people at at OpenAI and Google of course. And like this is like the last thing on their mind is to try to lead students down some garden path and get them to spend all this time on skills they're not gonna use.

It's like couldn't be more wrong. So, I don't know, it's strange to me. So speaking of Microsoft, another ed tech. News item that I, you know, may have slipped under the radar this week, but I thought it was pretty interesting. We saw Mustafa Soleman on LinkedIn basically post about how co-pilot launched its own study mode.

It has a study and learn mode, it has quizzes. Microsoft Co-pilot is mostly known as a coding support, I believe, but it is sort of Microsoft's core AI product. It is still widely used, even if it's not, we don't talk about it nearly as much as chat, GBT or Claude or Gemini. What do you make of the Microsoft world putting out their own learning mode and their own, they call it an always on tutor in your pocket.

[00:27:50] Matt Tower: Yeah, I mean, I think it's, everybody else has this, so we have to offer it too. I think I would be curious, you know, Microsoft hired Mustafa to develop internal models, right? They made that giant $13 billion bet on open ai. Two or three years ago. And the basis for it was that they would incorporate open AI's models into all their products.

And copilot was born from the OpenAI, API access, which Microsoft has a right to, for, you know, extended period. That's like a whole other side to the OpenAI negotiations. They hired Mustapha as that partnership came apart and he, his charter was to develop Microsoft exclusive models. I don't know whether this copilot leverages that or not.

You know, I think it's. What we have to remember is that Microsoft products are baked into billions of computers around the world. So even something that doesn't get a ton of headlines probably will get a lot of usage and you know, I'm sure it's at least as reasonable quality as a chat JPT study feature or a, you know, philanthropic study feature or whatever, possibly because they use literally the same models but have a different distribution channel.

[00:29:02] Alex Sarlin: I agree. I think that those are great takes. There's a little bit of a, a me tooism, a sort of follow along nature to this because we've seen a lot of the other big tech, big AI companies launch study modes over the last few months. One thing that stood out to me here, and this is not a hundred percent different from what we saw from Google, but they talk about how the.

Chats and quizzes in their learning modes are also shareable. So there's this idea of creating learning artifacts. Like if you are a student studying, you know, molecular biology in, in a college, and you create a really powerful quiz for yourself being like, I want, here's a hundred questions to understand some of the core principles.

You can take that quiz and share it with the rest of your class. You share it with your professor or share it with, you know, your study group or any of that. Not that that's by itself the sort of killer feature, but that does create a kind of an interesting, you know, positive feedback loop that I don't think we've actually seen in action.

As much as you think, or much as I, I would think in AI so far, what do you make of that? 

[00:29:56] Matt Tower: Yeah, I mean, and I think. There are some other study tools, you know, I think Quizlet allows for, for sharing and yeah, so to our earlier point about digital literacy and, and whatnot, like to me, I'd rather have this sharing in the open and be a part of the classroom conversation that said that, you know, the teacher says, Hey, look, you know, Alex created this really wonderful study guide.

I, I recommend you guys use it and collaborate on it. So, you know, I think particularly in a product that has as. Broad a distribution as Microsoft's office suite has. To me, it's really exciting and like, again, I'm glad it's explicit in the product rather than hidden, which is where it's been historically.

And I sort of trust that classrooms will adapt to being more collaborative rather than having this sort of dark corner shadow cheating world. Right. 

[00:30:51] Alex Sarlin: I agree. And I, I think of things like jigsaw models or classroom wikis or some of these collaborative models we've seen in the past that we haven't really talked about in the AI era.

But you can easily imagine, just to continue on your example there, right? The teacher saying, Hey, we have five groups in this classroom. This group is gonna create an incredible study guide and tool and interactive quiz about this aspect of the, you know, whatever the, the American Civil War. And this group's gonna do this and this group's gonna do that, and this group's gonna do that.

And at the end we have this unbelievable suite of tools that you can all use that the next year's class could use, that you could share with the world. I mean, it's pretty exciting. 

[00:31:28] Matt Tower:

[00:31:28] Alex Sarlin: couldn't 

[00:31:29] Matt Tower: agree more 

[00:31:29] Alex Sarlin: this's a topic. I'm really glad I have you on here, Matt, because this is a topic that if I had been here, I would lean on him.

This is something that is really out of my wheelhouse, but there's something that happened this week that I think we should talk about. And I know that you've, you've been interested in, you've been talking to some great thought leaders about So, and Pathology, which is the company that formerly known as Blackboard, basically, which has been part of a private equity purchase in 2021.

Declare bankruptcy this week and basically sell off, set most of the parts of its company and retain the Blackboard piece itself, but sell off other pieces to other big ed tech companies, including Ellucian. Do you wanna give us the headline about anthology and Blackboard and what people should know?

[00:32:08] Matt Tower: Yeah, so we're gonna go back in history a couple times here. So first we'll go to 2021, and there were a number of companies that either were bought out by private equity or raised a lot of money in 2021 that effectively undid them in the subsequent years. So, you know, I've written a ton about Baiju, right?

That they raised a whole bunch of money 2019 through 2021, and they got. You know, there were some positive fundamentals in their business. They got overweight on valuation, and that's how the House of Cards came crashing down. Similarly to you, I think it was roughly $800 million. They paid $750 million for Trilogy in 2018, and then they paid $800 million for edX in 20 20, 20 21, around there again at sort of the peak valuations.

And subsequently, the debt that they had to raise to make that acquisition is what undid them and took them into a, a bankruptcy. Similar to what we're seeing here. Anthology was backed by Veritas in 2021. They did a buyout of Blackboard, so they were a collection of other companies in the higher ed space, student information systems, other sort of backend infrastructure.

They acquired Blackboard became anthology, Blackboard's the most well known brand. So that's why these names will sort of get squished together. At a peak valuation and the thesis for the acquisition was that I am a education technology company. I sell a bunch of education technology products to relatively small set of buyers, so a university president, a CIO, chief Information Officer.

Or like VP of technology and the thesis was that those buyers were the same for each of the products that they, they had in their portfolio. From student information system to learning management system to marketing and enrollment systems. All that would be bought by the same person. Turns out that is really not.

Quite how technology acquisition happens in in higher ed. There are different stakeholders for each of those purchases. And we think, and you know, all of this is a little squishy unless you like, you know, are an executive at the company or at the private equity firm. We think that thesis basically just didn't bear out and it became a collection of assets that didn't really have any quote unquote synergy.

And thus the valuation that was supposed to mean one plus one equals 10, actually it was just one plus one equals two, and they couldn't uphold the valuation that they were bought out under. So this bankruptcy is essentially an unraveling of that. They're gonna sell the different. Component parts to different stakeholders.

Blackboard will be its own spun out business effectively on its own. Again. So you know, the interesting thing is one of the, the student information system aspect is being sold to Ellucian, which also was bought out in 2021, but with a vertical integration strategy. So under the thesis that buying a bunch of student information systems, you would get quote unquote synergy.

I think technically we can't say that was like a successful thesis. We just know that Ellucian was a buyer from the bankruptcy process rather than a seller in a bankruptcy process. So seems like that thesis says born out better being a vertical rollup rather than a horizontal rollup. So that was a little financey.

Apologies, but hopefully the product aspect makes sense. 

[00:35:37] Alex Sarlin: When you get into this infrastructural set of EdTech tools, you have lifecycle management and student success and you know, learning management systems. And it can feel like sort of inside baseball, unless you're in that particular world, which I think is, is an interesting world.

But I think that the upshot here is that anthology. Was this, what I'm hearing you say, but also what I've read from Phil Hill, you know, and others, Phil Hill, huge expert, 

[00:36:00] Matt Tower: and Phil deserves credit for being the earliest to call shenanigans. As soon as that original 2021 deal got done, he said, I don't get it.

I don't think this is gonna work. So he deserves accolades for holding that thesis for, you know, four years. Yeah. And, and being 

[00:36:15] Alex Sarlin: correct. Yeah, a hundred percent. And Phil Hill, long, long time observer of the EdTech space, especially the LMS world, which has sort of been his bailiwick for a long time. So he is like one of the absolute global experts in this space, like bar none.

And yeah, he saw this coming. But it's interesting to see the bundling and unbundling of EdTech at this scale. But I think one of the takeaways I'm hearing from you is that there's an era that, you know, basically the COVID years where e everything ed tech went through the roof. The big investment firms were investing a lot.

The private equity firms were looking to cash in and, and combine, you know. Into synergistic amalgamations of companies that we're gonna put things together. And I think, you know, we are now in a little bit of a, a retrenchment period from that. And, and it feels like this anthology concept is getting redone.

And, you know, Veritas still owns h Houghton Mifflin Harcourt huge, a deck publisher and digital publisher, the Cambium Learning group like company called Final Site, which is now the owner of the K 12 section of anthology. There's still a lot of moving pieces in this, but, and they're gonna come out of this in three to six months as Blackboard again, and it'll be Blackboard and most people will probably never even notice that anything changed, but probably worth noting, you know, that this is a, a little bit of a, a rebuilding year for this particular type of a private equity based ed tech.

[00:37:29] Matt Tower: I think reflective of the valuation resets happening across the ecosystem, mostly quietly. This is sort of the like, you know, public version of what's happening in a lot of boardrooms right now. 

[00:37:41] Alex Sarlin: The other thing I'd love to pick your brain on, because you are the absolute master of this, is the funding rounds.

You know, there's been a couple of pretty notable funding rounds over the last couple of weeks. I mean, we, we've seen a lot of happening in Indian ed tech. We reported on that last week, but there's been a couple of interesting ones happening this week. A couple stood out to me and I'd love to get your take on them.

Matt, if there are other ones you wanna bring up, please, please do. But one is, uh, lingo Kids, which had $120 million funding ground just a couple of weeks ago. It's a, I believe it's a language learning platform, right? 

[00:38:08] Matt Tower: I think it's more entertainment content for two day year olds, Spanish based company with character driven narratives that are a little bit more defensible versus sort of straight cartoons, if that 

[00:38:20] Alex Sarlin: makes sense.

And it's B2C. It's Educational Learning Games B2C. Yeah, yeah, yeah. We with 3000 shows and songs. Yeah. Got it. It's not pure language learning. What is, what are the subjects? It's just, it's like broad based academic, 

[00:38:32] Matt Tower: it's not really subject driven. It's it's content driven. It's more like a Got it. Netflix for kids.

With some strong guardrails than it is like, you know, scaffolded learning. 

[00:38:42] Alex Sarlin: Got it. Including early literacy and, and phonics and things like that. I see. A, B, c words now. Yeah. Got it. But yeah, $120 million learning round out of Europe is, is definitely worth noting. And the other was outsmart, which just happened this week.

Do you wanna talk us through these funding rounds and any others that, that you think our audience should be keeping an eye on? 

[00:39:00] Matt Tower: Yeah, I mean, I think for Outsmart, I would just say it's a call for more information. They've raised about $40 million now, but haven't publicly announced their product. You know, it's founded by two ex Duolingo folks and a number of other high quality education people.

So they have some pretty significant chops, but they've been pretty quiet about what exactly they're building. So I'm excited for them to sort of come outta the woodwork and, and say like, this is what we're up to, and be able to evaluate it with a little bit more depth. But they certainly have, you know, a, a great set of investors and a great set of founders.

We'll see if they can translate 40 million bucks into a big user base. And then on Lingo kids, I think there's, the US has felt pretty quiet this year from a funding perspective. There've been plenty of rounds, but Europe has really been where it's at. They have four of the top five funding rounds for this year.

Lingo kids, I think is number two behind Ambos, which is a test prep company. So, you know, the European ed tech system is really heating up in a way that it felt like the US was in sort of 2017 to 2019. 2020 through 2022, I still think is sort of anomalous, but that sort of heating up period for us Ed Tech I, I'm starting to feel in Europe and have been quite excited to follow along throughout the course of this year.

[00:40:16] Alex Sarlin: Great call and the, the Oppenheimer report for Q2 of 2025 I think just came out and they also have been focusing on the European EdTech system as well as AI being really central, but that's worth Googling and looking into as well. That's a great call. When I hear about the outsmart raise and I see that founding team, Gina got Health and Jorge Maal, these are like executives at Duolingo, very high level.

Gina got h was one of my big ed tech heroes actually, because she's really the person who made a lot of what we considered the Duolingo sort of core product features. I think she was behind a lot of optimizing a lot of them and, and Duolingo is all about optimizing, it's all about AB testing, all the streaks and the different models and, and she was really right at the center of that as they were growing very quickly.

I remember a really great wired profile of her at one point about how she was just like this sort of product mastermind. It's a really great team. So I'm like, I wanna call it the Duolingo Mafia, right? The Duolingo Mafia. And I'm like, you know, it's a better name. The lingo kids, they should be the lingo kids.

Right. But that would be kind of, well, that would be trademarked in Europe. That's the problem. That's the problem. But eh, I just thought that was kind of funny. So you got the lingo kids and the Lingo Mafia also both raising money this week. I would not sleep on this investment. I think Outsmart is going to be something that we will be talking about a lot on EdTech insiders over the next six months as they come out and actually explain what they do.

But it is a higher education play. It's from some very seasoned EdTech veterans who know a lot about B2C EdTech. They know a lot about, you know. Engagement. I think, you know, there's definitely gonna be something interesting happening there outta the west coast. It's gonna be interesting, and I think you mentioned this, but the investors piling into this are, you know, it's ksa, it's Lightspeed, it's reach capital.

It's a lot of angel investors who are deep in the tech space and I, I think that matters in EdTech that matters when you have a set of investors who are really, really well connected and, and sort of know how to put the pieces together. So that's really exciting. I think we're almost at time here. I mean the last thing that jumped into my mind this week, and I, this is a longer conversation, but the NAP scores that came out just a, you know, about a month ago now, I believe have created a interesting spate of op-eds basically on sort of hot takes from a lot of different pundits around the press ecosystem in various ways.

I think there's a great article by Matt Iglesias who was formerly a Vox, a really article, somebody recommended to me just recently by Derek Thompson, who's a Atlantic writer, really thoughtful journalist, and they're basically saying like. How do we put the pieces together where we have, you know, these scores that have not risen at all since COVID, they're sort of had historic lows.

We have AI coming into the school system and nobody knows what that's gonna do. We have the schools themselves, both at higher ed and K 12, sort of under ROI attack. Everybody's trying to figure it out. There's a little bit of a, like a, almost like throwing up the hands being like, is this the end of education?

The, the Derek Thompson article is called The End of Thinking, and it's basically saying humanity is starting to reduce its own cognitive load. You know, in anticipation of ai. We're just not reading anymore. People are not trying to figure out how to write very well. They're not doing very well on, on, uh, standardized tests.

Medically, CS articles called American students are getting dumber, period. Okay. And it's interesting because it, it dovetails in a way with some things we've been talking about on this pod for a long time, which is this idea of, you know, the education system I think is, is really been losing its luster for a while in, in America at least.

And I think it creates both an opportunity for EdTech and a sort of scary period where people, nobody knows exactly what would step in to support it or, or bring back the reputation of, hey, schooling is how you get adv advance in this society. Education is important. We've thought that for a hundred, 200 years now and I think that message is starting to feel kind of awkward and I'd love to hear your take on it.

Matt, I know this is an enormous subject. We could have started with this, we could do a whole episode on this, but I'd love to hear your, just take on, you know, the combination of the NAP scores that people that Gallup poll about, you know, a lot of loss of faith in public education. Like, do you feel like we're in a moment now of, of education really sort of hitting a record scratch moment and people starting to say, wait, what is this all about?

[00:44:14] Matt Tower: Yes, I think. What is tricky for me is there's like a lot of commentary about what's not working and very little vision for what should happen. Right. And even among the like school choice crowd, which I know is a big topic in the US right now, it's not that they have like a, a specific theory of change.

They're just like. We don't like what's currently happening, we just want the ability to do something different. Like I get that because they can, the status quo is not working. So like, sure, it seems fair to want something different, but there's just very few, honestly, people who have like a clear vision of what education should be.

Right. And we have, you know, I think in the post World War II. Economy, we have, you know, the Russian threat, right? And like STEM was the answer and we wanted to like go to the moon and there were all these like big meaty things that required a ton of like math and science. Yeah. And we sort of lost that, you know, arguably when the USSR fell and we haven't really found like a new unifying thread to sort of get us through it.

And so, you know, we, we, last time I was on, we got into a little bit of a debate about Alpha School and I think regardless of whether or not that is the answer, at the very least, that team has a clear vision. Yeah, no, I agree. You know, that doesn't have to be the way, but I respect that it is a way, and like I would love to see a lot of the folks complaining about the status quo, present their own vision for what they think it, it could be classics, it could be ai, it could be vocational.

And there are, there are actually a fair number of people who are more on the vocational beat. And I, I respect that too. I wish there was more of a push for like, this is a way we could solve this, rather than like, yeah, American students are dumb. Right? Cool. Like, great, nice headline. And the article is more nuanced than that.

So I, I, I don't wanna be too flippant, but, but like, it, it is more than people getting dumb. It's a change in preferences and priorities that I don't think we have. That many great visions for what it should be and the way that we had for most of the 20th century. 

[00:46:30] Alex Sarlin: That's, I think that's a terrific take.

And, and this is one thing I really admire and love about the EdTech community. I think there are a lot of people out there with various levels of clarity on it, but with theories of change, I mean, part of what scares me about opera schools is that I think it does have a very clear theory of change. It has a very clear vision of what it thinks AI education should look like or education in the AI age should look like.

But because they're one of the only ones, I think with a extremely clear vision, I think they're gonna end up being a standard bearer. And I'm not sure that that's, 

[00:46:58] Matt Tower: and that that might be why they're getting so much attention. I think it's, there was this Chinese company whose name is escaping me that I just wrote about, where is it?

[00:47:07] Alex Sarlin: Squirrel, 

[00:47:07] Matt Tower: squirrel, ai. It's squirrel. Yeah, yeah, yeah, yeah. And the founder, it was this interview with the founder and he was just so. Transparent and direct with his belief in ai. Honestly, it was really refreshing. Like, again, I don't know, I don't have a clear theory of change for what the K 12 system should look like.

I, I'd like to build one. I'd like to, you know, maybe we can build one together. There's something refreshing when I read. About a person who, who has a, a specific belief. There's, there's something really nice about that. 

[00:47:36] Alex Sarlin: I really agree. And I think the, you know, the little taxonomy you were starting to make there, you're like, well, there's this sort of vocational take on, you know, what's the future?

There's the student, the sort of micro school student choice take where it's much more localized. There's project-based learning, which is sort of related to vocational, but not directly when, when we talked to Jamie Candy recently, she has this concept of, you know, career did learning that's gone all the way down through high school and maybe down into elementary school.

The idea of having career orientation and sort of thinking about your eventual skillset and mindset as it relates to work at. The core of what you're doing from the beginning. It's actually a pretty clear vision too, but like I think a lot of these theories of change are jargony. I think a lot of them are in ed speak and with alpha schools it's like, look, AI comes with these sort of core assumptions, right?

AI can teach the basics, which nobody knows if that's true or not, but again, and so we're gonna make it two hours of AI learning a day, and that's plenty. And then the rest of the time will be used for the things we think are really important, which are X, Y, and Z. It's like, I know what you mean. I saw that directly squirrel thing as well, and it's like when somebody comes in and it's just like.

No, here's what it's gonna be. Let me say it really clearly. There's sort of a clarifying you get out of that ambiguity and that trying to pull together the different quotes that different that people have that that Sal Khan is putting out there or that, you know, various people are putting out there and say like, I'm almost imagining this but I can't picture it.

And somebody's like, well I can picture that. I can picture two hours a day. And then everybody goes into groups like I can picture it, it's concrete. I'd love to see some more concrete examples of these taxonomies and a little more of a like, here's five different visions of the future of what edu. Like what does a classroom look like in three years?

Here's five different visions and they're actually really concrete and here are the assumptions behind them. Here are the beliefs behind them. Here's the theory of change to use your, your language. I think that would be really clarifying right now. 'cause I mean, we interview a lot of people here and everybody has elements of them and in some way, you know, not to be too blunt about it, but ed tech founders who sell to schools have to go really bend over backwards to make sure they don't say anything that could be interpreted as.

Hey, I can teach better than people. That's right. The number one thing they cannot say, which makes them have to talk in these sort of loopy loops certain things. 

[00:49:48] Matt Tower: I agree. That is the sort of consensus opinion of how this should happen. And I'm not advising somebody to be like, yeah. To go into a school and say, Hey, I wanna take all your teacher's jobs.

Right. And to be clear, I'm not saying that Right. But like, it might help if you are a little bit more direct. Right. Like people can sense baloney. So you know, don't say you're gonna take teacher's jobs, but do say the actual thing that you want rather than trying to dance around it in the interest of like not offending anybody.

[00:50:19] Alex Sarlin: Yeah. There's probably a political corollary as well, but uh, maybe we don't have to get into that quite today, but I think that's okay. There's strength and clarity. It's a message that people can, whether they agree or disagree, I think they just attribute strength to somebody who's saying something that's very, very clear and they could say, I, at least I know what you're talking about.

And then I can sort of make sense of it and decide where I land. So if somebody says the number one thing students should be preparing for is careers, it's like, okay, sure, that's a take my career or not, I don't know. We can think about it. Let's think about what that means, but like at least you can do well versus, you know, it's college versus it's socialization and or durable skills.

Phrase that drives me a little crazy, but I, but importance of it. Right? Anyway, it's always so fun to talk to you about, I feel like we get under the hood of some of these incredibly important issues. I hope we haven't offended anybody with any of this. I don't think we, I don't think we've gone too far.

As always, we'll put the links to all of these articles in the, the show notes for this episode. I, I highly recommend them. And we have the Natasha Singer article about coding education that Derek Thompson, uh, all of these raises. What's going Phil Hill on anthology. There's some good reading this week, and meanwhile, I guess we'll see you all at EdTech Week, right?

Yeah, I'm looking forward to it. If it's gonna happen in EdTech, if it's already happening in EdTech, you'll hear about it here on EdTech Insiders. Thank you for listening and we really appreciate it. And please, if you like this podcast, please leave us a review. We, we have not gotten a whole lot of reviews or ratings in proportion to the listenership, so you know, let people know.

We are hoping to lend some much needed clarity to this really confusing moment. That's our goal, at least. Thanks so much for being here with us on EdTech Insiders.

We have very special guests this week on the Deep Dive for this week at EdTech from EdTech Insiders, Sari Factor, and Jason Fournier from Imagine Learning an absolute giant in the EdTech field.

Sari Factor is the vice chair and chief Strategy Officer at Imagine Learning. She leads the company's vision to empower educators and students potential through innovative digital first learning solutions. She's a seasoned K 12 education solutions leader with over 40 years of experience, and she's passionate about using technology to enhance and support teachers.

And to improve student engagement and learning outcomes. Jason Fournier is the Vice President of Product Management for AI Initiatives and Data. At Imagine Learning, he's a pioneering product leader who spent more than two decades transforming how technology meets education, and he drives the innovation behind the company's curriculum, informed AI solutions that enhance curriculum and the classroom with proven pedagogy at the center.

With experience ranging K 12 through workforce, he's deeply committed to changing lives through learning. Sari Factor. And Jason Fournier. Welcome to EdTech Insiders. 

[00:53:08] Sari Factor: Happy to be with you, Alex. Yeah, 

[00:53:10] Jason Fournier: nice to meet you, Alex. Thanks for the opportunity to be here with 

[00:53:12] Alex Sarlin: you. I'm really excited to talk to both of you.

Imagine Learning is a huge ed tech company, a really important one, has tens of millions of students that it works with. And Sari, let's start with you. You've championed digital first research backed curriculum for many years, and one of the things that Imagine does so well is using high quality instructional materials as the foundation before layering in additional things like ai.

Tell us about that approach and how it's worked at Imagine 

[00:53:42] Sari Factor: Alex High Quality instruction materials or HQIM as it's known in the field, the backbone of effective teaching and learning to make sure that every child across a school or district is taught to the same high standards, so critically they ensure rigor.

Equity and alignment to state standards, providing a strong research backed foundation states and districts that have moved this direction, that choose HQIM to ensure instructional coherence. What they look for is making sure that every lesson connects to the high standards that they're seeking and that students are building knowledge as they move through the curriculum.

And the teachers are, have the tools they need to ensure that. It's important to remember that the districts and schools that use our products, they spend a long time evaluating curriculum before they buy, and then they spend more time and resource preparing teachers to implement our programs with fidelity to ensure the results that they seek.

So our customers know that when the entire system is aligned around a program like Imagine Im or Twig Science, they create the best conditions for student success. So our goal is to reinforce and not detract from that coherent curricular design. And our curriculum informed AI does just that. 

[00:55:02] Alex Sarlin: Yeah. This concept of curriculum informed AI feels like a really core idea here.

And Jason, you've been leading Imagine Learning's, AI initiatives. Tell us a little bit more about this concept of curriculum informed ai. What does it mean in practice, and why is it different from other approaches that we've seen in EdTech? 

[00:55:20] Jason Fournier: It's a great question. I, I think when we think about the broad array of tools that are available to people, whether it's the kind of commercial tools or it's tools specifically in education, they often don't have the full context of the curriculum that the teacher is using in the classroom.

And so when you can add that context to the things that AI is generating, to the suggestions it's making, to maybe the lesson plans that it's creating, you give the teacher that much more advantage on creating resources that, as Sari said, have coherence with the rest of the materials that are in use.

We know that teachers are always looking for things to bring into the classroom to expand the way that the materials presented, to find alternative perspectives or approaches to convey something to students. And that takes a lot of time for the teacher to go out and find and then customize and tailor.

And that's if she knows how to adjust it. I think there are always learning science and other things that are built in that may not be apparent on the surface that a teacher would have to know in order to include in something like a prompt to add that context engineering to what they're doing. So.

When you start with the curriculum from a grounding perspective, you can build all of that in and, and so that's the beginning point for curriculum informed AI is defining the pedagogical approach, defining the things that matter, adding some of those touch points that Sari mentioned, standards and learning objectives, and then using that as your baseline to begin from.

The second thing that we think a lot about is the data. And so we prioritize efficacy and being able to measure the impact of our products. And I think when you start from a data oriented perspective and you think about how can I use data, the context of the classroom to also inform these things, you can increase the benefit even more.

I mean, the curriculum always has teaching notes and scaffolds and customization approaches, depending on where your students are, what you covered last week, what you'll cover next week, all of that data plus performance data can also be layered in to alter the outputs of these models. That in turn creates resources that are that much more effective, that much more tuned for use in the classroom.

So we think of it as really kinda starting there and grounding in what we do and grounding in that high quality instructional material. Using that to make permutations, to make variations, but not starting with the kind of random information that might be encoded in the LLM. 

[00:57:30] Alex Sarlin: HQIM contains a huge number of different resources.

I mean, if anybody who's worked with EL or IM or Twig, you basically have all of these different resources that can be drawn from, it's not just that there's one lesson and one go-to piece for any different moment. There's all of these variations, there's all of this teacher training, there's all of this tasks and supplementals and additional pieces that you can dive into.

So the idea of using that as the data set underlying the ai, it's a very robust data set. And then if you combine it, as you say, with performance data from the students, you have a huge amount to work with. I'm curious, Sari, maybe you could just walk us through, just so we can really visualize it. What does this look like in the classroom?

If you have a teacher using, let's say, illustrative Mathematics and they're using, you know, curriculum informed AI with imagine's version of Im on top of it, what kind of thing might they ask and what would it pull out using specifically the IM curriculum as its core? 

[00:58:25] Sari Factor: So one example might be if a group of students.

Is below grade level and missing some of the prerequisite skills for a particular lesson. So she's gonna teach a lesson and she knows that these eight children need a little extra something. She can ask the system for that and it could provide a mini lesson for her that would engage those students around prerequisite skill, almost preloading what they need to go into the full class lesson.

I think that's probably the use case we see that is most used and most obvious. Jason's got a lot of them 'cause he's working on delivering on a lot of these products, these features through his team's work. 

[00:59:10] Alex Sarlin: And that prerequisite knowledge would be within the IM ecosystem. Yeah, please, Jason. 

[00:59:14] Jason Fournier: I mean, I can give you a couple more examples on top of that.

I think you mentioned earlier, Alex, that a lot of these programs come with a tremendous number of resources for teachers. I was meeting with some teachers in, uh, city on the East Coast in December last year, and one of them said, look, there's so many great things as part of the program, but we can't always find what we need at the time we need it.

And one of the benefits of these tools is they provide tremendous semantic search across the resources. So surfacing the right thing at the right time, even just that it might seem like a simple feature, but it's so powerful for a teacher who's short on time may not know all the resources available to them and needs something to help to scaffold or, or help a student.

So that's an example is just kind of powerful search and contextualization of the resource. The other thing you can think a lot about is to your point about PL and pd. Professional learning, professional development. Anytime that a teacher asks for something, there's an opportunity to coach and and kind of suggest the pieces of the curriculum that might be most beneficial or a different vantage point to think about the material that might resonate with learners.

And so that's another moment where you can kind of really think about layering that in. And HQIM comes with a lot of suggestions and guidance on how to do that. We can bring that to the point of need for a teacher, and that's really powerful as well. And then I think there's just the fun customizations.

I know where I live, I know the history of my area, I know what my kids are excited about, but I think it's kind of superficial to bring that in when you don't know how to tie the concepts together across the different things that the curriculum's trying to do. So how do you do that in a way that's inquiry-based or in a way that's kind of self-driven for the students?

So all of that can be brought in at point of need as well. 

[01:00:48] Alex Sarlin: I love that, that point of need, the right thing at the right time for the right learners. And if you're bringing in enhancements based on the personal interest of the learners, you're doing it in a way that actually is incorporated cleanly into the pedagogical model.

It's not just, hey, they like, you know, this basketball team. So we're gonna mention that in passing in a math question, it's, let's actually bake that into the material and make sure it works in the context of the curriculum. That's hugely important and it feels like a big step forward for what we're doing in ai.

Sari, I wanna ask you, you know, many district leaders, you mentioned state standard alignment, that is obviously vital and related to the HKIM movement, but they also care about strong controls over AI generated content that educators can actually control what comes out of it. And there's a lot of fear and excitement around AI generated content.

How are you balancing the standardization and the standard alignment and the control feature for what you're doing with Imagine while also still innovating and not feeling too constrained? 

[01:01:45] Sari Factor: Yeah, well that's, that's a great one. And it's very challenging, but transparency is the key. You know, one of the things we really believe in is the teachers need to be at the center.

We are doing the pre-vetting for them, but ultimately it's their choice about what to present to their students, what to share, what their students assigned to their students. And so that teacher empowerment is just a non-negotiable for us. It's designed to support educators and administrators. We don't want them pulling from some outside resource if we can provide the resource internally.

We've lived through teachers, pay teachers, and all the Pinterest that lacks the coherence that we believe we can provide. So we think we can give the teachers control and transparency over the content to review it. They can customize it. They can approve or say, I wanna do something different or give me another way to do it.

It not only keeps the instruction on track, but it builds trust and confidence among the educators who are using our programs. 

[01:02:43] Alex Sarlin: Yeah, I love that phrase, teacher empowerment, that it's really about keeping the teacher, as you say, at the center of the experience. We know that in different areas or in different moments, teachers have had to sort of grab a bag of resources from external sites, from wherever they can find it to fill holes or to try to engage their students in various ways.

But the idea of empowering teachers within the context of the curriculum, within the pedagogy they're actually working with is really powerful. And not having them have to sort of flail and pull together all these pieces and sort of patchwork something together and it feels like that's core to your vision there.

[01:03:15] Sari Factor: You also heard Jason talk a little bit about his work directly with educators and he and his team spend a lot of time listening to educators, you know, research advisory boards and the like for every one of our products to ensure that we are listening to them about what they would most value the AI doing for them.

And so it's not just we're coming up with ideas on our own, we're working in collaboration with our school partners, which, you know, that's what it's all about. If we're not helping them advance their practice, if, you know, if we're not helping them build to the success that they aspire to, we're not doing our job.

[01:03:52] Alex Sarlin: Jason, you mentioned the, the professional learning and professional development and the idea that teachers can be learning as they go, as they, uh, you know, continuing to expand their own skillset as they use various tools. But one of the things that educators also feel struggle with right now, some are very on board with AI tools and some are are worried about AI tools and sometimes very reasonably, one of the big concerns with AI is in inaccurate responses or, you know, what they call hallucinations in some, in some cases.

How do you use your approach of grounding AI in the curriculum? This curriculum informed AI to address teachers' concerns about hallucinations or inaccuracy. It feels like there's a, there's a really clear match there to reduce risk. 

[01:04:33] Jason Fournier: It's a multifaceted approach for sure. And, and the first thing I would say is, and this is gonna date me, but I've been around the industry long enough to say we've always had core agenda and ADA in our products because there were always the chance for mistakes and there was always something we might need to correct.

And so we've, as content creators and, and software developers in the educational space, we've always had to have a path to fix things. So that, that's one thing. The second thing I would say is you can't fix what you can't measure. And so there's an evolving landscape of evaluations and benchmark tools that you can use from a, a kind of technical and implementation perspective to both understand the characteristics of the system you're building and to understand when it's performing in an abnormal way.

And hallucinations are one of those. So you can build evaluators that know where you have a data set of gold answers. Here's a set of questions, here's a set of requests that are common, here's what the right answer is. And then when you run those in production. You can compare the answer and you can see if the system is finding that answer in the curriculum.

And if not, you can re, you can fix, you can build to kind of adjust for that. So we start there like, what are the guardrails and the tools that we can implement at a build level that can increase the confidence that that teachers have? And that comes back to knowing what's in the curriculum and, and starting there as our, our beginning point.

The other thing is to then have those systems run at scale. Like it's not enough to just do it in the lab now. You have to monitor in production, you have to be tracking the types of real world requests you're getting and understanding performance. So building those systems in. And then there's, as Sari said, the integration of experts.

And so at Imagine Learning, we have learning scientists and researchers and efficacy scientists, and all of those folks can be involved in looking at the outputs of these systems and making sure that what we're providing is accurate. Then we can run those checks in production to see if underlying SY changes to the models, for instance, are causing differences in in system performance.

So I would say like, we need to think about this all the way around it. It's gotta be a mix of guardrails. It's gotta be a mix of those little thumbs up, thumbs down controls we see in all of the AI products we use, it has to be grounding in the curriculum. And then it has to be using cutting edge science and research to tune these systems to perform well in the classroom.

And we think by doing all of that in a transparent way, making sure that teachers understand the methodologies we're applying and why we're applying them. We can build confidence and trust and then we have to be responsive when they, when they point out errors that the system is making. So it's really a robust approach and, and there's always room to improve.

We're always looking for new learnings and new, new developments, but we really try to kind of take a broad approach to that. 

[01:07:06] Alex Sarlin: That's a really powerful answer, and I think you're, you're addressing some of the core capabilities, but also some of the core constraints of ai. It's that you have this stack, right?

I mean, you have the foundational models which change. You have the set of data that it's trained on, which grows over time. You have the set of data it's drawing from, in this case it's the curriculum, so that. Is it, you mentioned it's grounded in the curriculum and then you have the user layer, right? You have the user asking different things in different ways and getting slightly different answers, and it's impossible to control every single thing going down.

But having evaluators in place, having really clear processes, experts, you know, feedback tools, creates a ecosystem of very quickly adjusting and improving and making sure that everything is really, really secure and is correct. I love that answer because I think you're really getting at that. You know, it's not about just a one-to-one relationship.

There's all of these layers that are happening inside that one query. When an educator is saying, I want that, that remedial, that task for these eight students who are behind, they ask it in a different way. It might get a different answer. If it's using a different model, it might get a slightly different answer.

So there's a lot of work to do there. 

[01:08:11] Jason Fournier: That's spot on. I and Alice, I'll, I'll add one more thing to that. I think being around education and educators themselves know that the context is so important and, and so when you think about all of those pieces of the stack that you described, there are certainly topics that in a, in the right educational setting are appropriate and in a different setting are inappropriate.

And that's a piece to this, that like the technology providers don't always have the ability to adjust. They're making commercial products at scale. And so the ability to know that, like when you compare two Viking tribes, there's gonna be aspects of violence in that comparison and that might be appropriate for a certain age level and topic area.

Whereas in a different setting, you might wanna flag that for an administrator. It's that context that really is important for us to capture. And it's all of those nuances that high quality instructional materials are intended to help teachers navigate as they deliver on learning outcomes. And so it, it, it's really kind of.

Like you said, it's a, it's a very complex ecosystem and you're trying to manage all of these pieces to build robust, trustworthy systems that you put into the classroom. 

[01:09:11] Alex Sarlin: It seems like a a lot of thought and a lot of work, but such something that when you put it all together, it creates incredible teacher empowerment.

They can do so many different things with the combination of AI and the curriculum. So I want to ask both of you, you know, Sari, let's start with you. You, as you think about the future of this curriculum informed AI, and you think about all of the different things that AI can offer to the classroom, especially for a major provider, like imagine you have core curriculum, you supplemental, you have credit recovery, you have so many different, you know, areas in which you work.

What do you think is the biggest opportunity for AI, for education going forward, especially for educators? How is it going to make educators lives better? Not worse, make them, save them time, give them that sense of empowerment that you mentioned. 

[01:09:58] Sari Factor: I think there are a couple things, Alex. I mean, one, the notion of having a place where you could go, I mean our whole, our whole thesis about digital first is that you have everything at your fingertips.

But when you have a lot of different components like we do in Imagine I am, there are still a lot of different components and for the teacher to know where to go and when, you know, the notion of being able to save her time and tee up what that next thing might be based on what her need is, and over time learning how she likes to teach and the students that she has and what they need.

So informed by that student data together, not only can that save the teacher time, but it frees her up to develop the more meaningful relationships with her students in the classroom. I mean, when I think about a middle school or high school teacher that has 150, 180 students. We don't get much time today with the individual students or small groups of students, and we've asked them to change their practice pretty dramatically with an inquiry-based or problem-based approach.

So the, the fact that they can be using the curriculum and improving their practice day by day through using curriculum informed ai, this is what it's really all about, is how do we help lift teachers up so they can improve their practice so they can then be more satisfied in their jobs and hopefully stay in jobs longer than teachers typically do.

So that's what I'm excited about because I think that's, you know, the ultimate is the creativity that teachers have and how they bring their talents to the classroom to improve student learning. 

[01:11:38] Alex Sarlin: I love that idea of the AI learning, how the teacher teaches, how they like to teach, how they prefer to teach, and then starting to give recommendations or ideas or suggestions very much in relationship to that.

That feels like enormously powerful. Jason, same question to you. You know, what do you think is the biggest opportunity for AI and this curriculum informed AI specifically to save teachers time while still protecting their creativity, their autonomy, their excitement about the role? 

[01:12:04] Jason Fournier: We recently published a study about some writing feedback tools that we gave teachers about a year ago.

And what we were able to show in that was that teachers using those tools, gave feedback to students on average, 28% faster, nine hours faster than when they were doing it by themselves. And I, I think it's that speed of feedback to give students what they need in the moment so that they can respond more quickly and integrate that more quickly.

That's important, but the reality is that's table stakes, like efficiency gains are table stakes. It's everything else that Sari mentioned about unlocking teacher creativity. That's really the exciting part, uh, about all of this. So I think the vision is that curriculum informed AI approaches give teachers kind of a, a team of experts around them.

You can imagine a squad of agents that are supporting them in bringing to life the experiences that they always imagined and dreamt of being able to give their students but didn't have all of the resources that they needed to do it. And, and so we can encode. The expertise we have as a, as a company that's got deep expertise in making these products, into these tools and give that to teachers so that they can leverage it to bring the most powerful experiences that they can imagine to the classroom.

And in doing so, when it's grounded, they have the opportunity to make sure it's aligned, that the representations match, that there's consistency, they lower cognitive load, all the things we know are critical, but they can do so in ways that really their imagination is the, the limit for them. And so I, I'm excited about not just the generative AI capabilities that make text, but multimodal models, evolving technologies like Google's Genie, where now teachers can bring all of these tools to bear with all the supports and the context they need to, to really innovate in the classroom.

So that, that's what really gets me passionate about this, is just there's so much opportunity to just reimagine the learning experience for teachers. 

[01:13:51] Alex Sarlin: A hundred percent. And yeah, anybody who heard him mentioned Google's genie, who has not seen the Google Genie demo, should be googling that right now.

'cause it's incredible. And you know, I think your, your mention of the imagination is the limit that is, feels like the motto of the AI era because that is where we're heading. We really are heading where whatever you can imagine doing in a classroom or with a group of students or to get to a certain learning objective, it's gonna be possible.

Now you can do it through videos and games, multimodal, like you said, you can do it through curriculum informed instruction, bringing all of these different pieces together. It's a very exciting vision. I really appreciate you, you sharing with us. Sari Factor is the Vice Chair and Chief Strategy Officer of Imagine Learning.

Jason Fournier is a VP of Product Management for AI initiatives and Data at Imagine Learning. Thank you both so much for being here with us on EdTech Insiders Week in EdTech. 

[01:14:40] Sari Factor: Thanks Alex. 

[01:14:41] Alex Sarlin: Thank you, Alex. We have a fantastic guest this week on the Deep Dive this week in EdTech from MedTech Insiders. Caleb Hicks is the CEO of SchoolAI.

He's a lifelong educator with 20 years of experience as a classroom teacher, instructional designer, EdTech founder, and he's created an entrepreneurship program for teens. He's led instructional design at Apple. He's co-founded Lambda School, one of the absolute groundbreaking bootcamps, and he's been focused on creating personalized learning experiences that drive outcomes for years.

Caleb Hicks, welcome to EdTech Insiders. 

[01:15:17] Caleb Hicks: Thank you, Alex. It's fun to be here. I 

[01:15:18] Alex Sarlin: appreciate you having me on. I appreciate you being on. I am a big fan of SchoolAI. I think you're doing incredibly interesting work, but let's start with some of this journey that you've been on. You've done a lot. So you've been at Apple, you've co-founded Lambda School.

What do you see as the through line that connects all of your experiences in ed tech that culminates in SchoolAI? 

[01:15:38] Caleb Hicks: Yeah. I think the core thing is I was a classroom teacher and I taught middle school, as you mentioned, like I started a teen entrepreneurship program. I taught all the computer classes. One thing that a lot of people don't think about for middle school teachers is that they have hundreds of students at a time.

So I had 42 desks in my classroom and I was teaching seven or eight periods a day. So you do the math, I had upwards of 300 kids walking in and outta my door every single day. And as a teacher, you're faced with this impossible choice. Of do I design my class for this? Like top 10% of students that love it and want more of it and deserve it, right?

Or the 10, 15% of students that are really struggling, whether that's a personal issue, an at home issue, a learning issue, a social issue. There's lots of different reasons that people could be struggling with your class or maybe they just don't like it, right? Like there's lots of reasons they could not like it.

Or 80% of the students in your classroom, and that's painting with broad brushes. But like teachers are faced with that impossible choice every day. And I think the through line of my career is trying to solve that problem both in my classroom where I moved from, uh, traditional curriculum to that teen entrepreneurship program that you mentioned.

The grade in my class was based on how many dollars you made out of a thousand. You had to make a thousand dollars during the semester. I did not care how we were, and my class was like a project based learning lab. I don't know that I knew project based learning in the pedagogical sense that people talk about it today, but that was basically what I was doing through to lamb to school where every hour we would send out a slack message with six emoji and students would pick one and tell us why.

I would attribute that to what teacher were they with? What teaching assistant did they have? What curriculum were they working on? What project were they working on? And I would take their outcomes assessment at the end of the week, and I would say I knew better by the hour how every student was doing and why, and what that meant for every piece of the instructional design of the school that allowed me to treat the school experience like a product manager.

Does really granular analytics. We had a one to eight teaching assistant to student ratio. We did a one-on-one with every student every day that turned in a write into a writeup that escalated up to the teacher. And when a student went yellow or red, I had a team of 800 teaching assistants that would swoop in and address whatever was needed anytime a student was falling behind.

So that was heavy survey fatigue and wildly expensive and totally inaccessible to the traditional classroom. So the first time I used AI end to end to like build a curriculum and then start teaching me parts of it. And then ask me questions about it to assess how well I understood it. The light bulb went on for me, which is like, okay, you can start doing personalized learning, which we all know, right?

We're all excited about. Every student gets an AI tutor. Like, great, that's very cool, but what can you do when you start having a hundred AI agents looking at the conversation and saying, this student gets this. This student missed this. And you can start doing that for a whole class at a time and tell the teacher, Hey, these four students need support on this concept.

These nine students, they're ready to move on. And like, here's a way to allow them to do that. And by the way, the AI has already started doing that with them. And like these other students, like they're good, don't worry about them. But like they probably maybe tomorrow will adapt the lesson plan for that.

And most importantly, this one student came to school hungry today. 

[01:19:25] Alex Sarlin: Mm-hmm. 

[01:19:26] Caleb Hicks: Okay. So what do you do for that one student? Anyway? So that's the how do we start giving teachers. That tool when they have hundreds of students, or maybe they just have 28 students as an elementary teacher, how do you give them that like red, yellow, green, blue, how is every student doing and why?

And give them the tools to actually go and address that. 

[01:19:44] Alex Sarlin: One of the things I'm hearing you say, you're not saying it in exactly this way, but is that. Teachers are faced with this incredibly difficult set of circumstances. They have up to 300 kids, even 30 kids. It's a lot. It's very hard to differentiate.

It's very hard to keep track. And you have to figure out who to focus your instruction on the the lowest percentiles, the highest percentiles, the averages in the middle, and ai. What it does is it creates all of these additional hands and brains that you can add to an educational experience. So can do that kind of pulse surveying.

It can do data analysis, it can differentiate instruction, it can translate instruction. It can do so many different pieces. And I feel like that's what's so exciting about what you're doing with SchoolAI is that you sort of invested this platform with many different ways of seeing the educational experience.

It almost serves as almost as like a team of ai. Yeah. Experts helping out an individual teacher or a school in a classroom. Yeah. Tell us about how SchoolAI actually does this. It has a lot of different pieces. 

[01:20:43] Caleb Hicks: Yeah. Part of it is we thought like, how do we flip the ratio so that every student has 30 teachers instead of every teacher has 30 students.

Right? So we do all the traditional stuff that you could guess AI could do for you, for teachers, right? Like we do the lesson planning, we do the content adaptation. All of that I think is like snacks, bite-size experiences that let the teacher recognize that it's useful for them. Right, because teachers have to feel like, Hey, I know what this is.

I'm comfortable using it myself. And I see a path now to students using it in a way that is good for them. It's not just cheating in my classroom. It's like I can actually use this as a tool in my classroom. So we were fortunate to be able to partner with Jordan School district in Salt Lake City, Utah.

They won an iste, national recognition for their willingness and ability to put AI in students' hands early. And we got to work with them in doing that. 'cause when we gave them like FERPA compliant chat GBT thing, that was like our first thing that we built with them. They're like, great, now we want to use it with students.

We said, how can we create this with students? And in particular, we're working with a couple of English language arts teachers and they said, I want a writing tutor that won't write the essay for them. We said, okay, great. Let's go figure out how to do that writing. Tutor easy won't write the essay for them.

Actually, it's really hard. Like it's, it's like, so I think for a while we might have been one of the best people in the world at getting AI to not do things because we started doing this thing where, okay, the student would ask a question and then we would have an AI on the side, look at the whole conversation and the rules that the teacher was doing and say, this is what you are allowed to respond with.

And then give that back to the conversational AI that then responded with, right, wrong, or otherwise. Right? And so again, I think we were one of the best in the world at keeping AI on track. So we created this concept we call spaces. They're these learning workspaces. You have an AI assistant and you have a thing that you are trying to do.

Now, a year ago, two years ago, a space was actually just chat. It was you and an AI agent talking back and forth with a kind of teacher set instructions on what you were going to do. That could be a pulse check at the beginning of class. That could be an exit ticket, that could be an interactive like simulation of supply and demand that adapts to your interests.

If you're interested in fashion and retail, it's gonna teach you about supply and demand using a sneaker drop in New York City. And if you're interested in the NBA, it's going to use it in like, Hey, there's literally only one LeBron James. If you want LeBron James, you're gonna have to beat everybody else on a total package, right?

So like you can start teaching supply and demand and what you actually care about. So that's, that's great. That's like very excited about personalized learning and education and adapting to students' interests. But the fun thing that we introduced actually this summer is this concept we call Power up.

Power up are essentially apps that the AI can use with you. So let's pretend you're writing an essay. You can have Google Docs as an app, and the AI is writing the essay with you, right? Not for you. With you as you're writing. The AI can read what's going on in Google Docs, and as you ask the AI questions, it can work with Google Docs with you, okay?

Now imagine that for Canva, you're designing something, you're in a design class or you're just trying to create something. Having the AI be able to read and analyze and work with you on what you are doing, you're now moving way beyond chat. It's really cool now, it's also really hard to build, so it's, I wish because it's the end of September.

I wish this was live for everybody. The prototyping, the people that have early access, like it is very cool and still needs work because it's very hard to do, right? Where AI can see what you are doing and work with you on it. But imagine you're watching a YouTube video. We all, Alex, when we were in class and growing up and we had a sub and teacher put on a video and we were filling out a worksheet while it goes along, right?

It's like boring. What if the AI could actually watch with you, pause the video, ask you questions, resume the video. You ask questions and it can be like, Hey, that's coming in like five minutes. We'll get there. And you have like, meanwhile again, you have AI agents that are looking at the conversation and surfacing insights up to the teacher of like the student Got it.

The student didn't, you probably need to review this concept when you come back to class, right? Like that feedback loop is really, I think the special part of what we're building. We're doing really cool things with AI and making that accessible and managed and safe and guardrail to put in students' hands.

But the special thing we do is that analysis, that actionable insights up to the teacher. And, sorry, I've got some greatest hits. Here's, here's an important one. Data-driven education is a lie, and it's not because nobody wants to, it's because we don't have the right data in the first place. Alex, do you have any kids?

I do. What ages? Three and eight months. Three and eight months. Okay. When they go to school, the teacher is gonna mark attendance for them and they're gonna start giving them grades on things. Does that tell the story of your three-year-old? No, not even. It does not. It does not. It can't. But a teacher can barely even do that because they're so overwhelmed with so many other things.

So how do we give them the right data points to understand like, Hey, this student tried this thing and failed. And by the way, maybe that was by design, and so you want to help them with that, right? Or this student actually came back from recess and got made fun of at recess and like they're not even in the head space to be able to think about like multiplication.

Okay. How do you address that as a teacher? Right now, teachers are the most mission driven workforce on the planet. If they weren't, they would be working somewhere where they made more money. Right. And so how do you give them the tools to have the light bulb moments that they actually got into the profession to have in the first place?

Okay. Sorry. That's, that's special part is that feedback loop back into the teacher. And we, we think about like a new atomic unit of data that allows everyone around the student to make better decisions. That could be their teacher, that could be their school counselor, that could be the school leadership that's able to see that granular level of insights across the whole school.

And what I get most excited about, I have four kids, 14, 12, 10, and 8-year-old. 

[01:26:57] Alex Sarlin: Mm-hmm. 

[01:26:58] Caleb Hicks: They come home from school. I say, how was school today? They say it was fine. Having a better way to understand what they're interested in, what they like, what they love, and, and I ask them all the questions, right? But kids are willing to talk to AI in a completely different way that they're willing to talk to me.

And so just creating these feedback loops between all the stakeholders of the school to bring them closer together in support of making awesome experiences for students is really special. 

[01:27:22] Alex Sarlin: And the idea of these granular data points that tell a, a much richer story about what's happening on any given moment, any given day, any given student, any given lesson, it has an exponential effect on understanding what's actually happening in the classroom.

One of the things, you know, I admired enormously about SchoolAI from a product lens is that you mentioned the spaces and these power ups, you are very, very, very good at taking these very abstract, hard to sort of grasp concepts in AI and structuring them. And you know, people talk about guardrails. You talk, you know, you mentioned the idea of.

Getting AI to not do things. And often people consider that, you know, you're putting guardrails on the ai, but I've, I've sort of preferred recently to think of it as putting structure on the ai, right? Yeah. It's like an open chatbot is, doesn't know what you're there for. You say, write an essay for me, it's gonna write an essay for you, but a structured AI experience in a space.

Yeah. Can do, you know, a supply and demand thing it can do. I talked to a teacher at ISTE about SchoolAI. He said they were building escape rooms in their spaces. I heard that's a fun one. Historical figures. Yeah. You do historical figures, you can talk to, you do tutors. You the idea. So I love this concept of a space that then educators can sort of configure to do exactly what they wanna do in the classroom because it's basically creating incredibly clear structure about what the AI is doing in that particular moment.

Yeah. And that's what's lacking in so much of the frontier model work. 

[01:28:42] Caleb Hicks: Yeah, I think open AI and Anthropic almost did us a little bit of a disservice in terms of framing AI only as an assistant because an assistant does things for you. Learning happens by you doing things for yourself with help, right?

Scaffolded help. The two ingredients of high impact learning is like, where are you? Where do you need to be? And like a mentor or guide to bridge that gap. Right? And if you know, as you can design a really cool experience that AI can guide you through to bridge you from where you are to where you wanna be.

And I think we doubled down on structure with spaces this summer. We actually introduced a concept of agendas and we always had like, Hey, you could come and put in an outline inside of a space when you were building one. It's like first do this, like that, the escape room example that you built. Teachers were writing these really elaborate guides for escape rooms.

And the AI was good at following that because of that like. Orchestration that we talked about where we have multiple AI agents, we do somewhere between three and 500 words of thinking before the AI even starts responding to the student. And we've been doing that since way before There were these like reasoning models, but the thing that we introduced was like adding more structure.

So you can now say, first I want you to do a pulse check at the beginning of class, then I want you to introduce this concept of supply and demand, and I want you to adapt it to the student's interests. Then I want you to do like a quiz or a review or a some formative assessment. Then I want you to have them write something, right?

And, and it's not all tied to chat. But the ai, for those of you in the audience that have like, that are, that are listening, that have used AI and the conversation gets long enough and it starts losing the plot. 

[01:30:29] Alex Sarlin: Yeah, 

[01:30:29] Caleb Hicks: right. That's a big part of what these agendas solve. It's making sure that the AI is focused on the one task to be done right now with you.

And yeah, these structured AI led experiences, I think is part of what we have to do for AI to make sense for students in classrooms. 

[01:30:45] Alex Sarlin: I think it's a huge amount of what we have to do. I mean, I think a lot of the hand wringing and worry about AI in the classroom comes from the idea that it came out in 2022 with chat bt of like an assistant that does whatever you ask it to do.

And people said, oh my goodness, I can't put that in front of my 13-year-old. Uh, because I can imagine what they're gonna ask it to do. And I, I think there've been all these dominoes falling from there. But I think SchoolAI has really, really moved the needle on this. I mean, one of the things that you do that I think is really interesting, and I know we're almost at time here, but is you are sort of serving as a one-stop shop for AI solutions for schools.

Even those that would otherwise be worried about that type of, you know, about guardrails, about safety, about privacy. Yeah. I'd love to hear you talk, I'm sure you've talked to so many people about this, but just in a nutshell, how do you take a school environment where you have. People with different levels of expertise in AI of comfort or of suspicion and say, this is a solution that actually can really address your, your fears and take your school to the next level without causing that kind of, uh, chaos that you might anticipate.

[01:31:46] Caleb Hicks: We think a lot of people build products thinking, okay, what do people, what are people ready to use right now? And you build like the car for them, like right now, like that, that you're solving that problem for them. And we think about that too. But we actually think the car is make school awesome. Right.

It's like how do we work backwards using a AI new foundational technology, right? Like it isn't just like a Chromebook, it's like the internet all over again. It's, it's a complete, it fundamentally changes how we think about certain things. And I go in and I think every day I'm thinking like, okay, where can I put AI to work to make school better?

I talked about this. School is the most well-intended educators are the most well-intended workforce on the planet. They're the most burnt out. They have the highest turnover, they're the lowest paid, and like they're charged with the highest social task that we could possibly ask them to do. And it, it's like it's impossible to do that.

So you take this new technology and you put it to work to make all of these well-intended people be able to do the part that they got into the job in the first place. And so, yes, we worked back to personalized learning with spaces with students, but all of that is in service of like, how do we make the school counselor's job easier?

How do we start solving like the real problems that exist in schools? We talk about all these surface level problems. I think one of the hardest problems in schools is that you have students, you have teachers, you have parents or families, and you have school leaders, and they're on completely different pages, not because they want to be on different pages, but because they're all overwhelmed in their own stuff.

So how do you give them a shared context to work from so that they can solve that? So that's, that's what we're actually building is like, it's, it's a new shared context where AI can be everyone's assistant in making school better for all of the people that are so well intended working in these schools.

And that's the unifying piece that ties all of what we're doing together. Yeah. How 

[01:33:40] Alex Sarlin: do we 

[01:33:41] Caleb Hicks: make school 

[01:33:41] Alex Sarlin: awesome and how do we make everybody's job more fun and exciting and closer to what they got into the profession to do? And that's a vision that I think everybody can get behind. And AI just becomes the means by which you do that.

It's not, it's AI product, it's a way to elevate our experience. And AI is just part of how it works, because it, it's, it can 

[01:33:59] Caleb Hicks: do all this work for us. I know we're about to wrap, but I'll just, I'll just say this one thing. It's the weirdest thing for me as like founder of a company called SchoolAI to say that like, AI is not the thing, right?

It's just a tool that gets us to an A thing. And for us that that thing is. Make school awesome. Yeah. And for students and the people who support them by finding out what they need and making it happen. And we believe it's a foundational technology to help the people in the schools be able to do that at a level that they haven't been able to before.

So it's really fun to build in this time for these people that we all care so much about and and hold in such high esteem. But are are, are in a really hard spot. 

[01:34:40] Alex Sarlin: Yeah. 

[01:34:40] Caleb Hicks: And I've talked 

[01:34:41] Alex Sarlin: to some of these innovative SchoolAI educators and they're just like glowing with the excitement about all the things they've done and that they're planning to do and that they're, I mean, you really see it.

Their experience has changed very distinctly because of this. It's really exciting work. Caleb Hicks, he is the CEO of SchoolAI. He's also a lifelong educator. 20 years of experience as a teacher instructional designer and ed tech founder. Thank you so much. This is a really pleasure of a conversation. I hope we can do a longer one sometime.

[01:35:06] Caleb Hicks: Would love to, would be happy to. Alex, this was fun. Thanks for having me on. Thanks for listening to this episode of 

[01:35:11] Alex Sarlin: EdTech Insiders. If you like the podcast, remember to rate it and share it with others in the EdTech community. For those who want even more, EdTech Insider, subscribe to the Free EdTech Insiders Newsletter on substack.

People on this episode