Edtech Insiders

Week in Edtech 1/28/26: SchoolAI in Classrooms, ChatGPT Blocked, OpenAI’s $30B Bet, Gemini vs Anthropic, China’s AI Literacy Push, Phone Bans Rise, Higher Ed Retention Challenges, and More! Feat. Jeremy Smith of pega6 & Stewart Brown of Code4Kids

Alex Sarlin and Ben Kornell

Send a text

Join hosts Ben Kornell and Alex Sarlin, joined by special co-host Mike Palmer, host of Trending in Ed, as they break down the biggest stories shaping AI, K–12 policy, higher education, and the global future of education.

Episode Highlights:
[00:03:34] SchoolAI study shows teachers using AI for reasoning and inquiry
[00:09:58] Denver Public Schools blocks ChatGPT over safety and privacy concerns
[00:12:20] SoftBank invests another $30B in OpenAI as ads roll out
[00:13:24] Gemini and Anthropic lead the race for AI in education
[00:20:36] China launches nationwide AI literacy for K–12
[00:29:58] Most U.S. states still lack formal AI guidance for schools
[00:33:13] Phone bans spread rapidly across schools
[00:38:44] Higher ed enrollment rebounds but retention remains weak

Plus, special guests:
[00:46:19] Jeremy Smith, CEO and Co-founder of pega6, on one-year AI-first career accelerators
[01:11:29] Stewart Brown, K–12 Computer Science and AI Literacy Leader at Code4Kids, on CS as a core elementary subject

😎 Stay updated with Edtech Insiders! 

Follow us on our podcast, newsletter & LinkedIn here.

🎉 Presenting Sponsor/s:

Every year, K-12 districts and higher ed institutions spend over half a trillion dollars—but most sales teams miss the signals. Starbridge tracks early signs like board minutes, budget drafts, and strategic plans, then helps you turn them into personalized outreach—fast. Win the deal before it hits the RFP stage. That’s how top edtech teams stay ahead.

Tuck Advisors was founded by entrepreneurs who built and sold their own companies. Frustrated by other M&A firms, they created the one they wished they could have hired, but couldn’t find. One who understands what matters to founders, and whose northstar KPI is % of deals closed. If you’re thinking of selling your EdTech company or buying one, contact Tuck Advisors now!

This season of Edtech Insiders is brought to you by Cooley LLP. Cooley is the go-to law firm for education and edtech innovators, offering industry-informed counsel across the 'pre-K to gray' spectrum. With a multidisciplinary approach and a powerful edtech ecosystem, Cooley helps shape the future of education.

Innovation in preK to gray learning is powered by exceptional people. For over 15 years, EdTech companies of all sizes and stages have trusted HireEducation to find the talent that drives impact. When specific skills and experiences are mission-critical, HireEducation is a partner that delivers. Offering permanent, fractional, and executive recruitment, HireEducation knows the go-to-market talent you need. Learn more at HireEdu.com.

[00:00:00] Ben Kornell: I feel like we're in this ping pong back and forth between the optimist and the skeptics, and somewhere in the middle is the truth, and the other dimension is time. So some may be right for a period of time, others may be right on a different time horizon. But you know, when you're working in education, you kind of have to triangulate because you've got real students now who need to be prepared for the world in the future.

So you almost have to acknowledge the skeptics take on, here's where things are today, but you have to at least do a probability weighted guess at the future where the optimists are because you're preparing, if you're in elementary school, kindergartner today, in 2080, they're gonna be in their prime of their career.

So it's a really tricky one. 

[00:00:50] Michael Palmer: What we saw in November of 2022, folks are now seeing heading into 2026 with the level of capacity and agency that non-technical users are getting with this stuff. So I think this reflects. A little bit of that where trust, I think is paramount and some of these brands, the whole move, fast break things, sensibility.

I think people approach that with caution, particularly in K 12.

[00:01:22] Alex Sarlin: Welcome to EdTech Insiders, the top podcast covering the education technology industry from funding rounds to impact to AI developments across early childhood K 12 higher ed and work. You'll find it all here at EdTech Insiders. 

[00:01:38] Ben Kornell: Remember to subscribe to the pod, check out our newsletter, and also our event calendar, and to go deeper, check out EdTech Insiders Plus where you can get premium content access to our WhatsApp channel, early access to events and back channel insights from Alex and Ben.

Hope you enjoyed today's pod.

EdTech Insider listeners, it's another Week in EdTech and this week we have a very special co-host guest, Mike Palmer, founder of Palmer Media and the host of Trending in Ed. Welcome Mike to the podcast. 

[00:02:17] Michael Palmer: Thanks, Ben. A long time fan. First time caller, so I'm very pleased to be here with you. 

[00:02:23] Ben Kornell: Yeah, we're so glad to have you on.

For those of you who don't know, Mike's podcast Trending in Ed. It's a long form podcast where he's diving deep into all the sorts of issues we talk about here on EdTech Insiders. So we thought let's do a joint Week in EdTech issue, and you'll be able to find this podcast here on the EdTech Insiders Channel, but we'll also cross post it over on yours In terms of what's going on with the pod, we have a ton of great guests coming up.

We've got Jeremy Smith from pega6. We have Karl Rectanus, who's starting a brand new CEO job. We have Stewart Brown, Brandon Smith, and Barak Glanz. We have so many great guests coming on just to talk about new and exciting products that they're working on and companies. We also have our EdTech Insiders Happy Hour and Bay Area Summit coming up February 12th in San Francisco.

We hope to see you there. A special thank you to the main sponsor, Cooley, as well as our supporting sponsors, Higher Education, Tuck Advisors, and Starbridge. So without further ado, we are gonna dive in to the headlines. Are you ready, Mike? 

[00:03:33] Michael Palmer: I'm ready. Let's do it. 

[00:03:34] Ben Kornell: Alright. So our main story today is really about this new school AI study that looked at 23,000 teacher created AI learning experiences across English language arts, math, science, and social studies from elementary through high school.

Really, we've been looking for data on how are teachers using AI and what's going on. Some of the key findings, just to give the highlights for our listeners, and you can check this out in our show notes, teachers are using AI to prompt students to reason, evaluate, and make decisions, not simply recall information.

So this idea that it's a shortcut to the right answer, teachers are already hacking that more than 75% of AI activities are anchored in core curriculum layered with personalized, interactive and interdisciplinary experiences. So this idea that this is living in some separate universe, it's anchored in the core curriculum, and teachers leveraged AI to increase interactivity and support student agency, this is a really promising report.

Now, obviously it's from school ai, but 23,000 teachers is not a small sample size. 

[00:04:46] Michael Palmer: Yeah. 

[00:04:46] Ben Kornell: As you dove into this, what were some of your tips? 

[00:04:49] Michael Palmer: Well, I mean, I think it aligns a little bit with the backlash against AI that I know you've talked about a little bit on the podcast as well, where the reluctance, I guess to put it directly in front of students, I think is leading us to treat teachers as the main point of intersection with the ai.

And this reflects that they're actively engaging in it, that they have some agency in it. And since a lot of these tools are pretty open-ended, I think they're finding useful opportunities to kind of build on their pedagogy and actually make their classroom experiences better. I would be curious a little bit about who exactly is sampled and which sorts of adopters we're talking about here, but it does look promising in that there is a tip of the spear of implementation that's emerging here.

And the more we can. Listen to what these people are doing, the more we can sense and respond to what they're putting out in the world, the entire practice should be moving in a positive direction with ai. And I think it's a place where, I think it's Roger Spitz who talks about the Black Mirror Effect, where we tend to think about science fiction futures and the future in general as dystopian.

And I'd say this is perhaps a note of optimism around some positive use cases of AI for teachers. 

[00:06:17] Ben Kornell: Yeah, and I think the fact that this report even exists is a big step forward in that we're finding somebody who's been successful selling AI to schools, school ai, saying that their outcomes are measured by cognitive demand, interactivity, student agencies and teacher rules across subjects and grade bands.

They're clearly saying it is not just are kids getting the right answer. It's not just are they scoring better on some test, they're actually going to the sausage making of learning and saying, are things actually being cognitively offloaded or are they being more rigorous? And I really would encourage anyone to read the report because it gives a bunch of concrete examples of projects where this is happening.

My worry, and I think maybe this report could get misinterpreted, it is when you get into it, these are really academic projects. It's clear that school AI has taken something that's pure AI and made it really, really deeply about, you know, basically curriculum. And this idea that it's connected to the curriculum also allows it to really deepen the learning.

And so I think the one false takeaway from this might be like, all generative AI is effective. Yeah. In schools it's this, the curated by the company and then curated by the teacher, so that it really is a deepening experience. 

[00:07:50] Michael Palmer: Mm-hmm. Yeah. It does feel like the AI trend is concurrent with other ones, like high quality instructional materials, which has been a kind of buzzword that's been out there for a couple years now.

But like this idea of connecting the technology to emerging trends around sort of content development and in instructional design and like science of reading is another place that I think is really interesting to look at the intersection between AI and these other concurrent trends. But I've heard it described as the last infinite mile, which is when you're at the implementation stage, what makes something successful is very multivariate and it's hard to disentangle.

Is it the technology, is it the intersection between the technology and the content? And then in this case, I think there's a lot of. Teaching practice that is implicit in this research that I'd like to dig into more. 

[00:08:44] Ben Kornell: It's almost like this study is making the case, even though it's about ai, it's actually making the case for teacher professional development and like implementation support.

It was interesting. One of the other takeaways was that science showed the highest levels of autonomy and inquiry and reading and social studies. Where they found the greatest value was perspective taking and interpretation, which. I think just that takeaway alone tells you that these assignments are not perfunctory simplistic task assignments.

This is not a multiple choice test that we're talking about, right? These are projects and these are ways in which kids are actually using the AI to deepen their learning and personalize it, rather than cut to the chase. Relatedly, Denver Public Schools just this last week blocked chat GPT on all student devices and they may block it for teachers shortly.

I found that quite interesting and the primary justification is that they are worried about group chat in the chat GPT, so this idea that multiple people could be, you know, having social conversations with ai. Bullying risks and then to a lesser extent, data privacy. And at the same time, they are doubling down with Gemini as a safer alternative.

That's in all the Google classroom ecosystem already embedded and they continue to partner with magic school. So is this a story more about chat, GBT losing subtraction with K 12? Is it about, um, nuance in these bands? What's your read on this situation? 

[00:10:29] Michael Palmer: I think privacy, safety and trust are first and foremost for a lot of folks around decision making around technology nowadays.

And I think there is almost like a tendency now towards caution when making decisions to adopt new technology. And that's why I think some of the bigger AI players need to be very conscious of sort of the public perception. This reminds me also of Apple. Just signed a deal also with Gemini. For Gemini to be its AI firepower and that to me is also reflective of maybe open AI is losing a little bit on the brand side.

At the same time you see Claude code getting a lot of buzz nowadays as really the next breakthrough. If you wanna talk about vibe coding and agentic ai, I've heard it compared to sort of a chat GPT awakening, what we saw in November of 2022. Folks are now seeing heading into 2026 with the level of capacity and agency that, 

[00:11:38] Ben Kornell: yeah, 

[00:11:38] Michael Palmer: non-technical users are getting with this stuff.

So I think this reflects. A little bit of that where trust I think is paramount and some of these brands, the whole move, fast break things, sensibility. I think people approach that with caution, particularly in K 12. 

[00:11:56] Ben Kornell: Yeah, totally. And it does make you wonder, is OpenAI losing their mojo? There was almost like this sense of move fast and break things really was working for them for a time.

Mm-hmm. 

[00:12:08] Michael Palmer: Yep. 

[00:12:08] Ben Kornell: And they haven't yet figured out how to pivot. You know? It's almost like the AI revolution is like a 20 year process that's happening in the span of just a few years. 

[00:12:19] Michael Palmer: Yes. 

[00:12:20] Ben Kornell: They are an incumbent, they should be acting in a more incumbent way and yet they're burning cash at like a phenomenal level.

One of our headlines today, SoftBank is investing another 30 billion. Yeah. In OpenAI. They're launching ads. On chat, GPT and the reports are that these ad costs are as high as NFL broadcasts, and then you also have open AI getting caught up in a lot of the current politics of the moment. There's a way in which it just feels like they're losing their eye on the ball.

And meanwhile, Google is just this drumbeat. Yeah, and I for Apple, I mean that's insane for Apple to seed so much territory to a long hated rival. Yeah. And then I think what we've all been admiring is Anthropic has had the focus from the beginning, which is enterprise use cases. And now the education context is if you wanna get a job within enterprise, you better learn how to use Claude code.

And, and yeah, as a breakthrough, 

[00:13:24] Michael Palmer: that'll be an interesting kind of battle. I think within education, I would say open ai, maybe on the enterprise side is mildly relevant, but to me, the real battle for the future of AI and education to me is a two horse race between Gemini and Anthropic. Assuming Claude Code is like the killer app and is actually the unlock that some of the more recent Gemini versions have been for Google.

[00:13:47] Ben Kornell: Yeah, it's funny that Microsoft isn't playing as much of a role and copilot is in industry, especially like industries like defense and like more conservative industries. Microsoft seems to be doing well, but they forged that landmark partnership with OpenAI in early days. Now that seems to have lost a lot of mojo.

And then meanwhile we have meta where Ya Koon man, he was like on the front, front edge. Then he goes to work at Meta and basically he's been crapping on the entire AI industry ever since he's moved outta the us. Basically, he is like, I don't wanna live in that political climate. And then on top of that, he believes that the work for large language models.

Is actually a dead end for a GI and basically his point of view is that we're going to be hitting an Asim toad where the kind of energy cost to generate any meaningful change is totally not worth it. Yeah. And we're not gonna break through, he's been saying this for, I would say like a year. Uh, the, we have a New York Times article that we'll share in the show notes, but both for Meta and for Microsoft and where we are headed with LLMs.

How do you interpret all of this through our education lens? 

[00:15:06] Michael Palmer: I mean, the age of slop is here. There's a lot of just not high quality content that's out there. There's a lot of misinformation and I've heard it described as epistemic insecurity. I think that was Dwayne Matthews I heard that from, which is like, it's just hard to tell what knowledge is nowadays.

There's so much pollution from these generative models. And to me, the idea that that ultimately needs to connect to something. More neuro symbolic, more kind of modeled after the way we understand human intelligence. That seems like a natural evolution to me, and it does feel like almost parallel to a hype cycle.

There are funding rounds that are supporting that investment thesis and the idea that we're gonna see a GI because we're gonna just continue to invest in compute. That's starting to seem a little too cute. The other guy I wanna shout out is Gary. Marcus is someone who I've followed for a while and he's someone, the AI skeptics are something I think you need to keep an eye on.

Even if you're somewhat bullish around ai, they're not necessarily saying that it's not going to really transform the world. Many are saying it is, but it's more the speed and the only rosy outcomes around rapid AI adoption. And your asymptote comment, I think is a good one where we were sort of on an upswing and it's almost like we've hit a point of inflection and now rather than looking like this is exponential into the singularity, it's more like this model is leveling off.

And if you think about innovation theory, it's almost time to jump the S-curve and find a different innovation vector besides just bigger compute and more data centers. 

[00:16:53] Ben Kornell: Yeah. I feel like we're in this ping pong back and forth between the optimist and the skeptics. Yep. And somewhere in the middle is the truth.

Yep. And the other dimension is time. So some may be right for a period of time, others may be right on a different time horizon. 

[00:17:12] Jeremy Smith: Right. 

[00:17:12] Ben Kornell: But you know, when you're working in education, you kind of have to triangulate because you've got real students now who need to be prepared for the world in the future.

So you almost have to acknowledge the skeptics take on, here's where things are today. Yeah. But you have to at least do a probability weighted guess at the future of where the optimists are because you're preparing, if you're an elementary school kindergartner today in 2080, they're gonna be in their prime of their career.

Totally. So it's a really tricky one. I do feel like. While it may be true that large language models will hit a wall and neural networks are the long-term path, it does feel like the level of investment, the level of talent, the level of focus also increases the likelihood that neural networks would make a breakthrough.

Almost like the LLM rising tide has lifted the boats of neural networks. Yes. And others working on other problems across ai. And we saw this with Grok too. Some of the more sophisticated, really, really hard to solve problems. Definitely need a lot of energy and compute, but also there's just a lot of low hanging fruit tasks that LLMs can do quite well, that on a less powerful model or less electric intensive chip.

[00:18:31] Michael Palmer: Mm-hmm. 

[00:18:31] Ben Kornell: You can be quite effective. 

[00:18:33] Michael Palmer: Absolutely. I mean, it kind of reminds me of just the disruptions we've seen already, like if the LLMs were to stop improving today. It would take another five or 10 years to kind of fully reap the impact of what we've already gotten from them. Look at the disruption we've seen.

You know, I was just listening to your episode, we were talking about Duolingo and like the way they've been kind of pummeled in the beginning of this year because I think there's this idea that technology around language learning, it was interesting seeing Prep Lee in the notes here as well as getting a big investment and the idea that gamified proprietary apps that will do things around like language learning, around other learning use cases are gonna get kind of blown up by just the models as they exist today.

So this idea of having sort of a moat around your differentiated learning. Experience that's already gone. And then this technology, even if it is Asim toting, you know it's already revelatory. And then to your point, then it reminds me of Nate Silver's big breakthrough back at the 5 38 was like, what if I start aggregating these different models and seeing what those blends start to look like?

I don't see the generative models getting less good. They'll continue to develop, but it's more like we're gonna need to start layering in new innovation. And in some ways it's like smarter investment. This is where like stuff like NSF funding gives me palpitations. Like we need to figure out how to end the competition with China.

But like it does feel like a time where rather than just somewhat lazily investing in open AI data centers, can we start to get smarter in how we fund the right level of investment in our learning infrastructure? 'cause without it 10, 20 years from now could be a much darker time than we really want it to be.

[00:20:36] Ben Kornell: Yeah, and it's funny that you brought up China in that construct too because we covered previously that in China, AI literacy, AI education, K 12, like overnight has become countrywide policy and the new Chinese AI models for a fraction of the cost. Right? They're not getting 20, 30 billion. They are just one step behind, if not surging ahead.

Yeah. Of some of the US based models. I think the claim that we often have is like, oh, Western countries should be on US AI models. Because if you are on a Chinese model, they'll still all your ideas and like ruin your IP or something like that. Well, one to your point. We're basically seeing companies get crushed by generalized ai.

[00:21:20] Michael Palmer: Yeah. 

[00:21:21] Ben Kornell: First and foremost, but two, we're seeing a geopolitical landscape where trust is declining on us, and the idea that the US might take advantage of what you feed an AI versus what like a quote unquote bad actor China would be. That differentiation is becoming a lot harder to make given the current political climate.

One of the things that brought that up for me was Davos and we, we have a great guest article in the EdTech insiders Substack 

[00:21:54] Michael Palmer: Ben, I'm sorry. Is there an ed tech insider house at Davos yet? I haven't heard. 

[00:21:57] Ben Kornell: Oh, we gotta get one. Okay. We gotta get one. Yeah, because that'll be a, I'm 

[00:22:01] Michael Palmer: coming to that happy hour.

Yeah. 

[00:22:02] Ben Kornell: Yeah. We would have fondue and we would have some great white wine and we would be talking about all things AI and education, but this idea of like. AI as an existential threat and opportunity and the role of education was like front and center there in a way that was far more intentional, I think, than the conversations we have here in the us And that actually rolled over into bet.

So BET is one of the, for those of you guys who don't know IT, BET is basically the trade show event of the year in London. 

[00:22:39] Michael Palmer: Mm-hmm. 

[00:22:40] Ben Kornell: And BET and GSV joined forces to become one company. Ah, okay. And BET has expanded. They have a bet. South America, I believe A bet Asia and essentially for adtech insiders, we could be traveling the globe every year if we wanted to travel to these things.

[00:22:56] Michael Palmer: Yeah. But 

[00:22:56] Ben Kornell: it really bet UK is one of those landmark conferences. And Microsoft outlined a bunch of major initiatives including work with their copilot to offer work for schools and classrooms. Google is the leader, but Microsoft has 40% of school districts as existing customers. 

[00:23:18] Michael Palmer: Right. 

[00:23:18] Ben Kornell: And at BET there was this like global focus on use cases.

I'm curious, as you're thinking about the geopolitics of how this is gonna roll out, is it China that's gonna be the leader? Is it us? Is it going to be UK and Europe? Where are you looking for the path forward or for the most innovation? 

[00:23:41] Michael Palmer: Yeah, I mean, the China's tough, scary, like the, you know, I'm a parent of a second grader, he is seven now.

When I read that they were starting their AI literacy curriculum at age six, it definitely was a bit of a wake up call for me. I feel like we're still catching up on the science of reading and just trying to like rebound from the pandemic in the US and it's sort of like we're licking our wounds. While I think there's a command economy over there that is, you know, being intentional about how to build AI into the future of their economy, the future of their educational system.

I just wanna see the same type of support in the innovation space in the US that we have had historically. If you look at why Silicon Valley exists, it's because DARPA and the agglomeration of resources that were kind of infused into that initiative allowed us to make a meaningful breakthrough and sort of establish a leadership position that I think you're right, like the fear of God should be in us coming out of Davos and coming out of a more volatile and less consistent approach to foreign policy and treaties and and whatnot, it, we seem much more by the seat of our pants, and I think that certainly can hurt us when China, I think the challenge with China is that I think there still is a perception that they're not, they're not necessarily as open as the US has been historically open to like.

Innovation and you know, it's a command economy. It's the government is really leading. But you know, it is interesting if you look at the demographics, you know that you mentioned South America, Africa, you know, population levels will be shifting. Yeah. That's why I do appreciate looking work, like the work that you do because it does feel like there's a tendency even when talking to experts within the US education system to be, you know, very naval gazing about what's happening here when not really looking more broadly.

And I think you're, your spidey sense is probably right where like. Where does capital flow and do we start to see some impact around, you know, where VC investment and some of this other stuff in the, in the education space, does it start to flow into other, other countries? That's what I'd be more concerned about.

I think China, China almost feels like the Department of Defense where there will be funding will at least be looking to stay competitive and that might be a place to look for growth and investment now is like where, where does education intersect with defense and, you know, innovation around that vector because it does look like there will be funding flowing and there will be friction kind of being removed from that space.

Yeah, it's certainly a tricky time to be thinking about the broader geopolitics as they relate to, you know, the future of education. In the US and beyond. 

[00:26:49] Ben Kornell: Well, and from an absolute capital standpoint, we are seeing more in Europe. You know, I do think India, it seems like India is rebounding, but China has been pretty much off limits since the government had the crackdown on for-profit private education.

I wonder, it's actually a great point, Mike, if they opened it up again, think of the capital that would flow given the command economy and their focus on ai. Mm-hmm. And their evolution of AI models. What a huge like financial opportunity that would be, and likely a zero sum game with some of the investments in us.

So, hadn't thought about that before, but man, all China has to do is say, Hey, we're open to investment. We welcome you to come. And you'd have some of the leading thinkers, funders, and developers. Like jump on that really, really quickly. One challenge, whenever you look at absolute dollars is also like what you can buy in one country versus what it costs in another country is vastly different, but relative to cost of living, in a way the US is not flat.

It's actually down because our cost of living has been going up. Whereas if you look at like developing countries, developing economies like Brazil, it's quite high. Mm-hmm. Because you know, they've, from an economic standpoint, the absolute dollars go really, really far. 

[00:28:14] Michael Palmer: Yeah. I think it also speaks to like a cultural identity challenge that we're facing now where like there was a time where I think part of our cultural identity was Steve Jobs.

It was like, we think different. We are not, you know, Orwellian, we are actually about our ability to break molds and to innovate in new and surprising ways, which kind of gets back to the data center. Like, it's not even like a, it's not really a breakthrough innovation to say like, and we're gonna throw more and more compute at this thing.

The breakthrough innovation was the transformer technology and like, how do we keep that as a real differentiator for the United States? Which also ties, I think, to, you know, the intellectual capital that we get through international students, you know, founding companies in the US and, you know, getting HH one B visas.

There's plenty to talk about on that front as well, but I, I do think it is an interesting time to almost take a step back and maybe be a little more strategic about how we understand the long-term impact of things like cutting funding at the NSF, making it harder for international students to study here and get visas and get on a pathway to start companies and differentiate what America can continue to do.

I think there's some real existential challenges that are out there nowadays. 

[00:29:38] Ben Kornell: Yeah. This idea of our identity is really, I mean, we're not just talking about ed tech here, like we thought for grabs overall. Yeah. And then within education, we have had an identity that we're the best in the world, that we're the most innovative.

And I feel like that's just gotten crunched and crunched and crunched down. 

[00:29:58] Michael Palmer: Mm-hmm. 

[00:29:58] Ben Kornell: And you meet somebody who's from Singapore, or you meet somebody from UAE and they're like, we're building the greatest education system in the world. And there's the optimism, there's the excitement. Yeah. And they're willing to try things that maybe we aren't willing to try in the us.

So it does feel like a cultural paradigm. It relates to one of our stories too, digital promise. Our great friend Jeremy Rochelle over there has a new research report about the lack of AI guidance by state departments of education. It does call out Colorado and Louisiana as the kind of star states that have really laid out what.

Innovation education and kind of a vision for the future of learning looks like and provides practical implementation guidelines for AI and K 12 classrooms. But I think, you know, as I was looking at this, it totally was realistic that states would not have guidance on this stuff when Chet EPT launch.

Here we are three and a half, four years later and they still don't, I guess I kind of wonder is that because we're making it a big deal and it's actually not that big of a deal for K 12 systems schools and educators? Or is it that the technology has changed so fast that the, the governments don't wanna put a stake in the stand yet because it's just moving too quickly?

Yeah. I mean, what's your read? 

[00:31:22] Michael Palmer: I think that last point is pretty close to where my head is at on this, where I feel like our K 12 systems we're ready. Pretty stretched and they're not really structured in a way to manage technology that effectively. And then you layer on top of that, this AI revolution coming before they're ready.

And it's changing so fast that this is where I always argue for like, even if it's very simple, high level document, just from the practice of creating an artifact that says, this is our AI policy, this is our acceptable use of of LLMs, start somewhere. And I think folks are hoping perhaps that it just blows over.

I feel like there's a lot of people who are kind of like Danny Glover in Lethal Weapon, you know, they're like, I'm only a few days from retirement. You know, like I don't really wanna have to go through the effort of. Shifting paradigms as a parent of a a youngster, neuroplasticity is real and like you have to force yourself to unlearn and relearn more as you get older.

But I think there is a sense of just exhaustion and like AI fatigue. I think in K 12 leadership, at the same time, I think folks are, the more interesting conversation that's happening there is like, how do we reimagine the structure of K 12? How do we think about developing leaders who. Are able to respond to the challenge and you know, take up the gauntlet, so to speak.

But I think a lot of folks are kind of beleaguered just 'cause I like that word, but I think folks are just getting a little weathered by the amount of change they have to assimilate. So they just figure I'll stay away and if it's a big deal, someone's gonna come back to me in a couple years. 

[00:33:13] Ben Kornell: Yeah, I think that's true for ai and it's becoming more and more true for tech in general.

This has been a drumbeat basically every week. Screen bands, phone bands, all of that kind of stuff. We've been following it here, but there was one in the Intelligencer that that stood out to me how the phone bands saved high school. And there's some really great thinking about what teaching and learning and engagement and interactivity looks like.

And this one is, they basically banned the. So you guys know, like with Yonder pockets, you put it there, but then in the passing period everyone just looks at their phone. Yeah. So what they did at this high school, it's in the New York Intelligencer, they banned it and then they created poker tables for kids to play poker.

[00:34:04] Michael Palmer: Nice. 

[00:34:04] Ben Kornell: And you, you know, kids don't know what to do if you're so acclimated to just go to your phone. Right. What do you do instead? Mm-hmm. And I just thought it was like a joyful pivot, uh, in that article. And then there's a couple things that came up at bat, but out of Europe where basically there's an unapologetics, you know, phone ban.

Yeah. It feels like, uh, we've kind of crossed the Rubicon here. More schools are convinced that they have to do some form of phone man than those that think, well, maybe we should just have balance utilization. That's a sea change in like less than a year. 

[00:34:44] Michael Palmer: Yeah. To me it speaks to the power of research and communication around work.

Like Jonathan Heights, the anxious generation. 'cause like I was blown away to be reminded that book came out in 2024. 'cause it feels like, like a cannon. It feels like it's been around since, you know, the Theen days, you know? It's kind of breathtaking to look at the level of global influence that those types of ideas can generate.

And you know, speaks again to that, the value of kinda like our research apparatus in higher ed in the us where at its best it is still sort of leading the charge in terms of our best understanding around the future of education. But it does beg the question around, you know, we do seem to, the pendulum swings both ways and we tend to overcorrect in some instances.

I think it's the coupling of the device with pretty much unregulated social media over the last 10 years that is causing this backlash. And then interestingly, AI is kind of coming in as a new wave on top of that. But I think that crisis of trust. Is still pretty foundational around this adoption. So I think it's gonna need to be like stealth, you know, zero ux.

But like the idea of sort of integrated into your life in more seamless ways, like the new AirPods are interesting, wearable technology that kind of seamlessly gives you access to new smarts. The internet of things as it intersects with AI is gonna be interesting to look at. I didn't catch much from the Consumer Electronics Show, but that's the other place where I look for signals that frequently aren't seen in education circles, but, you know, consumer technology and sort of breakthroughs there.

If you look back, you know, the iPhone was really a revelation. We haven't really seen something in our physical environment. This Johnny Ives AI medallion is on the horizon and you know, like folks are gonna try to win that. The new race, I guess, I think that's gonna maybe transform how we think about some of the, you know, put your phone in a box.

Kind of thinking as long term that feels like it might be reflective of more like the current hardware, software app, and, you know, ecosystem that's out there. 

[00:37:08] Ben Kornell: Yeah. So maybe one way this plays out is that you get to, the phone is no longer the interaction, uh, layer and it's all audio and Yeah. Voice.

Therefore, like maybe the distractibility of that is less. I think that it's, people are starting to realize that the cost benefit of having a phone around them all the time. Is starting to reach its ascent to, yeah, take a Jan Koon concept and basically systematically starting to either dumb their phones down or be strategic about no phone time.

[00:37:47] Michael Palmer: Yeah. I used to always say as a podcaster, you can't always offer up like the affordance of your eyes. You can pretty much give up what you're listening to and if you have a back feed that is giving you additional information, that's interesting. And then similarly like the layer interface that you can sort of see something that gives you a little bit extra information, whether it's through glass or through something else.

I'm curious about that. 'cause like it does feel like 2000 eight's a long time ago and we haven't really seen a breakthrough technology like the iPhone. You know, we're probably due. And then I think that may actually help with reimagining the classroom. Voice might be it, you know, and then how, how that actually impacts that experience.

It'll be interesting to keep an eye on. 

[00:38:37] Ben Kornell: Yeah, I feel like we are not done with that storyline by any means whatsoever. 

[00:38:43] Michael Palmer: Yes. 

[00:38:44] Ben Kornell: To wrap up, let's go to higher ed, where, you know, there's some actually good news around enrollments. Enrollments seem to be rebounding. Enrollments are back up whenever economic times are hard.

People seem to still believe that college education or higher education in general has value. You know, disaggregating some of the school, there's winners and there's losers. Still seems like a lot of the state schools that are large that have figured out how to do online learning continue to succeed, and the smaller independent private schools struggle with their cost to value.

But one of the new elements that's getting a lot of focus is as there were concerns about enrollments, if you get new freshmen in, what about retention and a new report, this is coming from the dispatch, talks about how dropouts or stop outs continue to be very high and disproportionately male undergraduates are dropping out.

They make up 42% of all undergraduates. Yet half of all dropouts and they're talking about different stress factors that force people to drop out. But one of the challenges is that funding that was giving financial aid was giving dropout support. All of that funding is winding down. So one in four students who enrolled in uh, the state of Michigan is one of the examples we're getting, uh, retention assistance now that's going away.

There's financial aid support and no longer disproportionately giving financial aid to parents like so folks who have kids and are trying to get their degree. It is a really, really tough nut to crack. What are your thoughts on how higher Ed should think about retention and specifically retaining their male students?

[00:40:36] Michael Palmer: You're talking about a major cultural. Issue that we're facing with our boys and young men. You know, like there's a broader issue just around Gen Z entering the workplace, but I think specifically through a more gendered lens, boys are more at risk perhaps than they have been ever before. And the role that technology plays in that is really interesting as well.

If you look at, you know, games and porn and gambling and you know, all the things that really kind of prey on the adolescent male psyche, it doesn't end necessarily, but at least you know, the first time kids are faced with that and there aren't really good supports necessarily scaffolds out there.

That's why I think the, the broader movement towards, you know, mental health and sort of wellness innovation and how that intersects with EdTech and. Technology innovation as it relates to K 12 I think is really important. And it also speaks to the, you know, then you're talking about higher ed here is the importance of mentorship.

And I'm still hopeful that AI will get better at creating more positive, net beneficial connections among humans. Kinda like the match.com equivalent for like mentorship and the right type of scaffolding and emotional support that people need. Um, but I think right now, you know, kids are hurting and they're kind of isolated by technology and then they enter into higher ed.

I think we need to be conscious of the problem and perhaps like refocus our efforts on addressing this. I know a lot of Richard Reeves and you know, Scott Galloway and, and even Jonathan Het, like I think there is increased awareness around this stuff, but I haven't necessarily seen how it connects to like ed tech investment.

I'm a fan of like You Will and like other, some of these other platforms that are out there that are trying to kind of provide mental health in your pocket, which I think will resonate with Gen Z. Yeah, it's tough. What are your thoughts? 

[00:42:49] Ben Kornell: Yeah, I mean both of us have our parents of boys. 

[00:42:54] Michael Palmer: Yeah. 

[00:42:54] Ben Kornell: And you know, I think the parental concern here extends beyond just college dropouts and so on.

My sense is that there is though an economic challenge here of connecting. The learning to career outcomes in a more tangible way. And you know, I think the challenge that we have is in many cases the kind of cost, the upfront cost required for college has such a delayed payback that makes it really hard for, especially men who have some sort of family obligation as this report shows both mothers and fathers show disproportionate dropout rates.

[00:43:37] Michael Palmer: Yeah. 

[00:43:38] Ben Kornell: And I don't think that's a new concept, but the supports were much stronger in the past. Mm-hmm. So I think for families and, and you know, if we're talking about an AI era where everyone's gonna have to be upskilling all the time, we need those supports. As we think about solutions, there's probably a lot that needs to even start in middle and high school to really help prepare young boys to be successful as they become young men and grow into college.

Well, there's so much more that we could have covered. I mean, EdTech Insider listeners, we have this prep doc and we have well over 150 stories this week. It's amazing, amazing how much is happening in the space. The new year has kicked off with a bang, and this first month of January has been amazing. If there is stuff that's happening in EdTech, you're gonna hear about it here on EdTech Insiders.

Thanks so much for joining us, Mike Palmer, and if people wanna hear more of your podcast, what's the best way for them to reach you? 

[00:44:38] Michael Palmer: Check me out on LinkedIn and then the podcast trendingined.com. trendingineducation.com. Those will both find what I'm doing available on YouTube and anywhere you get your podcast.

[00:44:48] Ben Kornell: It's been a real pleasure. Thanks so much for joining us today. 

[00:44:50] Michael Palmer: Thank you. 

[00:44:51] Ben Kornell: Talk to you soon. 

[00:44:53] Alex Sarlin: For our interview today, we are here with Jeremy Smith. He is a multi exit tech startup executive and the CEO and co-founder of pega6. That's pega6. The number six, a new kind of higher education built for the AI age.

Prior to pega6, Jeremy was the COO of Lodos Markets, the president and COO of Risk genius, which sold to Bold Penguin and Chief Strategy officer of Second Market, which sold to the nasdaq. Exciting stuff. Jeremy Smith, welcome to EdTech Insiders. 

[00:45:25] Jeremy Smith: Thanks for having me, Alex. I am super excited. 

[00:45:28] Alex Sarlin: I am excited to have you here as well.

So you are very passionate about the future of higher education, about how it's working and not working. Tell us a little bit about how you're thinking about higher ed and how pega6 is really designed as a superior alternative to college. 

[00:45:44] Jeremy Smith: Yeah, absolutely. So as you said, I am very passionate about this.

This is something I've been thinking about for 20 years now, literally, and it is all, essentially started 20 years ago when I could see that the university system was broken four years over a hundred thousand dollars on average, and students were coming out, including myself, completely unskilled and unready for the workforce, which to me, after spending so much time and so much money was.

Nonsensical. 

[00:46:16] Alex Sarlin: Yeah, 

[00:46:17] Jeremy Smith: so that's when I came up with the idea for pega6 and the category we're creating of career accelerators more generally as a superior alternative. And I'll touch on what that is exactly in a second. But essentially around 15 years ago or so when I came up with this idea. I talked to people in the market and the market just wasn't ready.

Yeah. They thought college was too expensive back then, but no one was questioning the value that college provided. And so I said, you know what? If I have to convince the rest of the world of what I see, this isn't going to work. And so I put the idea on the shelf until about five years ago or so, there was just this explosion of deep dissatisfaction with universities.

Yeah. From parents, students, and employers saying, this just doesn't work for us as a system anymore. And that drumbeats only gotten louder over the last five years. And so that's when I said, okay, now is the time. Now the market is ready for pega6 and career accelerators, which essentially what we're doing is saying to graduating high school seniors, Hey.

Don't go to college and waste your time and money. Instead, go to a pega6 career accelerator where each accelerator is focused on a single career path. It is 100% experiential and it's one year, 15 k. And the student graduates as an AI first entry plus employee who doesn't need any employer training because they come armed with the technical skills and the soft skills of a second or third year employee from day one of their first job.

And so for kids, they get to start their career three years earlier, you know, $250,000 or more ahead of the average college graduate. And for employers, they get the first ever AI first job ready workforce. 

[00:48:25] Alex Sarlin: Yeah, and I wanna focus on the length of the program. 'cause I think that's a very specific choice.

You know, you mentioned it's a one year program, $15,000, so that's longer than in the five, 15 years you've been watching this space evolve. We've seen the entire sort of bootcamp movement come and then sort of ebb. We've seen associates degrees come and change in community college. One year is a very specific choice.

Tell us about that choice to do one year and how you think that plays into this superiority of why that is a better length of a program. 

[00:48:57] Jeremy Smith: Yes, great question. 'cause that is super intentional and super critical because what we have found to be universally true is that it takes one year to gain mastery.

To go from zero to one takes one year, no more, no less. And where we found that is really in the place. That was one of the inspirations and models four. The pega6 model that we've built, which is the workplace, more specifically, all of our first jobs, all of us joined our first jobs, essentially as zeroes because colleges have failed us, and it's around the end of the first year for everybody I know myself, my co-founders, anyone I've ever talked to in my life.

It's around the 12 month mark of our first job where all of us go. Okay. Okay, I think I got this now. You know, not from like a director or VP level, but from an entry level it's like, okay, now I am adding meaningful value and know what I'm doing here and how to do it well. And you know, with that, and you never see somebody after two months on the job, just totally have it and be like, ah, got it, got it.

All. Know how to do it. Right. And also, you don't see anyone after three and a half years say, oh, now I understand. Now I know what I'm doing. Right. It's a year. It just is. And anything less. You cannot achieve mastery and anything more, and you're wasting these kids' time and money. And so one, that's why we chose a year because for whatever reasons of human nature, that is the inviable timeframe to go from zero to one.

But also it is one of the key factors on why the bootcamp space wasn't able to deliver on its promise of a job ready workforce, which is they weren't long enough, six weeks, three months, a year, but part-time that doesn't work. You cannot graduate with mastery with anything less than full time for one year.

Now, in fairness to bootcamps, they had no choice. Their student body. Was career switchers. These are people who have lives and bills to pay and so they can't not earn income for an extended period of time. And so bootcamps, given that they address career switchers, had no choice but to make these part-time or very short programs, otherwise they wouldn't have been able to get any students.

But for us, our audience, our student body is graduating high school seniors. They were expecting to take four years off full-time to become skill less. And we're saying, great news, you now only need one year. You'll be able to just be crushing it at your job. 

[00:51:51] Alex Sarlin: Yeah. You're mentioning the skills as really a core aspect of this model.

You say AI first, and it's really skills building. That has also been a movement that has grown and accelerated a lot in parallel to the dissatisfaction with higher ed over the last few years. This idea. You need to graduate with skills. You need skills to actually join the market. Tell us a little bit about how you think about skills within pega6 and what the experience does to make sure that students are building those skills towards mastery and when they come out, they are truly skilled and ready to get the jobs that they're hoping to get.

[00:52:26] Jeremy Smith: Yes. That's something that, I use an analogy that I think is powerful because it's very instructive, which is a lot of times for a long time people have talked about education as a pipeline. 

[00:52:38] Alex Sarlin: Mm-hmm. 

[00:52:38] Jeremy Smith: And sometimes people overemphasize the importance of an analogy or metaphor or what have you. But in this case, I think it is very important, which is when you have a pipeline, you're putting something in one end, and then it just comes out the other.

That's what it's, that is not representative of what is going on or what needs to go on with the education system. And the way I view it is more as a supply chain. Where you have multiple steps in getting the product to the end customer. And at each step there's value that is added to make that product into what it is that the customer wants.

And one of the places, one of the many places that universities go wrong is they view the student as the customer. And I don't believe that is a correct way to view it. And it leads, I think, to a lot of the brokenness. Definitely not all of it, but a lot of the brokenness of the university model. Because in the end, it really is the employer who is the customer, and the student is the product.

And the mindset there is incredibly informative because in the end, parents will make their kids go to where employers hire from first with the best, highest paying jobs. They will. And that's how you know the employer is the customer because it drives all the other decisions, particularly of the parent and the student.

And so with that, just like any supply chain, you need to start at the end and say, okay, what is it the consumer wants? What does the customer want? And then you work backwards through the supply chain to make sure that they get it well, as you work backwards, the last link in that chain before you get to the customer is higher ed.

And so higher ed needs to deliver what that customer, what the employer wants, which is a highly skilled employer. What they want is someone who provides the most lift the fastest with the fewest amount of resources required by the employer to get them there. 

[00:54:41] Ben Kornell: Hmm. 

[00:54:41] Jeremy Smith: And so that's what higher ed needs to deliver.

And that's why we are so skills focused, because that is what employers want. And therefore, by delivering that to employers, we are benefiting the student. Because we are making them ultra employable by preparing them for the workforce. And the way that we do that is with our curriculum, which is entirely experiential.

There are no lectures, no textbooks, no tests. They learn to do the job that that accelerator is focused on by spending 50 hours a week, every week doing that job. And so for the first accelerator that we're launching, which is this new emerging role in the AI age called a product builder, which is essentially a melding of a software engineer, a product manager, and a product designer for the curriculum that is part of that accelerator.

What that means is the students will spend. 50 hours a week, every week for a year, learning to build software by building commercial grade software in cross-functional teams, because that is what is going to imbue them with the technical skills and the soft skills to be more than job ready. Again, the equivalent of a second or third year employee from the moment they walk through the employer's doors.

And honestly, that's all that employers. Want. 

[00:56:13] Alex Sarlin: I think this shifting of the framing, it's a really interesting positioning. You're saying that rather than a pipeline, it's a supply chain. The end customer of the education system is the employer, not the student. It's definitely a big mental shift. It's a really interesting way to think about it.

You've been very critical of the higher education system as it stands and focusing. They've had a lot of trouble over the last number of years, sort of thinking about employment as a core goal of education. It depends on the type of school, of course, but that has been a long, hard shift for the higher education system to sort of get its head around.

I think some schools have been a lot faster than others. Tell us a little bit about why you think the traditional. Education system has really not wanted or been able to make that shift to thinking about career readiness as a sort of core goal of higher education given especially that many parents and students have voted with their feet and voted with their for surveys.

I mean, they've said that career outcomes are the number one thing they're going to school for for years, but higher education has had a lot of trouble with that. I'm curious how you reconcile those two issues and and why you think it's happened. 

[00:57:16] Jeremy Smith: Right, and it is why universities cannot adapt. I don't mean right now, I mean ever is because they both don't want to and can't adapt.

And what I mean by that is if you think about what it takes in order to create an institution that is focused on optimizing these kids for the workforce, for imbuing them with these job skills, universities are teaching the wrong things in the wrong way for the wrong amount of time at the wrong price.

So they have to change all of that because let's say you want to imbue them with skills. Job skills, you have to fire almost all of your professors. 'cause this is not an academic endeavor that we're talking about here. This is building the ideal entry plus employee and that takes someone who's in industry.

And so for example, our teachers who aren't really teachers, they're more like the proto bosses of the students. They come from industry first. My two co-founders and then more as we bring people on, these will be directors of software development and product management and design who are acting as the bosses of these students as they build software and deploy it on deadlines and and so on.

Professors just, they either have never been in industry or they have, but it's a long time ago. Or maybe they are now, but they're only a guest speaker for one course or part of a course. If you are trying to skill someone for the workforce. You need someone who has recently been in the workforce building teams, building people, coaching and mentoring.

They need to be the ones who take ownership of these students as they go through higher ed. Higher ed will not fire all of its professors. They both culturally just that would be anathema. But also there are institutional structures that also prevent that from actually happening. So one, they don't have the right staff to do workforce skilling.

They also don't have the right pedagogy. You cannot master any skill. It doesn't matter whether it's riding a bike, learning a foreign language, building software. You cannot master any skill through lectures and textbooks and tests. You can only master it through doing and just getting your reps in. And that's it.

Colleges are not structured that way. They are structured to be lectures, textbooks, team projects, case studies. None of those are learning by doing the job, and they would never change their pedagogy in a meaningful way where they would shift all of it. But then lastly, another thing, they not only wouldn't shift, but they can't shift.

And for anyone who's read the book, innovator's Dilemma, the universities are right in the crosshairs of the Innovator's Dilemma, where they have this overly built product of which the market doesn't want most of the features. And through their continual seeking of higher and higher profitability, they have become so bloated now that they can't react to a leaner, meaner, better model.

So let's say our one year, 15 K model, let's say that's the right model. I could walk into every single university president's office, lay out our entire business plan right in front of them, and there is nothing that they could do because if they tried to switch to a one year, $15,000 program, as that is the college program now that you are going to go through, they would collapse under their own weight because they've become so bloated.

And that's why you see any of the colleges who try to try to do something here, they just add it on. Right now it's a five year program, now it's a six year program, and now it's a four and a half year program. But they, they, because they cannot afford to build. And evolve into the right model. 

[01:01:35] Alex Sarlin: That's really interesting.

Let's talk about the AI aspect of this. You mentioned, I think it's really interesting that you're doing single job functions. The career accelerators are designed around a single job function, but this first job function that you're really focusing on is one that's basically a, you know, a function of the future.

It's where things are moving. This idea of combining software engineering design and product management into this concept of an AI builder, that's a really forward thinking way to look at it. That's a, not only not a university major, it's not even yet a job description yet. It's not getting there, but it's not there.

Tell us about that decision and how you think that the workforce is going to evolve to meet that vision. 

[01:02:14] Jeremy Smith: Yeah, so it's interesting because. Initially when we started to put the pieces together for pega6 and just become more public about it, we were going to be launching a software developer accelerator and a product manager accelerator.

And then we went out and we were talking to employers about employing our graduates of our first cohort, which is launching this year. And what we found from, I wouldn't say all of the employers, but many, if not most of the employers, is they were asking for this. AI engineer, product builder. There's a number of names for it because as as you were talking about that it's this newly emerging role and so not even the name has really settled for it 

[01:03:00] Alex Sarlin: totally.

[01:03:00] Jeremy Smith: But as these employers were describing what it is, they wanted this ultra capable, versatile, almost utility player, right, who was two parts engineer, two parts product manager, one part product designer with a little bit of business operations maybe sprinkled in. We just kept hearing that again and again and again.

And so one of the features that we call ourselves higher ed for the AI age, and one of the core features of higher ed for the AI age needs to be agility. Because AI is going to make certain tools, certain skill sets, entire career paths disappear or emerge overnight. Overnight and as an institution of higher education, it cannot take you decades or centuries to change.

You literally have to be able to do it in weeks or even days. And so as we started to hear from the employers we were talking to and just sort of see out there in the ether, the emergence of this role, and we could see not only the the emerging demand for it, but the logic for it. We said, okay, we need to pivot, not the business model, but the career path that we are going to be educating these students for and preparing them for.

And literally within a week, we made the pivot and said, okay, it's the product builder now, or AI builder, AI engineer, again, whatever name that industry ends up using. And we did not expect it, obviously. 

[01:04:39] Alex Sarlin: Yeah. But 

[01:04:40] Jeremy Smith: that's the whole thing is if you build an agile model, you don't have to be able to predict the future.

Because as you start to see the demand and the demand starts to validate itself a little bit, boom, you can make that change again, whether it's a tool that they're learning or a skillset or an entire career path. And that's what we did with the product builder and, and the truth is, it's also the same with the fact that these are going to be AI first employees that we're creating, you know, AI natives, because that was not the intention, obviously, when I initially had this idea, you know, 15 years ago, but even a couple of years ago, as I was starting to say, okay, it's time to do this.

AI first employee was not in my mind, but then it became obvious that is what this role needs to be, whatever role it is, whether it's product builder or as we expand to other career paths, both in tech and then eventually outside of tech. All of these roles need to be AI first. So whether it's AI first.

Product Builder or AI first cybersecurity professional, or AI first Junior account executive. You know, in advertising it needs to be AI first because that is how you are going to become more 10 XA hundred x more efficient and effective. But you still need to understand the fundamentals. Yeah. You still have to be understand being a junior account executive or an engineer or a product manager, because that is when you're able to 10 x or a hundred x the effectiveness of ai.

Is if you also understand the technical components of that job, whatever that job may be. And so we ensure, even though they're AI first, the students are also incredibly technically proficient. 

[01:06:31] Alex Sarlin: Yeah. Speaking of the technical proficiency and the skills base, I have, I have one final question for you, and it is I, I think it's the billion dollar question in this.

I know you've thought a lot about it, which is the signaling aspect of this, right. One of the things that has been true for many years about traditional higher education is that the college degree has been a sort of entry level requirement for many different fields, many different types of jobs. That has shifted in various ways over the last few years.

Tech companies, the government has taken it out, but as you're working on high school graduates and saying, in one year you're gonna be an entry plus level, you're gonna be ready for the workforce, you're gonna have all of these skills. How can they signal and how are you making sure that they can signal to the workforce that they are ready to go if they don't have the college degree?

[01:07:14] Jeremy Smith: Right. One of the most important things for that was something that we weren't responsible for, we, we didn't have any control over. And that is a shift in the mindset of employers. 

[01:07:28] Alex Sarlin: Hmm. 

[01:07:29] Jeremy Smith: That degrees were either meaningless or even counterproductive. And that's why we're launching pega6 now and not 15 years ago, or even seven years ago because that shift of their of employer's mindset had not happened.

And so no matter what we would say, it would not be an adequate signal for them because the only signal that meant anything prior to five years ago was the stamp that universities put on students, not skills, but still a stamp that said, Hey, they got into and out of this institution. So you can consider interviewing them for a job, but.

The reason we're launching it now is that mindset shift has happened with employers. So that was the biggest signaling issue, wasn't even the signal itself. It was receptivity to any other signal other than a university degree. So once that happened, fantastic, now we're in business and ready to go. So then the specific signal that we can send or enable those students to send is that pega6 creates the best entry level employees that have ever existed.

So much so that their entry level in name only, and really again, they're more like an experienced second or third year employee when they get there. But what that takes is track record. And so the way that we are going to market. Is we are starting out very small. Our first cohort's going to be half a dozen students, and what we've done is we're working with employers who are guaranteeing jobs for just one graduate.

So half a dozen employers who are guaranteeing jobs for just one graduate from the first cohort because then we can take those guarantees easily, recruit the first cohort in a world where 50% of parents don't wanna send their kids to college. We're coming not only with a better model, but also with a guaranteed job.

It's a no-brainer. And then that first class they graduate, they are going to be the best entry level employees that those employers have ever seen. And they'll say, you know what? We love that. We'll guarantee three for the next cohort. They graduate. Same amazement with the quality of the graduate, that continues for another cohort or two as we just build up the size slowly and incrementally for the first handful of cohorts.

And then we'll have the track record. Two, we don't need employment guarantees after the third or fourth cohort 'cause we'll have the track record where those employers who have hired from us will want as many graduates as we can give them. And that word, through our own efforts and just natural virality spreads to other employers who have seen that track record that our early employers have experienced.

They flock to our doors and then more students come for those employment opportunities. And it's a snowball rolling downhill at that point. And so essentially the signal is slowly and incrementally building that track record over the first two or three years and then using those results. To scale it massively at that point because employers will see the results and results will be their signal.

Not a stamp. 

[01:10:55] Alex Sarlin: Fascinating model. I think, you know, authenticity, just cutting through all the proxies, the degrees or the tests or the group projects that have been the sort of staples of traditional education, you're sort of carving away right through them and saying, we're gonna go right to employment all the way through.

It's really interesting. Jeremy Smith is the CEO and co-founder of pega6, a new kind of higher education built for the age of AI career accelerators, you heard it here first. Thank you so much Jeremy Smith for being here with us on EdTech Insiders. 

[01:11:26] Jeremy Smith: Thank you, Alex. It was truly my pleasure. 

[01:11:29] Ben Kornell: Hello, EdTech Insider listeners.

Today we have Stewart Brown, a K 12 computer science and AI literacy leader focused on fixing the foundational gap in elementary and middle school education. He works with districts nationwide to treat CS as a core literacy, helping students understand how the technology shaping their lives actually works.

Stewart works at Code4kids, and we're excited to have you here on EdTech Insiders today. Welcome, Stewart. 

[01:11:57] Stewart Brown: Thank you, Ben. Great to be at. 

[01:11:59] Ben Kornell: So let's start with just a little bit of your background. How did you get into education, and then what was the genesis of Code4kids? 

[01:12:07] Stewart Brown: Yeah, yeah. So I come from a family of educators.

My father just retired after being in education for 41 years as a teacher of so many different subjects, and ultimately a elementary school principal. And my sister's a teacher, she's a head of art at her school. And I've kind of gone into education just slightly in a different way. Ultimately, I was first in higher education, big into intercultural learning and competency and the career development that provides.

And slowly, as I have now, young kids started to get very passionate and interested in how technology is shaping kids and childhood. And that led me to Code4kids and really wanting to make digital literacy and now AI literacy a core foundation of what kids are actually taught in school. 

[01:12:55] Ben Kornell: So one of the profound messages that you had that resonated with me and Alex is that computer science should be treated like reading and math starting in elementary school.

What foundational gaps do you see today when CS isn't introduced early, and how do those gaps show up later for students? 

[01:13:12] Stewart Brown: Great question. So to frame it, firstly is unfortunately computer science is not taught in most elementary schools. The average elementary school is not gonna be teaching much about technology at all.

And so the biggest gap I see is that kids grow up using technology. We're looking at kids as young as first grade, second grade being given a one-on-one device in the classroom, but they're not taught how to think about it and actually understanding how that technology works. And what this does is then from a young age, students are interacting with algorithms, whether it's smartphones, whether it's on social media, eventually ai, and so many different apps that they're on.

They're seeing recommendation feeds, so true results, automated feedback that they don't understand how that's coming to them. And when that's not explained to them, technology feels powerful, but also invisible. And so by the time they're older, they become ultimately fluent users of technology, but they don't have language or understanding of what's actually happening under the surface.

And when it's invisible, mysterious, like that, people tend to trust something or accept a system rather than question it. And ultimately, in this day and age, especially in the age of ai, that gap is widening and I find it really important to be instilling human agency and critical thinking in this day and age.

[01:14:34] Ben Kornell: So in some ways it's really about competencies of critical thinking and evaluation, not necessarily the technical components of build this, don't build that, learn this coding language, et cetera. What resonated with me is it is a great counter argument to this idea that kids don't need to learn to code anymore and that the AI is gonna do all the code.

And it's this idea that underneath the code there is a lot going on that kids deserve to understand, not only to understand how code is built, but as consumers of products. How might this have been built in ways that affect me explicitly or implicitly? 

[01:15:15] Stewart Brown: Absolutely. And I'll say the educational model for most part.

Has really been about content consumption and trying to memorize as much as we can and spit that back out. And AI has really exposed how important critical thinking is, how important it's for kids to look at content, actually digest it, interpret it, understand it, and define what they're seeing and understanding in a way that's really important in this day and age.

So I think computer science is a core literacy. It's not about making every student a programmer, for example. It's ultimately about giving them that computational thinking that's important to give them agency in a tech powered world, if that makes sense. 

[01:15:59] Ben Kornell: Hmm. So trying to take computer science to elementary might be a daunting task on the student side, but on the educator side, it is even more daunting.

Yes. Most elementary teachers have zero experience. With computer science, those concepts or constructs, and we're in this dawn of an AI era where all of us are kind of learning new things. When you work with districts, what does it actually look like in practice to treat computer science as a core literacy rather than a standalone STEM elective or something you do on like a Friday afternoon in robotics club?

[01:16:35] Stewart Brown: Well, firstly, let me define what it's not about. It's not about making every teacher a tech pro or a computer science expert, we have designed our curriculum at Code4kids, specifically for teachers who have never taught computer science before. And so the goal is to not make every teacher the expert in the room.

We want teachers learning alongside students. 'cause ultimately, in the age of ai, that's what we're all doing. Then I think the big thing is how does it fit in in the instructional time? So if computer science is an elective, sometimes it's taught as a standalone technology class or within stem, that's okay, but ultimately.

Computer science shouldn't be taught for the sake of just teaching computer science. It's about amplifying the core curriculum and showing how the dots are connected. So in practice, the way in which we do it is it's about embedding it into the core curriculum. It's about making sure that computer science isn't seen as something that's a nice to have or an opportunity for enrichment for those kids that want it.

It's something for every kid, because every kid is getting. Influenced and is consuming technology. So let's give them agency in that world. And the second is who teaches it? It's about every teacher having the opportunity to teach this and to teach it in a way with cross-curricular integration. So for example, if a teacher is teaching about the solar system, we're gonna say swap out some of those lessons from the textbook.

And why don't you teach this one computer science lesson about building an app about the solar system? We'll tell the social studies teacher, instead of teaching them them about the geography of their hometown, let's get them to learn text-based coding for the first time and build a website about their hometown.

The same can be true to so many examples within our curriculum of how do we bring this into the core curriculum in a way that doesn't complete but actually amplifies the core curriculum and can be done in a way that is not just one extra thing for the teachers to do. 

[01:18:38] Ben Kornell: Yeah, it's really fascinating. I feel like this idea of, it's almost like project-based learning where it's an applied knowledge component.

We're also in this moment where AI tools are rushing into classrooms. Do you believe that true AI literacy needs to come first before AI usage? And what risks do schools face if they get that order wrong? 

[01:19:01] Stewart Brown: Yeah, I do. And I think we've seen this pattern before with technology coming into classrooms. We saw it with devices with no digital literacy and platforms and apps being adopted very quickly in classrooms.

And that happened with really good intentions. And I think we are adopting AI in classrooms and with very good intentions too. But a lot of the time we're ultimately, if we're not teaching kids what's behind the screen or under the surface of the technology tools or AI that's coming to their classrooms, we are just widening the gap of making them passive consumers of technology rather than giving them agency and actually helping them understand how that technology works and giving them the opportunity to ultimately be in the driver's seat of technology.

[01:19:46] Ben Kornell: So what's at stake if we get that wrong, then? Like what's the downside? 

[01:19:51] Stewart Brown: I think the downside is this, it's AI is giving us access to information at an incredibly rapid speed. And it's evolving very quickly. And if we are not teaching kids how AI works, how it's designed, how it's shaped, we are ultimately not gonna have much agency in this world.

They're going to blindly accept everything, a system that spits out to them. And we want to be teaching them that. If you're seeing information, you need to learn how that information comes to you and have the opportunity to understand how to interpret it, how to know when biases come in, how to know what data this system is trained on.

And I think when we don't teach kids that, they are just gonna see this as everything AI spits out at them is the truth. And I see that as a huge risk in this day and age. 

[01:20:43] Ben Kornell: So, you know, as you think about elementary and middle school students, a lot of people are putting computer science in the bucket of career preparation or a career technical education, CTE, but you see it more about student agency.

How does computer science education help students build their student agency? 

[01:21:05] Stewart Brown: Yeah, so what computer science does it ultimately, it instills critical thinking. It instills problem solving. It instills computational thinking. It allows kids to grow their confidence. It allows them to grow a growth mindset, and I think it does that in a way that is unlike any other subject.

I think the way in which technology is influencing their lives, both in and outside of the classroom, is rapidly expanding and. In this day and age, human agency is probably one of the most critical skills that we actually give our kids when AI is, you know, really flooding our daily lives. And so computer science is helping them understand the systems thinking behind how AI works, but not only ai, you know, in the world, there's a system underlying almost everything that we see.

And computer science helps students learn how to unpack systems, how to question systems, and that's how I see it. Ready giving them agency because they're able to not just unpack the AI system that they might see today, but how it's gonna evolve in the future. And also, not only ai, but you know, all systems out there in the world.

[01:22:19] Ben Kornell: From the work that you've been doing, what's giving you the most hope and what's giving you the most concern? 

[01:22:27] Stewart Brown: Very good question. The hope is coming from many of the forward thinking teachers and educators I speak to every day, the thought leaders in the space, the ones that know what real AI literacy is.

You know, I think a lot of the time there's a bit of a distinction between how AI literacy is defined. A lot of the time it's a buzzword a lot of the time it's, you know, sometimes paying lip service to, you know, the AI tools that are flooding the market. And you know, what gives me hope is that there are so many within the school system that actually know that it's about the critical thinking of teaching kids how to understand how these systems work.

And it's about the educators I speak to every day. They understand that kids are. Being so absorbed and influenced by technology by the time they're in high school, and we are not making this a priority in school. And so they are the ones really trying to change that. And that's, that gives me hope where I get a bit of a fear.

And what kind of gets me down, I, uh, if you will, is. When there's two sides of the coin that I don't agree with, and one side of it is the argument that technology is bad, AI is bad, and we should ban it and we should restrict it. And that is all for good intentions. But a lot of the time, you know, just, we sometimes are creating, uh, returning AI into a Fitbit and fruit of sorts, which is never a long-term sustainable solution.

And on the other side of the coin are those that are saying, you know, we need to be innovative. We need to adopt it. But they're adopting it without really critically thinking about how they're adopting it. And so they might be adopting technology and AI in a way, in classrooms that completely overlooks the need for any form of digital literacy or AI literacy.

And so it's ultimately widening that gap between creating. You know, consumers of technology and those that actually know how it works and how to question those systems. So, you know, that gives me a bit of fear. It gets me down a little bit, but I think at the same time, there's more and more people waking up to the fact of what is actually needed and, and how to go about changing it.

And there's a lot of people in this space that are fighting the good fight and trying to bring about the, the much needed change that our kids, uh, already need in this world. 

[01:24:57] Ben Kornell: Yeah, I mean that's almost like the driving. Why is that so much is at stake here? I do think that it's really interesting to focus on elementary and middle.

When you look at computer science standards, they're state by state, they're different, but often focus on them as career technical education. As I said before, for high schoolers only. But what we're now realizing is that kids are interacting with ai. From like age two or age three. And you know, depending on what your specific definition is of what is ai, what is not algorithmic technology is clearly endemic in basically everything that every technology that kids touch.

You know, for the folks that say that they want to ban these things, that a lot of times the rationale is not banned forever, but is around developmentally appropriate, is kind of what they talk about. What do you say to that pushback or criticism that maybe learning about AI or even using these technologies is not developmentally appropriate and that it should wait until, you know, kids are more sophisticated in their intellectual understanding to introduce these new tools to them?

[01:26:13] Stewart Brown: You know, we're seeing a lot of AI tutors come into classrooms and. They look really great on the surface. You know, they're said to provide personalized instruction that the teachers just don't have the bandwidth to provide at such a scale to every student in the class. And it's all fantastic and well, but a lot of the times if the teachers onboarding in and have no facilitating role in that, which a lot of the time they don't, when the AI tutors into the classroom, it kind of, it takes away their role in the classroom of helping those kids grow.

And what I see so often is that the personalized instruction that AI can provide is, is fantastic, right? But. It is a technology, it ultimately, it's a data center giving information to a kid. It's not a human being. And so it, it doesn't take into consideration the context of actually what's going on in the classroom.

You know, is the kid upset today? Is their neighbor bullying them? Potentially, is there, you know, all of those Interschool contexts that are so important and only a human can really pick up on a lot of the time, they don't take that into account unless you're feeding that information into the AI system.

And so I think a lot of the times that element is often overlooked. And what's unfortunate is that the school leaders sometimes and the district leadership. We are all learning about AI and trying to keep up with it. But a lot of the time, those that are making the decision to implement it, they don't really understand what's at stake and they're being sold something that they don't truly understand the kind of repercussions of, and they might be bringing some AI literacy into it, but a lot of it's, you know, not necessarily AI literacy and more so just tool fluency, getting the kids to learn how to use ai, not necessarily how to think about it and understand it, if that makes sense.

[01:28:08] Ben Kornell: Yeah, totally. I feel like this is also one where adults underestimate how much kids are ready for how much, practically speaking kids are already aware of, and three, what they're exposed to E even if they're not ready or aware of it, they're exposed to it. And then also on top of that, there's like, what's going on at home versus what's going on at school.

You know, I think this approach of like building the foundational competencies to live in a dynamic world. Allows you to move away from focusing on the tooling and the career, which realistically, given how fast the space is changing, I'm not sure that a freshman computer science class is going to prepare that person for when they enter the workforce in four to eight years.

Whereas like the kind of core capabilities, critical thinking that you mentioned, both as consumer and builder could be really enduring. So, you know, really interesting topics that you've brought up today, I'm sure. Very fascinating conversations you're having with. Schools and school communities, parents and students, we wanna find out more about Code4kids.

What's the best way for them to learn more? 

[01:29:18] Stewart Brown: Best way to learn is to go to our website, c4k.io, that's the letter C, the number four, and the letter k.io. Also find me on LinkedIn, Stewart Brown, and you know, I'm posting all the time about these sorts of things, so if anyone wants to follow along, they're welcome to do so.

And you know, would very much welcome anyone reaching out and would be very happy to expand a little bit more on all of these topics. 

[01:29:45] Ben Kornell: Fabulous. It's so great to have you on the podcast, Stewart Brown Code4kids Thanks so much for joining EdTech Insiders. 

[01:29:52] Stewart Brown: Thank you, Ben. 

[01:29:53] Alex Sarlin: Thanks for listening to this episode of EdTech Insiders.

If you like the podcast, remember to rate it and share it with others in the EdTech community. For those who want even more, EdTech Insider, subscribe to the Free EdTech Insiders Newsletter on substack.