Edtech Insiders

Josh Jones of QuantHub on Teaching AI Skills That Outlast the Tools

Ben Kornell

Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.

0:00 | 20:30

Send us Fan Mail

Josh Jones is the CEO of QuantHub, an edtech and workforce company advancing data and AI literacy across K–12, higher education, government, and enterprise. A serial entrepreneur and former data science consultant, Josh has spent decades helping organizations adapt to rapid technological change.

💡 5 Things You’ll Learn in This Episode:

  1. Why AI education should focus on durable skills, not specific tools
  2. How schools can adapt curriculum faster without sacrificing rigor
  3. The difference between AI literacy and AI fluency
  4. Practical guidance on student AI use, ethics, and assessment
  5. What institutions must change to stay resilient in an exponential tech era

Episode Highlights:
[00:02:00]
Why QuantHub was built to keep pace with accelerating AI change
[00:04:30] The mismatch between slow curriculum cycles and fast-moving AI
[00:06:40] Teaching meta-skills like critical thinking, ethics, and change management
[00:09:00] Mapping AI and data skills into existing K–12 and higher-ed standards
[00:11:45] Why banning ChatGPT creates more problems than it solves
[00:13:40] Institutions leading the way on clear, fair AI policies
[00:15:40] What schools must rethink to stay relevant over the next decade

😎 Stay updated with Edtech Insiders! 

Follow us on our podcast, newsletter & LinkedIn here.

🎉 Presenting Sponsor/s:

Tuck Advisors was founded by entrepreneurs who built and sold their own companies, frustrated by other m and a firms, they created the one they wished they could have hired but couldn't find. One who understands what matters to founders and whose North Star KPI is the percentage of deals closed. If you're thinking of selling your ed tech company or buying one contact Tuck advisors now.

Innovation in preK to gray learning is powered by exceptional people. For over 15 years, EdTech companies of all sizes and stages have trusted HireEducation to find the talent that drives impact. When specific skills and experiences are mission-critical, HireEducation is a partner that delivers. Offering permanent, fractional, and executive recruitment, HireEducation knows the go-to-market talent you need. Learn more at HireEdu.com.

[00:00:00] Josh Jones: On the other hand, if you think about concepts like AI ethics or tool selection or prompt engineering, if we back out a little bit and think about really the approach we're taking those concepts, critical thinking, leadership, change management, those are skills that you actually need in the AI usage process.

They're not really changing at that same pace. So what we try not to do is overemphasize on a particular tool, its capabilities or lack thereof, but move in that direction while also knowing whatever educational curriculum we produce, the days of this lasting for five years are over. We're producing content that we realize may have a year shelf life at best.

[00:00:42] Alex Sarlin: Welcome to EdTech Insiders, the top podcast covering the education technology industry from funding rounds to impact to AI developments across early childhood K 12 higher ed and work. You'll find it all here at EdTech Insiders. 

[00:00:58] Ben Kornell: Remember to subscribe to the pod. Check out our newsletter and also our event calendar.

And to go deeper, check out EdTech Insiders Plus where you can get premium content access to our WhatsApp channel, early access to events and back channel insights from Alex and Ben. Hope you enjoy today's pod.

Hello, EdTech Insider listeners. Today we have Josh Jones joining us. He's the CEO of QuantHub, an EdTech and workforce company, advancing data and AI literacy, a serial entrepreneur, former data science consultant and TEDx speaker. He has led Global Ventures helping state schools and employers build future ready talent for a data-driven economy.

Welcome, Josh Jones to EdTech Insiders. 

[00:01:47] Josh Jones: Hey Ben. It's great to be with you. 

[00:01:49] Ben Kornell: So as we talk a little bit about QuantHub, maybe you could just give us a little bit of the background, the foundation story, like why'd you start QuantHub. 

[00:01:59] Josh Jones: Absolutely. So Ben, I've been consulting for really a couple of decades now in the space of data science and artificial intelligence, and a common trend we saw really across the corporate and government clients we worked with is the pace of change of technology is increasing.

At a faster and faster rate, and we realized that we wanted to build the educational tools needed to help people keep up with this fast changing environment. So we focused on really everything from data science, data analytics to AI, and of course right now, generative AI and all of the different associated trends and concepts with that have really been a primary focus point for us.

[00:02:38] Ben Kornell: So we're kind of getting to this inflection point with technology where this is what Ray Kurzwell calls an auto aggressive era where tools can create new tools. So not only do you have these amazing tools, they're creating derivative products. Content that is source content is now creating derivative content, which feeds the snake back, and that's creating incredible acceleration.

What does this level of acceleration mean for educators, for students, for workforce, in your context? 

[00:03:11] Josh Jones: Absolutely. So we've been doing this QuantHub as a company, has been around for about eight years, but about three years ago, we were able to join Open AI's beta program and really some of those first models that GPT-3, obviously chat, GPT, were big for us.

And really seeing that, you know what, there's a tremendous amount of opportunity here. And the early stages we realized, hey, there's a lot to be perfected as well. So we wanted to really use AI to move faster and leverage the speed benefits while also thinking about how do we actually qualify the content, validate the content, deliver something that can be used, not just in a chat setting, but how do you deliver something that can be used in a curriculum setting where you have to map to state standards or in higher education, thinking about accredited institutions or working with government or regulated agencies.

So we've really been sitting at the crossroads of these two. Concepts of using AI to be faster, but also how do you validate that? And so while the technology and the skills that we need to be successful are changing fast, we've been seeing how can we actually use those tools to speed up the education delivery process as well.

[00:04:16] Ben Kornell: Yeah. It's kind of almost like a physics problem where the AI is moving so fast and education systems move so slowly. What mindset or structural shifts. Do schools and institutions need to make to keep up with this rate of change? 

[00:04:32] Josh Jones: So historically we've all had sort of this five to 10 year cycle that we're okay to make slower refresh.

I'm gonna build committees to think through curriculum and go at a very, what I would call deliberate or methodical pace. And I think it was needed, well intended, but we're in an environment now where we just can't move at that speed because oftentimes the process itself to approve certain curriculum or standards, by the time you get through that, the standards have already changed.

So we're really in a situation where we need to think more nimbly and think about it from a meta level. How can we actually continue to provide rigor while also being able to adapt in months as opposed to years? 

[00:05:09] Ben Kornell: Yeah. There is this irony though. If I adopt today's technology and I go through all the pain of rolling it out, implementing it, training my team, and six months later it's a step change better am I actually not better waiting the six months to implement and to act.

So we have this kind of launch conundrum from that thought exercise of trying to get to Mars. Are you better off launching today or waiting 20 years for fusion energy? There's actually leaders who are really thinking things through rationally and saying, actually, we're better off waiting now as parent, I hate hearing that 'cause my child's only gonna be a fourth grader for a period of time, or my middle schooler's only gonna be a middle schooler for that period of time.

But from a rational standpoint, aren't there some rational actors that are saying, Hey, we actually need to slow down rather than speed up? 

[00:06:01] Josh Jones: No, that's a great question and it's really at the heart of this whole matter. So what we do is we actually abstract a layer. We don't really focus on teaching, say.

Chat GBT 5.2 because by the time you air this, maybe there's a new model, and now this sounds like it's already outta date, right? On the other hand, if you think about concepts like AI ethics or tool selection or prompt engineering, if we back out a little bit and think about really the approach we're taking those concepts, critical thinking, leadership, change management, those are skills that you actually need in the AI usage process.

They're not really changing at that same pace. So what we try not to do is overemphasize on a particular tool, its capabilities or lack thereof, but move in that direction while also knowing whatever educational curriculum we produce, the days of this lasting for five years are over. We're producing content that we realize may have a year's shelf life at best.

[00:06:53] Ben Kornell: Yeah. So maybe there's elements where. What used to be high stakes decisions are now much lower stakes because the cost of rip and replace is so low that it's almost this constant refreshing, you know, you mentioned some skills. Do you have a framework or have you identified which data and AI literacies matter most for the students to be successful and thrive in the future?

And how should schools teach those with adaptability at the core? 

[00:07:23] Josh Jones: Yeah, so we actually created AI literacy, AI fluency curriculum before there were classes to put it in, and so we had a little bit of a conundrum there. Okay, we got the curriculum, how do we deliver it? It looks a little bit different in K 12 than it does higher ed in corporate and K 12 settings.

What we found is each state generally has their own state standards. When you think about your STEM courses and so forth. But if you break down those fundamental data and AI skills and you think about things like building a hypothesis, data exploration, data analysis, visual storytelling and so forth, those are skills that are on the A CT.

Those are being taught in current STEM classes. And so what QuantHub has done is built modules that teachers can really map to their state standards so that while they're teaching those foundational data and AI skills, they're actually teaching to their. Core courses as well. Students that are interested can then go further and explore careers in data science and AI.

Similar approach in higher education. So you think about, say in the business school, what we're looking at is AI literacy is our foundational certificate. Of those five to eight skills of, again, AI ethics, prompt engineering and so forth, AI fluency, we start getting more into systems thinking. Ag agentic flows and AI in specific domains.

What does AI look like in the field of marketing? What does it look like in the field of management or supply chain? And so that's where we get very granular and we try and keep all of our educational delivery as opposed to textbooks or chapters. These are all digital modules. Typically that can be done in less than an hour.

So it gives the faculty member the flexibility to adapt our curriculum to whatever existing course they're teaching to. And that helps us get past some of these long curriculum approval processes because we can map it back to previously approved SLOs and learning outcomes 

[00:09:04] Ben Kornell: with that framework in mind and the idea of really around longstanding skills that are enduring despite the rapid technological change.

The debate rages today around when is it okay for students to use AI and where are the boundaries or where should teachers draw ethical boundaries or practical boundaries around student usage? What guidance do you have for them as you think about. Both building those skills, but also not over anchoring on current levels of technology and also ethical use of said technology.

[00:09:39] Josh Jones: I think some of the fundamental things we need to do is teach students to be curious, to use critical thinking, to explore, to expect that technology is advancing, like Kurzwell said, an an exponential curve as opposed to a linear curve. How do you think exponentially? How do you explore with what those tools are?

And there are a number of ground wolves and basic principles that we can teach that really will surpass these different whatever the technology du jour is. 

[00:10:04] Ben Kornell: Yeah. So you've got this group of well-meaning and very concerned educators. And then you have another group of innovators who are like, let's throw all that stuff out.

Let's have chat GPTB school. When you're thinking about the role of AI in education and in K 12 and in higher ed, what do you see as the prime value or need of institutions given that so much is being automated or transformed by products like chat, GPT? What can AI not replace? 

[00:10:37] Josh Jones: Well, it really is those human skills.

If you think about those interpersonal communications, the eq, the leadership, the teamwork, and those are things that are not going away anytime soon. And so, in fact, the institutions that are really leading in this space are just, they've gone ahead and accepted that AI is changing the field. They're no longer, you know, forbidding it.

They're instead addressing it head on. So we see a number of universities, for example. Struggling with faculty that have not even defined what the rules of engagement with AI are, and those are some of the worst case scenarios because students really don't know what to do. The leading faculty are defining it upfront.

Maybe it's a traffic light system, green, yellow, red in terms of how they use it in the class. They're really rethinking the way they're delivering education. They're assuming that some AI use is gonna be there, so it's no longer don't use AI, but if you're using AI, I wanna see your prompts, I wanna see your iterations, I wanna see your validations.

And in some cases they say, Hey, listen, I don't want you using AI. AI is great in other cases, but here's why you need to build these fundamental muscle memories. So really, the best educators are the ones that are just addressing this head on and going ahead and rethinking how they both deliver education and how they do assessment.

[00:11:44] Ben Kornell: Recently, Denver Public Schools banned chat GPT, but they still are allowing purpose-built education AI tools with the idea that generalized models either aren't accurate enough, aren't safeguarded enough, or aren't appropriate for learning processes. They basically give the way the answers too fast. But AI tools that are purpose built for education can be really valuable.

How do you see that distinction between generalized tools, which, you know, let's be real like in the working world, kids and students will be using those and even if you do ban it, maybe at home, they're using it versus purpose-built tools that maybe have the LLM underneath but are guard railed with prompt engineering and development and design.

[00:12:32] Josh Jones: You know, I think we need to be really careful when we talk about banning things because it is a bit of whack-a-mole. You ban chat GBT, but you've got Gemini embedded in Google. And so I think sometimes when educators think that they've eliminated AI, they've really just eliminated one or two flavors of it.

And what you're actually doing is creating a widening gap between the haves and the have nots. 'cause you've got students who are gonna have. Parents that will pay for pro subscriptions for them, that will have access to all those things and you'll have other students that won't. So in many cases, quite honestly, I think we're doing more of a disservice than, than anything.

And if you look at some of the bigger models, whether that's Clawed or Chat, GBT or Gemini, you know, what you're getting in some of these cases, some of the, the smaller companies, they're essentially wrapping chat GPT with their own tools. And I don't know that you're necessarily landing in a, in a better place at this point.

[00:13:20] Ben Kornell: You mentioned some like leading voices, how professors really set great guidelines in some circumstances. Is there an institution that you feel like is a shining star or a couple organizations that you feel like are really starting to figure this out that are bright lights that we should all be aware of and studying?

[00:13:40] Josh Jones: So I've been really impressed with Dr. Ima Gupta at the USC University of South Carolina upstate. She's got some really good thoughts on this and done a lot of research in the space of, of AI ethics. I'll say Shani Robinson at Sam Houston State. Um, really the faculty we worked with at University of Alabama, all three great examples of higher education really addressing this and not just saying, okay, gimme a template AI policy that I can slap on everything.

But really rethinking the entire process, thoughtfully establishing guardrails not only for the students, but for the faculty and really saying, Hey, we need to be fair to our students. And I'll, I'll give you one example. USC Upstate, for example. If faculty do not have an AI policy in place, they are not allowed to punish the student for using AI.

That sounds kind of simple, but just really being forward thinking and let's go ahead and address this upfront, really goes a long way with students. 'cause most of them want to do the right thing. It just, they may not know what the right thing is if you don't provide that leadership. 

[00:14:41] Ben Kornell: Yeah, and a lot of faculty don't actually know what best practices are either.

So actually having policies in place that faculty can respond to really help them think through, in my context for my course, what is appropriate AI use and how am I going to govern it? How are we gonna make it transparent? That sounds to me like a very modern world scenario that everybody's boss is thinking about, everybody's institution is thinking about.

It's interesting to see that coming to education. 

[00:15:12] Josh Jones: Yeah, it's definitely what we're grappling with in the workforce. 

[00:15:16] Ben Kornell: As you're looking at the trends that are coming down the pike, obviously you've been very early to understand the impacts of AI. Now, as we think of multimodal AI, as we think about robotics, as we think about potentially a GI.

Looking around the corner, what do you think is going to be most important for schools and institutions to be resilient, to support their students and also stay on the front edge? 

[00:15:43] Josh Jones: You know, I think the biggest thing they need to do is really rethink how they're approaching the course curriculum approval process.

And we need to, as we start approving new courses and new standards. As I mentioned earlier, we need to back up one abstraction layer and be careful that we're not writing in specific tools specific approaches. When we talk about establishing learning outcomes, let's think about that broader capability, core capability, but leave educators over the next couple of years that flexibility to come fill in some of the gaps.

So in some cases, it really is sort of regulatory question to allow for that flexibility and the change. And then educators, this is something that you gone are the days that you rethink your courses every two years, five years, whatever. This is a semester by semester basis. We need to be looking for what are some of those different things that we can hot swap into our courses.

Maybe remove a lesson here, adolescent here to continuously refine and sometimes, you know, flip the classroom and be comfortable saying, Hey, I don't know. We're gonna, we're gonna figure this out together. We're gonna explore together. 'cause that's what we're doing in, in the workforce as well. 

[00:16:48] Ben Kornell: Yeah, that makes a lot of sense.

And this idea that it's not binary, it has to all change or all be the same, but really this idea of modularity and hot swapping in the parts that are most relevant and most actionable. That makes a lot of sense. All right, well, we're gonna wrap up with one last question. What's giving you the most hope or excitement and what's giving you the most concern as you look out on the education landscape today?

[00:17:16] Josh Jones: Ben, if you don't mind, I might actually split the difference and tell you about something that's just really intriguing to me and maybe I'll, I'll, I'll address both of those, but as I watch, so, so QuantHub was, was honored by the Inc. 5,000 this year as one of the fastest growing private companies. And so I attended their annual conference and I got to meet a lot of other CEOs of.

Fast-growing private companies and at the same time, we work with a number of publicly traded and, and larger enterprises, and I've seen really a tale of two cities and the larger enterprises. What we're seeing is this, you know, what's a tool that I can deploy to 25 or 50 or a hundred thousand employees?

And so they're going with something like, you know, Microsoft Co. For example, meanwhile these smaller companies, what they're seeing is really this wild west of maybe it's five different LLMs or different tools or, you know, use what you want. And obviously it depends on the industry, depending on whether they're regulated or not.

But I'm really gonna be watching in 2026 and 2027. How are these two different, you know, this fortress mentality versus this gorilla mentality. How is that gonna play out in terms of the innovations that we see and the advancements of, of industry? So. Kind of going back to what I'm excited about, what I'm worried about at the same time is I'm excited to see some of these things that come when you've basically got running as fast as you can.

These companies that just trying a little bit of everything and at the same time, you know, I'm also excited about the advances we're gonna see in places like healthcare that just seem to be improving every single day. 

[00:18:40] Ben Kornell: Is really inspiring. And it also, there's learning that can go both ways. I mean, you think about the larger companies and if they're sitting on a data advantage, having, you know, consolidation of tools on top of a dataset will also be really interesting to watch.

Well, Josh Jones from QuantHub, thanks so much for joining. If people wanna find out more about QuantHub, what's the best way for them to learn more? 

[00:19:04] Josh Jones: They can visit QuantHub.com or find us on LinkedIn. 

[00:19:08] Ben Kornell: Awesome. Thanks so much for joining us today. 

[00:19:11] Alex Sarlin: Thanks for listening to this episode of EdTech Insiders.

If you like the podcast, remember to rate it and share it with others in the EdTech community. For those who want even more, EdTech Insider, subscribe to the Free EdTech Insiders Newsletter on substack.