Edtech Insiders

The AI-Enhanced School of 2040 with Edtech authors Greg Toppo and Jim Tracy

October 17, 2022 Alex Sarlin Season 3 Episode 23
Edtech Insiders
The AI-Enhanced School of 2040 with Edtech authors Greg Toppo and Jim Tracy
Show Notes Transcript

In this episode, we speak to Greg Toppo and Jim Tracy, co-authors of Running with Robots: The American High School’s Third Century from MIT Press, available at Amazon, Bookshop or wherever you get your books!

Greg Toppo is a journalist with more than 25 years of experience, most of it covering education. He spent 15 years as the national education reporter for USA Today and was most recently a senior editor for Inside Higher Ed. From 2017 to 2021, he was president of the Education Writers Association. 

Jim Tracy is Senior Advisor at Jobs for the Future (JFF) and Senior Scholar of the Center for Character and Social Responsibility at Boston University's Wheelock School of Education. He has been head of several independent schools, as well as a college president.

Recommended Resources:

Alexander Sarlin:

Welcome to Season Two of edtech insiders, where we talk to the most interesting thought leaders, founders, entrepreneurs, educators and investors driving the future of education technology. I'm your host, Alex Sarlin, an edtech veteran with over 10 years of experience at top edtech company. Greg Toppo, is a journalist with more than 25 years of experience, most of it covering education. He spent 15 years as a National Education reporter for USA Today, and was most recently a senior editor for Inside Higher Ed. From 2017 to 2021. He was president of the education Writers Association. He's the author of the game believes in you how digital play can make our kids smarter in 2015, and most recently, co authored with Jim Tracy of running with robots, the American high schools third century, which looks at automation AI and the future of high school, which we'll be talking about today. Jim Tracy is a senior advisor at jobs for the future. JF F. And senior scholar of the Center for character and social responsibility at Boston University's Wheelock School of Education. He has also been the head of several independent schools and a college president, Greg Toppo. And Jim Tracy, welcome to Ed Tech insiders are having us. This book is so interesting, and I wanted to ask you both about it. Let's start with you, Greg, you know, you're a veteran education journalist, you've covered higher ed and K 12. For years, tell us how you got interested in education reporting, and sort of your take on edtech. Throughout your career, what has been your overview of all the different things that happened in the last, you know, decade or two of education technology?

Greg Toppo:

Thanks for having us. Again, the way I got into education reporting was kind of I wouldn't say quit by accident, it happened kind of in this odd way. Before I got into journalism period, I was a classroom teacher for about eight or nine years. And I made the switch almost all at once, or cold turkey. And what what happened that I started at a little tiny paper in New Mexico, Santa Fe, New Mexico. And every time the guy who covered school boards was sick, or, you know, had to go take care of his kid or had something else going on, there would be some sort of like, meeting in the corner, and they would all look over at me and say, you know, didn't wasn't Greg, a teacher, he didn't cover the school board tonight, Kenny. So it just sort of happened by default, that eventually, I started covering education and sort of never looked back. So that's my very short version of that. In terms of Ed Tech, you know, I really didn't pay much attention to it for a long, long time until I started getting interested in writing my first book, which is about games and learning. And then I sort of just dove in headfirst, and was really, at once amazed and kind of delighted, but also sort of kind of depressed by where things were going. And I spent a long span several years, just trying to kind of pick through the stuff that I thought was kind of dumb, and very sort of low level. And to find the things that were that were interesting to me. And by the way, full disclosure, Alex, you were someone I met along the way as I was doing reporting, and were a great help answering that exact question. So I really valued conversations we had.

Alexander Sarlin:

Yeah, you and I used to be at the same events around gaming and education. I know we'd always connect. And it's always so interesting hearing your research about what you were doing

Greg Toppo:

as learning in real time. So that's really exciting. Yeah,

Alexander Sarlin:

there's a lot of filtering to do with games and education. There's a lot of stuff that is not worth the time and a few things that definitely are. So it's been really interesting. And Jim Tracy, you've been a college president ahead of schools. And you're now an advisor for Jobs for the Future, which is one of the leading organizations examining and shaping the future of work. Give us an overview of your journey in the education and workforce fields.

Jim Tracy:

Sure. Well, thanks for having us, Alex. It's a pleasure. And I come from a working class background. And I was a high school dropout with a young family, which wasn't particularly unusual in my circles. And ventually felt a sort of epiphany that led me to get my GED and take college courses at a commuter campus and work my way through while I was spending my days in factories. And it was a process of empowerment, the education to which I was being exposed was liberating, and gave me a sense of agency over my destiny. So I actually went on to do my doctoral work at Stanford, in American 20th century history looking at how people can transform social processes and in that journey, aggregate Utley empower themselves So how do groups of people arrive at a kind of agency in a situation that is, maybe marginalizing for them, or even more overtly oppressive. So my dissertation really focused on, like freedom struggles from 1940, to 1970, and also other social protest movements during that period and how they intersected. That became my first book. So I started to become an educational leader heading schools, because I wanted to bring that as a vehicle for empowering others. And at the same time, for those who are already privileged, I wanted to have education be a vehicle for them to become more empathic and understanding and supportive of those who were less privileged. So the role that education had played in liberating me and empowering me, I wanted to establish writ large. And I believe that it is in many ways, not that I was going to be the only person with that sense of mission. But I felt that I was joining an series of institutions that were in fact dedicated to that mission driven vision. What I found was that they were highly problematic, and certainly an education reformer, if you will. And I also around the late 90s, early 2000s, began to feel that really educators needed to understand the coming tsunami of the digital revolution, if they wanted to have the potential to distribute those tools, democratically, and more importantly, perhaps, to understand the disruption that it was going to cause to everybody in every domain. And we had to get ahead of that curve. That's

Alexander Sarlin:

a really interesting background, I love the concept of you know, education as, ideally a form of empowerment of different social strata. But as you say, it doesn't, you know, hasn't always lived up to that promise. So I'm sure that really informs what you've done in the book, especially as you say, as you combine it with the digital revolution, and all the possibilities of empowerment that come with technology. So let's jump in and talk about running with robots, the American high schools third century, it is out now from MIT Press. And so your backgrounds are long, at really understanding education from a lot of different angles. Greg, you've written from everything, you know, higher ed, and K 12. For years all over the country with USA Today and others, what brought you to the idea of the future of high school and how it's going to be changed through technological disruption and technology?

Greg Toppo:

I would say it's Tim's fault. We were talking kind of just very generally about what was happening education for a couple of years, as it turned out, we just have these conversations every now and then about what was going on. And, and this topic kept coming up, you know, what was happening with technology and AI and how it was going to change the landscape of education. Around this time, maybe about five, six years ago, there were a lot of books that were, I think, taking a kind of a microscope to what was going to happen to the workforce, right? This idea of, you know, like the age of robots, and you know, the book by a couple of MIT economists, was the new machine age or, you know, stuff like that, you know, there was a real look at the future of work, and future of workplaces and employment. And I think what we ended up feeling was that there was really no discussion at all on any of these books, or anywhere about what that meant for education, and specifically, what that meant for secondary education. So that's kind of how we got into this topic.

Alexander Sarlin:

Jimmy, Greg says it was your fault. I'm curious how you got to the idea of high school and understanding, you know, how the future of high school is going to be impacted by technology? Well,

Jim Tracy:

you know, some of the trajectory of the journey I mentioned, but also, I was, in part of the discussions that Greg and I were involved in together, were around what we saw as a working class, wellspring of politics of resentment and anxiety. Very objectively, we felt that a lot of this anxiety and a lot of this anger is coming from the fact that they are not seeing gains, you know, they're not seeing that their purchasing power is, is improving. In fact, it's decreasing over time. They're not feeling that they have viable pathways into an increasingly roboticized digitized economy. They're not feeling empowered around retooling their skill sets. And so we felt that it was really absolutely critical that educational institutions, including and perhaps especially K through 12, institutions, repurpose themselves to be able to speak meaningfully to the next generation of workers and citizens.

Alexander Sarlin:

One of the things that's in First thing about the book is it really explores the idea of disciplines of we live in an age in which AI and the metaverse and digital marketing and the digital world is transformed so many different industries. But in most secondary schools, you still have the same old, you know, math, history, social studies, English, the exact same subjects and the exact same sort of format as you had 50 years ago, or longer. And I think one of the things you explore in running with robots, is how schools could really be a lot more proactive in thinking about understanding how to prepare people, not just for individual careers, but even how to think about the world in a way that's closer to the way the world actually works.

Greg Toppo:

And I think, you know, part of that, if I can jump in is, is thinking about things that schools don't need to do anymore. One of the things that kind of bubbled up to the surface was this idea of, well, an age of instant translators, do schools need to spend all this time and effort and money teaching foreign languages? You know, we kind of after we're thinking and reading talking about it, we kind of arrived at this place where you don't need to do it for the sort of the practical, can exist, traveler in Paris. But you do need to, you know, think about it in terms of just something that enriches people's lives, or that that helps them understand the culture. So I think a lot of the question is like, what are the things schools can offload? Yeah, that was a different to your point.

Jim Tracy:

If I could just pick up on Greg's comment. I think that people when they're discussing what needs to happen in schools, tend to pile on too much in terms of there's always added rather than recognizing something really needs to be taken off the shelf, if you're going to add any more, because there's a finite amount of time in any given day. And so there's been a lot of discourse in the last three decades around what should be done in our school systems. And it tends always to be a series of agenda. And I think we need to make some hard choices around what are we going to take off. And it's interesting, because for the last 20 years, maybe quarter century now, surveys of educational leaders have consistently shown a skill set that is not being taught in the traditional curriculum, around creativity, collaboration, and you know, the skills of resilience and so forth iteration. And we need to take something away from the traditional curriculum in order for teachers to have time to inculcate those skills. Those are tough decisions, Greg, and I would certainly agree that there is a value to learning French, but the question is the trade off between doing French at the expense of not being able to do these other skills that are more germane to the life experience of the students who are going to be living in a different world,

Alexander Sarlin:

it's a really, really good point. And you see things, you know, turned into electives, or after school activities that are, as you say, potentially more, more germane, more relevant to students lives, whereas core subjects stay the same year after year, after year. Before we go any further, I really do want to talk about running with robots is told in a really innovative way. It is very different from your sort of traditional nonfiction, you know, educational book, it's really told in the form of a time travel parable, which sort of follows the exploits and the adventures of a school principal who is gone 20 years into the future to see a high school and how everything has changed. And you sort of see all the innovations through his eyes. So how did you get to this is such a it's such an innovative format, and it's really a joy to read, how did you get to this format? And how do you decide to tell to give your predictions about the future of high school in this way? Let me start with you, Jim.

Jim Tracy:

I think I might be to blame for that. And you want to

Greg Toppo:

thank you for taking the blame.

Jim Tracy:

Greg and I, the current manuscript is so interwoven with our voices. Throughout that it's impossible to extricate. But in the initial drafts, we did split up chapters and say, Okay, you write this in the first draft, and I'll write this other chapter. So the introduction was one of my tasks. And I kept trying to write it to say, this is what we could be doing, but it kept coming out very dogmatically. When I tried it in a format that was you know, sort of traditional expository, it kept coming out very dogmatically as this is what you should be doing. It sounded like I was speaking, rather arrogantly from on high. I went through literally 10 Plus drops, that all ended up in the trash because it just sounded very didactic. And I finally sat back and said, Well, you know, this would be better as a fictional narrative to just show the possible Ladies, be playful with it. And some of the models that came to mind were Edward Bellamy's looking backward, and even Galileo, when he wants to bring up some issues that he thought the church might find a little difficult to take. He tried this format of the sort of, I wouldn't say buffoon, but the neophyte speaking with the person who has lived the experience of this new world that he was trying to present in his economy. And so Bellamy the high school principal from the future, becomes that authoritative experienced source of this new world, The Rumpelstiltskin characters waking up into. And so those are parallels that I thought of, it seemed to me that it was a gentler, less dogmatic way to present some ideas that we think would be perhaps startlingly new to even educators today. Yeah.

Greg Toppo:

And you know, just to follow up on that, in my mind, one of the things that it allowed us to do was to go back a little bit in history, was to almost create sort of a series of like, platonic dialogues, basically, with our narrator, this principle from 2020, who falls asleep and wakes up in 2040, basically asking a series of stupid questions of the new principal. And I found that to be a really not only effective, but also sort of energizing way to slice and dice this stuff. And that was kind of the only way we could do it.

Alexander Sarlin:

I think it speaks to both of your experience as educators, because it's really a show don't tell format, it's like, let's actually look at what one of these classrooms and what these schools would look like, rather than having to explain and long dogmatic paragraphs or long didactic paragraphs about what it would be like you just show it and those questions, that sort of dialogue format really allows you to get to the why behind everything, you know, why things have changed in the way that you predict? Which is it's really an interesting read. Greg, I have a question for you, which is that, you know, we have the audience for the Ed Tech insiders podcast is primarily you have classroom educators who are currently in the classroom or in universities, we have educational entrepreneurs who are starting their own companies. And we have a tech investors who are always keeping an eye on the future and where things are going. I'm curious for each of these groups of listeners who haven't yet, you know, read running with robots. Give us an overview, of course, this is for both of you. But Greg, let's start with you, you know, an overview of some of the ideas in the book that you think are particularly relevant to those groups, for teachers, for entrepreneurs, and for investors.

Greg Toppo:

I mean, I think if I started with teachers, with one thing, I would say, Me, there's a lot to say to teachers. But I think the one thing I would say is, don't under estimate the ability of technology to totally mess with your world. You know, as a journalist, I'm seeing it now, you know, we actually even have a chunk of a chapter about how, you know, there's AI journalism, that is as good as the real thing. So I would say, in terms of that, you know, just don't think your skill set is not automatable. I mean, in a way, getting back to Kim's earlier comment about political unrest. You know, one of the reasons I think we wrote this is because teachers basically see themselves as white collar workers, right, they see themselves as professionals, and professionals, as a group, I think, are of the mind most days that they're doing high level high prestige work, you know, almost bespoke down to individual tasks. And I think my one message teachers would be just you wait. I mean, I'm saying this as a former teacher, myself, in terms of, I guess, it was the second one to investors.

Alexander Sarlin:

Yeah, and entrepreneurs. But our second one was education, entrepreneurs and company founders,

Greg Toppo:

I guess I'd talk to maybe both of these groups. My take on this is, and this in a way kind of precedes my work with Jim. I mean, the thing I've thought about a long time, and I think the book bears it out, is that I think we placed too much emphasis on the things on the tools, and not enough emphasis on what they do, or what they allow us to do. I think about like something like, you know, Kahoot, you know, which is really interesting platform that allows sort of this classroom engagement. And when I remember when I saw it, you know, 10 years ago, I was thought this thing was a living. And this is amazing stuff. And now it's sort of, you know, there's lots of Kahoot like things out there. And it's almost, I wouldn't say it's passe. I think it's still really interesting. But to me, the focus needs to be on what it and all these other tools allow us to do. Like, what I say is focus on the verbs, not the nouns. So I would say that, to me is like an important message, like what is this technology can allow us to do that we can't do now. And if it's not very interesting, that I wouldn't take It seriously. But if it's, you know, it's adding to the equation in an important way that I would say that it's worth looking at, like the nouns will always go out of date and be obsolete. And they'll be replaced by something else. But the verbs to me are much more interesting. Anyway, that's my long take.

Alexander Sarlin:

No, it's great. And, Jim, what do you think? Are there some takeaways that you would highlight from the book specifically for educators, entrepreneurs, or investors?

Jim Tracy:

Yeah, I was picking up on Greg's point, I think that we continue to educate people into a knowledge economy of the night, late 19th and early to mid 20th century. But it isn't going to be a knowledge economy for humans in the 21st century, it's going to be a different type of human value proposition in collaboration with the real knowledge economy, leaders, which will be the algorithms. And so we are in the process, if we haven't already have ceding the knowledge, leadership, the information, leadership to our own algorithms, and they're going to be faster, more savvy, and it's only increasing every single day. And so what is the human role? I think the fear that is driving a lot of people to despair, I think it's happening with everybody is this feeling that I'm being sort of marginalized from the economy from the society because of these departments. And that's only going to increase, as Greg said, when white collar workers really start to feel the hit to their leadership in their roles. What I think we have to say to people, and it is absolutely true is, that doesn't mean that humans don't have an absolutely central role going forward, it's just a different role, it's going to be the role of empathy, it's going to be the role of creativity, it's going to be the role of greening values, to make sure that there are no unintended social consequences, when an algorithm becomes effectively a black box to us, because it's crunching data that we can't handle in our own brains. And then it comes up with a solution, we're going to require humans at the decision point to be able to ethically ask, Well, okay, this is a more efficient way to route traffic to the airport. But is that path going to disrupt a really, you know, cohesive community? Those are things that the algorithms can't answer. And so I think that humans need to pivot to thinking, working and being educated into a different way of participating in the workforce and in society. And there will be roles for us. Yeah, so

Alexander Sarlin:

building on that the answer and the idea that, you know, it's about the verbs, it's about what can be done in education, not exactly what tools are there to do it. One of the threads, you know, main threads of running with robots is is about artificial intelligence, and robotics, it's called running with robots in terms of which is not a coincidence, it's really about collaborating with the technologies that, as you say, will sort of run the knowledge economy. And you mentioned earlier, some of the books like, you know, Martin Ford's Rise of the robots, or the McAfee and Brynjolfsson, the fourth Machine Age, which tend to focus on exactly that the replacement, the idea that so many jobs, blue collar and white collar are going to be totally up ended and replaced by artificial intelligence. And I think what's nice about your take is that even though you both sort of do see this coming, you're not in denial that this will happen, or you know, you think it will happen, you're not pessimistic about what it'll mean for education or for teaching. So I want to drill down into what you're both saying, and ask specifically, what are some of the educational tasks you mentioned empathy, but what are some of the things in the education world that will fit that criteria of they cannot be outsourced to artificial intelligence, or at least they must be done collaboratively between human intelligences and artificial intelligence? We know in some fields that could be interpreting X rays or medical ethics, in education, what is unique about education in terms of what it'll look like when there is all sorts of technology supplementing or even replacing certain tasks?

Greg Toppo:

I guess my quick take on that is, I had no idea because it's really all about just what we trust technology to do. I mean, a lot of people are very skeptical about things like integrators, or, you know, AI, essay grading, right. And there's a lot of pushback against that. Just to take one example, I think, for most teachers, grading essays, you know, Jim writes this chunk of this chapter in the book, is it just terrible, awful, dehumanizing job, right? It's 30 students in five sections. You know, you've got 150 essays to grade. And a lot of the times it's like the first go that a student is trying so It's just awful. And to me, the resistance is something like a robo essay greater will evaporate the moment we get a really good one. To me, there's no doubt there's no doubt in that sense, we get an essay greater that The New York Times or The Chronicle of Higher Ed stamps with its seal of approval. That's the end of that task for teachers going forward. So I really do think it's, I guess, the short answer to your question is, it's constantly shifting, it's always always always going to be like, what can the technology do? And then we'll proceed from there.

Alexander Sarlin:

It sounds like what you're saying is that some of the tasks of education are a little bit rote, and repetitive and maybe tasks that sort of lien you mentioned the word dehumanizing to Detroit grade 100 essays? Well, if it's dehumanizing, maybe that would be something that would be better, better served by a machine. And, you know, we've seen AI being used in proctoring of exams a lot in the last couple of years. And that's something also is similarly dehumanizing. You know, you're having a teacher sit in a room and watch the people don't cheat. Maybe that's something that a camera and an algorithm could do better. And it doesn't sort of rely on particular human intelligence. So what's the flip side of that coin? What are some of the things that you think could never be replaced by artificial intelligence? Or is that too crazy? A question? That's actually

Jim Tracy:

a really great question. And it's spot on with really how Greg and I approach this entire book project, we said, we're seeing a lot of research into what skills might be replaced by intelligent machines in the next 10 to 20 years. But we're not seeing anybody really asked the question, what is overwhelmingly unlikely to be replaced that humans, and Aren't those the things that we should be emphasizing that humans can feel confident, will be the value proposition they'll be able to bring to society into their work? And we saw these sets of skills that people were already talking about, right? creativity, collaboration, iteration, values, clarification, and so forth, and empathy. And we said, you know, those are the things that humans are going to be doing, there are going to be robots that will know the calculus, they're going to be robots that will know the biology. So why are we trying to skill up every single student in high school to reach a certain level of content proficiency, when really what they need is, they don't need to be content fluent in those areas, they need to be content conversant, and then we can, that's less time to skill them up to a conversational level of those topics than it is to bring them up to a high level of proficiency. And if we take that time that were freed up and spend it on what we call process fluency, how to work in teams, how to be given a challenge, and think, the engineering challenge, and the social challenge and so forth, then they're going to be better prepared for the type of work they're actually going to do in society. You know, a nice example to me is if you take the way we trained physicians today, physicians are trained to be data centers, right clinicians to be people who have enormous amount of information inside their heads that they carry around in their three pound gray matter. And they look at symptoms and they make a diagnosis, that's all going to be externalized, to algorithmic processes. And they're going to do it better than the best humans very soon. So what is the role of a physician, it's going to be to have a conversational understanding not to be the premier expert on the planet that's going to be in a computer somewhere, but they have to have a familiarity, certainly some expertise around the information so that they can troubleshoot what the algorithm is putting out and say, Does this make sense? And they can empathically then guide the patient through their healing regimen. So then, if you reverse engineer that, what should we be doing in medical school? Right now, we teach knowledge and content exclusively, and only very tertiary is any emphasis upon being empathic and having a good bedside manner. Greg, and I think that that's going to be completely inverted in terms of emphases as to how we train people in medical school going forward. And you really have that writ large. And it's a very different training for people in the professions, to be collaborators with the algorithms that will be smarter in terms of content than we are. That's the shift from humans being the knowledge economy, to humans being the collaborators with the knowledge leaders. And so if I think about you know, K through 12, would be spending a lot of time teaching people to be ethical, empathic, creative, I could see teaching them to look at how to be savvy simmers of information. How do you recognize deep fakes? How do you guys disinformation? How do you engage with differing opinions in a manner that is reflective of the best values of civility? How do you protect yourself? How can you be cyber secure in a world where all of your information is going to be digital, including your most private and important information? How do you make ethical decisions? What do you want a machine to know? When it can move faster than you? Before it actually does anything with the information and the conclusions it reaches? How do you preempt the values that are inputted into a computer, not just after the fact I'll give an example of that, that has many harrowing, but it's an example that really drives it home, when we increasingly are using lethal autonomous weapons systems militarily in the field, there's not going to be time for humans to make a battlefield decision, we have to put those values into the algorithms before the machines are let loose. We're going to see that increasingly in the next decade, on battlefields. And it's humans who have to be the engineers to design that. And so everybody, including software engineers, and designers, from military weapons systems, need to be immersed in the ways in which we can embed human values and ethics into everything that we do. And that should be happening K to 12.

Alexander Sarlin:

It's a really, really, really interesting take. And so just synopsize what I'm hearing you both say, you know, is that as machines take over a lot of the tasks that are very knowledge centered, content centered, diagnostic, you know, the role of human intelligence, and thus, the role of education should focus much more on ethics, you know, philosophy, understanding, creativity, collaboration, many of the skills that we know that machines and computers and robots are furthest from. And you know, you mentioned, Greg, the idea that journalism is increasingly AI based in some ways, but from what I've seen, so far, the things that can most easily be done through AI journalism are things like sports journalism, or business journalism, which is really reporting facts and numbers, and you know, sort of turning a baseball game statistics into an article. But the idea of AI writing a, you know, investigative journalism piece, that's more all that talks about, you know, the mental health institutions and how they should be reformed. You know, that's a very, very different take. So I'm hearing you both say, hey, you know, what we should be doing in the future education. And this is obviously, the focus of the book, is helping people become more and more human, in a world where machines will do a lot of the mechanistic tasks.

Greg Toppo:

Yeah, and I would say, just to your, to your point of AI can't write a long form piece about some ethical issue, just wait.

Alexander Sarlin:

Right, so so it's a race between, you know, as, as in artificial intelligence gets wider and wider, and its scope? To me, I

Greg Toppo:

think that's what I'm very, like reluctant to predict anything. All I can say is like the principle that when you kind of move through this book, Thinking on his, as humans, like, what do we bring to the equation? Like, minute to minute, year? Right? What is the value we're bringing here? Right? And you know, 50 years ago, the value might have been, I can help students learn how to calculate, you know, do long division, do you know, certain number of things, diagram sentences, whatever it is, things that are sort of out of fashion in a way? And you know, so it's just constantly changing? And I think, more than anything, we have to be, as educators aware of that question, and keep asking that question. Today, this year, what do I bring to this equation that can't be done or that we don't trust technology to do, I think

Jim Tracy:

a nice example of that, you know, instead of teaching students a language per se, even though there's a value in learning that intellectually, but it's an enormous amount of classroom time, to teach someone even to a level of just basic functionality in a foreign language, nobody in the next generation is going to be remotely interested in doing that. I shouldn't say nobody, there will be students who will always find a passion for French, or German or Japanese, and they should always be able to pursue that passion. But the vast majority of students are overwhelmingly going to vote with their feet, as they increasingly have a little device that will immediately translate into any number of hundreds of languages and dialects, and say, Why should I spend years getting to basically being able to say, Where's the bathroom and I would like that plate plates. Instead, what's with that time is humanizing them. We can be giving them project based scenarios, where they learned to be polite, and sensitive, and considerate travelers. In another person's culture,

Alexander Sarlin:

we recently interviewed Quinn Tabor as the CEO of a company called immerse, which is a Metaverse based language program. And it's it's exactly what you're saying it's group based, immersive, you're walking through scenarios with a group of other people figuring out what to do in the foreign language. And it's designed to be exactly what you're recommending here. It's not doesn't take up any classroom time, because it's entirely extracurricular, and virtual. So I think some of the many of the predictions you've made in the book are already starting to take place in the one you're saying that this is really happening. It just launched a couple of weeks ago, on the sort of Oculus platform, Jim, and

Greg Toppo:

I have this sort of text thread going, where we basically one of us will see something in the news. And we'll say, you know, I think we predicted this book in 2014, so anyway,

Alexander Sarlin:

yeah, I mean, that's what's hopefully fun about writing a book that's all about prognostication about where education is going is that you can then lean back and watch it go in that direction. And in many ways, I want to ask about, you know, a lot of what you write about in running with robots is really about what secondary schools in particular, could be thinking about in terms of the future, you know, how they might think about how to work with technology, how to incorporate artificial intelligence, how to, you know, change the curriculum, so that it's more befitting of the future of work and the world? How have you seen high schools or universities to that matter, sort of received some of these messages? Do you feel like the schools are getting that point? Or are they sort of sticking their head in the sand a little bit and doing what they've always done? Jim, let's start with you.

Jim Tracy:

I'm seeing a lot of Yes, yes, absolutely. That's right. But we can't do it for this list of reasons. You know, we're too constrained, we can't do it. I see that over and over again, what I feel as an historian of social change, is that historically, institutions that are past their time, and deeply entrenched, actually become increasingly rigidify, as their irrelevance becomes more pronounced, until they collapse from within. And I think that that's kind of the current state of our K 12. Educational System, I'm sorry to say, but I think that that's part of how change transpires,

Greg Toppo:

the end of the book is, we imagine a video arcade, where people are basically practicing skills that they need, or that they were interested in learning or that they enjoy. And one of the suggestions is that if schools don't do this private industry will, and it'll step in and do it in a more engaging and fun, and, you know, frankly, sticky way than schools do.

Jim Tracy:

When things do implode, then people adopt the parodic Matic seeds that have been planted by innovators into those systems in the prior years. And so for instance, you know, when you had the great depression, or when you had the Great Depression of 1929, all of these ideas that have been percolating in progressivism circles, suddenly were adopted into policy, because the crisis demanded it. And I see a lot of seeds being planted by Innovative educators out there that will bear fruit will become mainstream and dominant when the current hegemonic system, which we inherited from the 19th century, is no longer able to function. So we've talked

Alexander Sarlin:

a little bit about the really interesting and unusual structure of running with robots. But I'd actually like to go even a little deeper, because it really is something that makes the book so unusual. Jim, tell us about your characters about Rumbaugh and Bellamy and sort of what they do and how you use them as a parable about the future of education and AI.

Jim Tracy:

Sure, and that's a nice term for it a parable we have a recurring story that is obviously fictional, of a principal of a high school, who falls asleep in the year 2020. The year the book came out, and wakes up in the year 2040. So it's a Rip Van Winkle sort of story, and allows him to do a bit of Edward Bellamy's sort of looking backward. See where education has gone in the previous 20 years to kind of fantasy of every educator to be able to see see into the future. It's an optimistic scenario. So he wanders down to the high school, he was principal of and a former student is now the principal and she gives him a tour of the pedagogy, if you will, the different classroom experiences that are taking place in the year 2040. The idea was to have that as a recurring narrative before each chapter. So each expository chapter where we do sort of more of the history of education or are each chapter where we do more of the analytical treatment of our current predicament, as a society and as educators is prefaced by a sort of next installment of this ongoing parable throughout the book, and the visit various classrooms and have various discussions about the reasons for different types of developments in education over those 20 years, as well as the actual presentation of what those classrooms look like, and they're quite different from today's classrooms.

Alexander Sarlin:

It's a great model and allows everything to be sort of explained in a really illustrative way. Greg, tell us about some of the places that rumble and Bellamy go within the schooling system.

Greg Toppo:

Oh, sure, they kind of wander through the library, which doesn't actually look like the library of today, it's got all kinds of science experiments sort of in midstream, they go to a classroom where students are crowdsourcing science experiments, they go to one of my favorites is humanities class where students are basically doing a group q&a with a very feisty, AI powered thought of Hemingway. And what we find out in short order is that some students built this thing, sort of a hologram that's powered by all of his writings, his letters, his novels, his journals, and so on, it's coded to respond to queries, kind of like what your Alexa does today, only much more sophisticated way. And it can basically have a sort of a conversation simply by accessing these pieces of prose. The teacher is actually sorry, the principal recalls that he got into a fight about one of the students asked him about bullfighting, and he got very defensive. That sort of thing.

Alexander Sarlin:

Yeah, it's really so interesting. And as they go into each scenario, you sort of have an opportunity to unpack how trends that are starting today, obviously, will be extrapolated out into really unusual, really unique opportunities for education. And I think it's just it's such a clever premise that you know, the last chapter of earning with robots, they actually leave the school, they go to a I think they go to a bar, and they go to a, an arcade. And there's a really interesting scene where they go into a video arcade and see all these people doing these sort of simulated activities. Some are game summer jobs. Greg, you've studied simulations and gaming in education for a while. Yeah, I'd love to hear your talk about that chapter and how you think about sort of the game world in the future of education,

Greg Toppo:

the big point we were trying to make was, and by the way, you recall, right, at the end of the day, they kind of need a drink, right sort of go down. As they go for their drink in the bar, there's a new principle says, Let's dip into this establishment across the street, and they basically kind of do the same thing they've been doing all day, which is this sort of watch people at work. And one of the things they see is, as you said, people basically taking part in all kinds of crazy simulations. One's a sort of a robotic pizza robot game where you're actually, you're remotely making pizza. And the other one, my favorite was the Godfather simulation, where you're basically sort of dropped into a full length version of The Godfather, and you have to essentially take a, you know, take on a role. And, you know, the joke is that, you know, by the end of the game, everybody's dead. It might take three hours. And the big, kind of the big idea behind this was for me was that it's foolish to just think about the future of school as something that's going to take place in schools in this building with four walls and three storeys. And that we're naive if we think that education, the education establishment has sort of a lot on helping people learn the things they want to learn or helping people to become the people they want to be or do the things that are fun for them. And instructive.

Alexander Sarlin:

There was a quote that jumped out to me, and it's exactly about what you're saying now. So I'm just gonna, it's very short, but I'm just gonna read it. And I Jim, I'd love to hear your response to this. It says, Rambo thinks about all of this sort of the role of education as he sips his beer. And he says, it strikes me that you're locked into a kind of arms race with places like the arcade, but instead of power, the prize is progress, personal meaning and self actualization. And Bellamy says yes, it's about who gets to say what learning is. I love that. So expound on that a little bit. How will the future give us this sort of arms? race or this idea of who gets to decide what learning is?

Jim Tracy:

Well, it's interesting. The way Greg and I actually first met is that I was engaged in retrofitting a school library in Massachusetts. And Greg heard about it and came to do an article on it. And what was happening at that school, Cushing Academy was that I was interacting with the students about what their experience at this boarding school was. And specifically, we started talking about the library, which was getting very, very little traffic from the students. And what I found was that the students were really voting with their feet. They were doing all of their research for projects online. And they really weren't seeing the relevance of this 20,000 book edifice that had, for instance, a social science encyclopedia set. That was from 1956. So you know, I realized that we're in an arms race with the online resources. And what was problematic was that the students, yes, they were voting with their feet by going to online resources for their research projects. But by the same token, they weren't imbued with the skill set to be able to discern what was perhaps more edifying, or reliable or authoritative online, from what was just somebody in their mother's basement, posting a blog. And we decided to meet them in the worlds that they inhabited, you know that they were digital natives. And so we digitize the entire library. But we then in doing that, gave them the tools, we moved all the books out of the library, all 20,000 notebooks were destroyed in the process, they went to good homes, but we gave them a Digital Commons. And we then had skilled librarians and others who would guide them as learned guides into the best resources, the peer reviewed results, and so forth that were available digitally. And we met them there. So that to me is an example of the the arms race we had to up our game, in order to compete with the worlds that they weren't happening. You know, they were in a digital world 16 hours a day, that was completely searchable, gamified and rewarding with different types of social and other types of rewards. Then they went into a 19th century world for six to eight hours a day and convinced him that that was relevant, by the way, you forgot the espresso machine. Yes. Very nice. When I decided that, actually, it was with a committee, including the librarian that we decided to digitize the library, I and some of my senior people took tours of libraries all over the country, the most cutting edge libraries, to see if we could get some ideas. And what actually was the most influential paradigm for us was a visit we took at the Googleplex. And they had a rule that no software engineer can ever be more than 50 feet away from food. So we put into some of our cafe. And we basically said to kids, in our new digital library, please talk, please make noise. And please eat in our digital comments. I should add, shortly thereafter, we did a survey and determined that the library was getting more students than any other we had 43 buildings on campus library was getting more students than any other place on campus, including the student lounge, I

Greg Toppo:

guess you can make the case that the coffee is a 19th century intervention, innovation.

Jim Tracy:

Little bit of central cafe in Vienna, mixed in. Exactly.

Alexander Sarlin:

There's a Steven Johnson has a book where great ideas come from that when he talks about the coffee shop being like basically, the advent of the coffee shop was also the advent of like philosophy and where everybody would go. And it makes a lot of sense, that's really, really interesting. And it obviously comes through in exactly that piece of the book where it's like, if students are spending that much of their energy and time online, you know, you could try and try to sort of pull them back to these, you know, old encyclopedias sets or microfiche or whatever, whatever they're, you know, that that world, but is that really even ethical? I mean, that's not their future. That's not what they're going to be doing when they're when they're out in the work world. So, you know, maybe it actually makes a lot more sense to upgrade and update, you can

Jim Tracy:

drag them back to that 19th century world, but it will be under duress, and it will be largely irrelevant.

Alexander Sarlin:

One of the interesting themes in the book is you talk about how the sort of economy of the future of 2040 is marked by three C's, creating cyber curating and caring sectors. We've just mentioned the idea of how schools do have a responsibility to sort of think about the future for their students and in this in this future high school, they think a lot about the the economy and what they're trying to, you know, prepare their student To be able to do talk to our listeners about those three C's, and how you see how you sort of got to that view of the future of what the economy might look like in 2040?

Jim Tracy:

Sure, well, first of all, it's really important for people to remind themselves and we all know this. But I think day to day, we tend to reify, the worlds that we inhabit, and forget that it is actually just a sort of a strobe light image of a very fluid process, historically. So in other words, people tend to think, Okay, this is the way it's always been. But if we think about it, even for a moment, we realize the type of economy that we inhabit today, the type of school systems we see are just historically constructed. And they had a long evolution to get to where they are, and they're going to continue to evolve. So it's all fluid. And so what we do in the book is we first look back and just remind people that, you know, we're largely agrarian society as recently as 150 years ago, in America, and around the world. And then we became a predominantly manufacturing society. And more recently, in the last half century or so we've become a predominantly service oriented society where we're getting, you know, most people are giving each other haircuts or legal advice, and not actually involved in physical labor to provide basic material needs. So what is the next step, as most of those service industry jobs, start to be roboticized, or at least bought a size, right? Even lawyers now are up against increasingly intelligent machines that are going to be doing more and more of their high level clerical functions. So we think that the best way to approach this in terms of looking at the future of work is to project out 1020 years, and ask ourselves a very fundamental question what is going to be the uniquely human contribution to collaborating with intelligent machines, what are going to be the things that only humans can contribute to that collaboration, and we do see it as collaborative, and then reverse engineer those skill sets, to how we can educate people into the jobs that will be there for the future. And what we saw was a set of three core categories. And they were the three seats, the creating cyber curating, and caring jobs of the future, that will be the three C's economy. And just to go through those quickly, I don't think that the creating jobs need a lot of description, we already have a creating sector, in our economy. Some of them are very highly paid, remunerative, you know, rock stars, are very, very low paid. But there is the spectrum of people who make their living through their creativity. The Cyber curating maybe requires a bit of a description, we have that a little bit now with, for instance, there are lots of openings for cybersecurity, I did a search on cyber seek.org Recently, and there were somewhat north of a million people in the country, according to cyber seek summit north of a million people currently employed in cybersecurity. And yet, what was interesting was that there were more than 700,000 current openings, so nearly 70%, again, openings, compared to the number of people who are working in cybersecurity, so that's clearly within that cyber curating realm. But we see a broader spectrum of cyber curating roles beyond just cybersecurity, there will be people who will be pre programming value inputs into algorithms before the algorithms actually are, are up and running. And what they'll be pre programming will be social values, right, making sure that the algorithms that are increasingly intelligent, ergo increasingly opaque to us in terms of how they crunch, so much data and get to their conclusions, will be informed by structured values, so that they will approach the challenges that are given to them in their black boxes, with certain sort of parameters of what we want to be as salutary social outcomes. On the other end, of course, there will be people who will be increasingly looking at the outputs from those algorithms, who will query them for whether their implicit biases are undesirable on anticipated outcomes. And that will be sometimes a fairly subtle determination. You know, it might be that there's a better way to route traffic more efficiently for the city. But you'll need a human who will recognize that, oh, gee, this is going to disrupt the historically cohesive community. So I think that all of those are examples of how humans will continue to collaborate with these increasingly capable algorithms. And then the third area, which we think is probably going to be 80 to 90% of the workforce in the future, is the caring economy. There is an almost unlimited need for humans to care for other humans. And we think that this is going to find so much of the human centered workforce of the future, including, for instance, doctors, doctors will fight, the best diagnostician, 20 years from now is going to be a computer somewhere in the cloud. And so they're increasingly going to be trained to understand enough to be able to collaborate with that diagnostic computer, to be able to explain to people what the diagnosis is in plain English, but they're not going to need to be the content, knowledge, repositories that they are today, they're going to have to have literacy in the medical speak. But they're going to have to have fluency in the caring for humans category. So they're going to be selected from their, from the perspective of, yes, they have sufficient intellectual capacity to get literate in the medical content. But what is really going to be separating the people who get accepted into medical school from the ones who don't, will be their empathic capacities we make. And then the training will actually enhance and augment and underscore their empathic capacity so that they are caring physicians who work between the computer that is doing the diagnosis, and the patient who needs to be guided compassionately through their healing process. And that's the caring economy, from social workers to doctors.

Alexander Sarlin:

One interesting moment in running with robots is when you talk about the mental health counselors, and how in the school of the future, there are both human and AI mental health counselors, which is clearly a profession that would be would be considered within that caring economy. But you really are nuanced about it and mentioned how Some students prefer the AI, guidance counselors and mental health counselors, because they're very objective, because there's no sort of interpersonal issues at all. And I thought that was a really interesting take on one of the sort of nuanced ways that this caring economy may work between the AI and the human sort of facilitator or the the human expert,

Jim Tracy:

it reminds me of a New Yorker cartoon, I want saw where the patient is on the couch, and you're kind of looking over the shoulder of the Sigmund Freud. And what he's really writing in his notes is really screwed up. And so, you know, Greg actually found an article that showed that a certain percentage of people who were interacting with a very nascent, you know, early stage primitive prototype of a bot, site, psychologist, people who in this study, were interacting with it as a test, some of them actually said that they always felt that human counselors were judging them, and they felt more comfortable with the bot.

Alexander Sarlin:

It's really interesting. And, you know, that's something we see in education as well, there are, if people had a nickel for every time a student has said, my teacher hates me, or my teacher likes this person better, or for all the amazing things that educators do. There's also always sort of inevitably interpersonal dynamics and conflicts. And I think it's really interesting to think about that caring and that sort of combination of diagnostic ai plus, caring profession and really sort of empathic human in education.

Greg Toppo:

I want to add one more thing, actually, two more things to that, just to kind of round out the conversation. I mean, in terms of the carrying piece of this, you know, we always look at jobs, from the point of view of what are we going to need things people to do? Or what are we going to, you know, what's going to be our need, as a society. And I think one of the things that, to me is just as interesting to look at is, well, what's the job going to be like, for the person doing it? You know, we very rarely think about work through the workers eyes. And to me, you know, this, the three C's economy sort of, in a way, it lays out a more satisfying paradigm. I mean, in terms of the caring thing, I was at the doctor a couple of weeks ago, and I can't imagine my doctor had a satisfying interaction with me. He basically sat me down, you know, on the paper, and said, How's your cholesterol medicine? That's good. Okay, great. How was your heart? Great. Let's listen to your heart. Okay. Good. Everything's fine. You know, you're exercising. Yeah. So you, you eating well, eating? Well, you know, I mean, this is somebody who spent years and years in medical school. Sure, it's like 98% case management. I can't imagine that was a satisfying, you know, four minutes with me, you know, that is something that he doesn't have to be doing. And by the way, on the flip side, you know what this talks about AI psychology. To me, it shows the real need for good cyber cyber curating because I think a lot of people would probably be fine with an AI shrink, but they are nervous as hell. about talking to, you know, their phone and not knowing where this information is going. And knowing where they're, you know, talking very candidly about, you know, all the things we talked about in counseling. Just think about if you could get to a point where people trusted the AI implicitly, how amazing that would be.

Alexander Sarlin:

Yeah, it's a great point. You know, we're in a week where I saw it amazing, a really interesting comment on Twitter about how this week Amazon bought one medical, which gives them medical records for a lot of people and bought iRobot, which gives them the home layouts of millions of people. And you know, it will be 23 of me, there is a little bit of a strange dynamic that comes with medical tech, and how you give your information your most I mean, your DNA, your medical records to companies, and then they can get bought up by bigger companies. And suddenly, you're a cog in the machine. And I'm not the most privacy oriented, paranoid person. But if I was even a little bit more, I would see this stuff. It's pretty scary that, you know, Google has my DNA records, why would they're gonna clone me someday,

Greg Toppo:

once I have a picture of your house with your dog walking upright vacuum cleaners?

Alexander Sarlin:

Exactly, exactly. And they have the Alexa's and the serie machines in the house, you know, it's a little bit spooky,

Jim Tracy:

it all becomes integrated, right? So then the Roomba will be able to tell if there are lots of children's toys, so they know that you have children. And then they market your products geared toward your children. And it just all becomes fully integrated as a sort of consummate consumer push. In terms of the doctors that you know, I remember I was at an airport. One time, I think it was lax. And there was a big medical conference in town that people were also leaving from, I was flying back to Boston. So there were lots of these doctors waiting for the Boston flight. And one guy who was sitting next to me was talking to somebody and he was saying, bragging, I've gotten it down to five minutes per patient, although, you know, sometimes, talkers. And I thought, people who actually want to know something about their medical condition from the doctor. And I thought that, you know, this guy was selected for a certain set of skills, but they those skills did not include caring at all about his patients, and what would it look like would still have jerks in the future? But for the most part, what would it look like if we selected people to enter medical school based upon their empathic capacity, and then we trained them to be even more robust in that regard?

Alexander Sarlin:

I can't help I always bring it back to education. But I extrapolate that out to college professors, I think you know, that we have a, an interesting system in higher ed in the US where we select professors, you know, based on the the depth of their research, and their publications, and their tenure, and their value to the field, all important things. And then we put them in front of 500 students trying to learn the basics. And it's such a strange mismatch of skills, you know, the idea that professors are always trying to get out of their teaching load, and you're like, Oh, right. But I want to ask one more question about something. It's pretty philosophical. But I thought it was really interesting in the book, and then we, we should finish up, this may end up being a two part episode, because I think we have so much we there's so much interesting stuff in here. There's a piece of the book that talks about orality and the oral tradition, and how, you know, with with in the modern tech world starting now, but especially as you extrapolate out to 2040, text, you know, regular readable text starts to sort of take more of a backseat to oral to people communicating orally. And and in some cases, visually, I'd love to hear you talk a little bit about that idea. And let's go from there. Because it's such a it's it's subtle, but it's very interesting. So, Greg, and

Jim Tracy:

I argue in the book that humans in their social life, their social experience, we're an orally based civilization until about 5000 years ago. So however, many hundreds of 1000s of years, Homo sapiens have been on the planet, it's only the last 5000 years that we became an increasingly textual civilization because we didn't have texts. You know, we actually arbitrarily marked the beginning of what we call civilization, at the development of the written word, or the written hieroglyphs, Ancient Egypt 5000 years ago. And so if you think about that 5000 year period, then we've been steadily moving away from an oral culture, and more toward a textual based culture. And that reached new heights of textuality when we started to insist that everybody become literate, which is a much more recent phenomenon than the last, maybe the first time that any society reached more than 90% literacy was probably colonial New England, and that was in the 17 or 1600s, rather. And then, of course, in the 20th century, America became the first society to require hire everybody effectively continue through secondary education. So this contextualization of society has been a very recent phenomenon. And what's interesting is that with, we think that with the electronic route, we're going to revert back to a visual and oral culture that is much more immediate and organic of people speaking to people, but it's going to be not quite identical to a Demosthenes is speaking to the people in Athens, you know, the third century BCE, it's going to be Alex Sarlin. Having a podcast that reaches billions is a return to the kind of organicity, if you will, of an oral culture, I think

Greg Toppo:

the idea in terms of the implications for schools is that they need to take that seriously. And they need to think about in a culture that's increasingly emphasizing morality, like what are the things that students need to be able to do? You know, it's not enough to be able to just, you know, write a six page term paper with footnotes, and ala style, they've got to be able to make presentations, cogent presentations, that hold an audience for a certain number of minutes. And they've got to just sort of understand, you know, beginning, middle and end of a narrative, and whatnot. So the places that do this, now, I think, are pretty exciting places. And I think it would be, to me really promising. If other schools learn from those, from those folks model.

Alexander Sarlin:

There are a couple of edtech tools that this conversation brings to mind. One is Flipgrid, which a really, you know, basically tries to make video and, you know, speaking and video into the main medium of communication. And you know, Microsoft bought Flipgrid, a few years ago, but it's really an interesting way to change the conversation literally, between students and between students and teachers into an oral one. And the other is, you know, we recently spoke to the CEO of a company called immerse that I just find so interesting. It's a Metaverse based language learning program, where basically people go in and walk around with other people who are learning the language. And they're starting with Spanish, they're moving to other languages. And they go to the library or they go to the supermarket, they go to the DMV, they actually have to, you know, they play games together, they work together, and they're all, you know, using Spanish the whole time, and it's in the metaverse. If there's no text at all, there's probably a little bit of text, but compared to the textual way that much foreign languages is taught. It's just so compelling. It's game based, it's oral, and it feels much more immersive and similar to how you'd actually use the language. And you know, that leads me to my last question. I know I promised the last one was the last question. But there's just there's just so much to talk about here, which is that, you know, one of the technologies that's just growing enormously in interest right now, and nobody knows where it's gonna go. But is this concept of an educational metaverse? We're really starting to see it sort of tip right now. And again, we'll look back in 20 years and see if it actually did. But Greg, I'd love to hear you talk about your thoughts on the educational Metaverse, you've written a book on game based education, which is part of that world? What are your predictions for the educational metaverse?

Greg Toppo:

To me, it's interesting, you mentioned the language piece of this because, you know, I think back on my, you know, junior high school, high school language classes, and, you know, what they were really trying to do with the kind of the bottom of it all was a simulation, right? I mean, it was, you know, hello, Greg, what's your name, you know, but but in Italian, you know, with any sort of simulating, you know, me being, you know, me meeting a new friend, you know, in Rome or whatever. And it was terrible simulation. But it was a simulation. Nonetheless, you know, here we are in the cafe, here we are at the school, we

Alexander Sarlin:

order pizza perfectly, and you wouldn't get any pizza. Right? So that's, that's no fun.

Greg Toppo:

I mean, I guess just sort of as, like, background, we've been trying to do this, you know, for longer than the three of us have been alive. And I guess so. You know, we're sort of on that continuum of trying to just make it actually good. So I think it's to me, it's fascinating. I do think, you know, we've already seen lots of starts and stops and kind of weird efforts, that seems to nowhere. I mean, one of the things that I was very conscious of when I wrote the book came out in 2015, about games was, you know, in five years, how many of these games are even going to be around anymore? And that was just, you know, sort of the cost of doing business I had to write about products and so products have a sell by date. I do think it's real really cool and really promising. To me the number one caution that I am always thinking about is just the privacy question. You know, how are we going to safeguard students privacy and their data. I don't think anyone's figured that out yet. And I think that, you know, that could really bring down a lot of these efforts. But I'm, you know, I'm really excited by some of the stuff I've seen. I haven't seen this language learning Metaverse app, but sounds really cool.

Alexander Sarlin:

Yeah, it just excites me for exactly the reasons you said. It's like, Oh, what if language is all about simulated experience? Why not simulate it in a way, that's actually, you know, better. And you can talk to people who are actually native speakers. And there's all sorts of opportunity there.

Jim Tracy:

Very, very cool.

Alexander Sarlin:

So we end the podcast with two questions. One is, what do you see as the most exciting trend in the Ed Tech landscape right now than you think our listeners should keep an eye on? I'll let's start with you, Greg.

Greg Toppo:

I guess to me, we're getting much, much more sophisticated about, like, what actual learning looks like. I mean, when I think about the conversations that I was having 678 years ago, people were just starting to talk and think about kind of what learning looks like, how to break it down. How to apply it, I mean, you know, these conversations I would have at places like Games for Change. And, you know, it all seemed very kind of elementary, in some ways, I mean, people. So I think we are much, much more sophisticated about that. So I'm really excited about those conversations, in terms of leading to like actual innovations. I'm very skeptical, leading to something right away. But like, for the long term, like, I'm excited, you know, I, I think in five years, 10 years, I think a tech will be in a better place. And we'll also be asking more interesting questions and education, the education and sort of the broader education world in general, I think, very rarely asked very, you know, very interesting questions. I guess, you know, going back to the privacy thing that I talked about earlier, I really do think if we don't figure that out, nothing will matter. Because nobody, you know, the bigger point of our book, nobody will trust the system. And no matter what incredible thing you create, it'll, nobody wants it.

Alexander Sarlin:

Jim, how about you what is the most exciting trend you see in the EdTech landscape right now,

Jim Tracy:

going back to you know how we came up with the Digital Library, Christian Academy, we did it by observing where the students were already going. And then we tried to meet them there, and fortify what the students were intuiting in terms of their digital space, fortify it with what we felt was perhaps a more robust, educative context, right? So meet them in the digital realm where they were with the tools that they were already using, but give them perhaps more peer review databases, searchable databases that would be more reliable for their purposes. And so I would say an answer to your question. Don't ask me, go observe students, see where they're going. And meet them there.

Alexander Sarlin:

I like that answer a lot. This is maybe not what you meant by this. But the first thing that comes to mind when I hear that is tick tock for education, you know, when you think about students voting with their feet right now, that's where their feet have been going. And it's been interesting to see there be this organic education arm of tick tock, but I haven't yet heard it sort of going anywhere that exciting. Or maybe YouTube, you could say to there's really interesting stuff there, where the kids are hanging out virtually. And, you know, our last question is always, what is one resource you would recommend, but for this one, I'm just gonna go ahead and say the resource is running with robots, the American high schools third century, we also mentioned a couple of different books. During this conversation, we'll make sure to have them all in the resources. That's the Brynjolfsson book about the fourth Machine Age and a number of different books. If you guys want to add one more resource, you can, is there any other thing you would recommend you have such a huge bibliography in this book that you would recommend for people to go even deeper after they read running with robots?

Greg Toppo:

I mean, I can give a shout out to Audra Walters book, teaching machines. I'm just a big fan of that. Just actually, as it turns out, no overt promotion here, but it's also an MIT Press book. But it's really wonderful. If you haven't read it already. It's a quick read lottery waters, teaching machines, the history of personalized learning, that's kind of a must read. At this point.

Jim Tracy:

I would also recommend a really brilliant book that's seven years old now, but as relevant as ever, it's a remarkable book called The Game believes in you by a brilliant author, Greg Toppo. Genius. still relevant, really, really grateful.

Alexander Sarlin:

Fantastic. Someday I'm gonna get Audrey waters on the podcast. I don't know if she and I agree on literally anything, but I think she has the most. That's an interesting, you know, edtech skeptic and such a thorough researcher. I'd love to hear her perspective. So that would be a fun conversation, too. This has been such a fascinating conversation. And you know, I feel like we could just talk for four more hours. There's so many aspects of this book we haven't even touched you talk about. They go to math class in the book. We haven't even mentioned math. Is this Hemingway. Got it by itself is a whole conversation. I can't recommend highly enough, you know, running with robots, the American high schools third century, Greg tapo. And Jim Tracy, thank you so much for being here with me on Ed Tech insiders.

Greg Toppo:

Thank you for having us. It's been really fun.

Jim Tracy:

Thanks for all you do. Thanks for the service you give to the community.

Alexander Sarlin:

Thanks for listening to this episode of Ed Tech insiders. If you liked the podcast, remember to rate it and share it with others in the tech community. For those who want even more and Tech Insider, subscribe to the free ed tech insiders newsletter on substack.