aiEDU Studios

Dan'l Lewin: Reinventing education in the AI era

aiEDU: The AI Education Project Season 1 Episode 15

Have you ever talked to an architect of the digital revolution that shaped our world?

We spoke with early-stage Apple alum Dan'l Lewin, who took us on a remarkable journey from the dawn of personal computing to our AI-powered present while offering rare insights as someone who helped bring the first computers into America's classrooms.

Growing up in upstate New York with a second-grade teacher who taught him binary math, Dan'l's path led him to Silicon Valley in 1976 where he ended up working next door to Steves Jobs and Wozniak. After joining Apple during the development of the Lisa system (precursor to the Macintosh), he spearheaded their strategy to introduce Apple computers to universities before they conquered the broader market. 

Our conversation with Dan'l explores how computing evolved over two distinct 25-year periods: from 1975-2000, when computers optimized rational tasks with limited connectivity; to the post-1997 era when the web transformed everything into interconnected systems. Dan'l talks about where AI fits in this historical arc and suggests that, much like early computing, AI's initial impact will mostly happen behind corporate firewalls before reshaping society as a whole. 

Dan'l also examines what AI means for human learning and development. He presents AI as potentially "a personal GPS for every learner" that could reroute students when they make errors, but also worries about what happens to deep-thinking in an era of instant, surface-level answers. 

For educators, technologists, and anyone concerned about our collective future, our conversation with Dan'l offers a perspective from someone who has witnessed (and shaped) the way technology has transformed how we learn, work, and connect. 

Learn more about Dan'l Lewin:



aiEDU: The AI Education Project

Alex Kotran (aiEDU):

Oh is it? It's Daniel.

Dan'l Lewin:

Daniel is my given name, yeah.

Alex Kotran (aiEDU):

Yeah, yeah, an apostrophe. L, that's right, it's my given name, You're the only Daniel.

Dan'l Lewin:

I know my mother's gift to me is a talking stick. I like to say that I'm vowel challenged. That's through way the eye of the ear. It's really just a contraction for Daniel and a family nickname that my father earned at one point.

Alex Kotran (aiEDU):

I'm not going to do a very good job giving your bio. You've accomplished and done a lot. What really sparked my curiosity to have a longer conversation with you was hearing some of your war stories from your time at Apple, where you, as I understand it, were leading their strategy to bring computers into the education sector. The time where there were no computers in schools, and I remember my school the first time we had computers it was IMAX. That was the initial impetus for this conversation. But before we dive into even that story and anything else that you're willing to share with us that maybe you can give, whatever your version of the Daniel Elevator pitch is, oh, just my background or my interesting story, sure.

Dan'l Lewin:

So yeah, long story short. For me is I grew up in very western upstate New York and, with hindsight, look back very fondly on the educational system that I got to work through public school system. And in hindsight, I look back very fondly on the educational system that I got to work through public school system and in particular, a second grade teacher when I was seven taught us binary and it was easy for me, it just happened. When I came to Silicon Valley, there was a lot that occurred. I had a very atypical family life. I attended Princeton and found out very quickly that I was not a mathematician like within a matter of minutes of arriving on the campus and ended up studying and spending my time in the politics department, based upon some professors that I met who believed that politics, as they called it, as opposed to political science, was best defined as relationships between or among people and that you could scale it structurally to organizations and that organizations have power as individuals give up certain things to participate, but the collective, comparative advantage, if you will, of large clusters of people can accomplish great things. So I was moved by the social issues of the day, thought in the end that I would be a civil liberties lawyer. I focused on organizing principles. The 18-year-old vote was clicking in when I was college. That dates me, if you want to look that one up. It was women's rights, civil rights and rock and roll that organized people towards social and positive change and, as far as I was concerned, positive change Anyway. So through that educational system, in the end I became a very good student and decided to take time before I went to law school, lost a family bet and came to California, moved to Palo Alto in 1976.

Dan'l Lewin:

The people that I met as I arrived were all clustered in this large house on the top of a hill in Los Altos Hills overlooking the Stanford Research Park, and they were doing various and different things eight or nine unrelated adults living in this large place. One was designing integrated circuits for Watkins Johnson for guided missiles. One was in med school the owners of the house. One was running the Kestrel Institute, which does a lot of crypto work for the NSA, among others, and a lot of math work. And the other one was a woman named Peggy Karp who was working on what became DNS. She worked with Larry Roberts. So, like aha, where am I.

Dan'l Lewin:

And then the long story short is through a roommate in college I ended up going to work for Sony in Cupertino. In college I ended up going to work for Sony in Cupertino and that happened to be a 600-square-foot office where the next-door neighbors the very week that I started became the people who left Apple's garage. Those Steve and Steve and a small scattering of others Chris Espinoza, rest of them that were just hanging around in the garage showed up in that office space. I spent a couple of years. I got to know them. But I spent a couple of years working for Sony in the storage area. They were bringing out three and a half inch floppy and storage kinds of things and that eventually found its way into the Macintosh, as most people know, as I brought that stuff to Apple.

Dan'l Lewin:

But I became keenly interested in the broader implications of what was going on with microprocessors and computing and in particular graphics. And at one point I interviewed to go to work for a company that had these three-quarter of a million half-million-dollar CAD workstations, because I was really interested in graphics and simulation and understood a little bit about microprocessors and all the rest and anyway. So the long story short there is. I turned down that opportunity. Just behavioral reasons on my part told me that I was the most qualified person that they interviewed, but I only had four years of experience and they needed someone with five years experience.

Dan'l Lewin:

And I smiled and said well, if that's the case, then I'm not interested in working for you or your company, because that makes no sense to me whatsoever. And they said well, but but? And I said no, no, sorry. The good news is I went to work for Apple soon there, and it was because you could kind of get your arms around what was happening. These personal devices were these worlds within which individuals were programming and doing interesting things.

Alex Kotran (aiEDU):

At this time, a vast majority of Americans do not have a computer, oh no, no, not even have access, and maybe you haven't even seen one.

Dan'l Lewin:

No, this is exactly the case. I mean, you got to realize there were 50 at the same time. I know that was a long-winded thing. There were 50 or so companies in Santa Clara County more or less, that were doing microprocessor-based personal computers Chromamco, and then Commodore and Atari. All the rest of them started to emerge. Apple was the one that was being run by real adults, if you will, in the sense that the board, the investors in Apple, brought in highly talented people from HP and other companies who knew how to build the infrastructure to scale that business. And Steve was the, you know, the goose that laid the golden egg. He was this entrepreneur with all this energy that needed to be harnessed, and so they had adults harnessing him and doing all those things. And Apple became the golden child with the IPO in December of 1980.

Dan'l Lewin:

I went to work soon thereafter. I missed the IPO. So anyway, when I went to work at Apple, it was the Lisa system, which was the precursor to the Mac. Mac wouldn't exist without Lisa and all the work that was done by that team. That product schedule ran out I don't know another 12 or 15 months, because they made some fundamental changes after I started and so I took my own time with permission and brought in people from the university community. I was posing the question to the executive staff of the company why try and sell these systems to the fortune 500 when it was all ibm mainframes and terminal emulation and color? Because that was a new, new thing, these color terminals and all that other kind of stuff and all fortran programming and apple's all about pascal and the rest. And I said know, the university and research community will buy these products. You won't have to sell them because you're inventing what they or you're building what they invented into real products. Then you know the company was not interested in that.

Dan'l Lewin:

The sneak preview program that I ran. We brought 90 corporations through. I wore a suit and tie and had a full day briefing with corporate CEOs had to be the CEO and other officers. They all had to be corporate officers ran about 90 of those programs over the course of a year. So I was in front of those people, interacting with them and then absorbing all that In that window of time. Mary Kay Cosmetics came in. They were an early adopter of the Xerox Star, which is some of the first stuff. That was kind of like a Macintosh and like modern day computing and Richard Rogers, who was Mary Kay's son, was the CEO. Steve Jobs came to that sneak preview and watched what I was doing and called me afterwards and he hired me as well at Apple. So that was pretty straightforward. But he said you should come to work for us. He had just taken over the Macintosh project and he said we have a passion about what you're interested in, because I'd been writing these reports about you know Fortune 500 is not interested.

Dan'l Lewin:

We should be thinking deeply about these other markets. And for all these obvious reasons, the long story short there was I went and looked at what they were doing. There were about 10 people in the group in this little Mexico Towers office space that had been set up and I looked at what they were doing and they had a little blueprint to consider the higher education market. Joanna Hoffman had done that work. She deserves all the credit for thinking it through. But the work needed to be retooled and redone because I was really smart about channels and distribution and how to figure those things out. Back in those days you could not mail order a computer to an individual. You had to go to a retailer. It was against the law to mail order. I mean when you're saying, were there computers? No, there were no computers. There were very few in homes. There were hobby hobbyists.

Alex Kotran (aiEDU):

they were kits and that was basically it I think right now about the exuberance around artificial intelligence. You know, because I was sort of working in ai like 10 years ago and it was, there was a similar level of exuberance but it was like very narrow, like here in boston um about Cambridge actually not boston um and you know, chat gbt kind of like massively expanded the and almost everybody now is sort of thinking and talking about ai and there's almost like a preconceived notion that the future is here, like if you could put your finger on the moment where people start to realize like, oh, computers are, like this is the future, this is what we need to be focusing on. It wasn't in 1980, it sounds like.

Dan'l Lewin:

Oh, it definitely wasn't my interpretation of the evolution of personal computing to the mainstream. My view of that comes from really desktop publishing and the notion of a graphics interface that human beings could reach into and the idea that what you see on the screen you could print. So laser printing, if you will. So that's where Macintosh broke through and broke out. We had already left, steve and I and others, steve first, obviously, but then we left to start next. But that was a moment. I think you went to NAC that's so cool.

Dan'l Lewin:

Yeah, and so that to me was really the beginning of it. The fundamental difference between what happened, what I would say, in the first 50 years of microprocessor-based computing and where we are today in this window, this 1975, and that 50 years total, so the first 25 years and the second 25 years, is kind of the following Computers are really good at optimizing rational tasks, calculating numbers, putting characters on a screen, manipulating an image over time, those kinds of things, and so it was all. You put things into the system and then they were stored and you could manipulate them. And there was no networking, there was no wireless, there was no way to connect. So communications were really nascent in all things considered. It wasn't until, you know, Macintosh kind of broke things out with desktop publishing. Then you had Microsoft and all the channel play through 1995, when the launch of Windows 95 and Office 95. That was a breakthrough. There was a cultural moment, All this stuff going on. Cultural again back to my earlier comments about women's rights, rock and roll. There were cultural movements that really occurred.

Dan'l Lewin:

1997 is when XML started to become a topic of conversation about, with browsers, going back to the Next machine which Tim Bursley wrote, the browser right and the web, if you will, on top of Next. I was writing the web, if you will, on top of Next. It was 97 until that early 2000s, and that was where Yahoo and others started to take off. And then Google you started to see the breakdown of the image of a page into these little components, XML components, and then you had these web standards to move those payloads around over the network that was basically ever-present, Started to turn the radio network, the cell networks, into ways for people to connect. Then you started to get social media and those kinds of things to connect people. So for 25 years, from more or less 1975 to about the year 2000,. Because you had Y2K run up with all the enterprise infrastructure and all the money flowing in, because everybody was, you know, last time you could sell people things on a promise that you know the world would fail if not. So that 25 years was one thing.

Dan'l Lewin:

97 with XML yeah, rolling up into the, you know, 2015, 2020, and today that whole period has been about markets reducing to a unit of one where I could market to you. But the way I marketed to you there was the trade, was your identity and your data in exchange for free access other than your connectivity charge, basically, and it's the organization and the mining of all that data and turning it around into a system using the techniques that you and others have been pioneering for 20, 25 years. But you now had cloud-based scale infrastructure. Storage was effectively free, Communications was ever present and everyone was walking around with a device and everyone had access to a device, Even had people like Warren Buffett buying into Apple. Why was it? Seize candy Apple? You know it's like. What do people buy when times are good and when times are bad? They buy chocolate. Do they have a cell phone? You betcha. So the cultural and social connections came about with the inverse of the business model.

Dan'l Lewin:

The challenge right now, in my estimation, is that, as has often been the case, the hobbyists and the early access point. Right now, that's the free-for-all of people running around on the web with AI and all the goofy stuff that's happening. The next phase, I think, will be behind the corporate firewall. Will it be a lot of very focused and highly tuned and highly valuable uses of these AI techniques associated with enterprise value and efficiencies. I don't know what that's going to do to the job market and drug people and all of that, but there's going to be a ton of that activity in the next five years or so. How that all turns into a use case for education and learning is an interesting challenge because back in the day, the benefit of a Macintosh in the university setting, which was the initial market entry, and then it went everywhere.

Dan'l Lewin:

After that it was the vet professor I mean, vet school is harder to get into than med school and he was an evangelist for the company because his point was when he was in that school there was a small chapter in a large textbook about some feline distemper thing in cats and here he was. 10 years later he's practicing. There's three textbooks on the topic, two journals and this, that and the other thing. So the question is how do you gain access to the information that you really need? So part of it was organizing information right and then obviously, then you had the web and then you have access and all the rest. So breaking that down into you know what you trust and how individuals will be will find their way, is going to be a big challenge when you start to think about the use of AI education.

Alex Kotran (aiEDU):

Because it's interesting that you talk about this sort of like digital transformation within these sort of big bureaucracies. One of the questions that I sometimes get is well, if AI is going to display so many jobs and we have all this AI, I mean the unemployment rate's at like 4% it seems very clear to me that there's a disconnect between the capabilities of the technology and institutions' ability to deploy those technologies, because the bottleneck is no longer what can be done, what can AI do? It's more like do we have the organizational structures, the people, knowledge and capacity, and also just the political capital?

Alex Kotran (aiEDU):

yeah, and the regulatory framing and all the rest regulatory framing and um, and my instinct is that the a big push will be when there's, whenever there's an extra session. I'm not going to predict when that is, but whenever there's an extra session, that's you know, suddenly it becomes a not going to predict when that is, but whenever there's an extra session, that's you know, suddenly it becomes a top priority to figure out how do we do more with less, because we just riffed, you know, 10% of our staff and I'm curious if that is that, how, like what do you see as sort of the critical inflection points for computing and maybe the internet, where it sort of pushed past all of that institutional like morass in the past.

Dan'l Lewin:

Well, y2k was, you know, one interesting moment for that. Because people had sold systems into the enterprise but there were competing stacks or the solutions, and so the interoperability of data became the big challenge. And so that was a recession that occurred after the Y2K sort of glut, where everybody sold everything in to save the day. Then the question was, the enterprises turned around and looked at the vendors and said now you need to make this stuff work together. And so the systems integrators did particularly well of organizing and bringing the data so that it was interoperable. You didn't have the ISO stack. People were still blushing that out. You know in terms of the layers, and you know what Amazon was doing and what Microsoft was doing. And even in the pc industry in that period there was what the asap bus crew was doing with compact and the lead and everyone else all the clones and bios that would allow sort of windows to run on a pc stack, and then ibm split and went with a micro channel, a different archetype. That was crazy, and then so they got left out in the cold for a little while, and that's when we started partnering with them, when I was at Next and things because they were. They were in left field. So these stacks, they just didn't work together and so that was a that was a big challenge. The tools are much more sophisticated and the problems, I think, are more. They're more human oriented today. Yeah, today exactly. They're more associated with individuals learning new and different ways to do work and in most cases, using these tools, they're going to be highly optimized in what they can actually do.

Dan'l Lewin:

Just even back in the beginning of the PC industry, there were the number of product managers required to bring a product to market and the time it took and the lead time for publications that were going to print the paper.

Dan'l Lewin:

I mean you had magazines embargoed for six months because they had a 90-day print cycle to even get the magazines to market. So I mean, just the pace and scale of things is just it's lightning speed right now. And I think the bigger problem right now at the broad level for society is how will people learn how to live in those systems and use those systems and deploy those systems? And it's always generational. So the question of if the technology takes a decade and now you can see that same decade back in the day, it would take a decade. Now you can see the same thing occurring in 24 months or less sometimes. How will people adapt? And yeah, employment is 4%, but the number of people who are unemployed, who are mid-level managers in the tech sector right now it's huge, like the qualitative data, from being in the valley um and in san francisco.

Alex Kotran (aiEDU):

You know, like two years ago, someone would get laid off from from meta and they would have two jobs lined up and and it's like, wait, I'm getting severance and then I'm getting a second right now. And the question was like oh, how much fun employment do I get? And now I have friends who are eight months great resume blue chip big pack.

Alex Kotran (aiEDU):

And that seems to actually be. I wouldn't say it's the rule, but it is sort of this thing that's percolating and it isn't necessarily reflected in the data, and I don't know that it's AI. I mean, I think that was just like interest rates are really hot and low. For a while there was a lot of like maybe overhiring, but there is this odd confluence of companies are saying oh well, like you know, we're now AI first, and a quarter of the code that's written at our company is written by ai. So I think a lot of that is actually just trying to signal to investors who are chasing anything ai. But there's, I mean, and my instinct is there's something to it. Yeah, he's really good at coding. And then there's like how much harder to get a job as a software engineer, like there's something there.

Dan'l Lewin:

I totally agree. I have personal experience within my family knowing that with highly talented kids and how long it's taken them in between jobs. And it isn't like I don't have access to people who will say, sure, I'll take their email and I'll hand it off to the most senior recruiters that we have and they're highly capable, but they're not hiring. And some of it is just a retooling and some of it is like you said, the run-up for a period of time was people they overhired right.

Alex Kotran (aiEDU):

One of the common refrains is that AI is going to create all these new jobs. And look back at past industrial you know, technology revolutions, industrial revolutions they all coincided with increases in employment. I mean, surely computers displaced a lot of the tasks that people were doing. If I think about, like before, microsoft Excel I mean, the fact that people were handwriting spreadsheets is wild to me. I think there's also a challenge now where companies are trying to figure out at what point do we start to raise the bar of expectation, given that we know that employees can now do more with less, but it's uneven in terms of which employees actually have access or have the training or even the capacity to learn.

Alex Kotran (aiEDU):

And I'm curious, with computers, how that was navigated where, you know, as these tools were starting to go and become more common, they weren't necessarily in front of every single employee. Like, at what point did companies say well, you know, people are bookkeeping stuff that used to take a week, now takes a day, and that's just the expectation. Was there like? Was there like a moment? Or whether did it just kind of happen gradually?

Dan'l Lewin:

I think it just happened gradually. I, you know, I remember the cover of a business week magazine saying you know, the office of the paperless office of the future. You know that was 1975. It was on the verge of not using paper anymore because you do things electronically. I, I think a lot of this was just again generational. I think it was a slow and gradual and systemic um process change and then before you know it you've got next generation coming in Spreadsheets and Wall Street. You had that whole run up with Mike Milken. It's like without spreadsheets they wouldn't have been able to do all that work. It wouldn't have been possible. So structural change and the economy radically evolved. I mean the financial services industry go back to the 70s was certainly sub 10%. It's 20% of the economy now. So there's just the world's a bigger place now. The marketplace, thomas Friedman, the world is flat. All of that allowed for structural change to occur, efficiency to occur, corporations to rise up to scale, and what we've witnessed obviously is the tech sector turning into this.

Dan'l Lewin:

As I said, you know, in my time at the museum, life doesn't exist without computing period. When did that happen? Slowly and suddenly. And what were the motivating factors for that? Communications, beginning of cell phones. I mean, I was on the board with a guy who was at motorola, who was in the executive suite, who when they brought in the first phone that had a camera in it. He laughed and said who would ever want a camera in their phone? He said I was that guy. So so everything that built up as a result of these communicators, right, this goes back to star trek in the very beginning. Many of the people who built the industry as we know it today were star trek fans, right and. And then it was, and it was alan k in the very beginning of it all, right, saying okay, here's the way we'll likely evolve, and it did.

Dan'l Lewin:

I forgot about the education component and the notion of that when the Apple reaction to us leaving Apple was the creation of the Knowledge Navigator video to show that the company had foresight for the way that we were going to evolve, and that's pretty much what we have now. I think we're in for a very different place. I think the credentialing associated with higher education just even the structural stuff that's going on in the world today and some of the pressures, whether to US politics aside, but just higher education and credentialing and learning those kinds of things. We're seeing more around badging and credentialing and I can go learn these things and I can prove that I have these skills and that's very, very doable. Some of the very early people in the computer industry that I knew didn't have college degrees, but they were really good at what they did because they dove in. I think this is just going to happen a lot faster.

Alex Kotran (aiEDU):

That's what I do A lot faster, because the past industrial revolutions, past technology revolutions had these very slow physical bottlenecks. You had to literally make sure that you were buying computers and then you had to hire these system administrators and you got to get the interoperability. That's what I'm doing and I think it's Sometimes. I think it's a bit of a vanity metric when people talk about the chat gbt reaching I think it was like 100 million users in a month or right, some crazy. But I think he's also instructed that like we have basically been getting to this moment, um, or you know, let's go back to like not take the and the fact that we now have widespread access to supercomputing over 5G broadband speeds. I think the key question is is education able to retool at pace or ahead of industry? Based on what you saw in terms of the education system's ability to adapt, surely there was a lot of change that came with computing.

Dan'l Lewin:

What are going to be the big challenges? Because the educational system has been set up to, like most regulatory systems, to resist change for the most part, and the generational evolution of how and what instruction is delivered and done and what information is presented done and what information is presented. What AI presents is a personal GPS for every learner. Right, you make a wrong turn in your car. What happens? Reroute? Right, because you want to get to a destination. If you get 90 on your algebra test, that's an A, but that last 10%, why weren't you rerouted? And why didn't you get a hundred percent? Because the system could easily reroute you to find out what those little things were that you should know, which maybe would make a difference five years from now or some period of time in your life, you know, or your credentialing for some opportunity of work of some sort. So that's where the tools can come in, how the the system will deploy them and whether the teachers and the structure of the teachers unions would allow for those things. That's where I'm concerned. I just think that's. I think that's that's. That's my big work. So you do see alternative approaches in some of these different types of schools and I'm hopeful for some of those, but at a wholesale level.

Dan'l Lewin:

When you break it down to the state level and how our systems are administered, it's hard to be really optimistic. Because I mean, the last major structural change that strikes me goes back to 100 years ago, which took us to the microprocessor era. And what was that? Well, human beings traveled at one horsepower. Percent of the agricultural production in the United States went into feeding horses. People cleaned up after them, cleaned up the horses right, and then we had structural steel right Because of Ford, and then eventually high rises and then aerospace and all the things associated with that. So that took a long time, but that was transportation, moving people around in that fed industry and war and all the rest and all the military stuff.

Dan'l Lewin:

And obviously Silicon Valley exists because of the military stuff, because the Russians had better rockets and we needed to reduce the weight of our payloads and so semiconductors got fed and Shockley lived here because his mother was in Palo Alto. So there you go. So we're seeing now a very different change because it's ever present. People's lives don't exist If the computers go off, we all die and the indigenous people somewhere survive and there's a new world. So it's a very different change now than what we've ever experienced before, because we have this ubiquitous access and requirement that these systems function. And it's on top of that and we're being fed the data that we've given away in exchange for free access. We're being fed that back. So who's jurying, filtering, organizing and presenting that in a way, through the structural systems that society has created for itself, in a world of agriculture which goes back to that hundred-year-old thing it was farming and horses, and that's where the educational system was set up to support that educational mean and endgame.

Dan'l Lewin:

So that's where I seriously do worry as a society, especially in an affluent society, as opposed to one where people are hungry and that's so the opportunity for for the africans perhaps right and people. You will see more and more of that over time and we're seeing it in other countries as well yeah, that's an interesting, almost almost contrarian take.

Alex Kotran (aiEDU):

I'm not sure. I'm not sure if we disagree on the potential for, like the, the intersection of ai and education. I I think what you described is sort of the I mean I, ai and education. I I think what you described is sort of the I mean I'm gonna say utopian, because I think it's actually really plausible. If you had the right expertise and systems to be able to implement it, no doubt ai could significantly enhance teachers ability to personalize learning. You know there's questions of, like you know, in harry potter they're the sorting hat and sort of, and that there's, like you know, there's other sort of dystopian novels or people that your job is sort of dictated for you when you're born. I think the world that you described where you're you're, you know, let's say, the education system has enough of your data to be able to personalize learning in that way.

Alex Kotran (aiEDU):

It seems at some point we're going to say well, it doesn't really make sense for us to have students apply to college because, first of all, they're using AI to write the applications.

Alex Kotran (aiEDU):

We have all this data. In fact, ai probably can predict to a far higher degree of accuracy than some human reviewer which, not just which college you should go to, but frankly, which career you should pursue. You know, perhaps they're sort of like we put our foot down and say, well, that's a bridge too far, but that's. You know the way slippery slopes work is. You know, you eventually do get at this place where there's so and and maybe that's just sort of the inevitability of like this and I think this is hard because I don't know what your take on agi is but sometimes I feel like there's a a rhetorical challenge. If somebody's assumption about AGI is that we're going to achieve AGI in five to 10 years, then a lot of these conversations are a bit moot. I try to assume that, like, if it comes, it comes, but you can't plan around sort of this, what might actually be a really difficult gate you know, for us to achieve a breakthrough technology.

Alex Kotran (aiEDU):

So let's just write off like, let's say, we don't achieve AGI. But am I missing something there in terms of you know, once you start relying on these tools, it just feels very hard. It's like imagine going back and using a paper spreadsheet.

Dan'l Lewin:

Yeah, I think the human opponent. Answering that question is complicated because of the cultural differences and societal norms. Who's it tuned for? In that sense, that's the one thing that I'm I kind of struggle with. It's the self-driving car routine.

Alex Kotran (aiEDU):

You know and can you explain that?

Dan'l Lewin:

Because I think it's well, yeah, in certain societies, if the choice is go off the road and kill yourself or the choice is to kill the baby and the mom walking across the street, where do you go? Kill the baby and the mom walking across the street? Which? Where do you go? And, and I don't know, was the car programmed in some country in korea, where it's different than if it was programmed here? So what are the choices they could?

Dan'l Lewin:

So I do think, I do think that and it gets into the kinds of things that that I've, that I studied and that I've been reaching back and reading more again. So cybernetic stuff, norbert Wiener, right, so it's like human interaction with machines Starting to look at, you know, the sense of Western society and humans. If you go back to one of my first experiences, I had a professor and a advisor in college who studied. He was a politics professor, but he was in the State Department and he was working on Iranian affairs at the same time that the US government was working with the Shah of Iran and aiming to implement a social security system, because EDS had done that in the United States, in Iran, and you're giving a number to every individual in a society that has no sense of I or self because of the nature of their culture and emanational nature of their belief systems. So, as these systems get grown up around the world, my curiosity is around how they will be tuned and housed and guided in a world where Internet protocol takes information everywhere in real time.

Dan'l Lewin:

So that's the hard part. So, even in the United States, down to the 50 states and the different schools and the districts and all that kind of stuff, how will they be? How will they be deployed? Who will be the? Who will be the judge? And that's the. That's the stuff that you know. You pose a question. You know the next five to 10 years. What is that going to look like? I don't. I see smart people trying to wrestle with these questions. I get that, but I don't. I don't have a crystal ball.

Alex Kotran (aiEDU):

This is the trouble of when you talk to people who are really informed, they don't actually make predictions. Make predictions, um, I've been. I've actually learned like you can. You can basically discount somebody's expertise if they're uh too sure of what the next, because it used to be. You might say, like the next 10 years, I think five years is actually unknowable yeah, exactly, I think you're totally right.

Dan'l Lewin:

I think you might be able to extrapolate. I mean, ai has changed the equation a little bit in a lot I I would say not a little bit, but it used to be that you could. You could kind of inside of a large corporation, having spent 17 years as an officer of microsoft in the end and having access essentially to all information, a thousand phds doing research and looking and the way the modeling gets done and everything. You sort of look at it and go. I might be able to look out 18 to 30 months, you know just in terms of. But then something comes in and then it adjusts but the big machines take a while to realign but the pace of change, it's stunning right now because again we've reached the stage where every individual functions as a result of computing. Everything they do all day long requires it to me it's all day long requires it.

Alex Kotran (aiEDU):

So to me, it's the real question of what does it look like once you have capable agents and an agentic AI? And the challenge of this is, like there actually is, I think. Simultaneously, I think people are underestimating the scale and scope of the chain. Users are sort of like so distracted by the individual widgets, and yet I think there is also a lot of overhype, of course. So balancing that balancing act of like yes, we're.

Alex Kotran (aiEDU):

I mean and I guess, going back to personal computers and the internet, I mean there was a hype cycle and it would still have been correct to predict that this is going to be the future. The goal might not still have been correct to predict that this is going to be the future. Um, the goal might not necessarily have been let's go build a website right now.

Dan'l Lewin:

Um, maybe that's what you need to do, but yeah, but I mean in those days we were still I mean that's the some of the earlier comments I mean we were. We were still struggling with the stack um and the communications infrastructure, the cost of storage and the cost of this, and it was a researcher sorry.

Alex Kotran (aiEDU):

in addition to all of the institutional bureaucracies that had to be changed, you still had all the same challenges you had with AI, but outside of the physical. But you used to go challenge.

Dan'l Lewin:

Yeah, logistical challenges of the industry and the competing stacks and the competing approaches and the way those things were going to work and the lack of interoperability and all abundance of data are the things that the data is out of the bag. You know the genies. How it gets restructured and placed into society is the big challenge, and there weren't any real regulatory issues. I mean, IBM was held accountable as a monopoly and Microsoft as well in those periods of time, but that was very small scale compared to what the world is facing right now.

Dan'l Lewin:

And the structures that need to be considered for the social implications of these devices and what they can do and will do, whether we like it or not.

Alex Kotran (aiEDU):

Can you give me an example? Because I think there's, you know, going back to these, there's a lot of like, well, we have to harness the upsides of AI and minimize the downsides, and I think sometimes there's almost a generalization of what and even like ai ethics. So I you know you can go as far as say, okay, algorithmic bias, sure, but if you can help paint a slightly more sort of um, you know, higher fidelity picture of like, the type of things where, if it's not regulation, at least sort of like having standards and systems in place, are going to be important yeah, yeah, you asked me a question in one of our exchanges about some of the readings and things, things that I find that are informing some of my gut reaction to these.

Dan'l Lewin:

There's a book that Verity Harding wrote called AI Needs you, which is a really good assessment of three different technologies, the internet being one, um ivf in vitro being another that took 20 year arcs for society to absorb them and for there to be regulatory oversight in some way, shape or form, one could argue that neutrality, etc.

Dan'l Lewin:

But but she looks at these, these three things that took societal change and structural change and courage for them to materialize and become part of the fabric of of modern culture, at least in the west. Um, there's another book. Jamie suskin wrote a book called the digital republic. It's very well organized into bite-sized chunks about structural and political change and how the technologies can fit into that and what kind of courage we need to deploy new structures and new regulatory frameworks on. That is again, generational change in the governance and the people's skills in government to be able to appreciate that and put those things to work. And that's, um, because back in the day this the motivator was was, uh, life-threatening, it was world war, and so those were the motivators where we actually had science, advisors and things like that.

Alex Kotran (aiEDU):

That people really organized uh to save society as we know it right I love that your emphasis on history, like as someone who studied history in college. Um and you know it's funny my background is not technical at all. I have never written in a program. Yeah, my background is not technical at all. I have never written a program. Yeah, my background was in political science, arts and politics. Um I accident fell into into ai.

Alex Kotran (aiEDU):

I was working for the ai company doing ml and pharma tech space and um did like policy and columns work for that sure, and I just so there's a long story, but basically landed at this AI company like the first company to basically build language models and linguistics and predictive coding for the legal sector, and the CEO was this, nicholas O'Connor. He was sort of this visionary, almost like a philosopher king internet history.

Dan'l Lewin:

And he was sort of intuiting this risk that history.

Alex Kotran (aiEDU):

He was sort of intuiting this um risk that I see the technology now being experimented within, deployed, including by our company, and there's no sense of like what competence or accuracy or quality is. There's no guidelines as to like who should even be equipped to ask the right questions, and certainly judges weren't. And so I started actually building ai literacy for judges interesting and I then discovered that our schools weren't teaching ai about ai and I was like, well, okay, well, the future work is probably also no, I didn't know that about you, that I really appreciate where your questions are coming from now as well.

Dan'l Lewin:

When we roll out the macintosh, I had a consortia of 24 institutions participating in receiving the systems under this centralized pricing and give-and-take relationship and one of the requirements was that they encourage faculty to do interesting things with the computers and then to share those little mini programs and things among themselves. And it was a guy from Boston College who organized a book of stuff and we printed it all but the distribution of that software. We cut a deal with Kinko's, because Kinko's distribution strategy was to set up shop next to the major research institutions, of which there are 200. And they would build these books it was called Professor's Press of chapters from various different publications and snap together the book for this professor's syllabus for his course or his or her course, manufacturing the floppy disks with these little quant-based things for the humanities and all these little programs and things and programming languages that the physics professor at Reed College built called Rascal Rascal because of Pascal, all these things, and it all went back to using the existing infrastructure and distribution infrastructure.

Dan'l Lewin:

The challenge now is it's the distribution in the infrastructure for sharing information is ubiquitous and everyone is a unit of market to attack, and that's the difference.

Dan'l Lewin:

Now, there were filters by which these things were delivered before and there aren't any. So how we harness those filters and what kind of leadership and structures are being proposed, that's going to be the trick, and it isn't that there won't be these capabilities where you can look at the bright side. There will always be the downside, and anything that can be used at scale can be weaponized, right, so? And this can be weaponized by a person or a small group of people in a very different way than, obviously, other types of weapons that require, you know, nation state action and things like that. So, um, that's the worry that I have, and the good news is there's a lot of smart people who who are putting energy into trying to solve, at least point out the areas that need to work and some of the structures that could be put in place, and that's I'd sort of try and hunt down that type of reading where I can yeah, I think that's the.

Alex Kotran (aiEDU):

That is the glass half full argument, up to putting language models into the public zeitgeist. I've heard it sort of talked about as reckless, but I think, prior to chat GPT, I founded AIEDU in 2019. I was doing the work in 2018 before. It was a very small world and a lot of people did not take meetings with me. I was doing the work in 2018 before even, um, it was a very small, it was a very small world and you know a lot of people did not take meetings with me, um, who are not banging on our door, but we have no like my, my, I have no issues getting meetings. It's more about like, how much, where do I spend my time? Right In the what do you what?

Dan'l Lewin:

do you? How do you exchange? What do you? What questions are you asking? What are you looking for?

Alex Kotran (aiEDU):

And so there's a tremendous power to. I mean, if you think about presidential campaigns, yes, it's hard to mobilize a really disparate, decentralized system like the US education system, but we do it every four years, twice every four years, With presidential campaigns, you're mobilizing half of the electorate. This is actually turning out, but we do it every four years, twice every four years, With presidential campaigns. You're mobilizing, you know, about half of the electorate is actually turning out and you have a very short timeline to do it. And it is not just about hiring a bunch of field organizers. That's part of it. And you know the Obama campaign did that really well. The Trump campaign really didn't. They outsourced it. What I think where there are common threads is there is this sort of centrality to, and simplicity to, the message, or even just and sometimes I think people confuse message and policy, Not always policy, but it's like what is the reason that it's getting you to take attention to this take?

Dan'l Lewin:

an action.

Alex Kotran (aiEDU):

And obviously Obama had that. Obviously Trump has that, and I think AI actually poses this, like the fact that now it's, you know, teachers are like building conference centers. You know, hundreds of teachers, hundreds of superintendents are taking time out of their day. And we did one big ai summit in cincinnati and the uh ohio department of education was there and like we've never seen.

Alex Kotran (aiEDU):

I love this level of excitement, yeah. And so there's like the question is, how do we channel that attention to what's actually important, as opposed to just try to sell but you're on it and that's that's good to, as opposed to just try to sell?

Dan'l Lewin:

You're on it and that's good to hear what you just said. It's important. My last role at Microsoft, I did campaign technology for both sides of the aisle for the 2016 election cycle. So I started in 2012 and I had a red team and a blue team and understood and studied everything that the Obama folks did and then, because it was all Microsoft underlying technology but no Microsoft data, so we watched all of that happen, including what happened with Cambridge Analytica and Ted Cruz and all that stuff. We watched all that stuff. We watched all that stuff and I think the harder part is it's a great analogy that registered voters is a known list and they're all technically adults, so it's the children the kids who will have access, whether we like it or not, and that behavior pattern.

Dan'l Lewin:

So I love the idea that you're spending time on that regulatory statewide education conversation to get people to be thinking about this and there will be good models that emerge and then people will share them and ideally, they'll share them easily and quickly because of the infrastructure that exists. The challenge will always be, which is what I experienced in my personal life because my kids grew up with their. My oldest was 18 months old when I had a macintosh prototype at home in 1982. And so he grew up like a duck with that menu system imprinted in his brain and when he got into you, know, junior or middle school whatever, where they had a little lab and he was controlling the systems and was getting in trouble.

Dan'l Lewin:

But he was wasn't really getting in trouble, he was just doing the things that he knew how to do, and the schools weren't ready for that. So that's the one question I will have is what will the schools do when the kids come in yes, more empowered. What will they do?

Alex Kotran (aiEDU):

I'm obsessed with this. I've been by far the number one question that we get and request for help is like help us deal with cheating? All the kids are using chat, gpt, and you know, I've heard some people in the space. Actually, their response goes something like well, it's not cheating these tools and if they don't know how to use the tools, they're going to be left behind. So, not cheating, just, you need to just change what you're doing. Um, I actually don't agree with that, because I think what teachers are actually? They don't. They can't quite put their finger on it, but what they intuitively understand is that kids today are already running laps around us. The idea that teachers are going to teach students about how to use AI is ridiculous. Let's be very clear. I just actually had someone from Stanford who has an AI makerspace and the students at Stanford are the mentors for the faculty. That will be the model to the extent that we're going to be trying to figure out how to use the tools.

Alex Kotran (aiEDU):

If, to the extent that we're going to be trying to figure out how to use the tools, there's there's something sort of fundamental about part of school, part of and like I didn't do, I really didn't do very much in alignment to what I studied.

Alex Kotran (aiEDU):

I mean, I studied politics, but it was I really just chased the interesting professors. So I studied the history of Brazilian politics part two, right and did the same thing. The one thing I learned in school was just the persistence. I wrote a lot and so I spent a lot of time, you know, and often the night before a big essay was due, I'd have to sit down and sort of like push through the writer's block and get something written and I, I, I, really I just I just wonder what it's going to be like in a world where, um, replace AI with, like, a really helpful parent and, let's say, the parent is like really good about not giving you the answer, not writing the essay. It'd still be like something would be off if somebody went through college, went through high school, and they always had their parents sitting next to them like oh, like, are you having trouble with that? Can I help you? Like nice, talk about it like you'd be like.

Dan'l Lewin:

No, I totally agree, you need to be by yourself sometimes I totally agree that the notion of deep immersing and reading and deep thinking, sort of deep structure as opposed to surface structure, because I think what you get back you ask, learn how to ask good questions and you get back interesting information but it's surface, it's a surface level, it's not the deep structure on the underlying, sort of the Chomsky language in mind stuff, right, it's just not, it's not the same and that's the one thing.

Dan'l Lewin:

You know that I worry about you're pointing out the same thing and, uh, I had the same thing. The professors that I was same exact example I was. I got a degree from the politics department. I took five courses out of the department. Everything else were cognates that were tied into things that I was focused on sociology of the family, the politics, the relationship between men and women, all this other kind of stuff Because I was trying to figure out how do you organize, what are the organizing principles for driving change in society.

Dan'l Lewin:

So I got to apply them to the computer industry in the early, early phases, and so that's what I look back on now and say, ok, what are those structural things that we need to be thinking about? And the reality is that, like you, I took one test in college and other than that I wrote papers, and it was. It caused me to think. And the last piece of work that I did for this professor which was the last thing I did in my quote-unquote college career I can't call it an academic career, but I got my degree was ask yourself three questions and answer them no more than five pages. We won't judge what you ask or how you answer with anything that you read or discussed in the precept, which was a small gathering, or from the lectures. It took me a month to think that through because I knew that was the end game.

Dan'l Lewin:

It's like what do I and why, and how am I going to express that in a concise way, rather than 50 pages of really hard problems? And so that's the thing that I'm the time, the contemplative time, that's the one thing, and I don't know what that will turn into for it. Just it will change the nature of the human being, and this is a general, linear stuff. I mean, it's just going to change us.

Alex Kotran (aiEDU):

So I worry that, like the digital divide, I was just talking to Tony Wan about Desmond Reach Capital, who also, interestingly, like he's in a venture capital now. He's been 10 years building EdSurge as a sort of background as a journalist and so he has a very unique, unique take as a venture capitalist that you don't always hear. But basically, the wondering I have is whether the digital divide will actually look something like. The poor kids hate all the AI. You know the. You're in a private school, you're reading equities, you're writing pen to paper, you have a teacher in your classrooms and sure, I think there's still ai there. I think the teachers are still using ai to maybe personalize. I think personalized learning, as you alluded to, is such an obvious, especially if you think about like students or social needs certain types of learning and certain types of content.

Dan'l Lewin:

Right subject matter.

Alex Kotran (aiEDU):

But but I, I and the reason I'm focused on that is because, like to the, to the point about what does it look like to have a very clear and crisp message, and what I have been really pushing and fighting for is ai readiness, not ai literacy. And literacy is how do you use a technology, readiness is how are you ready for the world, and that might mean math and reading and writing. It might not actually include that much. Yeah, literacy and I go back to 2007, which is not even nearly as far back as you've taken us, but it would feel quite silly in 2007 to be like the thing that schools need to focus on is mobile phone literacy. It's also correct to say that mobile phones change the world and you couldn't do a job without a phone, but it's, it's like, necessary but insufficient right no, I, I'm, I'm with you, it's, it's a, it's a question.

Dan'l Lewin:

The book Writing to Learn. I mean, we've, as humans, have learned from taking a stick and putting it in the dirt. You know what I mean. And over there divide, whether you have um or economic divide, whether you have the access to someone who can help you with the nature of humanity as it's currently embodied in most people, and yet, um, you get the email from the school. These days, my partner, my fiance's, got a son in high school. There's going to be a policy about cell phones that they can't be on. It's like, all right, we know this Shouldn't be on in school, but they are. I don't know. I don't know. We are in an inflection point that is unique, um, and you know, on the global scale, the again I, I look at it at the, at a macro level, just a level of cultural differences and societal norms, the rituals, the symbols and the way in which the world will evolve. And then we'll, we'll know when we know, and, but it's happening, happening faster than ever do you feel like you go back to?

Alex Kotran (aiEDU):

your conversation about access to information, and I feel like we're actually sort of like past the peak, like if I had to guess it would be around the 2012 timeframe, where there was a really rich universe of like high quality content, like the publishing and news industry hadn't collapsed yet social media hadn't quite pivoted to information. It was still like friendships and social networks and connections. Right and now, today it's like you know my own, my read some mainstream publications, but I really my the time I spend, I spend it's like it's on YouTube really, and maybe a little bit of Twitter or Axe, but everybody I talk to, I mean like their information silos are like it's staggering.

Dan'l Lewin:

They're real, yeah.

Alex Kotran (aiEDU):

And so I always wonder like do we? You said well, no, but I yeah, like I, I feel like some people will know yeah, and other people I will just already just happen, and they'll be oblivious to it having happened.

Dan'l Lewin:

I I I don't disagree with what you're saying and I think that time frame that you point out is is very rational, because it's in that period that I started asking questions of people in the group that I was looking after inside of Microsoft, and they were of another generation than mine, and I was asking them about Facebook versus LinkedIn. I mean, I went to the first public developer event that Facebook had. I was at Microsoft and I was here and we had a little programming tool to help.

Dan'l Lewin:

And obviously, back in that day, it was your enemy is your friend, and so we made the investment of Facebook, all the kind of stuff relative to Google and all these things, and Facebook was where you basically would be. You'd communicate with people, only those people that you would. This is what people said to me Ben, you're at the beach and you're with your family and you're in your swimsuit. You would send those photos to the people that you would friend on Facebook. And that was the use case, was that level of sharing and community and some level of intimacy and filter. And then LinkedIn was a professional framework and that was it.

Alex Kotran (aiEDU):

And I ask you, like as someone who is informed and thinking deeply about this I don't want to make it sound like other people are sleepwalking and I'm sort of smarter than them Because, like I'll be the first to admit, I'm curious if this is the case for you I have started to become more and more reliant on the, like the google search synthesis, where, I mean, it gets it wrong a fair amount of the time, but most of the time it's right. Most time it gets me, and even when there's like potential for a hallucination, I'm sometimes just like, well, let me just try it out, like if I'm troubleshooting something, and so even someone like me who is really attuned to this, my behavior is like pretty has changed and even and I'm curious if you've I mean, have you taken any? Are there any sort of like meaningful or intentional things that you've done to try to ensure that you don't, you know, sort of haphazardly live into? This is a rabbit hole.

Dan'l Lewin:

you mean, um, inadvertently placing too much trust or reliance on ai I may be more of a, a lot, and most in that sense I mean whether that's the right phrase or not, I don't know um I tend to read more than than most people that I know and and I just find it um sort of I had a list of books that I've been you know like.

Dan'l Lewin:

I picked one out the other day the revolution, very self-social change in the emergence of the modern individual from 1770 to 1800, like like what was going on back then and how did the modern human of that era emerge and what was what was propaganda back then? And what's propaganda now? It's the systems we have right now. It's basically it's propaganda. So I try to like you. I think what you just described is, if I'm searching for something, I'll try Google, I'll try Bing.

Alex Kotran (aiEDU):

You're not using like deep research.

Dan'l Lewin:

No, I'm not and and it's kind of like maybe it's, maybe it's just me and maybe I'm stuck because I don't know. I've got about 500, maybe 700 old albums that I have digitally and I like that music. You know, there's a certain um. Well, it isn't that I want to explore other things and things like that, but the majority of my time I find that. And then when I, when I'm with my kids and my grandkids and stuff like that, then that's when I sort of I listen and learn and I follow their leads Reese for the last couple minutes.

Alex Kotran (aiEDU):

Touch your music and do you so. You don't use streaming services. You actually have albums. Maybe they're on iTunesunes yeah, I mean I I digital, I have, I have yeah I have.

Dan'l Lewin:

I was an early sonos customer. I've got a whole bunch of zones and all that kind of stuff, and so I got a little server and I put all of it up there yeah, and, and there are streaming services that laura will put them on as well, but I'll I'll. I'll just. I like jazz and I got a bunch of old stuff that I really like and things like that.

Dan'l Lewin:

Um and I you know, and I'd probably bob dylan, and classical charismatic leadership theory, and so the new move, an interesting one as well. Uh, about that.

Alex Kotran (aiEDU):

So um, anyway, yeah, what I'm getting at is because I also collected cds. I mean, I my best gift I ever got was my cousin basically gave me that like this giant binder of cds and I just let me rip them. So I just had all the cds, itunes and I sort of like immediately teleported into.

Alex Kotran (aiEDU):

You know, this is like sort of like indie rock sort of like the general theme and so I was like I by far had the coolest music taste of all my friends I was listening to like Radiohead and Flaming Lips and John Coltrane, a lot of jazz, a lot of like avant-garde, all of that place.

Alex Kotran (aiEDU):

Looking back, that was by far the golden age of my and still sort of like in form is my music taste today, yeah today, my music, my, my ability to conjure up a, a song or an artist that I like, has diminished so much because the unit of measure is no longer an album, it's a radio playlist that's algorithm creates for you and and it was sold to us under the premise of like. This is going to help you with discovering new music and the actual and I've talked to enough people that have experienced the same thing where it's you just you're tapping into a sound stream.

Dan'l Lewin:

Yeah, you're not, not the same you don't have any.

Alex Kotran (aiEDU):

You're not, you're not invested in it because you haven't. It's not about even buying the album.

Dan'l Lewin:

It's like listening to an album all the way it's the surface structure, in the deep structure, the same thing that me. That was again a takeaway. This is a couple reads from my, you know from school, is language in mind. You know chomsky's stuff, um was really fundamental, um structure of scientific revolution, sort of structural change, which is coon's book. Are you familiar with that? I'm sure most people, and it's that they were. I think he wrote that in 72 or something like that, and I was in school in 73 to you know. So it's like they were fresh and Wiener stuff was older.

Dan'l Lewin:

But information theory, some of those things became, you know, were really fascinating, and so you know, we were what you're. What you're getting at is just the notion that you actually invested enough time to go deep and to, and, and so you as a human, something locked, something locked in, you learn something. And that's a structure from which, in a filter by which you see the world and that's what's changing in a really unpredictable way. And so the question, and that's why I I err on the side, and so I want to learn more, a lot more about what you're doing as well, over time. Just how are the societal structures going to filter and what will we be trusting? And so your point about, well, is it an economic divide, where there will be some, where there's a human engagement, where there's a level of humanity engagement, where there's a level of humanity, uh, or will? Will it be more dystopian, or will just be slitting around surface like a water bug?

Alex Kotran (aiEDU):

yeah, I don't know. I like I.

Alex Kotran (aiEDU):

I spent enough time in you know, world economic forum, the un, the blunt bank, all these are all these sort of like organizations, and there was a lot of thought leadership. There's a lot of thought leadership today. There was a lot of thought leadership about ai when the world economic forum coined the fourth industrial revolution back in 2016. Um, and at a certain point, I realized that there's there's probably not a pathway to policy leaders and the thought leadership class actually solving this, because what I kind of like and I will give myself credit for this and, by the way, I don't think this is actually kind of terms yet, but I really think that there's ai is, right now, the protagonist in the story, um, maybe like a, uh, an anti-hero if you're, you know, depending on where you are, but it's sort of the most conversation about ai is about, like, the opportunity it's going to bring and, yes, there's going to be. Most conversation about AI is about like the opportunity it's going to bring and, yes, there's going to be changes and downsides. Um, I think in the next it will be the next recession, but I think it once, once job displacement it gets to a certain point where it becomes unequivocal, I think. I think public reception shifts my worry that there's. If we don't build some foundational knowledge about what this is, it's very unpredictable what the negative sentiments and the backlash will, how it will be channeled, because it's possible that everybody's sort of like that there's a rejuvenation of, let's say, union membership and people realize that we need to sort of like that there's a rejuvenation of, let's say, union membership and people realize that we need to sort of like retake power, um, to have more agency over our data. And I think there's also like, especially in a world where there is so much ai that can manipulate which, and the thing that I think is actually the best, uh, the one use case of ai where it performs the best against humans is persuasion, more so than coding or writing or anything like that, and so so if you think about, okay, well, how do you educate the public to a place where they are now at least interacting with the changes in a way that, um, if they don't, they may not have in total agency, but you have more agency if you're at least can make decisions, and so for me it was like well, where do you start? Well, you need.

Alex Kotran (aiEDU):

We actually started out as the american ai forum because the idea was like we need to educate everybody.

Alex Kotran (aiEDU):

And then and then I quickly realized that schools weren't teaching kids and I was like, okay, well, that's, that's an obvious place to start the future workers. They're at least making a very clear decision. But that's going to be impact. Like you should, even if you say I really want to be a, you know, a truck driver or you know, drive uber, right, you should know that there are companies that literally are planning to displace those jobs, right, um, and it's, it's not the entire solution. And so you know, when you talk about like regulation, like we really don't, really don't advocate for regulation, I mean, we have an opinion about what guardrails should be in place, but I think it's the hardest part. Yeah, you know, regulation is going to require a lot of thought and strategy, but reaching real people, especially people that are not, I've never heard of Next, yeah, there's a lot of those, but people who just don't have that technological curiosity which is completely justified, sure, no, it's relatively narrow, but in this moment it feels quite urgent and large in scope.

Dan'l Lewin:

I would agree. I think we're. I think people like Jaron, you know it's important that they speak and a boy does some other work in that area. You've met Dana? No, I actually didn't. I like her work as well she's also at microsoft. Uh, he set up a thing in new york called the data and society research institute, and when obama administration did the ai reports, if you will.

Dan'l Lewin:

They had three centers one at mit, one at berkeley, stamford, I think, and she helped facilitate. The one at MIT, one at NYU, murphy-stamford, I think, and she helped facilitate the one at NYU. She set up a new or was she's in Colorado now, but I think the entity I don't know exactly where it is right now, but there's more, there's a few other researchers who are saying Crawford as well have good voices.

Alex Kotran (aiEDU):

Yeah.

Dan'l Lewin:

I anyway, I agree for agree for this.

Alex Kotran (aiEDU):

I mean, it's like, yeah, it's part of it's part of my hope for this podcast or this youtube channel, whatever we're going to call it is, um connect some of these thinkers to educators, to funders, um, that are right now, I think, understandably kind of like very focused on the ball in front of them.

Dan'l Lewin:

Yeah, they have to be in some senses, because it's like a fire you have to be attentioned.

Alex Kotran (aiEDU):

Well, it's like a house that was already on fire. Yes, you know, yeah, yeah, good on fire right.

Dan'l Lewin:

So now they. Now the question is how do we, how do you house it? Yeah, anyway. So I this is. It's interesting for me. I appreciate the invitation. Thank you so much for coming and glad to do it.