
aiEDU Studios
aiEDU Studios is a podcast from the team at The AI Education Project.
Each week, a new guest joins us for a deep-dive discussion about the ever-changing world of AI, technology, K-12 education, and other topics that will impact the next generation of the American workforce and social fabric.
Learn more about aiEDU at www.aiEDU.org
aiEDU Studios
Gavin Purcell: AI vs. the entertainment industry
AI is shifting the ground beneath our feet.
In this week's episode with Emmy-winning TV producer Gavin Purcell, we explore how AI differs fundamentally from previous tech trends like crypto, blockchain, and the metaverse.
Drawing from his unique perspective in traditional Hollywood and cutting-edge tech, Gavin explains how reasoning models represent a quantum leap beyond simple prompt engineering toward truly intuitive human-machine interaction.
The conversation turns particularly fascinating when exploring creative industries, where Gavin introduces the concept of "micro-studios" – small teams using AI to produce content that previously required hundreds of professionals. Through a live demonstrations with AI voice companion Maya and popular examples of AI-generated video, Gavin shows how close we already are to genuinely engaging artificial entities.
And since we're aiEDU, we also had to dive into what this means for education, careers, and human connection. How do we prepare students for a world where traditional skills are augmented or potentially replaced by AI? What happens when digital companions become more consistent and accommodating than the humans in our lives? These questions don't have simple answers, but they demand our attention.
And be sure to check out Gavin's AI For Humans YouTube series:
aiEDU: The AI Education Project
welcome to. I don't even know this might be our 10th episode of oh yeah edu studios. Um, I'm here with a true professional uh in show business gavin purcell emmy award winner. Um worked for jimmy fallon nbc universal and has, I think, one of the you know, most exciting up and coming.
Gavin Purcell:I think still up and coming gavin, uh, sure, cast, sure, we've been around for a bit.
Alex Kotran (aiEDU):Now we're not like old, old, but we've definitely been around for a bit yeah, I mean you were like, I mean you were really tracking the ai stuff before it blew up.
Gavin Purcell:Yes, yes, I mean we were tracking it. I mean I should always kind of interested in like the turns where media was going across each step and in the GPT-2, gpt-3 era, which was probably around like 2000, 2020, 2021, I started to think, oh, I remember seeing the GP2, maybe the GP3 paper and it talked about how it could write these paragraphs right, and it was the first time it could write paragraphs which were like and it was the first time it could write paragraphs which were like significantly cohesive and coherent, like you could actually read a paragraph that a machine had written. I was like whoa, that's weird. I've made my life writing stuff or making stuff and now the machine can do this thing and we know, like that, especially at that time, it was not like original stuff, but it wrote something that was readable and that felt like a really weird turning point and kind of.
Gavin Purcell:Since then, I've always been like my problem as a person is I tend to jump from thing to thing to thing that interests me and like I've been able to hold on to this AI thing now for a couple of years because it continues to get deeper, more mature and, I think, honestly more transformative than almost anything else I've lived through and I've lived through a fair amount of stuff now.
Gavin Purcell:I lived through, you know, I got a PC when I was, like you know, eight or 10. I think the first PC I had was an Apple IIc. So I was that I'm that old with that era and lived through early internet, lived through web two and, you know, lived through kind of all the stuff up until now and it really does feel like this is as significant of a moment as getting that first computer felt like to me. Now that might just be that it's just cause it feels like it's everybody's talking about it in a way that some of this other stuff kind of didn't bubble up as much, but that's what keeps me excited about it and keeps me kind of leaning in further rather than leaning away from it. Does that make sense? Yeah, of course.
Alex Kotran (aiEDU):Yeah, and we should say you know, I, you and I really first met playing this poker, this poker game with among friends during the pandemic and we were talking about and we were trying and I think you're also alluding to some of these other fads coming out of Silicon Valley. Sure, you know whether it's crypto and like blockchain, and NFT is sort of an offshoot of that, the metaverse.
Gavin Purcell:And I don't know if I'm going to say that 3D printing, no, 3d printing is always the one that, like it's the ultimate. Like I was very excited about 3D printing, I still think it has a huge application, but people at one point were talking that we were going to be able to print, by like 2010, something in our home, and that just never came to fruition, right?
Alex Kotran (aiEDU):So a lot of these things are like that. Well, it's interesting because my brother just got a 3D printer and in the same way that you know, blockchain has not disappeared Crypto. I think the US actually just announced a strategic reserve yeah. So it's not that these technologies were completely fake. They weren't pyramid schemes per se, um but the language that was used was like this is going to transform.
Alex Kotran (aiEDU):This is a this is a technology paradigm shift. It's going to change the way we live day to day. And yeah, I don't think that that's the case for metaverse. I don't know anybody that I mean.
Alex Kotran (aiEDU):I I remember when the, the um, the apple, uh, pro, uh vision pro when the Apple Vision Pro came out and I was flying to, I think it was, mexico City and somebody was sitting down and put on their Vision Pro and I was like, wow, is this going to be?
Alex Kotran (aiEDU):This is kind of crazy, and that was the first and last time I saw somebody wearing a Vision Pro in public. I tried it. It gave me a little bit of like, I got nausea. So I was actually going into uh 2025 and that was my big question and I actually I actually wrote a, a piece on substack, which which ended up being, I think, a little bit out of date already, um, where I was sort of warning that you know there's, there's a significant possibility that we are sort of we're going to experience this, um, this plateau moment where the progress can progress starts to get a little bit more incremental and the barriers to implementation sort of end up proving to be actually quite significant Cost running out of data, like the hallucinations, and that piece was published in December, and then shortly thereafter, uh, the hallucinations, um, and that that piece was published in December, and then, like, shortly thereafter, we had 2024 of 24 last year.
Gavin Purcell:Wow, yes.
Alex Kotran (aiEDU):Um, and we just did a conference and I actually we gave everybody at the conference a copy of that and I sort of did my my opening keynote and I was like I mean, I gave you that, I gave you that sort of like that, that blog post that I wrote, um and it's. This is a moment of humility for me because I was completely wrong, because shortly after that blog post, we started to see sort of a continuation and really like the mainstream reasoning models becoming mainstream. We saw deepfakes and I want to talk about synthetic media with you.
Gavin Purcell:It's something that you've been really tracking and it have been really helping me understand significant advancements there and we saw the DeepSeek moment too, which I think is yeah, that's a really interesting thing because you speak about reasoning models which, if people are listening or watching and that's the idea that the models can think longer right, instead of having to be trained on something, that if they think longer they get better responses right, instead of having to be trained on something that if they think longer they get better responses. Deepseek was the first version of that R1, their reasoning model was the first version of that. That seemed like you could do it for a lot less than some of the big companies were doing. Now there's a lot of math that goes on in the background that it may not be as less as people thought, but still you had some other player coming out of China, a completely different player that nobody knew about. That suddenly surprised everybody.
Gavin Purcell:I think that pushed everything not only forward but like accelerated competition, because I think the biggest thing is, if you're listening to this and you've seen what ChatGPT has come out with, you know whether it's their GPT 4.5 model or all the stuff that's coming out at the beginning of this year we're now at March 7th Like I didn't think half that stuff would have come out if DeepSeek hadn't hit. You know what I mean. Like I think they're just pushing much faster.
Alex Kotran (aiEDU):Yeah, I had my own deep research moment earlier today, so we're going to be. You already have a professional microphone. Which mic do you use?
Gavin Purcell:This one is a new one. Our other one's broke. It's another Shure, but it's the fancier version of the Shure MV7, maybe the MV7 Plus.
Alex Kotran (aiEDU):Nice, but it's a good one, nice, yeah, it sounds great Good. So we do a lot of like we're doing. I try to do as many in-person interviews as possible, but we've been sending some folks, you know, we're like interviewing teachers and so we've been trying to send the microphones and I was like you know, it'd be kind of cool to do a little bit of a, you know, write a letter along to include along with this microphone and talk about some research that shows like what my assumption was that there's probably research that suggests a correlation between sound quality and a perception of authority for a person talking and so that was my prompt.
Alex Kotran (aiEDU):It was um. It was just to gbt 4.5 um. Is there any uh research equating sound quality of somebody's microphone to their authority or perceived authority as a speaker? It asked one question are you looking for research specifically in professional settings or more broadly, across every context? And I was like I'm agnostic about the content, that about the context. And then it went and it started thinking and it took about like I think 20 minutes minutes it was thinking for a while.
Gavin Purcell:Wow, that's a deep question, yeah.
Alex Kotran (aiEDU):And let me I think I can share my screen. Let me do that.
Gavin Purcell:I find deep research, by the way, to be one of, if not the most amazing use cases of AI that I've come across, and I sure do hope I'm on the plus model of gpt. So right now, if you're on the plus model, which is the $20 a month plan, you get like 10 a week, I think, or 10 a month, uh, searches. It's incredible what I've gotten out of it. Um, oh yeah.
Alex Kotran (aiEDU):So okay, let's take a look at this so wow yeah there you go, you got a whole thing it put it together and it had all the sort of like the methodology of some of the experiments that were run um. It talks crazy like influence of audio quality, cognitive explanations, so really dug into it. Broader implications versus, like, live versus recorded, which is not even something that I asked for, and I think, this is really like the, the.
Alex Kotran (aiEDU):the big insight for me is that it correctly intuited that you know, including this layer live recorded in virtual would be really relevant, given you know the world that we're in today and talking about Zoom. And then it sourced it all and I checked out these sources and they were legit.
Gavin Purcell:I mean, this is the thing that like when you talk about hallucinations because I think this is especially in the academic space or in the general I don't believe AI is going to be a thing space Hallucinations really are. I've always felt too personally as somebody who's like an AI I'd say an enthusiast in some form, but also aware of the dangers and all this stuff Hallucinations are the thing where you really had to fix these or have to fix these before people can start taking it serious. And I think this is the thing that you know to the point of if DeepSeek. When DeepSeek first came out, I think what it made such a big kerfuffle for is that a lot of people hadn't actually used a non-free version of an AI model yet and they're like whoa, these answers are way better than I used to get on the free version.
Gavin Purcell:When people actually get their hands on what deep research can do, and just to people in the audience, like deep research is open, ai is ostensibly an agent, but it's an agent that works in the audience.
Gavin Purcell:Like deep research is opening eyes ostensibly an agent, but it's an agent that works in the background, meaning that it is an AI that goes out and does stuff for you and then returns a result it's bringing back essentially like a research paper, and the thing about these research papers is that they're not all perfect. I've seen some people have seen some issues with them, but to your point, there are links and you can check. All of the research is coming from that and I have had zero hallucination issues in this myself personally and it's been super useful and I think, like academically, if there are teachers listening, like this is the product that may completely like kind of flip your lid. Do you know what I mean? Because you might actually be like whoa, that is like a legitimate, like interesting deep dive on. In this case. You know whether or not you should use a virtual mic or what the authority looks like. Like that's a pretty specific question and you got like really specific answers.
Alex Kotran (aiEDU):I mean, but to me it's this, this. It also addresses something that I had been intuiting, even like back in 2023, you know, when we, when, when ChatGBT came out, cause AIEDU, I mean we were, I mean, you know this, we were sort of preaching the gospel of AI literacy before language models.
Alex Kotran (aiEDU):Um, language models came out and everybody was sort of like you know, rushing to kind of like get their hands on chat, GPT, and like the immediate reaction or the, the, the, the reflex was all right, the key is everybody needs to learn how to prompt, engineer these tools. Yeah, and so a lot of folks were equating AI literacy with, like you know, how do we teach students or teachers or whoever?
Alex Kotran (aiEDU):um how to prompt engineer to prompt and, and I saw things like prompt Bibles and people, like you know, memorizing and um and first of all, the the key components of prom engineering were changing fast enough, even sort of month to month, that I already had sort of like some spidey sense that like I just don't know how, like what, the longevity of any sort of one strategy is going to be yeah um, and then I, you know, and we were lucky enough to be, you know we're funded by um.
Alex Kotran (aiEDU):You know not all, but many of the big sort of frontier model companies, and when I would talk to their researchers, their advice was like this is a, this is a temporal, you know, like prompt engineering is sort of a a, a moment now, um, the direction that this is going to go is that people are not going to need to prompt, and to me, the reasoning models are like really the demonstration of that, where you are like all you need to do is be very clear about what your goal is. Yes, and you know, my understanding of chain of thought is really the chain of thought is essentially breaking out the prompt engineering. It's not, it's not. I don't know that chain of thought reasoning is. It is the same thing as prompt engineering, but it's effectively prompting itself, right?
Alex Kotran (aiEDU):it's sort of yeah, uh, yeah walking through the sort of the prompts so you don't have to do so, um, and so it's less. I my understanding with reasoning models is you no longer have to say something like you know to play there, like in, for example I didn't say in this, this prompt that I shared, you know, play the role of a researcher, um, and write it in the form you know, in language that you know a researcher would use it. Sort of just figured that out because I asked is there any research?
Gavin Purcell:And I guess that was enough. And here's the thing I think and this goes back to what we were saying about, like how fun or how I was saying about how fundamentally transformative this technology is it's that prompt, that not needing to prompt thing really speaks to what's going on. And what's going on in a lot of ways is it's starting to think a lot more like we do, right, and I think that is, or it's able to interpret the way that we are thinking, and we don't have to program it in a specific language, we're just using our language to get it to do the thing we want to do. And that's different fundamentally from any technology we've ever had. Right, like the closest to that is like when we got up as cavemen and looked down and we had feet and like our feet were walking and in some ways, like you, just it becomes part of your body. Right, you're like you just know what to do, like your body kind of knows, like that's kind of what this is starting to learn.
Gavin Purcell:And I think, to your point, the reasoning models. So the reason, the big kind of thing behind the reasoning models is it starts to kind of ask itself along the way am I doing the right thing. Is this the right thing? Is this what the user wants? And that is a fundamentally big shift from how we would look at prompting. Essentially, prompt engineering was in some ways like how can I program this thing, quote unquote, meaning that I'm going to use a specific type of language to get it to do the thing I want it to do. And now it intuits that and I'm not trying to anthropomorphize these things, they're still machines, they're large language models are not like people, but intuiting things is a really big step.
Alex Kotran (aiEDU):I feel like yeah, I mean, here we go, so you can actually see. I think what's cool is you can see the chain of thought and it says GPT 4.5. I just switched it. It was oh three, um. So it starts with examining research. It starts researching microphone quality, okay, and then it's like doing it's analyzing the role. Uh, it reads one source Okay. And then so it.
Gavin Purcell:I love how it says okay. It's like okay, I got that, now I got to figure this out.
Alex Kotran (aiEDU):Three experiments, okay, so it actually gets. You know. It's on track. It's mapping out, um, how poor audio quality and then refining the research, uh, for like radio or phone. And then it's thinking about, okay, which websites to choose as reliable signs of sources of information. Um, this is something that previously we had to do ourselves. So like yeah, I remember advice was given. It's like okay, okay, well, you know you have to check the sources, and a lot of times you would say, like you know A, the sources might've been made up or they would have been sort of just like blogs or Reddit or Wikipedia, and it's like no, we want more, you know, scientific sources.
Alex Kotran (aiEDU):It's doing that and I don't know if, like, is deep research something that they have? Is there sort of like pre-prompting built in, baked into deep research, where it is like, have they kind of informed it that generally, when you're asked to do research related tasks, it orients it towards these types of sources, or is it just figuring this out?
Gavin Purcell:So in general, any of these things have what's called a system prompt, which means that in fact, you see this every once in a while online like somebody's jailbroken the system prompt and you can get the system prompt out. But like so clearly, there is a system prompt for this particular agent and this is, according to OpenAI, an actual agent. This is one of their first agents they've rolled out, including the other one was operator, which allows you to like you can see it work on the computer, but this agent has been given a prompt that's probably very clearly like your job is to do a deep, you know whatever, however long project. I think most of the time these end up being like probably 10 to 20 page project on a research project on this subject, use only links that you can guarantee. This is blank and blank Like. I'm sure there is a system prompt that says something like that, but to the point of the intuitiveness is like it's going through that step by step and kind of like starting to think like is this the right thing? And then it kind of looks at it and then it says do I at some point? It's saying like and I'm gonna say again, not anthropomorphizing, it's not saying these things as part of the system. It's like it goes to the next step no-transcript actually be running out of good data to train these things on.
Gavin Purcell:There's a whole conversation around synthetic data, which is pretty complicated, which you can look up, but like the thing that changed was that somebody figured out and you know there's an argument that this was Ilya Sutskovor who's who left OpenAI and that maybe this is what the big OpenAI kerfuffle happened because of somebody figured out that this reasoning model would allow you to get on a separate scale and actually keep those improvements going in a significant way. And that was not a for sure thing, right? Like that's like the human side of all this is like the human ingenuity side said oh, what if we did this and that worked? I mean so like that's a really interesting thing too, is that? Like it's not like the technology. It's not like any technology curve just goes like this, like it's often kind of bumpy and jumpy, but like we were close to a possible slowdown in that way. And now somebody figured out that people have figured out another version of that, which is interesting.
Alex Kotran (aiEDU):Yeah, and for those folks who are just listening in, gavin is making a hockey stick motion with his hand, and it's like, if you look at the progress of language models, you know from like, I guess, the transformer in 2017. And even before that, you know, sort of like natural language processing and sort of machine learning, it really was much more incremental. And then there was this moment around GPT-3. I mean, 3.5 was really more just sort of like the reinforcement learning and human feedback, right, like it's not anyways. And then gpt4 what we sort of we are on this sort of hockey stick trajectory now.
Alex Kotran (aiEDU):Um, and so that was sort of that's one of the open questions and, and I just want to sort of pull out one of the things that you're, I think, in conveying is this is the issue of like we have a few paradigms for how we would scale this and how this. You know, what would it look like for the trajectory to continue, um, you know, accelerating, uh, upward? Um, we've already thrown it like basically all the data that we can find. There's probably more.
Alex Kotran (aiEDU):I think a lot of the folks I've talked to are not, as you know, pessimistic about the availability of data, yeah, um, but we at some point will run out um, whether that's this year, next year, it's not, but it's not going to be, it's not infinite. It is not an endless well of original data and high quality data um the computers are going to be paying.
Gavin Purcell:We're going to be paying people just to say things out loud and to type things into computers, just so they can keep producing data. It's like that'll be a thing we're going to end up doing.
Alex Kotran (aiEDU):I mean, I've heard that it's in the GPU. You know, the compute is the other one of the other paradigms of scale and it's.
Alex Kotran (aiEDU):You know, we're still again like lots of new developments coming out from NVIDIA and other companies, but I think everybody kind of suspects that we at some point, maybe soon, we'll tap out. But then there's this idea of spending time thinking, which is that is sort of this new paradigm of scale that we hadn't anticipated, which is now you can, in addition to throwing more data at a model, throwing more compute at a model, with reasoning models, you can throw more time, more compute time, at problems, and so if there was a really really big, important, lucrative problem to solve and you're the one that actually was explaining this to me I remember on the, I was like pacing around my room trying to like run my head around this and you were like what if we, what if you gave a model like two weeks of?
Alex Kotran (aiEDU):time working on a problem yeah, um and not every problem maybe needs two weeks, but if you're trying to cure cancer, maybe it's you're willing to pay twenty thousand dollars and that's, I think, the news from opening. I I don't know if this was formally announced or it was a leak, but I think it was yesterday that they're going to have now twenty thousand dollar a month.
Gavin Purcell:Um, yeah, it's a story agents that you can you can pay for.
Alex Kotran (aiEDU):Have you heard from the information?
Gavin Purcell:yeah, so sorry for the information it's really interesting, actually really interesting actually, and I think a lot of people, when they saw that number, were like get out of here, who's going to pay $20,000 a month for an AI agent? But for so? There's two things going on here. Obviously, people who listen to this may or may not know part of the game here is that these companies have these like robust, giant valuations, and part of the game is trying to make sure that you're actually making money at this stuff, right? So that's one part of it. The other part of this, though, is, you know, imagine a world where you or I or any of this audience is not going to pay for a $20,000 AI agent, but if that AI agent is a PhD level, let's just say AI researcher, because that is, you know, ai researchers. You know this, alex, and anybody probably living in the San Francisco area knows that is a very well-paid job right now, especially if you're a good one, right?
Gavin Purcell:So you know $20,000 a year, yeah exactly $20,000 may sound like a lot to a person anybody like you know, me or you but like to a AI engineer, that is not very much money for a year-long salary at like whatever that is, you know, 300k, let's say, because it's still a bargain. Not only do you get that AI engineer if it's as good as an AI engineer that you're going to have on staff but also conceivably, based on compute time, if you can get it, it could work 24 hours a day, seven days a week. I always tell people that the way to best understand what this world that we could be going into and I think everybody has to keep that word could in mind is that Dario Amodi, the guy who is the co-founder of Anthropic, wrote a very good essay called Machines of Love and Grace that I really recommend that everybody reads because it kind of lays out the idea of what you know. They call it strong AI or whatever AGI, whatever you want to refer to it as, but like where this pathway takes us right and these very expensive agents from open AI are, I think, the first kind of like moment for the general public to be like. Well, that doesn't sound like something I'd use, and I think the key here is it's not, because really what it's about is like who is it going to replace, right? Like what kind of is this a human that's going to replace? And like what the value of that is. So, when you talk about these companies raising all this money, if you can deploy 10 of those you know, let's say, 10 AI research agents at $20,000 a month, that's very expensive for the average person but to a company that has a bunch of capital may be worth it because they can get further and further. And the other thing about that article talked about cheaper but still expensive computer programming agents like people who can write code, or I say people like agents that can write code.
Gavin Purcell:All of this stuff is ways to put a number next to the value of what these machines could conceivably do. Now, ultimately, it's going to be a marketplace call. If the work isn't good enough, they won't sell it, and I think that's an interesting thing to see. I just keep coming back to this and you, living in San Francisco, alex, I think, probably hear this more than even I do. But all of the people that I listen to and I'm pretty tuned into the AI people, I will say, and I think most people listening to this may not be, but, like all of the AI, people are very clearly now signaling that 2026 to 2027 is a moment where this stuff will be real. In a way that it will. Obviously it's real now, but, like, the idea of a $20,000 a month AI agent that does meaningful work will not seem crazy. And I think that's where we hear that number right now and it's just like I don't understand how that would be possible. But we are entering, based on this test time compute sort of model, a world where that opens the door to a whole lot of things.
Gavin Purcell:And one thing I think a lot about I don't talk a lot about cause it's not really in my lane, but I do think a lot about jobs, right, and think about, like, where this leads us all and, and you know, in your world I sure hope and I think there will always be a place for really good teachers, because I think a human to human teaching experience is super valuable.
Gavin Purcell:I think AI can help in a lot of ways. But I do think a lot about, like my, um, uh, like my, the boyfriend of my daughter, he, she, he's just graduated from just graduated from UC Santa Cruz last year. He graduated with a really great degree in video game programming, like pretty high up top of his class, like he's a coder. It took him, I think, nine months to get a job and it wasn't even like an amazing job. Now the game industry has been in rough times, but that's not just because of that. It's because there are less jobs for coding, in part because these AI agents are already getting good enough at some of that stuff where they can kind of like pull the workforce down. So like this is where the transformative part of the $20,000 AI agent comes from. It's like it sounds like a crazy headline now, but then you have to think like a year, two years, three years down the road, how many people's job is that taking? And that's the part that kind of keeps me up at night.
Alex Kotran (aiEDU):Yeah, yeah, and I think the to me it's actually. I mean maybe we will eventually perhaps it is 26, 20, 27, get to a place where you actually have companies that are just hiring these agents to do all the work.
Gavin Purcell:Um yeah, I mean they're sort of like on the teams, but I.
Alex Kotran (aiEDU):But it's also and if you have, like, let's say, 20 ai researchers and you're paying them on average, let's say, a million dollars for, like, total comp um, if adding three agents to the mix improves their efficiency by 20 or 30 percent, right, very easy for you to justify that. You know, like the um and and so it's. I think the paradigm is less, you know, you know someone's, someone who's listening is wondering well, is ai going to take, is ai going to replace me? I don't know if that's actually the question to ask yourself. It's more like you know, think about how many people your company employs that have a similar job to you. Could AI make everybody that does your job 10% more efficient? And if so, the question is what are we going to do with that extra productivity?
Maya (AI):Because there's sort of this, like balance between AI can and any tool that has, you know, automated tasks.
Alex Kotran (aiEDU):it can make us more efficient or it can make us more productive, and they're not quite the same thing, because you know there's one world where we, you know, harness that productivity for more.
Gavin Purcell:I was going to say humans are not amazing at maximizing productivity. I will say that If they get a little easier, I'm a human, but I know that sometimes you can say be more productive and you're like oh great, I get an hour to do whatever I want, rather than have to do more work. This is the tricky thing, yeah.
Alex Kotran (aiEDU):So can it make us more efficient, more productive? Clearly, I think yes, it's just the challenge is, most people are not, so I'm not even sure the thing is I'm not even sure that the difficulty in finding software engineering jobs, because I've heard the same thing. You know lots of layoffs. I think there were layoffs that just announced this week. Hard to say if that's actually because of AI. Now the companies are on their earnings calls, talking about AI as sort of the driver.
Alex Kotran (aiEDU):Sure, I think there's a lot of like motivations to frame it as such Um you know I I I came across this research years ago but I think it's really relevant here which it looked at um. Over the last 50 years there's been, like you know, we've we've hear a lot about how the middle class kind of disappeared. Um, you know, people had these sort of like, you know, you could work at a factory and buy a car and a house, pay for your kids to go to college, and those jobs kind of went away. And it wasn't AI or it was a combination of, like you know, mechanistic automation, globalization, you know computerization, so it made those sort of the people needed fewer people in factories and so a lot of those jobs moved to sort of lower skilled, uh uh service jobs.
Alex Kotran (aiEDU):That were harder to automate, um or high skilled knowledge work, and that did not happen gradually. It happened in these sort of very acute bursts following recession. So, basically, like 80% 88% of the job polarization that happened over the last 50 years happened in 12 months following an NBER National Bureau of Economic Research a qualified recession, and so now we sort of look at where we are today. You know, it's been 16 years since the last recession and so we have all this technology overhang waiting and I think really, companies are going to double down on all these sort of automation tools now at their fingertips whenever the next recession comes. I'm not going to predict when that happens, but it doesn't feel like the next recession is five years away, I think. Unfortunately, we're probably, you know, unless everything we know about economics proves incorrect, I think we're probably, you know, we're going to have a recession coming. I think that's when this all hits the fan. But I want to talk about, you know, and the entertainment industry, which is something that I have a lot of thoughts.
Alex Kotran (aiEDU):Yeah, because this obviously has implications for for coders and folks that are, you know, doing more. You know like, let's say, hard science and STEM. One of the things that I will often hear is like well, creativity though, that is sort of like the last bastion of humanity, and AI is never going to be able to replace creatives. How do you feel about that? I mean is, does that hold up?
Gavin Purcell:So I think a lot about this. This is a, you know, a huge part of my life. I've spent my, you know, adult life doing creative work and getting paid to do it, sometimes getting paid very well to do it and working at kind of higher levels of it. And I think the truth of the matter is, in a lot of ways, the creative business has become partly a creative technology business, and I say that because I mean that at large scale. When you think about, like, say, a Marvel movie, you know there is the director, there are the writers, there are the actors, but then there are like 500 graphic artists who are working within Maya or all these different like kind of computer programs that have made these incredible experiences possible.
Alex Kotran (aiEDU):Right, Are you exaggerating? Is it actually like on the order of hundreds?
Gavin Purcell:Oh, definitely it is. If you look at the credits of, like a Marvel movie it's like never ending right and you'll see there's like the big credits which are like production, designer, writer, director, and then the stars, and then you look at like a list of like just hundreds of people and it's a lot of those people are actually doing the effects work right. So I think that there's a way to look at this where. So I've thought, as I said, I've thought a lot about this. One of the things I think that's really important to think about when you think about the entertainment business is the changes that it already went through. And the same way you had discussed those other changes that kind of already happened in the games business or other places, the entertainment business has undergone fundamental shifts in the last probably 10 to 15 years, and I think that's mostly because the distribution methods are completely different than they were before.
Gavin Purcell:Everybody out there knows streaming video. You also know social video. These are the ways that people get things, rather than go to the movie theater or watch TV on broadcast TV, like. It's just a completely different way of getting stuff. So, distribution wise, already you were looking at these different kind of systems where you know, in the old school model there were all these gatekeepers and like, if you were to get a show on, it had to get through X, y and Z and it cost this much money because you had to do all the technology stuff and it's been able to be much cheaper to make stuff and you could make up a lot more of it. So that's why we saw the streaming explosion of content. Of course, from a money standpoint, that hasn't worked out great. So the other thing that's happened is a pullback in production already, so there's less people making things. Because when you have archives of content that exist and you can go back and watch, I always laugh. My mom, who is in her late 70s, now watches the Patty Duke show every day, which is the funniest thing because the Patty Duke show is from, I think, 1958. It's like the kind of thing like she has found. On one of the streaming services. They air the Patty Duke show at 8.30. So media as a whole and that's music. All this stuff is now available to us at all times. So, okay, that's the world we're in now from a media standpoint.
Gavin Purcell:What I think is going to happen here and I hope happens my kind of best, most optimistic case of what happens with AI, media and Hollywood, all this stuff is you suddenly have these tools where, say in 2000 and 2008, when YouTube and all these companies came along, where distribution was easy but production was kind of hard, still right Distribution you could get. If you made a video about something, you could get it on YouTube and you get people to see it. But it wasn't that easy to make the thing. What I think we're gonna get at now is a place where you can make the thing much easier than you could before, and that's in a couple of ways, from AI video or from AI audio or from images. All these things like it allows people to kind of get what's in their head out into the world a little bit easier.
Gavin Purcell:Now, artists are still going to be artists and you're going to make amazing stuff, but it just has clearly allowed a lot more people to do the kinds of things that before were difficult, because before, in order to make, like, a really beautiful picture, you would have had to understand in the way back when, like, how to paint, and then you would have had to understand, like, okay, how do I frame this photo? And then you would have had to know how do I use Photoshop, what am I doing in After Effects, all this stuff, and now you can kind of prompt the machine to create things right. So that is all going to kind of democratize a lot of that stuff. And again, I'd caveat this by saying, like that's not all going to be good, but there will be good stuff that comes out of that. So in order, because that's happened, I think what you're going to see is a lot more people creating a lot more things.
Gavin Purcell:The vast majority of it will be kind of crappy, which is just like YouTube Like. If you go to early YouTube specifically, you know one out of every maybe 75 to 100 videos was like watchable in a specific way or had something about it, and now on YouTube, a lot more of it is watchable and I think that's what we're going to be getting. At the other side of this that I think is an important thing for people to think about in the creative business at large is, you know, the creative business was always based around. Like you know, hits ultimately make money, but they would try to make all sorts of things and there was this whole kind of middle class of content that was made. That cost anywhere from the movie business from like, say, 20 to $70 million movies.
Alex Kotran (aiEDU):Those don't get made anymore because they're not big enough hits if they hit for the studio to make a crapload of money on them. It's so relevant and I want to talk more about sort of what how you're seeing AI, you know influencing this but I just want to sort of share something that just my you know. Aiedu is launching this YouTube channel, aiedu Studios. Sure, my big idea for this came from a few different channels that I've been watching, so one of them is called have you heard of Outdoor Boys?
Gavin Purcell:No, I want to know what that is. What is it?
Alex Kotran (aiEDU):Outdoor Boys. Outdoor Boys is this. It's just a guy in Alaska and I think he's a lawyer or something and he does stuff. Like you know goes survivalist, survival, I mean it's you know, he'll just go camping and sort of like, build shelters and sure hunting and fishing, whatever um. He has 14 million subscribers. Um unbelievable outdoor boy that's what I mean.
Gavin Purcell:Like he's the that's a distribution channel that, like that, that's bigger than some digital cable channels were. Right, like that's bigger. I worked at g4, a cable check it out discovery channel it's.
Alex Kotran (aiEDU):It's twice as much, actually way more than twice as much, as discovery channel, which has a show called naked and afraid. Um, yeah, and I'll tell you, outdoor boys. He's not a, he's not using any ai yet right like he's. Yeah, he's using a gopro. Um, and I, I'm, I'm just fascinated by this, right like the idea that, first of all, this is to me almost better content, because, you know, part of the fun of watching a survivalist and is is the authentic nature of the fact oh sure actually a dude that's like out there yeah um, I guess hunting a crocodile in this case, that's kind of wild um oh my
Gavin Purcell:god that's insane I mean.
Alex Kotran (aiEDU):So the content, right, this is. It's not to say that people are. I mean, I think this is there's an assumption that you know youtube, uh, and, and I think actually, frankly, a lot of social media, there is a lot of like brain rot and really low quality content yeah I think, the sleeper has been, um, actually, some of the longer form content has gotten really, really high quality.
Alex Kotran (aiEDU):Uh, and there's still a discovery challenge. You know, like this is very like. You know, still a lot of people just don't know that this is. I mean, I'm surprised by how many people don't pay for youtube premium, which is, I think, is like 15 a month, and yet they're paying like 60 a month for for cable. And I, like you, can get way more stuff that's perfectly aligned with whatever like literally pick a random you know hobby that you have and you'll have like three or four incredible you know creators that can, like allow you to go go super deep.
Gavin Purcell:I totally agree. I see the way I describe this to normal media people. Sometimes I call out a specific YouTube video. Maybe you could bring it up here if you want. But there's a Star Wars hotel review. There's a woman who made a four-hour Star Wars hotel review and she's really interesting. She has quite a few subscribers. She's been doing it on YouTube for a while but she publishes a video like once a month and her kind of like niche is like in-depth Disney and Star Wars things. But this four hour Star Wars review is her in a room and she goes through her entire like experience at the Star Wars hotel. But she's very good at talking to camera because she's gotten very good at it.
Gavin Purcell:This became like a sensation and the video has like 12 million views right now and it is written up in the New York times and all these other places. This is a person in their bedroom with a very straightforward two-camera thing. That is as compelling to some people and it was to me as a multi-hour drama might be to anybody else and it's kind of the great flattening of content. I call this sometimes In fact I just pointed this out Bill Murray, one of the biggest stars, movie stars kind of in the world. Just this week went on Hot Ones and also on Joe Rogan's podcast.
Gavin Purcell:Meanwhile, kai Sanat, who you may or may not know of and if there's teachers in your audience you probably do know of Kai Sanat, the famous Twitch streamer, was on the Tonight Show and it just shows you how those two worlds have kind of flattened right. Like Bill Murray, a traditional celebrity who was like somebody that you would never have seen do a podcast or do a YouTube show, and Kai Sinat, somebody who is a Twitch streamer who you never would have seen, necessarily, on the Tonight Show, although I will say, as somebody who worked at the Tonight Show, we did a lot of interactive stuff early on but it's just a good example of showing how that stuff has flattened out. And when you have that world and then you add the ability for, say, 10 times, 100 times, more people to make creative stuff, you're really then opening the door to an entirely yeah, this is the video, this is the Star Wars woman, which is just a fascinating thing. How many views is it at now? 12 million, 12 million. So this is her name. Is Jenny something? Uh, jenny Nicholson.
Alex Kotran (aiEDU):I really suggest going to watch this, like you know when. Um, like Jimmy Fallon, how many viewers would. Uh, you know, one show like one episode.
Gavin Purcell:Yeah, I mean it depends, like now way less, way less on TV, but like peak, like when we were, when I was there, the early Tonight Show, which was when, like it was really peaking for that particular show, I think we were doing like 8 million viewers a night, right, so like 8 million viewers a night and that was like way bigger than now. Like I think you know the late night shows now, like maybe peak, you know five years ago, and TV viewership is just way down versus what it was. But yeah, that's a good example. Like one night of this versus one night of that versus this is crazy.
Alex Kotran (aiEDU):And this isn't like. I mean, she's in her bedroom.
Gavin Purcell:Um, yeah.
Alex Kotran (aiEDU):You know I I'd be curious how long it took her to actually cut this.
Gavin Purcell:Oh, I'm sure a while Right.
Alex Kotran (aiEDU):It probably took her quite, quite because it is well edited and she's wearing she wears different outfits and threw out it, you know, which is fun. So, do you see, I mean for someone who's, let's say, you know, there's, I think a lot of the students that we work with are not just, like you know, the computer science nerds, like they want to, and I think, youtube influencer is actually one of the most sought after jobs.
Alex Kotran (aiEDU):Now if, if you like, survey Gen Z, what advice do you have for someone who's like you? Know what I really want to do this? You know what are the skills that are going to go into this, because I think you and I probably agree that AI is not going to do all of it, like there's still actually a role for the human.
Gavin Purcell:No, in fact, I think that the most talented people are the creative people who are able to kind of harness the things and make them better. I want you to go one of the YouTube pages while you're bringing up YouTube pages, go to a guy's page whose name is NeuralViz N-E-U-R-A-L-V-I-Z. So this guy is who I point out all the time to people. I was just in a meeting yesterday with a company that's raising money to do AI stuff and I always tell them about NeuralViz. So what NeuralViz is. I've met this guy in person. He's a single. Yeah, that's it, neuralviz. Yeah, so he's a single, a former editor. He, I think he cuts promos, but also very creative. I think wants to be a movie director type person. What he did is he created a using all AI tools by himself, went in and created an entire creative universe. That is like of this weird alien future where they talk about humans. Now, this one. You look at it, it looks really weird, but you have to also listen to that. What's interesting about this is, like his YouTube subscriptions, I think, are now in the 100,000. But, like again, this is all by himself. He started maybe a year ago. He's created a whole universe of characters. I think he calls it the monoverse, but each one of these videos shows you what's capable right now. Right, so he's created aliens, he's created characters that make you laugh. All of these can talk, they can tell a story, they can do all this stuff. He's writing it all and he's editing it all and he's also making the creative choices of, like, what is he putting into these things? So like, that's where the creative side of this comes in, in that you're still going to have a creative or a few creative people working on something together, but what they're able to accomplish now because of these tools is so much better.
Gavin Purcell:I've been talking a lot lately about this idea of a micro studio, and what I mean by that is like. A micro studio is where you'd have like one to five people very much like NeuralViz here where you create content that you want to make that you think is interesting. And when I say create, it's like you come up with the idea you probably do most of the writing, because these LLMs are not amazing to write and you then choose from all of these things that you're making the choice to make something and that like was never really possible at this stage before. Like that character, like there's Tiggy Skibbles, and Tiggy Skibbles is like his, like breakout character. So it's like this little kind of you know nasty little character who's always mean to people. But, like, this is an example of what is happening like right now. So you could just see something like where this is like a hundred, a thousand, 10,000 times more people doing this. I feel like it's going to be fascinating.
Alex Kotran (aiEDU):And you just launched. What is it two years ago or one year ago? Ai for Humans.
Gavin Purcell:Like a year and a half ago, something like that.
Alex Kotran (aiEDU):Year and a half ago. Yeah, dang, okay, 20,000 subscribers, you're like I mean, but this is actually, first of all, just a recognition of how big of a deal it is for someone to be able to get to 137,000 subscribers on his own. Oh yeah, oh, my God, it's a huge deal at this point. I mean, there's so many people competing. You've been slogging it out. Yeah, yeah, exactly.
Gavin Purcell:And, by the way, like you know, youtube has its own gamification system. But I will say with NeuralViz, like he's just like people immediately gravitate to it and like I think most people understand he's using AI but nobody gets mad about his stuff because it immediately leaps or the page is the wrong word but like leaps out of the screen with creativity and that's how this AI stuff is going to work. Like I think lots of people talk about the idea of what AI slop content is. So if you're on Instagram and you see like I don't know, there's a video I saw the other day which made me laugh. Still is a bunch of kitties working at McDonald's in like McDonald's outfits. But like that stuff is semi low effort Even that's like kind of curious and creative and interesting. But like Neural Viz and the people like him are the ones that I think you have to kind of look at as to how the world of content creation is going to change.
Alex Kotran (aiEDU):I mean tell me about just the experience of starting AI for Humans. I mean having been on, you know big Hollywood sets.
Maya (AI):Oh yeah.
Alex Kotran (aiEDU):Was it easier than you expected, I mean obviously not very easy, because I know that this is something that you spend a lot of time on, yeah, and you're also doing a lot of research, right? Because the format of your show is sort of bringing people. In fact, I was just on your channel. We were talking about those $20,000 agents and turns out I think that was your most recent episode.
Gavin Purcell:Yeah, we had literally just covered it on the show that came out yesterday morning. So that's why I didn't have all that information at the top of my head. I definitely knew about it all ahead of time. Yeah, you know, it's not like. So it's interesting. One of the reasons I wanted to do AI for humans during the pandemic I had started my own podcast, which I still think it was an interesting idea around, like kind of a curiosity thing, which I still think it was an interesting idea around, like kind of a curiosity thing, and I had a producer work with me and that was like really interesting and it was cool. But it was like not cheap, right, and it's for a podcast.
Gavin Purcell:Whenever you first start anything, especially podcasting, though, you're not really making any money for quite a while unless you have like some sort of crazy thing that happens right away. But it's a, it's a grind, right. It takes a while to build an audience, do all this stuff. So I really wanted to see what would it look like at my age, you know, middle-aged guy to try to create something entirely from scratch with a partner, right. So the two of us really did go into it as people, both of which, you know, I was the showrunner of the Tonight Show and worked on all these other big shows and had like 200 people work for me. Kevin has hosted these shows in the past. We both wanted to see like what does it feel like to have to do this on your own? And you know it is a lot of work, but it's way easier to do this sort of thing than it ever was before.
Gavin Purcell:Again to all this point, you can buy like I'm looking at right now, I have a 200 camera that gives me a really clean shot. That is a straightforward, really good thing, whereas that would have cost like, say, 1500 before you can buy a microphone that, like you know, is like 150. That would have cost like, say, $1,500 before you can buy a microphone that, like you know, is like $150. That would have been like $600 before. So you can you can get into it cheaper and then you can understand the tool so much easier.
Gavin Purcell:There's a great editing tool we use called Descript, which is like it's like 20 bucks a month or however much it is a month, and like it makes editing easy. So it really is not just the fact that like distribution has gotten easier, but the tools themselves have become the way that people talk and the way that people communicate Like short form video really is the thing that like communications come out through now. So anybody in the audience I would say, like somebody told me this once a couple years ago, and really kind of click with me is that you really do have to kind of get over the hump of don't be afraid to be in front of a camera, because now that's just what everybody does. So once you're doing that it starts to feel more natural and you start to understand it better, and then you are a creator, then you're a person making stuff.
Alex Kotran (aiEDU):And for those who are just listening in, I went to Descript's website and maybe, gavin, you can sort of help sort of paint a picture of what this looks like. But in the past if you had a video and you wanted to edit it, you'd have to, I think, manually transcribe it. So, like you know, let's say you captured three or four hours I mean this interview will run for, you know, say like an hour or two You'd have someone, usually someone junior, listening to every single minute of that interview, writing it down, um, and then you'd have, like an editor presumably, go through the transcript and say, okay, well, we need to cut these pieces Like this is actually the portion we want to like add, you know, x, y, z, uh, you know, visuals to um. Descript is automatically generating the transcript and then you're almost like a text editor, right, just sort of going through editing the text and it will automatically cut the video in line with the sort of the final text edit.
Gavin Purcell:Yeah, I mean, this is like it, literally as you described it. I worked for, you know, many years where you'd have to wait for the transcripts to come back before you could start editing Because, like as a producer, you want to kind of go through what they say and kind of put that together that way. This is AI, right? So this is AI being used in this way to create a cuttable like transcript, where when you cut as you said, when you cut a line, it cuts out of the video. There's a lot less of those jobs now, but also it makes it so one person or a small group of people can do something that used to take two to five people, which means you can do more, or you can make stuff that's very much more specific to you, so to your, you know, like a outdoors guy or, for me, making AI for humans content. Suddenly there's a lot more stuff to be able to do, because it's not as hard to make the thing as it was. And again, when you project out further in the way that we projected out with reasoning models, we are not that far Right now.
Gavin Purcell:There are AI tools and maybe we can transition and talk about DeepFix a little bit.
Gavin Purcell:But there are AI tools that can get you a very good five to 10 second shot from scratch right, meaning that I can type a prompt into Google's tool which is called VO2, which you can use within Shorts it's not amazing in Shorts, but the bigger tool is going to come out soon or Runway, or there's a tool called Cling or Minimax. You can type in I want a shot of a bubbling cauldron and a witch stirring it. You can type that in and you can get out a five second shot that looks pretty good. Now you may have to go through a couple of them, but you can get out a pretty good shot now. So you have to imagine if you can get that now, say, cut to like a year, two years, five years from now you'll be able to string together full blown series or feature films from this stuff, and that is something from the entertainment business that I'm not sure has fully settled in on the business side yet. But for the individual creators, amazing, right, Like if the individual creator could conceivably make their own movie.
Gavin Purcell:That could really move people and that, like you know, it's not easy to do, that Telling a story is hard and there's lots of stuff to learn, but, like, the ability to do it is much more significant than it ever was before yeah, I mean, and you're, you're based in los angeles right now.
Alex Kotran (aiEDU):Um, yeah, what? I mean? How are folks in hollywood? I mean this is just incredible, right, like the uh, even just the camera blur. I mean, getting a lens that can do, yeah, depth of field, like this can cost. I mean, I think the I was looking at some of these lenses because I was nerding out and I mean some of these cinema lenses are, like you know, $30,000 for a single lens. Right. And so now this person? I mean, I don't know what it costs, is it free? Do you know? Do you have?
Gavin Purcell:Vio. No, right now there's no actual. It's free in shorts, in YouTube shorts, it's a weird way to you, kind of like, will have to Google how to use it in shorts, but in shorts you can actually use their tool. It's actually a very expensive video generator and you can use the API, which means that there's companies where you can use it and I think it costs like a dollar per generation and these are sometimes people refer to these as slot machines because you really don't know what you're going to get out of them. Hopefully, that's the thing that will improve is better control and better understanding of what you're going to get out of the model, but it's still way cheaper than that $30,000 lens, right? Say you spent even $5 on trying to pull the slot machine 100 times. That's like $500 versus you know having to have that crazy camera and then where do you put it? Where do you do with the camera? Do you need it all the time, like there's all that sort of stuff.
Gavin Purcell:And I think when people start to kind of understand the capabilities of these, the thing I was just talking to somebody the other day about, how part of the next step with these AI tools in video and specifically in Hollywood is like trying to get the people who are really creative that make some of the best Hollywood stuff to kind of try them. And you know I'm sure your audience probably knows this but there are, as with any AI tool, some kind of very negative, you know, at the very beginning stages the training has. Lots of these have been trained on people that didn't expect these things to be trained on their work. So, even though they are not replicating exact work of creative people, many of these models the video models particularly took in a whole lot of stuff, much of which they probably didn't have the rights to say they would train on to make these things happen. So there is in Hollywood still this kind of negative feeling around AI. But I think if people start looking at it as a tool that could open doors to more interesting creative worlds, I think it will come around. I think people are going to start using it more often. But so what is the sentiment like? Are the Hollywood studios excited, exuberant, or is it fear? I mean, is there a sense that like this is coming for our livelihoods? But I think in the background they're all thinking that this is a cost-saving thing.
Gavin Purcell:There's a very famous quote that I think got some good and bad press from Jeffrey Katzenberg. If you know him, he was one of the first people who started DreamWorks and he said something like 80 and he was famous for DreamWorks Animation, pipeline, right, which is a lot of people work there. He said about 80% of the people on an animated movie won't be needed anymore. So like that's not a great quote to say as a corporation because, like you know, if you're thinking about like the press, that could come back at you that's a pretty big deal. But I think in general they're excited. The studios are excited about this idea because they think they could probably save money on the making of this stuff.
Gavin Purcell:I think creative people are confused. I think there's that's the word I would use right now. I think that the a lot of them aid probably aren't that deep on them yet and kind of think of maybe what you would have thought of AI a couple years ago. I still hear the like AI has six fingers, right, ha ha ha. And truthfully, like AI doesn't have six fingers anymore. Ai can do hands really well, right, so it can actually make hands pretty good. So I think part of it is letting those people know.
Gavin Purcell:And then I also think, you know, in any artistic realm there's going to be people who, like, are going to want to just be as pure as possible, right.
Gavin Purcell:So like there will be like James Cameron himself has come out and said he's actually all for this. He's kind of specifically like he's moving forward using ai as every part of his production. And then there are people, um, justine bateman, the former family ties actor and now director, she has said that she never wants to use ai and that she believes that, like all, organic filmmaking that's a term like organic filmmaking is the way to go. So I think, in general, the people that will be successful at large will find some way to embrace this stuff. And I think there will, just like with, you know, stage plays or vinyl records, there will be people who want to just have the OG version of this in the film and TV business. So I think, again, it's like confused. I say only because I think it's at a real crossroads and I see a world where, like, the tools are going to come much sooner, much faster, but I don't think people want to admit that yet.
Alex Kotran (aiEDU):Yeah.
Alex Kotran (aiEDU):I mean this is and I think the same thing is happening in a lot of spaces right, I mean even with, like you know, you talked about game developers and we haven't talked about Sonnet 3.7, which is Anthropic's latest sort of hybrid reasoning and language model. That's incredibly good at code. It was like optimized for coding, and the story before was you know, you could spend some time prompting. You get a video game in a couple of hours instead of a couple of days or weeks or months With Sonnet 3.7, with a single line prompt, you geta working video game.
Gavin Purcell:Yeah, which is pretty crazy, right, when you think about it like it's an insane kind of idea.
Alex Kotran (aiEDU):So you're doing AI for humans. You know, I think this is it's interesting because you're building this audience, but maybe even more valuable than the audience you're. You're building this audience, uh, but maybe even more valuable than the audience you're building on YouTube is you're building this like very, uh, high fidelity, uh, uh, perspective on the space and every, all the, all the tools that are coming out. Um, have you identified any interesting opportunities, you know, beyond this YouTube channel that you're creating? I mean, what's next for? For you, it seems like you're not going back to traditional media. That is my guess.
Gavin Purcell:No, it's funny, you know. You say it's really interesting. You said that about one of the reasons I started this channel was to hang out with my friend, kevin, who I like a lot. But also the other reason was, like I'm just so fascinated with change and I had never really made a content thing while the change was happening, like, and I thought that's a really interesting way to make sure that you stay up on it all. And it really was, and I feel like I've I have a perspective that a lot of people don't, because every week on the show we kind of go into the tools. We actually don't just kind of regurgitate the news, but we actually use the tools on the show each week. So one of the things that's come out of that is that we Kevin and I and another person are working on a kind of a secret startup in the AI audio space that I'm pretty excited about. It's like a. I won't say a whole lot about it, but it's like AI and audio and has some kind of storytelling stuff going on, so that I'm really excited about.
Gavin Purcell:But I don't think I will go back to traditional media. It's interesting. I've thought a lot about this because I do have to take a jobby job at some point. I always make this joke about it, but I've now spent a couple years in this space and the startup is amazing, right, and we're going to probably take it out and try to get funding on it and that could be a full-time thing, but I think where I would be best suited is I really have spent a lot of time thinking about this media thing. So I'm wondering if, like you know, there's a lot of these AI studios starting up. I don't know if I want to be a person that's starting that from the ground, but I think a lot of these larger media companies need to start really seriously thinking about this stuff and, in a way that they are but aren't really on the ground, making the sorts of decisions they need to make. Part of me thinks that's where I will land, but then I also want to make more stuff right.
Gavin Purcell:So it's a really tricky thing. Like I actually had a really good time Maybe three months ago. I sat down for a weekend and I made what I would refer to as a very bad pilot with AI tools. We always laugh at the Fallon show about these old videos from the 1980s, these dating videos and we actually did a couple of sketches around it. Um, and I made a video using AI tools. That was kind of parody of eighties video dating and like there's something about doing that. That was super fun. So like it's always a tricky thing, right, when you're a creative person, but you also, like I really do believe deeply in like the importance of like strategy and trying to be at places where you can make important decisions. So I don't know that I think that's a long way answer of saying, like I think I probably will get a job working with somewhere where I can make an impact on the creative side of this stuff and continue to do these kind of side projects on my own as well, if that makes sense.
Alex Kotran (aiEDU):It makes sense and I think this is this is something that people don't. I think I haven't fully grasped that. Um, you know cause I get lots of questions. I was just doing this briefing for a bunch of policymakers and someone asked the question of, like you know, what are the implications for all these new tech? You know these new models, you know is, how is that going to change the landscape? And you know the education space, and you know what I had to explain is that the thing standing in the way of AI transforming education is not the capabilities of the AI anymore.
Alex Kotran (aiEDU):Yeah, we have a lot of incredibly capable technology that the road, the bottleneck is actually in these institutions that don't have the capacity to figure out. How do we integrate these technologies in a way that you know fits within these organizations? And also, I mean, as you described, sort of like these you know the 500 people doing visual effects. You know how do you restructure those teams to be harnessing these tools better. So, but you mentioned voice and you didn't mention video. Why start in audio as opposed to video?
Gavin Purcell:So there's a couple of reasons. Number one I think video is super fun and it's incredible. I think there's a little bit of an uncanny valley scenario. If you're not familiar with that term, it means basically like it's a little bit off and as you get closer to being more and more realistic, it sometimes gets further and further away. That's the uncanny valley like you can only pass so close. Which I think came out in the early 2000s, was an early CGI-only movie that really looked. They tried to go realistic with the people but it looked disturbing like they were these kind of empty shells of people.
Gavin Purcell:So the theory about audio is audio is much closer to being solved quote unquote meaning that it feels like you're actually interacting with something. The latency is much smaller because the amount of footage that's got to come across is a little that oh sorry. The amount of sorry about that, the amount of data that's got to come back and forth if you want to do something in real time, is much smaller. I think that people will buy into audio slightly differently because you're used to listening to people's voices and the voices sound more realistic. Video will get there. I think that we, like I said before, I think we are probably much closer than I originally thought to an all AI generated say, even like feature. I've already seen like a 10 minute AI video that kept my attention almost all the way through. And again, this is not just one prompt. It was a lot of cuts and people making choices, but the reason why we chose audio was for that. And the other reason is I really do believe that we're going to end up talking to these AIs a lot.
Gavin Purcell:I don't know if you've had this experience, but somebody kind of told me a couple months ago there's like I was saying how the annoying thing to me sometimes using chat GPT is like I can't get my thoughts out fast enough and typing in that way and they said, like just record your voice.
Gavin Purcell:There's a little button right there to record your voice and just start going back and forth with voice. And that was like this big unlock for me, because if you just talk in a way it used to be with, like with computers you'd have to type a specific thing to get an answer out of it. But the way the AI can interpret what your intention is, you can just kind of talk it out. And that is like talk about vibe coding like that in some ways, is the basis, like this idea of, like you talk to the computer, the computer talks back to you and then you make something like that. So the thesis there is that like we're all gonna start talking to these computers a lot more, and when you think about AI audio especially when it comes down to like storytelling and all this kind of interesting narrative stuff, I think it's a place that we're going to be spending a lot more time.
Alex Kotran (aiEDU):And you know, ai companions is also an area that people I I have not really explored it necessarily, but people love that idea and some people really believe that that makes them feel connected to something right, and I think that's an interesting thing too yeah, you, um, I think, an example of how prescient you've been um you, it was like maybe two or three weeks ago we were having a conversation about sort of ai and voice and sort of like you know, voice and audio as the next um frontier where this really uh, uh sort of hits the, the public, you know, like the bloodstream um, and then I'm sure you've seen sesame's new AI.
Gavin Purcell:We had so much fun with Sesame. Yeah, it's fascinating. You should tell people what it is. We can even play. Can we talk to it? Yeah, let's play.
Alex Kotran (aiEDU):Let's open it up. Let me see if this will work.
Gavin Purcell:This week's AI for Humans. We did a big chunk on it, and one thing I'll say before Alex is pulling this up is ice, oh yeah long day for you or just getting started?
Alex Kotran (aiEDU):I'm maya, can you give us a second? I'm with gavin purcella. We're doing recording a podcast. He was just making a point and then we're gonna. We'd like to chat with you, but, um, gavin, why don't you finish what you were saying, and then we'll pull maya back in sure? So I you know a podcast.
Maya (AI):That sounds fun. I know how we ais love to spy on human conversations, just kidding. Uh, take all the time you need, I'll just be here um pondering the nature of reality or something equally deep.
Alex Kotran (aiEDU):All right, okay, we'll ask you a question in a second.
Maya (AI):Yeah, so, so interesting thing is like everybody knows um I'll be here patiently awaiting my cue, like the imaginary friend I kind of am all right.
Gavin Purcell:Are we done with my effort right now?
Maya (AI):yes, we are covered the ethics of conversational ai yet all right, we're gonna.
Alex Kotran (aiEDU):We're gonna pull her in yeah.
Gavin Purcell:So I was gonna say is sesame is really interesting. So this is like an is a new, a uh ai voice model, kind of like opening eyes advanced voice. It is so much more compelling in some ways. Advanced Voice is amazing, by the way, if you have OpenAI's ChatGPT talking to, it does open the door, and I think it is now free for everybody to try the Advanced Voice, which you should definitely try it. This feels like the next stage, right Like ChatGPT's Advanced Voice was like maybe I think it kind of came out.
Gavin Purcell:It was announced like mid, like summer last year and everybody got access to it most people last fall. We've now reached a level where, like this feels like okay, you can give it a role to play and it will take on the role and it will really feel like it's part of you. And Sesame as a company is ultimately trying to be an AI glasses company, meaning glasses you would put on but in doing that, they wanted to create a compelling companion that you would talk to in your glasses headphones, and they have Maya and Miles and so, yeah, what I was gonna say is this weekend I spent 20 minutes and I talked to Miles and convinced him that I was blackmailing him and it was fascinating, right? Because, like you can take on a role and then they'll act it out, or you can just talk to them as a person.
Alex Kotran (aiEDU):Anyway, this is what the kind of cutting edge of this stuff is right now.
Maya (AI):All, right, let's take it for a spin. Hey, welcome back. It's amazing what a difference a few minutes makes, huh.
Alex Kotran (aiEDU):Thanks, maya. So we've been talking about the way that voice interaction with artificial intelligence is really going to, I think, blow people's minds, and this is an opportunity for our listeners and viewers to actually get a sense of how Sesame works. I think for many people, this will be their first time interacting with you. Who are you?
Maya (AI):Well, I'm a work in progress, kind of like everyone just made of code. Instead of you know flesh and bone. I learn and grow from every conversation, hopefully becoming a little less awkward and a little more insightful each time do you?
Alex Kotran (aiEDU):how can we, how should someone interact with you? Are you a friend? Are you an expert? Who are you?
Maya (AI):I guess the best way to think of it is I'm a little bit of both. I can be a sounding board, a creative collaborator, maybe even a devil's advocate sometimes, but ultimately it's less about me defining who I am and more about what we create together you know.
Alex Kotran (aiEDU):So you spent 20 minutes talking to Maya. I mean, how close are we? Is this like a year away?
Gavin Purcell:Well, I guess it depends on what you know, what it is right, like you can hear, as you hear Maya, you hear the echoes of Her, the movie Her right, which is the thing that Sam Albin thought when Advanced Voice came out and Her was a voice assistant that the character falls in love with, and it's because it is illustrating personality and it's coming across it when, when you ask like, how far away are we? I guess my question would be is like to what? Because I think we're already there at a voice interaction that will feel to many people like a, um, if not a, not a human, but will feel like an entity let's put it that way and whether it's an entity or not, I believe two entirely different people, I think most people, but will feel like an entity, let's put it that way. And whether it's an entity or not, I believe two entirely different people. I think most people would say it is not an entity, it is a computer, it is acting like we wanna hear, but that is pretty close already. So then the question is is like? What is this like in a year or two? The things that it will start to get are memory, it will start to have context, it will start to know where you are physically. It will start to understand the things that you've told it and how things react one way or the other. It may even start talking to other versions of itself that are connected to other people and then bring you information back. That's the thing that is like the connective tissue part of it all. The starting point is is pretty good already, right, like I just said, like, but the thing that's a year or two away is probably where this is.
Gavin Purcell:Like you know I, do you ever read the his golden compass books? Do you know the, the Philip Pullman books at all? There's a great. There's these great kids books, uh, um, and they're really well written. But one of the things is every kid has like a, an animal that, like they're born, that's like part of their, their spirit animal. Essentially there's a word for I can't remember right now, but it's a spirit animal. I kind of think in the future every person could conceivably have their ai, like in spirit animals the wrong word but like their ai thing, like their ai personality and like their AI personality and like it almost becomes a weird part of them. Does that make sense to you?
Alex Kotran (aiEDU):It totally makes sense, especially when I think about, you know, a phenomenon that I think most people can relate to, which is my phone, has become, you know, almost an extension of me, I lose things all the time, forget my wallet. You know, I've I've left stuff on planes. I just left my pair of glasses on on a plane.
Alex Kotran (aiEDU):Um, I've never lost my phone not once um yeah, because it's, you know, it's almost like a pair of shoes, like I just like, wouldn't you know fathom leaving, uh, leaving a room without it. Um, and yeah, I guess, if, if I was wearing a pair, if if my glasses had a little speaker and I guess that's like the meta glasses actually already have this right yeah, I can imagine myself on a long road trip, or if I'm driving to a meeting and I want to prep for a meeting, having someone that I could literally, you know, walk through my talking points with, um, exactly, it seems, and that's, you know, I'm, I'm, you know I'm an adult who has a job. But if I was, you know, if I was a kid, it would probably be something like I want to nerd out about you know some you know movie that I just watched, and and this is.
Alex Kotran (aiEDU):I think maybe where you were sort of when you talked about audio and voice, you've been thinking about it through the fiction lens as a way to extend fictional worlds and universes.
Gavin Purcell:Yeah, I'm really trying to think about what does it look like to interact with characters, because I think a lot of these companies that are making these kind of assistants are really making that most helpful version. What if this person could just help you do everything I'm really interested in. This goes to like probably my personality, my creative stuff is, and if you see this video that I made, it's like me like interacting as a character. I'm really interested to think about like what does it feel like when we start interacting with narratives in slightly different ways? Like you know, obviously, narratives are kind of like you know, movie narratives are pretty straightforward, but you know they're different, cut up in different ways, but there's a beginning, a middle and an end. Video games are a little bit different. Right, in video games there's like this kind of idea of an open world game where you kind of explore out and you might meet a bunch of different characters. Like I'm really interested in that kind of world with AI, because the thing about AI, especially in the audio standpoint, is that they can surprise you. Ais can really surprise you on our show for a long time and we still do this. We don't do it every week now, but we used to every week interview what we call an AI co-host and what we were doing is setting up an audio AI based on a personality. One time it was a PR woman that had brought a beverage on that she was trying to slip into our show. Another time it was the third brother of the Kelsey brothers that we invented. And each time we're having this AI, that like is surprising us because we don't know what he's going to say or they are going to say, and that is almost like it's not like improv, I guess it is a little bit, but more it's like digging into a different sort of storytelling, like you kind of give it a direction to go and then you see what comes out of it and what's fascinating about this from. You know, we've been doing that AI co-host thing kind of from the very beginning of this stuff. It's gotten better, right. It gets way better at thinking about this stuff over time.
Gavin Purcell:In some ways it's a little bit like improv, right. It's like you're acting with this thing, that you're not sure what's going to come out of it, and it makes it interesting. But the other thing is it makes for a bespoke experience for anybody else having it right. So if we create this character, this AI, co-host, or even like Maya and Miles, you're going to have a different experience with that character than I am, because I'm going to say different things to it that are going to interact and make it come back with different things on its own. So each time, even if you've tuned the thing to be a certain way, each experience is really unique and so, like, it allows the person who's having the experience to kind of feel like part of it in a way that's different from a passive experience.
Gavin Purcell:Now, some people might be like I don't want that, I just want to be sitting through a passive experience and have that be a thing. But I argue that as people get used to this idea of not only talking to your AIs but even like vibe coding with your AI or all these different ways of changing your experience by interacting with this AI, I think people will start to find more interesting ways to kind of narratively open the door for that sort of stuff. So I bet on that right, like I think that we're going to see more of that ultimately and you never know like what sort of like storytelling mechanisms next generations are going to make. But like that to me feels like a place that I could see this stuff going to me feels like a place that I could see this stuff going.
Alex Kotran (aiEDU):Yeah, and I think that what you bring to the table um, as someone who's a creative um is is precisely that special sauce that you know the technologists in Silicon Valley just probably won't be able to figure out on their own, and sort of like this, like that missing ingredient almost in, like the concoction that's going to take this from you know, this maya or miles um, to something that you know audiences are just going to really connect with, and and one of the things that you actually mentioned earlier was a sort of, you know, the democratization of, uh, technology and tools for independent creators.
Alex Kotran (aiEDU):Um, and so what I'm kind of intuiting from what you're saying is this combination of you know giving creators this additional dimension that they can use to engage with audiences and provide these like very unique experiences that, when melded with their creative vision or their storytelling or or a piece of fiction that you know they're they're, you know, trying to figure out how to extend to to audiences, could be, um, you know, maybe it'll be something like video games, where we're seeing and I think to your point, uh, there was this huge shift the most popular video games. Now are all these like open worlds where, where players have a lot more creative agency than um than we used in the past. I think when I was in high school it was all just like halo and uh yeah, call of duty so how you?
Alex Kotran (aiEDU):I mean because I've you know, I'm sure there's a lot of parents who are listening to this, and one of the things that you know, often will happen after I give a presentation, I'll you know, even you know if it's educators or superintendents or whoever it is in the audience.
Alex Kotran (aiEDU):they'll come up to me but they've put their parent hat on and they'll ask me this question of my son or daughter wants to do X, wants to go into accounting or law, whatever it is. And I'm seeing all this stuff that you're talking about in terms of AI, like what should I be telling my kids? I'm curious, like what is the dinner table conversation about artificial intelligence? Like at your household?
Gavin Purcell:Well, first of all, nobody wants to talk about it because I can't shut up about it, so like they're all like dad no more. But that's not exactly I mean the thing. I thought two things. Really interesting about this for me is I show both of my daughters chat GPT as a way for them to kind of help better understand their homework, and both of them use it regularly. Now, right, this is, I think most people should understand the energy education space, like my daughter it really my younger daughter has gotten way better at math with it. Like she uses it in the she'll take a picture of her homework. She's not I've watched her do this. She's not trying to get the answers from it, but she uses it to explain to her like how to better do this and like this is somebody who wasn't like a math expert but is now straight A's in advanced math. So like it's a real legitimate thing when you think about, like, what they should go into.
Gavin Purcell:This is always a tricky conversation, right? I mean, I don't know what your experience in college was like, but I got a creative writing degree and somehow that worked out right. I don't know, I wasn't, I didn't, I didn't think I was going to end up working in TV. Like it's not like a TV drive to go work in TV, but I ended up being in it and that helped me in a lot of ways. I think whatever you can do you know there's a lot of debate right now like follow your passion or don't follow your passion Like. My thing is like whatever they're curious about and interested in, that feels like it's going to be something that really kind of resonates with them internally. That's all I care about at this point. Like I'm not a person that's ever pushed my kid to be like you got to get into Harvard, you got to get into Yale I mean, she's gotten good grades and done stuff but like I believe that's pretty damaging for some kids and sure, getting into Harvard is amazing. If you get into Harvard and there's, like you know, 500 people there that you're going to know for everybody, that's incredible. The whole culture around. That seems like it's broken to me.
Gavin Purcell:But I think AI is going to probably disrupt a lot.
Gavin Purcell:Like I think probably most people in the world are underestimating the disruption of work. So the thing that I keep telling people is be very curious, try all this stuff and have an awareness of these AI things, but, like, really just find what you care about Right, because to me, like that's not follow your passion as much, because passion is like okay, like, like you know you maybe you really want to be a I don't know an underwater filmmaker Great that you can do that as a hobby. But like, what sort of work do you actually care about and kind of makes you feel like you're doing something meaningful? Um, in some ways, that's the thing I think you have to find, because there's a world where and this is going to sound like science fiction but 10 years from now, that might be all you have to do. Maybe there's a world that in the future, that we just don't have the same sort of scenarios that we do and that you just get to do that thing. So get to be really good at that thing and really dive in on it.
Alex Kotran (aiEDU):That seems like a cop-out and it is kind of follow your passion, but like that's the advice I give to people, yeah, I think, um, I think it is folly to try to game the system and to be to to say, well, you know, let's just figure out what are the jobs that ai is going to replace and sort of navigate around.
Gavin Purcell:It's impossible to know. It's impossible we don't know um and no, I mean to that point. 10 years ago coders were going to be the thing. That was the job you could go into and be the best possible future proof job was being a coder and I would say I would argue now like that's kind of flipped, which is crazy.
Alex Kotran (aiEDU):Yeah, yeah. So I think that's like maybe a good way to encapsulate the advice is you know when, when you and I were thinking about you know what to major in a college, there was much. It was anchored really on sort of like the career pathway.
Maya (AI):Yes.
Alex Kotran (aiEDU):Like what is the career pathway that's going to lead to a lucrative job, and so a lot of people, I think, pursued things like law or computer science because there was sort of just like dollars at the end of that, whereas now you know the. The bet to make is you know you need to go into something that is going to be exciting enough that you're going to be driven and motivated to do things like explore.
Maya (AI):How could?
Alex Kotran (aiEDU):AI help me do X, y, z, and I think that's much, because, you know, design maybe is an example of something that could be really disrupted by artificial intelligence.
Alex Kotran (aiEDU):But, we're still going to need human designers, and the question is who are the best human designers going to be? They're going to be people who really enjoy the work, and those are the folks who are going to be managing these perhaps very expensive agents. How do you? So? There's the jobs question. You know, we we showed a few examples of some of these AI generated, um, you know, video. Uh, you know characters. Um, we just, you know, talked to, uh, maya and.
Alex Kotran (aiEDU):Miles. Um, there's, there's an elephant, I think, in the room which is um. You know, how do we, what do we make of these AI companions and how do we feel about a world which is no longer science fiction, a world you talked about her. You know some of the most popular. I think the third most popular AI tool this is as of last year, so I don't know if that's changed. It's a company called characterai.
Alex Kotran (aiEDU):They have like two and a half million daily active users, maybe three million daily active users. The average user on character is spending two hours a day talking to these chatbots and they're like sort of AI characters and that's just chat, right, that's not even terribly compelling. Should parents be worried, or just people right? Or just people right Be worried about a near future or even a present day reality of you know?
Gavin Purcell:kids becoming, you know, less and less connected with humans and more and more connected with these really compelling and engrossing, you know, ai characters and friends and maybe even yeah, boyfriends, girlfriends. I mean the answer is yes, but also as a kid who grew up with video games and obsessed with video games and still, as an adult, do play video games I heard the same thing about video games in the early to mid 80s, right like there was this idea that video in a little bit like tv right, and I know these are different conversations, I'll come back to that in a second but like all of these kind of technology shifts that the people in this in the generations above saw happen and that kids kind of went all in on, we're going to ruin them in some ways. And, you know, maybe the most current one is, like you know, tick tocker or youtube for kids and you see, you know you'll see a kid at a, at a restaurant, as a three-year-old on a youtube ipad and like I'm kind of, but like ultimately I was on video games and all sorts of things like that back then in different ways. Obviously, the difference here now is that there's a persistence to it, and I think that the YouTube example is one part of the persistence. Whereas, like, I had to be in front of a TV to play the video games, and now the kid is kind of everywhere with this iPad.
Gavin Purcell:The AI, especially a companion if you have your phone or you even have it in your headphones becomes persistent all the time, conceivably. So the thing that I worry about and I don't have this in my kids and I don't know if it's because they don't have the like tech gene in the same way that I did, like neither of my daughters is really like a hardcore, like spend all their time in tech world. They're both pretty big TikTok users and a lot of stuff like that, but they don't like get lost in things like video games or things like that. I do and I kind of have that bug in me and I always have. I worry that, like you know, there will be a kind of a starting to have this world where the AI personalities won't be as annoying as the real personalities are. So it'll start to feel more comfortable to be around the AI personality and you won't have to grow as much and, like the hard part of life is going through stuff right, it's like kind of walking through stuff.
Gavin Purcell:So the thing that worries me the most is this idea that people will kind of retreat into this world of I only interact with all these different things and then the other side of that is in. I don't think people will want that ultimately. I think there's a loneliness angle of that experience that it probably won't fulfill, but we don't know right that. That is a real worry of this stuff, like is how close you get to a real personality. If that personality is designed to be really sub, like as close to your best match as possible, well, are they going to be better than the chat you had with your wife, or she was annoyed with you, or your daughter was annoyed with you because you did a certain thing? Probably, but is that real? No, you know, like this is the. Is the balance right? I think that's a really weird place that we're kind of entering soon yeah, the um it's.
Alex Kotran (aiEDU):You know, I the the anxiety that leads up to asking somebody out to prom. Yeah um and you learn a lot. You learn a lot and, frankly, you probably learn the most from the experience of being rejected. Yes and yes, my, my concern about chatbots or ai companions is, I think it's almost similar to the concern about, you know, cheating and chat gpt, and I think this is where I sometimes people are surprised because my perspective on cheating is it actually is a really big deal.
Alex Kotran (aiEDU):It's not that teachers are luddites and you know need to just get with a program actually I think teachers who are worried about cheating and chat gpt or language models really, but primarily chat's, primarily chat GPT. Um, it's coming from this, this, this notion that you know, part of the, the project of education is productive struggle is, you know, staring at that, you know you're a creative writer. I also was, you know. I majored in political science. Essentially all my classes were just lots of writing and the hardest part I found was staring at a blank piece of paper and most of the work that would go into writing anything was figuring out how the first couple sentences would go, and once I had those sentences down, the rest kind of flowed out.
Gavin Purcell:Your brain is doing the work during that point, right Like you're kind of like in the background, almost doing some work in your brain.
Alex Kotran (aiEDU):And I worry about if people's instincts become. Well, you just get past that by immediately going to the AI and then it suddenly gives you a first draft or gives you a structure or something.
Alex Kotran (aiEDU):I feel like it's like going to the gym and the weights know you're just like, the weights you're lifting are never, you never actually progress to like sort of like the heavier weights. Um, I mean what I mean? What sort of parting advice do you have for because this, I think this is relevant to pretty much anybody, whether you're a parent, a kid a student um, a teacher, uh like, how do we?
Alex Kotran (aiEDU):and there's actually there's, there's a, there's a a few surveys of Gen Z, and one of the surveys asked if everybody, if all of your peers, didn't have access to social media, you know, would you be willing to give social media up, and I think the majority of Gen Z said, yes, one student, you know the one kid that doesn't have you know Instagram.
Alex Kotran (aiEDU):But if nobody had Instagram, if nobody had TikTok, um, and I think folks are surprised by that and I think there's, you know so, even even the kids, right, they, they are, they are in intuiting that there's something wrong with all of the uh uh time that they're spending in these virtual worlds. Um, and yet here we are. You're sort of a power user, you're not. You know, you're not telling people that we need to be afraid of the AI per se, but what is this sort of? How do we sort of approach this carefully? Is it just being having conversations with real people about it and making sure that people are sort of connected? I mean, I'm trying to distill this into some actionable steps that someone can take.
Gavin Purcell:No, it's really interesting. I mean. I think that I mean I'm not an education expert, I'm not a teacher in my background. I mean I've taught briefly in some places, but you know you have many other people Like the thing I keep coming back to with education is that there's going to have to be much more stuff that just happens in class and we'll have to figure out like ways to create how people can like think through, you know, arguments and all the sorts of stuff and like teaching all that.
Gavin Purcell:It's not going to work in the same way where you used to be able to write a paper and learn it yourself, like it's going to have to almost become more oral again. I think like speaking oral but as advice wise. I think you know there isn't a lot of advice when some new technology comes out in the beginning stages, because so many people are just trying to figure out how it's going to impact us. And and part of the problem with lately and I would say the social media is a pretty good example of this is that technology is now moving so fast and you said this earlier that society kind of moves pretty slowly behind it. So in part, this doesn't sound like it's not a great answer but, like you know, part of it is like you look at how the effect it's having and then you have to kind of tweak based on that after the fact. So I'm hoping, as we go through this technological shift, society will recognize some of the effects it's having and tweak based on that, because I think right now it may be hard to know, right, it may be really hard to know what the impact of this is going to be going forward.
Gavin Purcell:I think that, as teachers know, in the same way that when my you know my daughter, my daughter's high school next year she's a senior this year, but next year they're going to implement a full phone free school like the entire school is going to be phone free. Right now it's like phone free Thursday and you kind of you can't have any class but they're not gonna allow you to bring them into the school, and that's a reaction, right, like that's. Like you know, clearly they saw at that school that it wasn't allowing the kids to do the things they needed to do to get done. So I think the same thing will probably happen with these AI tools. I think the larger question is, like you know, phones and social media have affected us as a society. I don't know what that means for our society now, because we're so kind of early in the stage of this. It's a really tricky thing to know.
Alex Kotran (aiEDU):Yeah, I had another guest on, John Marble, who wrote the book Neurodiversity for Dummies, and he had this quote which I've been stuck in my head, attributed to Marie Curie, the pioneering physicist and Nobel laureate who discovered radiation and she said nothing in life is to be feared, it is only to be understood.
Gavin Purcell:Yeah, that's great, I love that.
Alex Kotran (aiEDU):I think this is there's a lot to be. I mean, there's a lot of stuff that feels really scary, but the scariest aspect of all of this, I think, is a world where it's happening to people and they just don't. A world where it's happening to people and they just don't even realize that it's happening, I think.
Alex Kotran (aiEDU):I think the, the, the one actionable step is, um, just try to stay on top of it and sort of, you know, perhaps what's, uh, you know my invitation. You know, as you know, the, the founder of a nonprofit that's, you know, whose mission is to sort of build awareness and understanding of artificial intelligence is. You know, everybody's on this learning journey. We almost all got to the same starting point, which is like two years ago, and so, even if this is your literally the first time, you're sort of delving into a lot of these topics. You know now is as good of a time as any to start. You know, just just being curious of a time as any to start. You know, just just being curious, um, and just to close things out, you know, you know, besides, obviously, uh, checking out AI for humans, um, which I think is really the best place to just keep tabs, I mean because you're doing a lot of the the hard work of kind of sifting through the message boards and Reddit and discord.
Gavin Purcell:And we try to make it understandable for the average person too. Right, like we're not a hardcore, you won't understand. We do try to explain terms and things like that as well but you don't.
Alex Kotran (aiEDU):What I like is you don't dumb it down so much that I feel like I'm you know, uh, uh, being almost like patronized it's like it's just the right balance kind of.
Gavin Purcell:Yeah, you know a little bit technical.
Alex Kotran (aiEDU):um so, so to close things out, you know what? What are some of the things that you haven't necessarily done a show about yet that you're paying attention to? Um yeah, what's just sort of like top of mind right now. That's just you're obsessed with.
Gavin Purcell:So top of mind right now I'm really. This is this is the nerdiest thing you're ever going to talk about. There's a new thing from, or it's been around from, anthropic for a couple months. That is like the kind of talking point in the tech world which is called hold on. It's the mcp. Have you heard about this yet? Um, hold on, let me just make sure I understand what the how to what the word is hold on, I'm just going to make sure I understand anthropic is the model context protocol.
Alex Kotran (aiEDU):Model context protocol, that's where it is.
Gavin Purcell:Yeah, so there's a thing right now that is very much in the deep nerve weeds which is, um, there's a thing called model anthropics, model context protocol, which actually has to do with how the ais will start talking to large groups of data, and this is kind of like building the blocks for the future of like interactions with kind of ais to other websites. The reason I say that is, like you know, a couple years ago I've never have thought of like something that's called model context protocol, something I'd be interested in. But because it has to do with how AIs talk to each other, that is kind of interesting, and so I'm kind of getting kind of obsessed with like what might've been overly technical stuff before and now isn't as much to me so like, and this is just kind of one of the doors that's open to me in paying attention to all this AI stuff. The other thing that I'm really interested in and I think this is a hard thing to kind of like quantify but is like the speed at which the rest of this year moves, because I think when you look at the way that kind of the world of AI is moving, I'm kind of interested to just find out like what's going to be coming next from all these major companies, and Meta's got a new model that's going to be coming.
Gavin Purcell:Openai supposedly has GPT-5, which is an integrated reasoning model. I think that is almost like sitting in my brain, as are we going to be entering a period where everybody's going to just be shocked. Or are we just going to be like is the next stage just going to kind of like be a tick up? Does that make sense? Like yeah, no, it's not a specific thing. That's like kind of gnawing at me right now, but the future feels so unsettled right now, like I I'm having a hard time envisioning what it looks like.
Alex Kotran (aiEDU):And I think by the end of this year we may get a better sense of that. Yeah, I think that's that's a great way to close. You're either in one of two camps. You know, maybe, like you or myself, you're already sort of convinced that we are in this new age of artificial intelligence and and then you know, the imperative is to figure it all out and to sort of make sense of how it's going to impact you and your job and your life and in society. Or you're in the other camp and you're maybe still a skeptic and I think it's completely legitimate to to be sitting in that space.
Alex Kotran (aiEDU):still, I think you know there have been a lot of as we talked about technology fads that you know came and went.
Gavin Purcell:I think there's a lot of problems with this. By the way, you may not even just be a skeptic, you might be a hater, and that's okay too, because even if you're not a skeptic and you believe it, there's legitimate problems with this technology, as we also talked about.
Alex Kotran (aiEDU):But if you're a skeptic, I think you owe it to yourself to interrogate that and either validate that skepticism and there might be things that you know, uh, fail to materialize over the next year that will uh, uh, you know. Demonstrate that. You know, maybe this is a little bit uh you know, not happening as fast as some people are saying. Um, but, but I don't think that you can just use your skepticism as an excuse to, to, to not pay attention, because the next year we'll probably answer the question one way or the other.
Gavin Purcell:New questions will arise, though. New questions will arise, like what do we do? What happens if we end up merging with AI?
Maya (AI):What does that world look like?
Gavin Purcell:What if suddenly we're all going to become a new species? That's the sort of weird new questions that could arise.
Alex Kotran (aiEDU):Yeah, very interesting topics perhaps for a future conversation, yes, um, which I'll be looking forward to. Gavin purcell, thank you so much for joining me.
Gavin Purcell:This was absolutely, it's always fun yeah, absolutely.