Raising Kids in the Age of AI

Preparing kids for careers in an AI world

aiEDU: The AI Education Project Season 1 Episode 6

Are you worried about preparing your kids for jobs that don’t exist yet? 

In this episode, we dig into the changes that AI is bringing to work and school. First up, materials scientist Ashley Kaiser reveals how AI is powering “self-driving labs” to offload repetitive tasks, which gives her more time for creative planning and scientific analysis. 

Next, Google’s Ben Gomes explains why the next era of education must emphasize concepts over mechanics. He also discusses why curiosity, problem-solving, and cross-disciplinary thinking will define future-ready talent. 

Across both conversations, we talk frankly about the shift from jobs to tasks and why routine work is most exposed to automation. But that does not make human workers less important — it actually makes human strengths more valuable. Critical thinking, clear writing, ethical reasoning, and the ability to frame problems will become the core skills of employability in AI-driven workplaces. 

We also hear practical AI guidance for parents and students:

  • Build real experience through internships and authentic projects.
  • Use AI to accelerate learning while double-checking outputs.
  • Blend STEM with humanities to strengthen judgment and communication.

If you’re wondering what to study, how to break into a first job, or how to keep your skills relevant as technology evolves, this episode offers a clear and optimistic roadmap for thriving alongside AI. 

 

aiEDU: The AI Education Project


Dr. Aliza Pressman

SPEAKER_03:

We think about the future of jobs. We think about different types of jobs that won't even exist yet. And there's so many new avenues that haven't even been defined yet. I think that the world is truly is is truly open.

SPEAKER_01:

I do think it's a particular kind of worry that parents have about their kids' futures. It's the thing that comes up in almost every conversation, no matter who it is, whether it's a superintendent or a policymaker or a teacher. Um, we'll be talking about the future of education. And you can see the look on their face change. And they also ask, well, like, but what about my kids? Like, what should I be telling them? And as parents, you know, you have a role in helping them prepare, right?

SPEAKER_02:

Yeah. I mean, it's the whole thing about raising good, adaptable, capable humans. You're raising them to one day walk into the world to do their own work, to build their own lives, to have autonomy. It is about getting them ready for that. So it does bring up a lot of questions about what will our kids' adult lives look like and how can we best prepare them for those lives?

SPEAKER_01:

And that's exactly what we're going to be talking about today. How do we prepare kids for the future of work? On this episode of Raising Kids in the Age of AI, a podcast from AI EDU Studios created in collaboration with Google. I'm Alex Katron, the founder and CEO of AI EDU, a nonprofit helping students thrive in a world where AI is everywhere.

SPEAKER_02:

And I'm Dr. Aliza Pressman, a developmental psychologist and host of the podcast Raising Good Humans. On this episode of the podcast, we're exploring how you can help prepare your kids for careers in an AI world. And it's something really close to my heart because my kids are teenagers. One of my kids just started college. And so it's really the next phase. I'm going to be listening very closely.

SPEAKER_01:

Later in the show, we'll hear from Ben Gomes, Google's chief technologist for learning and sustainability. He's going to share his thoughts about what expertise is actually going to be valuable in a future job or career in the world of AI. First, we're going to hear from Dr. Ashley Kaiser.

SPEAKER_03:

My name is Ashley Kaiser, and I am currently a scientist at Lila Sciences. My field is essentially chemical engineering, and I'm really interested right now in building new systems with AI for science.

SPEAKER_01:

So imagine laboratories that are self-operating and have a combination of lab equipment, hardware, software, but also scientists who are overseeing and managing all of those technologies that are working in concert together. As a scientist of materials, characterization and testing, Ashley's current work focuses on chemistry and nanomaterials. But her passion for science started at a very young age.

SPEAKER_03:

When I was a young kid, I would go around my yard and smash rocks and collect them. I thought they were pretty. But as I got older, I started to question, you know, well, why are certain rocks blue? Why are other ones gray? Well, that's really because of what they're made of. And I think that the passion for rocks has really led me to where I am today, you know, in this type of field.

SPEAKER_02:

Ashley went on to get her bachelor's in chemical engineering and later earn her PhD from MIT in material science. And she says she'd take just about the same path all over again if she were to start out today, just maybe at a class in coding or learning a little bit about AI. Because recently Ashley has started working at a company that builds AI to all stages of scientific endeavor to increase the speed and scope of discovery.

SPEAKER_03:

So the ultimate goal is that we will have, we will have an AI that will be able to run our labs, and that the AI, you know, will essentially conduct every single step in that wheel of science, which is we ask a question, we have a hypothesis, we run the test, we look at the data, we feed that back into the closed loop. And then we as humans are kind of watching this whole thing, guiding it, which basically means that we can design labs that will run themselves so that we can experiment faster and faster. You know, as a scientist, I'm so excited about data. And every, you know, I think all people, they say, Oh, you know, I have a question, I want to answer my question, I'm gonna run this test, and then you wait, and then you want to see that data come back to you. So in the in the olden days, um, you know, when when I would work in labs in the past, it would really be just me, you know, and I'm moving pipets left to right, I'm moving my samples around, I'm carrying trays around the lab myself. And so I can do that. But now in this way, we can do it so much faster because we are designing systems where I don't have to do every single step in that workflow. I can design the workflow and then it can just happen.

SPEAKER_01:

This is that like the big dream that we often hear about, you know, in terms of the promise of these AI tools, that they're gonna take over the boring tasks and give us more time back to think bigger, more deeply, and basically do the higher order work that we tend to like a lot better.

SPEAKER_03:

AI allows me to do more of the creative tasks for sure. So more of I can spend more time thinking about the next step or the scope of that next build, brainstorm it, vet that out, talk to other people about it. Um, because I don't have to be in the lab moving plates around all day. I truly do have more time to think. And I would say it's relaxing, but we spend so much time in doing this and we're so excited about it that it's still not really relaxing. Bringing AI into our work, I would say does a lot of good things for us. Um, it really makes the effort that we put in go farther. It allows me to spend less time doing tasks that would just take a long time. Um, so whether that be um if I have to go through a bunch of information in the past, I would just do that myself. And it would take forever to open up a bunch of files, copy this from here to there, and then we put it in a table. You know, there's there's a lot of tasks like this that AI can do so well that are really basic tasks. So it really increases how efficient I can be while I'm working. Um, and that's really across the board. Um, so that's been one of the biggest changes is it's such a time saver. And in the way that AI helps me streamline my work, it allows me to be more of a co-creator for what I'm doing. So it can help me brainstorm new ideas. At the end of the day, I'll take in all the insights and then I'll make my own choice. Um, but it it's very cool. It almost feels like I have a co-pilot next to me that helps me work faster and better and smarter than than than in the past when it was just me and my brain. I don't think that humans working in in labs at every scale is gonna go away anytime soon. Humans will always have to be there to build that next system. So I don't think that we will ever run out of problems to solve or things to optimize. Um, scientists, engineers, pretty much everybody in STEM, I think we would all agree that even when you wrap up part of a project, you're not really done. You're done for now until you improve it, upgrade it. Um, it's akin to, you know, now in society, we would rather fly on a plane rather than walk if we have to travel across the whole country. So as we build our AI and we find more problems to solve, we instead of walking through science or physically a human walking through the lab, we will truly fly through science at a faster scale and it's gonna be awesome. Amazing. No, not awesome. You can't say awesome in the podcast.

SPEAKER_01:

It is awesome. I think I think you can say awesome on the podcast, Ashley. And we're hearing a lot of the tenets of what I think is exciting to people about the role that AI is going to have in the future of work. You know, we if you go back to like the dawn of computers, there was a time when people literally hand wrote out spreadsheets. So, you know, I think there's a lot of low-hanging fruit here. Um, and you know, I think what's powerful is this vision of, you know, AI again, not replacing people entirely, but sort of like just creating more space for, you know, brilliant people to spend more time on the things that I think we want people to be spending time on. I don't know that anybody really believes that it's good to just waste time filling out spreadsheets by hand. And you know, I think in science and in law and medicine, uh in so many different fields, I think there's like a lot of analogs to that. But um, Aliza, I'm I'm curious, what are some of like the routine things that like you had to do back when you were, you know, getting your PhD that you know AI might be able to replace?

SPEAKER_02:

This I connect with so much because even just thinking about in graduate school, you're just put you're doing a lot of like coding and inputting and it's just tons of hours of things that I think probably there was benefit to spending some hours on that, but not the thousands of hours that you put into it. That seems, you know, like in retrospect, pretty exciting that you could just get rid of that. But even now at the hospital, like there is so much low-hanging fruit. There is still hours, hours spent filling out charts by hand constantly. I think it's really exciting to think about how AI can take all that off the table. It's like, what are the burning research questions? You need a human being to figure that out. And so it's AI is not taking away the thinking, but boy, does it sound like slowness of research can be expedited so much, which is just really exciting.

SPEAKER_01:

I think you're describing the ideal. This is, you know, why it's so important for to have people that are being really intentional about when I should be, you know, playing the role of you know, the human expert, and when is it maybe the right place for me to automate? It's worth doubling down on this point, which is it's not so much that AI is going to replace entire jobs, it's gonna replace tasks. And the more tasks that you do in your job that can be or will be replaced by AI, um, the more vulnerable you are and the more you're gonna have to adapt. And so, you know, the the answer of like what is the future-proof career pathway, I don't think there really is an answer. Nobody, anybody that tries to answer that question, um, I don't think we should take them terribly seriously. I think what matters to kids, if we think about advice that a parent needs to give, it is you need to pursue something that you're like truly interested in, maybe not passionate about, but like something that you really want to learn about, where you're gonna apply yourself in the ways that will develop the critical thinking skills, the problem-solving skills that I think you know, companies and organizations are really gonna be valuing. Um, and I think in a way that's actually freeing because it's sort of a shift away from this very vocational uh posture that, you know, when I was in school, it was like a doctor, lawyer, engineer. You know, if you go and get if you get a humanities degree, it's just gonna be a lot harder to find a job or career path.

SPEAKER_02:

I got a humanities degree.

SPEAKER_01:

Same, yeah.

SPEAKER_02:

I read recently this is kind of the era of revenge of the humanities, and I was so excited by that.

SPEAKER_01:

You know, I think people are starting to realize that it's not just about going to school for four years and just being really good at writing code or being really good at, you know, writing a legal memo or doing uh, you know, creating a business plan. You know, we're seeing the the results of technology already starting to shift what skills and backgrounds are you know preparing kids for the future. And I think anything where they're just thinking, right? Like anything where you have you're building this skill to think more broadly about challenges or problems, um, like that's the type of stuff that that organizations are are really starting to place more and more of a premium on.

SPEAKER_02:

For sure. My daughter is doing physics and English literature because she wants to have like sort of the part of her brain that's thinking about problem solving and the part of her brain that's thinking about just like beautiful synthesis of human condition combined with writing. And I think it's very exciting because there isn't that worry anymore that there had been in years past of like, but what will I do with that? Next, we'll hear from Google's chief technologist for learning and sustainability, Ben Gomes.

SPEAKER_01:

Ben began his career as one of the first principal engineers at Google, and he's been with the company for over 20 years. Today, he works closely with the teams working with technology and AI to develop tools to combat climate change and improve the way we teach and learn.

SPEAKER_02:

And I was kind of surprised to hear how much he thinks not just about those things, but also about how jobs will change in the future and about building tools to ensure young people today can carve out their own path.

SPEAKER_00:

You know, very often I talk to people about what they've chosen to do or what they want to do. And very often they want to do that specific thing because they know one person who has done that specific thing and succeeded. But the world is a lot broader than that. So I do think there's the ability to first of all dream more broadly. Now, what's also true is the tools in many of these areas have become a lot simpler to use. So there are many ways in which I think new areas are opening up. Uh people might have thought, oh, I'm not good at math and therefore I can't do this. But that's the mechanics of arithmetic and math in general should not be a barrier to many fields today. Much of that can be done by machines. So mathematics is a very particular example because a lot of people are afraid of math. And therefore, they stay away from whole fields because they're afraid of dealing with math. And AI can be a supplement to that in a very interesting way because it enables unlocking a pathway that might not be otherwise possible to you. Of course, deep expertise in some areas will continue to be important. There are complicated concepts in every field that you need to understand. If you're going to understand do medicine, you're going to have to understand the human body, the biology, and the chemistry of what happens in people, the ways in which diseases progress and initiate and don't initiate. That doesn't go away. But perhaps less important is the ability to do manipulation of various of various kinds, right? So let me give you an example. When I was studying calculus, I spent about a month on limits and then years on the mechanics of calculus on integration and differentiation. Being able to solve all the problems and actually do well on the exams without a deep understanding of what exactly I was doing. That needs to change. Education needs to emphasize far more getting the underlying concept of what you're doing, while machines will be better able to do the mechanics of the detail, but you still need that concept.

SPEAKER_01:

As we talked about in our last episode, AI is forcing us to think about the mission and purpose of education. What do students need to be able to do at the end of the day? For Ben, thinking about the big picture, understanding the larger pieces and how they fit together, rather than just getting hung up on the mechanics, like that's the key to being ready for a job market that requires flexibility, problem solving, and this sort of broad inclination towards curiosity and learning.

SPEAKER_00:

Because the tools are getting so much easier to use, the access to the information is getting so much easier to have, people are able to operate across multiple disciplines a bit more easily. I think the workplace has been very siloed into specialties over time. And there will be people who take time to build expertise in those concepts. But as time goes on, I can see more of a role for more generalists that there is the need for critical thinking, for analysis, first of all, for identifying the right problems, for breaking down a problem, for monitoring the solution to it, for having the right curiosity to approach something new. All of these are really going to be important skills in the future. But beyond that, I think the meta properties of learning, the ability, the curiosity to learn, the knowledge of your own ability to learn, your confidence in learning is going to become more important than the details of exactly what you learn. Because things are changing pretty rapidly.

SPEAKER_01:

I think for a lot of us, when we're talking about the future of work, our minds go to the beginning of it all. That first job out of high school or college. What is AI going to do to those entry-level jobs?

SPEAKER_00:

There is this concern that getting into that first level of a job becomes more complicated. One of the things I would say, I can speak for computer science with a bit more depth, is that you're now able to do a lot more while you're an undergraduate to learn about that first level of a job. I do think things like internships and all will become ever more important, engaging with industry, engaging with your future workplace, so that when you join the workplace, you're able to contribute at the level that is expected of you. I actually see a world where AI takes on some of the more mechanical aspects of work as a world in which the human interaction becomes even more important. So if I think about teachers, right, many people who go into teaching go into teaching because they really want to inspire uh kids to learn. And then they get into teaching, they find that their days are taken up with a lot of paperwork and things they have to do in the process of teaching. And there's a lot of burnout of teachers. And you might hear the same thing from physicians as well. They're spending their time on things that were not why they got into that profession. It's not what makes them get up in the morning with excitement about what they're doing. And I think here I can take on some of those tasks and leave the person more free to focus on the human aspect of it, which is what they often came to that profession to solve. How does a teacher actually inspire a student? How do they ignite that bit of curiosity in them, that spark that will last a lifetime, right? At the end of the day, we are solving problems because of other people, for other people, not in the abstract. There's no shortage of problems in the world to solve. There's an almost an infinite number of them, whether we think about things like healthcare or poverty or access to resources or the climate, but we need to be able to analyze those problems, identify them well, figure out ways to approach them, and then collaborate with other people to solve them. And bringing all of that together is going to require very human skills that are going to be critical for us to solve these problems well.

SPEAKER_02:

The idea that AI is going to allow us to help each other more seems like the ideal scenario. Curiosity, critical thinking, problem solving. I mean, we keep talking about it so much in the context of like how to deal with AI, but I also think those skills and empathy and the very human skills that we can keep growing are going to be so valuable. Right? Hopefully, maybe.

SPEAKER_01:

Yeah, I mean, that's the that's sort of the mantra. It's it's our mantra at AI EDU. We talk about critical thinking and the human advantage. It's it's interesting to hear from you know someone in a leadership position at you know one of the big AI companies, with what was, I think for many is a counterfactual. Like I think a lot of folks are saying, okay, AI is going to be doing all of this stuff. And so we need to focus on the other things. And what Ben is actually saying is like, and you also need to understand all that stuff that AI is going to be helping you with. Because even if AI is really good at coding or physics or chemistry or accounting or whatever application you're using it for, um, companies are also going to want you know employees who themselves understand those domains. And it makes sense, right? Like if you don't understand how to write, how could you possibly evaluate an output from AI and determine whether it's good? And you know, that kind of brings us almost to this idea that, like, well, maybe education doesn't change as dramatically as people might think. What should change is, you know, let's take computer science. Um, you probably don't need to go two or three years until you create your first application. Like you still need to learn, you know, how to write and read code, but we can also fast track some of the more interesting hands-on components.

SPEAKER_02:

I also think this goes back to the individual student and learner. How deep do they want to go?

SPEAKER_01:

This is actually like a really big gap that we have in schools right now. Like many schools we work with, you know, when you, if you were to ask them, like, okay, who's your career exploration teacher? Or like who's even responsible for just keeping tabs on how career pathways are changing? You know, most schools will say, like, we don't actually have anybody dedicated to that. And so I think anything that can help students, you know, just sort of like ask themselves this question and like really sort of like dig into both, you know, what do I want to be when I grow up, but also what is the world gonna be when I grow up? If you're a parent and you're not sure if that conversation is happening in schools, to me, this is like precisely the kind of conversation that you can bring to your dinner table.

SPEAKER_02:

I think I know what we're talking about at the dinner table tonight. Thank you so much for listening. Join us again next week when we take off the kid gloves and answer parents hard questions. What about cheating, critical thinking, misinformation? Tune in next week to find out if your pressing questions about AI get answered.

SPEAKER_01:

Find out where AI will take us and future generations next on raising kids in the age of AI. Until then, don't forget to follow the podcast on Spotify, Apple Podcasts, YouTube, or wherever you listen so you don't miss an episode.

SPEAKER_02:

And we want to hear from you. Take a minute to leave us a rating and review on your podcast player of choice. Your feedback is important to us. Raising kids in the age of AI is a podcast from AIEDU Studios in collaboration with Google. It's produced by Kaleidoscope. For Kaleidoscope, the executive producers are Kate Osborne and Lizzie Jacobs. Our lead producer is Molly Sosha, with production assistance from Irene Bantiguay, with additional production from Louisa Tucker. Our video editor is Ilya Magazanen, and our theme song and music were composed by Kyle Murdoch, who also mixed the episode for us. See you next time.