MEDIASCAPE: Insights From Digital Changemakers

Navigating the Between Times: Education in the Age of AI with Ben Tasker

Hosted by Joseph Itaya & Anika Jackson Episode 73

"We're in the between times – not fully in the future, but no longer in the past." These words from Ben Tasker, Dean of AI at a leading university, capture the transformative moment we're experiencing as artificial intelligence reshapes education and work.

In this eye-opening conversation with host Anika, Ben shares his remarkable journey from healthcare administration to data science to educational leadership. Through compelling stories about using data to improve cancer screening in rural Maine and developing algorithms to boost student success, Ben illustrates how AI can solve human problems while emphasizing human connection.

The traditional educational model faces unprecedented disruption as information once disseminated in classrooms becomes instantly accessible through AI tools. This reality forces a fundamental shift from time-based degrees toward skills-based learning focused on what you can do rather than how long you spent studying. According to the World Economic Forum, one billion people will need upskill or reskill by 2030, developing both technical AI capabilities and uniquely human qualities like empathy and leadership.

For institutions, this means creating flexible learning pathways through micro-credentials, badges, and shorter courses that stack into larger qualifications. For individuals, success requires developing a personal learning plan and building your brand. The good news? Getting started with AI doesn't require complex mathematics – simple experiences using free LLM tools can build familiarity and confidence.

Whether you're a student, educator, or professional navigating these "between times," this episode offers practical guidance for thriving in an AI-transformed world. Take your first step today by creating your learning plan – when written down, goals become 76% more likely to happen – and begin your AI journey with simple experiments that demonstrate the technology's potential.

This podcast is proudly sponsored by USC Annenberg’s Master of Science in Digital Media Management (MSDMM) program. An online master’s designed to prepare practitioners to understand the evolving media landscape, make data-driven and ethical decisions, and build a more equitable future by leading diverse teams with the technical, artistic, analytical, and production skills needed to create engaging content and technologies for the global marketplace. Learn more or apply today at https://dmm.usc.edu.

Speaker 1:

Welcome to Mediascape insights from digital changemakers, a speaker series and podcast brought to you by USC Annenberg's Digital Media Management Program. Join us as we unlock the secrets to success in an increasingly digital world.

Speaker 2:

I'm super thrilled to have this conversation because this is a topic I'm super passionate about. Ben Tasker, you are focused on AI-driven education and, as an educator, this is something I see a huge need for right now. We have professors who have students writing papers using chat, which I think is the worst use case for AI. We have teachers, such as myself, who go AI first and I'm teaching tools to my students and best applications, and we have teachers who just don't care and they've been teaching for so long they don't want to learn new technology. So thank you, first and foremost, for being here to go deep dive into this conversation.

Speaker 3:

Well, thanks for having me, Anika. Like you said, my name is Ben Tasker. I'm an AI expert. I've worn many hats I've been an AI engineer, a data scientist, a team lead and now I'm a dean of AI at the United States top university and or just university. I know many listeners might be feeling scared and uncertain, and I understand that. I consider this moment as the between times. We're not fully into the future.

Speaker 2:

Exactly.

Speaker 3:

But we're not fully out of the past yet either, and that may make people feel anxious or uncertain, but it's also a great time to learn how to upskill and reskill and revolutionize and evolve with AI. Yeah, and evolve with AI.

Speaker 2:

Yeah, well, let's talk about your trajectory going from being, you know, data analyst scientist and moving into education. What sparked that interest for you?

Speaker 3:

So my upskilling and reskilling journey actually comes from a lack of data literacy. My undergraduate degree is in healthcare administration. I was really passionate about running a hospital. One of my first tasks as soon as I was employed in a hospital was to project patient volumes at doctor's offices. And maybe I was a little naive I was younger at the time. I thought it'd be the latest, greatest technology, the data would be available, it will be a relatively non-complex project. And it was the total opposite. Some of the data was still written down by hand. I had to go to the doctor's offices in my car to figure out patient volumes. Sometimes I had to meet with physicians to get accurate numbers. The data was in different types of Excel sheets or some other data file types, so nothing was consistent. And that's where I realized the real power of data and why I should focus on that. And data is the oil of any AI algorithm. But data science was around, but not AI. People weren't talking about it as much as they were today. During that same time period, data science programs and data analytics programs started to emerge. So I decided to get my master's in data science. I learned how to code. That helped me load the data more quickly. It made me understand the data more accurately. It also taught me the theory behind all these AI algorithms, all the math computations, and that's where I really saw how data modeling and all these algorithms that we talk about in math class are really important because, yes, they seem abstract but you can really have impact on the people. The physician volume prediction might seem like it's a low-task, low-bar project, but it's really impactful for humans. If they can't go to a doctor's office when they need to go, that could impact patient care. So you want to be as accurate as possible or I want it to be as accurate as possible and data science is a unique domain because it encompasses, I think, every other domain. So in this program I was learning management, communication, math, project management. All those aspects then plug into AI. So I graduated with my master's degree.

Speaker 3:

I went into academics In that background. I had to create an algorithm to help predict students' success and persistence. So persistence is making sure students move from one course to the next. Success is when they actually get there to a certain amount. Both are incredibly important. It was for a community college system, so these individuals may have been out of school for a long time. They may have been upscaling and reskilling themselves. Each program is a little bit different, but we took some data features like distance from the community college that they were enrolled at the program. They're in specific sections of courses, the time of day. These courses were running even down to the faculty number to create this algorithm to predict success and persistence. If it was below a certain threshold, we deployed resources like tutoring to make sure that these individuals could cross the finish line. And in that case I really did see the impact of how AI AI science can also be a human interaction. I helped many more than one person, but just helping one person complete their degree felt very powerful, and during that time I was working with some faculty members and those faculty members were starting a healthcare project. So I transitioned from academics into healthcare again, this time as a data scientist. In that role again, it became a public health campaign, so very human-centered.

Speaker 3:

But we had to figure out which cancer and which types of patients to focus on for the highest impact. So pre-screening, post-screening are there abnormal levels of cancer in a specific population? To do this I had to use a data science technique and data visualization, so I took the patients that were above a certain threshold. I plotted them onto a map. When you look at the map, this was in Maine. A lot of more patients that were coming in for cancer screenings and ultimately, with the cancer diagnosis, were coming from rural areas. This was in. Yeah, exactly. This is not the data science part, this was just intuition. But, looking at the map, what's in rural Maine Well water? So if you don't clean it regularly, if you don't get your pumps flushed regularly, if you don't get the water tested, you could end up with stomach cancer. And so we saw that. And then we looked at the stomach cancer rates and they were 1.5 times the standard deviation compared to the nation and compared to other parts of the same state. So what that means is that it was significantly above average where it should be.

Speaker 3:

So the public health campaign was making sure we had to go out to the community, get these people in for screening. We had to go to churches, get ministers involved, go to schools, get principals involved, hand out all these brochures. As a data scientist, yes, we talk about communication and the importance of that, but when you're actually doing it, it becomes super, super human. There's a lot of connection involved in that. Just because you tell someone they need to come in for pre-screening or they need to get their well cleaned, you know you might think that that would get people in there, but that's not the case. I had to build relationships and that relationship building skill is what then brought me back into academics.

Speaker 3:

I was brought into an online university. I was asked to help create their data science program and then, in 2022, data science evolved rapidly into AI. It's potatoes, potatoes. To me it's very similar, but with AI, I saw the impact that it will have on education. Typically, education's a laggard, so if some innovation happens, usually we'll preach about it later and we'll make sure we work. Great the technology.

Speaker 3:

But with these LLMs, all the information that we would disseminate to the students is now at your fingertips. So you can interact with these LLMs and it can teach you things if you're moderately skilled with using an L on them and you know kind of what's the value of school at that point on them and kind of what's the value of school at that point. So I made a playbook, I brought it to the president, the president disseminated it to the C-suite and we started building our AI strategy and plan Over the next couple of years I then transitioned into my dean role and that's kind of how I got in with Slot. I landed in now, but it wasn't all of it was with intent, but I wouldn't. I planned out to be a dean of AI, it was more. I saw that I understood it from studying it, enabling it, and then I said, well, if LLNs can produce everything that's in a course, 15 minutes how are we going to develop our courses?

Speaker 2:

Yeah, no, that makes so much sense and I love that you shared your full trajectory. I've worked on campaigns for public health as well, just getting kids back in for routine assessments and vaccinations. We didn't use a lot of data, except for the focus groups we did with different constituents and the main focus areas, but it was very much centered around what's the information we need to provide, who are the stakeholders in the community, as you mentioned. Centered around what's the information we need to provide, who are the stakeholders in the community, as you mentioned. So we had, whether it was tribal leaders, whether it was, you know, pastors, ministers, educators same kind of concept. And then we built out a public health campaign that was primarily social media. You know, PSAs over radio, things like that. We didn't have that other component and this is also before people really understood. I mean because artificial intelligence and machine learning has been there for a very long time, but before we started using these specific names for it and changing the vernacular and then having it become something that was more mainstream. So, to your point, it does take a long time to get things to shift.

Speaker 2:

I teach in programs that were created before Gen AI and now you know, even though some of the material for the courses is older. I have to figure out how am I going to teach my students new approaches and new ways, because this is the information they really need to be able to be successful in the outside world. So I went to get back and I'm finishing up my MBA with a specialty in AIML right now, so that I could do a deeper dive and really learn how to. And that's why I have such an interest now in AI first pedagogy, because I'm seeing it as a student, as a creator of curriculum for high school students and for at the collegiate graduate level. So I think what you're doing is so vitally important and thinking about how we help the next generations and actually even adults, adult learners upskill so that they have jobs in the future, because we know that there are a lot of jobs that are task oriented that will be replaced.

Speaker 3:

Absolutely, and replacement is a vague word. I think jobs are going to shift. Ai touches all jobs. Even if you don't think it touches your job, it will eventually. It might take years, but it will be there, I promise you.

Speaker 3:

But 1 billion people, according to the World Economic Forum, need to upskill and reskill by 2030. So what is upskilling and reskilling? So upskilling is if you're a data scientist, maybe you learn deeper strategies or you learn about algorithms that you might not know. If you're marketing, maybe you learn more about AI algorithms that you might not know. If you're marketing, maybe you learn more about AI. Reskilling is if I'm a data scientist, maybe I become a clinician or something completely different than the field I'm in learning those new skills to maintain market relevancy. But I want to express that AI doesn't need to be complex to learn. It can be very fun. So, just by developing a learning plan which is short, objectives that can be long-term or short-term, creating an account on any free LLM website, learning how to basic prompt, maybe making a dinner plan, maybe making a fitness routine, making a simple image. It doesn't have to be complex. But individuals can get into AI anytime. You don't need you eventually may need the deep theory, but to get into it you don't need all the math and theory behind.

Speaker 2:

Yeah, and what's interesting is I'm finding even at the graduate level, there are a lot of students who've been told not to use AI as undergrads.

Speaker 2:

There are students who have been in the workforce for a long you know, many, many years and also they have been trained in certain methodologies for digital media management, and so now they're learning how to integrate more tools into their toolbox and create better, easier workflows. So it's all levels of students, and a lot of people still think of AI as the scary, you know, ephemeral thing that's just out there and that's going to be like Terminator. Instead of thinking no, we just have to take baby steps, learn one thing at a time and by doing that you can get comfortable, you can understand the ethics, the biases, all of that part of this tool as well, so that you're going in with open eyes into the AI world and figuring out how to make sure that you are using your ethics, that you're considering what kind of data you're getting Is it the correct data? And these are things that I think of when I'm thinking about the AI journey and just making sure people are coming in, learning the basics, but then also looking at it through that lens of ethics.

Speaker 3:

Yeah, responsible AI and AI ethics is increasingly important. I think as more individuals go into AI, like AI, the ethical risk also increases with that usage. They go hand in hand. Unfortunately, I think responsible AI and AI ethics are laggards compared to the LLMs that were released. So they were released. Impact happens, more impacts could happen. I consider it uh-ohs or what to do moments. We're still in the between time, so we haven't seen a major catastrophe yet, but we're probably pretty close to that. Once one of those aspects happens, then I think we'll see a huge focus on AI ethics and responsibility, maybe even a dial back on the adoption and revolution rate. But unfortunately, I think it's kind of like playing with fire.

Speaker 3:

You have to get to that point before you realize the harm before implementing frameworks and solutions to help mitigate that risk and bias.

Speaker 2:

Yeah, so how are you bringing AI into the classroom as the dean of AI? What does that entail? What does that look like?

Speaker 3:

I consider myself an individual that's helping individuals navigate between time. So how are we navigating this ambiguous time? To make it more known? And really that comes down to three different domains and this concept called skills-based learning. So I'll talk about the three domains and then I'll go into skills-based learning. But the three domains and this is the World Economic Forum's classification, so folks listening might call it something different but there's trainers.

Speaker 3:

The trainers are the individuals programming the AI AI engineers, prompt engineers, data scientists the more technical AI roles. Then we have explainers, so individuals that understand the theory behind these tools, know how to use the tools. They teach organizations how to use the tools so they can be AI coaches, strategy individuals. They could even be AI enablement folks. There's a bunch of different job descriptions and job titles for those individuals, but they're becoming more and more popular. And then there's sustainers individuals that just use the tools, like content creators, marketers, lists, any other job project managers and they use the tools. They're not as experienced in the training, but they keep consuming and sustaining the technology. So that the trainers and the explainers have to keep current and make sure that, to your original point, that the ethics and responsible ai behind the tools is being maintained and that you can, that they're using ai for them, like what they're supposed to do, is you're following a framework and the outputs make sense, attaching this all to what I consider skills. So a skill is something that is attainable, that you're good at. So, annika, you're good at doing podcasts. That's a skill. It's communication, that's the boot skill.

Speaker 3:

And other skills the World Economic Forum describes is AI skills and human skills. So AI skills those technical, traditional hard skills systems thinking, coding, analytical thinking, prompt engineering. What's interesting why they call them AI skills is because AI is going to get really good at these things. But it's going to get really good at those skills which could cause the softening in the market which leads to human skills, human skills, uniquely human. That's interesting to me. They could have just called them soft skills, but human skills what humans are really good at Empathy, communication, leadership, management AI is going to get good at those things Probably not as good as a human. So it's important to focus on both with your learning plan. So I'm making sure that our training programs are skills-based, so they're teaching individuals relevant skills Communication, for example. The medium might change. Maybe we're interacting with a chatbot in the future, so you have to teach the communication and the prompt engineering. But we're not changing, we're not calling that something different, we're not calling communication prompt engineering.

Speaker 3:

It's distinctly its own currency, so to speak. So you can collect the skills and the currency. They're like coins. You can fill up a piggy bank and when you have to go and break the piggy bank, you have those collection skills, money right to be able to change in this AI economy which?

Speaker 3:

Trickling this all back to education in a traditional sense, education is based on time. So in the United States, it takes four years to get a bachelor's degree. Typically, on average, it takes five to six years to get a master's degree and it takes even longer than that to get a doctorate's degree. All of those are great concepts, but with AI, the time it takes to do things, the skill value, diminishes. So typically, when you get a degree, it's relevant for around five years on average.

Speaker 3:

With AI, since it's at your fingertips this is just my opinion, but it's probably skills are probably going to be relevant for two years.

Speaker 3:

So you have to do things more quickly, more relevant.

Speaker 3:

You have to explore new areas that you might not have explored before, just to keep up with economic changes.

Speaker 3:

That means colleges and universities are going to have to change with that. Yes, we'll still have degrees. Yes, we'll still have programs. Yes, those things are still very important. But with shorter form learning micro-credentials, badges, six-week courses, two-week courses, live on-demand training you can acquire these skills, practice them and then adapt them to your job and quickly not only upskill and reskill but move up in your company, which decreases your value, shows the impact of not only responsible AI, generative AI, whatever the AI is but also shows the importance of the other skills of communication, leadership, systems, thinking. If you can create that return on investment, organizations are going to not only invest more into individuals, but I think they're going to value these shorter-form learning aspects. And then those shorter-form learning aspects and credentials can stack into a program and a degree and it's a little bit of an ecosystem change, but fundamentally it's no longer based on time. It's based on what you can do and what you can showcase how you can do those things.

Speaker 2:

Yeah, that's such a good point because one of the programs I teach in is digital media management and for that program we have 12 courses and they're outlined and you can do a one-year really fast track program or two-year program.

Speaker 2:

I get my students for 90 minutes once a week, you know for about so they're eight weeks, so they're very shortened programs, but in that time I can teach them. You know, we have the concepts of whatever the topic is, but then that class time is when I can bring in other applications and bring in AI tools. So it is fast track, but there's not enough time to fully embrace. So that's why I always recommend here's a great training or here's a webinar about this topic so you can get more in depth. Or if you're interested in learning about this, here's where to go so that they can get that. So it's not necessarily part of the degree that they're going to get from USC, but it is still important to their learning to get that extra knowledge. And, as you said, certifications, right, short form programming, skill stacking, so things that will, with the foundation of all of the different you know digital media areas and aspects and knowing the foundational information, then they can use these other certifications to just really level up.

Speaker 3:

Absolutely, and the leveling up is important. But it's also that continuous learning and it's a mindset change. So continuous learning doesn't have to be completely academic. You might not even have to enroll into a university or a college. You could go on your own and try to acquire some of these skills.

Speaker 3:

So what might that look like in your own organization? Can you sign up for an additional project? So if I'm a data scientist, can I take on some project management tasks? Gain organization skills. Gain managing people skills. Communication. Gain organization skills. Gain managing people skills, communication those are relevant and important skills that could also position you for more of a leadership role. So it's kind of thinking outside of the box a little bit to really focusing, you know, linking this all back to a learning plane so that you know you plan the work and then you work the plan Without that plan. Even writing it down might sound silly to some folks listening, but when you write something down, it's 76% more likely to occur. So write down where you want to go and the steps it's going to take for you to get there. Spend some time on it.

Speaker 2:

Yeah, yeah. So being dean of AI, but also talking about the future of education, not necessarily being in the classroom traditionally what do you think your role is going to lead into and have you seen any shifts yet? I mean, I know we're in that in-between phase, we're at the early stages. Still Some people think they're way behind. They really aren't necessarily. So what do you see as the next step, the next vision of education, and will your role still be relevant?

Speaker 3:

So education has been time-based. I talked a lot about that today, so I'm not going to repeat all that, but I still. That probably will change. It will still require the amount of credits and you'll still need to put time in, but if this more adaptable ecosystem approach occurs, it won't be four years in a seat. You'll be out in the ecosystem, so to speak, learning and can transfer this currency back and forth.

Speaker 3:

We're going to bring in the workforce, so the workforce is usually adjacent to education, but I think those two worlds are now merging. More and more employees need to be upskilled and reskilled. These individuals also need to understand not only the impacts of AI but how to use AI for their roles. I think academics is good at teaching some concepts of that, but a traditional university or college can't teach you AI for every single role. It's just not scalable and it's confusing as an offering. So, like those two worlds merging together, you get that workforce aspect of it. So maybe there's mentors, maybe some of the courses are offered through an HR training program. Once you complete those courses, then you learn more theory back at the university. It becomes more of a flipped classroom scenario, so to speak. Then a traditional student, if they haven't had a job before and they're going to college for the first time, those students can be matched with individuals in the workforce. So they get a mentor, a mentee. You're more likely to get a job. It's probably funner to learn because you're learning from an expert, right? So you're learning from a professor and you're learning from an expert in industry probably two completely different mindsets, but now the content's relevant, it's more current, it's more fun. I think learning can be fun.

Speaker 3:

But then what happens with AI? How does that work with all of this? Right, and I think that's some of the main question. I think with the AI, we need to really understand that the world's still changing and that there needs to be some flexibility. Linking this back to everything I just said, I think the system needs to become more flexible, but new roles are going to exist Not every single role we know yet. So I don't think we need to focus on roles. We need to focus on skills.

Speaker 3:

Skills is a little bit different of a concept than a way. A university creates programs now, so universities can start tagging their courses to skills. They can figure out what skills taxonomy they want to use to the world Economic farms one, for example, but there's many which ones work for your university. How can you start changing the language so, when you interact with your partners and workforce, you're making sure that you're teaching the relevant skills, and then how can you create that ecosystem of a back and forth environment? I honestly think that with AI, education is going to be more accessible. It will ultimately lower costs if more people are going to universities right Quality cost access. So access increases, quality will increase, volume should increase.

Speaker 2:

Yeah, fantastic. I love that vision of the world where things are more accessible to people and that you're not just hearing the theoretical perspective but you're also getting practical application and inroads, Because sometimes students can graduate from any university and they think the job's going to be waiting for them, but it might not be if they don't know how to apply those skills and interview appropriately based on those skills.

Speaker 3:

Yeah, absolutely so. Degrees traditionally I know people upskill on the side now, but traditionally a degree is one and done, so you go and get your bachelor's, then you go and get your master's, and maybe you go and get your bachelor's, then you can only get the master's and maybe you will get your doctorate or or some combination of those. Right, that's not super. I'm hoping to get a job. I'm not sure. With the skills-based economy, in an ai economy, we're shifting our mindsets to curiosity, learning, agility. We're always trying new things. Learning doesn't always have to be in the workforce or in an institution. For example, I'm doing more podcasts to learn more about the communication skill. It's something I wanted to do. I'm interested in doing it. There's no credit for that. I'm just experimenting with it right?

Speaker 2:

Well, you're doing a great job. So how has being dean of AI and putting so much into AI first pedagogy for your university helped shape and transform the university? Are you seeing more students who are interested, engaged, more press?

Speaker 3:

So well. First, you asked the question do you think a dean of AI is market relevant and do you think that that job is going to continue to grow? I think that it might not be called a dean of AI. It could be called dean of AI enablement, a director of AI enablement, but I think that you're going to see it in workforce and I think you're going to see it in universities. All the worlds are merging. It's extremely market relevant. I don't think AI is going away.

Speaker 3:

Linking this back to what you just asked with what skills and how are the universities transforming to this? I think some institutions were in a lot of trouble before AI. Covid, for example, I think, changed a lot of the playing field. A lot of institutions went online At that time. Ai was kind of released. We were kind of getting out of COVID but still in it, and that kind of turned up the heat. You're going to need revenue to successfully implement AI and a lot of the revenue. A lot of colleges don't have access to that, so you're going to see mergers. I think there's probably going to be less colleges. That doesn't necessarily mean it's a bad thing, but it could also change the dynamics. So right, it can make things less expensive or more expensive. I think less expensive, but with this focus on AI and AI enablement, it's just going to change the nature of the game. So if you're not thinking about your AI strategy or your AI implementation planner how SEO is kind of it's a thing, but we're moving away from that.

Speaker 3:

Search engine optimization and LLM is a recommendation engine. So does your university college show up inside ChatGPT Gemini Plot? Is it accurate? Does it list the proper sequence of courses? You have to redesign your websites to make it accurate. You have to have good search engine optimization to feed to these LLMs. Are people thinking about that? I know I'm thinking about that, but I'm a simple one, and some universities are ignoring it completely. Some are playing in the middle and some are being more innovative. I think the middle and the more innovative are going to get closer together and then there's going to be the colleges and universities that work against it and I guess we'll see how that plays out.

Speaker 2:

It's so interesting that you say this because I teach this to my students in terms of personal brand, because the whole thing is right. Google not as much now, but Google and your LLMs are the basis for how people understand us. Now, as more and more people use LLMs for search, we need to make sure that, as individuals, we have our own website, which you do have. That kind of is the hub for all of the information we want people to know, so that the LLMs aren't just out there scraping the internet guessing at who we are, or taking information about somebody with the same name who might be doing something very different, or taking really old information that isn't relevant to who we are now applying it to the university level and how businesses in general need to be doing the same thing is, so I would say that's one of the biggest changes that's needed right now.

Speaker 3:

Exactly and it goes all the way down to the course level. So if someone Googles your university, the course and the course number, and it gives an accurate description of that course, an accurate course content, because it will pull up everything that it finds, yeah, People may be less or more interested to you know the course, the program, the program links to enrollment. They might go elsewhere.

Speaker 2:

Yeah, what is I mean? We've talked about a lot. I think that we have a clear vision for where you see the world going. I agree with you on all of it that you would leave for one of our students or one of our faculty, or even some of the directors who listen to the program, for them to continue this change in mindset and this move towards being really AI first and making sure we're implementing everything really effectively for our students.

Speaker 3:

I would focus on a learning plan. So organizations can focus on a learning plan. Individuals can focus on a learning plan. A professor can have her class focus on a learning plan, so organizations can focus on a learning plan. Individuals can focus on a learning plan. A professor can have their class focus on a learning plan that they want to learn in that class. So you could do a classroom scenario.

Speaker 3:

But my point is by thinking and getting those thoughts out there. If it's at the organization level, then employees know what to attach to and what they exactly need to learn. And then at the individual level, yes, I need to attach to these things, but you can take any type of force or do I want a mentee, or do I want to pick up a project within the organization? You can solidify it more. So really start investing in your personal brand and your learning plan. Those are the two biggest takeaways. Because your personal brand, even though you don't think it links to a learning plan, it does, because if you're trying to go get a mentee or you're trying to upskill, reskill and then switch into a new organization, you're going to need a portfolio to fall back on. All these things interconnect. So really focus on those aspects.

Speaker 3:

I know skills and upskilling and reskilling might be scary. To some folks it's probably just as scary as AI. But it's nothing that's non-attainable. And you can start small, log into an LLM, have it, make a three-day vacation plan or get away for you See if it's accurate, it's summer, I mean people are having fun right now. Have it plan. If you paddle for it, have it plan a route along a river or a lake. It's completely plausible. And if it's something you do all the time, you can understand the trade-offs and benefits. And then, right, you check off one aspect of your learning.

Speaker 2:

Yeah, fantastic.

Speaker 2:

And, ben, thank you so much for coming on the show and bringing your perspective Again.

Speaker 2:

We're so aligned and this is something I've really been thinking about how to create new plans for student learning and also for teaching teachers how to teach effectively at every level, whether it's upskilling the adult workforce, whether it is the university level or even K-12 education. So I think we need to have more conversations like this out in the world, and thank you for trusting me with this conversation and, of course, bentaskeraicom is you for trusting me with this conversation and, of course, spentaskeraicom is the website we'll lead everybody to. So I appreciate your time and I want to thank everybody in the audience as well for listening to this episode. Start thinking about the ways that you can incorporate more AI strategies and tools into your lives, even if it's learning, prompt engineering, proof planning, vacations, meal planning, looking at universities for your child, you know and figuring out, based on their criteria, what the good fits are, which is something I've been doing as well for mine. So, with that, thank you for joining us today to the audience and thank you, ben, as well.

Speaker 3:

Thank you so much for having me. This was a lot of fun. I am very passionate about this, and thank you, ben, as well. Thank you so much for having me. This was a lot of fun. I am very passionate about this and, once again, I hope everybody starts their AI learning quick.

Speaker 1:

To learn more about the Master of Science in Digital Media Management program, visit us on the web at dmmuscedu.

People on this episode