What's Up with Tech?

Universities Must Evolve: Preparing Students for an AI-Driven Workforce

Evan Kirstel

Interested in being a guest? Email us at admin@evankirstel.com

What happens when artificial intelligence meets higher education? The answer might surprise you. It's not just about new technology—it's about a complete cultural transformation.

Professor Angela Virtu from American University's School of Business reveals how their Institute for Applied Artificial Intelligence is reimagining business education from the ground up. Their revolutionary approach? Infusing AI into every single course while simultaneously strengthening the human skills that machines struggle to replicate. "Knowledge has become a commodity," she explains. "The differentiator now is critical thinking and communication."

The university's strategy goes beyond simply teaching students to use AI tools. They've created a community-driven learning environment where faculty receive extensive training first—bringing in industry experts to show professors exactly how AI is transforming their respective fields. This "train the trainer" approach has been crucial for faculty buy-in, especially when redesigning courses they may have taught for decades.

As entry-level jobs disappear and middle management structures face potential collapse, American University is preparing students for a radically different professional landscape. Their focus remains on developing capabilities that AI currently struggles with: handling ambiguity, navigating chaos, and facilitating cross-functional collaboration. While many universities still debate whether AI use constitutes cheating, American University has moved forward with partnerships like Perplexity that give every business student access to enterprise-level AI tools.

Want to understand how education is evolving in the age of artificial intelligence? This conversation provides a fascinating glimpse into the future of learning—where the most valuable skills might not be what you expect.

Support the show

More at https://linktr.ee/EvanKirstel

Speaker 1:

Hey everyone. Fascinating chat today as we dive into the world of AI and education. Angela, how are you?

Speaker 2:

I'm doing well. Thanks for having me today, Evan.

Speaker 1:

Thanks for being here, really excited for this chat around AI, in particular, your role at the Institute of Applied AI. We're going to learn all about what that is, what does the Institute do and why it was created. First, maybe introductions to yourself, your role as a professor at American University and a little bit about your journey.

Speaker 2:

Great. So again, my name is Angela Vertu. I'm a professor in our business school at American University, where I teach mainly our machine learning, business analytics and artificial intelligence courses. Before I joined academia and came full-time, I actually worked at a whole bunch of startups in the DC metro area, where I was owning and operationalizing all of our machine learning and artificial intelligence products and tools Fantastic.

Speaker 1:

And what is the Institute of Applied AI? Sounds like a fascinating entity.

Speaker 2:

So we just launched the Institute for Applied Artificial Intelligence this past spring at the university and we have three core pillars that we are working on doing to really incorporate and continue infusing it throughout the entire business school curriculum and education through learning outcomes and really making sure that all of our students get AI infused and they're prepared for their future careers. The second pillar of this institute is all about AI research and then the third pillar is all of our business and policy community engagement pieces. So we're really starting to try to branch out into the greater DC community and even expand the AI exposure on campus, right. So Kogod was one of the first ones to do this AI push. Now we're kind of going to all the other colleges on our campus to make sure that all of our students, regardless of if they're necessarily a business major, has that AI touch.

Speaker 1:

Brilliant. So you worked in both education and business and industry. How has your industry experience kind of shaped the way you're teaching AI today?

Speaker 2:

So I think there's two key components that has really been helpful in influencing our AI approach. The first one is just industry moves at such a rapid pace, especially when it comes to technologies especially when it comes to technologies and so we've really been able to kind of lean into that agile development and framework and kind of accept, hey, ai is a brand new, uncharted territory, especially when we start applying that to our education space, but really like, let's just roll with it, let's go really quick and break things. As you know, mark Zuckerberg always says. So we've really been able to approach that in that manner. And then I think the second thing that's been super helpful is because I come from that business world where I kind of know what artificial intelligence is. I built these tools, I know how they get adopted. We've been able to kind of cut students regardless of, you know, if they're going to marketing or finance or accounting or whatever their individual discipline is to make sure that they're AI literate and fluent.

Speaker 1:

Brilliant, and one thing you've said is that AI is more about culture than technology, which I spend most of my time talking about tech on this show. So what do you mean by culture? Why is it so fundamental?

Speaker 2:

Yeah.

Speaker 2:

So I think the interesting thing about artificial intelligence, and more specifically the like generative artificial intelligence, is that the way in which we interact with it is very unintuitive.

Speaker 2:

Right, the way you actually want to interact with these generative AI technologies is to treat it more like a human right.

Speaker 2:

But when you pull up on your screen and you just have a little chat box, it feels weird to speak to it like a human. So there's this thing in our brain that's like can't quite process that this should be talked to like an intern or like a peer or like a colleague, instead of being like we need to code or like what's the exact right word to use to get it to do this thing for us. And so I think with that just kind of design thinking, there comes a much larger cultural shift that has to come from the businesses and the organizations to adopt it right. It's not like we're switching from Tableau to Power BI, where you can kind of just plug this thing in and everyone knows what to do. It's just a few changes of buttons. It's really rewiring our brains and kind of having the approach that things that used to be impossible are actually possible now. So that's why I kind of get into it that culture shift.

Speaker 1:

Definitely would agree with you there. And let's talk about how schools are teaching AI firsthand. You can tell us, of course, but what can we learn on teaching AI in academia versus training teams in the enterprise, training people in business, et cetera?

Speaker 2:

people in business, et cetera. So what we're doing at the Kogod School of Business is that we've infused AI into every single class, every single course. That way, every single student, regardless of what their major or minor or application is, are going to get these core AI literacy skills. And the interesting thing that we're doing is we're coupling those more technical AI skills along with very human communication skills, so all of our students are going to be able to talk to each other, they're going to be able to collaborate, they're going to be able to communicate, they're going to be able to do those human things that are going to be much, much harder to replicate in a technological sense. And I think the one thing that's been super successful with our AI integration is we've been very community driven, and that kind of speaks back to that culture piece a little bit, where learning is a very social atmosphere. Right, you can't just tell people go do this thing on their own. It just doesn't work. You need to talk, you need to interact with each other, you need to learn from each other as to what's going on, and so I think that's the one thing that businesses can kind of you know, maybe take from our academic setting.

Speaker 2:

One thing that's been super, super helpful in our approach within our community is we actually had a train the trainer experience. So about a year and a half ago, before we said, hey, let's put this all into our curriculum, we understood that if our professors you know, the individuals who are actually teaching our students don't know what AI is and they don't know how to use it and they don't know how to use it responsibly, it'd be really, really hard for our students to have that expectation of they're going to learn that skillset. So we, a year and a half ago, brought in a whole bunch of alumni and individuals in industry. So we brought someone in from the financial banking industry to come in and kind of showcase and demo exactly how they're using artificial intelligence within their industry. And kind of showcase and demo exactly how they're using artificial intelligence within their industry.

Speaker 2:

And our finance team, like their jaws, were on the floor, like they were floored by what AI could actually do from a finance perspective. We did the same thing with marketing. We did the same thing with accounting, all of our individual programs. We brought people from industry to basically say, hey, this is how this is completely changing our world and what that was really able to do is get the buy-in from the professors to say, hey, I know I might have been teaching this for 20 years, 30 years, 40 years, but companies are changing today and so having that buy-in, having that community, has been super, super helpful in just learning what to do with this.

Speaker 1:

Fantastic, and we're all a bit overwhelmed by the requirement to upskill and learn new AI skills. It's really overwhelming, especially for younger professionals. What do you recommend in terms of the top AI skills younger or older folks should be focused on right now?

Speaker 2:

Yeah, so I think the interesting thing with this generative AI push is that knowledge has become a commodity right. It's completely at your fingertips. There's no longer this like hard eclipse of getting knowledge. It's just there and there's vast, vast, vast quantities of it on if I was either starting school right now or even trying to upskill. The first one is communication right, because, as I talked about even before, like, the more human you can speak to these generative AI systems, the more clear your instructions are going to be, the better your results will actually be at the outcome, right. So having really good communication skills, not just with each other but within the AI systems itself, I think is a good piece. And then the second piece is just curiosity, like, can you ask really good questions? Do you have an insatiable thirst to kind of get to the final answer, and so all of that combines really creates your critical thinking. And I think if you can sift through what's right, what's wrong, and really start going from is this thing possible to building the correct thing, that will be that differentiator.

Speaker 1:

That's fantastic. And what's a big myth or two about AI that you wish more people understood from reading the press or the headlines we're all gathering today?

Speaker 2:

So the first one and I think this comes more from my teacher hat than maybe my business hat, but AI can be wrong, and I think this comes more for my teacher hot than maybe my business hot, but AI can be wrong and it's okay. I have a lot of students in my very first class and they teach my freshmen is that they said, oh, I got this from AI, it must be right, and they just don't even like think about it, right. So I think the first thing is just test the boundaries right, like where's one area, if you're starting from scratch, that you are an expert in Maybe it's ballet, maybe it's football, maybe it's golf, I don't know and have a small conversation with the AI where you can start testing the boundaries and fact-checking and once you understand it's not always right, come up with your system to say, hey, does this always have to be right? What's the risk if it's not? And if it is in fact incorrect, how would I maybe know about that right?

Speaker 2:

So that goes back to that critical thinking piece. The second thing that I think is a little bit of a myth, or maybe a hope for me is I really want us to kind of get to a new user experience with AI, right, I think the chat functionality is not the best way to do it either typing or with voice and I think we're going to start seeing this new interface of how we interact with the world of computers and technology soon. I don't know what that would look like, but that, I think, is what I'm most excited for and maybe aspirational for.

Speaker 1:

Oh well said, oh well said. So, as you know, there's a growing concern about AI replacing white-collar jobs in particular, especially the roles that business students are training for, studying for, and lots of entry-level jobs. How do you prepare to compete with AI?

Speaker 2:

or are you talking about working beside it, or both?

Speaker 2:

So I think right now, what we're seeing is the immediate collapse of the first level entry jobs.

Speaker 2:

Right, there's been lots of conversation, lots of numbers supporting that narrative. I think the next stage of that is going to be we're going to see a collapse of the entire middle management structure because AI can do that. So we're going to have lots of large organizations that are going to reduce and become really, really flat and at the same time, the jump from a higher education institution into your first job is going to be significantly widening. Right, because now you can look at, like KPMG, but their first year analysts are now doing the work of years two and three analysts that typically do that work. So I think, as we think about that preparation and the future of work, it really comes back down to like can you handle messiness, can you handle the chaos that these ai systems aren't great at just yet, and can you do that cross-functional collaboration right? I think those are going to be the two areas where, if you have success in those domains, it'll be harder to argue why you shouldn't have that necessary job.

Speaker 1:

Yeah, it's going to be an interesting one to watch. So you're clearly moving at fast pace at American University, but do you think universities are at risk of falling behind in adopting and changing with these tools? What's your state of the union, as it were, for higher education?

Speaker 2:

So it's still very much a mixed bag, I would say. Every single day I still have to have conversations of why using AI for a college student isn't necessarily cheating. And then when you have reports coming out from MIT where they did a study that you know a lot of people will synthesize is just saying, oh like, using AI actually is like horrible for you and it makes you not think and things of that nature kind of just like makes that conversation a little bit harder. I would say I see lots of promise. So you know, obviously we're going to plug American University, what we're doing at Cobot and really infusing AI and embracing AI and making all of our students AI literate. But there's also really good work going on at, like, babson College doing the exact same stuff.

Speaker 2:

So there's pockets of hope, I would say, where we're really on board and we understand AI is not something we can hide from and it's not something that we can just kind of shove under the rug and forget. But at the same time, there's there's lots of institutions where the conversation is still we're banning AI, we can't use it, it's horrible, and a lot of it just stems to. It's a lot of work, right, we have to redesign the way we think of teaching. We have to redesign our pedagogy, we have to rethink our assignments and the way we deliver curriculum right, so that student experience and the expectations of how to work through the learning process is changing and there's no real clear definitive. This is exactly how it's supposed to look. Right, your definitive. This is exactly how it's supposed to look right.

Speaker 1:

So it's challenging For sure, and I guess it's questioning everything. Everyone's questioning everything, including their majors. Some evidence that computer science majors are having a tough time with all of the new AI coding tools on the market. Again, those entry-level roles are being hit. Are you seeing that in terms of choices around majors and areas of study based on what might or may not happen with AI?

Speaker 2:

So from the business school I can't speak to our computer science numbers or enrollment too much, so I don't know what the full metrics look like exactly on our campus for that.

Speaker 1:

Yeah, well, that makes sense. What about business and finance, where you know increasingly, you know, learning Excel and the old school tools won't be enough to compete. School tools won't be enough to compete. How will data science and the suite of tools that you use in the future look and feel different from what? The tools that we've all learned for, even in business and finance, for managing the business and P&L and forecasting, et cetera, et cetera?

Speaker 2:

Yeah, I mean we just announced a partnership with proplexity and for the finance space in particular. They have a really, really impressive like financial analysis tool that they've launched and have announced, so we're trying to start to get into there. We also have an fsit lab where we have access to bloomberg terminals with for our students to have access to. So I would say, like the, the tooling has always been one piece of the educational puzzle, right? Because?

Speaker 2:

as you start to prepare our students for industry. Obviously it's not just what you know, but it's the tools and how you actually can get the work done. So I think we're always revisiting what that next three to five year horizon of what tools should be, or what our students need in terms of those core competencies, and horizon of what tools should be or what our students need in terms of those core competencies, and we have really good partnerships with industry and we have like panels to basically give us feedback of. You know, hey, actually we're all using this thing now, or this is the new tool or this is the new skill set that our students are going to be expected to know. But again, I just want to underscore that our approach to AI has always been what are those core fundamental skills that, regardless of what tool we're using, they can then kind of navigate their way through the functionality or the tooling of it?

Speaker 1:

Well said. And of course, there are a lot of AI ethics landmines out there to navigate. How do you think about teaching students to build and use AI responsibly?

Speaker 2:

So that's throughout the entire curriculum, or we really want them to be responsible stewards. So I think that comes from an understanding of how AI actually works, what the limitations of the AI are, and that includes some of the biases of the underlying training materials, include some of the biases of the underlying training materials. We have an entire business ethics class that has a lot of AI incorporated into that as well. So I think our students are, you know, having the conversations of, you know, examples of good AI, bad AI, everything in between. We have a really strong business and entertainment program where they've been talking a lot about the intersection between AI and entertainment.

Speaker 2:

So the whole writer's strike a year ago now, my calendar is a little questionable but, I, think a year ago, the strike, after the strike, a lot of that had to do with AI. Is it okay for us to clone actors' voices, or what about voice actors If you can just use something like 11 labs to basically do all the vocal work for a video game or for a movie or something along those lines? So we have lots of room for those conversations fantastic.

Speaker 1:

So what are you looking uh forward to in the new year? Uh, school year coming up. What's on your radar? What are you excited about?

Speaker 2:

we have a lot going on. We've been working really hard this summer. I think the first really big thing is we're introducing AI learning outcomes for all of our individual classes and for all of our individual programs. So we've taken a really hard look as to how have we traditionally approached learning outcomes, what are those core skill sets that have been needed, what are the methodologies of how we would mark that skill set and progression? And now we're really taking a fine tooth comb through what ai skills do we need? And of the existing skill set like what, what changes now with ai? So I think we're really excited about the progress we've made there. We also this past spring announced a partnership with perplexity, um, and so we're really. We launched that march of our spring semester, so about halfway through.

Speaker 2:

So this will be, our first full semester where we can really have every single student, faculty and staff within our kogod school of business to have access to enterprise level of artificial intelligence. So we're going to see a much, much deeper infusion of ai being used as tutors, ai being used as a thought partner, ai being used as a learning vehicle in all of our classes this fall, which I think is super, super exciting.

Speaker 1:

Fantastic. Well, it may be time for me to go back to school. That would be a scary thought, but wonderful stuff. Congratulations on the program and all the success onwards and upwards. Thanks Angela, thanks Evan. Thanks everyone for listening, watching, sharing the episode and check out our new TV show now on Fox Business and Bloomberg at techimpacttv. Thanks everyone.