Good Enough Isn't

Will AI Replace Universities? ECPI’s Bold Experiment in AI-First Learning

Patrick Patterson

Will AI Replace Universities? ECPI’s Bold Experiment in AI-First Learning

with Stephen Arthur (Director of AI & Analytics, ECPI University)

Episode Summary

This week, Myles Biggs and Patrick Patterson sit down with Stephen Arthur, Director of AI and Analytics at ECPI University, one of the first institutions boldly pushing toward an AI-first university model. While most universities are still debating whether students should be “allowed” to use ChatGPT, Stephen and his team have already built a proprietary AI learning platform, integrated it into curriculum design, student support, faculty workflows, and the operational backbone of the university.

Stephen shares how his unusual journey, from aerospace engineering to marketing analytics, to AI product leadership, shaped the mindset needed to drive change in one of the most tradition-bound industries in America. You’ll hear how ECPI is breaking higher-ed inertia, converting AI skeptics into evangelists, and redefining what it means to prepare students for the workforce of the future.

If you’ve wondered whether AI will replace universities, or remake them, this episode is your front-row seat.

What You'll Learn in This Episode

  • AI’s existential question for higher ed
  • How ECPI became an early AI-first university
  • Change management: flipping skeptical faculty into AI champions
  • Marketing, analytics, and the engineer’s mindset
  • The Three C’s that ensure universities survive AI
  • The future of work and which careers AI won’t replace soon
  • Operational AI across the institution

Featured Guest

Stephen Arthur — Director of AI & Analytics, ECPI University
Engineer turned marketer turned AI builder, leading ECPI’s transformation into one of the first AI-first universities.

Stephen on LinkedInhttps://www.linkedin.com/in/stephen-arthur/

ECPI University Newsroomhttps://www.ecpi.edu/newsroom

Takeaways for Operators & Education Leaders

  • AI won’t replace universities, but universities that ignore AI may become irrelevant.
  • Experience beats access: AI enables one-to-one tutoring at scale, something higher ed has dreamed of for centuries.
  • Adoption requires solving faculty pain first, not just student pain.
  • Curriculum must evolve constantly: AI isn’t a topic to teach, but a medium through which all learning happens.
  • Operational AI matters: IT support, financial aid, and student services can all be reimagined with LLMs.
  • Statistical literacy is a superpower in marketing, analytics, and now AI deployment.
  • The human element still wins: Community and accountability remain irreplaceable advantages of real institutions.

Connect With the Show

Level Agencyhttps://www.level.agency/

Patrick Pattersonhttps://www.linkedin.com/in/pattersonwork/
Myles Biggshttps://www.linkedin.com/in/mylesjb/

How to Support the Show

  • Subscribe so you never miss an episode.
  • Rate & review if this conversation brought you value.
  • Share the episode with colleagues exploring AI, higher ed innovation, or organizational transformation.
Stephen Arthur:

will AI replace universities? That's what I'm, writing right now it starts by talking about all the different, technologies predicted to, be the death of universities. The printing press, the radio happened, the television, then the internet. I think this is a little different though, uh, because all of those past technologies have expanded access to information, AI interprets information.

Myles Biggs:

Hello everyone and welcome back to the show. It's your host Miles and Patrick here once again to remind you that on this podcast, we are driven by truth. Sometimes the hard truth. We believe it's imperative to be relentless for results.'cause if you're not your competitor is we're obsessed with how to be better every day because that's what our customers deserve. If you can set aside your ego, if you can truly be a no ego, then we're the show to help you go all in because good enough isn't. Today our guest is Steven Arthur, director of AI and Analytics at ECPI University, where most universities are still writing strategy decks about ai. Steven and his team have quietly built their own AI learning platform. He didn't just talk about the future of learning, he shipped it and flipped skeptical faculty into evangelists. And is now turning ECPI into one of the first AI first universities. So today we're gonna talk about theory, but we're also gonna get into what does it actually take to break higher ed inertia and build something better. So Steven, welcome to the podcast.

Stephen Arthur:

Thanks for having me.

Myles Biggs:

So before we get into all that, that thing you're building that I alluded to, let's talk about you for a second, and how you got here. In checking out your LinkedIn, I saw that you got your bachelor's in aerospace, aeronautical, and astronomical space engineering. Which is a lot of words for one degree, I've gotta say. And then you got your MBA from Carnegie Mellon. I'm curious to learn what made you go not just engineering, but that specific, trifecta of engineering and why you went back to get your MBA.

Stephen Arthur:

Sure. So, I was always, good at science, math, physics, chemistry, all that stuff. So at Virginia Tech, I kind of knew I wanted to get into engineering. I was debating either, physics or engineering in the freshman class, you attend all of the specific disciplines of engineering, and they all were so boring. You have, you know, civil engineering, it's just a bunch of bridges and buildings that's not fun. You know, you got the, some of the more, I'm never been a hands-on type of person so I was thinking mechanical engineering's out. You know, aircraft felt like that was the most fun sounding one. And turns out it was one of the harder ones too. so, that was pretty fun. But then as I'm going to college, I, started working, with ECPI as internships over the summers, doing odds and ends here and there. after I graduated, I got, a cad, software role at ECPI had just acquired a new CAD software and so it my first job outta college was to go in and start learning, go through all the training on this specific software, and then teach it to our faculty, who may not already know it. And then as I was doing that, someone was leaving from the marketing department and their VP of marketing. you know, I, started to talk with them and then I kind of naturally got pulled into a marketing analyst role because, she liked how my mind worked and, and all of that. Gave me a big marketing budget to manage. And then I just kinda kept taking on more and more marketing roles and, channels and essentially became our unofficial, director of, digital marketing and then I started transitioning into a university-wide analytics role, creating predictive models for student retention, to find out which students are likely to drop out and so we could intervene. You know, doing inquiry scoring models for marketing and admissions to figure out who are the most likely people that are going to enroll. then just doing general analytics and reporting across all of the different departments then Gen AI hit the world three years ago, it was kind of freaky in academia, as you can imagine. Everyone's starting to see students using it to do all of their work for them. And people just absolutely hated it. Everyone's being like, well, if we can just learn anything with ai, then what's the point of university? And so there's a little bit of anxiety around that. I just started learning more about it and how it works. Thankfully I learned a lot of calculus in college, which helped me understand the mechanics of how these large language models work, and that helps you get a better intuition on how to use them influence them and work them. We have a pretty large, software developer, team here at ECPI unfortunately, we were struggling to get some of these first AI projects off the ground so I convinced our university president to let me hire a couple of software engineers and start building some gen AI products for the university. That's what I've been doing for the last year and a half.

Patrick Patterson:

That's great. I want to come back to that moment where you took over marketing and what that felt like. But before I do, maybe a tongue and cheek question here, but how many times have you used, it's not rocket science,

Stephen Arthur:

Honestly, I don't really use that phrase

Patrick Patterson:

Oh man, it's the Trump card. It's the Trump card. You have like, Hey, it's not rocket science. And I would know, literal rocket scientist.

Stephen Arthur:

Science is very hard. I can tell you from learning how it works.

Patrick Patterson:

so

Stephen Arthur:

just the service level is

Patrick Patterson:

yeah. you and I have very similar backgrounds. I was an analyst and got pulled into marketing. I call it going to the dark side, in sales and marketing talk about that moment and what that felt like and what you realized you were bringing to the table versus a more traditional advertiser. We're all math men now versus mad men, you were leading that charge. What did that feel like? What were you bringing to the table?

Stephen Arthur:

Yeah. So, what happened when that transition first happened? When I went into my first marketing role I'm working as an engineer on, CAD design software and simulation software and then one day I get a tap on the, on my shoulder. And it was the university president said, follow me. so I walked down the hallway with him into the VP of marketing's office. He asked me how good are you with money? I said, well, I have my own personal finances. I'm doing pretty good. I invest a little bit here and there. he was like, all right, I'm gonna give you four and a half million dollars, we'll work with you to help you figure it all out, but we think you can do it. the first role was, the affiliate marketing side almost like portfolio management where you have a bunch of different partners you can, work with and get inquiries from. And then, our VP of marketing joined the first few meetings with the agency that I was managing after the first two weeks, she was like, all right, you got this. so I took it, started working on it, learned how that whole thing worked. that whole ecosystem is very complex and very large. I actually, co-chaired a committee that, created a document on best practices on how that whole, ecosystem works, for, an organization called career Education

Patrick Patterson:

Yeah.

Stephen Arthur:

And so that was a lot of fun. Got to work in network with a lot of people on that. And, Yeah, no, I mean over the years I started, volunteering to take on more and more. first thing was our email marketing There was no personalization. no automation. It was just, you know, an event is happening or our holidays happening, we'll just send, a million emails to everybody in our system. And our open rates are one to 2% because Google doesn't like that at all. so I took that over. That person started reporting to me and I developed our entire email nurture strategy, and automation. eventually we were sending millions of emails a year to students getting, 20 to 50% open rates, depending on the email. so developed that whole strategy. Then took on, Facebook started getting into the details on campaign management, on the meta platform, and then paid search. I just kept volunteering to take on more and more. from what I have seen, if you want to move up in a company, you have to volunteer for work you're not gonna get paid more for it until your next review you need to make yourself invaluable to move up and you only do that by, taking on more work. So that's been my philosophy over the years.

Patrick Patterson:

So I think, do the work and then you get, recognized for it after. it also has to be met with a skillset that you bring to the table. I imagine that moment where you come in and you're, you know. Looking at marketing for the first time with a engineering background and, and everything, you probably had a different lens in which you were looking at things versus other folks. were there things that you saw where you were just like, maybe other people would be okay with,

Stephen Arthur:

Yeah.

Patrick Patterson:

other people were like, this is just how it's always been done. And you come in and you're like, no, this is silly. We gotta do it some other way.

Stephen Arthur:

Yeah. And it's not just coming at it from an engineering background as much as a statistical mindset. I look at how some people manage their campaigns and they say, all right, this part is doing well, this other part isn't so let's cut the stuff that's not doing well. that sounds correct. but there's a lot of context to take into account because if your sample size on either one of those isn't big enough, then you know, you are cutting something that might actually be valuable. You just don't know,'cause you don't have enough data on that particular thing. one of the examples on the affiliate marketing side was, you have, a thousand different inquiries coming in and you look at it by. high school grad year where they're coming from, by state, by zip code. what a lot of people will do, for instance, on, you know, if you look, if you have an online college and you just group up all the zip codes that perform the worst, and say, all right, if we cut all that, here's the performance And then the performance is never that good because you can't just group up performing zip codes. Each individual one might not have enough sample size to even be able to know it's actually good or bad yet. But when you group'em all up, it looks like you have plenty of sample size on all the bad stuff. an example I use with some agencies that, don't necessarily know what they're doing as well as they should. you have 10 people all flipping a coin and anyone that gets tails, you cut. And you cut them and, all right, we're gonna get all heads from now on'cause we got only the people flipping heads.

Patrick Patterson:

that's how that works, right?

Stephen Arthur:

round happens and you still get half heads and half tails. it's kind of like that in campaign management where you have to come up with, some way of grouping, data and campaigns to a point where you have enough sample size, the grouping has to make sense. You can't just look at it by performance and group all of them. Instead maybe by zip code, you look at, correlations of performance for each zip? So look at the lowest incomes, zip codes and group all of them up and then see what the performance is.

Patrick Patterson:

Yeah.

Stephen Arthur:

And so, having a statistical understanding is very valuable in the marketing world. and then also having the logic skills to figure out, all right, well. you cut this, what's gonna happen? You have to understand selection bias and all these different statistical, phenomena to understand at a deep level how to look at data.

Patrick Patterson:

Yeah. I remember a time, this was 15 years ago. we were managing around a$30 million, paid search budget, and it was like, let's take the lowest performing 30% of keywords and just cut them all.

Stephen Arthur:

Right.

Patrick Patterson:

Right? And that was a strategy. I remember having to fight with the team. I was like, that's not what we need to be doing. when you look at long tail search, and the fact that searches every day have never been searched before you're looking at very low sample sets. those types of decisions are, difficult to understand even when you have the data in front of you. it's more than just looking at statistics, it's also understanding the platforms the people the need all of that. So there's a human-centric part of that as well, as well as being educated enough to understand what the data's telling you. Right. it's interesting.

Stephen Arthur:

Yeah. that's a good example, and I agree with you, trying to explain these things. If people don't understand statistics and how all this works, it's hard to that in a way that makes, that helps them make sense of it. that's been a challenge. one example I give on how to get around this, you don't need to know statistics, you just need to know one heuristic that I give my team, our main goal is, you know, we try to get as many students as possible, for at least amount of money in advertising dollars. And so, you know, if you're looking at a particular, you know, cost per enrollment is kind of our, metric that we look at. And, you know, if you're looking at of these data, all these different places, you're getting these inquiries from, do a plus two rule. what is the performance and then what is the performance? If you randomly just happen to have two of those people enrolled tomorrow, what would the performance look like? What, and if that number looks good, then you don't have enough sample size to know you need to cut it. that's a very simple rule and one of many that I've, tried to develop to just be like, just follow You don't need to know anything else,

Patrick Patterson:

Yeah, that's a phenomenal tip. anyone listening should take that away as a good rule of thumb, especially in a, a world. Like education or anything that's kind of a large lifetime value or, or, or considered purchase. Right. You know, we, we all got enticed to go past, cost per thousand impressions. Then cost per click, then cost per lead, then cost per app, and then cost per starter enroll. eventually, we were going towards cost per graduation,

Stephen Arthur:

we're looking at cost per salary after graduation,

Patrick Patterson:

there you go. so it is enticing to go down that far but then you lose the ability to really get actionable insights from some of that data, right? You get directional insights, which, you know, people confuse, actionable and directional a lot of times, when, when we look at those charts. so you come in, you're having these conversations. bringing a new way of thinking. did that transform the way ECPI was thinking about their portfolio, of advertising and, what was one major lesson you learned from that moment?

Stephen Arthur:

I think one of the big things that I was able to bring, is a culture of AB testing things, don't just change something because it looks better. typically, and from my experience, the worst looking webpage is perform the best, unfortunately. So it's

Patrick Patterson:

Yeah.

Stephen Arthur:

to hit both of those where it looks good and converts well. but yeah, just having the mindset of. Like I, and it's a very, it's a very humble thing to try to learn it, to know that what you think is correct isn't always going to be correct. What you think is going to work will often and more likely than not fail just a law of nature. entropy is an example where there are more ways things can go wrong than Right? That's basically what entropy is. a big example of this in the K through 12 side is you look at all of the different interventions that, the school systems have tried to incorporate and almost every new program that they try tends to not work well. you have, you know, summer schools a good example. You don't pass, sixth grade, so your kid has to go into summer school, and it doesn't really work that well because it turns out when you put a bunch of kids that don't perform

Patrick Patterson:

Mm.

Stephen Arthur:

altogether, that's a recipe for disaster. there's a lot of those types of intervention. I think a good principle is there are more ways that things can go wrong, that can go right. So the odds are that whatever you're going to try to do is not going to work even if you have good underlying principles and, you've seen things work before. you need to bring a culture of testing just to make sure,

Patrick Patterson:

I think that culture of testing, building that muscle memory is helpful in a world where everyone thinks they are a marketer and everyone thinks they're right. going into where we are today with ai, one of the things that I'm super excited about is this ability to, to fail fast and this ability to, to really test things. the cost of being wrong. You know, back 10 years ago, the cost of being wrong was, was pretty high. You know, you know, I remember, you know, when you're managing huge marketing spends and you want to do an AB test on a landing page. and spending, millions of dollars. if I'm wrong, I'm screwing up half of that, right? So, you don't ever wanna be so wrong. and, and so, you know, the ability to do some things and some of the newest tech around, you know, actually, you know, kind of user testing things with LLMs before they go to market, has really allowed us to kind of pressure test bad ideas quickly, effectively, and, and have that culture of testing while still being conscious of the budget. But it's those long tails

Myles Biggs:

Just.

Patrick Patterson:

like the keywords, the campaigns you shouldn't be touching because you don't have enough data. It's the long tail thing. It's the 45th idea you had, not the second that's going to have the eight to 10 x return. And if you're not in that world where you're getting to the 45th test, you're never gonna find it

Stephen Arthur:

Yeah, exactly. And to that point about, the opportunity cost of testing something that isn't going to work. one of the other principles that I bring is if you are going to do a risky task on a lot of money, like a page that gets a million dollars a month, do a 90 10

Patrick Patterson:

Right.

Stephen Arthur:

we're talking about that right now some new pages we're trying to roll out. we just rolled out a new AI and education page describing all the ways we started incorporating AI into our curriculum. we had to go through many iterations of testing before we saw that it was working well, even though it looked great from the beginning. there were just a couple of mechanical things that were slightly off, and when we compared it to other pages those little mechanical UX type of things are what, bite you when you're not looking.

Patrick Patterson:

Yeah. Well, I do want to talk about, what's being built right now at ECBI before we do that, I love talking with folks that have done ab tests, landing page tests, ad tests,'cause everyone has, the one thing they found that they couldn't believe worked on a landing page. and it worked. do you have any tips and tricks for anyone listening on how they increase the conversion rate on their landing pages?

Stephen Arthur:

am not giving away that secret.

Patrick Patterson:

I can tell you this, the color of the button does not matter.

Stephen Arthur:

when I built out that whole automated nurturing email strategy, that's what everyone says. The test is just the button color. See if you can get more clicks on that button. And it's like, no, the biggest things, and this is another principle that I've kind of found over the years on the predictive analytics side of the world, is the things that have the most impact are behavioral. They're not the color of something. for instance, a student dropout prediction, you're trying to figure out which students are most likely to drop. you have a ton of data you can look at. You can look at their age, race, marital status, military status, gender, all the different characteristics about them. if you look at the, drop rates by all of those characteristics, none of'em are that valuable. a man is not much more likely to drop than a woman. a 40-year-old is just as likely to drop out than a 25-year-old on average. so that's not what is predictive. the thing that is most powerful is the behavioral side. many times are they logging into your LMS? How many assignments have they submitted this week? How often are they in there? what was their last day of attendance? Are they showing up to class? all of these different behavioral things are really what predict, future behavior, and it makes sense. previous behavior predicts future behavior, it's the same on any UI or UX interface, whether it's an email, landing page or anything else. the things that have the most impact are what changed the user's behavior on your site. Not the visual they're looking at but how they interact with it. if I had to give one general thing, I would say, you know, try to ad adjust your user's behavior by introducing elements that influence that and that'll have the biggest impact.

Patrick Patterson:

great.

Myles Biggs:

When did you get the next tap on your shoulder from the president saying, Hey, you're good with money, we're gonna spend a bunch on ai, and I want you to do that. How did that come to pass?

Stephen Arthur:

that one was a little more gradual. It was a series of meetings over that first year after Chat GPT 3.5 came out. like, all right, well what does all this mean? You know, no one is really building any of these things in earnest. any industry in that first, six months to a year, there were some people that were really on the ball and were able to start building out, you know, GPT wrappers and stuff and start selling that for 20 bucks a month and only cost them 50 cents a month in API token costs. How do we combat, do we even teach our students how to use this because they're using it right now to cheat? We can tell, or what we consider cheating as, you know, using AI to do their assignments for them without even looking at it. You have them write an essay and they just copy and paste that those instructions and Chat GPT and does the essay for them. And then they don't even read the essay. They'll just submit it and not forgetting to remove the bit at the end that says, if you'd like, I can also do this for

Patrick Patterson:

Right.

Stephen Arthur:

the submission. We're like, all right, well if, if we introduce our students to Chat, GPT. even more of them are gonna use it to cheat because now they're aware of it a huge philosophical debate in academics right now, across the world does it matter if your student knows how to do something themselves without ai? It's the whole question about, you know, you're never gonna have a calculator in your pocket when you get outta school and you need to be able to do this math. And that turned out not to be right.'cause all our phones have calculators because you don't need to know how to do long division'cause your calculator can just do it. we're having a similar, much more fraught debate now with Gen ai if your students can do your assignments with ai. does it matter if they know how to do it without it? when they go into the workforce, they're gonna have AI to do whatever work they need to do. there's the question of, all right, well if they can get an A on our classes, then they can get an A when they get a job in their job performance. that's wildly different than what academia has stressed, for the last thousand years of higher education. all the way back to the what, 11 hundreds in Cambridge and Oxford.

Patrick Patterson:

Sure.

Stephen Arthur:

you know, it's the difference between true learning being able to just get the job done. And I think that's been a, a big transition, the college and higher education system for the last several hundred years. it used to be that college was only for the elite, that you just learn philosophy, history, all of that sort of thing. And there was no real economic benefit to going to university. You went to become a better citizen, to learn how to, you know, run a city. then, you know, in the late 18 hundreds, mid 18 hundreds, it started to evolve. And it wasn't just that because now there were professional degrees. You know, you go to school to become a doctor, a lawyer, even at the time clergymen, that were more learned. there was a little bit of an ROI, but The main factor was being able to learn something so that you could apply it and provide a valuable service to your community. then, the industrial revolution happens and it goes further towards engineering managing and, scientists with career colleges technical colleges and coding bootcamps a lot of the career college sector has almost completely dropped the Acade academia learning just for learning sake. And it's purely transactional ROI teaching a skill. And, you know, the, it's, this is why I think that the career colleges are probably gonna be the ones that succeed the most in this era. And it's because they've already What is AI good for? It's good for. making things more efficient to, to learn things much easier because it can explain things those two things are, what career colleges have been trying to do for the last 50 years. And just will accelerate that. Whereas traditional universities, their motto has always been learning for learning sake and knowledge is its own, reward. if they start adapting AI and making it more transactional towards, it doesn't matter if you use AI to do your assignment because you were able to accomplish the goal you needed to do. that's what the workforce requires, your employer doesn't care how you get the job done, just that you get it done. that goes completely against traditional academics, goal. And is going to blur their mission, I think I know one of the people that just started working here, He said that his wife works for Virginia Tech. My, undergrad school. apparently they are banning all faculty and students from using any type of ai. it's the same at a lot of institutions I talk to. They're only allowed to use copilot because of some security privacy thing a lot of higher education is Microsoft, including us. they're only allowed to use certain tools and the students aren't allowed to use it at all. Whereas you have other schools like University of Tennessee, university of Michigan, ECPI University. We are diving headfirst into all of this AI We're updating our curriculum doing every single program now as at least a third of their classes, mention how AI is impacting that particular course. we're still in the process that takes a long time to build all that curriculum out, but, we're making very fast progress on that and starting to build AI tools into our students, student portal basically everywhere. we have our own, ai, platform that students use. It's a philosophical debate that has been ongoing. the longer it goes, I think the more likely it's gonna be that the schools that just don't start embracing AI are just gonna die. or they'll go back to what they originally were, which was just for people that have a lot of money and just want to learn things for learning's sake.

Patrick Patterson:

Yeah.

Stephen Arthur:

gonna result in a lot lower, enrollment.

Patrick Patterson:

Well,

Stephen Arthur:

college these days are looking for an ROI.

Patrick Patterson:

I think as, we moved from the revolution to the knowledge economy to where we're moving now, Those types of jobs, the trades, those types of jobs are going to become more important as AI is allowing more efficiency in the knowledge economy. it's interesting. I was a math, major and I remember the debate about the ti 89 and the ti 92 calculators, what that could do, calculus for us. 50% of the professors were you, you were, you were able to use it and, you know, it was, it is a very interesting, it's, it's, it's very, very, very similar to where we are today, in my opinion, where the argument was, if you use the TI 89 to do differential equations, then you're never gonna know how to do differential equations. Right. what ended up happening is it actually became the new. Normal. So instead of us all being down at level one, when you gave us a TI 89 calculator, we all got to be on level eight together, And that, but you still had to stand on top of level eight. there's still a human in the loop that had to do the work and wield those tools. I think it's very similar. what Gen AI is doing is it is it's now raged us all to platform 100, right? but you know, now the difference is, the human on top of that. I think the folks that are not adopting it, they're choosing to stand eight still, or even platform one, and they can have their abacuses and their buggy whips. Like, it's fine. but to your point, that's not the future.

Stephen Arthur:

That's not good enough.

Patrick Patterson:

it's not good enough. advances we have made since the PC came out and the calculators in advanced mathematics would not have been possible without those tools. Right.

Stephen Arthur:

All right.

Patrick Patterson:

I choose to believe that we're in another version of that. the advancements we're gonna see over the next five years. we can't even think of them right now. now that we're all standing on platform 100 or 1000, we're able to do things that we never thought were even possible years ago.

Stephen Arthur:

a good example of that flows almost perfectly with our philosophy when we're doing our curriculum redesign, our programs. one example, because I'm working on it now, is we're introducing coding to our business students we're teaching them not just, in the business program, again, the marketing class now will include how to actually code up a website because it's very easy to do now. I mean, there's a lot of tools out there that, you know, we introduce our students to they can just build a website. You know, you can use cloud ai, create an artifact that will just code up all of the HTML and CSS for you. And, you can make edits in plain language. We're teaching them how to host it on, you know, like Versal and, you know, VO. And so they can go from no knowledge of coding to having a functional website, designed and hosted in a single course in five weeks. our, our courses are, five week term, so they have two courses at a time every five weeks. it's very fast paced, but they can do it in five weeks, that's more than enough time to teach someone how to do something, and that's just one part of the class. we're bringing people up to that second rung, because it's easy to do. it's not that hard, but a lot of colleges don't really have that vision yet.

Patrick Patterson:

as an employer, I can tell you, we can see the difference between the schools that are actually teaching the skills or encouraging the use of these tools inside of their education. I don't think people need to have a class on open ai. they need to say, we want to teach X, and how can I utilize the tools to make that? A great experience and to make that better I don't know if you have a prompt engineering class, but I don't think schools should be thinking like that. They should be thinking how do I teach physics to the next generation utilizing all the tools that exist so that they're prepared to be better than we were. Right.

Stephen Arthur:

exactly

Patrick Patterson:

we can tell the difference. when we interview folks in their twenties, it's like, oh, you went to a school that didn't allow open ai.

Stephen Arthur:

right.

Patrick Patterson:

Oh, but you went to a school that did. And I'll tell you, those folks that did, and the folks that really embraced it, like they are absolutely standing out in the interview process right now.

Stephen Arthur:

Yeah, that's good to hear because that's what we're trying to do. I mean we don't have a prompt engine. That was all the rage, What was that? There was one role that

Patrick Patterson:

You.

Stephen Arthur:

went viral,$300,000 for a prompt

Patrick Patterson:

I told people, it. It's gonna go away in a year.

Stephen Arthur:

and honestly, there's only one prompt you ever need to know, and it is to add at the end of every prompt you try to do is ask me questions to clarify the context,

Patrick Patterson:

one tip,

Stephen Arthur:

prompt after answering whatever it asks you about.

Patrick Patterson:

let's call that out for anyone listening who's not doing that. I think that's easily missed, right? What, what you just said. A lot of people treat. Chat, GPT, like it's a Google box, and they ask it questions and try to find their answer. When they don't get the answer, they ask it another question, and they don't get their answer. They ask another question, then they give up and say, this Chat GGPT thing is dumb. Right? I have found tremendous value, in doing what you just said, which is, Hey, I have this hair-brained idea, it's probably bad, whatever it is. Here's my chicken scratch idea. ask me 10 questions about this that will tighten it up or make it better. I answer those questions and the, the, the, the, the responses then just so much better and in, and, and in that it's a thought partner with me,

Stephen Arthur:

right.

Patrick Patterson:

a co CEO in a lot of ways in that journey, it's, you know, when I have a hard problem today, 20 years ago you call your executive coach. Now I start with. Chat, GPT and have a debate with it it's teaching that critical thinking. that's a really solid tip. anyone who's not doing that needs to be.

Stephen Arthur:

it goes all the way back to Socrates with Socratic dialogue the best way to learn is to talk have a dialogue and ask questions. the best way to use AI right now is to have it ask you questions. we've applied that to our curriculum. we're including AI topics into all of our courses, but we're including AI as a medium of learning as well, where students are required to interact with ai, in our case, OpenAI. the platform asks them questions about a particular topic that they need to learn about. And then they have the conversation with our specifically designed chat bot that walks them through the, the topic that they're learning about. And. Test their knowledge and, and if they don't know, it'll give them a hint and help them get to the right answer. they're graded on that interaction, on what they got Right. And how thorough they were in, describing it. just that principle of having it prompt you is key

Myles Biggs:

Yeah.

Stephen Arthur:

I think.

Myles Biggs:

Can we,

Patrick Patterson:

great.

Myles Biggs:

we've been alluding to this thing you have. I'd love to talk more about ECPI intelligence, you've talked about AB testing. what you have today was not just born perfect, You had a lot of different variations that existed and were scrapped, take us through the arc of what it started as and now what it is today.

Stephen Arthur:

Sure. So, this week is our annual, faculty conference. we have all of our faculty from around the country, fly in and, we have the president's panel with him and all of our university leaders. we're describing all the things we've done this year and our plans for the future. we've only had this ECPI intelligence platform out for less than a year. this is its, you know, final major debut with everyone. A lot of'em have been using it for the last six months but this is the official announcement and, Looking, we, we pro produced a slide that through the iteration of how the UI has developed, and I was just looking at that very first one, and it was just awful. it, it was basically a little chat bar at the bottom with a send button and the conversation flowing up and down. It was like a ChatGPT without any other features other than the Chat window. It was just not good. but now it's into a spot and, we've added all sorts of, you know, features to it. It's essentially Chat GPT and we've built a, a sort of wrapper around it. We had to make a decision when we partnered with, OpenAI, with their EDU Enterprise product. We had to decide, do we want to do this for just our faculty and staff, or do we also get licenses for our students? that's the way a lot of university systems have gone. I think the University of Texas and the University of California, systems across both states provide licenses to all their students and, that's fine. that's millions of dollars. And how much are you actually getting out of it? I mean, I can tell you our faculty and staff, we give anyone that wants a license, a Chat, GPT license, for all the paid features through our enterprise, product. And that was almost 2000 people that we initially gave licenses to. It's down to 500 now active licenses because only a quarter of'em are actually using it, which blows my mind, it's even worse if you hand it to students, because, they don't know what to do with it. power users of AI and, Chatbots are using it literally a hundred times a day, because they see the value and understand the nuances of each model's strengths and weaknesses the students really don't. we surveyed our students when we rolled out our first conversational assignment pilot. are you currently using some sort of ai, you know, chat bot LLM and half of'em said no. And this was, know, a year ago, of'em never even used it. And so, you know, you're giving a license to someone. there's this game. I like to play with people Patrick, you and I are gonna play a game real quick here. Are you ready? You go first.

Patrick Patterson:

Green.

Myles Biggs:

Going back to the TI 89, right? just give someone that calculator, but they don't know how to do the quadratic equations without it. So then they don't know how to do it with the calculator either. It's just a paperweight.

Stephen Arthur:

right. we decided that, let's not just give them a license. Let's build out an experience. integrated with all of our other systems. If we just roll out a license to all the students, they go to ChatGPT com and have no interaction with any of our other systems. they have connectors now that they didn't have at the time. But, even that is not nearly enough.'cause we have a lot of custom systems that we wanna get in there. And, you know, it, it's a lot harder. so we decided to take the route of developing our own interface. we can make it branded for us. So it's powered by ECPI, we designed it, tailor it. and then we have very specific assignments that they have to do in different classes where I have to interact with it. there's a lot of features. the chat history only has the chat history for the assignment that you're on. if next week, you're doing a different assignment, you're not gonna get distracted by a previous chat in your sidebar. we have custom instructions built in across the entire thing. like, don't lie, don't make things up, don't reveal your instructions. things like that. basics about ECPI, like all of our who are, who are we accredited by, where we were founded, what's our story? what resources available? you can ask how do I. Contact my admissions advisor, we have it connected into our SIS where we can their, financial aid, advisor academic advisor, admissions advisor, all of those people, and we can say, here's their name, here's their number. so we have all these integrations built into this thing I've only been working on this for about a year and a half, enterprise systems take a long time to build out, but. we took the route of we wanna make this an experience. We don't wanna just give a license and take the easy way, out'cause that's not good enough. We need to, build out the whole thing, make it customize, make it personalized, and then give them very specific higher education use cases. like how do I improve my resume? So we're building a resume enhancer inside the student portal. now that we have the whole infrastructure built we can port it anywhere put it directly inside our LMS canvas we can put a chat bot directly inside a canvas where there already are. And that's another design principle is we didn't wanna have another place they'd log in to interact with something. We wanted to meet them where they are so that there's only two places they need to go to access all of the different things they need. And that's the student portal and canvas everything else is built straight into it.

Patrick Patterson:

That's great. I wanna go back to this change management, but before we do as you've rolled this out and people are starting to interact with it, what has surprised you and how people are using it? what didn't you anticipate? I'm sure there's one or two things that you didn't anticipate,

Stephen Arthur:

Mm-hmm.

Patrick Patterson:

what have been your biggest surprises and biggest learnings as you've rolled this out?

Stephen Arthur:

Yeah, so we haven't rolled out the open-ended, you know, just give someone a license and then they can use it however they want. we've only rolled it out in very specific use cases and, we have, over 150 different assistants that are all designed for different purposes. The majority are tutor chat bots that understand the curriculum and can answer questions if a student asks them about something in their class. or specific assignments where they submit the conversation. one of the big surprises was how. Easy. The students picked it up. And I think it's because we took the, the design principle of students don't need to know how to prompt, they don't need to know how to talk to ai. They just need to know how to click a button to begin, answer questions and follow the instructions prompted by the ai. so it's been very easy for students to really adapt it. the biggest surprise I got was sheer amount of, just positivity around these things. you know, our first pilot rolled out, in January this year, we surveyed, about a hundred students and asked them like, what was it like to, do this AI conversational assignment? this particular one was, they'd taken a, this is their first class, freshman orientation. They have to design a smart goal, which, you know, has to be specific, m is measurable, et cetera, et cetera. And a lot of our instructors don't even know how to do a good, smart goal. So students struggled massively with creating a good enough smart goal because they weren't specific enough. They weren't making it measurable enough, you know, they weren't making it achievable enough. now they're interacting with this AI chat bot that helps them figure that out automatically. they say, here's the goal. I want, here's the list you can choose from they choose one, and then they're like, Let's make a specific goal for that particular thing. they say, you know, all right, I wanna be better with my time management. And you know, to specific part of that is I want to, plot out 15 minute increments for the first hour of every day and then just go from there. And the chat bot be like, all right, that's very specific. Or if you just say, I wanna be better at time management. It'll correct them and not let them move on in the assignment until they get something specific enough. finally generate their smart goal at the end, and we survey them, and it was a 97% positive feedback rate. 97% said it was either helpful or very helpful, and no one said it was not helpful. The other 3% were just neutral. you never get a 97% satisfaction rate in anything you try to do,

Patrick Patterson:

Yeah. Well, and I think there's been a dream in higher ed for one-to-one coaching, one-to-one teaching.

Stephen Arthur:

Yeah.

Patrick Patterson:

every university tries to get as close to one-to-one as possible. It's just not financially, possible. You can't have one professor for every student. when you look at this in a world where, there are 20 people in a classroom. And 10 of them, already understand the assignment. 10 of'em don't understand the assignment. you know, who, who is the professor gonna cater to in that moment? They have to either talk to the people who understand it or talk to the people that don't. either way, you're leaving half of the class out. when you have an ability to be truly one-to-one, and, you know, get people, you know, pull people along in that world, like, hey, just as you said, like, maybe I already know what specific means so I can skip, to Measurable, but I don't understand measurable. So I get stuck in measurable for a little bit longer than, someone else. but at the end of the day, we're all getting that one-to-one, treatment. that's really admirable. And I'm glad that has a 97% positivity rate, because that's the goal.

Stephen Arthur:

it's not just the students either. it's the faculty that are loving this too. one of the biggest challenges, like I mentioned, earlier is that academics and academic people and instructors have been super anti ai. see students using it for their assignments and they hate it. They don't like it, they don't want students using AI at all. how do we change that culture to be AI first with our faculty if our faculty hate it? that's why one of the first things that we tried to do was figure out do we make this both good for the student and for the instructor, because we need the instructor buy-in. we positioned this in a way that. the instructor to offload a lot of their busy work the basics of like, here's just some definitions. Students can read that. you don't need instructors to necessarily even grade as much anymore, or at least the repetitive stuff that students get, wrong all the time by positioning it that way to make their lives easier, we've had a much better response once instructors see the students using our specific AI use cases. They flip 180 degrees from hating it to being like, this is the best thing ever. I have some feedback from one of our instructors, he said this is my first time integrating the AI driven assignments into my teaching. The response from students in the MA program has been overwhelmingly positive. Many have shared these assignments, are helping them better grasp and retain key concepts in a way that feels both engaging and supportive. From the instructional standpoint, this shift has been incredibly helpful. It's allowed me to focus more on teaching concepts in a meaningful and effective way, and to spend more time tutoring students in small groups. the AI generated feedback streamlines the grading process, enabling me to review the AI suggestions and adjust scores as needed without sacrificing quality or student engagement.

Patrick Patterson:

I love it. It's phenomenal. that's the email you were hoping to get when you went?

Stephen Arthur:

I was like, so happy for the next week, and I'm still happy.

Patrick Patterson:

Yeah.

Stephen Arthur:

this all the time to be like, all right, this is really cool. We're making a difference.

Patrick Patterson:

You know, it's awesome to get that, feeling 10 feet tall moment, and I'm sure the past three years were super easy. No one was fighting this, there was no hurdles. no, but how did this, how in a world, and I've talked with thousands of CEOs and companies and schools about, this type of transformation. I can tell you, you guys are leading the way in it. So how, in a world where everyone else is saying, Hey, no, that's not what we're gonna do. How do you rise above that and build something as impactful as you just said?

Stephen Arthur:

That is a good question. honestly, I'm just shocked that everyone isn't doing this. once you make the connection. I think maybe that's the problem no one else has thought of that yet. one of the things I first saw when, you know, ChatGPT came out, was that the leaders and how to use this came from academics. It, it was people in the university systems, like really good instructors, really good professors, you know, the Ivy League colleges that were starting to use it in their classroom. were doing it in a very good way. but they were doing it for their audience of students who are in, if they're in an Ivy League school, they're already motivated they're already highly intelligent. And so they can pick up on how to use these things themselves. the other problem is that those first adopting this can't scale it beyond themselves. Whereas, an institution like us. Our instructors do not create their own curriculum. We create the curriculum centrally with a team, and work with the subject matter experts, mainly our deans. we design the curriculum centrally and then hand it to the instructions to teach to. I think that's probably one of the biggest advantages we have as an institution we can pivot at scale. we know the principles on how to design these things. And start rolling it out across the board. the instructors are handed these AI systems. they're very intuitive to use. both the students and the instructors are using them very, it's very. Easy to understand how they at least understand the dialogue, if not the underlying instructions and, systems But the dialogue itself, it's just talking, that's something that everyone understands. it might just be that one click moment where you understand that the assignment is the conversation.

Patrick Patterson:

Well, I think it speaks to the leadership. I think it speaks to, your ability to impact change management inside of a space that is very hard, as you said, at academia.

Stephen Arthur:

support from university leadership to really lean into ai. I think that's another thing is that at least our leadership here at ECPI has always been very future minded. it's not just what's happening this year and next year it's what's 10 years from now. because, with the regulatory environment, you have to think beyond the current presidency. You have to think about the next presidency in eight years and how that might impact what you're doing now. you have your students for over two years usually, so you have to think about where they're going to be when they graduate and where the, the industry and where the, the economy is gonna be in two years plus years, sometimes four. we have a very accelerated program where you get a full bachelor degree in two and a half years, so that helps a little bit with, making sure they're prepared, for two and a half years down the line. but you know, AI is not going away. that's the thing you just have to realize. And what the Virginia Techs don't realize is that. This is not going away. this is as bad as AI is ever gonna be. it's just gonna keep going from here. if you're not preparing students for it, then they're not gonna be prepared in the workforce. the ones that are exposed are much better hires. we're not teaching students how to do the prompting, but they're learning how these systems work by interacting with them. They kind of learn it through osmosis of they're interacting with all of these different AI chatbots and, and they see all different ways that we've introduced them to how these tools can work. And so that when they go out into the real world and get a job and start seeing other AI systems, they'll be familiar with them and how they work and how to interact with them. And I think that's gonna be a, skill that we didn't even really intend to introduce to the students. because they're using our AI system so much, they're gonna be even more prepared not only are they interacting with these systems and learning how to talk to them, but they're understanding the capabilities of them as well.

Patrick Patterson:

Yeah. what you guys are doing is so important. when we went through the industrial revolution, we transitioned from farming to manufacturing. That took about 70 years. Then we transitioned from manufacturing to the knowledge economy. That took about 30 years, 40 years. the thing that we don't have right now is 30 years. when you look at what could potentially happen to unemployment,

Stephen Arthur:

Mm-hmm.

Patrick Patterson:

in the next three to five years, maybe even sooner met with the need for, upskilling and reeducation, where do you see the future of all of this going and, what's the responsibility of universities and schools to figure this out?

Stephen Arthur:

I don't think anyone really knows. I have a Tesla that drives itself and it's really amazing. But then you look at one of the number one jobs in the country driving, whether it's truck driving, taxi driving, uber driving. If all of that is replaced, that's a lot of people that won't be able to drive for a living. and that's just one thing. there's so many examples

Patrick Patterson:

Call centers are 6% of people in a call center job.

Stephen Arthur:

so we're already starting to

Patrick Patterson:

Yeah.

Stephen Arthur:

it's been slower than people thought it would be so far. I saw a study I think it was MIT but it said there's really only been a few percent of jobs that have been impacted so far. this was just a couple months ago, but, one of the things I'm actually writing right now, an op-ed on, is AI going to make university obsolete? because, and I, we, I was a little worried about that was first coming out.

Patrick Patterson:

think it's the right question to be asking regardless of the answer. Any business leader, anyone in their job right now, they should be asking that question. Just insert their job, insert their industry, insert their whatever. how is AI going to change our business? And the second question is, if I were to start. you have a mission at ECPI, whatever that mission is. Everyone has a mission at their organization. If we were to say, I want to execute that mission with all the tools available today, how would our company be different? a lot of people are afraid to face that reality. I love that you're thinking could universities be, I had someone ask me at a networking dinner what generation will we stop teaching people how to read?

Myles Biggs:

It's wild.

Patrick Patterson:

and I heard the question, I was like, that's ridiculous, Then it's like, well, we all have headphones on. We all have meta glasses or whatever. And AI is, talking to us all the time and we're in augmented reality world, and it was like, oh, there's a non-zero chance of that in the future, so let's think about that when could that happen and what would have to be true for that, you know? But it's those types of thoughts, experiments that I think are very valuable for how we can start to chart the map towards the future. Right?

Stephen Arthur:

No, that is an interesting question. if you're just transmitting via telepathy, you don't need to know how to read,

Patrick Patterson:

It's a, bone induction. everything's going right in.

Stephen Arthur:

will AI replace universities? That's what I'm, writing right now it starts by talking about all the different, technologies predicted to, be the death of universities. You know, you had, you know, the printing press even as when it arrived, thought books were gonna make universities unnecessary.'cause you could just learn everything you needed to outta books. the radio happened. you know, why would anyone sit in a classroom if you can hear lectures from great minds anywhere? even with the, the television, I had the visual dimension where you could explain things over, you know, in a visual, medium. surely the televised instruction would replace the lecture hall, then the internet. You know, you can learn, you can learn about anything just by looking it up. I think this is a little different though, all of those past technologies have expanded access to information, AI interprets information. It doesn't just give it to you. It can help explain it. And so that, that's why it does feel a little different right now, that this is not just any sort of technology. you know, the example that I was thinking about was, you know, I don't know how often you go to a Wikipedia page about a math subject, but it goes so deep into everything and there's formulas everywhere, and you have no idea how to understand any of that. now you can have AI explain it to you in simple terms. so that, that's fundamentally different of not just accessing, giving access to information, but to also provide the formation of the information

Patrick Patterson:

Well everyone has something that they're passionate about, a hobby or a thing that they go really deep on. And actually when I'm

Stephen Arthur:

case.

Patrick Patterson:

AI in my case as well, but I love in interviews, what are you more passionate about than anything else? And I loved asking those deep questions, let's say I'm really passionate about, poker, I play poker all the time and that's what I like to do. You can then say, Hey, take this complex, physics, problem. I don't understand physics really well relate it to poker and now you're taking something that you really, and talk about connections in your mind that you never could make before. AI's able to take two concepts like that, no one has ever thought about taking, like, Patel's rule and poker theory and combining them it's explaining it to you in words that you understand. I've been able to do that a few times and it's just like, oh, I get that concept now. That makes sense. it's really powerful in a one-to-one way that was, just not possible prior to this. unless you had a one-to-one coach that understood both of those things. which never happens.

Stephen Arthur:

Yeah. So, as I kept thinking about

Patrick Patterson:

I.

Stephen Arthur:

it made me start thinking deeply about what is the university for? the one constant that has remained throughout has been to create better citizens. that was the original goal, to create a better citizen. career colleges today have a different way of doing that, where we teach a skill so they can provide more to their community and hopefully make more money. I came up with this framework of the three C's of why AI is not going to, eliminate university altogether. curriculum community and the credential. the curriculum, you need to know, what to learn. and you can ask, Chad should be two to create a career for you on a particular topic But the university creates and spends a lot of time, including using AI to do this. Now, the, the curriculum of all the different things you need to learn the hidden, curriculum in a, university or college is the dialogue you have with both your instructors and other students. And so, as you're learning, the thing that I argue is that is a social exercise. this goes back to Socratic dialogue where you learn best by having a dialogue, by asking and answering questions talking to someone. And you can do that with Chat GPT. But don't really get the communication, the learning, how to work as a team. The collaboration, the deadlines you're being held to, If you're just using AI to learn something and set a deadline it's not going to keep you accountable there's no consequence. You just don't finish, you don't do that assignment It's gonna give you a bad grade. it doesn't matter because it's just an AI system. It's not actually keeping you accountable. that the accountability is there is one big part of that. second big part is the community where. You know, it kind of goes along the same way, but I think, I forget what the statistic was, but it's like over 50% of jobs people are hired because they know somebody not because they have the skills. learning and getting to know your network of people is very valuable in a college because you're together for years at a time. Especially if you're at like the, you mentioned, you know, my Carnegie Mellon, MBA program, I'm with that same group of people for years that's a very valuable, set of connections AI can't simulate the motivating effect of, a fellow student who depends on you for a project. it can't replace the social pressure of your mentor instructor who expects you to actually do well on a particular assignment. thirdly, the credential is always going to be required our graduates are going to be entry level going into their first job. They don't have a resume yet. they might have some experience, doing Uber driving or something, but there's no industry experience. so they need some way to prove to an employer that they know what they're talking about.

Patrick Patterson:

Okay.

Stephen Arthur:

Because, once you get the interview then you can describe what you know. But it's a lot harder to get that interview if you don't have a credential in the first place. AI can assess skills, but it can't, confer that institutional trust We know you went to university to learn about a particular topic for years, you have the credential that proves you at least know the basics, and good starting point. So the three arguments that I put forward on why universities will still be value. Now what AI can't, or what it, can enhance in that whole process is another question. I think that the universities that don't start using AI in all of these different ways, are going to be the ones that get replaced because that institutional trust will start to go down if, like you said, the people, the institutions that banned AI use and their students gonna be as good of a hire, and so they'll stop recruiting from those colleges.

Patrick Patterson:

Yeah. I think that super valuable lesson, right? And I think out of those three C's, the community aspect is the hardest to replicate and the hardest to build. anytime I see something that's hard to build and hard to replicate, that's usually where value is.

Stephen Arthur:

Right.

Patrick Patterson:

I think AI can probably help with the curriculum you can do credentials, but if you don't have the community you're missing a huge part of what is going to make you successful going forward? the 19, 20, 21 year olds out there that are, reading all the doomsday, articles saying, all the entry level jobs are gonna go away and no one's gonna work in the future, I've actually had someone tell me, Hey, I'm not gonna go to school because of ai. which is crazy. so what's your advice to someone who's facing that decision right now? where should they be looking? How should they be assessing? How should they be thinking about the future?

Stephen Arthur:

I think in the same way that we at ECPI University decide which degree programs we wanna offer, one of the things that we're seeing a tremendous growth in are engineering technology programs. I think a lot of that is because as everything starts becoming more automated, that automation is going to need to be repaired. the machines that are going to be out there, someone's gonna have to repair'em'cause there's gonna be a lot of'em out there and they're gonna break every once in a while. especially as manufacturing is being reshored to the United States. Over the last year and over the course of the next decade, I think there's been a big push for that. when you're creating manufacturing, facilities, you're gonna have a lot of automation and a lot of machines building all of these things, whether it's cars or, or anything else. you're gonna need people to repair all of that. right now, robotics, at least Android robotics are starting to a little bit. But I'd say the most, future safe thing, probably in the next 10 to 20 years is probably gonna be on the either engineering technology side where you're, you know, a repairman or a mechanic because of AI systems coming out, you know, we think that cybersecurity is still gonna be a pretty important thing. so that's why that's still one of our, you know, bread and butter programs. the more AI systems that come online, the more ways that AI is being found and being used to attack other, all of these systems, you're gonna need people that can actually implement the, the defense of all of it. and then a lot of the human relationship building sorts of programs like our nursing programs, it's gonna be hard to have a robot perform well as a nurse. the community aspect is important and if you have an unfeeling nurse, you know, trying to take care of you, that's not as good as having the human person asking you how you're doing.

Patrick Patterson:

Yeah.

Stephen Arthur:

it's not really caring if it's a robot, whereas if it's a human, there's some level of concern and emotion behind the interactions. a lot of the other programs that are probably going to go away are. some of the more automated things, A lot of people say software developing is gonna, be a thing of the past. But, we've only hired more and more software developers as AI has become online.'cause we're trying to implement all the AI stuff,

Patrick Patterson:

Mm-hmm.

Stephen Arthur:

with those developers.

Patrick Patterson:

when we went from punch cards to assembly and assembly to c plus plus and C plus plus to Python, and now Python to vibe coding we didn't take all the jobs away at each of those moments, the same people learning punch cards 40 years ago, are now learning, how to do, cloud code and open AI codex. As a coder, I can say, it's about logic, curiosity, and solving problems with the technology available. That's all software engineering is, that's all coding is, what tools are available and what problem do you have? Great. let me engineer a solution for you. and you know, that is the thing you learn when you code and you can take that with you, no matter the language. and so I appreciate that you're telling people that it's still important. we're in the same boat. we're seeing we want more and more people that understand that not fewer,

Stephen Arthur:

there's all these people that say, oh, I can create this whole project with Codex or Cloud code from scratch in a day. it creates their whole app for them overnight, that's just not how the real world of coding works when creating. applications for enterprise level companies. that was one of the biggest things I've had to adapt to when I started my AI engineering team a year and a half ago, I've coded stuff myself, personal projects and things like that, but it is a wildly different world in enterprise coding AI just can't do that yet.

Patrick Patterson:

Yeah.

Stephen Arthur:

to interpret and help guide it a lot more. It's not quite there where it's automated,

Patrick Patterson:

Well, and the human in the loop is so important. I think yet is the right term to use there. But, you know, I was browsing, I think I was X or something, someone, they were like, you know, Hollywood's dead. Let me walk you through how I created a two minute video using, Gemini SOA and Cling and proceeds to show how he spent 40 hours together a 45 second spot, from his 30 years of experience creating ads. And I'm like, no, Hollywood's not dead. That's literally what they're gonna hire you for. What you just did is what they're gonna hire you for,

Stephen Arthur:

one of

Patrick Patterson:

so important.

Stephen Arthur:

of the things that, that we've realized is that if you were a perfect human, these tools could probably double or triple amount of code that you can put out. but no one is a perfect human. And, there's a, one of those, you know, laws that, I think it's Parkinson's law that says the amount of work expands to fill the time allowed for it. if you give someone a week to get something done, then it'll take a week and a month. by giving people these tools, they're now getting all their work done might be 10% more efficient, but then they're just kind of hanging out the rest of the time. theoretically, these tools could make people a lot faster. if you are very dedicated and need to be doing something at all times, it will double and triple your output. most people don't have that sort of personality and they'll just go onto Amazon for the other half hour they saved from using these tools.

Patrick Patterson:

I think that's the difference between the people that will get by and the people that will win.

Stephen Arthur:

Yeah.

Patrick Patterson:

The people are like, yeah. doing what I did before is good enough. those are the people that are gonna get by and those might be the people that are outsourced, to AI in the future. But, you know, the people that say that's not good enough. Okay, I was able to do in five hours what used to take me. Eight. How can I now reuse and recycle those three hours to uplevel myself, uplevel my team, learn something new, try something new, be curious. Like those are the people that have always won, right? And it's will be no different going forward. Those are the people that are gonna win. And those are our next leaders, that are gonna change the future.

Myles Biggs:

Before we wrap up, Steven, is there anything you wanted to say that you haven't yet that we should through with a question that you can answer before we wrap up?

Stephen Arthur:

I think I talked about one of our AI systems, the conversational, assistant and we have a lot of other operational stuff that we've been building out. Like, you know, financial aid assistant I would help you with, you know, repackaging your financial aid. We call it ECPI intelligence, our platform, but we are trying to figure out a better brand name for it. we wanted to call it Alfred Intelligence. after ECPI, university's founder Alfred Dreyfuss, who passed away about a year ago, and he founded EPI was founded in 1966, as a coding institution. So we were the Electronic Coding and Programming Institute, originally. And so it was like a coding academy in the sixties. he was always at the forefront of the newest technologies, and we kind of maintained that, that culture of always being the forefront of technologies and, you know, kind of following his legacy. I'll ask Alfred about any topic. Although, our nurses, an excitable group. very, nurses are interesting. I'll leave it at that. but they've always loved, any of these new technologies and to try to tie it to Florence Nightingale. they were wanting to call, have their own branded one where it's, ask Florence about, the first nurse ask about nursing.

Patrick Patterson:

Well, maybe you'll get to a point in the future where you can have, different,

Myles Biggs:

I say different voices, right? If you're talking to the female, it could be Florence. If you're talking to the male voice, it could be Alfred

Stephen Arthur:

Yeah, exactly. powered by ECPI

Myles Biggs:

yeah. Steven, we talk a lot about the one application you have for ai, where it's conversational with the students, but where else have you deployed AI throughout the university?

Stephen Arthur:

Yeah, we have ai, use cases across financial aid to help people because it's a very complicated subject, we have assistance that guide them through that. we're currently working on a project to automate our entire IT help desks, know, system where you, the idea is you send an email or fill out a desk form with any IT problem and we'll email you back the solution or do a follow-up email if we need more information you can then reply back and we'll automate the emails back and forth always referencing the knowledge base articles our IT team has put together over the years. so that would be 24 7 it, you know, tech support for students.'cause we get hundreds of Tickets every week from students, faculty, staff, and, we have certain support hours. You know, we can't have, 24 7, support out, you know, all the way till, you know, 2:00 AM all of our, students' assignments are due on Sunday at midnight. often students wait until the last minute, it'll be 10 o'clock at night on Sunday that they're finishing up their assignments and then they run into an issue the help desk isn't open. by rolling this out, we'll be able to, access the hundreds of knowledge based articles, synthesize a solution for them, and then give it to'em. So that's gonna be a massive help for students. And that's kind of been the theme of ECPI in general over the years, we try to make things as streamlined as possible. another example of that is, all of our courses are designed exactly the same structurally. one of the problems I had when I was in college, was all the instructors developed their own content and curriculum. so new semester you have to try to find where the professor put everything

Myles Biggs:

Yeah.

Stephen Arthur:

They all have it set up a different way. And some of'em even had their own websites you had to go to. it was just very complicated to navigate the administrative side of things. And we make that simpler by just, having a centralized curriculum team streamlining all of that. And so one of the value propositions of UCPI having that streamlined process. we don't have an ad drop, you're automatically enrolled in the next classes. We know that you need, you know, the IT help desk just automating your IT support, fa chat bot helping you that 24 7. so, you know, we're, we're trying to streamline the students. Whole academic journey making it as simple as possible so they can just focus on learning instead of having to navigate the administrative side of things. And AI is helping us do that at scale

Myles Biggs:

Was it AI in the administrative side or in the classroom?

Stephen Arthur:

one thing I have seen across the board is AI is always implemented first for efficiencies. everyone uses to summarize their meetings. They use it to, scan, articles and give the gist. taking a ton of content and giving you the answer. the hardest thing with AI is to impact the top line instead of, the expenses. how do you actually enhance your product? for us, enhancing our curriculum, making AI a medium of learning, making all of our support staff, give them access to Chat gpt that they can better answer and help students. So that, that's been my main focus, is to focus on the actual product that we're giving to our students and to make that as good as possible. The efficiency side is good. but, you know, you kind of fall into that Parkinson's law again, where the efficiencies often don't actually result in any, you know, bottom line difference. we've tried to find the use case, get it implemented. And whichever we've kind of been going low hanging fruit first. but now as we have our AI platform and kind of the, the backbone of the whole, you know, the code, code base behind it, all the infrastructure, we're now gonna be able to start moving very quickly and implement a lot more of the efficiency stuff. And to start, for instance, we get unstructured feedback from student all the time through free response questions on surveys. students have to fill out an end of term survey every after the end of every class, just giving their feedback on the class and all that. And, you know, we get a thousands of free response, feedback, which is very hard to analyze one by one. And so, you know, one of the other things that we're trying to do to help I improve the student experience is to be able to more quickly act on some of these things. so a student had a problem with a particular instructor, we can now take all the feedback we get and then flag it for things like that so we can escalate it. And, you know, that's harder to do on a keyword basis, in the past. it's a lot easier to have AI figure out, all right, just ask it. Does this student feedback require a follow up from somebody? they'll just, it'll just say yes or no, and then we can then use that, flag it, send it to our, the campus president that is in charge have them address it. so many things like that that we're gonna be able to start building out now to help improve that student experience is where we're going with this.

Myles Biggs:

That's exciting. so people wanted to learn more about what you're doing with ai, just about the school in general. Where should we send them after they listen to this podcast?

Stephen Arthur:

Sure. So, we have a page that describes it. So if you go to e cpi, you can Google AI and education, that, page shows the different ways we're implementing AI into the student experience, at the curriculum level. a lot of the other, ways we're using ai, you're not gonna be able to find, because, that's one of our, value adds that, you know, we're, we're still trying to figure out, you know, how do we start talking about these things with prospective students to show what we're doing with AI and how it's being used. So right now, just AI and education@eci.edu. You can find it on Google,

Myles Biggs:

Awesome,

Patrick Patterson:

That's great.

Myles Biggs:

and we've got your LinkedIn profile, which we can also put in the show notes along with that link for ECPI, so people can come chat if they wanna talk to the man himself.

Stephen Arthur:

I should, plug one of the papers that me and a group of other marketing and admissions, professionals put together, for, cq the Career Education Colleges and university@career.org. We just put together kind of a best practices on how to use AI and, and, college, marketing and admissions enrollment management. So, you can go to career.org and it should be, on their website. You can find that paper if you're interested.

Myles Biggs:

That's great. Steven, thank you for your time today. This has been a great conversation. Really appreciate it. thanks to everyone listening. Be sure you hit that subscribe button on the podcast so you don't miss any future episodes like this one. we'll see you next time.