Admit It, An AACRAO Podcast
Join the conversation with Admit it, an AACRAO podcast that serves to educate, amuse and inspire professionals in college admissions and enrollment management.
Admit It, An AACRAO Podcast
Re-Evaluating Program Success: A New Model for Non-Traditional Program Evaluation
Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.
In this live episode of Admit It!, host Dr. Alex Fronduto sits down with Dr. Michael Williams, Executive Director of Strategy, Innovation, and Effectiveness at Embry-Riddle Aeronautical University, to explore his SEM Conference presentation: “Re-Evaluating Program Success: A New Model for Non-Traditional Program Evaluation.”
Michael dives into how institutions can rethink what “successful” programs look like—especially those serving non-traditional, adult, and online learners. He introduces a new evaluation model that moves beyond traditional metrics to incorporate mission alignment, learner outcomes, market demand, and long-term sustainability. The conversation highlights why this shift is necessary, how Embry-Riddle developed its framework, and what lessons other institutions can apply when examining their own program portfolios.
Host:
Dr. Alex Fronduto
Faculty Lead, M.Ed in Higher Education Administration & Associate Teaching Professor
Northeastern University
Guest:
Dr. Michael Williams, Executive Director of Strategy, Innovation, and Effectiveness, Embry-Riddle Aeronautical University
Welcome to the Acro Admitted podcast, the podcast where we dive into the people, ideas, and strategies shaping the future of enrollment management. I'm your host, Doctor Alex Frandudo, the faculty lead of Higher Education Administration at Northeastern University. And today we're talking about the strategic enrollment Management conference that ACR held back in November 2025. I'll be interviewing a variety of presenters so we can discuss their findings from anything that they shared at the conference, for those that did not get to go to their session, or for those that weren't able to take time to go to the conference at all. I hope you enjoy hearing all of these different stories over the next coming months, and as always, if you'd like to be on the podcast, don't hesitate to reach out. Hello and welcome to the Admitted podcast. This is Doctor Alex Frondido, your host. I'm live at the SEM conference, excited to speak with Michael Williams on his session about reevaluating program success, a new model for non-traditional program evaluation. Thanks so much for being here. Thank you so much for having me. So, obviously we have people listening. To this that may have not been able to come to SEM or maybe they weren't able to attend your session. So I would love to talk about your session, but first, let's talk a little bit about you. Who are you? Sure. Well, that's a, that's actually a very loaded question. Um, so I usually go by Doctor J or Doctor Williams, whatever fits people's fantasy. Um, my Rise to education or through education has kind of been one of very interesting. Uh, my original path was supposed to be a medical doctor, um, so I tell people I became a different type of doctor. That's how I saw it, um, but I came through education watching my mom and working in higher ed for over 30 years, um, and it got me into that mode of service, and I found that I had this passion for seeing other people succeed. Um, we always say you don't enter education because you want to become rich. You education because you really have a passion for the work, um, helping people achieve their goals, um, and that kind of led me through the path at, at Embry-Riddle. I've been there just over 17 years. Oh, just a, just a little. I really like just over 1 year. No, just a, a small drop in the bucket. And during that time, um, we've kind of pushed the boundaries at the worldwide campus in terms of. Defining what the student looks like today, the needs of the student, and ensuring that the institutional model kind of meets the student's needs where they are, you know, it's one of those things where, uh, it's kind of opposite of how we typically see things where We usually look at education, we push the degree out, we tell them this is what they need and this is what they do, versus the other way around where we take into account what they're trying to accomplish throughout the process. Um, and that kind of led me down that path of Exploring and researching this whole dynamic between traditional and non-traditional and uh I know Acro loves to use uh post-traditional um and I kind of tossed all of those out the window so I went down this path of years of researching and decided that I think we should call them modern students. We're in modern times, so let's call them modern students. So my, my drive for education has been looking at how we align resources to meet the needs of the modern student, um, but then I had to take a step back and say, wait a minute, I need to define what do I mean by the modern student. So that kind of has been the framework of my existence in the higher ed space. And so, uh, talk to me a little bit specifically. You've been at your current institution for 17 years, all within similar departments. Where have you been, you know, prior, and where are you now? So people, I, I, I've been in the worldwide campus, which is we call the, the non-traditional campus, the, the global campus of the institution with just over 24,000 students. I came in, uh, believe in. Not as the manager of online recruitment under the Aero unit, which was academic enrollment recruitment online. We love acronyms in education, right? So from there, um, I kind of, uh, elevated through enrollment management as, um, they saw the need to really formulate what enrollment management meant at that time. So I've evolved from, uh, online to enrollment management to administration and business operations to, um, now my current home is under student success so, um, still within the worldwide campus of Embillo, but I've moved around and my title has grown to be somewhat that of a paragraph. Don't you love how that happens?-- I love how that happens-- and so uh. Are you, uh, thinking about Sam? Of course you started in the recruitment area, so thinking enrollment. Now are you thinking mostly retention in your world or still everything? I am, I say I serve at the pleasure of the chancellor. So I love that term, you know, they say never ever get the title Special projects, but sometimes I feel like that's, that's what I am. I've done things from, um, taking programs that needed some structure, give them. Structure I've done, uh, creation of departments, um, structured them and handed them off to someone to run, um, I've done enrollment analytics, um, which a little bit of everything, a little bit of everything. I, I call myself the Swiss Army knife or higher ed. OK, hey, that's a great skill set to have and very common for people that work at the same institution for a long time. Absolutely. That is the paragraph title. It is the other. Their duties as assigned.-- So-- and it's the institutional knowledge that you just can't make up on day one. Uh-huh. I definitely understand that. So what led you for this specific presentation? And so again, you're looking at reevaluating program success. And again, you say non-traditional program evaluation, which is, so talk to us a little bit about what made you want to present. Sure, so I, I. Put the period on the last sentence of what I define the, the modern student to be in 2023. And I said, OK, what's next for this? And I said, now it's time to, that we've defined what the modern student is, is Look at the tools we use to evaluate students and say, do they work or not? And what I've done for the last two years is kind of refine what I call the degree progress report, which is kind of the core of the, the presentation that I'm giving here at ACR, um, and it takes into account. How modern students move through degree programs. Uh, cause when you think about it, if we look at what iPads, NCES defines as students, it's a coin toss as to whether they're included or not. I mean, over 50% of students are excluded from the definitions that are defined there. So I say let's move into a realm of functional analytics where we torpedo this concept of what we need for iPads. Obviously we still have to provide those, of course, but what do we need to be effective in day to day decision making at the institutional level? So after all of that research was combined, I started writing the paper, um, and I was like maybe I should try to submit this to ACro as a presentation. Um, it was accepted as a presentation and then it'll also be published in the summer July issue of Sim Quarter. Awesome. Congratulations. Thank you. Thank you. Lots of, lots of work leading up to that point. I'm excited about it. I'm hoping that this modern student concept will catch on and Maybe I'll be in history books. I love that. OK. And so you said you were looking at different data points potentially, so you structured a different report. So talk a little bit about what was that? What included it? What would be something that's listening to this podcast, didn't go to your session. Like what are some key things that they should know? So the, the, so first I structure this to look exactly like a report card. I said, what is one thing that we understand in education outside of anything else? We all also. All seen report cards we've gotten them, we've given them, we agree with them one way or the other or somewhere in between. So one of the first metrics there is how effective are we in moving from an admitted student to an enrolled student, and that is 15% of the overall grade in the model. And what that does is tells us how attractive is this program in, in, in converting people. And if we're doing good there, the next metric that I have in there is called eligible to active. Um, now, in the, a little bit different from the traditional side where if they deposit, they're they're on campus, they're good. But when we're talking about the non-traditional student or the modern student, They're not depositing. They're not showing up to a dorm, so we can move them from admit to enroll, but then once they're active in our system and can take courses, how many of them are actually taking courses? So I call that metric eligible to active, and it account, it's, uh, the, the weight for that is 25%. Of what becomes the overall grade, and that tells us the student engagement and the persistence component, um, of the process and then there's course load. Some people will call it take rate, um, or a number of courses that were taken in a year. I just collapsed it and called it course load. That's, that's 10%. That tells us, uh, essentially how many. Credits are they taking in an academic year and the interesting thing I did, um, probably writing, writing on many whiteboards was figuring out for an undergraduate student and a graduate student what should be the minimum number of courses they're taking in an academic year to effectively move through a program at a rate to complete. And I'm not considering it on a 4-year model, um, and that's the beauty of it. Institutions can take the same model and say we want to say our undergraduate students are coming out in 6 years. Fine, you can then adjust that component of the, the measure to account for 6-year rate, and then you can see what the minimum number of courses they need to take in a year to complete against the number of credits in your degree program. Same thing for graduate programs. We just, uh, did the work of. Converting all of our graduate programs from 36 credit hours to 30 credit hours, um, wish it was that way when I was going through my master's program, but the, the, and, and I was thinking about the other institutions use this and their model is not exactly the same as ours, is this modular enough to allow them to account for their timelines and credit hours involved in their program? So course load is 10% of the overall grade. Um, then we get to the exit points. I call the, the negative attrition, or we, we just say attrition in general, discontinuation rate. So those, these are people that are leaving the program, they're not graduating, they're just stopping out, um, or not telling us anything until we get to the date that we finally discontinue. And that that accounts for 20% of the, the grade. So the lower that percentage is, the better you're doing at retaining students in the program and trying to also escape that model of saying retain. I went to a conference, ACR one year where, um, the presenter said we want to retain good faculty and staff. We want students to persist. So ever since then, that's kind of been in the back of my mind that, OK, we're not retaining students, we're retaining good workers, and we are persisting students through the process. Haven't been successful in changing the industry understanding of those terms. Uh, maybe that's the next assignment. I mean, that's a whole another conversation. It's a whole other conversation then I'll get into. Retention versus graduation rates, which are also two different terms. Exactly. a whole other study worthy of Nobel Peace Prize, I'm sure. And then I, what I call positive attrition is the final measure is graduation rate, right, because that's what it's about. If we're not graduating students, then what are we doing? So it is the largest metric at 30% of the weighted score. Um, but I, I tell people, it is not the only component. Sometimes we look at, oh, how many are graduating, but we forget about everything that leads up to that point that is a predictor of graduation. Um, and so we want everyone to have successful program completion, and what does that mean for your institution is also modular just like, um, the other components like course load. So if a student should be coming out in 6 years at the undergraduate level, you can adjust the scale for that based off of your institution. This does not mean that traditional campuses can't use the model because they can back that down to a four-year or 5-year as well. Um, so that's one of the things we kind of stress test with this particular model to see if it can work across a different, um, arrangements or, or setups, institution types. And so, really, you were looking at those, some of the enrollment data is there in the beginning of what you just went through, you're still really looking at. Success defining as the students that come in and students that are leaving and everything that goes in between that. Some would argue that program health in general and like is also like market data, market research, the number of students you're enrolling will also predict whether your program is successful, right, because if you, if you have 5 students and they all graduate, you in this scorecard, they would be great, they're 100%, right? They're, you know, how do you take an account, or is there a completely separate model. Of how you're looking at that kind of market data and enrollment data to, to, you know, essentially predict whether your program, you know, is successful. Sure. So even though, um, there's no component right now in the model for prospects, um, it does take into account of all those that have applied, then, then admitted. It, it, it just happens to pick up from the admit point, but it looks at that total number available in each of those categories against who's actually. Doing something. Who's going from potential to kinetic, of course, inside of that. So that is one measure. And then I've also, um, kind of batted back and forth. Do we then bring in industry data? So what are they doing then after graduation or what were they doing before coming in? Um, those have not been incorporated in, but I do have a future work section inside of the study where I suggest that those things be explored, um. Where applicable, no, I mean that makes complete sense. I think it's always the finding the balance between too much and too little, right, and that's a delicate dance, right. And so I think, you know, I've seen a lot of similar types of scorecards for programs, and it's hard to determine. That difference because you want to do the entire funnel right all the way through graduation and are they getting jobs within their field is it within 6 months, you know, all of these different data points and then, but like I said like you could skew your report if you only have 5 students, right? And so and so I, you know, I do think like looking at that piece I think is crucial for institutions. To think about that, you kind of have to have this and that. Absolutely, absolutely. And so have you looked at any of that, or does your like other people in enrollment, like how are you comparing like, OK, we're retaining these students, they're enrolling at a high rate, which is great, but is essentially the volume accurate, right? Are you meeting the need of what The world is saying, right, like if there's supposed to be this many students in the programs, are you taking a good chunk of that right right now, no, the model does not consider that. So the next phase for me was, uh, bringing in the prospective student pool, awesome, um, and, and seeing how we weight that in the scale and how do we rebalance the other metrics to account for that, um. So prospective students would be next. I would love to move into kind of the, the post look, which is where did they go after? Did they go, are they effectively placed or not afterwards? Um, I think the thing that kind of skews it for us at Embillo is the majority of our students, about 90% of them are employed. In fact, we, we serve, uh, 50% of our population is military, um, so I think. What would be appropriate is if we could get an institution that is not Uh, balance that way that serves more non-military and look at how do we build a scale for that, and then I think that will bring it full circle and we can, uh, stress test the elements that you're talking about. That's awesome. No, I think, of course, every model is gonna go under a lot of iterations, and I think. Um, as you said, you've kind of looked at it, graded, and so have you actually used it on all your programs at this point? Like, yes,-- have you tested and kind of looked at-- the criteria we've, uh, ran it against, uh, 44 of our, our degree programs looking at two different academic years, um, and we did things like look at a Nova, run T tests on them to see, um, whether these things are statistically significant or not, and they, 4 of the 5. Measures showed strong um CNDs above one, so, uh, we're kind of excited about that, but I'm cautiously optimistic because I want to see other institutions take this and test it to see if it proves to yield the same output for them as well. Um, so I would say yes, it has across the 44 programs we looked at it has been a strong determinant of, um. By performing programs against low performing programs, in fact, um, those that perform well showed almost double graduation rates than those that were not performing well under the model. Um, now one of the contentions with the deans and academics is they feel like they're being graded, of course, um, but what we tell them is this is not about grading academics. In fact, no component in here looks at what the faculty or academics are doing. Um, what this is designed to be is a. Advising tool to help advising leadership figure out how they need to uh support and approach students but then also work with the colleges to figure out, hey, our students are leaving your program because of the stats course, right? Like why, why are they leaving so it, it, so it opens the door to look at well, what's going on with the stats course that this is the exit point for our students. So it's meant to be a conversation driver versus a punishment point to say your program's got to be. Um, in fact, most programs are, uh, performing at C, which we would say average. Um, no program got an F. No program got an A. Oh wow, OK, yeah, so there are some that are very close to an A, but, but they have not hit the, the A mark. OK, that's interesting. I mean, again, you know, there are many different models and especially thinking of like academic quality assurance, like AQA type of stuff, and so it's interesting to see, like you can imagine that curve that, you know. Curve most likely I'm assuming that most of them see and like you said it's really trailing right now potentially as you said no at the end, no tail, yes, I would love to see more programs move into that B and and move into that that A and then those programs we, we evaluate and say what's going right with them that we can template to other programs, um, but then also the other component is when we're talking about leveraging institutionally funded scholarships or doing other programs to support students. Does it actually have an impact, right? So it's a, it's a way to look at when we spend money, are we spending it in the right place? OK, wow. Well, I assume that especially when the written article comes out, that will be a way for people to, as you said, you want people to test this. So is that gonna be a way that people can figure out how they can test it? Yes, so it'll include more details, and I'm trying to figure out a way how to package. Um, the calculations, um, so this is built on top of Microsoft Power BI, OK, great, um, so what I want to do is figure out how do I package that so that other institutions can just do a plug and play and, and make minor tweaks as, as they need. So I haven't quite figured out that part and, and how to structure it properly,-- but I love that the intent-- is there.-- The intent-- is there, yeah, and we know in this field sometimes not everything is always sharing and caring, and so I appreciate and.-- Applaud you wanting to share that because-- I, I hope to see this continue. Um, I would not be sad if I read an article one day where someone completely refuted the work that I did. That's, that's par for the course, right? I love that. Well, again, thank you so much for sharing all of this. As I've been asking everyone, obviously you're at SEM right now. We're recording in someone's presentation room. So, uh, what is valuable about SEM to you? You obviously come before. So if someone's thinking about coming to SEM, what would you say to them? I would say do. It find the money in your budget to do it because what I've discovered in coming here since, uh, I think my first Acrosim was 2013 is above and beyond the interaction with people from different institutions is you learn that you're not alone in, in trying to solve some of these problems. We love to think that we're on an island by ourselves, but you get to hear different perspectives around the same challenges. You get to hear how some people have attempted to solve them, um, good or bad or. In between, but it is, it's just that, that opportunity to connect with like-minded individuals and, and really see what's being done in the industry beyond just opening a journal article, reading about it, or going to the Chronicle and reading about it. It's that opportunity to have a real-time dialogue about topics that are important to the education space. I love that. Well, thank you, Doctor Williams, for being on here, and I can't wait to see your article. I'm excited for you to read it.