In the Field: The ABA Podcast
Welcome to In the Field- The ABA Podcast, hosted by Allyson Wharam. This podcast is a resource hub for Board Certified Behavior Analysts (BCBAs), business owners, training coordinators, individual supervisors, and graduate students accruing fieldwork in ABA.
Allyson, the creator of Sidekick, an innovative online curriculum and learning portal for behavior analysts, dives into the nuances of ABA with a focus on quality supervision, which she believes is the cornerstone of the field. Each episode offers information on topics relevant to ABA professionals, ranging from effective strategies for supervision, innovations in the field, to practical advice for improving service quality and outcomes for clients.
In the Field- The ABA Podcast is not just a show; it's a community for those who are passionate about enhancing their knowledge, skills, and practices in ABA. The podcast features interviews with experts, discussions on emerging trends, and shares actionable tips to help listeners invest in their professional growth and the advancement of the field.
Whether you are driving to an in-home session, taking a break in your busy day, or seeking inspiration and guidance, this podcast is your companion in fostering excellence in ABA. Join us as we explore, learn, and grow together in the field of Applied Behavior Analysis.
For more resources and information, visit our website at www.sidekicklearning.net.
In the Field: The ABA Podcast
Common Training Mistakes in ABA Organizations (and What to Do Instead)
Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.
Podcast Episode: Training in ABA Organizations with Shannon Biagi
In this episode of In the Field: The ABA Podcast, I sit down with Shannon Biagi to unpack common training mistakes we see across ABA organizations and beyond.
We talk about why training is often treated as the default solution, how content-heavy approaches fail to build real skills, and what it actually takes to design training that leads to performance. From “sit and get” learning to one-and-done onboarding, we break down where things go wrong and how to approach training in a more effective, sustainable way.
Key Topics:
- Why training isn’t always the problem or the solution
- Designing for performance, not just content
- The limits of passive learning
- Moving beyond one-time training
- Evaluating whether training actually works
- Why seniority doesn’t equal training ability
Key Takeaways:
- Training should be driven by performance needs, not assumptions
- Designing for behavior change requires more than delivering content
- Active practice, feedback, and actual application are essential for skill development
- Training should be ongoing, not a one-time event
- Evaluation should measure what learners can do, not just what they experienced
- Building a strong training system requires both instructional design and performance analysis
Resources Mentioned
- Performance Diagnostic Checklist – John Austin
- Behavioral Engineering Model – Thomas Gilbert
- Kirkpatrick’s Four Levels of Training Evaluation
- The Mager Six Pack – instructional design and performance objectives
Connect with Shannon Biagi
- Website: chiefmotivatingofficers.com
- Training and CEUs: motivate-u.chiefmotivatingofficers.com
- Social Media: Instagram | LinkedIn
Subscribe to the Podcast:
Don’t miss more conversations on supervision, training, and leadership in behavior analysis. Subscribe to In the Field: The ABA Podcast and explore more resources at
Disclaimer:
BCBA®, BACB® [or any other BACB® trademark used] is/are registered to the Behavior Analytic Certification Board® BACB®. This website and products are not in any way sponsored by the BACB®.
All information and products are for educational purposes only.
Shannon Baigi
===
[00:00:00]
Shannon Biagi: Dr. Binder said this quote that, saying your organization has a training problem is like saying someone with a headache has an aspirin problem. Isn't that nice. I just, what he's referring to is oftentimes consultants like me are brought into organizations and they're bringing me in because, oh, we need a training.
There's a deficit in our training. And I'm like well, is there?
Allyson Wharam: Hello. Welcome everyone. I'm here with Doc. Are you doctor yet? No. I'm also in the like, very final throws of that process, and so my brain is just uh, yeah, I think you're in a similar spot too. Okay. All right. Not yet, but I'm here with Shannon Biagi of Chief Motivating Officers. Shannon, I'll let [00:01:00] you introduce yourself a little bit, but today we're gonna be talking about training and some of the common pitfalls and mistakes that happen in ABA organizations, but as we were talking about also so this is not exclusive to ABA, this is broader training mistakes that we see happening as both behavior analysts and people with a background in instructional design and OBM.
Shannon do you wanna go and introduce yourself?
Shannon Biagi: Sure. For those of you who don't know me, hey my name's Shannon Biji. I am originally an organizational behavior management practitioner, so the science of human behavior, but we take that and we apply it in workplaces, and I had the opportunity to actually get a dual master's degree in both applied behavior analysis, the clinical application and organizational behavior management.
And what I did, what you were referring to, Allison, is when I went to get my doctorate, which I'm still working on. I decided to go into [00:02:00] instructional and performance technology, so I'm currently a doc student in instructional and performance technology. I've completed the data collection for my dissertation, so we're on our way folks.
Anybody who's gone through a doctorate program knows that it feels like it's never ever going to end, but we'll get there.
Allyson Wharam: Yeah, I am in the exact same phase and I'm very much feeling the same way, so I'm with you. Thank you so much for being here. I'm excited to talk about these things. And we were talking to you, it's funny that we had both generated this list of like training pitfalls that we see.
And there wasn't a one-to-one correspondence, but I think if you were to take IOA, it was pretty high in terms of what we're both seeing in terms of some of these. Let's just jump into the first one, and we had identified, I think, two sides of the same coin.
So the first one is assuming that training is always the problem or the solution. So defaulting to training, and this could look one of two ways. One is, saying we [00:03:00] have a training problem without actually pinpointing the performance or identifying what the actual issue is. And the other side is maybe you've pinpointed that performance, you know that there's a specific gap, but you're assuming that training is going to solve that gap.
So talk to me a little bit about this one.
Shannon Biagi: Yeah I always go back to a few summers ago it was probably, it's probably longer than I think because COVID in the middle of everything just makes like a weird time warp. But I went to Dr. Carl Binder's summer Six Boxes, summer Institute, one of the best experiences I've ever had. And Dr. Binder said this quote that saying your organization has a training problem is like saying someone with a headache has an aspirin problem. Isn't that nice. I just what he's referring to is oftentimes consultants like me are brought into organizations and they're bringing me in because, oh, we need a training.
[00:04:00] Like we're there, there's a deficit in our training. And I'm like is there, because training isn't the problem that you're having. Why are we talking about this? That forces them to actually think about why am I saying training is the problem? Where is that performance deficit? Like you said, what is the, what's the actual pinpoint that we're concerned about?
And then that opens us up to actually do some kind of assessment and determine is training an issue? And as instructional designers, one of the first things that you learn really early on is that a lot of the time training's not the solution. It's also not the problem, but it's not the solution. Because there's so many other things going on in a system with within an organization that there could be gaps.
So from your experience, do you see that as well?
Allyson Wharam: Oh yeah. And I think it's ironic, like you said, people think instructional designer and they think, oh, we're [00:05:00] just focused on the training aspect. But that really is the first piece is how does instructional design fit into this broader performance, human performance technology system and way of looking at things.
Because training is sometimes the solution or part of the solution, but we shouldn't default to it. And actually that's one of the most common things that I see is that we're just defaulting to training for any sort of problem regardless of what the training looks like. That's what we'll get into next, but it's just assuming that training will solve that problem or address that need or that gap.
What are some of the frameworks that you use the most in terms of, or that you would recommend an organization would focus on or use during that performance analysis process?
Shannon Biagi: Yeah. So first, making sure that you've pinpointed what the issue is because it's great to have an assessment tool, but if you don't know exactly what you're assessing, like we're not assessing the entire system, we're assessing a specific component of [00:06:00] performance. So we, okay. My clinicians aren't turning in their session notes on time.
Alright, let's focus on that. You can't just say broadly, performance stinks because there's gonna be all kinds of different, recommended, solutions. Potentially if you don't know what it is that you're working on, you. Then when it comes to like actual, like structures and things that we use, the Performance diagnostic checklist by Dr. John Austin is a classic and is probably the most referenced and peer validated withIn the Field of ABA as far as assessment tools. So for listeners who maybe aren't yet familiar, go get familiar. It's basically a series of yes or no questions where it's divided into four categories of potential variables that influence performance.
So it's like antecedents and information, knowledge and skills, which is training.
It's equipment and processes, and then we have consequences, [00:07:00] oversight, things like that. There have been modifications for human services specifically, so ABA folks, you might go find Dr. Wilder's version. Wilder and Carr did the PDC for human services.
And it just forces you to think about all of those other variables and not just, oh training's gonna be training is where we need to go with this. So that tends to be, for ABA folks where I go. But there's other, like I mentioned Dr. Binder, I keep his little six boxes, which is just a riff off of Thomas Gilbert's behavioral,
Allyson Wharam: I think most of them are...
Shannon Biagi: Dr. John Austin took the behavioral engineering model and was like, how can I make this user friendly? Yes or no questions? Consolidate a few things. So a lot of these are riffs off of the classic behavioral engineering model from Thomas Gilbert. Feel free to pick up a copy of Human Competence.
He will completely trash behavior [00:08:00] analysis in the first chapter, but just pushed through that. he believes that outputs and outcomes are more important than behavior itself. Which is valid when you start getting into, actual performance analysis. Dr. Carl, your six boxes is another good one.
Allyson Wharam: Yeah. No, that's great. And yeah, I think, like you said, most. Most are variations of the same thing. So there, there are a couple of other little ones, but if you look, you'll see consistency across all of them. And I am glad that you said that 'cause I don't wanna dive into it, but even as we're looking at pinpointing a specific performance, without getting into all of the details of like systems analysis and things like that, but really considering, is that performance important?
Is it relevant? Is it related to something? Is it worth addressing? And will addressing that performance actually lead to something meaningful? Is that performance the actual cause of [00:09:00] the problem that you're experiencing? And I know I, we had also talked about the fact that, any one of these questions or things could be an hour plus.
So we're just doing like a high level overview, but it is, performance and training are a lot more complex, I think is the essence than just, Hey, let's talk about this specific topic or content or yeah, which leads to the next one which is designing the training itself around content versus a performance.
And what this tends to look like, what I've seen it happen in my own experience is you're asked to provide a training on a specific topic for a specific audience. So I've done a lot of school-based work, so it'd be like, oh, I want you to talk about behaviors, do a training on behavior for teachers.
Which again, for all of the reasons we just discussed is what do you do with that information? But then it can lead to this problem where you'd then just think about okay, what is everything that I know that [00:10:00] is interesting about behavior that I could talk to these teachers about?
So I'm given this general topic, let me just think about the content that's related to that. And then that's my training. So what do you see happening around this, and what's your experience there?
Shannon Biagi: Yeah, I would say what I see a lot of, especially with 'cause I'm coming from like a consultant perspective, people come in to share the things that they feel like sharing about, not necessarily what the end user the performer actually needs to know. And in ABA we are so enthusiastic about all these niche little nuanced things.
We love to. Get into the OID details of language that the way that we use language, like positive and negative reinforcement and stress this, and we like grind it into people. Is that the important thing about it? Like for somebody's let's say first two weeks on the job.
Taking [00:11:00] jabs at them every time they mix up a positive and a negative reinforcement.
Is that the valuable thing? Or are we just saying you did a thing and the client's behavior increased. That's awesome. We love our language. We love our acronyms. We love to share what we want to share, not necessarily what the performer needs in the moment.
Allyson Wharam: Yeah, we are ultimately, I think most of us got into this because we love the science and it's interesting and there are so many amazing things you can do with it. And when we're thinking about training itself, we have to really think about, again, what is the actual need that I'm trying to address?
So based on that gap that we've identified, that pinpoint, who are the relevant learners, who are we actually designing this training for? What do those learners look like? Where are they orienting to the training from? What is their background in this? As we're thinking about the topic for teachers, have they already been exposed to a lot of those things?
Am I just being redundant? Is this relevant to the role and how they're actually gonna use the information? And why am I [00:12:00] covering this? And one of the most important pieces of that is objectives, which you had noted. A lot of people don't take them seriously, and I agree, and I think that is because of how people use them or think about them.
So what have you seen in regards to objectives?
Shannon Biagi: I think people treat objectives like a formality, especially because like In the Field of behavior analysis specifically we're required to have objectives for our CEUs, let's say. Then it becomes a let me just check this box. 'cause really I just wanna talk about this cool thing that I'm interested in.
I am not thinking about what I actually want them to do with the information. And I think that's critical, especially when you were talking about teachers. If a teacher goes into a training and they leave and they cannot do anything differently, they've got a definition, they've got information. But I still go back to my classroom and I struggle to manage, what's happening [00:13:00] in my classroom.
Why? Why did we do all of that? When you give an objective and you're like, okay, our objective is that after this training, this person will have the skills to deescalate, behavioral excesses in their classroom and all of the sub things that go with that. Now they've actually got something that they can use, and because we've turned it into like this checkbox thing, people just don't think of them as guidance for everything that comes in your design.
After that should be guiding your content. It should be guiding your evaluations. Like, how are you, and we'll talk about this later, how are you evaluating whether your training's effective? If you didn't have an objective for that training in the first place, it's like, why? Why did we do all of this?
Right?
Allyson Wharam: Yeah, exactly. And I think. It's the idea of backwards design is technically what we call it, is we start with the objectives [00:14:00] and then we design everything around those objectives. And it makes sense when we talk about it in terms of like students or clients, we would never just put a bunch of, do a bunch of activities and then write goals for that client based on the activities that we've decided to do or things that we've decided to structure our session with.
We start with what are we hoping to. Help this learner, this client achieve. And then how do I actually teach and train that? But so often with training, it's flipped. It's what do I want to teach? And then let me design the goals around what I've decided I wanna teach. And so then everything else from there is skewed because you don't have that focus of where you're actually going.
And it actually makes it much harder, I think, to design the content because then you're left with the breadth of everything. So again, back to that, if I'm designing a training on behavior for teachers, that leaves like the whole [00:15:00] world of possibilities for things to talk about. And then it's really overwhelming for me to design a training and feel like I'm doing anything meaningful.
And then it's overwhelming for them because it turns into this like spray and pray. Approach to, I'm just gonna cover all of this content and I'm just gonna talk about all of these different things that, again, you might leave and go, okay, that was interesting, but what do I do with this? And I think I've experienced this during CEUs.
I've been guilty of this in the past. And so I, yeah, I think this is one of the simplest things though, of everything that we're gonna talk about. If you start with this, I feel like it's the anchor for everything else, like evaluation. So yeah. Anything else around this sort of gap that you see?
Shannon Biagi: I would say solution. 'cause as I was going through, am I just...
Allyson Wharam: Talking about the problems...
Shannon Biagi: Am I just whining about all these things? And I was like, okay, for this one, know your audience first. As a consultant and I need to know the [00:16:00] audience. Am I training teachers? Am I training technicians or these first years?
And what are they experiencing? Because that also builds the MO for them to actually pay attention. So if I say, Hey, I'm gonna help you solve this problem that you're having. I'm gonna make your classroom management so much simpler because I'm teaching two things that they need.
So that goes back to that concept of doing an actual needs assessment, learn about who the performers are and what they need to be successful, not what do I feel like sharing. So you could do a little survey like, doing this for a conference I'm doing later in the year. It's for an association.
And I was like, okay, can I send a survey out to your association to talk to them and get their challenges? What are your top three challenges with this thing, that I'm gonna talk about so that I know that audience needs to know this, and this, so we can actually solve that problem.
Allyson Wharam: Yeah, absolutely. Surveys are a great thing. [00:17:00] In terms of a quick and easy needs assessment sitting do, if you are able to do a quick interview with the actual performers Yeah. Focus group. It does not have to be this like fancy you, you can certainly do a very formal process, but the gist here is to really think about and.
Tailor what you're doing to the person that is gonna be receiving the training. And part of that too, for the needs assessment process might be the actual performer, but also who has identified... again, going back to that broader performance system, if it's the principal that's asking for the training, why is the principal asking for the training?
What is the gap that they're seeing? Also that they've come to you and said, Hey, they need this training.
Okay, the next one, this is a really big one. So once you have your actual training, your objectives, the next problem that I think we both identified is treating exposure as training.
I don't know if you've ever heard the phrase "Telling isn't training," but that's what I'm thinking about here. This looks like [00:18:00] set, sit and get training. So you're just you have a PowerPoint up, you're talking about the content. You're sharing all of this information, and then the learner is this like passive vessel who's just taking in everything that you're talking about over the hour or however long that you're training.
So talk to me a little bit about this. What led this to be one of the things that you had identified?
Shannon Biagi: Yeah, so I, I talked to a lot of organizations for what was it, 2019. I went to 62 different clinics. Audited their systems and processes and their practices. And one thing that I would always ask them is what does your onboarding look like?
So many of them were like, they watch modules on the computer and then they, come back and I'm like, okay, and what are you doing with them before they're going to be live with a client?
And they're like they got the modules, they watched four hours of modules. And I'm like, that's not and [00:19:00] ABA Cs are like so guilty of this, like online of I'm gonna push play, I'm gonna go do my dishes, I'm gonna come back, hopefully catch whatever the lecture verification is, which I didn't need to actually pay attention.
I needed to click a button or get a code word or whatever else. That is not that's not learning, that's not trading. That's again playing this. Are we checking the box of a regulatory body thing? We're not learning anything by doing that. And I understand with a lot of ABA companies, they're limited on time.
So we need to get somebody billable as soon as possible. They don't necessarily want to create an entire training system. We know that it's a lot of work to do this and do with as well. So they subscribe to a central reach, a relias, something that's got these video modules and look, we're good to go.
They can have those and then we'll put 'em in with a client. It's a disaster. [00:20:00] Honestly. It's not great. Or they'll say, an hour of their training was them just reviewing policies. I'm like, them reading a manual is also not training. Like you're what are, we would never do this with clients, like clinical clients.
We would never be like, here...
Allyson Wharam: Here's the behavior.
Shannon Biagi: Here's your PowerPoint kid. Watch this. Now do some matching to sample for me. It's no, that's not how we teach. So why so quickly do we take off our behavior analyst hats as soon as we start interacting with staff?
Allyson Wharam: And I think even within, and I'll be the first one as someone who has designed some of these modules and things like that to say that they are like just a piece of the puzzle. But even within that, so much of the modality, whether you're online, whether you're in person, whatever you have afford, like things that the modality allows and things that are constraints of the modality.
So I'm not with you if you're doing an online module, [00:21:00] I can't view your performance in real time and give you feedback. But what something like an online module could do is give really frequent opportunities to. Respond to questions or watch a video and apply things or, there, there are different things that you can do.
Again, acknowledging that's one piece of the broader puzzle. But if you are going to use a system like that, thinking about it again in terms of what is the actual performance and the modules are one piece that might be more of the instruction and the modeling of some of it, but we need to look at what this looks like for our actual client population and then I need to actually give you feedback and make sure that you have retained this content.
But even within the modules themselves, so often I have seen, I think it's getting a little bit better but like ours are not an hour of just watch this video and then answer a multiple choice quiz. And that was a very intentional decision. Most are like, here's five minutes and [00:22:00] let's engage in some sorting or matching or.
For data collection, again, thinking about what is the actual performance for data collection? Let's watch this video. And you collect data and you enter the data that you got, or actually labeling the pieces of a graph or, whatever it happens to be. We can't just use the fact that it's online as a way to just say oh sorry, this is the only thing I can do.
Even during a live CEU for example, there are ways to ask questions, whether it's I use technology like Nearpod or Pear Deck or even the poles within Zoom or a waterfall within the chat. If you wanna be like really simple and straightforward, but getting some active student responding.
And then also thinking about again, what do I want them to do with it? And do I, have I given them some of the tools that are leading them closer to that performance rather than again, me just talking about it, am I doing something that helps them? Helps to facilitate that moving from just like hearing it to [00:23:00] putting it into practice?
Shannon Biagi: That's where those objectives come in.
objectives Are providing That framework to be like, how can I get a component of this objective? Like, how can I validate that they're moving in that direction. So that's active, meaningful practice rather than rote. I'm just gonna click a next button and that counts as my verification that I'm doing something.
Allyson Wharam: Yeah I really struggle with the seat time aspect of a lot of our compliance based trainings because it's not a great it's not only not a great proxy for learning, it is not any proxy for learning in terms of what someone is actually able to do with that training. And so I do think that's a limitation and something that's a little bit tricky with a lot of these things is, again, which comes back to the objectives. What do I want someone to actually do with this? Rather than just saying, I need to fill this hour of time. And then even within the objectives themselves, there was something that you said that made me think about even just like [00:24:00] the levels of performance that you're talking about.
Yes, I need them to be able to recall, which is gonna be like a quiz or something like that. But again, if I want them to actually engage in this more authentic performance, what are those levels of actually putting it into practice of just basic facts and then building to concepts and then actually applying it in some sort of meaningful way. One other thing that you had mentioned was BST and that often, we're told that instruction is a piece of that. But then we're not really told what that instruction should look like. So talk to me a little bit more even during that didactic piece of a training or those videos or whatever it happens to be, or a live PowerPoint or no PowerPoint, I don't know, just any sort of didactic instruction.
What are some ways that we can make that content less passive?
Shannon Biagi: Sure. This to me is why there's an entire field of instructional [00:25:00] design.
Even though we think of BST as instructions, modeling, rehearsal feedback on the job evaluation, which we'll talk about later, everybody drops that step. For instruction is an entire, like how we design that didactic, I'm not gonna say passive, but that component of the process.
There's so many different models that we can look at for that, and almost all of them have things like, we have to build motivation. So why should this learner be paying attention to this? Do they understand where this is gonna fit in the big picture of, how it's gonna make their life easier?
Again, going back to the understanding who your learners are, which all of this like connects together,
Allyson Wharam: Yep.
Shannon Biagi: and then understanding not overdoing the PowerPoints and so much information on slides and. Talking for, 30 minutes without any kind [00:26:00] of pause for reflection or taking questions and, you have to actively, consciously integrate pauses and content examples, like vivid examples of, and they're not models exactly.
'Cause a model should literally be showing them like performance. But an example being like, when somebody learned this skill, this is how things got easier for them. The storytelling part of instruction, I think we, we tend to lose a little bit and the story is what keeps people invested and interested in what we have to say.
Allyson Wharam: I think that linking it to that like emotion, like you said, that is in most of the different trainings or motivation or both of those things is a huge one.
Thinking about how much information you're giving someone at one time in terms of scaffolding or, gradually releasing that information. Instead of telling them, giving them an opportunity to [00:27:00] retrieve or to make connections to things that they already know or experience. All of those things are ways that you can, even as you are the one delivering content, can make connections so that people are more maybe engaged in the process.
One of the smallest shifts that I made in designing our RBT training was shifting from... because it's so much content, we sequenced it very carefully so that you're starting from the basics and then things get more complex. But then as you get more complex, we would go back and reference, okay, here's what you learned about this before.
And when I first started doing it, I would do like a quick review here, let me give you this review of this thing. And it's such a simple shift, but I just shifted to, okay, take a second and think about what we learned about the ABCs of behavior and what those are. Instead of me saying, okay, remember [00:28:00] ABCs are... blah, blah, blah. Just taking that second to pause, have them retrieve and recall, and then moving on. There's just so many little micro shifts that you can make. But I think the biggest one here, the, like one takeaway even outside of yes, use BST, yes, use modeling and rehearsal and things like that.
But any sort of training is having it be an active process in some way, whether that's through ASRS or, there's a lot of ways that can look, but I think that is the biggest or most straightforward solution to some of this problem.
Shannon Biagi: Yeah. And just as a general rule, don't go for more than 10 minutes without stopping for something like whether it's , now we're gonna put in a video model. Now we're going to have you reflect. Take a couple minutes to reflect on a thing. We're gonna ask you a multiple choice question. We're gonna do a poll, don't go for, usually the guidance is around 10 to 15 minutes for doing something different.[00:29:00]
Allyson Wharam: Yeah. And it's interesting because like we will even get feedback even still within our trainings. Like the videos were a little long and there are mostly like five to 10 minutes or so. And it's still valid. Like it's still it's just, it's a lot of content. And yeah, the more that you can break it up and make it interactive and people like surprisingly ask for more interactivity, which you would think, sometimes, and we tend to be pretty poor proxies or pretty poor assessors of our own learning because and I don't have the citation for this off the top of my head but they talk about it in the book, powerful teaching in relation to like metacognition and making changes based on our learning and things like that.
But. We tend to prefer the things that are easier. And so it's easier to sit and not along to a training for an hour or to reread the thing that you already read or rewatch the video. I talk to students as they're studying for the BCBA exam or anything. It's easy to tend towards the things that are more passive because it [00:30:00] feels good versus actually having to engage in some of that practice.
But the practice itself is what leads you towards retaining over time.
Shannon Biagi: Right.
Allyson Wharam: Which leads us to the next one which is treating training as one and done. And you gave a couple of really nice examples of this. One is just like we have the onboarding, we're rapid firing everything during this initial onboarding period, and we're just trying to get everything out there and then just, turn people loose into the field and hope that it sticks. Training on a specific topic, but then not following up is another one that I see. So let's deliver this one hour training on this thing, and then again, hope that something comes of it. Or really reactive training. So training only when someone is struggling.
So what do you see in regards to this training as a one and done sort of problem?
Shannon Biagi: Yeah, so often what I see with organizations is when I'm sitting with a team and we're [00:31:00] developing an onboarding training, they're trying to get everything that learner will need to know. Their entire career as a technician into the span of two weeks tops. Like that's being generous. Sometimes they're like, we need them like two and a half days.
And I'm like, that's not gonna happen. So no offense to any of my clients who are listening to this, but if you're trying to teach, let's say the verbal operant in detail and that person's first two weeks on the job and they've never been in the field of ABA before, there are BCBAs that barely grasp like the differences between the verbal opera and you're expecting this fresh high school graduate who is, knows nothing about this field to understand the MO's for a mand compared to the reinforcers were attacked.
And I'm like, you can't. Do they need to know that in their first two weeks on the job? Or are we within those first two weeks, are they still developing rapport with their client? Are they [00:32:00] learning how to interact with their peers and caregivers like. What do they truly need to know in that span of time?
And let's create phased training where, okay, in their first two weeks, these are the essentials. We need them to know this. By the end of three months, we've got some additional things. Maybe now we're starting to get into the verbal operas so that they understand that. And then at nine months, maybe they're doing something like they can grow into their roles, but we try so hard to be like, we're gonna train them and never talk to them about this ever again.
And that's not great.
Allyson Wharam: Yeah it is, and I have said this a number of times, but like for the 40 hour training as an example it's both too much content at once. Like it's, it is so much to take in and it's also not enough. So it's treated as this you have this training, you're good to go. And yet at the same time, it's also just so much for someone to take in.
And so I love the idea of mapping out like what [00:33:00] are the essentials that someone needs to know at these different sort of milestones? And in those first two weeks, like really, what are they doing? They're really focusing on pairing, understanding the role like navigating some of the basic like employee systems, things like that.
And that also helps you think about what. Responsibilities are realistic or expectations are realistic for this person as well in addition to the training itself. We have started, we, I have not talked about it on here, but we've in the background, been working on a competency tracking app. And part of that is being able to break the role down into levels that are, could be associated with an actual job role, but could also just be a phase of where they're at in training.
And one thing that we've done with that is not just associate a specific competency to that level, but associating the level of proficiency for that competency. Do they just need to be able to do it in [00:34:00] a simulated setting for the first two weeks and then in the first three months they need to be able to actually do it with a client and then maybe in six months they need to be able to show that they can generalize across different clients.
But really thinking about. How we, what our expectations are for learners, and then making that clear for us, but also making that clear to them because you also have to think about how they're receiving all of this information and coming back again to learning objectives. And there's, some controversy around like sharing those learning objectives.
And is that meaningful? Most of the utility is for the designer, but it can also be helpful for learners to know, again, what is actually expected of me when I'm done with this thing. And so I think that's the other benefit as you're talking about this as like an ongoing process is even if we're touching on more content, what does the learner actually need to be able to do?
And do they know that? Do they feel like they need to be an expert in all of these 50 things that you've just talked [00:35:00] about? Or are they really clear that right now for this first two weeks, this is what you should be focused on.
Shannon Biagi: And I think sharing the objectives, like the critical objectives of what they're gonna be expected to do, and keeping the content to, what do I need for this phase of my little career that I'm starting. It reduces their overwhelm. If you think about like cognitive load and how much they're trying and in the field of ABA, like all the business owners will resonate when I say.
People quit before they even get outta training. So often they don't even make it through the training process. And I think this is a big reason why actually a lot of the things that we've already talked about of it's spray and pray, they're telling me all these really advanced things they're getting on me when I'm mix up positive and negative reinforcement.
There's these 85 different things that I think I need to know to do my job and I don't understand any of them within the first couple of weeks. Why am I gonna do this when I could [00:36:00] go take a less response effortful job somewhere else that's not gonna have all of these and it's probably gonna pay more somewhere else.
So if we could get that down and say, okay, this isn't that overwhelming. Here are the five behaviors I want you to be able to do by the end of the week. And that includes pairing, that includes navigating our online pay system. It includes like these feasible, nice things where they can feel at the end of that week, I can do these five things pretty well,
You're gonna keep people so much better that way.
Allyson Wharam: Yeah, I totally agree. And part of that is that culture of training, which you had brought up as we were talking and brainstorming things before that not only are you lessening what they're expected now, but you're going ahead and setting the stage that. We are all continually learning here's your focus right now and this isn't a process where you like have learned how to do ABA and now you don't need any training ever again.
We're [00:37:00] always learning, we're always improving, we're always building these skills. And the other thing you said about that is it keeps it being, from being like a reactive sort of piece. So I would love to hear some of your thoughts about that and more of the like culture of training and creating that ongoing learning environment.
Shannon Biagi: We should always be learning. We should always, and that goes from the top to the bottom. Everybody should be seen participating in ongoing learning and training and things like that because right now in so many organizations that I work with, what's happening is a performer got through their training, they're struggling to perform, and they keep having more training thrown at them.
Whether or not it's the solution to go back to our, they've got aspirin being thrown at them constantly, but you haven't figured out why they have a headache to begin with. So you haven't fixed the fact that they're dehydrated or they haven't had something to eat or, there's so many different things that could be going on that we're not resolving, but we're just like, [00:38:00] here's some more aspirin.
And eventually that becomes super aversive and beyond even our industry. There's this pairing of, if I'm receiving retraining on something, I suck at my job.
Like we should all love to learn. And what we're doing is teaching people that when I struggle, I receive training.
Therefore, when I receive training, I'm really bad, and we take this on as I'm just not good at this. And again, we leave, we take off, we go do something else. So if everybody is seen training and we're receiving training when we're doing well with things and we're receiving, additional support when we're doing well with things, we avoid that contingency that so many different organizations have made.
One of the classic things I was brought in to create a training for was a session notes at this organization where they had just shifted to electronic data collection and their [00:39:00] wifi could not handle the number of devices on the network. So they were trying to do their notes and they'd bon out, it'd delete everything, and they're just like, I'm just gonna do this at home, and then they wouldn't get back to it.
I had that resolved by the end of the day, but I could have trained on that until the cows came
All day. We never would've gotten performance up. So that training would've been so punishing for them because they're just helplessly going, this isn't, I know how to write a note if, and one of the questions that I always go to as a classic is if I offered them a million dollars right now...
Allyson Wharam: yeah.
Shannon Biagi: could they do the thing I'm asking them to do?
If I offered them a million dollars to get their session note in, they would have tried their darnest to get that done but they wouldn't have been able to do it because the wifi was not working. So all that would've told them is, you're sucking at your job. We're not gonna solve the problem, and here's some more things to do on top of what you're already not able to [00:40:00] get done.
Carl plus w Edwards, Deming plus Rumbler, I think they, the average between those models is like 14%...
Allyson Wharam: hmm. Mm-hmm.
Shannon Biagi: Actually a training issue.
Allyson Wharam: Wow. Yeah. And so it's, I think you're highlighting a really helpful point too in terms of the organization itself, being willing to change and evaluate and not putting the onus on the learner all the time. That it's the learner's problem. And with that then, if you have that overall culture of we're all making changes and improving as an organizational system, and then also we're always training and talking about new, things and building in this culture of feedback and, however it's structured, then yeah, it just, it changes the overall experience and changes to receptiveness to training where it's not this reactive strategy when you're doing something wrong or a punisher because [00:41:00] you're getting the same training on the same thing for the fifth time.
I think the other thing is like spaced training. And I know we just talked about not repeating the training but that's more in the case of it's not actually the performer that is experiencing a problem.
In terms of repetition. There is actually a lot of research to show that repeated and spaced practice that just, hes, again, it's just he first off, not just listening to something but just doing something one time tends to not maintain. And I think that we know that obviously as behavior analy, think about how we structure our teaching, but often when we think about training, it's here's this one hour of this topic rather than, again.
Having that repeated and spaced practice. And so even within our 40 hour training, for example, I would love to have done this like way, way more than we did, but we embedded maintenance checks throughout, for example, because they're learning, again, so [00:42:00] much content at one time. And so instead of it being like, you learn this thing 30 modules ago, like hopefully you retained it in that one hour and then you're never gonna see it again.
Let's actually not just recover it and not just have that like in the video, recall this thing, but just general maintenance checks. And there are some interesting systems. I for like texting question, like having some sort of prompts throughout. There, there are solutions, technology solutions that might help with some of this stuff.
But if you're training on a topic, thinking about how does it show up in supervision, how does it show up in feedback? All of those different things rather than just assuming someone gets it one time and then they've got it.
Shannon Biagi: And a long kind of a similar line is expecting fluency without actually leaning to fluency.
Allyson Wharam: Yes.
Shannon Biagi: So I'm gonna have you run a trial with me in a role play, and now I expect you to do 30 of those in an hour with your client. [00:43:00] Immediately. So that repetition, that practice is so important. And I don't think currently most organizations are thinking about fluency versus mastery and their training design.
Allyson Wharam: In that competency system we have different scales for different is it a knowledge-based competencies at skill-based. But the skill-based aspect of it is based on miller's hierarchy or competency matrix. And it looks at going from I can just recall what this thing is to, I can describe how to do it to, I can do it in a simulated environment.
So that's shows to, does so does independently. And then we added a fifth to it which was, adapting and generalizing and trying to get to more of the fluency piece because there is the simulated environment. There's, okay, I can do it on my own, but maybe I can't do it in my own, on my own in like really demanding or varied conditions.
And but yeah, and I don't see that a lot. I see like we're doing like a fidelity check based on maybe this [00:44:00] individual point in time, but okay, did that check take place during a really controlled training session or did it take place in vivo with a client who is pretty easy to work with? Or did it take place with a client where maybe there's a lot of other competing demands or more challenging behavior, whatever it happens to be.
And so yeah, it's really like thinking about how does that skill carry through over time.
Let's talk about the next one which is lack of evaluation. And what I see this look like the most is people do think that they're evaluating the training, but the evaluation looks like a survey at the end that just says like, how did you like this?
Like, how would you rate this training experience? And don't get me wrong we have that in our training because that is one little piece of this process. But often it stops there. It's just how did you like this thing? So talk to me about this. Talk to me about evaluation and what that tends to look like or should look like.
Shannon Biagi: So I just felt like that meme from we are the [00:45:00] millers where you've got like the graduated and then I'm right now in the one of wait, they send surveys, like at all. Like I don't even see them evaluating do I like this training, let alone like any of the other pieces.
Allyson Wharam: Yeah.
Shannon Biagi: But what I do see them do is say, Hey, check this person got the seat hours and now they're good to go.
That's like their evaluation of their training. So yeah, you're a level up if you're, even if you're just doing the did you
Allyson Wharam: Maybe it's, I'm thinking more of Cuus and that's more of the built-in compliance piece. But for actual seat training, then I think you're right. Based on I was thinking more specifically that CEU aspect.
Shannon Biagi: We're checking that box because the BACB says there has to be some kind of something. For example, big platforms right now, we'll have you give like a scale rating of stars of like, how much did you like this training? And that's the extent what we're doing. So if we're actually evaluating [00:46:00] training most folks will say, the person got the seat hours and now we're gonna put them with a client.
And they don't actually check to see. Do they have any skills? Did we actually take, say, a task analysis that we presented with an objective from our instruction? And did we go back to that and say, are they doing the things in the task analysis from the objective? Again, that has to do with not taking those objectives seriously.
And this is where it comes in to say, Hey, I said that they should be able to, perform a certain thing a certain way. Did it happen? And what we should be using that information for is like tenfold. But the main ones being one, is this person ready to be independent, like to move out of training?
And two, does our training suck? Like we just taught this thing for a certain amount of time and they still can't do it. Is that [00:47:00] a reflection of us? Did we teach and train this correctly? And usually that's the one that I'm more focused on as a consultant to say, okay, you believed that you provided sufficient information, but we've got these performers who still can't do it.
We need to change the training, not change the performer. Which is usually what they assume is oh, this person is just not good at this thing. It's no, you the rat is always right.
We did not do the right things during this process. And most companies, they just don't have any data collection to indicate either that their training is effective or that their learners are actually competent to be on the job.
Allyson Wharam: I think once you do have dialed that in and all of this is such an iterative process, so you're always gathering data, you're finding your training based on that, those data. And I think one of the like nicest compliments someone gave us about our training was that it was now when they switched over, a lot easier for them to tell whether or not that person wasn't going to be a good fit or [00:48:00] if the training was just tripping them up.
Because I guess there was more consistency and more ability for them. They trusted that training process to give them accurate information and that the training itself wasn't also creating more issues. Not saying that it's not perfect. We're always iterating, we're always evaluating, we're always looking at those data and things like that.
But in an ideal world and I'm also not saying that you don't then also individualize or say Hey, now for this person, we might need to make these changes. But ideally you're also iterating such that you have systems in place that you can evaluate on multiple different levels. Is it actually the training that needs to be changed or does this person maybe need additional support or additional mentorship or additional feedback or whatever it happens to be?
And you brought up a point about like collecting data during the training and also after the training. And there, there are a few different models, but there's also just this general idea too of like formative and summative [00:49:00] assessment. So the formative assessment lets us know and the students know how they're doing.
It also lets us tailor our instruction in the moment so that we can change things and support that learner versus the summative assessment is more like the training phase is done, not that it's ever totally done, but this training is done and we're evaluating more holistically the training as a whole, but also that learner's skills and where they're at.
And did this also make the changes that we're trying to make. And so talk to me a little bit about, there was one model we both noted. There's Thal Harmers. Lt a model which is the learning transfer evaluation model, but the better known model is typically Kirkpatrick's four levels. Talk to me a little bit about that.
Shannon Biagi: Yeah, Kirkpatrick tends to be the one that I use most often. And I use it in terms of as training, evaluation, and sometimes just as general evaluation for any intervention that I put in place. [00:50:00] So our four levels are gonna be reaction. So did the students like the training? Was the training environment comfortable for them?
Did they leave? Feeling competent, confident, like it's all about that learner in that moment. So that's our first level. We've got the learning level, which is in the learning context specifically. Have they acquired the knowledge, the skills, the abilities in the learning context to be able to perform eventually on the job.
So that would be things like quizzes, that's a learning check. Role plays with checklists, even though this is gonna sound like another level because it's in the training context, we would still consider that the learning level of evaluation. Having them recite things and say things and define things.
That's all demonstrating that they've got the content, they've got the basic skill. So we've got reaction, we've got learning, then behavior. That's where people get tripped up and they're like with role play, we've got them [00:51:00] to do a behavior. Sure, but it's not in context. So behavior level is, did I take that skill and now I can actually do it with a client?
Am I transferring that? That's our generalization bit. And I alluded a little bit earlier to how in behavioral skills training, we think instructions model, rehearsal feedback. In the original work from Sarah Coffin, stormy, there was a fifth step that nobody talks about, and that is transfer to the natural environment.
We just got rid of that step and we don't do it, which is, did they take the thing that we gave them instructions, model rehearsal feedback for, and actually do it with a client? That's everything. Why did we lose that? That's literally the entire thing. But probably because the research that's being done on BST has to stop because a lot of them are lab-based studies where they're learning a skill that they're not actually transferring to the environment.[00:52:00]
That's where Kirkpatrick's third level behavior is really important. That's the generalization. Did we actually see them able to do it with a client in non-ideal situations? So it's great that you can do a multiple stimulus without replacement preference assessment with, a peer during our training learning level.
But if your client sits down and immediately swipes all that stuff off the table, whatcha doing what? Where do you go? What? You weren't trained on that. So do we, in that we should also be doing that in the learning level of let's train non-ideal.
Allyson Wharam: Yeah.
Shannon Biagi: You don't want them to go out there and deer in the headlights immediately as soon as something doesn't go right, because things are not gonna go right. These children, these adults, whoever we're working with, they're not gonna do the same thing that your very kind peer that just wanted to help you get through training. So they did not resist or throw a fit. We've all been [00:53:00] there.
We don't wanna be that guy who's gonna, make somebody struggle through their role play. We're all gonna be as nice as possible, but that's not reality. That's not where we're at. Behavior level transfer. And then the last one we don't really talk about a whole lot, which is like that outcomes, the results, the business aspect of, okay, we taught them to do, whatever the skill I usually use for this example is like teaching supervisors to give feedback.
So we did a feedback training. They reaction said that they enjoyed it and they're feeling confident enough to use it. During training, we got them to role play and click off some key behaviors of feedback. Learning. We observed them and saw that they actually used those feedback skills with their supervisees in a couple meetings.
Behavior. Do we retain employees better?
They now have these skills? So now we're getting to a business result that really influences how the organization operates. So do we have greater retention of [00:54:00] our staff because of this? Is staff satisfaction up? Are we even measuring staff satisfaction?
So you've got that like greatly lagging indicator on that end. So Kirkpatrick creates this cascade from like in the moment to generalization to on the job to actually getting a business result. And the reason that last one exists we alluded to a little bit earlier, which is why are we training this thing?
If we did all of that work, but our staff are still really angry about everything and nobody wants to stay at our business. Did we train the right thing
Or did we just try to put a bandaid and now none of this actually made a difference for the organization, and part of why training folks do that is to justify their existence.
Like why? Why are we putting all this effort into this? Why are we spending Training is expensive, it's time consuming, it's all these things. If we don't get a business result from it, then we're just training to train and not training to actually [00:55:00] improve what's happening in the business.
Allyson Wharam: That takes it full circle, beautifully from where we started with what is the actual gap, what problem are we trying to solve? And then ultimately, did the training itself not just teach this skill, but it, did it solve that gap that we were noticing? And did it make the impact what we were trying to have it make.
So just to end on a quick note, we don't need to do like a deep dive on this one, but the last one you mentioned, which I thought was great, I hadn't thought about this one, was assuming that seniority equals competence. And one piece of this specifically is assuming that. Someone is able to train just because they're at a senior level and able to train well.
But training is a skill. Hopefully we've talked about that enough today to to talk about the fact that it is, it's its own pivotal repertoire. It was interesting as we were developing that, that competency matrix the initial fifth [00:56:00] level I had instead of adapts and generalizes is teaches and trains this skill.
I backtracked because yes, that is a level, but really it's more of a pivotal skill that comes. I don't need to necessarily see that you can teach and train every single discrete. Skill, I need to know that you have that skill. And then I need to know that you have the pivotal skill of training that you can then adapt to the different skills.
All this to say in a very long-winded way is that it is its own skill repertoire with many different sub-skills and elements. And so just because someone has been with an organization doesn't mean that they know this how to do this. And so I'd love to hear you talk about that.
Really briefly. And then I would love also if you have any thoughts on where someone should start. If they are listening to this and they're interested in diving more, but maybe don't actually have a training background, where would you point them?
Shannon Biagi: Sure. Sure. You explained it really beautifully. [00:57:00] It's pervasive that when we've got a high performer, we're like, oh, we're gonna have this high, or we have the senior performer, we're gonna have them go train people. One senior folks who have been with organizations for the long time are often the ones with the most drift, and they will teach them all the shortcuts because they've been there for a while.
And they'll be like, oh we don't actually do, these particular stuff? That, that with folks who have been with organizations for a long time, and they're sometimes less likely to get feedback because they've got those underlying relationships with their supervisors and things like that.
You create an issue there and training as a skill. We need to be teaching people how to train people and not just adding it in as like a task list item that we don't address on how to do that effectively. And BST. Just being able to recite four, five steps is not that's not enough.
Like you can say I know how to train people. I give them instructions. I model for them, I rehearse and I give them [00:58:00] feedback. Nope, that's not half of what you need to know about training. So for resources, one of my favorite things is I'm a reader. Clearly there's books literally everywhere around me.
Mugger, major Mugger.
Allyson Wharam: never know how to say it.
Shannon Biagi: So I've got this six pack. There's actually another six pack right here. I've got two, six packs.
Allyson Wharam: Nice.
Shannon Biagi: I just can't help myself when I see it. I just have to adopt it. These are old. You can get these used online for two or three bucks per book. This is gonna be like, to me, the most elaborate, the fact that they've got an entire book just on writing learning objectives, my heart...
Allyson Wharam: It's both the most elaborate in that they're broken up into these little sub areas and it's so digestible and easy to read. It's not like you're reading six little mini textbooks. They are very actionable, explained yeah. Books.
Shannon Biagi: And one thing he does fantastically well is the analysis [00:59:00] beforehand. So the analyzing performance problems book, he'll walk you through a decision tree of do I need a full training for this or are there like seven other things we need to do first before we start putting training in place. That's another reason I really love this set.
Allyson Wharam: Yeah.
Shannon Biagi: That's usually my go-to. And if, and I believe they've got trainings online through the MAGA consortium specifically, that you can get if you don't prefer to read that you can get that pretty easily. So that's like my primary recommendation is go back to some of the OG original stuff. Read Kirkpatrick's four levels of training evaluation.
Like just even just take a glance and he's got an entire book to get dedicated just to the behavior level, like behavior analyst. Just look at instructional design literature. ABA is the science of learning. Let's figure out how other people are talking about it.
That's why me going in as an OBM practitioner into an instructional design and performance program, I had to do some translation, [01:00:00] but wow, did it add so many tools to my toolkit as a professional. So you all can do that too.
Allyson Wharam: Yeah. And the other piece I would add to that, because I totally agree, is not shutting down if you hear a word like metacognition or self-efficacy or the big one is a cognitive load because when you actually think about it, yeah, that's my, I had put it in here. I'm like, I think we should add like a disclaimer not to write us off if you hear us say cognitive load.
But really, in all honesty, like these things can be broken down into behavioral systems and processes and things like that. And just being open to understanding what someone is saying. And they're often things that, despite the fact that we can break them down and make them conceptually systematic, we don't actually necessarily talk about them within ABA it is a deeper dive into really this aspect of a.
Learning. And yeah, no I really appreciated this conversation [01:01:00] and I, I don't necessarily always feel like I'm in island because, on an island, because I feel like there's so much overlap. But then I also do sometimes because there is that, like navigation and translation and all of that.
So this was really fun.
If folks wanna connect with you, learn more, see more about what you're doing at Chief Motivating Officers, how can they get in touch with you?
Shannon Biagi: Yeah, so I'm on all of the weird plat... not all the weird platforms. I'm on Facebook. I have an Instagram, I've got all that stuff. You can look me up by name too. I do have online CEUs that you can check out some of them on training at https://motivate-u.chiefmotivatingofficers.com/.
So if you wanna learn things from me and not overly critique my design, now that I've talked about all these things we should be doing that make me feel a little weird about my own online CEU's but...
Allyson Wharam: It's all iteration, right? Like you look back and you realize that yeah oh, I could do this differently. And we're always learning and growing like we talked about. Yeah.
Shannon Biagi: Something that I'd like to make available to your folks. I've got a little training audit tool, so if [01:02:00] folks who are listening, you've got a training within your organization and you just wanna know whether something is in place or you wanna add something based upon all these things that we've talked about I'll just provide a little link where you can get a copy of a little training audit tool that might be helpful for looking at what you've already got.
Allyson Wharam: Perfect, and we'll link all of that in the show notes as well as your social handles and things like that if people wanna get in touch. Thank you so much.
Shannon Biagi: You're very welcome. This was fantastic.
[01:03:00]