Enhancing Leadership Training: The Importance of Post-Training Evaluations
Discover the significance of post-training evaluations in leadership training, their impact on continuous improvement, and methods to gather valuable feedback.
File
Post-Training Evaluation Assessing the Effectiveness of Training
Added on 09/30/2024
Speakers
add Add new speaker

Speaker 1: Hi, welcome to LeaderFeeder. I'm Suzanne Jakes, the Marketing Coordinator at Unique Training and Development, and I'm here with our manager, Kirk Langford today. Hello. Hi, Kirk. How's it going?

Speaker 2: I'm good. How are you, Suzanne?

Speaker 1: Good. Excited to be doing another LeaderFeeder. For sure. They're always a fun day.

Speaker 2: They are.

Speaker 1: Today, we were going to talk about post-training evaluations. What do we mean by post-training evaluation when it comes to leadership training?

Speaker 2: Good question to start it off. Basically, when we're talking about a post-training evaluation, it's essentially evaluating the training in terms of what did people like about it, what did they not like about it, what do we need to change? It's just getting a sense of how feedback on the actual training itself. That's what we're talking about.

Speaker 1: Okay. Feedback on the training, how did it go? Why is that so important for organizations?

Speaker 2: Good question again. It's important because it's easy to assume that the training, the way it is, is just good enough. But if we're not asking the actual people that are going through the training, what did you get out of it? What did you like? What did you not like? Then I think we're missing a really important component, which is the learner experience. Because at the end of the day, the training isn't for the trainer. The training is for the participants. So if they have feedback that can help us make the training better, we want to know about that. I can give you a quick example. One of our customers that we do quite a lot of work with, they do a post-training feedback at the end of each of their programs. And some of the feedback they got back was some stuff like the chairs were uncomfortable. Now you can hear that and go, yeah, you can hear that and go, oh, get over it. The chairs were uncomfortable. We're adults, we can figure it out.

Speaker 1: But how was the training?

Speaker 2: Right, sure. But keep in mind that these are people who are working on construction sites regularly. So they are used to being on the move for their eight, 10, or 12-hour shift. They're on the go most of the time. They're not used to sitting in a classroom for two hours a day, let alone these were two eight-hour sessions back to back. So that's a lot of sitting. And so for learners like that, we do have to take into account what are they used to and what are they currently experiencing? And that's why for them, a comment like the chairs were quite uncomfortable, especially if you see multiple people making the comment, that is something that you probably need to address. And again, as the trainer, first of all, you're usually standing up for most of the day. So you wouldn't have, and second of all, if you might be more used to delivering training rather than being on site, you wouldn't think about how comfortable the chairs were. So it's a piece of feedback that's really important. And yet at the end of the day, you may never have thought about unless you actually asked your participants for the feedback.

Speaker 1: Right, yeah, that makes sense. Oh, good, good point. Thank you. So when we talk about post-training evaluation, you kind of brought up this point of you know for next time. So then can you speak upon continuous improvement? This of course will lead into continuous improvement and how that'll benefit organizations, right?

Speaker 2: In the long run. Yeah. So yeah, with any training that you do, generally speaking, you're not just doing it in a vacuum, which is to say you're usually gonna do, you know, that same training session, maybe it's a monthly training session, maybe it's a weekly training session, maybe it's three times a year. But in any case, if it's gonna be repeated, you wanna find out what went well and what could go better next time. And so that's where the post-training evaluation comes in. It's not just a chance for the trainer or the organization to look at the feedback and go, well, that was good, okay, good for us,

Speaker 1: and then move on. Super great, yeah, super great, we don't have to do it again. Yeah, that's right.

Speaker 2: There should actually be a moment or a debrief where you sit down and you go, okay, based on this feedback, what do we need to change or do differently next time? And that feedback could be related to the instructor, ways that they can be maybe more engaging, ways they can deliver their message more effectively. It could be about the content itself. This content isn't very engaging, it's not very interesting. How do we make it more engaging and more interesting? I think of things like in manufacturing, we do a lot of health and safety. We do things like WHMIS training, hazardous materials, some diversity, equity, inclusion.

Speaker 1: And those can kind of just be like point by point training things too, right? They can be, that's right. So trying to make it more engaging for, and then understanding that is it engaging will make it better for the company.

Speaker 2: That's right. So you can be looking at not just the trainer and how they delivered and whether they are the right person to deliver that training. You could be looking at the content or like I brought up in that other question, you could be looking at things like the venue. Was it an appropriate venue for what you're trying to do? Did it offer, for example, was everybody sitting in kind of theater style where they couldn't really engage in chat or were they at tables where they could have small group discussions? So there's lots of different things you can think about and your post-training evaluation is where you get all that really, really good information that will allow you to make a more effective training program next time.

Speaker 1: Can you let us know, maybe some of our listeners might wanna know what different methods there are for post-training evaluations, right? Like, is it usually just a survey that goes out? What are some methods you can use?

Speaker 2: Yeah, really good question. There's two, you can talk about, well, there's lots of different methods, but I'm gonna share a couple. So there's what we would call qualitative assessments or qualitative evaluation and quantitative evaluation. Qualitative is asking things like, what did you like? What did you get out of it? So questions that are sort of text answers, if you will, someone saying, well, I learned this or I liked this about it, or any kind of question that's more like, just provide your thoughts on this. Quantitative feedback is more like, on a scale of one to five, how comfortable were the chairs? On a scale of one to five, how good was the instructor? On a scale of one to five, this training, you felt this training was relevant to your position. That's all quantitative. So let me talk for a second now about the value of each of those, okay? So if you're trying to bring together a bunch of results and sort of show, for example, maybe the management or just get a better sense of overall, what did people think about it, right? Or kind of how satisfied with people, then a quantitative evaluation is great. A bunch of numbers of rank on a scale of one to five, the following things, because then I can pull everybody's answers together and say, for question number one, how relevant was this content? We got an average score of six out of 10, right? You can do that with numbers. If you ask the question, what did you like about this training? As you can imagine, it's a lot harder to kind of easily and quickly and efficiently compile all those answers.

Speaker 1: You get a lot of opinions.

Speaker 2: Yeah, instead it's basically, hey, here's everybody's answer. You can maybe do a bit of, I found some commonalities, right? A few people mentioned the chairs were uncomfortable. A few people mentioned that the instructor was engaging. A few people mentioned they felt the activities had a lot of relevance to their day-to-day, but it's really hard to kind of pull those into just quick and easy sort of little tidbits of information. So when it comes to methods, you want to have a bit of a balance. Like what we have tended to do is we like to have a quantitative to start. So basically how engaging and knowledgeable was the instructor on a scale of one to five? And then do you have any comments on that? Because the reality is that if somebody scores, somebody gives a score of one out of five with one being the lowest and five being the highest, that doesn't tell me very much. It tells me they didn't like the instructor, that they didn't feel that the instructor was knowledgeable and engaging, but it doesn't tell me why, right? It doesn't tell me that, oh, they didn't tell relevant stories. It doesn't tell me, oh, they were late and unprofessional, or it doesn't tell me that they spoke to the audience but didn't really like engage with them. I don't get any of that. All I get is a low score. So qualitative allows us to get a little bit more info so that we can then better improve the training. And so I guess I haven't quite answered your actual question, which was also about the methods in terms of a survey or whatnot. I think a survey is most commonly what we see. Now, the biggest-

Speaker 1: It was the only thing I could think of.

Speaker 2: Yeah, well, and honestly, we struggle a bit with that here at Unique Training and Development. As you know, we normally do a survey at the very end of the session while the participants are still kind of there and sort of captive, as you will. Because we know that when our frontline leaders that we're training get back out onto the floor, they're flooded with emails and fires to put out and people that want to ask them questions. And so that can be a challenge for them to then take the time to fill out a survey that maybe they've been emailed or something after the fact. So we ask them a few short, just quantitative questions at the very end because they're still with us. And that way we get almost 100% response rate. Sometimes if a participant has to leave early or something, we may not get their answers. But we get a pretty high response rate, probably 90% and above. That's great, right? Now we're toying with the idea of using, for example, a QR code that leads to a survey that's a little bit longer. So participants pull out their phones, they scan the QR code. It's still in the session. Cool. But then if they want to either complete the survey afterwards, they can do that if they want to take a little bit more time. Or they can complete it right then and there. So we're experimenting with that right now. I suspect what we'll see is a lower response rate. It won't be 90%. It'll probably be, I don't know, 80 or 70. Because some people might say, okay, I've got it on my phone in the browser. I'll go do it later. And inevitably they might get busy and not be able to. However, that survey that we're using via QR code or we're sending via QR code is more in depth. So we're gonna go from less information but a higher response rate to lower response rate but more information.

Speaker 1: Interesting. What would you wrap? I guess it's kind of just dependent on.

Speaker 2: Well, it depends. I mean, if you're getting a lot of high scores, with a shorter survey, then one way you might say, well, we don't need to do a lot. We don't need to change a lot because people are liking the content. They're liking the instructor. They're liking the venue, for example. If you're getting low scores in post-training evaluations, then I think you do want to ask, you really want to be asking the question of why. And so to kind of end off my answer, my very long answer here, I think if you're getting low scores and you're thinking to yourself, I need to dig into this. But for whatever reason, maybe your survey was really short or it's just hard to get people to answer a longer survey, you could do what's called like a focus group where you might pull in some of the participants and say, hey, we got some pretty low scores on this. I'm not gonna ask people to identify why they gave or sort of who gave low scores. I'm gonna pull you all in but I'd love to pick your brain for 10 minutes and find out you rated the venue one out of five. Can you help me understand what the rating came from? Or can you help me understand why you said the content isn't very relatable to my position? So you can have those conversations, those focus groups. Sometimes that's challenging. Some people may not feel comfortable giving it.

Speaker 1: That was gonna be my next timeline. Like, do you think you'd get actual real responses if you pulled people into a focus group?

Speaker 2: Sometimes you don't, but sometimes you do because some people might say, well, I recognize that the training's not gonna get any better for the next person who has to take it or if I have to take it again, like if it's something you have to renew every year, it's not gonna get any better if we don't get realistic feedback. So some people end up giving really nice helpful answers that make you go, oh, that's what you didn't like. Or that's what we can do.

Speaker 1: So post-training evaluation will help you with continuous improvement, like we've discussed. It's gonna help that. I imagine organizations would also be able to compile all this information to assess their impact that the training had in general with their whole organization.

Speaker 2: Yeah, I mean, depending on-

Speaker 1: The processes, I suppose I should say.

Speaker 2: Yeah, so depending on how in-depth it is and depending on kind of what they're looking for, for sure. Now, keeping in mind, the post-training evaluation is really meant to be more just about what did you think of the training? It's hard to ask someone right at the end of a training, like how has this impacted you on the job, right? That's a tricky thing because if they've just finished the training, if it's either the same day or the next day, they're really commenting more on what they thought of the training.

Speaker 1: Right, you're not gonna see a return on anything until further down the road. So would it be a smart idea to set up another evaluation down the road or is that completely a different whole subject?

Speaker 2: I'd say it's a whole different subject because then it's gonna be completely different questions because it's not so much about what did you like about the training, what did you not like, what would you do differently? It's more about how have you been able to apply what you learned to the job? So it's a little bit of a different beast.

Speaker 1: So this is more about the trainers, or I'm pardon me. The training, the trainers. Yeah, the trainer, the training, the participants and how they felt about it.

Speaker 2: Yeah, that's right. Another neat thing that if I could share, if you start aggregating results, so say, for example, you run a session every six weeks in your organization, right? And you have the same questions. You keep those questions pretty consistent. You can start to pull in that data and add it to existing data to start to get to say, okay, are we getting better over time? Are we getting worse over time? Like, is the training being, are people enjoying the training more? Are they saying they're getting more out of it than three months ago or than six months ago? The other thing you can do, though, is you could even say, well, we have three different instructors that deliver this training. We can actually look at, okay, well, this instructor is getting this level of engagement and these results. This instructor is getting this level and these results. So then we can ask this instructor that's getting the higher results or the better evaluations to say, what are you doing differently? And that's something that we think about in our organization, of course, which is how do we get all of the instructors delivering at the same level? Now, we're really lucky that our instructors are all really great and they tend to deliver really great training, but we all know they have different, whether it's anecdotes, different approaches, different strategies, so we can learn from that and we can use that to our advantage to say, let's try and give everybody an equally great experience. And so we can look at the different results based on each instructor and say, what are you doing differently that you're getting better results from the participants or a better experience?

Speaker 1: Yeah, that's great, thank you. So I guess on a final note, is there anything that maybe we didn't talk about that our listeners would wanna know about post-training evaluations, surveys?

Speaker 2: Yeah, good question. I suppose what I would say is that if it's something you don't do, I think organizations do these small trainings, maybe it's like an hour-long training on something, a new policy that's been put in place or new health and safety measures. We sometimes think it's a one-hour training, whether they like it or not, we just have to go through it. And I think that's the wrong mentality, right, Suzanne? I think what we need to say is anytime we're training people on something, we need to take into account how engaging are we, are they getting anything out of it, are they listening, are they paying attention? And so a post-training evaluation, even if it's just a couple of questions, it doesn't have to always be a 15-question, 30-minute survey to fill out. It might be three or four questions, but that can give us a lot of information to know what learning happened, what were the takeaways that the learners got, and what do we need to change for next time? So I think no matter what the training is, you should be implementing some sort of post-training evaluation.

Speaker 1: Benefits and challenges and all of the above that go along with organizations trying to build their teams better, right?

Speaker 2: Yeah, exactly.

Speaker 1: Thanks so much. Thanks for joining us, everybody, to talk about post-training evaluations today. I hope you all benefited from it and learned something from Kirk. I know I always do. So we just wanted to take this moment to ask you if your company does post-training evaluations, what do they do? Do you have a different method maybe that wasn't brought up today? We would love to hear any feedback that you have to share. So until next time, thanks so much. Bye.

ai AI Insights
Summary

Generate a brief summary highlighting the main points of the transcript.

Generate
Title

Generate a concise and relevant title for the transcript based on the main themes and content discussed.

Generate
Keywords

Identify and highlight the key words or phrases most relevant to the content of the transcript.

Generate
Enter your query
Sentiments

Analyze the emotional tone of the transcript to determine whether the sentiment is positive, negative, or neutral.

Generate
Quizzes

Create interactive quizzes based on the content of the transcript to test comprehension or engage users.

Generate
{{ secondsToHumanTime(time) }}
Back
Forward
{{ Math.round(speed * 100) / 100 }}x
{{ secondsToHumanTime(duration) }}
close
New speaker
Add speaker
close
Edit speaker
Save changes
close
Share Transcript