Speaker 1: If you've been working as an instructional designer or researching the field, then you've probably realized by now that a lot of the training out there is either boring or downright ineffective and unnecessary. And the biggest reason for this is because a lot of learning designers and training teams just completely skip analysis and design training because someone says that they need it. So in this video we're going to look at the five different types of analysis that you can use on your next instructional design project. Alright, so analysis is the name of the game today. And I'm not even kidding when I say that if we as instructional designers could get this right and had the opportunities to conduct these analyses, we would probably contribute a massive amount to the global GDP just because of how much more we would be helping people and companies that we support. So again, we're looking at five of the main types of analysis. These are the traditional types of analysis that you would learn about in a master's program. But there are some more streamlined approaches to creating efficient and effective training, which we'll talk about later in the video. So for now, we're just going to take a look at these five different types of analysis. You can take a look at these if you want, but we are going to get to each and every one of them. Now, needs assessment, this is probably the one that you've already heard of. The main question we're asking and answering here is, why aren't people performing at the desired level? And we can explore this by interviewing people, by observing them, by talking to the star performers and also the people who are really struggling. There are many different approaches to this. It might just be asking the client or the internal stakeholder a series of why questions until you feel like you've really drilled down into what's causing the problem. Sometimes the client or internal stakeholder won't know and you will need to speak to the actual audience that you support. And I'll do a little disclaimer here and say that is what analysis is all about. We want to make sure that we are supporting our audiences with what they actually need to perform better. Instead of assuming, oh, it's a course on effective communication or it's a course on this, we want to really speak to our people, observe our people, empathize with them and get to know how are we really going to support this person in the best way possible so they can perform better on the job. That's really the overarching question and theme when it comes to analysis. So why aren't people performing at the desired level and what will help them perform at the desired level? So not just why aren't they doing it right, but also what will help them do it better. And in a lot of cases when we take the time to ask these questions and talk to our people and see where they're having trouble, we might find these problems are being caused by environmental factors like the technology they're using or the reward systems that are in place. we might be getting rewarded for doing the wrong thing or we might not be getting properly incentivized to do the right thing. So there are a lot of things at play that may not mean, oh, we should design training for this specific audience. Maybe you're designing training for their managers or maybe you're investing in new technology or new systems, but we can't throw training at every problem. Now, training can help when there's a gap in skill or knowledge. So that's the main problem we solve as learning designers, right? When people are at point A, we need to get them to point B. They currently know how to do this. We need them to know how to do either that same thing more efficiently or some other things so that they can properly contribute to the organization. So if you're going to take away one thing from this, it's that we only want to design training when it's solving a gap in skill or knowledge. And we don't want to assume that the audience has a gap in skill or knowledge and we don't want to assume that that gap is what's causing the problem. Usually it's a combination of multiple things. There's a gap in knowledge or skill, there are environmental factors, there are inappropriate reward systems in place. It's usually a combination. We just want to be clear about exactly how we're helping solve that problem as learning designers. So again, this is like the first and most important form of analysis that you can conduct. It tells you whether or not you even need training. So if we're experts in designing learning experiences, you can see why this piece is so important. We don't just, you know, when you're a hammer, like everything is a nail, like that kind of saying. We don't want to look at every problem as a learning problem. We want to be very picky and do this diagnosis to determine, can my efforts here, where I'm going to spend many, many hours speaking to subject matter experts, writing content, developing that content, we only want to use this skill set and use these hours if it's actually going to make an impact and make a difference. And again, this is in an ideal world. The reality is there are a lot of cultural issues working against instructional designers where we're kind of not being incentivized to put in this upfront time. But for those of you who are newer or fell into your ID role and didn't learn much about analysis, this is kind of like the proper approach and the proper, the way to determine if training is even necessary. And sometimes on some teams that are learning consultants and they do just this or performance consultants, you may also refer to them as, and then when they do decide or determine that, yes, training will help solve this problem in this way, that's when we would turn it over to an instructional designer. So it's okay to just design the content or design the practice activities if there is someone like upstream who is doing that hard upfront analysis work. And it doesn't have to be that hard or that difficult, but it does take a little bit of time to dive deeper into these problems. But this right here is the most important one. We can do a whole video on this. Just let me know in the comments if you'd like to. We also maybe could do like some live events where we do like some mock needs assessments. Like we could have some fun with this, but it is such an important skill set and it really does not have to be very difficult. It's just digging deeper, asking questions. So the next one is job task analysis. So you also may see this abbreviated as JTA. My impression is that this was much more popular back in the day. When I was doing my master's program, one of our professors, who was like a veteran practitioner, he used to say, I paid for my first house doing JTAs. So he would just get hired by companies to come in and conduct this. And then the output here is a list of all of the tasks that we should prioritize training for. So let's talk a bit about what it actually is. So this is where we would interview or observe people to determine which tasks they actually have to perform. Some of you may already be familiar with action mapping. This definitely is kind of wrapped into that in some ways. But the way we were taught this in our master's program is that this was like an exhaustive inventory of every single task that someone would perform in the context of achieving a given goal. So we need someone to operate this machine. So we would document every single thing, like walk up to the machine, like press this button to turn the machine on, check this mirror, check this thing. A lot of the times, if you're just sitting down with a star performer or one of these employees and asking, tell me every single thing you need to do to have to get this done, it's only natural that they're going to skip some steps because the experts, and again, this is who you'd want to interview for this. We want to interview the people who are performing at the desired level. But they might be doing things and not even realizing that they're doing things. And that's where, as an instructional designer, it's important to be very meticulous and drive very deep into these tasks. So they might skip steps. So you would ask, do you do anything in between those steps? Or observation can help a lot too. There's a form of this called cognitive job task analysis, where you even talk about what decisions and what processes the employee is going through in their head. So look into those. Again, I get the sense that they're a bit more traditional. I have seen some opportunities come up in the freelance or contract space. Over the past four years, I've maybe seen one or two people ask for someone who's really good at job task analysis. So I don't know if this is a dying art or what. And again, there are more modern approaches that let us focus on the right actions. It's called action mapping in particular. If you've been following me for a while, you know I love that process and we've done some content on it. So we have a whole action mapping playlist if you want to see a more modern streamlined approach and focus on which actions people need to perform. So once you have this exhaustive list of actions for this job task analysis, we would ask the employee to rate each and every single one of those tasks in terms of frequency, difficulty, and importance. So how often do you do this? How difficult is it to do this? And how important is it that you do this? Another way to say that is maybe like, how severe are the consequences if you do not do this? And the idea is that you would do this with maybe it's just one star employee, but probably more like three to five. So you get a pretty exhaustive rating system on these tasks from multiple employees. And then you, as the instructional designer, you can identify which one of those tasks are the most important. So if we see a pretty decent list of tasks where it's very frequent, very difficult, and very important, clearly that's where we're going to get the best return on investment when it comes to focusing our training efforts. So look into this if this approach sounds appealing to you. But again, action mapping I would probably recommend as the modern equivalent. And action mapping wraps up a little bit of everything. Like it wraps up some needs assessment. It wraps up some of this. But this may help just from like seeing where a field has come from and seeing like the proper traditional way to conduct these analyses. All right, so learner analysis. So this is a very important one too that is often overlooked. You can go to different levels of depth with this. But what it is really about is we want to understand our audience's wants, needs. Basically, we want to be able to empathize with them really, really well so that we can wrap that into our design decisions for the solutions we're designing and developing. So mostly this will look like interviews and surveys. So we're talking to our people. We want to determine what their preferences are in relation to technology and delivery mechanism. And we also want to know where they're at with the task at hand or the topic at hand. We want to know their background knowledge. So how comfortable are they with this content? So again, what's their comfort with the tech, with the topic? What are their preferences for art style, modality? So do they really prefer videos? Have they taken e-learning before and they're comfortable using those navigations? Have they never used technology before? Or what are their language preferences? Are they bilingual? Are they very fluent, educated English speakers? That's another thing, how educated are your audiences? All of these things and art style, right? How do they respond to the illustrated art style? What are their favorite websites to go to? How do they use technology already? What are some of their favorite training experiences? Sorry, I'm kind of like rattling all of these off, but I hope you get the idea. We're asking questions so that when it does come time to do the actual instructional design and development and graphic design, we can design something that we're very confident that our audience will be comfortable with and will be able to navigate with ease and all of the above. It will also help us make decisions around like scaffolding. Because for example, if our audience isn't very good with technology or if they've never taken an e-learning experience before, we're probably going to want to devote more time to navigation instructions and maybe we're going to use some bigger buttons and instead of using like a hamburger icon like you can find on websites, maybe we'll just use like plain text like open menu, for example. So when you know more about your audience, you can design more for that person. And that's probably clear from this, but the end result is a learning experience that's more tailored, appealing, and effective. So get to know your people, right? A lot of the times you just get stuck working with our subject matter experts or our clients and we can complete entire projects without once hearing from someone in our actual audience who we're supporting. And yeah, sometimes we ask for feedback, so we get some survey information, but that's not really what this is, right? We're not asking for a specific feedback about a learning experience. We want to learn more about who that person is, what their daily lives are like, and just get a really strong sense of that so our learning solution will really hit the mark when it gets delivered. Okay, so these two are maybe not as popular. I don't see these really talked about a lot, but I remember when I learned about instructional context analysis and job context or performance context analysis, which is the next one, I was like that makes so much sense and I can see how if you skip these, it can lead to some very big problems. So essentially, this is the question we're answering. What context will people be in when they engage with the learning experience? So where am I going to be? Am I going to be in front of a desktop computer in an office building and I'm going to be clicking through an e-learning experience? Or am I going to be sitting in a conference room listening to a facilitator with five of my coworkers? So again, think about when the people are actually going through the learning experience that you designed, whether that's face-to-face or e-learning, what is the context in that environment? So for face-to-face learning experiences, we want to look at that context and figure out how many seats do we have available to us? Do we have enough seats for the amount of people that we need to accommodate? Do we have a projector if we need to show slides? Is there a computer there to even load up a PowerPoint? We basically want to see what do we have available to us in this space or in this context so that we can make sure we design accordingly. We don't want to design a PowerPoint for the facilitator to use if we don't even have a computer or a projector in that room. It just doesn't make much sense, and that's where we can lead to a lot of wasted money. For e-learning, it's still very, very important because we want to know what devices are people using, right? Are they going to be on cell phones, or are they going to be on computers? What is the screen size? Should we be designing a nice full-screen, wide-screen type of experience, or maybe everyone, maybe all of the computers in the office have a vertical orientation? What's the processing power, right? What is the internet speed? We want to figure out, is it okay to include some very high-quality media that needs to get downloaded onto their computer? Because if we don't have the right hardware to run that, then it's going to be pretty ineffective, like the training isn't even going to work, or it's going to be very frustrating, and people might just dip out of it. Sound cards. There's a big example with this one. If we're going to include sound in our learning experience, we want to make sure that the computers opening them up have sound cards, and maybe this is standard and is default these days, but there was a story that one of our professors told us in the master's program, and it was just about this training program where a company invested millions into this really slick, high-production-value, narrated e-learning, and then when it came time to deliver it to the audience, they found out that none of the computers in the office had sound cards, so they would have to re-outfit every single device in the space to experience that learning project. So you can see these things are important. Maybe we can take sound cards for granted these days, but I hope this just illustrates the point that we need to be careful with this context analysis. We need to make sure that we're designing for things that are available in the spaces where people will be engaging with the training. And then performance context is very similar, but instead of which context will people be in when they go through the training experience, we want to know what context people will be in when they have to apply what they've learned. So what is it like on the actual job? We want to make sure that the solution we're designing works for that context. Maybe people have desktop computers back in the office where they go for training, but if these people are working in the field and they're working out of their cars and out of clients' houses or businesses, maybe all they have with them is a cell phone. So if we design this nice widescreen learning experience that we want people to be able to access from the job, it's not going to be a very good time if they're trying to refer back to that on a mobile device. And similarly, maybe we think, oh, it will be great to design this job aid that lists all these important communication principles, but then if the person on the job is constantly in front of clients or in front of people, it might not make sense for them to pull out a paper job aid and look at that while they're having a conversation. So you can see. And then similarly, if I'm working from a computer, we want to know the specs of that computer, what it can handle. Maybe if we're designing tools, we want to make sure that it works for the technology that our people have available. So this one should be clear. But again, just let's make sure we have a very, very clear picture of how people are applying what they've learned. Because not only do we want to make sure the solution works, but we also can emulate this context in the learning experience. So the more we know about the actual struggles of the job and the actual, really just the context, like what are the common things that happen that can make things go wrong? How are people feeling when they're on the job? What resources do we have available on the job? All of these questions, we can bring those, the answers, we can bring those elements into our learning experience. So the more realistic we can make the learning experience and the more that we can prepare our learners for what the actual job site will be like, the more effective your e-learning or your face-to-face learning projects will be. So let us know if you have any questions about these ones or have you heard of these before? I'm also curious about that. And do you use them? Or how good of an understanding do you have of the people you support, of the context they learn in, and of the context that they work in? Because when we have a really clear picture of that, our learning design will just go to a whole new level. And you will hear it from your audience. Like they'll tell you how happy they are with the relevant training and solutions that you're designing. So overall, right, proper analysis, it can really save our audience time. And they'll be happier because the training they're going through is actually helpful instead of having to just click Next and try to get through it as soon as possible. But proper analysis can also save the organizations we support countless hours, countless dollars. Like I literally don't think it's an exaggeration to say that billions of dollars a year are being wasted by skipping analysis and doing improper analysis. There are so many programs out there that we've invested whole teams of time, like effort into for months. And then we roll it out and deliver it to 50,000 people who are getting taken away from their work to go through it. And then nothing happens or changes. So again, skipping analysis, I would say hands down the biggest cause for all of that wasted money. And you're wasting your time. You're wasting your audience's time. So, you know, it may have sounded like a lot. This video is probably getting a little bit long, but we can create templates to make this process easier and faster. And I'm sure there are templates out there, but if we want to do this, if you think that would be helpful, maybe we can do some content where we create templates together. So if you want me to maybe create some freebie templates, or maybe we can do a live event together where we build templates together, I'd totally be happy to do that. Again, the more we can conduct analysis and learn more about our audiences, I think the better off we'll be as an industry. And we can do so much more content about how to have these tough conversations with stakeholders who maybe don't see the value in it. Again, just let me know in the comments what kind of analysis content would be useful to you. And then also, if you've made it to this point and you do not know about action mapping, I think that would probably be the best next step for you, because Cathy Moore laid out like a very streamlined approach to creating results as learning designers. And you'll see, especially after you learned about these things, you'll see how a lot of them are kind of baked into Cathy's approach, but it's just a simplified, like easier to implement approach. So check out action mapping. Again, I'll link that playlist in the description. If you did enjoy this video, please go ahead and give it a thumbs up just to let me know that me talking to you all for however long this has been isn't missing the mark completely. And I'll see you in the next video. Again, if you did make it this far, I really do appreciate you. And please subscribe because there's a lot more content like this coming your way. Bye-bye.
Generate a brief summary highlighting the main points of the transcript.
GenerateGenerate a concise and relevant title for the transcript based on the main themes and content discussed.
GenerateIdentify and highlight the key words or phrases most relevant to the content of the transcript.
GenerateAnalyze the emotional tone of the transcript to determine whether the sentiment is positive, negative, or neutral.
GenerateCreate interactive quizzes based on the content of the transcript to test comprehension or engage users.
GenerateWe’re Ready to Help
Call or Book a Meeting Now