Mastering E-Learning Quality: From Design to Execution in Testing
Conclude the series on enhancing e-learning quality by exploring best practices for executing test scripts, ensuring thorough testing, and avoiding common pitfalls.
File
QA Script - Managing the Testing Process
Added on 10/02/2024
Speakers
add Add new speaker

Speaker 1: Welcome to SkillQ Advisor. I'm Jeff Harris. In this post, I conclude my three-part series on improving the quality of e-learning and avoiding mistakes that can lead to embarrassing and sometimes reputation-diminishing review cycles. In the previous two posts, I discussed the importance of quality, and I introduced you to the concept of adapting software test scripts to e-learning projects. I also explained how to design your test process and test script. Now, we'll move from design to execution, where we'll cover best practices for using your test script to structure and organize the testing process. Make sure to stick around until the end of this video, and I'll explain how to download a summary of this series that includes notes and a test script template. Before we get started, let's recap the processes and documents you should have ready before you begin testing. As we discussed previously, your design document is critical to the testing process. I also explained that good quality control requires well-established and documented standards. Last and certainly not least, you need a well-defined and specific test script. There's just one more very important thing you need, and that's a completed course. I know it seems obvious, but note that I say completed course. My point is that you don't want to begin the testing process with a course that you know has serious gaps and defects. Doing so will cost time and tempt your team to slip back into design mode. Of course, there is a judgment call to be made, most likely by the team leader or project manager, about the level of completion necessary before testing begins. I prefer that testing be completed before the project is submitted to the project sponsor or company leadership. You may be thinking, should testing be completed before sharing the course with the SME? My answer is, it depends. On some projects, the SME works as an integral part of the team and may even be asked to execute testing. On other projects, the SME is the project sponsor and works more as an advisor to the project. The project manager should make the call about exactly when in the process testing will begin. Okay, we've recapped the input documents and covered the issue of when testing should be completed. Now let's talk about who. If you read or listened to my previous post, you already know that the testing analyst should not be the developer who created the course. I also avoid using the instructional designer who designed the course. Instead of identifying a specific role who should execute the test, it may be better for me to list the ideal traits of the person acting as the testing analyst. A testing analyst is a person who is comfortable and competent reading and interpreting design documents, familiar with project standards, detail-oriented, focused even when dealing with tedious tasks. And did I mention detail-oriented? That's a really important one. Let's be honest, testing is not the most enjoyable part of the development process. It often gets assigned to the most junior member of the development team. This may be a mistake if the junior team member is not qualified for the task. Whoever you choose, make sure they have the right traits for the job and they are trained on the overall testing and documentation processes. We now know when and who, so let's discuss on what. In other words, what is the appropriate testing platform? Previously, I used the term target deployment environment. Sounds pretty fancy, right? Not exactly. The test environment should be at the low end of computers that are used by learners. Ideally, you already define minimum specifications for your courseware. If you do, refer to the minimum requirements document to determine the low end and then use a machine and browser that resembles the minimum configuration. Ten years ago, testing on a low end machine running Internet Explorer version 6 was all you needed. In the modern computing environment, the problem is more complicated. Ideally, you need to determine operating systems, browsers and devices you will support and then test using those devices. If you don't have the resources to cover all possible platforms, and really, who does? I suggest picking the most common desktop and mobile systems that your courseware supports. You may want to have your testing analyst use a desktop and mobile platform as they execute the script. Alternatively, you can complete a separate test cycle for each platform. A common mistake is testing using your development computers. In our environment, we generally develop with high end Macs on a high speed network. Our customers, however, generally use two to three year old PCs. More than once, a developer has concluded that a design is working well only to learn that functionality is not working on the target platform. Therefore, we keep a variety of older PCs on hand for testing. Before we conclude our discussion on testing environment, let's not forget to consider how the test course is hosted. A common mistake is to test the course outside of the hosting environment used by learners. For example, let's say you want to test a new course, but you don't want to bother with going through the process of loading the course into your corporate LMS. You decide to conduct testing by publishing the course for web, then loading the course on a web server you control. Testing this way will certainly simplify the process, but it is not a true test of your course. To fully and completely test your course, you really have no choice but to host the course on the same or similar environment that will be used for deployment. When you don't test on the target LMS, there are simply too many variables that may mislead or invalidate your results. If testing on your target LMS is not possible before deployment, you should use an alternative LMS for your testing. This approach does not always uncover the full range of issues you'll encounter when testing with your internal LMS. It will, however, be a closer approximation versus hosting on a web server without an LMS. Okay, let's assume you followed all the steps and successfully completed your test script. What's next? I recommend that you hold a meeting with your team to review and discuss the results. Be sure to include representatives from instructional design, development, and media. Review the defects and come up with a plan for addressing the issues. Don't allow this meeting to become a platform for recrimination. There are a thousand reasons courses have defects and many have nothing to do with the person who implements the design. I find that many defects are the result of unclear design documents, incomplete standards, design interpretation, and frankly, other process issues. This meeting is a great opportunity to identify where the process got off track and ensure corrections are made for the next course. We have reached the end of this series on improving the quality of your courses by planning, designing, and implementing a testing process. Quality assurance and testing is an often forgotten but very important part of creating professional courseware and e-learning products. Thank you for joining me for this series and taking the first step to implement a quality assurance process or improve your current process. What do you use in your organization? Leave me a comment and let me know. If you enjoyed this video series, be sure to subscribe to the SkillQ Advisor newsletter where my team and I share knowledge and insights on e-learning, media production, strategies, tools, and processes. The newsletter will also keep you up to date on new blog and video posts as well as live webinars and events. If you subscribe to the newsletter, you'll also receive an email with a link to download a summary of this series that includes notes and a useful test script template to get you started. Until next time, have a great week and keep stretching your skills.

ai AI Insights
Summary

Generate a brief summary highlighting the main points of the transcript.

Generate
Title

Generate a concise and relevant title for the transcript based on the main themes and content discussed.

Generate
Keywords

Identify and highlight the key words or phrases most relevant to the content of the transcript.

Generate
Enter your query
Sentiments

Analyze the emotional tone of the transcript to determine whether the sentiment is positive, negative, or neutral.

Generate
Quizzes

Create interactive quizzes based on the content of the transcript to test comprehension or engage users.

Generate
{{ secondsToHumanTime(time) }}
Back
Forward
{{ Math.round(speed * 100) / 100 }}x
{{ secondsToHumanTime(duration) }}
close
New speaker
Add speaker
close
Edit speaker
Save changes
close
Share Transcript