The End-User Review: Questions to Ask
Even with a quality course that subject-matter experts, instructional designers, editors, and technical testers have approved, you still don’t know if the final students will like the course, enjoy it, be able to use it, or get what they need from it! That’s why one of the most critical elements of your test plan is the End-User Review.
Here are the most common questions I’m asked about end-user testing.
When do I test?
This is a two-part answer. For starters, the earlier in the process you can have end-users test the better off you’ll be – preferably testing the prototype. You’ll want to run the course through all the other testing first, so users won’t get caught up on typos and obvious functionality issues. Early testing will help you find issues that could affect how you design the rest of the course. This testing can also help with your load testing mentioned in my post about technical testing.
The second part of this answer is that you could conduct end-user testing once the course is complete and before it goes live. This is often called a pilot.
Questions you might ask during these tests are:
- Do they like or enjoy the course?
- Do they understand the material?
- Can they meet the objectives?
- Do they find the material helpful?
- Can they operate the course?
Who should be part of the test group?
You’ll want test subjects to represent the full cross-section of your audience in terms of such things as computer proficiency, age, cultural background, existing knowledge of the subject matter, role in the company, and any other variable that related to your program.
How should I conduct the test?
In a perfect world, you would have a facilitator coordinate the tests, observe the students, and be able to gather feedback during the tests. During the test, the facilitator should encourage the student to make verbal comments about thoughts while taking the course, and by conducting a final interview to get more details and clarify comments after the test.
Another option is sending a link to the course with a built-in survey. This can give more limited feedback (because you aren’t able to view the students and hear their in-the-moment comments), but can be easier for people – especially if your workforce is dispersed or works different schedules.
What do I do with the data I collect?
This is probably the most difficult question to answer because you will get differing opinions. For example, imagine you are reviewing the summary of the results and you see that about a third of the test subjects say the course was too slow and a third say it was too fast. The other third says it was just right. Really?!? What in the world do you do with that?
Review and consider all comments, and then make your best judgment. You’ll have comments that contradict each other. Some suggestions will be something you’ll want to implement, others would be too expensive or time-consuming to make, some would result in a worse product, and some might seem unhelpful or off-topic. You’ll want to try to look at each comment objectively and get additional information whenever possible.
Be sure to build time and money into you project plan for the re-work that will come from your testing; both at prototype phase and during the testing for the final course. It’s nice to think that everything will work out as planned, but experience tends to prove otherwise.