Tips for Browser Testing Your E-Learning Courses
We recently undertook a rather large browser testing project. It involved testing 16 courses on a total of 9 browsers. Our schedule was aggressive: test everything in two weeks. If you do the math, that’s 144 passes through the various courses. It took some some trial and error, but we were able to build an efficient model that I’m certain we can use again in the future. If you find yourself in a similar situation, consider the following steps:
Gather a team.
This was clearly too much work for one or two people; we needed a small army. We started by asking our contractors to help out. We then asked our contractors for referrals. We also asked our own friends and family. Our testers needed to be comfortable with a computer but didn’t need to be computer “experts,” so our pool of potential testers was fairly large. Complicating matters, though, was the fact that our testing took place during the holiday season. As a result, we asked our testers upfront for a time commitment of at least 10 hours. This is because we had to train them on how the course should work and answer all of their questions. We wanted to make sure that investment had time to pay off. Another option would have been to contact a temp agency. We considered that option but were ultimately able to fill the slots ourselves.
Organize your tracking system.
We developed an online spreadsheet using Google Drive. One sheet served as a sign-out where each tester placed his or her initials, indicating the course and browser being worked on. Letting them choose their own assignments reduced the level of coordination needed on our end. The other sheets, one for each course, were used by the testers to log their feedback.
We made a training video in Storyline for our testers. We walked them through the browser testing process from beginning to end, showing them what things to look for and how to log their feedback. Although we could have provided a “live” training session via a tool like Adobe Connect, the movie worked out even better. Not only did we not have to worry about coordinating everyone’s schedules, but we also provided our testers with a resource they could refer to over and over again. I know for a fact that several of our testers re-watched segments of the video as they began the testing process themselves.
Put it in writing.
In addition to providing our testers with a movie, we also provided them with an Excel spreadsheet that served as a “browser testing plan.” This step-by-step document not only served as a written reminder of each task that had been shown in the video but also gave testers an easy way to keep track of which tasks they had completed.
Make sure the “help desk” is open.
No matter how clear your instructions are, testers are bound to have technical problems, questions, or misunderstandings. On our first full day of browser testing, we had one person available to field all of these issues. Since it was important that we kept the browser testing moving and that our testers didn’t get too frustrated with what was a new process for many of them, we were careful to answer questions quickly and to keep an upbeat tone. We also watched the Google Docs spreadsheet, looking for possible misunderstandings – either in terms of filling out the spreadsheet incorrectly or misunderstanding how the courses were supposed to function. Although this was a large initial time investment, it paid dividends almost immediately as browser testers quickly became self-sufficient and provided high quality work.
What tips do you have for making the browser testing process more efficient and successful?