Focused and fascinated in your seat in the meeting room, it dawns on you that this is the best training you have ever attended. You can't wait to tell the trainer how much you appreciate her hard work and how useful the information will be. But at the end, she simply hands out a long, pre-coded evaluation form. You fill in your address and put checks in boxes that ask about what is running on your desktop. Bored with the chore, you answer a few of the questions and leave, maybe even a little disappointed. Why hasn't she asked for your reactions or suggestions?
Without a doubt, the trainer has missed an opportunity: the chance to learn trainees' opinions of content, the trainer, and the value of the event. Next quarter, when she needs to update her presentation, there will be little "eyewitness" input to consider.
The Needs of Many Many companies miss their opportunity for insightful evaluations as a result of corporate politics rather than simple neglect. Many of my trainings involve multipartner efforts in which several large companies develop a solution (typically hardware, software, and networking) for a market or channel. With such large investments, all involved are anxious for data about the attendees. Each partner feels that they "must" have their questions represented on the evaluation form. As a result, the form can quickly become unwieldy, dominated by profiles and demographics. Multiple-choice questions are used to save space and ease tabulation. Little room, if any, is left for comments.
There are ways to reconcile the need for data with the need to evaluate content and delivery. Here are a few approaches:
Gather Once, Use Often. Take a hard look at your evaluation form. How much space is devoted to recording facts? Is this data also gathered during registration? Instead of asking for the same information twice, adopt a system that allows you to match registration data with the completed evaluation form. Attendee codes or confirmation numbers can be used. Better yet, use registration information to print personalized evaluation forms and distribute them at check-in. Have a few empty forms on hand for those who prefer to be anonymous or for late registrants.
Look at the remaining questions. Are there other ways to obtain this information? Every partner wants to know if the invitations are attracting the right audience. Again, use the registration lists. By analyzing the titles and companies, you can get a pretty accurate picture of the effectiveness of your efforts.
To the Point. I have found that only two quantitative questions are necessary to gather information "from the seats": a rating of the content and a rating of the presenter. If there is more than one major section or trainer, of course, add a question for each. With these two simple yet powerful measurements, you can get an instant picture of the quality of the course and the level of satisfaction with it.
Next, add qualitative or open-ended questions that will give texture to the quantitative ratings. Ask if the training met expectations, if attendees were satisfied, and if they would recommend it to others. In all cases, ask "Why?" and leave plenty of room for answers. If attendees feel that their opinions are valued (and the form is short), they will respond.
Inevitably, someone will press for "statistical significance" in the analysis of each question--including the qualitative questions. Ask them, "For our decision making, how much precision is really needed?" Isn't a 95 percent satisfaction rating for both content and presenter, coupled with comments, more useful than a statistically precise rating of 94.25 percent with no understanding as to why the rating was so high?
So make the most of each opportunity to understand your training event. At the end of the day, isn't that the real value in an evaluation form?