Essential to any training course is the survey at the end asking for feedback. Evaluations are key to instructional designers’ understanding of how well the learners received information, how well they will retain it, and whether they will recommend the course to colleagues.
However, the evaluations are not always designed to demonstrate which part of the training team excelled or failed. And when a negative review comes in, it is only human to take it personally. Unfortunately, reacting out of fear and hurt rather than rationally can lead to each department pointing fingers rather than sitting down to figure out how to actually improve the course.
The next time a trainer returns from the field with lackluster ratings for your course, follow this 5-step plan to move forward:
- Take a Break
Whenever receiving negative feedback, it is a good idea to step away before you begin to process it. Even if you don’t feel that you are reacting emotionally, go get a cup of coffee, take a walk, or put the feedback aside until the next day so that you can return to the material prepared. While you want to address the negative aspects of the course eventually, there is no need to respond immediately. Plus, extra time will help you come up with new ideas to improve the course design.
- Break It Down
The next step is to analyze the evaluations to pin down where exactly the course is going wrong. A good evaluation form will do some of this work with separate questions about trainer performance, presentation style, and course materials. But this is not enough. Read between the lines in the free-form fields, and be honest about whether they are griping about the content or the presentation. Don’t forget to take it all with a grain of salt: sometimes a cold room is enough for a learner to give low marks throughout the evaluation.
- Look for Miscommunication
Next, review the preparation process to see what went wrong. Was it really your design that was weak, or wasn’t the trainer familiar enough with them to properly deploy your materials? Track down any weak links in the communication chain between your team and the on-the-ground trainers to save yourself the headache of redoing a course.
- Design vs. Content
If the course really didn’t do well because of a problem in instructional design, then do some further analysis to see which you need to redesign: the format of the course or the content. Do you need to add more interactivity, e-content options, or even just update the look of the trainer’s powerpoint? Or do you need to update the substance of the course, such as going deeper into a subject, eliminating a section that is too detail-oriented, or adding a new section?
- Make a Plan
The last stage of dealing with a bad evaluation is to make a plan for moving forward. Whether it is improving communication with the trainers or revamping your design, make sure that once you have identified the problem, you have a solid outline of how to correct it.
This is also an opportunity to change your evaluation system. If the evaluations weren’t easy to analyze, rewrite them to include questions that help break down satisfaction according to trainer presentation, class format, and class content. Feel free to use our evaluation template [not yet created] as a starting point.
Your evaluation forms may be measuring the course experience rather than the course efficiency. Your program is designed not only to keep learners interested all day but also to increase long-term retention and application of skills. Consider which of these metrics is more important to you and then examine whether the evaluation form is measuring that metric.
If you had trouble collecting enough evaluations to establish a reliable data set, talk with your team about incentivizing evaluation forms. You could use techniques such as offering a raffle of a gift card to everyone who submits an evaluation form. You could also try tying the course credits to a completed evaluation.
Finally, it may be time to look at other learning metrics available. For ILT print-based courses, evaluations remain the only way to gather feedback from attendees. But a course using e-content has many other metrics available. Depending on the distribution solution you use, you can monitor such performance indicators such as:
- Who opened what content
- How often they opened that content
- How long they viewed that content
- What notes, highlights, or annotations each user made
Using these metrics, you can pinpoint exactly where learners are engaging with your content, see their questions real-time, and more effectively get a picture on whether your design is fulfilling its potential.
To explore the subject more in depth, Connie Malamed and Will Thalheimer, PhD, discuss performance-focused evaluations in this podcast.
At the end of the day, even the best instructional designer will get a poor evaluation on one or two of her courses. What makes you stand out is how you handle that evaluation.
How have you handled receiving a bad course evaluation?
Millions of dollars are being invested in training each year. But how are organizations measuring the effectiveness of their training, especially soft skills training like sales? At Richardson, Eileen Krantz, Vice President of Client Analytics, has discovered that some clients believe that there is just an inherent value in providing quality sales training, others are more concerned with just aligning training with the sales strategy, and some develop a comprehensive measurement strategy to isolate the financial return on their investment.
Speak to one of our experts today.
Fill out the form to get a 10 minute demo. A representative will be in touch with you in the next 24 hours.
It’s easy to print your business documents online with Mimeo.
- No Order Minimums
- Print from Wherever You Are
- Budget-Friendly Prices
- Overnight Delivery
- API Integration