This is adapted from Mimeo’s Spotlight Report: Training Measurement 2017.
Since our State of L&D 2016 report revealed that only 9% of learning professionals use standard industry benchmarks, such as the Kirkpatrick Assessment matrix, to measure their training, we wanted to get a better understanding of how corporate learning practitioners in various industries measure the results of their work.
Jessica Coburn has revolutionized her measurement strategy to move from utilization metrics to a robust rubric to evaluate her team’s training on each level of the Kirkpatrick Assessment model.
Jessica joined Social Solutions 4 years ago as the second person on the training team. Social Solutions Global (SSG), the developers of Efforts to Outcomes (ETO®) and Apricot® software, is the leading provider of outcomes management software for the human services. The software equips nonprofit and government agencies to drive performance by making data useful at all levels, from frontline staff to executive leadership. The training team was a part of the customer support team until January 2016 when it was made into its own department.
Jessica has grown the team both in size (it is now a department of six) and in function. The team spends time creating content and developing courses, while Jessica is focused on developing metrics to ensure efficiency and accurately reflect the success of her team.
Jessica and her team deliver training to customers on how to use the two Social Solutions platforms. Using a blended approach, she and her team map out learning objectives to address the training needs for each user role within the software.
In this model, software administrators attend a “boot camp,” which is an intensive, instructor-led onboarding class conducted either on-site or online.
In addition, Jessica’s team produces “labs,” which are 60-minute courses on specific features or functions. Labs can be “refresher” training, training on advanced features, or training on new features following a new release. These are designed as interactive, instructor-led courses.
Jessica’s team includes an instructional designer, Lei Robinson, who crafted a self-paced e-learning library using Storyline. This library includes interactive lessons that guide learners through content in the form of animations, tutorials, and quizzes. The team is currently in the process of building a new microlearning library that end users can access as an in-the-moment reference while they learn how to use the software. When it comes to microlearning, Jessica encourages her team to break down concepts into as many segments as possible.
The training team facilitates certifications on each software, which include practical and written portions of each exam.
They also respond to client requests for custom training. Custom training is designed to teach users how to use their unique site and its set up.
Finally, they create the documentation to support each software, such as the user guides.
The Old Metrics:
As the training department has grown and evolved, so too has their measurement strategy.
When Jessica first joined, training was embedded in the Customer Support division and metrics were focused on revenue generation. However, this measurement seemed counter to the goal of the training department, which is to develop independent and self-sufficient software users. That metric is also heavily dependent on the number of customers acquired by the sales team, over which the training department has no influence.
As Social Solutions evolved, there was an organizational shift to focus on utilization. The training team was given a utilization target measured by ratio of how many customers participated in training. However, utilization does not accurately represent the team’s achievements. For example, the training team focuses on developing courses during the “slow” months. This results in low utilization metrics, despite the high level of productivity.
Throughout its history, the team has largely focused on customer satisfaction surveys to determine the success of each class. These surveys asked the likelihood that the customer will recommend the training on a scale of 1-10. Jessica pointed out that her trainers are gregarious, charming people, so they frequently receive high scores based on personality. Those scores don’t necessarily illuminate whether the customer actually learned how to use the Social Solutions software.
The New Metrics:
To develop a better measurement system, Jessica first identified the populations her team intends to serve, determining who benefits from effective training. The first group served is the customer who participates in the training. The second is Social Solutions: specifically, the Support and Professional Services teams. Effective training allows customers to be more savvy software users, which reduces the demand on Support and Professional Services.
Next, Jessica questioned the definition of success. Success in Jessica’s department is when customers are more active participants during implementation so that they can complete a more appropriate configuration in the intended timeline. In addition, success is when customers are self-sufficient. They know the software and how to find answers to their questions.
Jessica then set about creating a measurement strategy to accurately depict the success of her team in serving customers as well as Social Solutions. Her new strategy addresses each level of the Kirkpatrick Assessments:
- Level 1 – Grading Matrix for Each Learner
- Level 2 – Change in Learner Perception Towards the Software
- Level 3 – Customer Call Support Metrics
- Level 4 – Correlation Metrics such as Length of Implementation and Customer Retention