Chief Learning Officer - October 2017

Page 42

in this way, he would have asked questions about what results sales executives were trying to accomplish and he would have known what metrics to build into his program plan. Ann Montgomery, a senior U.S. military officer for a military training development organization, explained how they avoid this pitfall. “In our arena of developing computer-based training to be used around the world for aircraft maintenance, we face significant challenges to collect continuous feedback on the effectiveness of our training,” she said. “By ingraining continuous evaluation throughout our processes, both during development and post-product delivery, we can bridge that gap and ensure our customers’ expectations are met. Identifying the leading indicators to be monitored and evaluated early on enables us to ensure our training is satisfying the customer’s need by delivering effective and efficient training with quantifiable results.”

Don’t Rely on One Source for All Data Some believe in the existence of a miracle survey that will provide all necessary training evaluation data. Don’t buy into it. For mission-critical programs, it is important to employ multiple evaluation methods and tools to create a credible chain of evidence showing that training FIGURE 1: THE KIRKPATRICK MODEL

Level 4: RESULTS

The degree to which targeted program outcomes occur and contribute to the organization’s highest-level result.

Level 3: BEHAVIOR

The degree to which participants apply what they learned during training when they are back on the job.

Level 2: LEARNING

The degree to which participants acquire the intended knowledge, skills, attitude, confidence and commitment based on their participation in the training.

Level 1: REACTION

The degree to which participants find the training favorable, engaging and relevant to their jobs.

Source: Kirkpatrick Partners LLC

improved job performance and contributed measurably to organizational results. For less important programs, you will want to be equally careful about selecting the few evaluation items you require. Surveys, particularly those administered and tabulated electronically, are an efficient means of gathering data. However, response rates tend to be low and there is a limit to the types of information that can be gath42 Chief Learning Officer • October 2017 • www.CLOmedia.com

Many learning professionals make the same mistake. They design, develop and deliver a program and only then start to think about how to evaluate effectiveness. ered. It is so easy to disseminate these surveys that they are often launched after every program, no matter how large or small. Questions are not customized to the program or the need for data and people quickly pick up on the garbage-in, garbage-out cycle. This creates survey fatigue and makes it less likely that you will gather meaningful data for any program. For mission-critical programs in particular, gather both quantitative (numeric) and qualitative (descriptive) data. Open-ended survey questions can gather quantitative data to some degree but adding another evaluation method provides better data. For example, a post-program survey could be administered and results analyzed. If a particular trend is identified, then a sampling of program participants could be interviewed and asked open-ended questions on a key topic. Donna Wilson, curriculum manager for NASA’s Academy of Program/Project and Engineering Leadership, or APPEL, explained how they implement this approach. “We use a multidimensional blended approach and focus on the quality of our evaluation data versus quantity,” she said. “Using diverse methods like surveys and focus groups to collect data from multiple sources, we are able to capture more truth and provide a comprehensive picture of training impact across all four Kirkpatrick levels. Equally important, the recommendations we develop from those multiple sources are more credible than they would be if we relied on surveys alone. Using a blended approach means that when I’m reporting to our customers and stakeholders, I am confident I am providing them validated and actionable insights.” A source of evaluation data that is often overlooked is formative data — data collected during training. Build touch points into your training programs for facilitators to solicit feedback and ask facilitators for their feedback via a survey or interview after the program.


Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.