How do we measure value creation from training? Part 2


How do we measure value creation from training? Part 2

How do we measure value creation from training? Part 2

In a previous article we looked at the five levels of impact in any exhaustive training evaluation process that reflect the key challenges involved in high-quality evaluation. We also looked at the limitations of evaluations due to non-measurable (intangible) factors. In this second article, we’ll be looking at the different aspects of training evaluation represented in diagram form, the three key principles that will help you to evaluate effectively, and a few tips to avoid the issue of intangibles.

Training evaluation at a glance

Evaluating training is no easy task. If you restrict yourself to sending out a satisfaction form, that’s not going to be enough for a complete evaluation. Evaluation is a complex, on-going process that involves various different people:


(Quality evaluation cycle, designed by Mickael Amorim, Florent Depoorter and Adrien Souci)

The process that leads to high-quality evaluation begins with senior management asking some key questions: What will allow us to say that our training strategy will be a success? What qualitative and quantitative indicators will we use to measure how successful it’s been in one year?

Then the evaluation cycle begins with the traditional satisfaction questionnaire, written by the training company or the training department.

Next, knowledge acquisition is measured a few weeks later, to assess how effectively knowledge and skills have been passed on to the learners. This involves measuring the gap between pre- and post-training skills levels.

Now we measure how effectively the acquired knowledge has been applied in the workplace. This should take place over a period of time (minimum 2 months, according to Kirkpatrick (1998)) and aims to measure how the learner’s behavior has changed since he or she completed the training. This evaluation has to be carried out by someone who is able to observe such changes, so it’s often the learner’s line manager, guided by the HR team.

The outcome of these behavioral changes should be increased performance: Have they achieved their objectives? Has their productivity improved? Here again, it’s often the line manager who is best able to evaluate the performance of his or her subordinates. The company generally gets this type of information from annual appraisals.

Last but not least, return on investment, which is linked to the first questions we looked at: Are the indicators positive or negative? Are they satisfactory given the amount of money we invested?

3 key principles for a successful evaluation cycle

  1. Evaluating training ROI involves more than simply calculating a financial ratio. It means assessing how much value has been created using key performance indicators (KPIs) that have been defined before the training begins. These KPIs can be results-oriented (quantitative: performance) or process-oriented (qualitative: human capital). (Dunberry & Péchard, 2007)
  2. Training evaluation is a cycle that involves a number of key people, both before and after the training takes place. Before it starts, the training strategy is defined with top management. Training should be seen as an HR lever aligned with corporate strategy: sharing the same vision allows you to set KPIs that are meaningful for senior management, HR, line managers, and staff/learners.
  3. Instead of evaluating “application in the workplace” all at once, it is seen as a support period carried out with the line manager. Acting as a “coach”, he or she can:
  • Give the staff member opportunities to apply the content of the training.
  • Provide on-going feedback to help the staff member to develop the new skill over time. (Pottiez, 2013)

Intangibles have an impact on measurable factors

Even though certain things are hard to measure reliably, Phillips (2015) suggests a way of getting round this when evaluating the impacts of a training course. When dealing with an intangible, the idea is not to seek to evaluate it at any cost, but instead to look for the impact it has on things that are measurable.

For instance, if a course is expected to improve wellbeing at work, it might be tempting to come up with a “before-and-after” questionnaire. But this wouldn’t be a viable way of evaluating the course because there are so many unrelated factors that can affect your analysis. Working conditions, relationships with colleagues, the equipment you use, the quality of what you eat for lunch, or even personal factors have an influence: don’t transportation problems or traffic jams affect your wellbeing?

In a nutshell, this type of analysis is influenced by so many factors that it’s hard to isolate the effect training has by itself. On the other hand, an intangible such as wellbeing at work has an impact on corporate performance, the employer brand, staff turnover, sick leave, and productivity… All these things produce measurable information that you can correlate with wellbeing at work—and thus with training initiatives or strategy.

To conclude, we can see that evaluating training is a long, complex process that involves a number of different people. A lot of careful thought is required to set up a true evaluation strategy that will allow you to obtain the resources you need and get everyone on board.

To find out more, download our free e-book

References/To find out more:

  • Dunberry, A & Péchard, C. 2007. L’évaluation de la formation dans l’entreprise : état de la question et perspectives
  • Gilibert, D & Gillet I. 2010. Revue des modèles en évaluation de formation: approches conceptuelles individuelles et sociales. Pratiques Psychologiques, Elsevier Masson, 16, pp.217-238.
  • Kirkpatrick, D.L. 1998. Evaluating training programs.
  • Phillips J, Pulliam Phillips P. 2015. Handbook of Training Evaluation and Measurement Methods
  • Pottiez, J. (2013). L’évaluation de la formation. Paris: Dunod.
  • Wargnier, J. (undated). CrossKnowledge White Paper Evaluating and demonstrating the value of training

Discover also