fasset skills development · web viewequipment and facilities - including all types of av...

135
Return on Investment (ROI) in Training November 2010 Facilitated by Faranani Facilitation Services (Pty) Ltd The views expressed in this document are not necessarily those of Fasset’s.

Upload: others

Post on 06-Jul-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

Return on Investment (ROI) in Training

November 2010

Facilitated by

Faranani Facilitation Services (Pty) Ltd

The views expressed in this document are not necessarily those of Fasset’s.

Page 2: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

TABLE OF CONTENTS

TABLE OF CONTENTS................................................................................................................. 1

ACRONYMS AND ABBREVIATIONS............................................................................................3

SECTION 1: UNDERSTANDING ROI...........................................................................................4

1.1 CONCEPTS FOR ROI................................................................................................................... 5

1.2 MODELS FOR ROI......................................................................................................................8

1.3 TYPES AND LEVELS FOR ROI....................................................................................................22

1.4 PROCESSES AND PRINCIPLES FOR ROI......................................................................................25

SECTION 2: APPLYING ROI METHODOLOGY..........................................................................28

2.1 ROI IMPLEMENTATION METHODOLOGY.......................................................................................28

2.2 DEVELOPING A PROBLEM TREE FOR ROI...................................................................................32

2.3 DEVELOPING A RESULTS CHAIN FOR ROI...................................................................................34

2.4 DEVELOPING INDICATORS FOR ROI...........................................................................................37

2.5 DEVELOPING A MEASURING FRAMEWORK FOR ROI.....................................................................39

SECTION 3: DATA COLLECTION AND ANALYSIS FOR ROI...................................................44

3.1 BASIC RESEARCH DESIGN AND METHODOLOGY FOR ROI.............................................................44

3.2 DATA COLLECTION OPTIONS FOR ROI........................................................................................45

3.3 DATA ANALYSIS OPTIONS FOR ROI............................................................................................46

3.4 EVALUATION TECHNIQUES FOR ROI...........................................................................................49

SECTION 4: ROI CALCULATIONS.............................................................................................53

4.1 VARIOUS APPROACHES TO CALCULATING ROI............................................................................53

4.2 TOOLS AND TEMPLATES FOR CALCULATING ROI.........................................................................61

SECTION 5: ROI IMPLEMENTATION CONCERNS AND MILESTONES...................................69

5.1 IMPLEMENTATION OF ROI – THE STEPS.....................................................................................69

5.2 CONCERNS AND CHALLENGES IN IMPLEMENTING ROI.................................................................75

5.3 IMPORTANT MILESTONES FOR ROI IMPLEMENTATION..................................................................80

ROI in Training (2010) 1

Page 3: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

SECTION 6: ROI LESSONS FROM PRACTICE..........................................................................82

6.1 CASE STUDIES RELATED TO ROI...............................................................................................82

REFERENCES............................................................................................................................. 91

RECOMMENDED READING.......................................................................................................93

HANDY RESOURCES.................................................................................................................94

Useful links.................................................................................................................................. 94

ROI in Training (2010) 2

Page 4: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

Acronyms and Abbreviations

CEO Chief Executive Officer

CFO Chief Financial Officer

CIPP Context, Input, Process, Product

CIRO Context, Input, Reaction, Outcome

CRM Customer Relationship Manager

EEO Equal Employment Opportunities

ERP Enterprise Resource Planning

GST General Systems Theory

HR Human Resources

ICT Information and Communication Technology

MBA Master in Business Administration

OEM Organisational Elements Model

OSHA Occupational Safety and Health Administration

PDSA Plan, Do, Study, Act

RBM Results Based Management

ROI Return on Investment

SDF Skills Development Facilitator

SOE Standard Operating Environment

UCLA University of California Model of ROI

ROI in Training (2010) 3

Page 5: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

Section 1: Understanding ROI

“The ROI methodology is one that is replicable and scalable so that it may be collected and reported

in a cost efficient and timely manner and such that it is comparable and benchmarkable both within an organisation and external to the organisation.

2004 Global Learning Alliance

In the agricultural and industrial economies, companies needed "hard" workers, but the New

Economy is putting a premium on "smart" workers. The explosion of knowledge and technology

and the shortage of skilled workers have spawned an abundance of books and articles on the

topics of lifelong learning, knowledge capital, and intellectual capital. Accountability is a key issue

for Human Resources as well as other business units. Consequently, the idea of being able to

calculate the return on investment (ROI) of training is enticing. An absolute number in a neat

package, a trainer's dream! In some cases, it can be obtained. In other cases, while seductive, it

may not be worth the effort.

The more money a company spends on employee training, the greater the concern that these

highly skilled people will leave and take their knowledge somewhere else. This results in a loss of

knowledge and a poor return on the organization's investment in training. However, research has

shown that training actually reduces turnover and absenteeism. Employees will stay where they

can grow and develop.

Training in many organisations is not seen as the most popular activity. Why? HR and training

managers have to continuously justify the amounts of money that were spent on training. Training

is often seen as a waste of time, money and effort. The value added through training is

questioned. Rightly so. Assumptions are made regarding the value of training added in achieving

overall organisational effectiveness. How do we know the value of training if we do not measure

the value of training? ROI in training is about measurement. Measuring the return on the money

invested in training. ROI is about accountability and responsibility for the particular investment in

training. 

The ultimate aim of any training program is to improve organisational performance that will add to

organisational effectiveness and profitability. In order to measure performance a person needs to

determine the monetary value of the performance in its current status. After the training

intervention has taken place, the learners' performance needs to be measured again, thus

determining if there was an improvement. Training is not a once-off event, but a continuous

process in achieving organisational effectiveness. Employees need to receive training on an

ongoing basis to be able to apply the learning acquired to their daily activities. The information

ROI in Training (2010) 4

Page 6: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

acquired to reinforce learning, will be converted to knowledge application once the specific task is

performed more efficiently and effectively. Only then can the ROI be calculated through a cost-

benefit analysis by determining the cost (investment in training) versus the benefit of the learning

that has taken place, i.e. the benefit of learning as a result of training.

Legislation has an impact on the way organisations view training, the purpose of training, and the

value of training delivered. South Africa has been investing vast amounts of money in training.

More so over the past eight years as a result of legislation in the form of the SAQA Act, Skills

Development- and Skills Development Levies Act. Training budgets are growing. Additional

training and development programmes are being implemented as part of organisations'

employment equity plans and workplace skills plans

The King Report on Corporate Governance states clearly that accountability must also be

reported in the area of human capital management. Training is an essential component of human

capital management. The good news is that ROI can be determined through a scientific method.

ROI is the measure of the monetary benefits obtained by an organisation over a specified period

in return for a given investment in a learning programme. In other words, it is the extent to which

the benefits (outputs) of training exceed the costs (inputs). The cost of training is the money

invested in the training programme. The benefits of the training programme would be the

measurable output, i.e. the ability to answer more telephone calls, or manufacture more nuts and

bolts.

Measuring ROI is not a one-person show. It is a powerful tool that enables training managers to

prove the value of training, gaining credibility for the value added, and a contribution to achieving

organisational effectiveness. It enables them to report to management in quantifiable terms, i.e.

rands and cents! Measuring ROI is about accountability and taking responsibility for the

measuring the impact as a result of the training. Measurement is about becoming a strategic

business partner that add value and provide integrated business solutions. Measuring ROI can be

done!

1.1 Concepts for ROI

If one looks at any facet of business the concept of 'return on investment' (ROI) is always a

relevant business topic. ROI can have many connotations depending upon the user’s perceptions

and motivations. In reality, ROI is really a measure of perceived value. Value can be different for

different stakeholders. Let's look at some examples:

An organization provides training to a group of participants. This person wants to know

the satisfaction levels of the participants.

ROI in Training (2010) 5

Page 7: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

A course designer creates an e-learning module. This person wants to know if the

module did its job in transferring new knowledge or skill to the learner.

A business unit manager sends two employees to training. This person wants to know

the impact the training has made on the job.

A senior executive measures performance by the business objectives that drive the

company. This person wants to know the degree to which training has helped drive key

business results.

The finance group manager views benefit relative to cost on every decision. This person

would want to know the benefit to cost ratio, payback period and ROI percentage from

training.

Value is inherent in each of the aforementioned examples. So the first question one should ask

when contemplating an ROI solution is 'How does my user of this information define value?'

Having said that, there is strong need to ensure that one has a balanced approach to learning

measurement. A balanced approach should be able to accomplish all stakeholders' perceptions

of return on investment.

ROI as process

ROI measurement is the process of collecting and analysing this performance data, and

translating this into a measurement of real financial benefit to the organisation. This benefit is

then compared to the cost of creating this benefit through training and measurement.

In many cases, ROI measurement can be linked to data collected and analysed for the purpose

of Training Needs Analysis (TNA). If detailed TNA studies are done prior to the training, the data

from these studies can be compared to the feedback and performance data acquired after the

training takes place. In addition, the TNA is likely to highlight the expected benefits and results

from the training. In this case, the change in performance may be more accurately determined.

ROI as perception

So, what actually is ROI on training? It can be considered to be a perception on the part of the

client of how valuable the training has been in achieving their perceived goals; and these

perceptions will vary depending on whom you talk to. For example:

The Board may see a big picture of how the training affects the company’s ability to

achieve its corporate goals

The finance department may be looking to see how training stacks up financially against

other ways to invest the company’s money, and whether the training, as carried out, was

financially more effective than alternative forms of development

ROI in Training (2010) 6

Page 8: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

The business unit manager may be solely concerned with the impact on performance and

productivity in achieving the goals of their department

The training and development manager may be concerned with the impact training

programmes are having on the credibility and status as the training function within the

company and its ability to secure investment in the future to drive further business

performance enhancements

Benefit and cost aspects of ROI

It is also important to understand that ROI can be realised in different ways. IBM has published

research on ROI, which points out the following factors to take into consideration. ROI can be

realised as:

“Cost avoidance”, which reflects the reduction in costs and overheads in various forms,

but which does not necessarily “improve” the value of the individual’s efforts to the

business (for example, training may reduce the time taken to perform a task, or attending

an online class may avoid travel expenses)

“Business benefits” are a reflection of what new and additional value (expressed as

business improvements) an individual brings to their work, or the performance of their

department or team, by virtue of their learning. (for example, learning may enable people

to perform tasks they were unable to perform previously, and provide valuable new

services)

By combining the 2, you can predict or calculate a total ROI on learning.

Why do we want to measure ROI?

ROI, and the evaluation of training, is, and always has been, an important topic to the computer

industry; and it has always been problematical convincing customers, and your own product sales

and marketing departments, that training is vitally important to the customer’s success with your

product. For the IT industry, training on new or updated software and systems is critical to

successful implementation.

Most people believe this, and intuitively accept it. Training is usually built into plans and proposals

alongside software purchases and other services. On the other hand, training is also often

regarded as one of the aspects of a project that can be cut back or even removed from a project,

without causing the project to fail completely; we are all familiar with the attitude that training is

often the last item to be included in the project plan, and the first to be cut back when money is

tight.

“Show me the value”

Many corporations, and corporate executives, are now more demanding in wanting to see, before

approving expenditure, a financial value justification for what training brings to the business, and

ROI in Training (2010) 7

Page 9: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

also wanting the actual benefit to be measured after implementation. Demonstrating ROI has

become a major point of emphasis for corporate HR development executives, who are facing

increasing pressure to justify their expenditure on people development, in terms of financial

benefit to the business.

“ROI” is a measurement technique common to many projects and investments, and can be

applied to training. It is necessary to be able to estimate, before the project starts, the potential

and expected ROI (using previous experience wherever possible), as well as be able to measure

and report on the actual return after the project has finished.

Looked at another way, ROI modelling also helps to support the argument that serious

businesses cannot afford NOT to train; the cost of not training (measured in various ways) can be

much higher and much more damaging, than the cost of training. For example, various studies in

various situations and circumstances have found some interesting statistics:

Untrained users take up to six times longer to perform the same tasks. Training enhances employee retention. A Louis Harris and Associate Poll says that

among employees who say their company offers poor or no training, 41% plan to leave within a year. Of those that say their company offers excellent training, only 12% say they plan to leave.

Studies show that in-house training costs 73% more than outsourced training.

A four-year study by the American Society of Training and Development shows that firms

who invest $1500 per employee in training compared to those that spend $125,

experience on average: 24% higher gross profit margins and 218% higher income per

employee!

Just a 2% increase in productivity has been shown to net a 100% return on investment in

outsourced, instructor-lead training

These examples are all well and good, but these are generic and non-specific indicators; how

well do these apply to any specific customer situation, or to a specific industry? If a customer is

going to make a decision to invest in training, they may want more specific supporting evidence

than this; hence the need to be able to offer a pre-training ROI analysis to give the customer

some comfort that this might in fact work in their situation.

1.2 Models for ROI

The measuring of return on investment originated in the USA, where the American Society for

Training and Development (ASTD) has been championing the process for decades. The ground

work was done by Donald Kirkpatrick who designed an evaluation system that addresses the

evaluation at four levels; reaction, learning, behaviour and results Reaction evaluation refers to

the typical evaluation forms we complete after a training course, the so-called smile-sheets in

ROI in Training (2010) 8

Page 10: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

which learners indicate the extent to which they enjoyed the course. The second level refers to

learning evaluation, in other words, the extent to which the learners have learned certain

concepts, principles, knowledge and skills. This type of evaluation is normally done by means of

tests because you can assess the knowledge level of a learner. The third level of evaluation is

called behaviour evaluation, and this type of evaluation seeks to determine whether people can

practically apply their knowledge and skills. Here we ask ourselves the questions: Is there a

change in behaviour? Can we measure the application of skills in the workplace? The last level

according to Kirkpatrick is results evaluation. This level of evaluation attempts to measure the

results in terms of the impact of the training intervention on the organisation. For example, are

there more sales as a result of a sales training programme? Is there an increase in productivity as

a result of a training programme? What is evident from all these questions is that there must be

some form of pre- and post course measurement in order to be able to answer these questions.

Jack Phillips, a training management consultant, is now considered to be the main expert in the

field of ROI in America. He visited South Africa in 2002 to share his ideas about ROI with local

training and human resource managers

More detail on the Kirkpatrick model….

Knowing there is a definitive need to measure the impacts of a large corporate cost like learning it

is fitting to have an industry acceptable model for doing so. This model is actually one that has

been in existence since the 1950's but continues to be accepted today using technology and

creativity to maximize its benefits for the modern corporation.

In 1959, Donald L. Kirkpatrick, author, PhD, consultant, past president of the ASTD and

KnowledgeAdvisors Advisory Board Member published a series of four articles called

"Techniques for Evaluating Training Programs." The articles described the four levels of

evaluation that he had formulated based on his work for his PhD dissertation at the University of

Wisconsin, Madison. Later, Kirkpatrick wrote a book (Donald L. Kirkpatrick, Evaluating Training

Programs: The Four Levels, 2nd Edition, Berrett-Koehler Publishers, Inc, San Francisco, 1998)

and it is now in its second edition. This book was a source for the information on the following

pages related to Levels One through Four.

Kirkpatrick's goal was to clarify what evaluation meant. The model clearly defined evaluation as

meaning "measuring changes in behavior that occur as a result of training programs." The model

itself is composed of four Levels of training evaluation. A fifth level, ROI has been added since

then. The fifth level was the brainchild of Dr. Jack J. Phillips, Ph.D., author, consultant and

KnowledgeAdvisors advisory board member and strategic partner. The illustration below and

subsequent commentary summarizes Kirkpatrick's Four Levels and Phillips' Fifth Level.

ROI in Training (2010) 9

Page 11: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

This grid illustrates the basic Kirkpatrick structure at a glance. The second grid, beneath this one, offers more detail.

Level Evaluation type

(what is measured)

Evaluation description and characteristics

Examples of evaluation tools and

methods

Relevance and practicability

1 Reaction

Reaction evaluation is how the delegates

felt about the training or learning

experience.

'Happy sheets', feedback forms.

Verbal reaction, post-training surveys or

questionnaires.

Quick and very easy to obtain.

Not expensive to gather or to analyse.

2 Learning

Learning evaluation is the measurement of the increase in

knowledge - before and after.

Typically assessments or tests before and after the training.

Interview or observation can also

be used.

Relatively simple to set up; clear-cut for

quantifiable skills.

Less easy for complex learning.

3 Behaviour Behaviour evaluation is the extent of applied

learning back on the job - implementation.

Observation and interview over time

are required to assess change, relevance of

change, and

Measurement of behaviour change typically requires

cooperation and skill of

ROI in Training (2010) 10

Page 12: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

sustainability of change.

line-managers.

4 Results

Results evaluation is the effect on the

business or environment by the

trainee.

Measures are already in place via normal

management systems and reporting - the

challenge is to relate to the trainee.

Individually not difficult; unlike whole organisation.

Process must attribute clear accountabilities.

Level One - Reaction

Per Kirkpatrick, "evaluating reaction is the same thing as measuring customer satisfaction. If

training is going to be effective, it is important that students react favorably to it."

The guidelines for Level One are as follows:

Determine what you want to find out

Design a form that will quantify the reactions

Encourage written comments and suggestions

Strive for 100% immediate response

Get honest responses

Develop acceptable standards

Measure reactions against standards, and take appropriate action

Communicate reactions as appropriate

The benefits to conducting Level One Evaluations are:

A proxy for customer satisfaction

Immediate and real-time feedback to an investment

A mechanism to measure and manage learning providers, instructors, courses, locations,

and learning methodologies

A way to control costs and strategically spend your budget dollars

If done properly, a way to gauge a perceived return on learning investment

Level Two - Learning

ROI in Training (2010) 11

Page 13: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

Level Two is a 'test' to determine if the learning transfer occurred. Per Kirkpatrick, "It is important

to measure learning because no change in behavior can be expected unless one or more of

these learning objectives have been accomplished. Measuring learning means determining one

or more of the following."

What knowledge was learned?

What skills were developed or improved?

What attitudes were changed?

The Guidelines for Level Two are as follows:

Use a control group, if practical Evaluate knowledge, skills, and or attitudes both before and after the program

Use a 'test' to measure knowledge and attitudes

Strive for 100% response

Use the results to take corrective actions

The benefits to conducting Level Two Evaluations are:

Learner must demonstrate the learning transfer Provides training managers with more conclusive evidence of training effectiveness

Level Three - Behaviour

Level Three evaluates the job impact of training. "What happens when trainees leave the

classroom and return to their jobs? How much transfer of knowledge, skill, and attitudes occurs?"

Kirkpatrick questions, "In other words, what change in job behavior occurred because people

attended a training program?"

The Guidelines for Level Three are as follows:

Use a control group, if practical

Allow time for behavior change to take place

Evaluate both before and after the program if practical

Survey or interview trainees, supervisors, subordinates and others who observe their

behavior

Strive for 100% response

Repeat the evaluation at appropriate times

The benefits to conducting Level Three evaluations are as follows:

ROI in Training (2010) 12

Page 14: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

An indication of the 'time to job impact'

An indication of the types of job impacts occurring (cost, quality, time, productivity)

Level Four - Results

Per Kirkpatrick, Level Four is "the most important step and perhaps the most difficult of all." Level

Four attempts to look at the business results that accrued because of the training.

The Guidelines for Level Four are as follows:

Use a control group if practical

Allow time for results to be achieved

Measure both before and after the program, if practical

Repeat the measurement at appropriate time

Consider costs versus benefits

Be satisfied with evidence if proof not possible

The advantages to a Level Four evaluation are as follows:

Determine bottom line impact of training

Tie business objectives and goals to training

Level Five is not a Kirkpatrick step. Kirkpatrick alluded to ROI when he created level Four linking

training results to business results. However, over time the need to measure the actual value

impact of training became so important to corporations that a fifth level was added by Dr. Phillips.

Dr. Phillips outlines his approach to Level Five in his book Return on Investment in Training and

Performance Improvement Programs, Butterworth Heinemann Publishers, Inc, Woburn, MA

1997. Dr. Phillips has written extensively on the subject, publishing or editing dozens of books on

the topic of ROI.

The Guidelines for Level Five are as follows:

Use a control group, if practical

Allow time for results to be achieved

Determine the direct costs of the training

ROI in Training (2010) 13

Page 15: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

Measure a productivity or performance before the training

Measure productivity or performance after the training

Measure the productivity or performance increase

Translate the increase into a dollar value benefit

Subtract the dollar value benefit from the cost of training

Calculate the ROI

ROI calculations are being done by a few world-class training organizations. They help these

organizations:

Quantify the performance improvements

Quantify the dollar value benefits

Compute investment returns

Make informed decisions based on quantified benefits, returns, and percent return

comparisons between learning programs

Dr. Phillips has created an ROI Methodology that he conducts certifications and workshops on

and has helped training organizations use the right tools to measure the ROI on organizational

learning. The methodology is a comprehensive approach to training measurement. It begins with

planning the project (referred to by Dr. Phillips as an Impact Study). It moves into the tools and

techniques to collect data, analyze the data and finally report the data. The end result is not only

a Level 5 ROI but also measurements on the Kirkpatrick 4 Levels as well. This yields a balanced

scorecard approach to the measurement exercise

While Kirkpatrick's model is not the only one of its type, for most industrial and commercial

applications it suffices; indeed most organizations would be absolutely thrilled if their training and

learning evaluation, and thereby their ongoing people-development, were planned and managed

according to Kirkpatrick's model.

For reference, should you be keen to look at more ideas, there are many to choose from...

Jack Phillips' Five Level ROI Model

Daniel Stufflebeam's CIPP Model (Context, Input, Process, Product)

Robert Stake's Responsive Evaluation Model

Robert Stake's Congruence-Contingency Model

Kaufman's Five Levels of Evaluation

CIRO (Context, Input, Reaction, Outcome)

ROI in Training (2010) 14

Page 16: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

PERT (Program Evaluation and Review Technique)

Alkins' UCLA Model

Michael Scriven's Goal-Free Evaluation Approach

Provus's Discrepancy Model

Eisner's Connoisseurship Evaluation Models

Illuminative Evaluation Model

Portraiture Model

and also the American Evaluation Association

A detailed model of understanding ROI according to this thinking and development

process is on the following page:

ROI in Training (2010) 15

Page 17: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

Diagram illustrating the ROI model

Source: Measuring the Return on Investment in Training and Development Certification Materials, Jack J. Phillips, Ph.D 2002

ROI in Training (2010) 16

Page 18: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

Another model that talks to ROI is based on the cost and complexity of the training that is

provided. The illustration below showcases a 3x5 learning measurement model to capture a

balanced scorecard of learning metrics that range from the low cost / simple solution to the higher

cost/ complex solution. Each may be applicable for different needs. Each is explained briefly

below.

Learner-Based

A measurement model that captures data from training participants at two distinct points during

the learning process. The first point is directly after the learning intervention (Post Event) where

the main measurement focus is on Kirkpatrick's Level I - and Level 2 to gauge satisfaction and

learning effectiveness. Because there is a high response rate to these data instruments it is also

critical to capture indicators for advanced levels of learning such as Level 3 - Job Impact, Level 4-

Business Results and Level 5 ROI. These indicators are in effect forecasting or predicting the

future impact the training will have on the participant and the organization.

A second data collection point is a follow up survey conducted a period of time after the

participant has been back on the job. This survey is meant to true up the forecast and predictive

indicators of Levels 3, 4 and 5 by gathering more realistic estimates now that the participant is

back on the job.

The approach is low cost if one leverages standard data collection instruments across their

training and utilizes technology and automation to capture process and report the collected data.

Thus it can be used for all of your training; each time a participant takes a class to yield

continuous measurements.

ROI in Training (2010) 17

Page 19: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

Manager-Based

This method has the same data collection points as the learner-based solution but adds a

manager-based dimension. The manager of the participant attending training is another important

data point. They can be sent an evaluation instrument timed when the participant receives a

follow-up. The manager survey focuses on Levels 3, 4 and 5 of the Kirkpatrick and Phillips

models therefore getting estimates surrounding job impact, business results and ROI from the

manager's perspective. The manager survey also asks 'support' type questions to understand the

on-the-job environment where the participant applied the training.

Due to the increased effort it takes to conduct and analyze manager surveys the cost and time to

measure at this level is higher than the Learner-Based approach. But, with automation and

technology to facilitate the dissemination, collection, processing, and reporting of the data, the

cost and time can be minimal. The result is that it could be used on a continuous basis for every

training event a participant attends. More realistically, it will be used on a periodic basis for more

strategic programs where manager data is more relevant.

Analyst-Based

This approach uses significantly more comprehensive post event, follow up and manager surveys

it also uses other analytical tactics that go beyond surveying. For example to analytically measure

Level 2 - learning effectiveness a detailed test is designed and administered to participants. Due

to the time commitment of conducting a significantly detailed data collection and analytical

exercise the Analyst-Based approach is only used for about 5% of all training programs in the

organization. Typically these programs are the more strategic or visible and have the budget to

afford a more costly and time -consuming measurement exercise.

Other ROI Models to be aware of:

Know that there are other models for describing ROI, such as:

Kaufman's Five Levels of Evaluation

The CIRO (context, input, reaction, and outcome) Approach

Stufflebeams's CIPP (context, input, process, and product) Model

Alkins' UCLA Model

Please note: it is not the intention of this document to provide an in-depth analysis of the various

models and their comparative merits. Suffice it to say that there are alternative ways to calculate

or describe ROI, and it is useful to know that they exist, and some of their characteristics. We

have selected the Phillips / Kirkpatrick model to work on, as it is well understood, and does lead

to specific financial numbers.

ROI in Training (2010) 18

Page 20: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

Stufflebeam – CIPP

Stufflebeam considers evaluation as the process of delineating, obtaining and providing useful

information for judging decision alternatives. The CIPP model of evaluation was developed by

Daniel Stufflebeam and colleagues in the 1960s, out of their experience of evaluating education

projects for the Ohio Public Schools District. Stufflebeam, formerly at Ohio State University, is

now Director of the Evaluation Centre, Western Michigan University, Kalamazoo, Michigan, USA.

CIPP is an acronym for Context, Input, Process and Product. This evaluation model requires the

evaluation of context, input, process and product in judging a programme’s value.

CIPP is a decision-focused approach to evaluation and emphasises the systematic provision of

information for programme management and operation. In this approach, information is seen as

most valuable when it helps programme managers to make better decisions, so evaluation

activities should be planned to coordinate with the decision needs of programme staff. Data

collection and reporting are then undertaken in order to promote more effective programme

management. Since programmes change as they are implemented, decision-makers

requirements will change so the evaluation activities have to adapt to meet these changing needs

as well as ensuring continuity of focus where appropriate in order to trace development and

performance over time.

The CIPP framework was developed as a means of linking evaluation with programme decision-

making. It aims to provide an analytic and rational basis for programme decision-making, based

on a cycle of planning, structuring, implementing and reviewing and revising decisions, each

examined through a different aspect of evaluation –context, input, process and product

evaluation. Stufflebeam viewed evaluation in terms of the types of decisions it served and

categorised it according to its functional role within a system of planned social change. The CIPP

model is an attempt to make evaluation directly relevant to the needs of decision-makers during

the different phases and activities of a programme.

In the CIPP approach, in order for an evaluation to be useful, it must address those questions

which key decision-makers are asking, and must address the questions in ways and language

that decision-makers will easily understand. The approach aims to involve the decision-makers in

the evaluation planning process as a way of increasing the likelihood of the evaluation findings

having relevance and being used. Stufflebeam thought that evaluation should be a process of

delineating, obtaining and providing useful information to decision-makers, with the overall goal of

programme or project improvement.

ROI in Training (2010) 19

Page 21: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

There are many different definitions of evaluation, but one which reflects the CIPP approach is

the following:

‘Programme evaluation is the systematic collection of information about the activities,

characteristics, and outcome of programmes for use by specific people to reduce

uncertainties, improve effectiveness, and make decisions with regard to what those

programmes are doing and affecting’

Stufflebeam sees evaluation’s purpose as

establishing and providing useful information for judging decision alternatives;

assisting an audience to judge and improve the worth of some educational programme or

object;

assisting the improvement of policies and programmes.

The four aspects of CIPP evaluation (context, input, process and outputs) assist a decision-maker

to answer four basic questions:

1. What should we do?

This involves collecting and analysing needs assessment data to determine goals, priorities

and objectives. For example, a context evaluation of a literacy program might involve an

analysis of the existing objectives of the literacy programme, literacy achievement test

scores, staff concerns (general and particular), literacy policies and plans and community

concerns, perceptions or attitudes and needs.

2. How should we do it?

This involves the steps and resources needed to meet the new goals and objectives and

might include identifying successful external programs and materials as well as gathering

information

3. Are we doing it as planned?

This provides decision-makers with information about how well the programme is being

implemented. By continuously monitoring the program, decision-makers learn such things as

how well it is following the plans and guidelines, conflicts arising, staff support and morale,

strengths and weaknesses of materials, delivery and budgeting problems.

ROI in Training (2010) 20

Page 22: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

4. Did the programme work?

By measuring the actual outcomes and comparing them to the anticipated outcomes, decision-

makers are better able to decide if the program should be continued, modified, or dropped

altogether. This is the essence of product evaluation.

The four aspects of evaluation in the CIPP model support different types of decisions and

questions – this is illustrated below in the table:

The CIPP model of evaluation

Aspect of evaluation Type of decision Kind of question answered

Context evaluation Planning decisions What should we do?

Input evaluation Structuring decisions How should we do it?

Process evaluation Implementing decisionsAre we doing it as planned?

And if not, why not?

Product evaluation Recycling decisions Did it work?

Kaufman's Five Levels of Evaluation

Kaufman promote an assessment strategy called the Organizational Elements Model (OEM)

which involves four levels of analysis:

1. audit or cost-cost - based solely on inputs, looks at reductions of cost between old and

new resources

2. products - the building blocks of a product or service within an organization that

contribute to the overall product or service (e.g., automobile fenders)

3. outputs - products or services that are delivered to external clients

4. outcomes - the value of the outputs (the aggregated products or services) delivered to

external clients and their clients and ultimately to society

Since the introduction of Kaufman's four-level OEM model, many researchers have used it as a

viable framework for evaluation. Others, though, have found it restrictive and have attempted to

modify and / or add to it. Kaufman, later added levels of impact that go beyond the traditional

four-level, training-focused approach which they felt did not adequately address substantive

issues an organization faces. Such modification to the model resulted in the addition of a fifth

ROI in Training (2010) 21

Page 23: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

level, which assesses how the performance improvement program contributes to the good of

society in general as well as satisfying the client.

Kaufman's Five Levels of Evaluation

Level Evaluation FocusSuggested

Levels*

5Societal

outcomes

Societal and client responsiveness, consequences

and payoffs.Mega

4Organizational

outputOrganizational contributions and payoffs. Macro

3 ApplicationIndividual and small group (products) utilization

within the organization.Micro

2 Acquisition Individual and small group mastery and competency. Micro

1b ReactionMethods', means', and processes' acceptability and

efficiency.Process

1a EnablingAvailability and quality of human, financial, and

physical resources input.Input

*Based on Kaufman's Organizational Elements Model (1992, 1995)

1.3 Types and levels for ROI

Delivery and accountability - Two terms that one hears mentioned with great frequency of late

and, perhaps for good reason. For too long now many of our organisations and institutions have

failed dismally to deliver results in line with expectations and to hold themselves accountable for

promises made or lack of performance. As the spotlight on performance intensifies universally,

the contribution made by the human resource development or training function will not be

excluded from this scrutiny.

In its most elementary form, the training responsibility in an organisation is not a complex

process. All the trainer needs to do is adhere to the following framework:

1. Identify the current and future key business imperatives impacting the organisation

2. Assess employee capability to meet these requirements or needs in terms of Knowledge,

Skill and Attitude

ROI in Training (2010) 22

Page 24: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

3. Design a training process to equip the employees with the necessary competencies to

perform at the desired levels

4. Implement the process using the most appropriate resources and training and performance

management technologies

5. Evaluate the outcome in terms of business results, reaction of the delegates involved, and the

cost of the training technology utilised.

Considerable organisational resource can be wasted if, (1) the training is not directly linked to the

key business needs of the organisation, and (2) there are no meaningful measures in place to

evaluate the efficacy, business impact or financial contribution of the training process, including a

Cost:Benefit or ROI analysis if appropriate. Every training intervention needs to be evaluated, but

not every programme requires the evaluation intensity of a ROI analysis. Only the high-cost,

organisationally pervasive training programmes would warrant this depth of scrutiny. However,

adhering to a set of basic ROI principles when designing any training process will always help in

ensuring positive results.

Training, like any key organisational activity, must be performed with professional discipline and

requires executive management stewardship if the full organisational contribution is to be

realised.

A model that specifically focuses on the LEVELS of ROI evaluation is the CIRO model.

This is explained below:

CIRO (context, input, reaction, and outcome)

The CIRO four-level approach was developed by Warr, Bird and Rackham.

The four components of evaluation

Adopting the CIRO approach to evaluation gives employers a model to follow when conducting

training and development assessments. Employers should conduct their evaluation in the

following areas:

C - Context or environment within which the training took place

I - Inputs to the training event

R - Reactions to the training event

O - Outcomes

A key benefit of using the CIRO approach is that it ensures that all aspects of the training cycle

are covered.

Context Evaluation here goes back to the reasons for the training or development event or strategy.

ROI in Training (2010) 23

Page 25: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

Employers should look at the methods used to decide on the original training or development

specification. Employers need to look at how the information was analysed and how the needs

were identified.

Inputs Evaluation here looks at the planning and design processes, which led to the selection of trainers,

programmes, employees and materials. Determining the appropriateness and accuracy of the

inputs is crucial to the success of the training or development initiative. If, for example, the wrong

types of learners were chosen to attend a customer care National Vocational Qualification

programme this would be a waste of time and money for the organisation.

Reactions Evaluation methods here should be appropriate to the nature of the training undertaken.

Employers may want to measure the reaction from learners to the training and to assess the

relevance of the training course to the learner’s roles. Indeed assessment might also look at the

content and presentation of the training event to evaluate its quality.

Outcomes Employers may want to measure the levels at which the learning has been transferred to the

workplace. This is easier where the training is concerned with hard and specific skills - this would

be the case for a train driver or signal operator but is harder for softer and less quantifiable

competencies including behavioural skills. If performance is expected to change as a result of

training, then the evaluation needs to establish the initial performance level of the learner.

In addition to evaluating the context, inputs, reactions and outcomes to training and development,

employers must continuously measure the costs. A cost / benefit analysis is usually conducted

prior to committing to any training initiatives. Costs must be monitored to ensure that they don't

scale over budget.

An alternative way to look at ROI is through the different levels of ROI analysis. This is explained

below by the Alkins UCLA model:

Alkins UCLA model

The UCLA Evaluation Model – Developed by Alkins, paralleled some aspects of the CIPP

model. (All but number 4 are alike) The Alkins model includes:

1. Systems assessment – Provide information about the state of the system.

2. Program planning – Assist in selection of particular programs to be effective in meeting

specific education needs.

3. Program implementation – To provide information about whether a program was introduced

to the appropriate group in the manner intended.

ROI in Training (2010) 24

Page 26: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

4. Program improvement – to provide information about how a program is functioning, whether

interim objectives are achieved, and whether unanticipated outcomes are appearing. (Not

similar to CIPP)

5. Program certification – To provide information about value of the program & it’s potential for

use elsewhere.

Alkins view of Evaluation – The process of ascertaining the decision areas of concern, selecting

appropriate information, and collecting and analyzing information in order to report summary data

useful to decision makers in selecting among alternatives.

1.4 Processes and principles for ROI

ROI is based on the following assumptions and principles – ROI needs to deliver on each of

these in order to be successful:

Cost effective to measure

Resource efficient to measure

Provides indicators of all 5 levels of evaluation

Is scalable and replicable across all learning classes, courses, curriculum and programs

Is benchmarkable for both internal and external comparisons

Uses reasonable assumptions based on industry proven principles and methodologies

Provides valuable business intelligence to a variety of stakeholders

Provides quantitative, financial evidence of return on investment

The key is deriving a monetized benefit from training. The benefit can be derived by calculating it

for a specific result such as sales, quality, productivity or cycle time. However, it may also be

derived by linking it to the known monetary value that is placed on human capital, an employee

salary. Let's use an example, if one buys a computer for R3,000 the expectation is that the

company will get R3,000 of value out of the computer. The computer may help a salesperson

increase sales or help a plant floor operator increase quality but the goal is to improve the user’s

job performance through the technology. The expectation is that at least R3,000 will be of benefit

in exchange for paying a cost of R3,000 to acquire the computer.

Compare this analysis to a person, (i.e. human capital). If the fully loaded salary (wages, benefits,

and overtime) of a newly hired employee is R50,000, the organization paying that expense

expects R50,000 of value from the employee. This value could come from their contributions in

one or more key business objectives such as sales, quality, productivity, cycle time, customer

ROI in Training (2010) 25

Page 27: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

satisfaction etc. But, in general, the organization expects a return of at least R50,000 from the

employee.

Now, say in our computer example our IT department added a R500 upgrade to it. The upgrade

is intended to make the machine faster, more resistant to bugs, and more accurate in its

processing computations. The business result is more productive employees, higher quality and

reduced cycle-time for a user of the computer. The expectation is that the R500 spent on the

upgrade will result in at least R500 returned in various benefits.

Compare this analysis with training. We use training to upgrade our people just as we add

components to a computer to upgrade technology. Training and organizational development are

proven tools to add knowledge and skills to our workforce. So, if an employee goes to a R1,000

training event over a week long period, the goal is that the employee will leverage the training to

help achieve various business results back on the job. Such results include increased sales,

quality, customer satisfaction, productivity etc. The expectation is that the R1,000 spent on the

training will result in at least R,000 returned in various benefits.

Important to consider as part of the process of ROI are the factors that might inhibit ROI:

Inhibitors to ROI measurement

When proposing to carry out an ROI exercise, it is worthwhile bearing in mind that in some cases,

there may be reluctance, in a training or HRD group, to commit wholeheartedly. There are a

number of reasons why this might be the case:

Fear of a negative ROI, and the implications this might entail

Reluctance to commit the necessary budget to carrying out the exercise

The fact that this process takes time, and only really demonstrates results some months,

or even longer, after the completion of the training

One of the keys to gaining commitment has to be to allay these fears and concerns, by

demonstrating a process that can be carried out, and by taking a realistic approach to the costs

and overheads, and the expectations of the client. In the right situations, the insights into the

effectiveness of the training can provide important strategic benefits to the company such as:

Measuring contribution of HRD

Setting priorities

Focusing on results

Altering management’s appreciation and perceptions of training

ROI in Training (2010) 26

Page 28: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

By carrying out some pre-training ROI analysis, it should be possible to demonstrate that the

there is a likelihood of a positive ROI on training, and thus allay some of the fear, as well as

encourage the investment in the training.

Important process guidelines for a successful ROI implementation:

For an ROI study to be successfully implemented, it should be designed with the following in

mind:

The ROI measurement must be simple

The ROI process must be economical to implement

The assumptions, methodology and techniques must be credible, logical, methodical and

practical

The ROI process must be theoretically sound, without being over-complex

The ROI process must account for other factors, which can influence the measured

outcomes after training

The ROI process must be appropriate in the context of other HRD programs

The ROI process must be flexible enough to be applied pre and post training

The ROI process must be applicable with all types of data collected

The ROI process must include the costs of the training and measurement program

ROI in Training (2010) 27

Page 29: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

Section 2: Applying ROI Methodology

2.1 ROI implementation methodology

The ROI Process is a comprehensive process that provides a scorecard of six measures.

These measures represent input from various sources during different time frames. The

measures include:

1. Reaction and satisfaction

2. Learning

3. Application and implementation

4. Business impact

5. Return on investment

6. Intangible benefits

In addition, the ROI Process utilizes at least one technique to isolate the effects of the program

from other influences. This comprehensive measurement system requires success with many

issues and must become a routine part of the learning development cycle.

There are good reasons why return on investment is such a hot topic. Although the viewpoints

and explanations may vary, some things are very clear. First, in most organizations education

and training budgets have continued to grow year after year (Training, 1999). As expenditures

grow, accountability becomes a more critical issue. A growing budget creates a larger target for

internal critics, often prompting the development of an ROI process.

Second, Total Quality Management and Continuous Process Improvement have brought

increased attention to measurement issues. Today organizations measure processes and

outputs that were not previously measured, monitored, and reported. This measurement focus

has placed increased pressure on the education and training function to develop measures of

program success.

Third, the proliferation of new hardware and software has created a need for accountability with

technology. For years, the implementation of new technology has escaped accountability.

Organizations have literally been willing to buy new technology, not knowing if the application

would actually generate an appropriate return on investment. Project sponsors have been

burned by inappropriate and improperly designed technology implementations. Today, that

situation is changing, particularly in education and training. Administrators and executives who

fund new projects are asking for a process to demonstrate accountability, with a commitment to

measure the return on investment after initial implementation. This situation has caused more

ROI in Training (2010) 28

Page 30: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

interest in accountability for any project utilizing technology, particularly in the education and

training field.

Fourth, restructuring and reengineering initiatives and the threat of outsourcing has caused

education and training executives to focus more directly on bottom-line issues. Many education

and training processes have been reengineered so that programs are more closely aligned with

business needs, and maximum efficiencies are required in the training cycle. These change

processes have brought increased attention to evaluation issues and have resulted in measuring

the contribution of specific programs.

Fifth, the business management mindset of many current education and training managers

causes them to place more emphasis on economic issues within the function. Today’s education

and training manager is more aware of bottom-line issues in the organization and more

knowledgeable of operational and financial concerns. This new “enlightened” manager often

takes a business approach to education and training and ROI is a part of the strategy.

Sixth, there has been a persistent trend of accountability in organizations all over the globe.

Every support function is attempting to show its worth by capturing the value that it adds to the

organization. From the accountability perspective, the education and training function should be

no different from the other functions. It must show its contribution to the organization.

Seventh, top executives are now demanding return on investment calculations from departments

and functions where they were not previously required. For years, training and education

managers convinced top executives that training cannot be measured, at least to the monetary

contribution level. Yet, many of the executives are now aware that it can and is being measured

in many organizations, thanks in part to articles in publications aimed at top executives (William

and Mary Business Review, 1995). Due to this increased awareness, top executives are

subsequently demanding the same accountability from their training and education functions. In

some extremes, these functions are being asked to show the return on investment or face

significant budget cuts (Gerber, 1994). Others are just being asked for results. The CEO for a

global telecommunications company recently described it this way: “For years we have evaluated

training with measures such as number of participants, number of programs, length of programs,

cost of programs, and content of programs. These are input focused measures. Now, we must

show what these programs are doing for our company and speak in terms that we can

understand. We need output focused measures.”(private interview) These no-nonsense

comments are being repeated throughout major organizations.

The Balanced Scorecard

As discussed, ROI is really value to your stakeholder and can mean different things for different

stakeholders. Merely positioning a financial metric to a training manager won't solve their

ROI in Training (2010) 29

Page 31: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

measurement needs. They need feedback on instructor performance, courseware quality etc.

Hence the need for an ROI scorecard that has a balanced set of metrics that provides indicators

on all 5 levels of learning, not just a financial ROI.

What should these measures be? A suggestion is to have a small set of measures that are

comprised of data gathered in a consistent manner on a continual basis. If done, a scorecard with

such metrics can be generated in a real time manner for any learning event or combination of

events you choose.

Below are the key performance components that comprise the scorecard.

Level 1 - Satisfaction

Level 2 - Learning Effectiveness

Level 3 - Job Impact

Time to Job Impact

Barriers to Use

Post Training Support

Level 4 Business Results

Job Performance Change

Business Drivers Impacted By Training

Level 5 Return on Investment

The benefits of this implementation methodology for ROI are:

Measure and Improve Job Impact

It allows you to streamline the learning evaluation process, measure training performance, and

ultimately, improve job impact.

Drive Superior Business Results

By having access to real-time learning and performance data, it provides organizations with the

ability to increase performance and drive superior business results.

Improve Return on Learning Investment

ROI in Training (2010) 30

Page 32: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

Because organizations they can't manage what they don't measure, it is important to establish the

right performance measures for all key investments. Learning is without a doubt one of the most

important investments any company will make.

The global economy and the rapid advancement of technology have made today's workforce

more mobile than ever before. Increased competition in a worldwide marketplace forced

companies to tighten their belts and find ways to value engineer everything, including learning. To

that end, today's world-class learning organizations are finding innovative ways to design and

deliver training better, faster and cheaper. These organizations are then monitoring the effects of

these changes through comprehensive measurement systems, and it helps these organizations

improve their Return on Learning Investment.

Industry Benchmark Comparisons

By leveraging ROI results, you can capture data on a wide array of learning interventions and

provide extensive reporting capabilities to the organization. Many value-added services using

such benchmarks for the organization to improve the performance of their learning operations.

Increase Shareholder Value

It helps organizations increase their shareholder value. By leveraging market leading models

such as the ROI Process, it provides thought leadership in the corporate learning industry.

Accelerate Adoption of E-Learning Programs

ROI captures learning performance data on many different learning modalities. Because learning

evaluation data is captured from online learning events in addition to traditional instructor-led

learning interventions, one can capture valuable data that helps corporations successfully adopt

and implement e-learning solutions.

Accountability on Training Dollars

Many corporate learning professionals have difficulties measuring their performance and

demonstrating value to senior management. Increasingly, corporate learning professionals are

being asked to justify budgets. ROI helps solve this problem by providing measurement data for

all training rands spent and helps training professionals determine what initiatives are working to

drive better business results.

Actionable Intelligence

ROI in Training (2010) 31

Page 33: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

ROI provides organizations with actionable intelligence. The data that is provided to

organizations, and the comprehensive ways in which we display this data provides organizations

with the ability to quickly gauge how effective learning is, and makes decisions accordingly.

2.2 Developing a Problem Tree for ROI

In order to really apply ROI methodology insightfully, you as the practitioner need to

understand the context of the organisation within which you are applying it, and the specific

context (or problem) for which the training intervention was designed / implemented. In other

words – what was the training supposed to change or solve or make better in the

organisational environment.

The Problem Tree method is a planning method based on needs; however it is not a

mechanical translation of problems into objectives. While going through the process, taking

the different steps, there is continuously room for opportunities, new ideas and contributions

from the involved parties. Problem Tree Analysis should be followed by actual project

planning, e.g. with the Logical Framework approach. Alongside, or interwoven with the steps

of Problem Analysis (at target group level) and project planning (for the target group), one

should analyse the capacity and intentions of stakeholders and the wider institutional context,

so that relevant and realistic choices can be made on who does what.

The problem tree is a visual problem-analysis tool that can be used to specify and investigate

the causes and effects of a problem and to highlight the relationships between them. It is a

ROI in Training (2010) 32

Page 34: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

tool for the identification and analysis of the relevant causes of the main problems, which will

later form the bases for formulating solutions and objectives for the strategy. A discussion of

the causes can help to identify the segments of the organization that are most affected and

who should be specifically interested in participating in activities aimed at removing the

causes of the problem. Remember that each cause of the problem is also a problem in its

own right.

As the name implies, this tool resembles a tree. The roots of the tree, in the lower part of the

drawing, metaphorically represent the causes of the main problem. The tree trunk at the

centre of the drawing represents the main problem and the tree branches, on the upper side

of the drawing, provide a visual representation of the effects of the main problem.

The whole purpose of the problem tree is to define the main problems present in the

organisation in order to analyse and prioritise their causes as the first step towards effective

sustainable solutions. Probably the most important tool to keep in mind throughout this

process is a single question or rather a single word: 'WHY?' It is amazing how this short word

can generate unexpected insights, which greatly help in developing an effective solution or

strategy. Never be afraid of asking or wondering why something is happening, even if it

seems obvious."

An example of a problem tree:

ROI in Training (2010) 33

Page 35: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

2.3 Developing a results chain for ROI

The Results Chain – planning for and measuring impact of training interventions

Training is implemented in order to achieve certain changes or results in the work process. The

impact of the training, when being applied back at work, is the REUSTS that we are looking to

measure in ROI. So to learn how to create and populate a results chain is important in the

process of planning and calculating the ROI of the training interventions that have been selected.

The Simple Results Chain

The results chain expresses the cause-and-effect relationship between a project and results. It’s

a way of representing the statement: “Projects cause results.”

ROI in Training (2010) 34

Page 36: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

We use the arrow to show causality; that is, to show that

something causes something else. So whenever you see the

arrow in the results chain, you can insert the word “cause” or

“lead to.” If you prefer, you can express it using “if / then”: If we

do this project, then we’ll see these results.

The Expanded Results Chain

Of course, if you’ve seen or worked with RBM frameworks, you know that things get a bit busier

than the simple results chain. Each of the two elements –

projects and results – in the chain above gets expanded, giving

us different components to consider. Even so, the results chain

remains a way of expressing the concept that projects cause

results.

Projects: Inputs and Activities

To begin, we expand the “projects” side of the results chain into two components: inputs and

activities.

Inputs are the resources that we will need to carry out the planned activities. They include things

like people, money, goods, materials, infrastructure, and technology.

Activities are the things that we do. Some familiar project activities include digging wells,

conducting training session, distributing seeds and tools, forming savings groups, and setting up

clinics.

In the results chain, and in RBM, projects include inputs and activities. According to the basic

principles of this management approach, the projects cause the results that are important.

Results: Outputs, Outcomes, and Impact

Now we expand the second half of the results chain: the results side. Results are divided into

three big categories called outputs, outcomes, and impact.

Outputs are the immediate results of activities. For example, the output of digging wells would be

the number of functioning wells in a community. Or the output of a training session would be the

number of trained individuals.

ROI in Training (2010) 35

Page 37: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

Outcomes are medium-term results caused by outputs. Examples of outcomes might be

households with access to clean drinking water or the percentage of trainees who start a

business or find a job.

The impact is the long-term, broad societal change that the outcomes lead to. For example,

improved longevity or decreased malnutrition might be the impact of a project.

Putting it all together

Our results chain now reads something like this:

Inputs lead to activities, which cause outputs, which cause outcomes, which cause the impact.

Another way of reading it would be:

If we have the inputs, then we can do the activities;

if we do the activities, then we will achieve the outputs;

if we achieve the outputs, then we will achieve the outcomes; and

if we achieve the outcomes, then we will contribute to the impact.

Illustrated below are the basic components of a results chain and the indicator / measure

terminology that is used to measure impact through the results chain.

ROI in Training (2010) 36

Page 38: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

The basic components of a results chain:

The terminology used in a results chain:

ROI in Training (2010) 37

Page 39: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

2.4 Developing indicators for ROI

Indicators are measurable or tangible signs that something has been done or that something has

been achieved. In some studies, for example, an increased number of television aerials in a

community has been used as an indicator that the standard of living in that community has

improved. An indicator of community empowerment might be an increased frequency of

community members speaking at community meetings. If one were interested in the gender

impact of, for example, drilling a well in a village, then you could use “increased time for

involvement in development projects available to women” as an indicator. Common indicators for

something like overall health in a community are the infant / child / maternal mortality rate, the

birth rate, nutritional status and birth weights. You could also look at less direct indicators such

as the extent of immunisation, the extent of potable (drinkable) water available and so on.

Indicators are an essential part of a monitoring and evaluation system because they are what you

measure and / or monitor. Through the indicators you can ask and answer questions such as:

Who?

How many?

How often?

How much?

But you need to decide early on what your indicators are going to be so that you can begin

collecting the information immediately. You cannot use the number of television aerials in a

community as a sign of improved standard of living if you don’t know how many there were at the

beginning of the process.

Some people argue that the problem with measuring indicators is that other variables (or factors)

may have impacted on them as well. Within a project it is possible to identify other variables and

take them into account. It is also important to note that, if nothing is changing, if there is no

improvement in the measurement of the key indicators identified, then your strategy is not

working and needs to be rethought.

Developing indicators

Step 1: Identify the problem situation you are trying to address. The following might be problems:

Economic situation (unemployment, low incomes etc)

Social situation (housing, health, education etc)

Cultural or religious situation (not using traditional languages, low attendance at religious

services etc)

Political or organisational situation (ineffective local government, faction fighting etc)

There will be other situations as well.

ROI in Training (2010) 38

Page 40: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

Step 2: Develop a vision for how you would like the problem areas to be / look. This will give you

impact indicators.

What will tell you that the vision has been achieved? What signs will you see that you can

measure that will “prove” that the vision has been achieved? For example, if your vision was that

the people in your community would be healthy, then you can use health indicators to measure

how well you are doing. Has the infant mortality rate gone down? Do fewer women die during

child-birth? Has the HIV / AIDS infection rate been reduced? If you can answer “yes” to these

questions then progress is being made.

Step 3: Develop a process vision for how you want things to be achieved. This will give you

process indicators.

If, for example, you want success to be achieved through community efforts and participation,

then your process vision might include things like community health workers from the community

trained and offering a competent service used by all; community organises clean-up events on a

regular basis, and so on.

Step 4: Develop indicators for effectiveness.

For example, if you believe that you can increase the secondary school pass rate by upgrading

teachers, then you need indicators that show you have been effective in upgrading the teachers

e.g. evidence from a survey in the schools, compared with a baseline survey.

Step 5: Develop indicators for your efficiency targets.

Here you can set indicators such as: planned workshops are run within the stated timeframe,

costs for workshops are kept to a maximum of R 2000 per participant, no more than 160 hours in

total of staff time to be spent on organising a conference; no complaints about conference

organisation etc.

With this framework in place, you are in a position to monitor and evaluate efficiency,

effectiveness and impact

Some specific and popular ROI indicators include:

The main ROI indicators from a finance perspective include the following:

Benefit to Cost Ratio

ROI Percentage

Payback Period

The benefit to cost ratio is probably the most relevant of the three. It is simply the monetized

benefit divided by the costs of the training. The costs should also be fully loaded for

ROI in Training (2010) 39

Page 41: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

conservatism. Typical costs need to include cost items such as needs assessment, design,

delivery, materials, overhead, evaluation, lost work time of participants and travel expenses of

participants. The benefit to cost ratio will then be a conservative view on the financial

ramifications of your training program. Ratios greater than 1 are positive in ROI. Ratios less than

1 are negative in ROI and ratios equal to one are break even. So, for example, if you have a

benefit to cost ratio of 2.5 that means the training program returned 2.5 dollars for every dollar

spent on it.

Another financial ratio is the ROI percentage. This is the benefit less the cost divided by the cost,

expressed as a percentage. Although more common than benefit to cost ratio, the benefit to cost

ratio is a more typical measure in training's use of ROI financial measures because it is not as

hard to interpret as the ROI percentage and has less tendency to be compared to other ROI

projects that are not human capital based.

Finally is the payback period. This is a time based financial metric. It tells you how many months

(or whatever time period you use) are required before you break even on the investment, after

which is a positive return. It is good to provide time -based metrics to balance out your scorecard.

2.5 Developing a measuring framework for ROI

Assessment of training may be conducted at 5 different levels, as illustrated in the matrix in

previous sections. These include (1) Evaluating the reaction of participants, (2) Measuring the

learning that occurred, (3) Assessing the on the job behavior, (4) Identifying business results of

training, and (5) Calculating the return on investment (ROI).

A comprehensive evaluation of ROI only occurs at Level 5. Most organizations compute the value

of training using data collected at the other four levels. These computations are based on a

number of assumptions and estimates; the lower the level of evaluation, the greater the use of

estimates and assumptions. For instance, at Level 1, a trainee's action plan might indicate that he

/ she estimates a 10% increase in productivity as a result of the training. This productivity gain

could be factored for with a confidence rating and then translated into increased revenue or profit.

A thorough ROI analysis is typically conducted for only 10-20% of all training programs. The

collection and analysis of data can be time consuming and expensive. Computing the ROI of

training at Level 5 involves these four steps: (1) isolating the effects of training, (2) converting

these effects (benefits) into monetary values, (3) calculating the costs of the training, and (4)

comparing the value of the effects to the incurred costs.

ROI in Training (2010) 40

Page 42: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

(1) Isolating the effects of training

To determine the ROI of training you must be able to measure the changes that occur as a result

of training. Consequently, you must know what the performance or level of knowledge was before

you began the training initiative. Pre-training data measurement might include frequency of

errors, labor hours per unit of production or service, dollars of scrap materials, number of returned

or defective products, volume of lost sales, absentee rates, turnover rates, or survey ratings

indicating customer dissatisfaction. Unfortunately, many companies do not maintain detailed

records, and therefore, the pre-training data is not available. Delaying training in order to

accumulate pre-training data may not be a wise decision.

An option to pre-training and post-training data comparisons is to use two sample groups of

employees. One test group receives the training while the control group does not. Productivity or

performance of the two groups is measured and compared. The challenge for HR managers is to

make sure that the two groups are similar except for the training variable. This requires

assessment and equal distribution of the skills in each group as well as a careful matching for

age, race, gender, educational level, years of experience, previous training, and other variables.

Isolating the effects of training requires identification of all of the key factors that impact employee

performance and business outcomes. Focus groups, questionnaires, surveys, and observations

will facilitate collection of this data. Possible contributors include employees, senior managers,

supervisors, customers, vendors, and training and human resource development specialists.

(2) Converting the effects of training into monetary values

The effects, or benefits, of a training program should always be identified, qualified and converted

to dollars with input from management. Trainees' supervisors, department or division heads,

senior level executives or even the board of directors are in excellent positions to observe

changes in performance or impact on the bottom line. Their decisions and data may be far more

objective and credible than if the HR manager makes all of the decisions regarding the scope,

impact and duration of training benefits. Costs are known up front. Benefits may accrue slowly

over time. Accurately estimating the number of times a course will be used, the number of

employees that will be affected, the dollar impact of changes in quantity or quality, and the extent

to which training will affect results requires skill, insight, and clearly defined objectives.

Effects can be tangible or intangible and are frequently referred to as "hard data" and "soft data."

Hard data is quantitative, statistical, number oriented and easily translated into monetary benefits.

Soft data is qualitative and refers to intangible benefits that are subjective and thus are more

difficult to measure and translate into monetary benefits.

ROI in Training (2010) 41

Page 43: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

Samples of both kinds of data are listed below.

Sample "hard" data for determining the effects of training data

Productivity measures (quantity or market value)

Quality measures (number of rejects or cost of rejects)

Materials costs (amount per unit of production or amount of waste or scrap)

Labor hours per unit of production

Labor costs per unit of production

Hours of "down time" due to equipment failure, etc

Absenteeism and tardiness rates

Turnover rate

Workers compensation claims - nature and number of injuries or illnesses, days of lost

work or "light duty" work

Number of grievances / legal claims / lawsuits

Time required to fill vacant positions

Time required to fill an order, respond to a telephone call, resolve a complaint, etc.

Number of sales or dollar value of sales per customer

Percent of market share

Customer satisfaction rating or index

Number of repeat customers

Number of accounts or dollar value of accounts more than 30, 60, 90 days past due

Sample "soft data" effects or benefits of training

Improved job satisfaction

Improved teamwork

Increased organizational commitment

Improved succession planning

Increased communication regarding career paths

More clearly defined promotion opportunities

(3) Calculating the cost of training

ROI in Training (2010) 42

Page 44: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

HR managers generally are able to establish the costs of a training program. When calculating

costs, remember to carefully consider all indirect costs, such as staff time, use of existing

materials, equipment, classrooms etc. The term "fully loaded costs" is sometimes used to

designate that the costs of a program include both direct and indirect costs.

Sample training costs:

Course Development - needs analysis, design, writing, illustrating, validating tests and

evaluation instruments

Wages and Salaries of HR staff, managers and employees involved in course design and

development

Wages and Salaries of instructor and any onsite support staff

Wages and Salaries of trainees (sometimes referred to as "seat time")

Wages for temporary or contract workers hired to maintain productivity / service during

regular employees' training time

Loss of revenue while trainees are involved in training activities

Payments to outside trainers, outside training companies, or consultants

Instructional Materials - printing, copying, or reproducing manuals, videos, computer

disks, and flip charts; pens, pencils, paper, etc.; and / or purchase of materials from

outside vendor

Equipment and Facilities - including all types of AV equipment, computer hardware and

software, classroom overhead or rental of training facilities or equipment

Administration - marketing, scheduling, registration, testing, documentation, copying,

collating, long distance phone calls, postage, (administrative staff time and costs

associated with administrative duties)

Logistics - lodging, meals, tips, refreshment breaks, and shipping costs

Travel time - to and from training

Tuition reimbursement

(4) Comparing the value of effects to the incurred cost

It may not always be feasible to measure ROI. It is doubtful that companies are setting up web

sites and developing e-commerce initiatives after a careful and exhaustive analysis of their ROI.

These organizations realize that rapid changes are occurring, and they will lose future business if

they fail to adapt business methods to new technologies.

ROI in Training (2010) 43

Page 45: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

Similarly, if HR becomes focused on getting every cost down to a line item, and then comparing

the costs with the benefits, every line item could be debated. Are you able to clearly show that it

was the training that made the difference in a customer's satisfaction? Are you sure you will

provide training for (x) number of employees before this training program becomes obsolete? Are

you sure that the training is responsible for the improvement in employee performance rather

than a new manager or a change in the compensation plan? When did you decide to measure

training results, in 3 months or 6 months? What time frame is logical or reasonable for your

industry? What would happen to your ROI calculations if turnover increased or decreased?

Sometimes you may need to focus on the big picture, the changes that are occurring, rather than

a line-by-line item of cost benefit analysis or ROI.

New employee orientation and diversity training are examples of types of training where it may be

difficult to measure ROI. New employees have varied backgrounds. For some employees, this is

their first job, and they might find it difficult to absorb all of the information given to them. Others

employees will have years of work experience and have attended several new employee

orientation programs. They already have a base of knowledge and are looking for the kind of

information that is specific to your company. Diversity training may also be an initial experience or

a review. It may be very difficult to hold all variables constant when you are trying to look at the

effect of training. However, few would argue that new employee orientation or diversity training

should be eliminated simply because you cannot easily demonstrate a ROI. Orientation programs

set the tone for the entire employer / employee relationship. Turnover rates or productivity

measurements of new employees using a test group and a control group could provide data for

estimating the value of orientation programs if implemented throughout the entire organization.

ROI in Training (2010) 44

Page 46: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

Section 3: Data Collection and Analysis for ROI

3.1 Basic research design and methodology for ROI

It is necessary to collect data at various levels to build up a proper picture of the influence of

learning (as per the Kirkpatrick model, or any other model). In order to determine which data

items should be collected, and when, you need to agree with the customer your approach to the

questions as suggested below. These questions describe some of the potential areas where

returns can be achieved; it is down to you and the client to decide exactly which of these factors

apply in any given situation.

Beware that, sometimes, benefits can be hidden; for example, consider the business benefits of

“certification for regulatory compliance” in the ROI calculation; it may be that compliance is seen

as a pure overhead cost of doing business; but in this case, training can reduce the cost of

compliance (and the costs of non-compliance), and provide benefits in enabling new business

activities, or better quality of customer service resulting from the compliance-training etc.

Examples of such factors may be

How can you quantify “competitive advantage” resulting from a more highly trained and

competent workforce, resulting in shareholder value?

o Competitive advantage is probably seen differently by every organisation; what

constitutes competitive advantage may perhaps be better seen as a compilation

of factors, such as quality, cost, customer satisfaction; but there may be

intangible elements such as “company image in the market” which amounts to

more than the sum of the individual components of competitive advantage.

How can you quantify the value of improvement in managers’ long term decision making

resulting from improved techniques in situational analysis and decision making?

o This may require some quite sophisticated analysis; maybe this can be

determined by offering simulation exercises, or role / game playing, and deriving

a score from that.

Note: you should try to estimate beforehand what a reasonable ROI is. Carrying out an expensive

exercise to show a 300% ROI is great; unless comparable training exercises in the clients

industry for equivalent training typically show 750%! Or even better, how does the client typically

view 300%? Is this good in their circumstances, or is it lower than their typical ROI as determined

from previous training?

ROI in Training (2010) 45

Page 47: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

3.2 Data collection options for ROI

It is important to note that it is critical to gather data from participants of ROI at least two points in

time. The Post Event instrument will gather data immediately after a learning intervention. It is at

this point that the participant estimates via forecasting their job performance, isolation and

confidence ratings. This is important because a huge value in ROI is to make it a predictive tool

not a reactive or historic tool. Forecasting ROI prior to the passage of time can be a very valuable

tool to make business decisions just as sales forecasts drive sales decisions and accounting

forecasts drive accounting decisions.

The Follow-Up is a second exercise to re-collect data when the participant is back on the job and

time has passed. Here the data on job performance, isolation, and adjustment is no longer a

prediction but a realistic estimate of what has really occurred. This is critical to do to understand

reality. Just as sales people review actual versus forecast so to should training personnel view

Post Event vs. Follow Up

Sources of data can include:

Organisational Performance Records, showing outputs and measurements taken as part

of the business’ normal reporting process

Testing and certification assessment records

Participant feedback

Instructor feedback

Feedback from participants’ supervisors / managers

Feedback from participants’ subordinates

Team / group peer feedback

Feedback from other internal or external groups (e.g. HR training departments)

Data Collection methods

There are various methods of collecting data; some of these, such as surveys and tests can be

automated, to reduce the time and resource required to carry out the data collection:

Questionnaires; Knowledge tests; certification programme:

To capture subjective or objective information about participants’ performance changes

Comparison of pre-training and post-training assessments (“compare apples with

apples”)

Surveys

ROI in Training (2010) 46

Page 48: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

To capture subjective feedback on attitudes, beliefs and opinions

Comparison of pre-training and post-training assessments (“compare apples with

apples”)

On the job Observation

To capture objective data on performance in carrying out specific task(s) in the normal

work environment

Interviews

To capture in depth subjective or objective information about participants’ performance

changes, and / or to capture subjective feedback on attitudes, beliefs and opinions

Focus groups

An extension to the Interview technique, involving in depth discussion as well as

individual feedback; especially valuable for subjective quality judgments covering a group

of participants

Action plans (or Performance contracts) and Program assignments

To measure the actual performance of a participant in carrying out a pre-determined

task(s), or to assess the participant’s understanding of what actions would be taken in

performing the task(s)

Performance data monitoring

To capture specific performance of the participants against measured performance

criteria

3.3 Data analysis options for ROI

Following the planning process, implementation begins. Data collection and analysis is central

to the ROI process. Both hard data, representing output, quality, costs, and time; and soft

data, including work habits, work climate, and attitudes, are collected. Data are usually

collected during two time frames. During the training process, Level 1 and Level 2 data are

collected. Following the training, Level 3 and Level 4 data are collected.

A variety of methods are used to collect the post-program data to be used in the ROI analysis

and evaluation. Important to note here is that he method selected to collect the data, will in

turn affect the way that they data can be analysed, compared and integrated into the ROI

calculation – so selection is very important:

ROI in Training (2010) 47

Page 49: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

Follow-up surveys are taken to determine the degree to which participants have utilized

various aspects of the program. Survey responses are often developed on a sliding

scale and usually represent attitudinal data. Surveys are useful in collecting Level 3 data.

Follow-up questionnaires are administered to uncover specific applications of education

and training. Participants provide responses to a variety of open-ended and forced

response questions. Questionnaires can be used to capture both Level 3 and 4 data.

On-the-job observation captures actual skill application and use. Observations are

particularly useful in customer-service training and are more effective when the observer

is either invisible or transparent. Observations are appropriate for Level 3 data.

Post-program interviews are conducted with participants to determine the extent to

which learning has been utilized on-the-job. Interviews allow for probing to uncover

specific applications and are appropriate with Level 3 data.

Focus groups are conducted to determine the degree to which a group of participants

have applied the training to job situations. Focus groups are appropriate with Level 3

data.

Program assignments are useful for simple short-term projects. Participants complete

the assignment on the job, utilizing skills or knowledge learned in the program.

Completed assignments can often contain both Level 3 or 4 data.

Action plans are developed in programs and are implemented on the job after the

program is completed. A follow-up of the plans provide evidence of program success.

Level 3 and 4 data can be collected with action plans.

Performance contracts are developed where the participant, the participant’s

supervisor, and the instructor all agree on specific outcomes from education and training.

Performance contracts are appropriate for both Level 3 and 4 data.

Programs are designed with a follow-up session which is utilized to capture evaluation

data as well as present additional learning material. In the follow-up session, participants

discuss their successes with the program. Follow-up sessions are appropriate for both

Level 3 or 4 data.

Performance monitoring is useful where various performance records and operational

data are examined for improvement. This method is particularly useful for Level 4 data.

The important challenge is to select the data collection method or methods appropriate for the

setting and the specific program, within the time and budget constraints of the organization.

Isolating the Effects of the Program

An often overlooked issue in most ROI evaluations is the process to isolate the effects of

education and training. In this step of the ROI process, specific strategies are explored that

determine the amount of output performance directly related to the program. This step is

essential because there are many factors that will influence performance data after education and

ROI in Training (2010) 48

Page 50: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

training programs are conducted. The specific techniques utilized at this step will pinpoint the

amount of improvement directly related to the program. The result is increased accuracy and

credibility of the ROI calculation. The following techniques have been utilized by organizations to

address this important issue:

A control group arrangement may be used to isolate impact. With this technique, one

group participates in the program while another, similar, group does not. The difference

in the performance of the two groups is attributed to the program. When properly setup

and implemented, the control group arrangement is the most effective way to isolate the

effects of education and training.

Trend lines are used to project the value of specific output variables, if the program had

not been undertaken. The projection is compared to the actual data after the program

and the difference represents the estimate of the impact. Under certain conditions this

strategy can be an accurate way to isolate the impact of education and training.

When mathematical relationships between input and output variables are known, a

forecasting model is used to isolate the effects of a program. With this approach, the

output variable is predicted using the forecasting model with the assumption that the

program is not conducted. The actual performance of the variable after the program is

then compared with the forecasted value to estimate the impact of education and training.

Participants estimate the amount of improvement related to education and training.

With this approach, participants are provided with the total amount of improvement, on a

pre-and post-program basis, and are asked to indicate the percent of the improvement

that is actually related to the program.

Supervisors of participants estimate the impact of education and training on the output

variables. With this approach, supervisors of participants are presented with the total

amount of improvement and are asked to indicate the percent related to the program.

Senior managers estimate the impact of education and training. In these cases,

managers provide an estimate or “adjustment” to reflect the portion of the improvement

related to the program. While perhaps inaccurate, there are some advantages of having

senior management involved in this process, such as senior management ownership of

the program.

Experts provide estimates of the impact of education and training on the performance

variable. Because the estimates are based on previous experience, the experts must be

familiar with the type of training and the specific situation.

In supervisory and management training, the subordinates of participants identify changes in the work climate which could influence the output variables. With this

approach, the subordinates of the supervisors receiving training determine if other

variables changed in the work climate that could have influenced output performance.

ROI in Training (2010) 49

Page 51: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

When feasible, other influencing factors are identified and the impact estimated or calculated leaving the remaining unexplained improvement attributed to education and

training. In this case, the influence of all of the other factors are developed and the

program remains the one variable not accounted for in the analysis. The unexplained

portion of the output is then attributed to the program.

In some situations, customers provide input on the extent to which training has influenced their decision to use a product or service. Although this strategy has limited

applications, it can be quite useful in customer service and sales training.

Collectively, these ten techniques provide a comprehensive set of tools to isolate the effects of

education and training.

3.4 Evaluation techniques for ROI

Evaluation is different to monitoring and collecting data for the ROI analysis process. Evaluation

is understood as the end result of a process of action – this evaluation can take place at different

levels. Here we refer again to Kirckpatrick’s four levels and use his model to illustrate the

evaluation techniques that can be used at each level of intervention:

Kirkpatrick's four levels of training evaluation in detail

This grid illustrates the Kirkpatrick's structure detail, and particularly the modern-day interpretation

of the Kirkpatrick learning evaluation model, usage, implications, and examples of tools and

methods. This diagram is the same format as the one above but with more detail and explanation:

Evaluation level and

type

Evaluation description and characteristics

Examples of evaluation tools and methods

Relevance and practicability

1. Reaction Reaction evaluation is how the delegates felt, and their personal reactions to the training or learning experience, for example:

Did the trainees like and enjoy the training?

Did they consider the training relevant?

Was it a good use of their time?

Did they like the venue, the style, timing, domestics, etc?

Typically 'happy sheets'.

Feedback forms based on subjective personal reaction to the training experience.

Verbal reaction which can be noted and analysed.

Post-training surveys or questionnaires.

Online evaluation or grading by delegates.

Subsequent verbal or written reports given by delegates to managers back at their jobs.

Can be done immediately the training ends.

Very easy to obtain reaction feedback

Feedback is not expensive to gather or to analyse for groups.

Important to know that people were not upset or disappointed.

Important that people give a positive impression when relating their experience

ROI in Training (2010) 50

Page 52: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

Level of participation.

Ease and comfort of experience.

Level of effort required to make the most of the learning.

Perceived practicability and potential for applying the learning.

to others who might be deciding whether to experience same.

2. Learning Learning evaluation is the measurement of the increase in knowledge or intellectual capability from before to after the learning experience:

Did the trainees learn what was intended to be taught?

Did the trainee experience what was intended for them to experience?

What is the extent of advancement or change in the trainees after the training, in the direction or area that was intended?

Typically assessments or tests before and after the training.

Interview or observation can be used before and after although this is time-consuming and can be inconsistent.

Methods of assessment need to be closely related to the aims of the learning.

Measurement and analysis is possible and easy on a group scale.

Reliable, clear scoring and measurements need to be established, so as to limit the risk of inconsistent assessment.

Hard-copy, electronic, online or interview style assessments are all possible.

Relatively simple to set up, but more investment and thought required than reaction evaluation.

Highly relevant and clear-cut for certain training such as quantifiable or technical skills.

Less easy for more complex learning such as attitudinal development, which is famously difficult to assess.

Cost escalates if systems are poorly designed, which increases work required to measure and analyse.

3. Behaviour

Behaviour evaluation is the extent to which the trainees applied the learning and changed their behaviour, and this can be immediately and several months after the training, depending on the situation:

Did the trainees put their learning into effect when back on the job?

Were the relevant skills and knowledge used

Was there noticeable and

Observation and interview over time are required to assess change, relevance of change, and sustainability of change.

Arbitrary snapshot assessments are not reliable because people change in different ways at different times.

Assessments need to be subtle and ongoing, and then transferred to a suitable analysis tool.

Assessments need to be

Measurement of behaviour change is less easy to quantify and interpret than reaction and learning evaluation.

Simple quick response systems unlikely to be adequate.

Cooperation and skill of observers, typically line-managers, are important factors, and difficult to control.

Management and

ROI in Training (2010) 51

Page 53: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

measurable change in the activity and performance of the trainees when back in their roles?

Was the change in behaviour and new level of knowledge sustained?

Would the trainee be able to transfer their learning to another person?

Is the trainee aware of their change in behaviour, knowledge, skill level?

designed to reduce subjective judgement of the observer or interviewer, which is a variable factor that can affect reliability and consistency of measurements.

The opinion of the trainee, which is a relevant indicator, is also subjective and unreliable, and so needs to be measured in a consistent defined way.

360-degree feedback is useful method and need not be used before training, because respondents can make a judgement as to change after training, and this can be analysed for groups of respondents and trainees.

Assessments can be designed around relevant performance scenarios, and specific key performance indicators or criteria.

Online and electronic assessments are more difficult to incorporate - assessments tend to be more successful when integrated within existing management and coaching protocols.

Self-assessment can be useful, using carefully designed criteria and measurements.

analysis of ongoing subtle assessments are difficult, and virtually impossible without a well-designed system from the beginning.

Evaluation of implementation and application is an extremely important assessment - there is little point in a good reaction and good increase in capability if nothing changes back in the job, therefore evaluation in this area is vital, albeit challenging.

Behaviour change evaluation is possible given good support and involvement from line managers or trainees, so it is helpful to involve them from the start, and to identify benefits for them, which links to the level 4 evaluation below.

4. Results  Results evaluation is the effect on the business or environment resulting from the improved performance of the trainee - it is the acid test.

Measures would typically be business or organisational key performance indicators,

It is possible that many of these measures are already in place via normal management systems and reporting.

The challenge is to identify which and how relate to the trainee's input and influence.

Therefore it is important to identify and agree accountability and

Individually, results evaluation is not particularly difficult; across an entire organisation it becomes very much more challenging, not least because of the reliance on line-management, and the frequency and scale of changing structures, responsibilities and

ROI in Training (2010) 52

Page 54: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

such as:

Volumes, values, percentages, timescales, return on investment, and other quantifiable aspects of organisational performance, for instance; numbers of complaints, staff turnover, attrition, failures, wastage, non-compliance, quality ratings, achievement of standards and accreditations, growth, retention, etc.

relevance with the trainee at the start of the training, so they understand what is to be measured.

This process overlays normal good management practice - it simply needs linking to the training input.

Failure to link to training input type and timing will greatly reduce the ease by which results can be attributed to the training.

For senior people particularly, annual appraisals and ongoing agreement of key business objectives are integral to measuring business results derived from training.

roles, which complicates the process of attributing clear accountability.

Also, external factors greatly affect organisational and business performance, which cloud the true cause of good or poor results.

ROI in Training (2010) 53

Page 55: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

Section 4: ROI Calculations

4.1 Various approaches to calculating ROI

ROI is a key financial metric of the value of training investments and costs. It is a ratio of net

benefits to costs, expressed as a percentage. The formula can be expressed as:

[(monetary benefits – cost of the training) / cost of the training] x 100

If the ROI from a training programme is calculated at 335, it means that for every rand spent,

there has been a return of R3,55 in net benefit, after all costs are factored in. The exact form of

that benefit depends on the objectives of the learning programme, i.e. better telephone technique

application and communication skills after training. The total cost of the employee to the

organisation is calculated including floor space, p.c. rental, etc. If the employee answers 3,000

calls per month divided into the cost of R7, 000. To the organisation, the cost would be R2.33 per

call cost to the organisation. After training the employee answers 3,180 calls per month. The cost

of the employee to the organisation is now R2.20. The value of the training programme is then

calculated as per the above formula to determine the monetary value of the benefit

Some factors to take into account when calculating the returns on the investment

Return (Benefit) factors

Reaction and plan

o What is the immediate reaction as to whether the training / learning has been, or

is likely to be, useful

o How that will translate into immediate / short-term identification of how the

learning will be applied to the job

o What are the immediate expectations of how the learning will positively affect the

individual’s performance

Measuring Learning

ROI in Training (2010) 54

Page 56: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

o How has the learning affected the individual’s skills and knowledge level required

to perform effectively; what is the expected change in performance levels and

productivity?

o Has the learning positively or negatively affected the individual’s attitude to their

work?

Assessing application to the job

o How has the training actually changed the individual’s previous job performance

against measured performance standards and criteria?

o Has the individual developed an ability to perform new tasks?

o Has the individual been able to save costs which would have been incurred

without the training?

Business results (benefits and cost savings)

o What is the actual numerical change in the measured metrics for the specified

data items?

o What are the actual monetary values of the changes achieved?

o What other identifiable cost savings have resulted from the training methodology

used, compared to alternative methodologies (e.g. costs saved by using

elearning vs. classroom)?

Investment (Cost) factors

Some of the factors to take into consideration when calculating the total cost of a training

programme or course

Money spent on the training / learning

o Equipment upgrades (e.g. for an advanced e-learning package requiring

additional hardware and software, servers etc, or for additional equipment in a

classroom) ***

o External content development and maintenance resource (for training and testing

/ certification

o Classroom costs (space costs, equipment rental, catering, materials, promotional

items, stationery)

o Cost of infrastructure overheads for elearning and virtual classroom facilities (a

share of the ongoing costs of having access to these facilities) ***

o Communications costs to reach internal and external audiences

promotional programmes

mailers and flyers

poster design and production

conference calls and meetings

PR

ROI in Training (2010) 55

Page 57: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

o Costs incurred in the evaluation exercise

Time and effort used

o Who

Training Project team

Training programme managers

Instructors

Senior management time and attention

Training administration, records administration

Learners / students

Learners’ managers (e.g. time spent contributing to the evaluation)

o What activities

Training Needs Analysis and Programme design

Materials development and testing

Accreditation design and development

Communications and training

Evaluation and ROI calculation

Both of these items are amenable to financial cost calculation. Note: Time and effort should take

into account both:

Actual salary and expenses of staff

Opportunity cost of not performing other tasks

*** Note, some items of infrastructure costs, such as implementations of Learning Management

Systems, may be so large as to make for great difficulty in allocating costs to a specific

programme. In these cases you should decide whether to:

allocate a “nominal share” of shared facilities or infrastructure cost to any particular

programme of learning

calculate an actual cost of shared facilities or infrastructure cost where it has been in

operation for long enough to make a reasonable calculation

allocate the full cost of this infrastructure, if it is implemented purely for the purpose of

supporting the project

In all these cases, the basis of calculation should be clearly stated and understood.

ROI in Training (2010) 56

Page 58: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

Steps for calculating ROI

Step 1: Create an ROI Measurement Plan

To achieve the ROI measurement requires that the ROI process is planned early, and that two

planning documents are prepared:

The Data Collection Plan

State the objectives of the training / learning

State the objectives of each phase of data collection at each evaluation Level

Identify any previously used metrics, values or methodologies used by the

client, and determine their suitability for the current exercise

Select the appropriate evaluation methods

Identify the audiences who will be surveyed for data

i. The learner

ii. The manager (focused on levels 3 and 4)

iii. The analyst (for analytically determined performance-change data based

on available performance measurement data)

Set the timing for the data collection

i. Pre-training data collection (benchmarking the current situation)

ii. Post-training data collection (comparative data (“apples to apples

comparison”))

1. immediately post-training (initial reaction and assessment)

focused on levels 1 and 2

2. at a later date when the effect of the training has had time to

make itself felt in the learner’s job performance – focused on

levels 3 and 4

Allocate responsibilities for data collection and analysis

The ROI Analysis Plan (This is a continuation of the Data Collection Plan, capturing

information on the key items needed to develop the actual ROI calculation.)

List Significant Data items (usually Level 4 or 3) to be collected

i. Benefit Factors

ii. Cost Factors

Methods to isolate effects of the learning / training from other influences

Methods to convert data to numerical values

Intangible benefits

Other influences

Communication targets

ROI in Training (2010) 57

Page 59: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

Step 2: Collect Data

As described above, data will have to be collected at various points in the training process, to be

able to carry out effective ROI calculations. The following notes may be helpful in planning your

data collection exercise.

1. Identify the purposes of the evaluation. State clearly what the evaluations are to

measure and what the goals of the training are intended to be. Be as specific as

possible about the goals of the training, and make sure these goals address the

performance enhancement, business improvement or cost savings expectations.

2. Select the evaluation instruments and methodology. Identify how the data will be

collected and analysed (see below for different methods of data collection)

3. Establish the timing for the data collection. Decide whether pre-training analysis is

required, or post training analysis, or both. (E.g. pre-training and multiple post-

training assessments may be necessary to effectively identify the skills changes in

Levels 2, 3 and 4.)

4. Carry out the data collection at the levels 1-4 indicated above

Step 3: Isolate the effects of training

Once the training program is complete, it is essential to identify which performance improvements

have resulted from the training, and which improvements are co-incidental and may not be

directly relevant to the training.

Tools to isolate the effects of training can include:

Control Groups; comparing the change in performance of a group which has undertaken

training, to the performance change of a similar, untrained group, can provide a factoring

value by which the total measured change can be adjusted.

Trend lines; these show the trends in performance change, which would have been

expected if the training had not taken place; these can be compared to actual

improvement measurements. Draw a trend line, derived from previous performance data,

and extend into the future; then compare this later with the actual data derived from the

results of post-training evaluation.

Mathematical forecasting. This may be appropriate where there are several factors to

consider in the change in performance (e.g. sales increase due to training, and an

increase in marketing expenditure). It may be possible to calculate the expected trend

due to the marketing intervention and calculate the difference as being due to the training

(see Phillips “ROI in Training and Performance Improvement Programs”).

Participants’ estimates of the impact.

Supervisors’ or managers’ assessments of the impact of training on measured

performance improvement.

ROI in Training (2010) 58

Page 60: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

Senior management estimates of the overall value of training programs on the business,

taking into account other factors they are aware of.

Having isolated the effects of training, it may be necessary to adjust the numeric results of the

data collection exercise to reflect the fact that estimates may be “inflated”; on average, individuals

may believe that training has resulted in a 25% performance improvement; in practice it may be

less than that, as the group concerned may have a tendency to over-estimate the effects of the

training. Taking a conservative approach, and applying an adjusting factor, may help to reduce

bias in the survey data.

Step 4: Convert data to monetary value

In this stage it is important to estimate the financial value of the various changes resulting from

the training, and to identify the total costs incurred in implementing the training program.

There are various strategies for estimating the value of performance changes:

Output data converted to profit contribution or cost savings

o Direct costs saved

o Increased volumes of output produced

o Timeliness of output

Cost of quality calculated and quality improvements converted to cost savings or

increased profitability

Cost savings (salaries and overheads) in reductions in participants’ time in completing

projects.

Internal or external experts may be able to estimate the values of the performance

improvements gained.

Participants or their supervisors / mangers can estimate the cost savings or value of

increased productivity

Having calculated the direct financial value of the performance enhancements, it is also

necessary, wherever possible, to estimate the value of the more “intangible benefits”, such as:

Increased job satisfaction, and the benefits of increased staff retention and reduced

recruitment costs

Increased organisational commitment

Improved teamwork

Improved customer service

Reduced problems and complaints

Reduced conflicts

ROI in Training (2010) 59

Page 61: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

To calculate the cost of the training program, ensure you include:

Cost of external training services purchased

Cost of training materials supplied

Cost of internal training staff involvement

Cost of facilities used for the program

Travel, accommodation and other incidental costs

Salaries and benefits of the participants

Administrative and overhead costs of the training function.

The costs of carrying out the ROI on the training program.

Step 5: Calculate the ROI

ROI can be expressed in 3 different ways

1. Benefit / Cost Ratio

BCR = Program Benefits

Program Costs

BCR uses the total benefits and the total costs, and are expressed as a ratio (e.g. 10:1)

For example:

If you gain a benefit of $1M in 12 months, and the cost of training is $250K for the same

period, the BCR is 4:1

2. ROI %

ROI = Net Program Benefits

Program Costs x 100

In ROI, the costs are subtracted from the total benefits to produce net benefits, which are

then divided by the costs. This shows the net benefits to the company after the costs are

covered.

ROI in Training (2010) 60

Page 62: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

For example:

If you gain a benefit of $1M in 12 months, and the cost of training is $250K for the same

period, the net benefit is $750K; divided by the costs of $250K (and multiplied by 100) this

equates to a 300% ROI

3. Break-even time

Break-even time in months = Investment

Benefit x Period in months

For example:

If you gain a benefit of $1M in 12 months, and the cost of training is $250K for the same

period, the break-even time is $250k divided by $1M, times 12 months = 3 months

It is possible for business benefits to last more than 1 year, but in most cases, the effect of

training is more difficult to assess over longer periods, so typically 1 year benefits are calculated.

Longer term programs can be measured over multiple years.

Calculating ROI requires that business results data must be converted to monetary benefits. It is

important to be able to allocate financial value to results such as

Improved productivity

o Time saved

o Output increased

Enhanced quality

Reduced employee turnover

Decreased absenteeism

Improved customer satisfaction

ROI calculations can be used in 2 ways;

To identify the returns on a specific training programme, and compare these to expected

results

To examine alternative approaches, and their respective expected costs and benefits,

e.g.:

o blended vs. purely classroom

o multiple course programme vs. single course

ROI in Training (2010) 61

Page 63: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

o limited scope vs. extended scope (e.g. addressing a few goals, or addressing a

larger number of goals)

The first use is a relatively straight forward analysis of costs and benefits, using the models and

tools provided.

The second use however can also be accomplished using the tools. In this case, you can use the

tools to provide a number of different views of the ROI data relating to the different approaches,

and compare the results by applying the estimated data to the final ROI calculations.

4.2 Tools and templates for calculating ROI

Phillips guiding principles include elements of what he refers to as estimation, isolation and

adjustment. These are the cornerstones to monetizing a benefit (the numerator in our ROI

equation) and linking it to training.

Estimation is a process commonly used in business today. Sales people will estimate their future

sales, accounting people will estimate the cost of a warranty or claim that is expected in the

future. So to can training personnel ask that participants estimate the job performance impact that

a training program will have on their job. Participant estimation, as it is commonly referred, is not

estimating the performance solely related to training but asks participants to estimate job

performance changes in general, including among other factors, training.

For example, if one attends sales training, one might estimate an increase in job performance but

that increase could be related to other factors such as a competitor going out of business that

increases sales performance more so than training. So, estimates of performance change need

to take into account many factors, not just training. Those factors include process changes,

people changes, marketplace changes, technology changes and of course training.

When estimating the increase, the participant should think carefully about all the factors

mentioned. They may want to review historic data and forecast data to reasonably factor into their

overall performance change.

Logically, the training department is keenly interested in the effect training had on the

performance improvement. So, the next step is to isolate the estimated increase in performance

to just training. In this part of the process, the participant should estimate the how much the

training has or will influence job performance, relative to the other factors and assign a value to it.

So if the sales person felt that training was the strongest factor that caused change or will be the

driving force behind future change it would receive a higher value than not.

ROI in Training (2010) 62

Page 64: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

Finally, because participant estimation and isolation is participant driven one must adjust any

resulting ROI calculation for the estimate. Again, in other facets of business this is commonly

done. Using analysis such as most likely, optimistic and pessimistic adjusts estimates for bias by

the estimator and flaws in assumptions. You'll often see sales forecasts reported in this manner.

In training, adjustment is made for two reasons: first is conservatism. Conservatism is a guiding

principle of Phillips. It is also critical to state one is conservative in assumptions to build integrity

into your ROI model. Second, is for bias. Self-reported bias by participants is typically inflated. In

fact, studies done by organizations like the Tennessee Valley Authority (TVA) and separate

studies by KnowledgeAdvisors suggest that respondents tend to overestimate by a factor of 35%.

To this end, when computing an ROI calculation one might reduce the inputs by a factor of 35%

or a similar confidence rate as the adjustment factor for conservatism and bias.

Taken together, the principles of estimation, isolation and adjustment form a powerful model in

tabulating a systematic, replicable, and comparable ROI model for human capital.

The result of the process is a monetized benefit factor, that when multiplied by the salary (i.e. the

human capital) yields a monetized benefit from training. The model is easily adaptable, leveraging

automation and technology, to drill deep into a specific business result such as the ROI on sales,

quality, productivity, cycletime, customer satisfaction, or employee retention.

In order to measure ROI, it is necessary to build a picture of the effectiveness of the training

activities at various levels. The measurements will then provide the detailed data to calculate the

value of the training program, which in turn leads to the ROI calculation. A classic example of this

is an ROI model based on the Kirkpatrick model of training evaluation of 1959 and extended by

Phillips. “Evaluation” was defined as “measuring changes in behaviour that occur as a result of

training programs.”

ROI in Training (2010) 63

Page 65: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

Evaluation Levels (modified from the Kirkpatrick 4-level model)

Evaluation Level Description Characteristics

Level 1“Did they like it?” Measuring Reaction and

Identifying Planned Actions

Measures participants’ reaction to the program, and outlines specific

plans for implementation of learning to the job.

Level 2“Did they learn?” Measuring Cognitive Learning

and Retention

Measures skills, knowledge, or attitude changes as a result of the

training.

Level 3“Do they use it?”

Assessing Application of the program training on the job

Measures actual changes in behaviour on the job, and specific

applications of the training material.

Level 4“Did it impact the

bottom line?”

Identifying business results from the training

Measures the business impact of the training. (e.g. measures changes in

output, quality, costs, time, productivity or other business

metrics)

Level 5“What is the return

on learning investment?”

Calculating Return on Investment

Compares the monetary value of the results with the costs for the

program.

In most organisations, measuring training effectiveness at all Levels 1-4 is not feasible for all

participants and for all projects. In practice, the organisation must set targets for the scope of

evaluations at each level, and for the critical training courses or programmes which have most

importance to the success of the organisation. Typical targets for measurement activities at each

level are:

Level 1: 100% of participants / courses provide effectiveness data

Level 2: 50-70% of participants / courses provide effectiveness data

Level 3: 30-50% of participants / courses provide effectiveness data

Level 4: 10-20% of participants / courses provide effectiveness data

Level 5: 5-10% of actual ROI is measured, and extrapolated to the overall program.

From this, we can see that most organisations will initially only fully calculate ROI in detail on a

sample of the courses and participants. It is therefore essential to identify those courses and

participants who are:

ROI in Training (2010) 64

Page 66: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

Representative of the overall population

Taking part in important or high impact programs

Able to accurately assess the impact of training on their jobs

Amenable to providing the full depth of data required

Note: This model describes ROI as “level 5”; please bear in mind that it is not necessarily

appropriate to carry out detailed analysis at all levels up to this, in order to derive an ROI; it may

be enough for the client to know that an 80% pass mark on a technical test (Level 2) is “good

enough” to justify the investment, assuming some numerical value can be put on the passing of

the tests.

During the lifetime of the training programme and evaluation, the data can be collected at various

times

immediately after training, Level 1 feedback should be easily accessible

within a reasonably short time, it should be possible to measure the level 2 performance

change

it will take longer for level 3 data to be meaningful, as this typically requires some period

of practical experience, applying the learning to the job, and gaining some fluency with

the skills learned

Once level 3 data can be reasonably collected, level 4 data can be either derived from

the level 3 data immediately in some cases, but in other cases it will require a further

waiting period to assess the new level of business operations performance, especially

where the new levels of business performance can only be measured after the new skills

have been applied for some time (e.g. a helpdesk technician may be able to apply new

customer care skills after some practice (level 3); but it may take some further time

before the impact of these new skills on customer satisfaction can be gauged (level 4)).

Converting Data to Monetary Values

To calculate the return on investment, data collected at Level 4 is converted to monetary values

to compare to program costs. This requires a value to be placed on each unit of data connected

with the program. Ten approaches are available to convert data to monetary values where the

specific technique selected usually depends on the type of data and the situation:

Output data is converted to profit contribution or cost savings. With this approach,

output increases are converted to monetary value based on their unit of contribution to

profit or the unit of cost reduction. These values are standard values, readily available in

most organizations.

The cost of quality is calculated and quality improvements are directly converted to

cost savings. These values are standard values, available in many organizations.

ROI in Training (2010) 65

Page 67: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

For programs where employee time is saved, the participant wages and benefits are used for the value of time. Because a variety of programs focus on improving the time

required to complete projects, processes, or daily activities, the value of time becomes an

important and necessary issue.

Historical costs and current records are used when they are available for a specific

variable. In this case, organizational cost data are utilized to establish the specific value

of an improvement.

When available, internal and external experts may be used to estimate a value for an

improvement. In this situation, the credibility of the estimate hinges on the expertise and

reputation of the individual.

External databases are sometimes available to estimate the value or cost of data items.

Research, government, and industry databases can provide important information for

these values. The difficulty lies in finding a specific database related to the situation.

Participants estimate the value of the data item. For this approach to be effective,

participants must be capable of providing a value for the improvement.

Soft measures are linked, mathematically, to other measures that are easier to

measure and value. This approach is particularly helpful when establishing values for

measures that are very difficult to convert to monetary values, such as data often

considered intangible like customer satisfaction, employee satisfaction, grievances, and

employee complaints.

Supervisors and managers provide estimates when they are both willing and capable of

assigning values to the improvement. This approach is especially useful when

participants are not fully capable of providing this input or in situations where supervisors

need to confirm or adjust the participant’s estimate.

Education and Training staff estimates may be used to determine a value of an output

data item. In these cases it is essential for the estimates to be provided on an unbiased

basis.

This step in the ROI model is very important and is absolutely necessary to determine the

monetary benefits from education and training programs. The process is challenging, particularly

with soft data, but can be methodically accomplished using one or more of the above techniques.

Tabulating Program Costs

The next step in the process is tabulating the costs of the program. Tabulating the costs

involves monitoring or developing all of the related costs of the program targeted for the ROI

calculation. Among the cost components that should be included are:

the cost to design and develop the program, possibly prorated over the expected life of

the program;

ROI in Training (2010) 66

Page 68: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

the cost of all program materials provided to each participant;

the cost of the instructor / facilitator, including preparation times as well as delivery time;

the cost of the facilities for the program;

travel, lodging, and meal cost for the participants, if applicable;

salaries plus employee benefits of the participants for the time they attend the program;

and administrative and overhead costs of the education and training function allocated in

some convenient way to the training program

In addition, specific costs related to the needs assessment and evaluation should be included, if

appropriate. The conservative approach is to include all of these costs so that the total is fully

loaded.

Calculating the ROI

The return on investment is calculated using the program benefits and costs. The cost / benefit

ratio is the program benefits divided by cost. In formula form it is:

BCR =

The return on investment uses the net benefits divided by program costs. The net benefits are

the program benefits minus the costs. In formula form, the ROI becomes:

ROI (%) = x 100

This is the same basic formula used in evaluating other investments where the ROI is traditionally

reported as earnings divided by investment. The ROI from some programs is high. For example,

in sales, supervisory, leadership, and managerial training, the ROI can be quite large, frequently

over 100%, while the ROI value for technical and operator training may be lower.

Identifying Intangible Benefits

In addition to tangible, monetary benefits, most education and training programs will have

intangible non-monetary benefits. Data items identified that are not converted to monetary values,

are considered intangible benefits. While many of these items can be converted to monetary

values, for various reasons they often are not, such as the process used for conversion is too

subjective and the resulting values lose credibility in the process. These intangible benefits are

the sixth measure reported in the ROI impact study report and may include:

ROI in Training (2010) 67

Program Benefits

Program Costs

Net Program Benefits

Program Costs

Page 69: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

increased job satisfaction

increased organizational commitment

improved teamwork

improved customer service

reduced complaints and

reduced conflicts

For some programs, these intangible, non-monetary benefits are extremely valuable, often

carrying as much influence as the hard data items.

Operating Standards and Philosophy

To ensure consistency and replication of studies, operating standards must be developed and

applied as the process model is utilized to develop ROI studies. It is extremely important for the

results of a study to stand alone and not vary with the individual conducting the study. The

operating standards detail how each step and issue of the process will be handled. Figure 4

shows the ten guiding principles that form the basis for the operating standards.

The guiding principles not only serve as a way to consistently address each step, but also provide

a much needed conservative approach to the analysis. A conservative approach may lower the

actual ROI calculation, but it will also build credibility with the target audience.

ROI in Training (2010) 68

Page 70: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

Ten Guiding Principles

1. When a higher level evaluation is conducted, data must be collected at lower levels.

2. When an evaluation is planned for a higher level, the previous level of evaluation

does not have to be comprehensive.

3. When collecting and analyzing data, use only the most credible sources.

4. When analyzing data, choose the most conservative among alternatives.

5. At least one method must be used to isolate the effects of the solution.

6. If no improvement data are available for a population or from a specific source, it is

assumed that little or no improvement has occurred.

7. Estimates of improvements should be adjusted for the potential error of the estimate.

8. Extreme data items and unsupported claims should not be used in ROI calculations.

9. Only the first year of benefits (annual) should be used in the ROI analysis of short

term solution.

10. Costs of the solution should be fully loaded for ROI analysis.

ROI in Training (2010) 69

Page 71: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

Section 5: ROI Implementation Concerns and Milestones

5.1 Implementation of ROI – the steps

Although progress has been made in the implementation of ROI, significant barriers can inhibit

the implementation of the process. Some of these barriers are realistic; others are actually myths

based on false perceptions. The key implementation issues are briefly discussed in this section.

Discipline and Planning

A successful ROI implementation requires much planning and a disciplined approach to keep the

process on track. Implementation schedules, evaluation targets, data collection plans, ROI

analysis plans, measurement and evaluation policies, and follow-up schedules are required. Only

a carefully planned implementation will be successful.

Responsibilities

There are two areas of responsibility when implementing ROI. First, the entire training and

education staff is responsible for measurement and evaluation. Whether they are involved in

designing, developing, or delivering programs, these responsibilities typically include:

ensuring that the needs assessment includes specific business impact measures;

developing specific application objectives (Level 3) and business impact objectives (Level

4) for each program;

focusing the content of the program on performance improvement, ensuring that

exercises, case studies, and skill practices relate to the desired objectives;

keeping participants focused on application and impact objectives;

communicating rationale and reasons for evaluation;

assisting in follow-up activities to capture application and business impact data;

providing technical assistance for data collection, data analysis, and reporting;

designing instruments and plans for data collection and analysis; and

presenting evaluation data to a variety of groups.

The second area of responsibility is regarding details for those involved directly in measurement

and evaluation, either on a full time basis or as a primary duty. The responsibilities around this

group involve six key areas:

1. designing data collection instruments;

2. providing assistance for developing an evaluation strategy;

3. analyzing data, including specialized statistical analyses;

4. interpreting results and making specific recommendations;

ROI in Training (2010) 70

Page 72: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

5. developing an evaluation report or case study to communicate overall results; and

6. providing technical support in any phase of the ROI process.

Staff Skills Development

Many training staff members neither understand ROI nor have the basic skills necessary to

apply the process within their scope of responsibilities. The typical training and development

program does not focus on business results; it focuses more on learning outcomes.

Consequently, staff skills must be developed to utilize the results based approach. Ten skill

sets have been identified as necessary for the implementation of the ROI.

1. Planning for ROI calculations.

2. Collecting evaluation data.

3. Isolating the effects of training.

4. Converting data to monetary values.

5. Monitoring program costs.

6. Analyzing data including calculating the ROI.

7. Presenting evaluation data.

8. Implementing the ROI process.

9. Providing internal consulting on ROI.

10. Teaching others the ROI process.

Building the Process

Building a comprehensive measurement and evaluation process is best represented as a puzzle

where the pieces of the puzzle are developed and put in place over time. Figure 1 depicts this

puzzle and the pieces necessary to build a comprehensive measurement and evaluation process.

The first piece of the puzzle is the selection of an evaluation framework, which is a categorization

of data. The balanced scorecard process (Kaplan and Norton, 1993) or the four levels of

evaluation developed by Kirkpatrick (1974) offer the beginning points for such a framework. The

framework selected for the process presented here is a modification of Kirkpatrick’s four levels to

include a fifth level: return on investment.

ROI in Training (2010) 71

Page 73: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

Next, an ROI process model must be developed showing how data is collected, processed,

analyzed, and reported to various target audiences. This process model ensures that appropriate

techniques and procedures are consistently utilized to address almost any situation. Also, there

must be consistency as the process is implemented.

The third piece of the puzzle is the development of operating standards. These standards help

ensure the results of the study are stable and not influenced by the individual conducting the

study. Replication is critical for the credibility of an evaluation process. Operating standards and

guiding principles allow for replication, so that if more than one individual evaluates a specific

program, the results are the same.

Next, appropriate attention must be given to implementation issues as the ROI process becomes

a routine part of the education and training function. Several issues must be addressed involving

skills, communication, roles, responsibilities, plans, and strategies.

Finally, there must be successful case studies describing the implementation of the process

within the organization, the value a comprehensive measurement and evaluation process brings

to the organization, and the impact the specific program evaluated has on the organization.

While it is helpful to refer to case studies developed by other organizations, it is more useful and

convincing to have studies developed directly within the organization.

The remainder of this chapter focuses on the individual pieces of the evaluation puzzle

developing a comprehensive ROI process.

Evaluation Framework

The ROI process described in this section adds a fifth level to the four levels of evaluation

developed by Kirkpatrick (1974) to measure the success of training. The concept of different

levels of evaluation is both helpful and instructive to understanding how the return on investment

is calculated.

ROI in Training (2010) 72

Page 74: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

Level Measurement Focus

1 Reaction, Satisfaction

and Planned Action

Measures participant’s reaction to, and satisfaction with, the

program and captures planned actions.

2 Learning Measures changes in knowledge, skills, and attitudes.

3Application and

Implementation

Measures changes in on-the-job behavior and progress with

planned actions.

4 Business Impact Measures changes in business impact variables.

5 Return on InvestmentCompares program monetary benefits to the costs of the

program.

Level 1, Reaction, Satisfaction, and Planned Action, measures the level of satisfaction from

program participants along with their plans to apply what they have learned. Almost all

organizations evaluate at Level 1, usually with a generic, end-of-program questionnaire. While

this level of evaluation is important as a customer-satisfaction measure, a favorable reaction does

not ensure that participants have learned new skills or knowledge.

Level 2, Learning, focuses on what participants learned during the program using tests, skill

practices, role plays, simulations, group evaluations, and other assessment tools. A learning

check is helpful to ensure that participants have absorbed the program material and know how to

use it properly. However, a positive measure at this level is no guarantee that what is learned will

be applied on the job. The literature is laced with studies showing the failure of learning to be

transferred to the job (e.g., Broad, 1998).

At Level 3, Application and Implementation, a variety of follow-up methods are used to

determine if participants applied on the job what they learned. The frequency and use of skills

are important measures at Level 3. When training of technology applications is conducted, Level

3 measures the utilization of the technology. While Level 3 evaluations are important to gauge

the success of the application of a program, it still does not guarantee that there will be a positive

business impact in the organization.

The Level 4, Business Impact, measure focuses on the actual results achieved by program

participants as they successfully apply what they have learned. Typical Level 4 measures include

output, quality, costs, time, and customer satisfaction. Although the program may produce a

measurable business impact, there is still a concern that the program may cost too much.

Level 5, Return on Investment, the ultimate level of evaluation, compares the monetary benefits

from the program with the program costs. Although the ROI can be expressed in several ways, it

ROI in Training (2010) 73

Page 75: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

is usually presented as a percent or cost benefit ratio. The evaluation chain is not complete until

the Level 5 evaluation is conducted.

While almost all education and training organizations conduct evaluations to measure

satisfaction, very few actually conduct evaluations at the ROI level. Perhaps the best explanation

for this situation is that ROI evaluation is often characterized as a difficult and expensive process.

When business results and ROI are desired, it is also very important to evaluate the other levels.

A chain of impact should occur through the levels as the skills and knowledge learned (Level 2)

are applied on the job (Level 3) to produce business results (Level 4). If measurements are not

taken at each level, it is difficult to conclude that the results achieved were actually caused by the

program. Because of this, it is recommended that evaluation be conducted at all levels when a

Level 5 evaluation is planned. This practice is consistent with the practices of benchmarking

forum members of the American Society for Training and Development as well as best-practice

corporate universities as identified in a study conducted by the American Quality and Productivity

Center.

The ROI Process Model

The calculation of the return on investment in education and training begins with the model where

a potentially complicated process can be simplified with sequential steps. The ROI process

model provides a systematic approach to ROI calculations. A step-by-step approach helps keep

the process manageable so that users can address one issue at a time. The model also

emphasizes the fact that this is a logical, systematic process which flows from one step to

another. Applying the model provides consistency between ROI calculations. Each step of the

model is briefly described below.

Planning the Evaluation

One of the most important and cost-savings steps in the ROI process is planning the

evaluation. By considering key issues involved in the evaluation process, time, money, and

frustration can be significantly reduced. There are four specific elements of the evaluation

process that should be considered during the initial stages of the planning process.

1. Evaluation purposes are considered prior to developing the evaluation plan. The

purposes will often determine the scope of the evaluation, the types of instruments used, and

the type of data collected. For example, when an ROI calculation is planned, one of the

purposes is to compare the cost and benefits of the program. This purpose has implications

for the type of data collected (hard data), type of data collection method (performance

monitoring), the type of analysis (thorough), and the communication medium for results (formal

evaluation report). For most programs, multiple evaluation purposes are pursued.

2. A variety of instruments are used to collect data. The appropriate instruments are

determined in the early stages of developing the ROI. Questionnaires, interviews, and focus

ROI in Training (2010) 74

Page 76: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

groups are common instruments. When deciding which instruments to use, those most fitting

to the culture of the organization and most appropriate for the setting and evaluation

requirements should be considered.

3. Training programs are evaluated at five different levels as illustrated in Figure 2. Data

should be collected at Levels 1, 2, 3, and 4, if an ROI analysis is planned. This helps ensure

that the chain of impact occurs as participants learn the skills, apply them on-the-job, and

obtain business results.

4. A final element in planning evaluation is the timing of the data collection. Sometimes,

pre-program measurements are taken to compare with post-program measures. In some

cases, multiple measures are taken at different times throughout the process. In other

situations, pre-program measures are unavailable and specific follow-ups are taken after the

program. The important issue is to determine the timing for the follow-up evaluation. For most

education and training programs, a follow-up is usually conducted from 3 to 6 months after the

program.

These four elements - evaluation purposes, instruments, levels, and timing - are all

considerations in selecting the data collection methods and developing the data collection

plan. Once this preliminary information is gathered, the data collection plan and ROI analysis

plan are developed.

The data collection plan - After the above elements have been considered and determined,

the data collection plan is developed. The data collection plan outlines in detail the steps to be

taken to collect data for a comprehensive evaluation. The data collection plan usually includes

the following items:

Broad areas for objectives are developed for evaluation planning; more specific program

objectives are developed later.

Specific measures or data descriptions are indicated when they are necessary to explain

the measures linked to the objectives.

Specific data collection methodologies for each objective are listed.

Sources of data such as participants, team leaders, and company records are identified.

The time frame in which to collect the data is noted for each data collection method.

Responsibility for collecting data is assigned.

The ROI analysis plan

The ROI analysis plan is a continuation of the data collection plan. This planning document

captures information on several key issues necessary to develop the actual ROI calculation.

The key issues include:

ROI in Training (2010) 75

Page 77: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

Significant data items, usually Level 4, business impact measures, but in some cases

could include Level 3 data;

The method for isolating the effects of the training and education program.

The method for converting data to monetary values.

The cost categories, noting how certain costs should be prorated.

The anticipated intangible benefits.

The communication targets – those to receive the information.

Other issues or events that might influence program implementation.

These two planning documents are necessary to successfully implement and manage the

ROI process.

5.2 Concerns and challenges in implementing ROI

Concerns with ROI

Although much progress has been made, the ROI process is not without its share of problems

and drawbacks. The mere presence of the process creates a dilemma for many organizations.

When an organization embraces the concept and implements the process, the management team

is usually anxiously waiting for results, only to be disappointed when they are not quantifiable.

For an ROI process to be useful, it must balance many issues, including feasibility, simplicity,

credibility, and soundness. More specifically, three major audiences must be pleased with the

ROI process to accept and use it.

Practitioners - For years, education and training practitioners have assumed that ROI could not

be measured. When examining a typical process, they found long formulas, complicated

equations, and complex models that made the ROI process appear to be too confusing. With this

perceived complexity, these practitioners could visualize the tremendous efforts required for data

collection and analysis, and more importantly, the increased cost necessary to make the process

work. Because of these concerns, practitioners are seeking an ROI process that is simple and

easy to understand so that they can easily implement the steps and strategies. Also, they need a

process that will not take an excessive amount of time to implement. Finally, practitioners need a

process that is not too expensive. With competition for financial resources, they need a process

that will not command a significant portion of the budget. In summary, from the perspective of the

practitioner, the ROI process must be user-friendly, time-saving, and cost-efficient.

Senior Managers / Sponsors / Clients - Managers, who must approve education and training

budgets, request programs, or cope with the results of programs, have a strong interest in

developing the ROI. They want a process that provides quantifiable results, using a method

similar to the ROI formula applied to other types of investments. Senior managers have a never-

ending desire to have it all come down to ROI calculations, reflected as a percentage. They, as

do practitioners, want a process that is simple and easy to understand. The assumptions made

ROI in Training (2010) 76

Page 78: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

in the calculation and the methodology used in the process must reflect their frame of reference,

experience, and level of understanding. They do not want, or need, a string of formulas, charts,

and complicated models. Instead, they need a process that, when necessary, they can explain to

others. More importantly, they need a process with which they can identify; one that is sound and

realistic enough to earn their confidence.

Researchers - Finally, researchers will only support a process that measures up to their scrutiny

and close examination. Researchers usually insist that models, formulas, assumptions, and

theories be sound and based on commonly accepted practices. Also, they want a process that

produces accurate values and consistent outcomes. If estimates are necessary, researchers

want a process that provides the most accuracy within the constraints of the situation, recognizing

that adjustments need to be made when there is uncertainty in the process.

The challenge is to develop acceptable requirements for an ROI process that will satisfy

researchers and, at the same time, please practitioners and senior managers.

Criteria for an Effective ROI Process

To satisfy the needs of the three critical groups described above, the ROI process must meet

several requirements. Ten essential criteria for an effective ROI process are outlined below.

These criteria were developed with input from hundreds of education and training managers and

specialists:

1. The ROI process must be simple, devoid of complex formulas, lengthy equations, and

complicated methodologies. Most ROI attempts have failed with this requirement. In an

attempt to obtain statistical perfection, several ROI models and processes have become too

complex to understand and use. Consequently, they are not being implemented.

2. The ROI process must be economical, with the ability to be easily implemented. The

process should have the capability to become a routine part of education and training without

requiring significant additional resources. Sampling for ROI calculations and early planning

for ROI are often necessary to make progress without adding new staff.

3. The assumptions, methodology, and outcomes must be credible. Logical, methodical steps

are needed to earn the respect of practitioners, senior managers, and researchers. This

requires a very practical approach for the process.

4. From a research perspective, the ROI process must be theoretically sound and based on

generally accepted practices. Unfortunately, this requirement can lead to an extensive,

complicated process. Ideally, the process must strike a balance between maintaining a

practical and sensible approach and a sound and theoretical basis for the procedures. This is

perhaps one of the greatest challenges to those who have developed models for the ROI

process.

ROI in Training (2010) 77

Page 79: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

5. An ROI process must account for other factors that have influenced output variables. One

of the most often overlooked issues, isolating the influence of an education program, is

necessary to build credibility and accuracy within the process. The ROI process should

pinpoint the contribution of the program when compared to the other influences.

6. The ROI process must be appropriate with a variety of programs. Some models apply to

only a small number of programs, such as productivity training. Ideally, the process must be

applicable to all types of education and training and other programs such as career

development, organization development, and major change initiatives.

7. The ROI process must have the flexibility to be applied on a pre-program basis as well as a

post-program basis. In some situations, an estimate of the ROI is required before the actual

program is developed. Ideally, the process should be able to adjust to a range of potential

time frames for calculating the ROI.

8. The ROI process must be applicable with all types of data, including hard data, which is

typically represented as output, quality, costs, and time; and soft data, which includes job

satisfaction, customer satisfaction, absenteeism, turnover, grievances, and complaints.

9. The ROI process must include the costs of the program. The ultimate level of evaluation

compares the benefits with costs. Although the term ROI has been loosely used to express

any benefit of education or training, an acceptable ROI formula must include costs. Omitting

or understating costs will only destroy the credibility of the ROI values.

10.Finally, the ROI process must have a successful track record in a variety of applications. In

far too many situations, models are created but never successfully applied. An effective ROI

process should withstand the wear and tear of implementation and prove valuable to users.

Because these criteria are considered essential, an ROI process should meet the vast majority, if

not all criteria. The bad news is that most ROI processes do not meet these criteria. The good

news is that the ROI process presented below meets all of the criteria

There is resistance to measure the ROI in training, and the following are some of the questions

asked in the industry:

How can I measure ROI to determine the value added to the organisation?

What are the steps to measure ROI?

How often should ROI be measured? 

In which areas of learning and change in behaviour can ROI be measured?

How do I implement the ROI process in the organisation?

How do I attach a rand value to learning?

What are the benefits of measuring ROI?

ROI in Training (2010) 78

Page 80: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

How do managers overcome resistance to measuring ROI?

Training managers know that ROI must be measured, yet very few are actually doing it. The

following are some of the reasons for not calculating ROI that have been provided by training

managers during recent conferences organised by ASTD Global Network South Africa: 

There is a lack of a measurement culture in training departments. Managers do not know where

to start off they don't have the resources to calculate the ROI. It is perceived as too difficult,

complicated, too much effort or time consuming. They are so busy with all their training

programmes and SETA requirements, that there is no time to calculate ROI. They fear that if ROI

is calculated, it will show that their training adds no value and that they will be at risk to lose their

jobs!

Difficulties in calculating ROI

One of the common difficulties encountered in calculating ROI is determining the "shelf-life" of the

training program. How many times will a course will be used or "cycled" before business

strategies or budgets dictate its demise, technology renders it obsolete, or all eligible employees

have received the training? Changes fueled by technology, global competition, legal and

regulatory agencies, employee turnover and other factors make it increasingly difficult to

accurately predict the life cycle or shelf life of a training program. A conservative approach should

be used in evaluating the shelf life of a training program due to the rapid changes occurring in the

workplace.

It is difficult enough to get managers to send employees to training without imposing additional

requirements to collect pre- or post-training data to document the effects of training. HR

managers should make data collection as simple and effortless as possible and provide forms

and administrative assistance whenever possible. Collaboratively work with IT, HRIS, Production,

Engineering, Sales, Payroll, and Accounting personnel to "mine" existing data or develop ways to

collect, compile and extract the necessary data for a cost / benefit analysis of training.

The presence of unions could also create data collection problems. Frequently, unions are

concerned that the employees or union members are being evaluated. It is a good idea to get

union leaders to "buy in" to the idea of a ROI assessment. At the very least, a careful review of

the union contract is recommended along with clear communication regarding the goals and

scope of the training assessment.

Some behaviors or skills may be useful for only short periods of time while others, such as time

management, may continue for a lifetime and affect numerous workplace and personal activities.

Frequency of use of the newly acquired skills is also a factor in determining how long the skills

ROI in Training (2010) 79

Page 81: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

are retained. Therefore, determining or estimating the scope and duration of the benefits of

training can be a daunting task.

Measuring the impact of training for new products, technology, and services is difficult for two

reasons. First, if no prior training existed or no prior performance measures are available, it is

very difficult to ascertain the impact of training. Second, if prior measures of performance are

available, how does one determine the amount of improvement that is attributable to training as

opposed to how much is attributable to the new innovation? Frequently, this determination is a

judgment call or an intuitive decision rather than one based on factual data.

Fear and Misconceptions

Education and training departments often do not pursue ROI because of fear of failure or fear of

the unknown. A concern may exist about the consequence of a negative ROI. Staff members

may feel threatened. The ROI process also stirs up the traditional fear of change. This fear is

often based on unrealistic assumptions and a lack of knowledge of the process and is so strong

that it becomes a realistic barrier to many ROI implementations.

The false assumptions about the ROI process, will keep training and education staff members

from attempting ROI. Following are typical faulty assumptions.

Managers do not want to see the results of education and training expressed in monetary

values. In many cases, managers do want to see the results of education and training

expressed in monetary terms. But for years, they were told it could not be done. As

mentioned earlier, there is a greater awareness among executives that education and

training can be measured like other business practices. This awareness is increasing

interest and requests for these measures.

If the CEO does not ask for the ROI, he or she is not expecting it. This assumption

captures the most innocent bystanders. As CEOs become more aware of the process to

calculate the ROI in training, they want to begin seeing results immediately. The ROI

process is not a quick fix and takes time to learn and fully integrate into a training

organization. With this in mind, many organizations are developing ROI impact studies

long before senior management ask for it.

”As the manager of education and training, I have a professional, competent staff;

therefore, I do not have to justify the effectiveness of our programs”. This may be true for

some organizations – at least at the present. But with the ongoing changes in the

business community, it is becoming less and less common to see any function not

required to show its bottom-line impact. When budget allocations take place, the

functions with a “place at the table” are usually the ones that prove their contribution.

The training and development process is a complex, but necessary, activity;

consequently, it should not be subjected to an accountability process. The training and

ROI in Training (2010) 80

Page 82: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

development process is complex. And like other complex processes can be quite costly

to the organization. With this in mind, many organizations are holding the training and

education function to the same standard of accountability as other processes.

These false assumptions and others must be addressed and analyzed so they do not impede the

progress of ROI implementation.

5.3 Important milestones for ROI implementation

It is important to follow proper guidelines for the effective application of the ROI measurement

process by adhering to the following steps: 

Create awareness for ROI in the organisation.

Build capacity for ROI by training staff to understand ROI.

Quantify information before the training in order to obtain a baseline (e.g. number of calls

answered, number of customer complaints, etc.).

Covert this data to monetary value (e.g. the cost of answering a call).

Allocate resources for ROI.

Develop a culture of measurement and accountability among training staff.

Start with only one course as a pilot programme to practise ROI skills.

Communicate results to training staff and the whole organisation.

Design improvement plans for training programmes in order to increase ROI.

Once ROI results are available, use the data to market future learning programmes.

HR managers can improve the actual ROI of training by utilizing the following tips:

Understand that business needs will change over the duration of the project. The longer it

takes to do a project, the more likely that change will occur. Be prepared to switch gears

or re-tool. Stay alert for indications of learning gaps or obsolescence of course materials

created by changes in technology, economic factors, or employee demographics.

Use the experience of senior managers and employees in estimating the results of

training. Estimating requires a little math and lots of experience.

Anticipate and plan for glitches. They will happen. Do not get discouraged. Deal

realistically with unexpected events and surprises. Some of them will be problems that

may reduce expected outcomes. Others may stimulate creativity and produce very

positive results.

ROI in Training (2010) 81

Page 83: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

Keep an eye on costs and monitor any deviations from estimated or budgeted

expenditures. Quick action or adjustments may enable the HR manager to curtail or

reduce costs and keep the project within or near budget.

Balance short term and long-term goals. The pressure to deliver consistent short-term

financial performance may limit the resources available for investment in growth

opportunities, such as the "development" of employees for future roles.

Secure managers' and supervisors' support for training. Effective training depends on

three persons: the trainer, the trainee and the supervisor or manager. All three must

agree on expected outcomes and how and when they will be measured.

Align training initiatives with strategic business plans. Training should improve your

organization's competitive edge, which may be efficiency, first class service, or creativity

and innovation.

Track training costs by employee for career pathing and succession planning

HR must shift its emphasis from being the "keeper and distributor of people information" to the

more critical role of "developer of people and productivity." Training benefits must be greater than

the cost of the training. Training must ultimately improve the profitability of an organization. The

four-step process for calculating ROI enables HR mangers to determine the productivity affect of

training or its impact on the bottom line. Reducing training to save money may be like stopping a

clock to save time.

ROI in Training (2010) 82

Page 84: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

Section 6: ROI Lessons from Practice

6.1 Case studies related to ROI

This section contains two case studies related to ROI in training and can be used to help develop

your understanding and application of the tools and processes involved in this new field.

Case study 1: Reliance Financial Services

At the end of a monthly staff meeting, Frank Thomas, CEO of Reliance Financial Services,

asked Margaret Bongani, Manager of Training and Development, about the Effective

Interpersonal Skills Workshops that had been conducted with all supervisors and managers

throughout the company. The workshop featured the Myers-Briggs Type Indicator (MBTI) and

showed participants how to interact with, and understand, each other in their routine activities.

The MBTI classifies people into one of 16 personality types.

Frank continued, “I found the workshop very interesting and intriguing. I can certainly identify

with my particular personality type, but I am curious what specific value these workshops have

brought to the company. Do you have any way of showing the results of all 6 workshops?”

Margaret quickly replied, “We certainly have improved teamwork and communications

throughout the company. I hear people make comments about how useful the process has been

to them personally.” Frank added, “Do we have anything more precise? Also, do you know how

much we spent on these workshops?” Margaret quickly responded by saying, “I am not sure

that we have any precise data and I am not sure exactly how much money we spent, but I can

certainly find out.” Frank concluded with some encouragement, “Any specifics would be helpful.

Please understand that I am not opposing this training effort. However, when we initiate these

types of programs, we need to make sure that they are adding value to the company’s bottom

line. Let me know your thoughts on this issue in about two weeks.”

Margaret was a little concerned about this CEO’s comments, particularly since the CEO enjoyed

the workshop and has made several positive comments about it. Why was he questioning the

effectiveness of it? Why was he concerned about costs?

These questions began to frustrate Margaret as she reflected over the year and a half period in

which almost every senior member of staff had attended the workshop. She recalled how she

was first introduced to the MBTI. She attended a workshop conducted by a friend, was

impressed with the instrument, and found it to be helpful as she learned more about her own

personality type.

ROI in Training (2010) 83

Page 85: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

Margaret thought the process would be useful to Reliance managers and asked the consultant

to conduct a session internally with a group of middle and senior staff members. With

favourable reaction, she decided to try a second group with the top team, including Frank

Thomas. Their reaction was favourable. Then she launched it with the entire staff, using

positive comments from the CEO, and the feedback has been excellent.

She realized that the workshops have been expensive and over 60 staff had attended.

However, she thought that teamwork has improved, but there is no way of knowing for sure.

With some types of training you never know if it works, she thought. Margaret believed that she

had done her homework on this one, but was still facing a dilemma. Should she respond to the

CEO or just ignore the issue?

Reliance Financial Services ~ Part B

After some intense discussions with Frank Thomas, the CEO, around a structured approach to

training, Margaret conducted a detailed needs assessment in the organisation. The assessment

uncovered four key problems that could be corrected with improved supervisor and managerial

leadership skills. These included excessive employee turnover, excessive absenteeism,

deteriorating quality of proposals processed, and sagging productivity per professional clerk

employed. Each of those four measures are monitored in the work unit under the control of a

supervisor or manager. Quality and productivity are monitored daily and reported weekly.

Absenteeism is reported weekly, and turnover is reported monthly, using annualised figures.

As a result of the needs assessment, a three-day leadership workshop for all supervisors and

managers was developed and implemented. The workshop focuses on building specific

leadership skills to enable supervisors and managers to correct the problem areas identified in

the needs assessment and secure significant improvements in all four key output variables.

The specific objectives of the leadership program are as follows. After attending this program,

participants should:

Be able to identify and discuss various leadership models and theories.

Be able to select the appropriate leadership style for a particular situation.

Serve as a positive role model for appropriate leadership behaviour.

Apply specific leadership skills.

Reduce employee turnover from an average annual rate of 12% to an industry average of

7.3% in one year.

Reduce absenteeism from a weekly average of 3% to 1% in six months.

Improve quality by 15% in six months.

Increase productivity by 12% in six months.

ROI in Training (2010) 84

Page 86: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

Case study 2:

ROI Profile: IBM Mindspan Solutions: IBM Basic Blue

Bottom Line

IBM's Basic Blue e-learning initiative brings training to more than 5000 IBM managers annually.

Using a blended approach of e-learning, electronic community, coaching, and simulation, IBM

was able to not only reduce costs by moving training to the Web, but take advantage of the

flexibility of the electronic medium to provide a richer learning environment for students.

The company

IBM's name is synonymous with computing. From its predecessor the Computer-Tabulating-

Recording company's establishment in 1911 to its launch of the personal computer in 1981,

IBM's goal has been to lead in the creation, development, and manufacture of advanced

information technologies. IBM sales of computer systems, software, networking systems,

storage devices, microelectronics, and services earned revenues of nearly US$90 billion in 2000.

In the turbulent IT market where vision and flexibility are key, IBM has recognized the importance

of training its managers both on IBM's strategy and culture and on leadership and people

management.

The challenge

As part of its ongoing management development program, IBM trains more than 5000 new

managers each year. Traditionally, managers were brought together for a 5-day event to

learn the basics on IBM culture, strategy, and management. As the complexity of their jobs

increased, IBM recognized that five days was not enough time to train managers effectively

- and that to help managers evolve with the industry, training needed to be a ongoing

process instead of a one-time event.

In 1996 Nancy Lewis, Director of IBM Management Development, began to develop IBM's

global management training program. Recognizing that more than one session was needed

- but that bringing together 5000 people from around the world was costly and time-

consuming, IBM looked at e-learning technology, and specifically, IBM's own Mindspan

solution, to support its manager training. The goal was to find appropriate technology to

support different parts of the manager training process - and engage and teach people who

were used to face-to-face training. "We like saving money and a quick payback but that's

not why we did this," said Nancy, "we did it for effectiveness of our managers around the

globe."

ROI in Training (2010) 85

Page 87: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

The strategy

IBM developed a 4-tier e-business learning approach, Basic Blue for Managers, which blends

technologies to support regular classroom sessions. The management training program

includes a Web-based learning infrastructure, virtual collaborative tools, content references,

and interactive online simulators to complement face-to-face instruction. Managers

participate in 26 weeks of self-paced on-line learning delivered through Lotus LearningSpace

modules. At the end of the self-paced learning, managers meet for their 5-day face- to-face

session to address higher-level issues and skills -and continue with on-line collaboration

with other managers as new needs or issues arise

IBM's 4-tier model focused on information and awareness through "QuickView" reference

guides, understanding and practice through interactive modules and simulation, learning

from experts through same-time awareness and col laboration, and higher order skills

through virtual learning labs and face -to-face meetings. IBM's Basic Blue for Managers is now

a 12-month training process with 5 times as much content as the previous new manager

training program. The program has grown to include approximately 6400 managers

Key benefit areas

Basic Blue for Managers, now an IBM Mindspan solution, enables IBM to train managers to

lead high-performance teams - without the expense of on-site meetings and travel. Key

benefits Nucleus measured in calculating the ROI from the solution include the following:

• Direct savings such as reduced program, travel, and manager off-site costs. Because

75 percent of the training is delivered through distance learning and 25 percent is in a

traditional classroom, IBM has been able to significantly reduce the costs associated with

training while increasing the amount of content taught.

• Reduced the direct cost of content development. In addition to the savings with the Basic

Blue project, Reusable templates have enabled other IBM divisions to develop content

without re-development of the training platform.

Indirect savings in the form of increased manager productivity. The self-service nature of

Basic Blue allows managers to better utilize their time. IBM also estimates managers are

able to reduce the time needed to learn new material by 25%.

IBM also tracks a number of additional benefits from Basic Blue. Measurement of the impact of

the enhanced training on managers' performance has shown that IBM's blended approach to

initial and ongoing training has enabled its managers to make sustained behavior changes

which then lead to significant business performance improvements

ROI in Training (2010) 86

Page 88: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

Key cost areas

Infrastructure costs for IBM Basic Blue included the cost to purchase software, hardware,

hosting services, and IT support. As with any e-learning initiative the cost of the infrastructure is

only a part of the total cost. IBM invested additional resources to create the training and

simulation modules that would match the needs of the company. In doing so, IBM ensured a

smooth transition from the traditional learning approach to the new blended e-learning approach.

Nucleus believes the company's success with its Basic Blue initiative is due in large part to the

commitment the company made early in the project to invest proper resources to develop high

quality learning modules. The success of the Basic Blue application shows the value of the

combination of both technology and the proper level of training services.

Lessons learned

By combining a variety of mediums with traditional classroom teaching, IBM found it was able to

reduce training costs - and that managers preferred and retained more from the "blended"

approach than from the classroom alone. IBM learned that longstanding guidelines for successful

innovations in other industries held true for e-learning: (1) simplicity — in navigation and usability

— enhanced the student experience; (2) compatibility with existing technologies sped student

acceptance; and (3) trialability — "safe" opportunities to try online learning, and positive, engaging

experiences early on -- increased adoption speed. Anytime / anywhere access to

performance-support was perceived by students as the key advantage of IBM's blended

learning.

IBM also found that to be effective, the technology had to closely match the needs of the users.

Even though its manager group was technology literate, rather than forcing them

To adapt to the technology, IBM took steps to model the application to match the managers'

learning styles. This continues to be an ongoing process as IBM adjusts the learning

environment based on feedback from users and the evolution of technology.

Bob MacGregor, Manager, IBM Management Development, advises that when venturing into e-

learning for the first time, teams can get bogged down in studying what to do. He says that,

"equipped with the vision of what would work best for students, the Basic Blue team jumped in

and did it."

Calculating the ROI

IBM's goals were to improve the training process for better-performing managers as well as

manage training. Reduction in costs that would have been required to increase training levels

without distance learning made up the lion's share of the return from IBM's investment.

ROI in Training (2010) 87

Page 89: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

When estimating the ratio of direct and indirect benefits, Nucleus assumed all savings in

employee time or increased productivity were indirect. Direct savings included reduced travel and

teacher expenses along with other direct savings

6.2 Lessons from practice related to ROI

Training ROI & effectiveness: How can I measure training effectiveness or its return on

investment? One of the keys is to have a defined plan / process when you go or send a staff

ROI in Training (2010) 88

Page 90: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

member to training; a plan for before training, during training and what to do after training.

Training should be managed so it can be planned & linked to the business results. Goals must be

clear. Mechanisms should be in place to provide re-enforcement to the person who is learning for

his / her efforts to implement what they have learned.

Using the PDSA process is one way. For example:

PLAN How does this training relate to the aim of the organization?

Aim of the division or business unit?

To the person's developmental plan?

How does the training relate to the customer?

Where does it fit into my long term growth?

Why is it important?

Determine the expectations during training and more important after training.

What are the expected outcomes?

How will I practice or demonstrate my new skills and knowledge.

Create understanding of the linkage between the training and the day to day activities.

DO Attend training.

Monitor progress throughout the training.

Participate

Develop an ideas list

Take notes

Network with class mates

Develop a plan for application of ideas.

Teach or share your new skills with someone else.

Maintain contact with network buddies.

STUDY What was learned?

How will the learning that took place be applied?

Review the linkage to the job and the aim of the organization and the customer

Analyze / track to see to what extent learning actually created the intended business

results.

ACT

ROI in Training (2010) 89

Page 91: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

Integrate the learning into the day to day activities in order to achieve positive business

results.

Continue to monitor or study the results of training over time (critical data for justification

for future training investment)

For example: Improved service levels; increased morale and job satisfaction; improved

performance / production; improve efficiency (greater production / less staff)

Continue the PDSA thought process…..

Remember, when compiling an approach for ROI in learning; keep these best practices

concluding thoughts in the top of your mind:

Best Practice #1: Plan your metrics before writing survey questions.First and foremost, never ask a question on a data collection instrument unless it ties to a metric

you will utilize. As simple as this sounds, often is the case where organizations create questions

with no purpose in mind.

Best Practice #2: Ensure the measurement process is replicable and scalable. Organizations tend to spend thousands of dollars on one-off projects to measure a training

program in detail. This information is collected over many months with exhaustive use of

consultants and internal resources. Although the data is powerful and compelling, management

often comes back with a response such as 'great work, now do the same thing for all the training.'

Unfortunately such one-off measurement projects are rarely replicable on a large-scale basis. So

don't box yourself into that corner.

Best Practice #3: Ensure measurements are internally and externally comparable.Related to best practice #2 is the concept of comparability. It is a significantly less powerful

endeavor to do a one-off exercise when you have no base line of comparability. If you spend

several months calculating out a 300% ROI on your latest program how do you know if that is

good or bad? Surely a 300% ROI is a positive return but what if the average ROI on training

programs is 1000%?

Best Practice #4: Use industry-accepted measurement approaches.Management is looking to the training group to lead the way in training measurement. It is the job

of the training group to convince management that their approach to measurement is reasonable.

This is not unlike a finance department that must convince management of the way it values

assets. In both cases, the group must ensure the approach is based on industry accepted

principles that have proof of concept externally and merit internally.

ROI in Training (2010) 90

Page 92: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

Best Practice #5: Define value in the eyes of your stakeholders.If you ask people what they mean by 'return on investment' you are likely to get more than one

answer. In fact, odds are you'll get several. Return on investment is in the eyes of the beholder.

To some it could mean a quantitative number and to others it could be a warm and fuzzy feeling.

Best Practice #6: Manage the change associated with measurement.As you can likely see from some of the best practices, they might be doomed for failure if you fail

to manage the change with your stakeholders. Successful organizations will spend considerable

time and energy planning for the change. Assess the culture and the readiness for change. Plan

for change or plan to fail.

Best Practice #7: Ensure the metrics are well balanced.Although you want to understand the needs of your stakeholders and have them define how they

perceive value, you also need to be proactive in ensuring that your final 'measurement scorecard'

is well balanced.

Best Practice #8: Leverage automation and technology.Although this goes hand and hand with a measurement process that is replicable and scalable it

is worthy of separate mention. Your measurement process must leverage technology and

automation to do the heavy lifting in areas such as data collection, data storage, data processing

and data reporting.

Best Practice #9: Crawl, walk, run.When designing a learning measurement strategy it is nice to have a long term vision, but don't

attempt to put your entire vision in place right out of the blocks. The best approach is to start with

the low hanging fruit that can be done in a reasonable time frame to prove the concept,

demonstrate a 'win' and build a jumping off point to advance it to the next level.

Best Practice #10: Ensure your metrics have flexibility.The last thing you want to do is rollout a measurement process that is inflexible. You will likely

have people who want to view the same data but in many different ways. You need to have

architected your database to accommodate this important issue thereby creating measurement

flexibility.

ROI in Training (2010) 91

Page 93: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

References

Barney, M, Measuring ROI in Corporate Universities, Death of Student Day and Birth of Human

Capital

Broad, M.L., (1999). Transferring Learning to the Workplace. Alexandria, VA: American Society

for Training and Development.

Cascio, Wayne F., Costing Human Resources: The Financial Impact of Behavior in

Organizations, South-Western Publishing, 1999

Donald L. Kirkpatrick, Evaluating Training Programs: The Four Levels, 2nd Edition, Berrett-

Koehler Publishers, Inc, San Francisco

Fitz-enz, Jac, ROI of Human Capital, AMACOM, American Management Association, New York,

2000

Geber, B. (1994, February). A Clean Break for Education at IBM, Training, 33-36.

Industry Report 1999. (1999, October). Training, 34(10), 33-75.

Jack J. Phillips, Return on Investment in Training and Performance Improvement Programs, Gulf

Publishing, Houston, Texas

Kirkpatrick, Donald, Evaluating Programs: The Four Levels, San Francisco: Berrett-Koehler 1998

Kirzan, William G., "Labor: Craft Training Pays Off Quickly," Engineering News Record, New

York, September 28, 1998.

Kimmerling, G. (1993, September). Gathering Best Practices. Training and Development, 47,(3),

28-36.

Kirkpatrick, D. L., Techniques for Evaluating Training Programs, Evaluating Training Programs

(pp. 1-17) Alexandria, VA: ASTD,

Phillips, Jack J., Return on Investment in Training and Performance Improvement, Gulf

Publishing, Houston Texas, 1997

Phillips, J. J. (1994). In Action: Measuring Return On Investment, Vol. 1. Alexandria, VA:

American Society for Training and Development.

Phillips, J. J. (1996a, March). Was it the training? Training & Development, 50(3), 28-32

Phillips, J. J. (1996b, April). How much is the training worth? Training & Development, 50(4),

20-24.

ROI in Training (2010) 92

Page 94: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

Phillips, J. J. (1997a). Return On Investment In Training and Performance Improvement

Programs. Houston, TX: Gulf Publishing.

Phillips, J. J. (1997b). In Action: Measuring Return on Investment, Vol. 2. Alexandria, VA:

American Society for Training and Development.

Phillips, J. J. and Pulliam, P. F. (1999, May / June). Dispelling the ROI Myths. Corporate

University Review (pp. 32-36).

Phillips, J. J. and P. P. (2000c). In Action: Measuring Return on Investment, Vol. 3. Alexandria,

VA: American Society for Training Development.

Phillips, J. J. (1998). Implementing Evaluation Systems and Processes. Alexandria, VA.

American Society for Training and Development.

Phillips, J. J. (2000a). The Consultant’s Scorecard. New York, NY. McGraw-Hill.

Phillips, J. J., Ron D. Stone, Patricia Pulliam Phillips (2000b). the Human Resources Scorecard:

Measuring the Return on Investment. Houston, TX. Gulf Publishing

Phillips, Patricia Pulliam and Holly Burkett (2000). ROI on a Shoestring. ASTD Infoline Series.

Alexandria, VA: American Society for Training and Development.

Sibbet, David. (1997) 75 Years of Management Ideas and Practice, 1922-1997. Harvard

Business Review. Supplement.

(2000). The Corporate University: Measuring the Impact of Learning. Houston, TX: American

Quality and Productivity Center.

William and Mary Business Review. (1995). Corporate Training: Does it pay off? Supplement.

ROI in Training (2010) 93

Page 95: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

Recommended Reading

The Consultant’s Scorecard: Tracking Results and Bottom-line Impact of Consulting

Projects, Jack J. Phillips, McGraw-Hill, New York, New York, 2000

The Human Resources Scorecard: Measuring the Return on Investment in Human

Resources, Jack J. Phillips, Ron D. Stone, and Patricia P. Phillips, Gulf Publishing, Houston,

Texas, 2000.

ROI on a Shoestring, Phillips, Patricia P. and Burkett, Holly

InfoLine Series on Evaluation, Jack J. Phillips, (Editor), American Society for Training and

Development, Alexandria, Virginia, 1999.

Volume 1 – Level 1 Evaluation: Reaction and Planned Action Issue # 9813

Volume 2 – Level 2 Evaluation: Learning Issue # 9814

Volume 3 – Level 3 Evaluation: Application Issue # 9815

Volume 4 – Level 4 Evaluation: Business Results Issue # 9816

Volume 5 – Level 5 Evaluation: ROI Issue # 9805

In Action: Implementing Evaluation Systems and Processes, Jack J. Phillips, (Editor),

American Society for Training and Development, Alexandria, Virginia, 1998

Evaluating Training Programs (2nd Edition), Donald L. Kirkpatrick, Berrett-Koehler

Publishers, San Francisco, California, 1998.

Return on Investment in Training and Performance Improvement Programs, Jack J.

Phillips, Gulf Publishing, Houston, Texas, 1997.

Evaluating the Impact of Training, Scott B. Parry, American Society for Training and

Development, Alexandria, Virginia, 1997

Hand book of Training and Evaluation and Measurement Methods (3rd Edition), Jack J.

Phillips, Gulf Publishing, Houston, Texas, 1997.

In Action: Measuring Return on Investment, Volume 2, Jack J. Phillips, (Editor),

American Society for Training and Development, Alexandria, Virginia, 1997.

Accountability in Human Resource Management, Jack J. Phillips, Gulf Publishing,

Houston, Texas, 1996.

Evaluating Training Effectiveness, Second Edition, Peter Bramley, McGraw-Hill Book

Company, London, 1996.

ROI in Training (2010) 94

Page 96: FASSET SKILLS DEVELOPMENT · Web viewEquipment and Facilities - including all types of AV equipment, computer hardware and software, classroom overhead or rental of training facilities

Evaluating Human Resources, Programs, and Organizations, Byron R. Burnham,

Krieger Publishing Company, Malabar, Florida, 1995.

In Action: Measuring Return on Investment, Volume 1, Jack J. Phillips, (Editor),

American Society for Training and Development, Alexandria, Virginia, 1994.

Return On Quality, Roland T. Rust, Anthony J. Zahorik, Timothy L. Keiningham, Irwin

Publishers, Chicago, Illinois, 1994

Make Training Worth Every Penny, Jane Holcomb, Pfeiffer & Company, San Diego,

California, 1994.

The Training Evaluation Process, David J. Basarab, Sr. and Darrell K. Root, Kluwer

Academic Publishers, Norwell, Massachusetts, 1992.

Training Evaluation Handbook, A. C. Newby, Pfeiffer & Company, San Diego, California

1992.

Handy Resources

http://www.fasset.org.za

http://www.businessballs.com/

http://www.ibmweblectureservices.ihost.com/servlet/Gate/Component?

action=load&customer=ibm&offering=sngl&itemCode=ltu2153f

http://www.learning-designs.com/page_images/LDOArticleBottomLineonROI.pdf

http://www.learningcircuits.org/2001/jan2001/cross.htm

http://hub.col.org/2002/collit/att-0073/01-The_CIPP_approach.doc

Useful links

http://www.franklincovey.com/jackphillips/index.html

http://www.franklincovey.com/jackphillips/satest.htm

http://www.glaworld.com/documents/2F1D3F14-1968-4539-9DC1213B0B4DB724/

Human%20Capital%20ROI.pdf

ROI in Training (2010) 95