Evaluating learning

Last updated: 14 Aug 2012

This page is: archived

Download the publication

Unfortunately a downloadable version of this publication is unavailable

The Australian Public Service Commission has used its best endeavours to ensure the accuracy of this publication.

The legislative framework and the other information covered by the guide may change from time to time, particularly where principles are based on court or AIRC cases. We will endeavour to notify agencies of significant changes through our website, when they come to our attention. The material does not constitute a comprehensive outline of the policy or legislation on every aspect of the APS Values and Code of Conduct, relevant to every situation.

For these reasons, we cannot guarantee the material is complete, correct, up to date, or relevant for your purposes.

The guide should not be relied on as a substitute for detailed advice as a basis for making decisions. In any important matter you should seek professional advice relevant to your particular circumstances.

Foreword

In 2004 the Australian Public Service was recognised as a world leader in public sector reform by the United Nations. These reforms in recent years have provided the flexibility and agility needed for a more responsive public service with a strong ethos of accountability and professionalism.

A critical part of this is the quality, capability and capacity of our people.

In the times ahead the Service will face much change and many challenges. If we are to continue delivering a high standard of public service to achieve the results we want, we need to ensure our people are supported to develop the skills and knowledge they need.

Recent State of the Service and Management Advisory Committee reports highlighted the significant impact on the Service of the changing shape of Australia’s workforce. In the future we can expect our workforce to be more mobile, increasingly multigenerational and smaller. We will need to compete to attract the best and brightest, and work to retain them.

Learning and development will be an increasingly important aspect of our strategies to build the capability of our organisations. It represents a significant resource investment within agencies and across the Service, and we need to understand how well that investment is working to achieve the desired ends.

In 2002, in conjunction with the Australian National Audit Office, the Commission developed Building capability—a framework for managing learning and development in the APS, to assist agencies as they design and implement learning and development initiatives that meet their individual needs.

Evaluating learning and development—a framework for judging success completes the picture. It gives agencies a practical guide to assist them in developing effective approaches to evaluating their learning and development initiatives. It is supported by a series of user-friendly evaluation tools and additional resources, available from the Commission’s website.

I particularly wish to acknowledge the practitioners from a wide range of agencies who gave their time, energy and expertise to the development of this resource. To all who have contributed to this effort, my sincerest thanks.

Lynelle Briggs
Australian Public Service Commissioner

June 2005

Introduction

Purpose

The booklet Evaluating learning and development—a framework for judging success is one part of the APS learning and development evaluation guide, which has been developed to provide practical support and guidance for agencies evaluating their learning and development. It provides a framework for ‘getting started’:

  • for making the key decisions about what and how to evaluate your agency’s learning and development
  • for assessing your agency’s maturity or organisational capacity to evaluate
  • for planning your agency’s overall evaluation strategy.

Working through the underlying key decisions set out in this framework will provide the basis for agencies to then choose appropriate evaluation processes and tools. The other part of the guide is a collection of practical resources (tools, templates, checklists etc.) which is available on the Commission website. Practitioners can either adopt these resources ‘as is’ or adapt them to suit their agency’s specific circumstances and practices.

The guide is based on the model for evaluating learning and development contained in the better practice guide, Building capability—a framework for managing learning and development in the APS, published by the Australian Public Service Commission and the Australian National Audit Office in 2003.

Who is this guide for?

The guide is primarily for human resource and learning and development practitioners. Practitioners can also use it when working with the senior executive and relevant line managers to inform the development of the agency’s overall evaluation strategy or an evaluation plan for a particular programme.

The framework can be used by practitioners at a number of levels:

  • to work with and involve the senior executive to develop the agency’s overall evaluation strategy—to review and inform the whole-of-agency approach to developing capability
  • to work with the relevant line managers to design an evaluation strategy for a broad programme of learning and development, e.g. leadership development across the agency
  • to design the evaluation of a specific learning and development activity, e.g. a half-day seminar introducing a new business system.

Components of the guide

The APS learing and development evaluation guide comes in two parts:

  • Part 1: Evaluating learning and development—a framework for judging success (this document)
  • Part 2: The practitioners’ website at www.apsc.gov.au/learn

The companion practitioners’ website is a collection of practical resources and tools that can be adapted and customised for use in agencies to best meet each agency’s specific business requirements.

Objectives of evaluating learning and development

  1. Assess if intended learning and development objectives have been met
  2. Continuous improvement of learning and development
  3. Assess whether resources are used wisely
  4. Assess the value for money of the learning and development

Importance of evaluating learning and development

  • Effective evaluation is part of an accountable, professional and ethical public service. It is fundamental to good governance, good practice and good management.
  • Effective evaluation is an important and useful management tool to help facilitate and promote effective learning and development—vital for an increasingly interconnected, complex and contestable public sector environment.
  • Effective evaluation will ensure that an agency’s approach to people development aligns with its business goals and is good value for money.

Perspectives on evaluating learning and development

Evaluation of learning and development can reveal different levels of information, each with its own emphasis and contribution to overall evaluation results.

It is important to consider and clarify the overall purpose of the evaluation before starting to plan the evaluation strategy. The purpose can range from high level strategic issues such as assessing the impact on the agency's capability, to operational issues such as improving quality and efficiency of the delivery of specific programmes and improving participant's capabilities.

Diagram shows the relationship between incresing evaluation outputs, the levels of evaluation information and the emphasis of each level

The APS model for evaluating learning and development

The model for evaluating learning and development, as set out in the better practice guide, Building capability—a framework for managing learning and development in the

APS (published by the Australian Public Service Commission and the Australian National Audit Office in 2003), covers six elements of evaluation across three phases of the life cycle of a learning and development intervention.

The six elements are:

  1. relevance
  2. appropriateness
  3. reaction
  4. capability acquired
  5. performance on the job
  6. outcomes.

The three life-cycle phases are:

Line of sight phase (Before intervention)

Is the learning and development relevant and appropriate to the learning need, goals, context, culture, funding arrangements etc.?

Learning and performance phase (During intervention)

Is the learning and development well conducted and managed, and does it help learners gain and transfer the necessary capabilities?

Outcomes phase (After intervention)

Does the learning and development produce tangible and intangible results, and what impact do these have on individuals and the organisation?

Model for evaluating learning and development

1. Line of sight phase (Before intervention)

Relevance

  • Business need
  • Individual need
  • Agency context
  • Other HR processes

Appropriateness

  • Scope
  • How much
  • How long
  • What cost
  • What benefit
  • What risk
  • What alternative

2. Learning and development phase (During intervention)

Reaction

  • Learner
  • Facilitator/presenter
  • Management

Capability acquired

  • Knowledge
  • Skills
  • Competency

Performance on the job

  • Learner
  • Supervisor
  • Next level manager

3. Outcomes phase (After intervention)

Outcomes of learning and development

  • Positive outcomes
  • Negative outcomes
  • Ambiguous outcomes
  • Value for money

Source: Building capability—a framework for managing learning and development in the APS.

Critical factors for success

Agency commitment and engagement

  • Extract and promote information from evaluations that assists managers to decide on:
    • the nature and type of learning needed for their people and teams
    • where to invest time, money and energy
    • how to improve the workplace culture to better support transfer and application of skills
    • the areas where succession and contingency planning need to focus
    • priorities for action, such as to develop new programmes, improve/modify existing programmes, reallocate resources.
  • Get line management input and sign-off for clearly stated learning outcomes and designs. Directly involving line managers in the planning and design stage will build their interest and clearly show their accountabilities for ensuring the desired results.
  • Provide feedback in ways that engage and involve the relevant manager/executive.

Adequate investment of time, money and resources

  • Make prudent decisions on evaluation an integral part of learning and development work plans. Be selective in what you choose to evaluate.
  • Allocate resources wisely to provide real results in areas that matter to the organisation.
  • Allocate the right amount of funds—proportionate to the end result desired.
  • Work with service providers to ensure that evaluation processes are built into the design of learning and development. Help them to help you.
  • Use existing data sources and existing data gathering methods to lower costs. Add value and use data in more than one way to decrease repetitive bureaucratic processes.

Sustainable practice

  • Embed learning and development evaluation processes to make them standard business processes within the organisation.
  • Make important evaluation information a standard feature in executive and management reports.
  • Ensure learning and development evaluation principles, strategies and practices are integrated within the agency’s HR practices—build a consistent approach to people management and avoid duplication.

Effective data, record and information management systems

  • Record and track minimum data sets such as expenditure and activity related data*.
  • Build in data entry tasks as an integral component of any learning and development activity.
  • Design data requirements and collection methods during the design phase of learning and development.
  • Match data collection methods to the purpose of the evaluation.
  • Use multiple sources of data to provide a holistic picture.
  • Integrate and use human resource management information systems, learning management systems and financial management systems to support evaluation.

* Refer to the better practice guide: Australian Public Service Commission and the Australian National Audit Office 2003, Building capability—a framework for managing learning and development in the APS, for details on the recommended minimum data set.

The framework consists of three sections

Section 1: Decide and plan—key decisions in evaluating learning and development

Provides guidance in thinking about, making decisions about and planning for the evaluation of learning and development.

Section 2: The learning and development evaluation maturity self-assessment

Provides guidance on how organisational capacity to evaluate can be measured and incrementally developed.

Section 3: Case study

Provides a structured process for the practitioner to anticipate, think about and prepare for typical challenges and issues that may arise in evaluating learning and development.

Together, the three sections are designed to help agencies to consider the relevant issues, make decisions and plan the evaluation of their learning and development.

Once the focus of the evaluation is clarified through this planning process, practitioners can then select relevant tools and resources from the companion practitioners’ website to assist them in implementing the evaluation. Resources are available to support evaluation of each of the six elements: relevance, appropriateness, reaction, capability acquired, performance on the job, and outcomes.

Section 1: Decide and plan—key decisions in evaluating learning and development

Chart shows key decisions at the centre of the topics covered in the following sections

Why is evaluation required?

To:

  • Determine success
  • Assess if objectives have been achieved
  • Make improvements
  • Ensure quality
  • Ensure accountability
  • Meet external requirements
  • Account for activity
  • Assess value or merit
  • Assess risk
  • Justify investments
  • Facilitate decisions whether to continue/discontinue the activity
  • Ensure appropriateness and alignment
  • Identify strengths and weaknesses

Tips

  • Be clear on the primary purpose of the evaluation
  • Make sure the purpose is understood, shared and signed off by stakeholders
Points to consider
Examples Consider
Importance of the intervention Activities which are important to key business goals or are key to meeting critical skill needs
Profile of the intervention High profile activities which are sponsored by key influential players in the organisation who may have particular or strong interests in the activity and the outcomes
Relative cost of the intervention Expensive, resource-heavy or time-consuming activity
Risk profile Programmes which have novel or risky features or which have OH&S implications
Relative age of the intervention New and untested programmes and activities
Certified programmes & courses Competencies which require assessments and tests for certification or accreditation
Mandated Activities which are deemed compulsory by the organisation

Planning focus: Intent of the evaluation

What will be evaluated?

  • Relevance
  • Appropriateness
  • Reaction
  • Capability acquired
  • Performance
  • Outcomes

Tips

  • Not all areas need to be evaluated
  • Use The learning and development evaluation maturity self-assessment (see Section 2, pg 21) to gauge your organisation’s capacity to evaluate the intended areas

Planning focus: Scope of the evaluation

Evaluation levels % of activities and programmes to be evaluated
 

Agency A example

This could be a policy agency with established highly experienced staff but significant change occurring in their relevant industry/sector.

Agency B example

This could be a fast growing agency with a significant number of new high profile responsibilities where service delivery must comply with complex legislation—programmes based on national qualification framework.

Relevance 90% 100%
Appropriateness 90% 100%
Reaction 70% 50%
Capability acquired 20% 100%
Performance on the job 20% 80%
Outcomes 10% 10%

How will information obtained be used?

To:

  • Improve the learning process
  • Improve relevant business areas
  • Improve decision making
  • Improve investment decisions
  • Improve organisational culture
  • Engage with stakeholders
  • Meet internal and external reporting requirements
  • Manage risk
  • Market activities

Tips

  • Use the results proactively to inform and assist managers and individuals to better use learning and development.
  • Use the information to educate the organisation on the factors which help or hinder learning and development.

Planning focus: Outputs of the evaluation

Who will have an interest in the information?

  • The executive management group
  • Senior executives
  • Line managers and supervisors
  • Learners
  • Facilitators/coaches
  • Programme managers
  • HR managers and learning and development practitioners

Tips

  • Focus on groups that are relevant and who will or can have a positive influence
  • Don’t wait to be asked for the information
  • Use the information in business cases and for business planning

Planning focus: Target audience for evaluation findings

What data will need to be collected?

  • Relevance assessments
  • Appropriateness assessments
  • Investment and expenditure
  • Participant and facilitator reactions
  • Capability acquired
  • Performance on the job
  • Business outcomes
  • Work performance data
  • Activity-related data

Tips

  • Data requirements are driven by evaluation purpose
  • Create data gathering processes which easily become routine and simple
  • Use alternative and existing data sources such as staff surveys, performance management data etc. where possible
  • Design data gathering tools that can provide information for more than one purpose

Planning focus: Data requirements

How will this data be collected?

From:

  • Relevant planning documents
  • Reaction sheets
  • Assessments and tests
  • Participant and supervisor surveys
  • Success and failure case studies
  • Individual learning action plans
  • Individual impact maps
  • Performance management processes
  • Interviews
  • Structured workplace observations
  • Work outputs
  • Workforce performance data
  • HRMIS or LMS

Tips

  • Record and track data diligently
  • Use the human resource management information system (HRMIS) or learning management system (LMS) for data recording and tracking where possible
  • Use simple data collection methods
  • Use appropriate sampling techniques
  • Collect data from multiple sources
  • Uphold privacy requirements

Planning focus: Data sources and collection methods

When will this data be collected?

During the different phases of the whole cycle:

  • Line of sight phase
  • Learning and performance phase
  • Outcomes phase

Tips

  • Collect data in a timely manner
  • Build evaluation into learning and development design

Planning focus: Timeframes

How will information be reported?

As:

  • 'Dashboards' and scorecards
  • Activity measures
  • Performance measures
  • Cost benefit analysis
  • Return on investments
  • Impact assessments
  • Qualitative reports

Tips

  • Report in a timely manner
  • Report in a constructive manner
  • Speak the language of the audience

Planning focus: Report format

Type of evaluative data Use this information to:
Relevance data
  • Demonstrate alignment of learning and development strategy with organisational goals and needs
Appropriateness data
  • Position learning and development as a lever for change
  • Highlight needs which should be met by other (non-learning and development) means
Reaction data
  • Influence service providers for better outcomes
  • Market the programme and encourage employee interest
  • Improve delivery and administration of the programme
Capability acquired data
  • Account for and demonstrate bench strength for succession planning, contingency planning and workforce planning purposes
  • Encourage (planned) employee mobility
Performance on-the-job data
  • Help managers and supervisors better support their staff in applying learning and development
  • Obtain and build 'buy in' and support from line managers and supervisors
  • Highlight structural and process aids and barriers to on-the-job performance
  • Inform learning and development procurement processes
Outcomes data
  • Demonstrate value of learning and development to the organisation
  • Assess linkages between positive outcomes and employee commitment
  • Enhance organisational branding as employer of choice
  • Inform learning and development procurement processes

Section 2: The learning and development evaluation maturity self-assessment

The maturity self-assessment is a guide for organisations to develop their maturity in the practice of good learning and development evaluation.

It covers seven elements, six of which are based on the elements of the APS model for evaluating learning and development. The additional element relates to evaluation strategy and planning which is an essential component for any mature evaluation capability.

Indicators

Each element is defined by several descriptors and illustrated by examples of observable outputs that can act as indicators of successful practice. These indicators are designed to enable concrete application of each element. Indicators can be viewed as desirable better practices.

The maturity self-assessment

The five-level maturity self-assessment is mapped against these indicators and is designed to help gauge, characterise and articulate the maturity of your agency against each indicator.

The maturity self-assessment can also be used to set incremental targets allowing each indicator to be applied with greater sophistication as may be required.

The five maturity levels are grouped into three categories (indicated by different colours on the following pages) for more refined maturity assessments and the tracking of incremental improvement.

For each relevant indicator, cross the box that best describes the state of play in your agency. Then tick the box that best describes the desired state of play in your agency.

1. Evaluation decisions and planning

The APS learning and development evaluation maturity model

Element

Decide what to evaluate, the level to which evaluation is to occur and plan for successful evaluation.

This entails (descriptors)

  • Developing an evaluation strategy and plan
  • Deciding what activities to evaluate based on accepted criteria
  • Deciding what phases to evaluate based on evaluation objectives
  • Support by the agency of evaluation activities
  • Deciding how to use evaluation data effectively
Tracking the maturity of learning and development evaluation in your organisation

Indicators

What successful practice of this element would look like

Level 1: Ad hoc
Practice is applied poorly or inconsistently and has an uncertain level of acceptance
Level 2: Managed
Practice is performed and managed with some skill for compliance reasons
Level 3: Defined
Practice is defined, familiar, shared and skillfully performed
Level 4: Integrated
Practice is embedded and seen as part of daily work and as adding real value
Level 5: Optimised
Practice is continuously improved and adapted for agency outcomes
1. Evaluation activities are guided by an agency evaluation strategy.          
2. The evaluation strategy describes the intended outputs and purpose of evaluation activity.          
3. Decisions on which programmes to evaluate are based on accepted criteria.          
4. Decisions on what to evaluate are based on the objectives of the evaluation.          
5. Evaluation and data collection decisions for programmes are made at early stages of programme design.          
6. Evaluation activities are supported by the agency leaders and managers.          
7. Data gathering and information management systems are in place and used to support evaluations.          
8. Evaluation data is used actively within the agency to improve programmes, decisions, processes and people development strategies.          

2. Relevance

Element

Assess how well proposed learning and development interventions address business needs, capability needs and individual needs within the agency.

This entails (descriptors)

  • Clarifying the purpose of any learning activity
  • Identifying linkages to business goals
  • Ensuring learning meets identified capability needs
  • Situating learning within an organisational capability framework
  • Assessing the potential of intended outcomes to contribute positively to organisational goals or needs
Tracking the maturity of learning and development evaluation in your organisation

Indicators

What successful practice of this element would look like

Level 1: Ad hoc
Practice is applied poorly or inconsistently and has an uncertain level of acceptance
Level 2: Managed
Practice is performed and managed with some skill for compliance reasons
Level 3: Defined
Practice is defined, familiar, shared and skillfully performed
Level 4: Integrated
Practice is embedded and seen as part of daily work and as adding real value
Level 5: Optimised
Practice is continuously improved and adapted for agency outcomes
1. Agency capability requirements at organisational, group and individual levels are understood and used as the basis for learning.          
2. Learning objectives have line of sight to workforce planning needs and business goals.          
3. Learning activities are approved by the agency.          
4. Intended outcomes from learning have reasonable potential to contribute to improved performance at individual and/or group and/or organisational level.          
5. Business cases are built for learning activities where appropriate.          

3. Appropriateness

Element

Measure the degree to which the allocation of resources to learning and development is appropriate to identified needs and priorities.

This entails (descriptors)

  • Identifying the extent of integration of learning and development with other HR strategies and business practices
  • Describing the desired benefits
  • Describing the scope of each intervention
  • Obtaining quantitative and qualitative information about the level and nature of investment
  • Assessing if the design of the intervention matches the desired culture and the needs of the target audience
  • Identifying risks
  • Identifying alternatives
Tracking the maturity of learning and development evaluation in your organisation

Indicators

What successful practice of this element would look like

Level 1: Ad hoc
Practice is applied poorly or inconsistently and has an uncertain level of acceptance
Level 2: Managed
Practice is performed and managed with some skill for compliance reasons
Level 3: Defined
Practice is defined, familiar, shared and skillfully performed
Level 4: Integrated
Practice is embedded and seen as part of daily work and as adding real value
Level 5: Optimised
Practice is continuously improved and adapted for agency outcomes
1. Learning is used to meet needs which can be met appropriately through development.          
2. Alternative strategies are considered in meeting needs.          
3. Learning activities are appropriate to the needs they are designed to address and the audience.          
4. Learning is linked to business planning, workforce planning, performance management and career development processes.          
5. Learning activities are appropriate to actual or desired organisational culture and context.          
6. Allocation of resources is proportionate to the need that the learning is designed to address.          
7. Adequate data sets on costs and activity are established and maintained.          
8. Cost benefit assessments or value for money assessments are conducted where necessary.          

4. Reaction

Element

Measure participant's and facilitator's immediate reactions to aspects of the intervention or learning and development activity.

This entails (descriptors)

  • Measuring participant reactions
  • Measuring facilitator and programme manager reactions
  • Using the data to make improvements
Tracking the maturity of learning and development evaluation in your organisation

Indicators

What successful practice of this element would look like

Level 1: Ad hoc
Practice is applied poorly or inconsistently and has an uncertain level of acceptance
Level 2: Managed
Practice is performed and managed with some skill for compliance reasons
Level 3: Defined
Practice is defined, familiar, shared and skillfully performed
Level 4: Integrated
Practice is embedded and seen as part of daily work and as adding real value
Level 5: Optimised
Practice is continuously improved and adapted for agency outcomes
1. Participant reactions are obtained when appropriate.          
2. Facilitator and programme manager reactions are obtained when appropriate.          
3. Data is used appropriately to improve activity-related aspects in a timely manner.          
4. Data is reported to relevant stakeholders including supervisors and activity sponsors.          
5. Data is used appropriately to generate organisational interest.          
6. Data is used in time series comparisons where appropriate.          

5. Capability acquired

Element

Evaluate the success of the learning and development activities by measuring whether the individual(s) and/or the agency have acquired the capability, knowledge, attitudes or competency required.

This entails (descriptors)

  • Acquisition of skills, knowledge and attitudes occurs in a logical and sequential manner
  • Measuring if performance standards have been achieved through valid and appropriate tests or assessments
Tracking the maturity of learning and development evaluation in your organisation

Indicators

What successful practice of this element would look like

Level 1: Ad hoc
Practice is applied poorly or inconsistently and has an uncertain level of acceptance
Level 2: Managed
Practice is performed and managed with some skill for compliance reasons
Level 3: Defined
Practice is defined, familiar, shared and skillfully performed
Level 4: Integrated
Practice is embedded and seen as part of daily work and as adding real value
Level 5: Optimised
Practice is continuously improved and adapted for agency outcomes
1. Standards for desired skill performance are identified in learning activities where appropriate.          
2. Standards are based on national frameworks where required or appropriate.          
3. Appropriate assessments or tests are used to measure the acquisition of skills, knowledge or attitudes.          
4. Opportunities for re-tests are built into learning activities where appropriate.          
5. Assessments appropriately reflect the context in which the skills, knowledge or attitude is to be performed.          

6. Performance on the job

Element

Assess individual performance on the job following learning and development activities.

This entails (descriptors)

  • Identifying what learning has been transferred and applied in the workplace
  • Identifying how transfer and application of learning has occurred (or why transferred application of learning did not occur)
  • Strengthening transfer and application of learning processes and mechanisms
Tracking the maturity of learning and development evaluation in your organisation

Indicators

What successful practice of this element would look like

Level 1: Ad hoc
Practice is applied poorly or inconsistently and has an uncertain level of acceptance
Level 2: Managed
Practice is performed and managed with some skill for compliance reasons
Level 3: Defined
Practice is defined, familiar, shared and skillfully performed
Level 4: Integrated
Practice is embedded and seen as part of daily work and as adding real value
Level 5: Optimised
Practice is continuously improved and adapted for agency outcomes
1. Transfer and application of learning mechanisms are devised at early stages of programme design.          
2. Reasons for successful or unsuccessful transfer and application are understood by all stakeholders.          
3. Transfer and application of learning is measured where appropriate and at appropriate times.          
4. Contribution of learning (and other variables) to overall performance is identified.          
5. Data is used to improve transfer and application of learning processes and mechanisms in a timely manner.          
6. Organisational variables which impact on transfer and application are identified and managed.          

7. Outcomes

Element

Assess outcomes achieved at individual, group and/or organisational levels. They can be positive, negative, or at times, ambiguous.

This entails (descriptors)

  • Measuring value for money
  • Identifying accrued tangible and intangible results which lead to better business outcomes
  • Assessing how the same or better outcomes could be achieved in a cheaper and/or quicker manner
  • Identifying areas for improvement within the organisation and its culture
Tracking the maturity of learning and development evaluation in your organisation

Indicators

What successful practice of this element would look like

Level 1: Ad hoc
Practice is applied poorly or inconsistently and has an uncertain level of acceptance
Level 2: Managed
Practice is performed and managed with some skill for compliance reasons
Level 3: Defined
Practice is defined, familiar, shared and skillfully performed
Level 4: Integrated
Practice is embedded and seen as part of daily work and as adding real value
Level 5: Optimised
Practice is continuously improved and adapted for agency outcomes
1. Outcomes or results of transfer and application of learning(or lack of) are identified and quantified in ways that are meaningful to the organisation.          
2. Factors other than learning which contribute to outcomes are recognised and identified.          
3. Outcomes or results are analysed and framed in relation to business goals.          
4. Outcomes or results are analysed for their value or benefit in relation to cost, or as returns on investment where appropriate.          
5. Outcomes or results are reported and communicated to stakeholders.          
6. Data is used to improve or guide future decision making.          
7. Data is used to shape learning strategies.          
8. Data is used to shape organisational context and culture to better support learning.          

Section 3: Evaluating learning and development— a case study

This four-part case study is intended to help the practitioner anticipate, think about and prepare for typical challenges and issues that may arise in evaluating learning and development.

Use this case study:

  • to reflect on the issues raised and to think of how you would respond to the key questions
  • to generate discussion with your team or manager on how similar challenges and tasks in your organisation could be tackled
  • to identify, anticipate, prepare and plan for potential issues and activities that may arise during learning and development evaluation
  • to inform your stakeholders of evaluation-related issues so that they can better support your evaluation effort
  • to help new human resource and learning and development practitioners understand and plan for some of the complexities involved with learning and development evaluation in the APS.

Case study

Part 1: The big picture

Rebecca is an Executive Level 2 working in the HR branch of a medium-sized Australian Public Service agency which has policy and programme responsibilities. She is currently Director People Development and was promoted into the job after a three-year stint as a line manager in one of the agency’s more prominent programme divisions.

As line manager, Rebecca made it a point to actively support her three teams in their professional development and paid much attention to building their technical capabilities as well as their skills in working well within and across teams. A teacher by background, she has always valued learning.

Part of her vision for her new role is to ensure that operational areas within the agency are well supported with a variety of business-related learning and development opportunities. These would provide practical help to people working with the many complexities of policy development and programme implementation. Learning and development in the agency, according to Rebecca, is supposed to be about practical skill development in areas relevant to the agency’s core business.

While the branch has lead responsibility for many HR-related functions, processes and policies, the branch also works in close collaboration with two HR support cells that service two of the largest divisions in the agency.

These cells essentially provide operational support to divisional staff including personnel administration, recruitment, case management and co-ordinating learning and development activities in highly specialised technical areas.

The branch has a good reputation within the agency and most divisional heads have been appreciative and supportive of branch initiatives. Of late however, the branch has been under increasing pressure to get the agency’s HRMIS in order so that it provides useful workforce performance reports for the executive, and to streamline its menu of learning and development activities.

Some division heads have also mentioned on more than one occasion, the need for the branch to demonstrate the returns that the agency’s expensive middle- management programme is making on its investment. This programme is paid for by all the divisions based on their quota of participants. The larger divisions have felt the burden of having to invest more, given the higher proportion of their participants.

This pressure, together with the recent focus on learning and development evaluation by the Government and central APS agencies, has compelled the branch head to think about the need for an agency learning and development evaluation strategy. To ignore these pressures would be to risk divisional support and goodwill and the relevance of the learning and development function. Divisions would simply develop their own learning and development agendas if they felt their needs weren’t being met by corporate HR.

As Director People Development, Rebecca has primary responsibility for the design, development and implementation of this strategy. She knows that this is a daunting task.

The agency runs over 30 learning and development activities including:

  • competency certification courses
  • postgraduate programmes in policy development
  • ‘study bank’—agency-approved time for formal study
  • supervisor development programmes
  • IT training
  • E-learning for OH&S and diversity issues
  • a middle management residential programme
  • an action learning set in a policy area
  • study visits of community stakeholder groups
  • self-paced study manuals
  • external attachments
  • occasional lunchtime seminars
  • basic APS procedural and administrative development programmes
  • executive coaching services for the senior executives
  • brokering external programmes.
Key questions to consider:
  1. What can be done to begin developing the strategy?
  2. What will be the key objectives of this strategy?
  3. What criteria can be used to determine which programmes will be evaluated?
  4. What criteria can be used to help identify the level of evaluation that would be required?

Part 2: Issues around evaluating relevance

Rebecca does realise that any good learning and development evaluation begins with an assessment of the programme’s relevance. After all, it’s stated in the APS evaluation model! But she is not sure of how to actually test each programme’s relevance, especially now that processes like skills audits and needs analyses seem to have fallen out of favour with many learning and development practitioners and employees.

The branch did recently attempt to collate training needs data from its performance management processes, but as this is still paper-based and confidential, obtaining this data has been tedious with patchy results.

The need for learning therefore is very much shaped by a variety of interests and agendas. This has led to a purely reactive approach on the part of the branch and an unsatisfactory sense that the agency’s learning and development strategy is a patchwork of unintegrated events.

Key questions to consider:
  1. How can relevance of a programme be tested for existing programmes and how can it be guaranteed for new programmes?
  2. How can relevance of learning and development to organisational goals/objectives/outcomes be achieved?

Issues around evaluating appropriateness

The agency has been brave in being an early adopter of novel learning and development initiatives. For example, it was one of the first agencies in the APS to use an e-learning system for its OH&S training and to employ executive coaching for its senior executives.

On the face of it, the agency has a rich menu of formal and work-based learning activities. Rebecca is however, uncomfortable at the ad hoc nature of this ‘menu’ as are some division heads. They have started to question the appropriateness of some learning and development activities which they have to fund.

As well, increasing budgetary constraints, time and workload pressures throughout the agency mean that HR is increasingly under pressure to justify each learning and development activity it proposes.

Key questions to consider:
  1. How can appropriateness of a programme be evaluated?
  2. How can its appropriateness, once determined, be communicated?

Part 3: Issues around evaluating learners' reaction

Reaction level evaluation of learning and development is carried out within the agency. Reaction sheets are religiously handed out at the end of every activity and the HR branch has used this to make judgements on content, service providers and administrative arrangements. Findings are generally discussed within the branch and are often used to drum up participant interest within the agency.

Key questions to consider:
  1. What is the purpose of the reaction sheets?
  2. How can you use this reaction data in a more strategic or creative way?

Issues around evaluating capability acquired

The agency corporately funds several competency-based programmes which result in certifications at the certificate IV and diploma levels. Successful participants are generally considered technical experts within the agency. Participants seem to value the certification they receive at the end of their learning and development assessment.

Rebecca has felt for some time now that many other programmes would benefit from having some form of assessment to help ensure participants actually master the skills being taught. She is aware of people in the agency, for example, who can’t complete tasks even after attending a course. She is not quite sure how to go about doing this and is unsure of the reaction of employees to the idea of assessments in non-certified programmes.

Key questions to consider:
  1. What activities are best suited for capability assessments?
  2. What assessments can be introduced to ensure that skills are actually gained from these activities?

Issues around evaluating performance on the job

There is little that the agency does formally at the moment to gauge if employees actually apply what they’ve learned from learning and development activities. Application on the job has always been seen by HR as being the responsibility of line managers, except where HR has helped implement work-based type activities. These could be action learning sets, and coaching which seemed to have obvious and visible work-based applications. Some information on application of learning, Rebecca assumed, would be captured in the agency’s performance management processes. However, as this is still paper-based and confidential to the individual, such information would be hard to obtain.

Rebecca does know anecdotally that many participants do seem enthusiastic after taking part in learning and development activities. But she also realises that this enthusiasm and energy dissipates quickly once participants return to their workplaces, and is keen to learn why this is so and how this effects the transfer of learning.

Key questions to consider:
  1. How can your agency find out if learning has been applied on the job?
  2. What would some barriers to application be?
  3. What would encourage application of learning?
  4. What can the branch do to more actively encourage the application or transfer of learning?

Part 4: Issues around evaluating outcomes

At the moment, there is very little that the agency does to gauge formally the difference learning has made to its outputs and outcomes.

There is a commonly held view that the learning is generally valuable and beneficial to individuals and the organisation.

There has, however, been no formal effort to actually describe and report what these benefits might be and how valuable they actually are. Recent reporting has been limited to learning and development activitiy reports and Rebecca is not even confident that these are accurate as the HRMIS does not yet capture such data well.

Current reported data is often compiled from doing the ‘ring around’ with branches to find out how many of their people participated in formal learning and development activities over the previous quarter. The branch did recently create an Excel database for each division to record, cost and report their learning and development activities, but this is used inconsistently.

The agency’s last performance management cycle does seem to report an increase in performance ratings especially for the executive-level feeder groups. Rebecca and her colleagues are unsure if this is a result of participation in learning and development activities.

Several division heads have also queried the cost of the middle management programme and say that similar programmes conducted elsewhere are less expensive.

While there is a general consensus that this programme is well conducted and received, and that the external service provider is an excellent partner, some senior executives do think that the agency is paying more than it’s worth. As a former participant on the programme, Rebecca does not agree with this view and would like to demonstrate that the programme is making an impact on individuals and their commitment to the agency. She does feel strongly that the investment made by divisions is worthwhile.

However, she is unsure how to go about demonstrating this with meaningful data. Concepts such as value for money, returns on investment and impact statements all seem useful, but she is unsure which would be most appropriate.

Key questions to consider:
  1. How would you show that the programme is worthwhile?
  2. What data would be useful to report to senior management? How can you start to get hold of this data?
  3. Which evaluation approaches would be useful in helping demonstrate the programme’s impact?

Appendix 1: List of web-based resources

Decide and plan

Making decisions about what and how to evaluate any aspect of learning and development should generally be made as part of the broad evaluation strategy for your organisation.

  1. Read and use the 'Prioritisation matrix' to help you make decisions about what aspects of learning and development you wish to evaluate. Also scan the 'Tips for engaging stakeholders' and references and resources available in this section of the website to gain a broad understanding of issues around planning your evaluation.
  2. Select the most appropriate tools for your purpose and context or use these to trigger other ideas.
  3. Adapt the tools for your own use.
  4. Apply the tools to assist you in making decisions about what and how to evaluate selected learning and development interactions, and create an evaluation plan.
  5. Gain key stakeholders' endorsement and commitment to the evaluation plan.
  • Prioritisation matrix
  • Evaluation plan template
  • Tips on engaging stakeholders
  • Types of learning and development failures
  • Quality checklist
  • References and resources for evaluation planning

Line of sight phase

The decision to conduct an evaluation of the relevance and appropriateness of learning and development should generally be made as part of the broad evaluation plan for your organisation.

  1. Read and use the planning guides to plan for your evaluation of relevance and appropriateness. Also scan the references and resources available in this section of the website to gain a broad understanding of issues around relevance and appropriateness.
  2. Select the most appropriate tools for your purpose and context or use these to trigger other ideas
  3. Adapt the tools for your own use
  4. Apply the tools to assist you evaluate relevance and appropriateness of selected learning and development interactions

Relevance

  • Planning guide for evaluating relevance
  • Relevance index

Appropriateness

  • Planning guide for evaluating appropriateness
  • Appropriateness index
  • Output potential index

Relevance and appropriateness

  • Tips when evaluating relevance and appropriateness
  • Relevance and appropriateness scorecard
  • References and resources for evaluating relevance and appropriateness

Learning and performance phase

The decision to conduct an evaluation of the learning and performance resulting from learning and development should generally be made as part of the broad evaluation strategy for your organisation.

  1. Read and use the guiding table to plan for your evaluation of learning and performance. Also scan the references and resources available in this section of the website to gain a broad understanding of issues around learning and performance.
  2. Select the most appropriate tools for your purpose and context or use these to trigger other ideas
  3. Adapt the tools for your own use
  4. Apply the tools to assist you evaluate relevance and appropriateness of selected learning and development interactions
  5. Report your evaluation findings to key stakeholders

Reaction

  • Participant reaction to learning and potential for application
  • Participant reactions
  • Sample participant reaction evaluation questionnaire for a complex programme

Capability

  • Techniques to evaluate learning and performance
  • Accountable learning agreement

Performance

  • Planning guide for evaluating learning and performance
  • Individual learning impact map
  • Learning survey-for use on desktop
  • Learning survey-for use as hardcopy

Capability and performance

  • Structured observation log
  • Reporting on learning and performance evaluation
  • Tips when evaluating learning and performance
  • References and resources for evaluating learning and performance

Outcome phase

The decision to conduct an evaluation of the impact resulting from learning and development should generally be made as part of the broad evaluation strategy for your organisation. Use the following suggested tools to assist you in evaluating impact when required

  1. Read and use the guiding table to plan for your evaluation of impact. Also scan the references and resources available in this section of the website to gain a broad understanding of issues around impact evaluation.
  2. Select the most appropriate tools for your purpose and context or use these to trigger and prompt other ideas.
  3. Adapt the tools for your own use.
  4. Apply the tools to assist you evaluate impact of selected learning and development interactions.
  5. Report your evaluation findings to relevant stakeholders.
  • Planning guide for evaluating impact
  • Success case method
  • Lessons learned method
  • Returns on investment
  • Impact statement
  • Value for money assessments
  • Tips when evaluating impact
  • References for impact evaluation

Appendix 2: References and resources

Australian Public Service Commission and the Australian National Audit Office 2003,

Building capability—a framework for managing learning and development in the APS, Australian Public Service Commission and the Australian National Audit Office, Canberra.

Australian National Audit Office 2002, Managing people for business outcomes.

Audit Report No.61, ANAO, Canberra. www.anao.gov.au

Australian National Audit Office 2003, Managing people for business outcomes, year two.

Audit Report No. 50, ANAO, Canberra. www.anao.gov.au

ACT Chief Minister’s Department 2003, ACT public service learning and development framework, CMD, Canberra. www.psm.act.gov.au/publications/L&D_framework-final.pdf

Commonwealth Department of Health and Aged Care 2001, Evaluation: a guide for good practice, Commonwealth Department of Health and Aged Care, Canberra. www.health.gov.au/internet/wcms/publishing.nsf/Content/mentalhealth-resources-evaluation

Centrelink 2004, Centrelink evaluation handbook: a guide for planning and conducting evaluations in Centrelink, Centrelink, Canberra.

Finance and Public Administration References Committee 2003, Recruitment and training in the Australian Public Service, The Senate, Canberra. www.aph.gov.au/senate/committee/fapa_ctte/completed_inquiries/2002-04.htm

Management Advisory Committee 2003, Organisational renewal, MAC, Canberra.

State Services Commission 2001, A framework for measuring training and development in the state sector: working paper no.12, State Services Commission, New Zealand. www.ssc.govt.nz

Appendix 3: Glossary of terms

Alignment

Vertical agreement of strategies, structures, capabilities, technology and processes with corporate goals, cascading to lower level plans and activities. These components are interdependent and must be viewed in relation to one another to ensure they are congruent with the overall corporate directions and desired culture.

Coaching

The practice of instructing, demonstrating, directing, prompting and encouraging participants. Generally concerned with methods rather than concepts.

Evaluation

A systematic, objective assessment of the appropriateness, efficiency and effectiveness of a programme or part of a programme.

The process of gathering information in order to make good decisions. It is broader than testing, and includes both subjective (opinion) input and objective (fact) input. Evaluation can take many forms including value for money assessment, portfolio assessment, 360° feedback and self-reflection.

Governance

Encompasses how an organisation is managed, its corporate and other structures, its culture, its policies and strategies, and lines of accountability.

Human resource/ People management

A series of organisationally approved initiatives designed to facilitate the effective management of people to achieve agency outputs and outcomes. This includes specific practice areas such as organisational development, workforce planning, recruitment and selection, performance management, learning and development, reward and recognition, workplace diversity, employee relations and occupational health and safety.

Integrate

To make strategies and processes horizontally compatible—share common practices and data sets etc. to achieve efficiencies, avoid duplication and maintain a congruent image and seamless process for users.

Learning and development

'Learning and development' refers to all processes associated with the identification of agency and individual requirements in relation to capability development, and the design, delivery and/or brokering of opportunities to develop the capability of individuals and groups within the agency.

Learning and development activity

Any activity specifically designed to build the capability of individuals. This includes a wide range of on-the-job and off-the-job processes, such as: structured programmes, formal education, networks and forums, job rotation, coaching and mentoring. A list of such interventions is found in Appendix 2, Building capability—a framework for managing learning and development in the APS, 2003, published by the APS Commission and the ANAO.

Mentoring

Three mentoring roles can exist in a work context:

  • mainstream mentor—someone who acts as a guide, adviser and counsellor at various stages in someone’s career
  • professional qualification mentor—someone required by a professional association to be appointed to guide a student through a programme of study, leading to a professional qualification
  • vocational qualification mentor—someone appointed to guide a candidate through a programme of development and the accumulation of evidence to prove competence to a standard.

The mentor’s role includes:

  • acting as a sounding board
  • sharing benefit of experience and perspective
  • highlighting opportunities
  • providing opportunities where the individual could showcase their talent.

Performance indicators

Information that can be used as the basis for determining the outcome, or impact, of particular learning and development activities or programmes—generally agreed with stakeholders ahead of time.

Workforce planning

A continuous process of shaping the workforce to ensure that it is capable of delivering organisational objectives now and in the future. The desired outcomes of workforce planning are its effective integration into an agency’s strategic planning framework and the alignment of HR strategies to continuously deliver the ‘right people in the right place at the right time’.

Appendix 4: Abbreviations

ANAO

Australian National Audit Office

APS

Australian Public Service

HR

Human resources

HRMIS

Human resource management information system

LMS

Learning management system

OH&S

Occupational health and safety