Performance measure 3
Key activity 3
Build APS capability and leadership for the future
We measure our performance against this key activity using the following performance measures:
3.1 Support APS leadership to develop their leadership capability.
3.2 Contribute to an uplift in APS capability in the domains of APS Craft.
Performance measure 3.1
Support APS leadership to develop their leadership capability
Targets and measurement
2025–26 Target | 2026–27 | 2027–28 | 2028–29 |
---|---|---|---|
3.1.1 80% of learners report a moderate or significant expected impact on their performance against the course learning outcomes (baseline) | Targets to be set based on 2025–26 baseline | As per 2026–27 | As per 2026–27 |
3.1.2 In courses that include practitioners, at least 85% of learners report that practitioner contributions enhanced their learning (baseline) | Targets to be set based on 2025–26 baseline | As per 2026–27 | As per 2026–27 |
3.1.3 ≥ 440 SES participated in APS Academy leadership programs | As per 2025–26 | As per 2025–26 | As per 2025–26 |
Overall performance is determined based on the following assessment scale:
- Achieved – all targets are achieved the performance measure will be achieved.
- Substantially achieved – two targets are at least substantially achieved.
- Partially achieved – one target is at least partially achieved.
- Not achieved – where one or both targets are at least not achieved.
Rationale
Under section 41 of the Public Service Act 1999, the Commission is responsible for fostering, and contributing to, leadership, high quality learning and development and career management in the APS. This measure reflects the APSC’s contribution, through the APS Academy, to that role by supporting uplift in the foundational capabilities of the Australian Public Service. The APS Craft domains – including leadership and management - represent a core area of expertise that underpins effective performance in all roles and functions. The sub-measures focus on delivering quality programs, embedding practitioner insights to ensure learning is relevant and reflects the APS context, tracking the reach of the programs. The focus on leadership is reinforced through the APS Learning and Development Strategy and Action Plan which states the Commission will support the uplift of public service capability and build leadership for the future by developing the critical capabilities identified in the APS Workforce Strategy, to build an APS learning culture that encourages and supports continuous learning.
To deliver on these commitments and build APS capability and leadership for the future the Commission provides high quality, one-APS senior executive leadership development and talent programs. Programs include SES Welcome, SES Orientation; SES Band 1 leadership; SES Band 2 leadership; and the Senior Executive Stewardship Program. In addition APS Secretaries Talent Councils’, supported by the APSC, manage talent assessment and development programs for SES Band 3s and high potential SES Band 1 and Band 2s.
Type of measure
Effectiveness
Methodology
3.1.1 – Survey data is used to determine the results of this performance measure. Participants respond to the survey question “To what extent do you expect this course will have a positive impact on your performance in the following areas” using a four-point scale (no impact, slight impact, moderate impact, significant impact). The “following areas” are a list of the course learning outcomes. The result is calculated as the proportion of respondents who selected “moderate impact” or “significant impact” against one or more learning outcome. Survey responses are aggregated across all deliveries of the same course in the reporting period.
The assessment scale for this target is:
- Achieved – 80% or more of respondents who responded to the evaluation survey selected “moderate” or “significant” when asked about the expected impact on their performance.
- Substantially achieved – 75% to 79% of respondents who responded to the evaluation survey selected “moderate” or “significant” when asked about the expected impact on their performance.
- Partially achieved – 60% to 74% of respondents who responded to the evaluation survey selected “moderate” or “significant” when asked about the expected impact on their performance.
- Not achieved – Fewer than 60% of respondents who responded to the evaluation survey selected “moderate” or “significant” when asked about the expected impact on their performance.
3.1.2 – Survey data is used to determine the results of this performance measure. In courses that include practitioners (described to learners in the survey as ”guest speakers or presenters”), learners respond to the survey question “How did the contributions of the guest speakers or presenters enhance your learning?” Learners are provided with five positive response options. One mutually exclusive negative response is included (“did not enhance my learning”). The measure reports the percentage of respondents selecting at least one positive option.
The assessment scale for this target is:
- Achieved – 85% or more of respondents in applicable courses selected one or more positive responses indicating practitioner contributions enhanced learning.
- Substantially achieved – 80% to 84% of learners in applicable courses selected one or more positive responses indicating practitioner contributions enhanced learning.
- Partially achieved – 70% to 79% of learners in applicable courses selected one or more positive responses indicating practitioner contributions enhanced learning.
- Not achieved – Fewer than 70% of learners in applicable courses selected one or more positive responses indicating practitioner contributions enhanced learning.
3.1.3 – The results are calculated using enrolment data during the reporting period for each program. APSC Learn collects by program cohort, this relies on enrolment data at the commencement of each program. SES Orientation runs 12 cohorts per year; SES Band 1 leadership runs 6 cohorts per year; SES Band 2 leadership runs 1-2 cohorts per year; Senior Executive Stewardship runs 1 cohort per calendar year. Deputy Secretaries and Secretaries Talent Councils run 1 assessment each per year.
The assessment scale for this target is:
- Achieved – 10% increase of SES who participate in APS Academy leadership and APSC talent programs from, 2023-24 (440) indicates more SES are accessing leadership development tailored to the contemporary APS context.
- Substantially achieved – 8% to 9.9% increase of SES who participate in APS Academy leadership and APSC talent programs from, 2023-24 (440) indicates more SES are accessing leadership development tailored to the contemporary APS context.
- Partially achieved – 2% to 8% increase of SES who participate in APS Academy leadership and APSC talent programs from, 2023-24 (440) indicates more SES are accessing leadership development tailored to the contemporary APS context.
- Not achieved – 0% to 2% increase of SES who participate in APS Academy leadership and APSC talent programs from, 2023-24 (440) indicates similar numbers of SES are accessing leadership development tailored to the contemporary APS context
Data sources
3.1.1 & 3.1.2 – Survey data is collected at the conclusion of each facilitated program, including the talent programs. The survey is distributed to participants via two channels: a link provided by facilitators during the course and a follow-up email issued after course completion. Participation in the survey is voluntary and responses are anonymous. The survey application for SES Leadership programs, Qualtrics, includes automated controls to detect invalid entries, duplicate submissions and incomplete responses allowing for manual review and inclusion assessment. MS Forms is used for talent programs.
Survey data is collected in Qualtrics and aggregated in GovTeams through an automated workflow. Data is collected continuously and updated weekly into Power BI dashboards developed and administered by the APS Academy Capability Strategy & Insights team. Access to the Academy’s PowerBI workspace is controlled by the PM&C corporate analytics team.
3.1.3 – APSLearn attendance and enrolment is captured through APSLearn. Talent enrolment data is manually entered and available on request.
Caveats and disclosures
Nil.
Owner
Leadership and Talent Development Branch.
Changes from previous year
Target 2.1.3 is substantially similar to the 2024–25 Corporate Plan planned performance result 2.1.1 and corresponding 2025–26 PBS planned performance result. Targets 2.1.1 and 2.1.2 replace 2024–25 Corporate Plan planned performance result 2.1.2 and corresponding 2025–26 PBS planned performance result.
We have made changes to our targets to better align with their strategic intent and to ensure results are based on reliable, verifiable data sources. These amendments strengthen the link between the measures and the APS Academy’s purpose, while improving clarity, relevance and integrity of reported results. Key improvements include:
- Strengthened rigor in measuring course performance – shifting from a broad satisfaction focus to a self-assessment of the learner’s anticipated workplace performance against the course’s stated learning outcomes. This ensures the measure reflects both course quality/relevance and the potential for transfer.
- Enhanced measurement of practitioner involvement – refining the measure to capture whether practitioner contributions have aided an individual’s learning
A measure to assess the Academy’s reach has been retained.
Performance measure 3.2
Contribute to an uplift in APS capability in the domains of APS Craft
2025–26 Target | 2026–27 | 2027–28 | 2028–29 |
---|---|---|---|
3.2.1 ≥80% of respondents report a moderate or significant expected impact on their performance against the course learning outcomes (baseline) | Targets to be set based on 2025–26 baseline | As per 2026–27 | As per 2026–27 |
3.2.2 In courses that include practitioners, at least 85% of respondents report that practitioner contributions enhanced their learning (baseline) | Targets to be set based on 2025–26 baseline | As per 2026–27 | As per 2026–27 |
3.2.3 ≥ 80% of APS agencies engage with the APS Academy through employee participation in courses or events (baseline) | Targets to be set based on 2025–26 baseline | As per 2026–27 | As per 2026–27 |
3.2.4 ≥85% of actions from APS Learning and Development Strategy Adapt Action Plan are completed within their endorsed timeframes (baseline) | Targets to be set based on 2025–26 baseline | As per 2026–27 | As per 2026–27 |
The overall measure results in the annual performance statements will be determined by the lowest rating target for the measure.
Rationale
Under section 41 of the Public Service Act 1999, the Commissioner is responsible for fostering, and contributing to, leadership, high quality learning and development and career management in the APS. This measure reflects the Commission’s contribution, through the APS Academy, to that role by supporting uplift in the foundational capabilities of the APS. The APS Craft domains represent core areas of expertise across the service and underpin effective performance in all roles and functions. The sub-measures focus on delivering quality learning courses, embedding practitioner insights to ensure learning is relevant reflecting the APS context, tracking the percentage of APS agencies engaging with the APS Academy capability uplift activities, and supporting implementation of strategic actions that enhance a one-APS approach to capability development.
Type of measure
Effectiveness and efficiency
Methodology
3.2.1 – Survey data is used to determine the results of this performance measure. Participants respond to the survey question “To what extent do you expect this course will have a positive impact on your performance in the following areas” using a four-point scale (no impact, slight impact, moderate impact, significant impact). The “following areas” are a list of the course learning outcomes. The result is calculated as the proportion of respondents who selected “moderate impact” or “significant impact” against one or more learning outcome. Survey responses are aggregated across all deliveries of the same course in the reporting period.
The assessment scale for this target is:
- Achieved – ≥80% of respondents who responded to the evaluation survey selected “moderate” or “significant” when asked about the expected impact on their performance.
- Substantially achieved – 75% to 79% of respondents who responded to the evaluation survey selected “moderate” or “significant” when asked about the expected impact on their performance.
- Partially achieved – 60% to 74% of respondents who responded to the evaluation survey selected “moderate” or “significant” when asked about the expected impact on their performance.
- Not achieved – < 60% of respondents who responded to the evaluation survey selected “moderate” or “significant” when asked about the expected impact on their performance.
3.2.2 – Survey data is used to determine the results of this performance measure. Applicable courses are Academy courses in which an APS practitioner contributes to the delivery of the course in addition to a course facilitator. In courses that include practitioners (described to learners in the survey as “guest speakers or presenters”), learners respond to the survey question “How did the contributions of the guest speakers or presenters enhance your learning?” Learners are provided with five positive response options. One mutually exclusive negative response is included (“did not enhance my learning”). The measure reports the percentage of respondents selecting at least one positive option.
The assessment scale for this target is:
- Achieved – ≥90% of respondents in applicable courses selected one or more positive responses indicating practitioner contributions enhanced learning.
- Substantially achieved – 85% to 89% of learners in applicable courses selected one or more positive responses indicating practitioner contributions enhanced learning.
- Partially achieved – 70% to 84% of learners in applicable courses selected one or more positive responses indicating practitioner contributions enhanced learning.
- Not achieved – < 70% of learners in applicable courses selected one or more positive responses indicating practitioner contributions enhanced learning.
3.2.3 –Attendance and enrolment data will be used to determine if an agency is engaging with APS Academy courses or events. An agency is counted as “engaged” if its employees have attended a live-learning course, event or enrolled in an eLearning course during the reporting period. The total number of engaged agencies will be calculated as a percentage of overall APS agencies as defined in the most recent APSED data tables published prior to the commencement of the reporting period.
The assessment scale for this target is:
- Achieved – ≥ 80% of APS agencies have employees participating in courses or events .
- Substantially achieved – 70% to 79% of APS agencies have employees participating in courses or events.
- Partially achieved – 50% to 69% of APS agencies have employees participating in courses or events.
- Not achieved – < 50% of APS agencies have employees participating in courses or events.
3.2.4 – Data is drawn from the internal implementation register that tracks progress of all endorsed actions. The result is calculated as the percentage of all actions completed within their endorsed timeframes (baseline or revised with governance approval from Academy Management Committee). Completion is verified through governance reporting to Academy Management Committee and APS Learning Board. Delayed or deferred actions must have a documented and approved justification to be excluded from the count. In some documentation, ‘actions’ are described as ‘projects’.
The assessment scale for this target is:
- Achieved – ≥ 85% of actions were completed on time, as recorded in the endorsed tracking framework.
- Substantially achieved – 75% to 84% of actions completed within endorsed timeframe.
- Partially achieved – 50% to 74% of actions completed within endorsed timeframe.
- Not achieved – < 50% of actions completed within endorsed timeframe.
Data sources
3.2.1 & 3.2.2 – Survey data is collected at the conclusion of each facilitated course using a standard post-course evaluation survey administered by the APS Academy. The survey is distributed to participants via two channels: a link provided by facilitators during the course and an automated follow-up email issued after course completion. Participation in the survey is voluntary and responses are anonymous. The survey application, Qualtrics, includes automated controls to detect invalid entries, duplicate submissions and incomplete responses allowing for manual review and inclusion assessment.
Post-course learner experience data collected in Qualtrics and aggregated in GovTEAMS. Data is collected continuously and updated weekly into Power BI dashboards developed and administered by the APS Academy Capability Strategy & Insights team. Access to the Academy’s PowerBI workspace is controlled by the PM&C corporate analytics team.
Courses are only included if there are at least 20 responses in a reporting period to avoid small sample bias. Response volumes will be accessible for audit purposes to ensure transparency.
3.2.3 – APSLearn attendance and enrolment data captured through APSLearn. Data is integrated with evaluation data and visualised through PowerBI.
3.2.4 – Data includes the Action Item Project Overview, Project Progress Reports, Implementation Tracking Register, Adapt Action Plan on a Page. Progress Reports are sourced from project leads once projects have been initiated. Project progress reports are submitted September, November, February and May. Implementation Tracking Register (ongoing); Plan on a Page (July 2025).
Caveats and disclosures
3.2.4 – This measure may shift in focus over time. Initially focused on completion, it may evolve to include impact of completed actions as evaluation maturity improves (e.g. evidence of uptake or contribution to system capability uplift). This will reflect the Academy’s increasing role in enabling system outcomes, not just delivering outputs.
Owner
Craft and Learning Branch.
Changes from previous year
Targets 3.2.1 to 3.2.3 replace the 2024–25 Corporate Plan targets 3.2.1 and 3.2.2 and the corresponding 2025–26 PBS performance measures. Targets better align with the Commission’s strategic intent and ensure that results are based on reliable, verifiable data sources. These amendments strengthen the link between the performance measures and the APS Academy’s purpose, while improving the clarity, relevance and integrity of reported results. Key improvements include:
- Strengthened rigor in measuring course performance – shifting from a broad satisfaction focus to a self-assessment of the learner’s anticipated workplace performance against the course’s stated learning outcomes. This ensures the measure reflects both course quality/relevance and the potential for learning transfer.
- Enhanced measurement of practitioner involvement – refining the measure to capture whether practitioner contributions have aided an individual’s learning.
- Assessment of the Academy’s system reach – introducing a measure that tracks the proportion of APS agencies engaging with the Academy, providing an indicator of equitable access and breadth of system support and influence.
- Amendments to the APS L&D Strategy Action Plan reflect feedback from internal auditors and the new Adapt Horizon Action Plan that was implemented in May 2025.
Target 3.2.4 is substantially the same as sub-measure 2024–25 Corporate Plan performance measure 3.2.3 and corresponding 2025–26 PBS performance measure. The target has been adjusted to reflect the new Adapt Horizon Action Plan and to improve methodologies and data source information.