Agency | Response Rate |
---|---|
ACIAR | 49% |
ACLEI | 54% |
ACSQHC | 100% |
AIC | 87% |
AIFS | 85% |
ALRC | 64% |
ANPHA | 80% |
AODTA | 79% |
AOFM | 51% |
ASADA | 62% |
CAMAC | 100% |
Cancer Australia | 85% |
CGC | 67% |
EOWA | 73% |
FFMA | 72% |
IGIS | 80% |
IGT | 29% |
IHPA | 47% |
NBA | 69% |
NCA | 80% |
NCC | 86% |
NMHC | 50% |
NOPSEMA | 88% |
NWC | 79% |
OAIC | 84% |
OPC | 70% |
PHIO | 46% |
PSR | 62% |
Screen Australia | 30% |
SSAT | 72% |
TEQSA | 61% |
WEA | 30% |
Micro-agencies | 71% |
Other APS agencies | 54% |
Total APS | 55% |
Micro-agencies | Other APS | |
---|---|---|
Male | 35% | 43% |
Female | 65% | 57% |
Micro-agencies | Other APS | |
---|---|---|
< 25 years | 3% | 4% |
25–29 years | 13% | 11% |
30–34 years | 17% | 12% |
35–39 years | 13% | 13% |
40–44 years | 14% | 14% |
45–49 years | 15% | 15% |
50–54 years | 11% | 16% |
55–59 years | 9% | 10% |
60–64 years | 3% | 5% |
> 64 years | 1% | 1% |
Micro-agencies | Other APS | |
---|---|---|
Australian Capital Territory | 47% | 40% |
New South Wales | 19% | 18% |
Victoria | 23% | 16% |
Queensland | 1% | 11% |
South Australia | 1% | 6% |
Western Australia | 9% | 5% |
Tasmania | 0% | 2% |
Northern Territory | 0% | 2% |
Outside Australia | 0% | 0% |
Micro-agencies | Other APS | |
---|---|---|
Trainee/Apprentice | 0% | 0% |
Graduate APS (including Cadets) | 0% | 1% |
APS 1–2 (or equivalent) | 2% | 4% |
APS 3–4 (or equivalent) | 16% | 29% |
APS 5–6 (or equivalent) | 32% | 36% |
Executive Level 1 (or equivalent) | 28% | 20% |
Executive Level 2 (or equivalent) | 17% | 9% |
Senior Executive Service Band 1 (or equivalent) | 3% | 1% |
Senior Executive Service Band 2 or 3 (or equivalent) | 2% | 0% |
Micro-agencies | Other APS | |
---|---|---|
Ongoing | 85% | 95% |
Non-ongoing | 15% | 5% |
Not sure | 0% | 0% |
Micro-agencies | Other APS | |
---|---|---|
Representation rates are reported to one decimal place in line with APS reporting practices for this issue. | ||
Indigenous | 1.1% | 2.5% |
Non-Indigenous | 98.9% | 97.5% |
Micro-agencies | Other APS | |
---|---|---|
Representation rates are reported to one decimal place in line with APS reporting practices for this issue. | ||
Person with disability | 3.9% | 6.9% |
Person without disability | 96.1% | 93.1% |
Micro-agencies | Other APS | |
---|---|---|
Non-English speaking background | 13% | 15% |
English speaking background | 87% | 85% |
Micro-agencies | Other APS | |
---|---|---|
Has carer responsibilities | 27% | 31% |
Does not have carer responsibilities | 73% | 69% |
Micro-agencies | Other APS | |
---|---|---|
Accounting and finance | 10% | 6% |
Administration | 16% | 12% |
Communications and marketing | 6% | 2% |
Compliance and regulation | 10% | 12% |
Engineering and technical | 2% | 3% |
Information and communications technology | 6% | 9% |
Information and knowledge management | 2% | 2% |
Intelligence | 0% | 3% |
Legal and parliamentary | 8% | 3% |
Monitoring and audit | 2% | 3% |
Organisational leadership | 1% | 2% |
People | 5% | 6% |
Science and health | 5% | 3% |
Service delivery | 2% | 14% |
Strategic policy, research, project and program | 19% | 11% |
Trades and labour | 0% | 0% |
Other | 6% | 8% |
Micro | Other APS | |
---|---|---|
Delivery | 22% | 34% |
Public policy and program design | 2% | 30% |
Regulatory | 16% | 10% |
Professional/specialist | 58% | 23% |
Other | 2% | 3% |
Appendix 2: Survey methodologies
Information in this section is adapted from the SOSR 2011–12. Further detail on these issues can be found in Appendix 3 of the SOSR 2011–12 (pp. 265–275).
Employee census methodology
The census population consisted of all APS employees (ongoing and non-ongoing), as recorded in the Australian Public Service Employment Database (APSED) on 4 May 2012[27]. Micro-agencies are defined as those who had fewer than 100 employees at this time. The employee census was open between 8 May 2012 and 6 June 2012. A total of 87,214 employees took part, a response rate of 55%. Nine hundred respondents were from Micro-agencies, which is a response rate for these agencies of 71%.
To ensure confidentiality, each employee was provided with a unique password to prevent multiple responses from individuals. Only a small number of staff of the external survey provider, ORC International, had access to both individual names and their unique passwords. All responses provided to the Commission by ORC International were de-identified. Due to these precautions, APSC staff could not identify individual respondents to the census or identify those who had not taken part.
Agency survey methodology
A key source of data for the SOSR is the agency survey, which was completed by the 101 APS agencies, or semi-autonomous parts of agencies, employing at least 20 staff under the Public Service Act 1999. This excludes nine of the 32 Micro-agencies. Furthermore, the remaining Micro-agencies completed a shortened form of the survey consisting of sections A, B, C, D, G, H and N. Consequently, the agency survey has not been used in this report.
Analysis strategy
This draws on both quantitative and qualitative data.
Quantitative data
A total of 87,214 employees completed the employee census, 900 from the Micro-agencies, and the remaining 86,314 from larger agencies. With sample sizes this large, it is likely that even very small differences will be statistically significant. This is because as the sample sizes increase, the size of the difference required for a significance test to return a positive result decreases.
In order to avoid presenting misleading information by over-emphasising statistically significant but very small differences, an additional criterion has been considered: the size of the difference (the effect size).
Where a difference is statistically significant and greater than small in magnitude, it has been flagged in the text. Differences smaller than this are unlikely to be visible in the workplace or to form a sound evidence basis for pragmatic recommendations. These are generally not reported unless they reinforce other findings with larger effect sizes. Table 1 shows the statistical criteria used to determine small effect size. These guidelines have been published in Statistical Power Analysis for the Behavioural Sciences[28] and are widely in the used social sciences.
Effect size statistic used | Cut-off for small effect size | |
---|---|---|
Comparison of independent proportions | Cohen’s h | h>=0.20 |
Comparison of two independent means (t-test) | Cohen’s d | d>=0.20 |
Comparison of three or more independent means (ANOVA) | Cohen’s f | f>=0.10 |
Longitudinal analyses
Many of the items included in the 2012 employee census were also included in the 2011 Micro-agency Snapshot survey. Ordinarily this would mean that the 2012 results could be benchmarked against the 2011 results. However, the Micro-agency Snapshot survey was completed by staff from 13 agencies, while the employee census covered 32. Any change in the aggregated results may be due to the inclusion of 19 additional agencies. Longitudinal analyses are not performed because the two samples are too different and results are likely to be misleading.
Qualitative data
The employee census provided specified response options for most questions. Complementing these, several items were completely open-ended, asking the individual to provide a short, written response to a question or statement. Open-ended responses were used to complement information gained through quantitative methods. Not all respondents provided a response to an open-ended question and comments do not necessarily represent the views of all respondents. However, comments represent a rich data source.
Open-ended comment analysis was based on the grounded theory approach in which key concepts from the collected data were coded either manually or with text mining software (Leximancer). Comments were reported using themes and concepts rather than individual responses, except when comments were non-attributable and served to highlight especially salient concepts or themes.
[27] Over the survey period, employees who were recorded in agency HR systems, but had not yet been recorded in APSED were invited to take part in the census as they were identified.
[28] J. Cohen, Statistical Power Analysis for the Behavioral Sciences, Psychology Press, New York, 1988.