State of the Service Report 2013–14


Appendix 3 - Survey methodologies

This appendix details the survey methodologies used for the State of the Service Agency Survey (agency survey) and Australian Public Service (APS) Employee Census (employee census). It also includes an outline of the analysis approach taken in relation to the qualitative and quantitative data collected.

Agency survey methodology

The scope of the agency survey was the 100 APS agencies, or semi-autonomous parts of agencies, employing at least 20 staff under the Public Service Act 1999.

Agencies were provided with access to the online survey between 10 June and 25 July 2014. As part of the process, agency heads were required to sign off their agency's response. All 100 agencies completed the survey, although 22 agencies with fewer than 100 employees completed a shortened version. The Australian Public Service Commission (the Commission) used this survey as a key source of information for this report.

Data cleaning

Agency survey data was rigorously examined for errors and inconsistencies by ORIMA Research and the Commission before being analysed. Where errors were subsequently discovered, corrections were made and all relevant analyses reproduced to ensure, so far as possible, the accuracy of the results in this report.

APS employee census methodology

The 2014 employee census was administered to all available APS employees. This approach provides a comprehensive view of the APS and ensures no eligible respondents are omitted from the survey sample, removing sampling bias and reducing sample error.

Employee census design

Employee census content was designed to measure key issues such as employee engagement, leadership, health and wellbeing, job satisfaction and general impressions of the APS. Questions from previous years were used as the basis for this year's employee census. Some questions are included every year while others are included on a two or three-year cycle. Some were included for the first time this year to address topical issues. To ensure the Commission maintains longitudinal data, changes to questions used in previous years are kept to a minimum.

Also included in the employee census were a number of internationally benchmarked items that allowed the APS to be compared to similar organisations; for example, the United Kingdom Civil Service Health and Safety Executive (HSE) First Pass Tool, which examines employee health and wellbeing.1

The draft employee census was pilot tested with APS 1–6 and/or Executive Level employees from 10 agencies. Feedback provided was considered and, where appropriate, addressed before the employee census was finalised.

Employee census delivery

The employee census was delivered using the following methods:

  • online, through a unique link provided to each employee via email by ORC International
  • telephone surveys were carried out for a number of employees working in remote locations
  • paper-based surveys were used for employees who did not have access to an individual email account or did not have (or had only limited) access to the internet. Employees received a letter from their agency inviting them to participate along with a paper copy of the survey to complete and return to ORC International.

Test survey

An innovation for 2014 was the introduction of an alternative survey running parallel to the main employee census. This allowed topic areas, scales and/or indices to be tested without increasing the length of the census. Five per cent of on-line respondents were randomly selected and presented with the test survey. The survey consisted of a set of core items shared with the main employee census plus items covering a range of other issues relevant to the APS including the APS Leadership and Core Skills Strategy. In total, 4,909 respondents completed the test survey.

Sampling and coverage

The employee census covered all employees (ongoing and non-ongoing) from all APS agencies, regardless of size or location. The initial employee census population consisted of all APS employees recorded in the Australian Public Service Employment Database (APSED) on 11 April 2014. This population was then provided to each individual agency for confirmation.

The employee census invitations were sent to employees from 12 May 2014. The number of invitations was adjusted as new employees were added and incorrect email addresses were corrected. The deadline for survey completion was 13 June 2014.

The final employee census sample was reduced to 146,875 from an initial number of 151,792. The adjustment was to exclude employees with invalid email addresses, casual and intermittent employees not in the workplace and those out of office for the entire survey period. Overall, 99,392 employees responded to the employee census, a response rate of 68%. Due to the decrease in APS numbers, this was a higher response rate than 2013, when 102,219 out of 158,358 employees responded to the APS employee census (66%).

Sources of bias

The employee census methodology removed sampling bias and minimised sample error by ensuring that all APS employees were invited to take part. Some employees who had recently entered the APS were not recorded in APSED at the time the invitations were sent out. Omitting these employees or others who had changed agency recently, may have introduced some sampling error. This risk was managed by giving agencies the opportunity to review or provide their own email lists and by encouraging all employees to watch out for their invitation and to contact ORC International if they did not receive one. Over the course of the survey, 549 additional employees were added to the population, reducing the possibility of sampling error as much as possible.

Non-sampling bias was controlled in part by independently reviewing and testing all items before the employee census was administered. Online administration of the survey records the respondent's answers directly, minimising data entry errors and addressing another source of potential bias.

A potentially large source of non-sampling bias was that not all invitees took part. Overall, 48,468 or 32% of invitees did not complete the employee census. In addition, 1,465 were unable to complete the survey because they were on leave during the survey period. If key groups systematically opted out of the employee census, this could be a source of non-sampling bias. To test this, the survey sample was compared against the overall APS population on gender, classification, location and employment category (ongoing or non-ongoing). Analyses showed there were only minor differences between the employee census respondents and the APS as a whole.2

Privacy, anonymity and confidentiality

Maintaining confidentiality throughout the employee census process was of primary concern to the Commission. To ensure confidentiality, each APS employee was provided with a unique link to the survey via email. Only a small number of staff at ORC International had access to both individual email addresses and their responses. All responses provided to the Commission by ORC International were de-identified. Due to these precautions, Commission staff could not identify individual respondents to the survey or identify those who had not taken part.

Including agencies with less than 100 employees creates an additional privacy risk. Breaking down small workforces into even smaller groups risks participants' anonymity by inadvertently ‘singling out’ easily distinguished employees to their colleagues. For example, female SES employees in a small agency. Even where there are several such employees, it is possible to attribute responses to specific individuals by guessing, either correctly or incorrectly. Besides breaking anonymity, identifying personal information such as carer responsibilities is a breach of privacy. Furthermore, knowledge of attitudes towards certain issues, such as leaders or colleagues, could be used against the employee.

This risk was managed by not reporting to agencies or in this report, any segmentation of the workforce which would have resulted in groups of less than 10 responses. In addition, agencies were not supplied with any raw comments, except where approved by the respondent and de-identified by Commission employees. Agencies were supplied with text analyses of comments on selected items where there was sufficient volume of comment to ensure anonymity.

Data cleaning

Employee census data was rigorously examined for errors and inconsistencies by ORC International and the Commission before being analysed. Where errors were subsequently discovered, corrections were made and all relevant analyses reproduced to ensure the accuracy of the results in this report.

Precision of estimates

Even with a 68% response rate, the figures discussed in this report are estimates of true population values. The precision of these estimates are influenced by the amount of data available. A common measure of precision is the margin of error, expressed as a confidence interval around the estimate. This interval gives a range in which the true value of the population is likely to fall. When 95% confidence is referred to, it is accepted that there is a 5% chance the responding sample will result in an estimate for the true population value that falls outside the 95% confidence interval constructed.

For example, the 95% margin of error suggests the true proportion of the population who agree that employees in their agency appropriately assess risk is between 58.5% and 59.1% (a sample estimate of 58.8% with a margin of ±0.3 percentage points). When the data is segmented into groups, the width of the margins will increase as the sample sizes decrease. For smaller groups, such as Indigenous employees (2,415 respondents), the precision may drop substantially (see Table A3.1). The Commission, however, is 95% confident that the true proportion of Indigenous employees who have confidence in their agencies' risk assessment practices was between 61.1% and 65.0%, or approximately 63.1%.

Table A3.1. Margins of error for employee census item ‘In general, employees in my agency appropriately assess risk’, 2013–14
Demographic group 95% margin of error (percentage points) Estimated result (%)
Source: Employee census
Women ±0.4 59.4
Men ±0.5 58.6
People with disability ±1.2 53.1
People without disability ±0.3 59.3
Indigenous employees ±2.0 63.1
Non-Indigenous employees ±0.3 58.8

Analysis strategy

This State of the Service report draws on both quantitative and qualitative data.

Quantitative data

Interpretation of items and scales

Most items in the employee census asked the respondent to rate the level of importance, satisfaction with or effectiveness of workplace issues on a five-point, ordinal scale. The scales were generally balanced, allowing respondents to express one of two extremes in view (for example, satisfaction and dissatisfaction) and with a midpoint that allowed respondents to enter a ‘neutral’ response. For this report, the five points have generally been collapsed into three: agree/satisfied, neutral, and disagree/dissatisfied. Figures reported are the proportion of respondents who responded with either strongly agree/very satisfied or agree/satisfied, except where noted.

When interpreting item responses, it is important to realise there is an ordinal relationship between points in a scale. The strength of opinion to shift a respondent from ‘neutral’ to ‘satisfied’ may be much smaller than the strength required to shift a respondent from ‘satisfied’ to ‘very satisfied’.

The employee census continues to make greater use of scales, such as the APS Employee Engagement Model scores and the APS Performance Culture Model scales, where the five-point item responses are combined and re-scaled to produce a continuous scale score ranging from 1 to 10. Scores from scales with demonstrated validity and reliability are generally more robust than item-based analyses as they triangulate information from a number of items examining a single issue. They also allow the use of more sophisticated statistical analyses.

Data analysis

As the agency survey has a 100% response rate, the data is not subject to sampling error. Statistical significance testing is unnecessary. Results are reported as either raw numbers or percentages.

While the employee census was offered to all APS employees, a response rate of 68% means that inferential statistics are still required to analyse the data. Analysis of this data has historically used traditional social science techniques, such as χ2 tests. Conventional guidelines have been used for determining statistical significance (p0.05).

Statistical significance speaks to the probability that two groups have been randomly selected from the same population. If the probability is sufficiently low it is concluded that the groups are drawn from different populations. These groups are described as significantly different. Statistical significance does not, however, reflect the magnitude of the difference between groups, also called the effect size.

As sample sizes increase, the effect size required to achieve statistical significance decreases. Put another way, even the smallest of differences will be statistically significant if the sample size is large enough. With a sample of 99,392 respondents, effects which are far too small to have any appreciable meaning for the APS will almost certainly be statistically significant. Therefore, the effect size has been used in conjunction with statistical significance to identify differences which are substantial enough to have an impact.

Units of comparison

For statistical comparison, agencies were grouped based on either their size or their function.

Agency size

Agencies were allocated into size categories based on their APSED data as at 30 June 2014. These categories are:

  • Small: fewer than 251 APS employees
  • Medium: 251 to 1000 APS employees
  • Large: more than 1000 APS employees
Functional clusters

Agencies were grouped into functional clusters. These are:

  • Policy: organisations involved in the development of public policy
  • Smaller operational: organisations with less than 1,000 employees involved in the implementation of public policy
  • Larger operational: organisations with 1,000 employees or more involved in the implementation of public policy
  • Regulatory: organisations involved in regulation and inspection
  • Specialist: organisations providing specialist support to government, businesses and the public.

Appendix 2 includes information on individual agencies' functional group.

Employee levels

In the report responses to the employee census have been divided into the following groups to assist with analysis:

  • APS 1–6: includes APS level 1 to 6 (and equivalent) employees as well as graduate and trainee level employees
  • Executive Level: includes both Executive Level 1 and 2 (and equivalent) employees unless stated otherwise
  • Senior Executive Service: includes employees at the Senior Executive Band 1 to 3 levels.

Qualitative data

The employee census provided specified response options for most questions. Complementing these, several items were completely open-ended, asking the individual to provide a short written response to a question or statement. Open-ended responses were used to complement information gained through quantitative methods. Not all respondents provided a response to an open-ended question and comments do not necessarily represent the views of all respondents. Comment data does, however, represent a rich data source.

Data analysis

Open-ended comment analysis was based on the grounded theory approach in which key concepts from the collected data were coded either manually or with text mining software such as Leximancer. Comments were reported using themes and concepts rather than individual responses, except when comments were non-attributable and served to highlight especially salient concepts or themes.

PDF Download the PDF of the Appendices icon Go to the Appendices home | Go to the next page >


1 Health and Safety Executive, Work Related Stress–Research and Statistics, R Kerr, M McHughand M McCrory, ‘HSE management standards and stress-related work outcomes’, Occupational Medicine, (2009), vol. 59, no. 8, pp. 574–579.

2 Results may be requested by emailing

Page ID: 64249 (3. Survey methodologies)