Go to top of page

Indigenous APS Employees Census Survey methodology

The 2009 Indigenous Census was designed to gain a better understanding of the views of Indigenous APS employees on a range of issues, including pathways to employment, work-life balance, job satisfaction, people management and learning and development. The results of the survey were the main source of information drawn on in producing this report.

Scope and coverage

The survey was conducted as a census of all APS Indigenous employees. The list of employees was drawn from the Australian Public Service Employment Database (APSED). Provisions were also made for Indigenous APS employees who were not identified on APSED to register with the Commission to participate in the survey.

The Commission distributed employee lists to the various agency contact officers to obtain either the email address or workplace postal address of individual staff members. Across all agencies contacted, information was returned for 3,675 employees.32

Privacy, anonymity and confidentiality

Maintaining confidentiality throughout the entire survey process was of primary concern.

Privacy arrangements for APSED preclude Commission employees, other than those in the APSED team, the Group Manager of the Research and Evaluation Group, and the Commission’s Executive, from accessing APSED data relating to individuals. This meant that the identity of the survey population was not available to the Commission’s Workforce Participation Team or any other non-APSED employees involved in the survey process. A small number of ORIMA Research staff had access to the data.

All responses to the survey were anonymous so that individuals could not be identified. Results have been presented in collated form so that identification of individuals is not possible. While the data has been analysed using some demographic categories, results relating to members of small demographic groups have not been reported where identification of individuals may be possible (or even guessed at).

Each person invited to participate in the census survey was provided with a unique password. This prevented multiple responses from individual respondents.

Survey design

The census survey was designed based on a range of surveys previously conducted by the Commission, most notably the Indigenous Employees Census Survey of 2005 and the 2009 State of the Service Employee Survey.

Pilot testing took place on 9 October 2009, with two groups, of two Indigenous employees each. The pilot testers were asked to complete a hardcopy version of the draft census questionnaire prior to the session and to provide feedback and comments to staff from ORIMA Research and the Commission during the session. The suggestions raised during the pilot testing process were discussed by the ORIMA Research and Commission project teams, and the questionnaire was amended accordingly.

The survey was delivered using two methods. The main delivery was online via a password-protected internet site. The majority of Indigenous APS employees were sent an email from ORIMA Research on behalf of the Commission inviting them to participate in the online survey.

A secondary, paper-based delivery method was developed and implemented for employees working in agencies that do not provide access to an individual email account or do not have (or have only limited) access to the internet. These employees received a letter from the Commission inviting them to participate in the survey, as well as a paper copy of the survey to complete and return to ORIMA Research.

Invitation emails and letters were sent out between 22 and 26 October 2009. Respondents were asked to complete the survey and submit or return it to ORIMA Research by 20 November 2009.33

An adjustment was made to the final survey population size to account for those out of scope of the survey, including:

  • employees who could not be contacted (covering repeatedly bounced emails, returned hard-copy surveys where the address was unknown and those ‘out of office’ for the entire survey period)
  • those known to be no longer employed in the APS at the time of the survey
  • those who were incorrectly identified on APSED as being Indigenous.

Adjustments were also made to account for those Indigenous APS employees who were not identified on APSED, but who had registered with the Commission to participate in the survey. The final survey population was reduced by 511 to 3,164.

Final response numbers

Overall, 1,649 responses were received from the final (adjusted) survey population of 3,164. This represents a response rate of 52%.

In total, there were 1,550 online responses and 99 hardcopy responses.

Weighting

The survey responses were re-weighted to correct for different response rates between strata. This was done to ensure that the aggregate results represent the underlying demographic profile of Indigenous APS employees. The re-weighting process was based on the following demographic variables:

  • level (APS/EL/SES)
  • agency (for the 60 agencies where respondents were employed)
  • location (ACT and non-ACT).

In total, 150 different weights were applied (noting that not all agencies had Indigenous employees at all levels and/or location). For this survey, the weights were calculated by dividing the overall response rate by the response rate for each stratum; for example, if there were 20 APS 5–6 Indigenous employees in the Commission in the ACT, and the response rate for this group was 80% (with the overall response rate being 52%), the weight assigned to each APS 5–6 respondent working in the Commission in the ACT would be 0.65. If the data were not re-weighted, some strata could be over-represented and others underrepresented in the total survey results.

Results have generally been presented rounded to the nearest whole percentage point (i.e. 88% not 87.7%). Due to this rounding, the percentage results for some questions may not add up to exactly 100%.

Measures of error and accuracy

Two types of error can occur in surveys: sampling error and non-sampling error. As this was a census survey (i.e. every member of the target population was included), the results are not affected by sampling error. However, census surveys can still be affected by non-sampling errors, which cause bias in statistical results and can occur at any stage of a survey. Estimating non-sampling error can be difficult, so it is important to be aware of this type of error and to either minimise or eliminate it from the survey.

Every effort was made to keep the non-sampling errors in the census survey to a minimum by careful survey design and efficient operating procedures. The following section provides a brief discussion of the main types of non-sampling error that could have affected the census survey and should be considered when making inferences to all Indigenous APS employees.

Note: when it is judged that ‘errors were minimised’, this does not imply that the error was zero or close to zero, but rather that it is unlikely that any other reasonable actions could have been taken to address these errors.

Non-sampling error

Coverage error—this error occurs when all relevant population units are not included in the population survey frame. This error applies to the census survey as APSED does not have a complete listing of all Indigenous APS employees.34 The survey was extensively promoted to encourage Indigenous APS employees who were not identified on APSED to register with the Commission to participate in the survey so as to minimise this error.

Non-response error—this error potentially applies to any survey with a less than 100% response rate. However, the reasonable response rate achieved (52%) and the relatively similar response rates across strata indicate that this error was minimised.

Respondent error—this error occurs when a respondent does not answer a question correctly and can apply in any survey. The pilot testing process and the online, hardcopy and telephone support options would have helped to minimise this error.

Coding and processing errors—these errors occur when errors are made in the recording and coding of responses and in data processing. The online survey delivery option, where the respondents themselves entered the data when responding to the survey, would have minimised errors in the recording and coding of responses. In addition, identifiable errors made by respondents while completing the survey were removed from the results database; for example, blank responses were generally coded to non-response categories.

Interpretation of scales

Scales were included in any question that required a respondent to measure the strength or level of a theoretical construct. In its simplest form in the survey, a scale asked a respondent to rate the level of importance, satisfaction or effectiveness of various workplace variables on a five-point scale.

The scales used in the survey were generally balanced—that is, they allowed the respondent to express one of the two extremes of view (e.g. satisfaction and dissatisfaction). These scales were also designed with a midpoint that allowed respondents to enter a ‘neutral’ response.

When interpreting scales, it is important to realise that there is not an ordinal relationship between points in a scale. That is, the strength of opinion to shift a respondent from ‘neutral’ to ‘satisfied’ may be much smaller than the strength required to shift a respondent from ‘satisfied’ to ‘very satisfied’.

Summary indexes

Summary indexes have been used to assist analysis of results on a number of survey questions comprising several parts. The indexes operate to condense a multiple response question into a single index for comparative purposes; for example, in exploring respondents’ overall level of job satisfaction, a question comprising 15 factors was summarised into a single index using a point scoring system. In this way, analysis of the 15 job satisfaction factors can be supplemented by analysis at the summary level.

Coding of open-ended responses

The survey questionnaire provided specified response options for most questions. It also included open-ended response options for some questions, which enabled respondents to provide a text response to a question. Open-ended options were commonly provided, for example, as part of a specified response question in the form of ‘other (please specify)’.

Some open-ended responses have been coded to assist analysis. Coding involved, for example, removing irrelevant and incidental comments from statistical outputs as well as counting relevant comments against the appropriate response option.

Size of agency

In analysing survey results, comparisons have been made on the basis of the size of the agency and the size of the agency’s Indigenous population.

For the purposes of the survey, agencies were divided into three groups: small, medium and large. Small agencies were defined as having up to 250 employees; medium agencies were defined as having between 251 and 1000 employees; and large agencies were defined as having more than 1000 employees.

Agencies were also divided up into three groups based on the number of Indigenous employees the agency employed (as recorded by APSED): small, medium and large. Small agencies were defined as having fewer than 20 Indigenous employees; medium agencies were defined as having between 20 and 49 Indigenous employees, and large agencies were defined as having 50 or more Indigenous employees.


32 Invitations to participate were issued to all Indigenous APS staff, i.e. ongoing and non-ongoing.

33 The return date was extended for all participants to 27 November 2009.

34 Either through omission in information provided by agencies, or individuals not being asked to report, or choosing not to report whether they are Indigenous.