ESRA logo

Tuesday 16th July       Wednesday 17th July       Thursday 18th July       Friday 19th July      

Download the conference book

Download the program





Thursday 18th July 2013, 16:00 - 17:30, Room: No. 17

Using Paradata to Improve Survey Data Quality

Chair Miss Anna Isenhardt (University of Fribourg)

Paper Details

1. Improving Operational Efficiencies and Survey Management of the American Community Survey in the United States

Ms Deborah Griffin (US Census Bureau)

The American Community Survey (ACS) is a mixed mode household survey conducted continuously in the United States to collect social, demographic, economic, and housing data. Information from the ACS is critical to monitoring communities' well-being and responsibly allocating federal, state, and local resources. The ACS faces challenges in continuing to produce high-quality statistics in an environment of declining respondent cooperation. To continue to provide ACS and other important survey information, the U.S. Census Bureau is pursuing aggressive changes in the infrastructure of survey management. These changes will take advantage of survey paradata and administrative records data, employ mixed mode data collection, and provide new tools such as dashboards that inform decisions to alter data collection approaches. Originally designed with three sequential modes of data collection, the 2013 ACS collects data by mail, Internet, telephone, and personal visit and currently uses adaptive design techniques to assign sample cases to modes, switch cases to new modes, and select mode subsamples. This paper uses the ACS as a case study to describe the process that the U.S. Census Bureau is following to develop new and improved survey management methods and tools. Moreover, it summarizes the critical role that collaborative efforts of systems architects and engineers, program managers, and survey methodologists have in this process. It includes examples of developmental and design work, completed research to support specific design changes and business rules, research underway to determine cost/benefit tradeoffs of other potential changes as well as future research plans.




2. Obtaining information on non-responders: a development of the basic question approach for surveys of individuals

Dr Patten Smith (Ipsos MORI)
Mr Richard Harry (Sport Wales)
Mr Colin Gardiner (Ipsos MORI)

The basic question methodology (Bethlehem, Cobben and Schouten, 2011; Lynn, 2002) obtains information on survey non-responders in order to estimate non-response bias and improve non-response adjustment. Interviewers ask a few key survey questions on the doorstep at households as soon as it becomes apparent a survey interview will not be forthcoming. The approach is problematic in surveys where questions relate to individuals who are randomly selected within a household. Where refusals take place in a household before the respondent is selected basic information may be asked of a household member who would not have been selected for the main interview and non-responder questions may be targeted at individuals who would not have entered the main survey sample. This problem is avoided if basic questions relate to whole households rather than individuals.

In a random probability face-to-face survey concerning sports participation we asked two household level basic questions about sport and physical exercise participation. These questions were not included in the main questionnaire and were asked at all responding and non-responding households. Response rates to the basic questions were around 50 per cent amongst refusers and other contacted non-responders. Substantial differences in sport participation were observed between main survey responders and non-responders, providing prima facie evidence of non-response bias. The paper describes the method used, examines the correlations between basic and main survey questions on sport participation and demonstrates the impact of including these basic variables in non-response adjustment mechanisms.



3. Other Specified Coding and Interviewer Performance

Ms Tricia Mccarthy (NORC at the University of Chicago)

Other Specified coding is an important part of survey research that allows us to capture respondent answers that might not coordinate with a set list of choices given in the questionnaire, and can be very telling of field interviewer performance. Currently about to start its 16th round of interviewing in 2013, the National Longitudinal Survey of Youth 1997 (NLSY97) is conducted by NORC at the University of Chicago and the Center for Human Resource Research at the Ohio State University, and commissioned by the Bureau of Labor Statistics. It contains a sample of almost 9,000 individuals currently ages 27-32, and gathers data on factors regarding economics, education, health, and the transition into the workforce. Many questions offer the option for Other Specify answers at the interviewer's discretion. After all response data are collected, we process centrally all Other Specify responses and attempt to back code into the original options, code back into our expanded list of choices not made available to respondents and interviewers, or code as other. We propose to look at the final result of Other Specify answer coding along with interviewer experience, training, and performance on other projects to see how Other Specify use and performance correlates to interviewer background.


4. Analysing nonresponse in a questionnaire survey of Swiss correctional staff

Miss Isenhardt Anna (University of Fribourg, Switzerland)

Non-attendance in social surveys can have a serious effect on data quality, which leads to a great importance for the understanding of the sources of nonresponse. This contribution deals with possible causes for unit nonresponse in a Swiss full population questionnaire survey of correctional staff. The prison system is a high sensitive reserach field with a traditionally rather low response rate. Whereas reasons for a low response rate can be seen in the characterstics of the field itself, specific reasons are mainly unknown. Therefor we use bivariat and multivariat methods to examine the meaning of typical sociodemographic data, like age and context variables like type of prison, prison size or geographical position. Furthermore the influence of different contact strategies are considered. This procedure allows to evaluate the quality of the collected data and helps to identify a possible nonresponse bias. In addition the identification of variables which influence the individual attendance to respond in surveys within the prison context contributes to the development of a strategy which improves the response rate during fieldwork.