Conference Programme 2015

Conference floor plans and map
Tuesday 14th July      Wednesday 15th July      Thursday 16th July      Friday 17th July     

Wednesday 15th July, 16:00 - 17:30 Room: HT-102

Modelling unit nonresponse and attrition processes 2

Convenor Ms Carina Cornesse (GIP, Mannheim University )
Coordinator 1Dr Gabriele Durrant (School of Social Sciences, University of Southampton)
Coordinator 2Professor Annelies Blom (GIP, Mannheim University)

Session Details

This session focuses on analysing the processes leading to unit nonresponse in cross-sectional and attrition in longitudinal data. Unit nonresponse and attrition are major issues affecting data quality in surveys. Their importance has increased over the past decades as response rates in the US and Europe have been decreasing across survey modes and nonresponse rates may be related to nonresponse bias.

When modelling the fieldwork processes leading to nonresponse, research can draw on auxiliary data sources. These may include paradata, such as call record data, interviewer observations, time stamps during the interview, or variables from external data sources, such as administrative, register and census data.

In recent years, the statistical techniques that have been developed to model unit nonresponse and attrition and applied to survey data have become increasingly sophisticated. In addition, both ex-post modelling to learn from previous fieldwork outcomes and real-time modelling to inform adaptive and responsive survey designs have found its way into the survey methodological realm.

For our session we invite submissions from researchers who model unit nonresponse and attrition processes. We specifically encourage submissions of papers that use auxiliary data to model unit nonresponse and attrition processes and papers that use complex statistical models for this purpose.

Paper Details

1. Investigatin Nonresponse Bias: Why Do Different Interviewers Cause Different Degrees of Selectivity?
Mr Michael Ruland (WZB Berlin Social Science Center, Berlin, Germany)
Mrs Sara Kretschmer (Leibniz Institute for Educational Trajectories, Bamberg &Institute for Employment Research, Nuremberg, Germany)
Mrs Jennifer Elsner (University of Siegen, Germany)

One of the major research fields in survey methodology is unit-nonresponse and nonresponse bias. Research has shown that nonresponse is strongly affected by interviewer behavior particular in facetoface surveys. In our paper we analyze systematic deviations from assigned sample to recruited sample for each interviewer. The analysis is based on the German National Educational Panel Study and paradata including contact record data about each contact attempt and interviewer characteristics. With these data we are able to analyze interviewer success and to show different degrees of nonresponse bias among interviewers with regard to variables like respondents' age or educational background.

2. Does the Switch to a Mixed-Mode Design Increase Panel Attrition? Evidence from the UKHLS Innovation Panel
Miss Alessandra Gaia (Univerisyt of Milan-Bicocca)

This paper evaluates the effect of a switch to a mixed-modes design in a longitudinal survey on panel attrition. Exploiting experimental data from the Understanding Society Innovation Panel, I use a logistic regression framework to model the effect of the experimental allocation (mixed mode design versus a unimode face-to-face design) on attrition. The treatment is interacted with sample members’ characteristics and response behaviours from previous waves. I do not find evidence that a mixed mode design increases attrition. On the contrary, for the original sample non respondents at the pervious waves a mixed-mode design reduces attrition.

3. Using Machine Learning to Correct for Survey Nonresponse Bias
Dr Antje Kirchner (University of Nebraska- Lincoln)
Dr Curtis Signorino (University of Rochester)

We compare survey nonresponse bias corrections using recent machine learning techniques to model response propensity. We apply these to the German ‘Labor Market and Social Security’ survey, using administrative data for both respondents as well as nonrespondents. We compare the nonresponse bias correction when using Adaptive LASSO with a polynomial expansion of regressors to existing techniques: logistic regression, neural nets, classification trees, and random forests.