ESRA logo
Tuesday 18th July      Wednesday 19th July      Thursday 20th July      Friday 21th July     




Tuesday 18th July, 14:00 - 15:30 Room: F2 104


The agony of attrition - challenges in longitudinal studies 1

Chair Ms Joanne Corey (Australian Bureau of Statistics )
Coordinator 1Dr Jutta von Maurice (Leibniz Institute for Educational Trajectories)

Session Details

All of us working on longitudinal studies face the challenge of minimising attrition: respondents who move and don’t update their contact details, as well as respondents who refuse for a variety of reasons. Engagement with study participants is vital.

This session is interested in hearing from survey methodologists and practitioners who work in this area and would be interested in sharing their experiences, successful or not.

For example, we would love to hear about:
• different engagement strategies
• incentives
• targeted approaches and follow up
• forays into the world of social media – what platform was used? How did you measure the success? What are the pitfalls?

Other relevant topics include:
• panel attrition in transition periods
• research on panel consent
• data collection methodologies aimed at increasing engagement
• engaging and novel methods of relaying study results back to participants

Paper Details

1. Data-Driven Incentives: Developing a Model-Based Framework for Incentives in the 2014 SIPP
Dr Jason Fields (U.S. Census Bureau)

The Survey of Income and Program Participation (SIPP) was redesigned for the 2014 panel. In addition to content and design revisions, the 2014 SIPP Panel shifted household interviewing from every four-months to annual interviews. With this redesign came the opportunity to evaluate the effectiveness making model-based decisions to offer incentives for the completion of SIPP interviews. Leveraging longitudinal data, we developed models to direct incentive offers for survey completion. We focus on measurable changes in the response rate among sample households, and assign incentives where the incentive would have the greatest differential effect for converting non-responders to responders. The evaluation of this test focuses on the cost and effectiveness of the incentive plans. Having identified a +3% response rate impact for a $40 conditional incentive when distributed to all respondents during Waves 1 and 2, in Wave 3 we administered a model-based incentive to 30% of the sample in three of the four treatment groups. Incentives were assigned by ranking households by the predicted incentive effectiveness. The model-based incentive attempted to achieve a response rate improvement as close as possible to the +3% observed for the full sample incentive distribution with only 30% of the expenditure by incentivizing the households where the incentive had the greatest effect. This paper outlines the multipurpose incentives experiment, provides preliminary results, and presents directions for future waves and panels.


2. Addressing differential response through the use of targeted financial incentives: Results from a pilot on the Growing Up in Scotland study
Ms Line Knudsen (ScotCen Social Research (Growing Up in Scotland study))
Mr Paul Bradshaw (ScotCen Social Research)

Since its inception in 2005 the Growing Up in Scotland study (GUS) has tracked the lives of more than 10,000 children and families living in Scotland. On GUS, like on most other cohort studies, differential attrition is a key concern. Families living in disadvantaged circumstances are less likely to take part and, over time, this has meant that the size of important sub-groups such as teenage mothers, lone parents and lower income families have disproportionately diminished.

Financial incentives have long been a feature of household surveys. However, such incentives have never been a routine feature of GUS. Relatively high response rates indicated that the vast majority of participants were willing to participate without (a financial) incentive, and the potential minor improvements in response rates have never outweighed the costs of offering such an incentive to all participants. However, targeting incentives at participants with specific characteristics (particularly those who are less likely to take part and who are thus under-represented in the sample) offered the potential to reduce sample bias whilst minimising costs. In 2015 a trial of targeted financial incentives was included as part of the third sweep of data collection with the youngest birth cohort. Study participants were considered eligible for an incentive if they met at least one of the following criteria: aged under 20 when the cohort child was born; a single parent or living in one of the most deprived areas. Of this group of ‘incentive eligibles’, a subgroup of c.700 participants were randomly selected to receive an unconditional financial incentive with their advance letter, leaving a control group of c.1200.

This presentation will outline the results of the pilot and the recommendations and decisions made on the back of these, including the financial incentives strategy now adopted with the older GUS birth cohort at sweep 9.

Growing Up in Scotland is a longitudinal birth cohort study funded by the Scottish Government and undertaken by ScotCen Social Research. Currently, GUS follows two cohorts of children: c.3,500 children born in 2004/05 and c.6,000 children born in 2010/11. Study website: www.growingupinscotland.org.uk


3. Participation patterns in longitudinal surveys: what can we learn from interviewers’ evaluations?
Dr Oliver Lipps (FORS)
Dr Marieke Voorpostel (FORS)

Many attrition studies treat nonresponse in a longitudinal survey as an absorbing state. Yet, in household panels respondents often drop out temporarily and come back to the survey in a later wave, producing a wide variety of participation patterns. These participation patterns over the course of a longitudinal survey have not received much research attention. Interviewers often provide some form of assessment of the interview and this may help to better understand the reasons for such irregular response patterns. In fact, the interviewer’s view of how the respondent perceived the interview predicts future participation (Plewis et al, 2016; Uhrig, 2008) and hence provides important input on which to base measures to prevent nonresponse in future waves.

Our aim is to better understand participation patterns, and to assess the extent to which interviewers’ evaluation of the interview is indicative of future participation. Using the Swiss Household Study, we model participation, noncontact and refusal separately, using (fixed and between-respondent effects) linear models that take into account the interviewer’s evaluation and the response history. Between and fixed effects models allow examining not only how respondents differ, but also what happens as respondents change during the course of the panel.
Starting from participation, noncontact and refusal models which control for the usual socio-demographic and inclusion variables, our preliminary results show that while the past number of participation, refusals and noncontacts largely improves the models, including the interviewer assessments are slightly better able to predict participation and refusal, but they do not perform better for noncontact.

The presentation will also address practical implications of our findings.


4. Short- and Medium-Term Effects of Incentive Alterations in the NEPS Adult Survey
Mr Michael Ruland (infas Institute for Applied Social Sciences, Germany)
Mrs Angelika Steinwede (infas Institute for Applied Social Sciences, Germany)
Mrs Annette Trahms (Institute for Employment Research (IAB), Germany)

Incentives in panel surveys are implemented to motivate participation and maintain panel stability in order to reduce nonresponse bias. A lot of research in the field of survey methodology about the impact of incentives identifies positive effects on response rates in general, moreover monetary incentives are considered to be more effective than non-monetary incentives. Research on modes of incentive delivery indicates stronger effects of unconditional prepaid incentives than postpaid incentives, which are conditional on survey participation. Target groups who are more likely to drop out of panels are affected in particular. Singer et al. (1998, 2000) address the issue that incentives in panel surveys may generate expectations in respondents to receive incentives similar in form and amount in future waves as well. Disappointing such expectations may result in panel attrition. The consequences of alterations concerning amount and delivery method of incentives in longitudinal surveys are not yet researched sufficiently.
This paper is based on the adult starting cohort 6 of the National Educational Panel Study, whose sample was drawn from residents’ registration offices and consists of individuals born between 1944 and 1986. Annual survey waves are conducted since 2009 in multi-method design of telephone and face-to-face interviews. Every second wave, the face-to-face interviews include a measurement of competencies. Participants of the first five waves were rewarded with a postpaid incentive. The incentive amounts to 20 Euros from wave 3 onwards. In 2014, an incentive experiment was implemented in the survey, which allows for analyzing both short-term and medium-term effects of those changes. The randomized split-half experiment was started in wave 6, resulting in delivering 50 percent of the incentive (10 Euros) as unconditional prepaid incentive. Wave 8 saw a comprehensive switch from prepaid to postpaid and postpaid to prepaid over the sample.
Our paper examines the effects of alterations in incentives’ delivery mode regarding panel attrition and nonresponse bias. Wave 6 shows positive effects of prepaid incentives concerning participation, especially in respondents with low education levels. Wave 7 allows the analysis of medium-term effects of switching from postpaid to prepaid, while wave 8 provides data for determining the impact of a switch from prepaid back to postpaid.