Program at a glance 2021
Access the Scoocs manuals
We offer full live support in video sessions on the platform during the following hours:
2 July: 11:45-13:45 and 15:00-17:00
9 July: 13:00-15:00
16 july: 13:00-14:00
23 july: 13:00-14:00
For help outside these hours, please see the Scoocs manuals referenced above.
All time references are in CEST
Non-response in longitudinal studies
|Session Organisers|| Professor Peter Lynn (University of Essex)
Professor Annamaria Bianchi (University of Bergamo)
Dr Alessandra Gaia (University of Milano-Bicocca)
|Time||Friday 16 July, 15:00 - 16:30|
Non-response is a serious threat to data quality. Not only it decreases efficiency but also it may lead to biased estimates if sample members not participating in surveys differ from those who are participating on some relevant aspects. Survey researchers and practitioners invest efforts to find strategies to increase response rates and to handle non-response after data collection. In the context of longitudinal studies, this section discusses both methods to minimise non-response – such as the use of respondents’ incentives, variations in the number/mode of reminders and testing of alternative recruitment strategies – as well as methods to handle non-response after data collection (e.g. weighting strategies).
Keywords: non-response, response speed, web surveys, tailored survey design, mixed-mode survey
Comparing face-to-face and online recruitment approaches: evidence from two probability-based panels in the UK
Mr Curtis Jessop (NatCen Social Research) - Presenting Author
Dr Joel Williams (Kantar Public)
The use of probability-based panels for survey research can enable the collection of survey data from a high-quality sample more quickly and at lower cost than traditional face-to-face methods. The recruitment stage is a key step in the set-up of a panel study, but it can also represent a substantial cost. A face-to-face recruitment approach in particular can be expensive, but a lower recruitment rate from a push-to-web approach risks introducing bias and putting a limit on what subsequent interventions to minimise non-response can achieve.
This paper presents findings on using face-to-face and push-to-web recruitment approaches from two probability-based panels in the UK: the NatCen Panel and Kantar’s Public Voice Panel. The NatCen Panel is recruited from participants in the British Social Attitudes survey (BSA). While normally conducted face-to-face, the 2020 BSA was conducted using a push-to-web approach in response to the Covid-19 pandemic. The Kantar Public Voice Panel has been built using a mixture of push-to-web and face-to-face fieldwork.
For each panel, we compare the recruitment rates and overall response rates of the face-to-face and push-to-web fieldwork designs. We also compare the demographic profile of panel survey participants recruited using each approach to explore to what extent any differences in recruitment and response rates translate into bias in the sample. Finally, for the Public Voice Panel, we look at the extent to which the face-to-face sample can be used to address biases in sample recruited from a push-to-web survey.
Adding Text Messages to the Contact Strategy for a Mixed-Mode Survey: Does it Pay Off?
Mr Pablo Cabrera-Álvarez (Institute for Social and Economic Research (University of Essex)) - Presenting Author
Professor Peter Lynn (Institute for Social and Economic Research (University of Essex))
In longitudinal surveys, wave nonresponse and panel attrition are threats to data quality. A method to boost response rates is to use a combination of modes to reach sample members (Dillman et al., 2014). Using an additional contact mode implies another opportunity to reach a sample member that may ignore other contact attempts. Also, it reinforces the “take action” message of the survey invitation or reminder and, in some instances, it can offer the respondent a different setting to complete the questionnaire such as a paper questionnaire attached to a reminder letter. Text messages, which are a brief and direct communication mode, can supplement emails or letters to encourage survey response.
This research explores the effect of adding an SMS invitation or reminder to a mixed-mode contact strategy using data from an experiment embedded in wave 11 of Understanding Society, the UK Longitudinal Household Survey. More than 4,000 households and 9,000 sample members included in the experiment were asked to participate online as part of a sequential web and phone strategy. For the experiment the sample was split into four groups: group 1 -control- received the usual letter and email invitations and reminders; group 2, in addition to the usual contact attempts, received an SMS invitation; group 3 received two additional SMS reminders, and group 4 was assigned with both SMS invitation and reminders. The SMS included a short text encouraging participation and a personalised link to the web questionnaire.
The analysis focuses on the effect of adding text messages to the mixed-mode contact strategy on survey response. It also examines whether adding the SMS reduces the time passed between invitation or reminder reception and response. The relatively large sample size of the experiment allows for an in-depth analysis looking for the subgroups that are more likely to be influenced by the addition of the SMS. Moreover, as the SMS included the link to log on to the survey, we assess the effect on device choice and the impact on sample composition.
Testing higher levels of incentive on the UK Household Longitudinal Study (Understanding Society)
Miss Hannah Carpenter (Kantar) - Presenting Author
Dr Pablo Cabrera Alvarez (Institute for Social and Economic Research, University of Essex)
Understanding Society (also known as the UK Household Longitudinal Study) contacts the same households and individuals every year, and aims to conduct a 40 minute interview with everyone aged 16 or more in each household. The 12th wave of the study started in 2020. Throughout the previous 11 waves of the study the standard incentive had been £10 (in most cases unconditionally, in advance letters). At wave 8, the study moved from being predominantly face-to-face to inviting a substantial proportion of households to complete online before all households were followed up by face-to-face interviewers. At this point, adults were offered an additional £10 ‘bonus’ incentive if they completed the survey online, before the transfer to interviewers. This bonus incentive has continued to be offered for online completion since wave 8.
An experiment was planned on wave 12 for households issued for fieldwork in April to August 2020. The experiment tested the standard incentive approach versus two experimental options: doubling the unconditional incentive to £20; and doubling the bonus for online completion to £20.
During this time face-to-face interviewing was not possible in the UK due to the Covid-19 pandemic and the resulting national lockdown, and restrictions on personal contact so the study changed to use a sequential mixed mode web then telephone approach. The experiment went ahead as it was still useful to compare the effect of different levels of incentive when using a different fieldwork approach.
This paper will examine the overall effect of these higher incentives on response rates, and also whether there is differential impact on different groups of respondents. In particular can increased incentives improve response amongst some of the less engaged groups and therefore improve overall representativeness of the study? We will also examine at what point in the fieldwork sequence the different incentives have an impact: do they improve response at the web stage, or make it easier for interviewers to persuade sample members to take part during the telephone stage?
Weighting for mortality in a longitudinal study
Dr Olena Kaminska (ISER, University of Essex) - Presenting Author
In a long term panel one needs to take into account mortality. In Understanding Society (UK household panel) only around half of death is identified by interviewers. Much of missed death occurs among previous nonrespondents – those who drop out before dying. Without additional adjustment panel members who have passed away are treated as nonrespondents for the following years. This leads to a skewed distribution of age in particular or of any other variable related to death.
Two approaches for additional adjustment were used in Understanding Society. First, we used information from death registers obtained only in 2018, 27 years after the panel has started. Through survival modelling we estimate mortality for everyone (including respondents) and use this to readjust the weights. But even after this approach a sizable amount of death remains missing. So the final adjustment involves the use of official age by gender mortality information.
We explain the adjustment approach and demonstrate bias in estimates without each of the additional adjustments.
A Data Driven Approach to Understanding and Handling Non-Response in the Next Steps Cohort
Dr Richard Silverwood (Centre for Longitudinal Studies, UCL Social Research Institute, University College London) - Presenting Author
Dr Lisa Calderwood (Centre for Longitudinal Studies, UCL Social Research Institute, University College London)
Dr Joseph Sakshaug (Institute for Employment Research and University of Mannheim)
Professor George Ploubidis (Centre for Longitudinal Studies, UCL Social Research Institute, University College London)
Non-response is common in longitudinal surveys, reducing efficiency and introducing the potential for bias. Principled methods, such as multiple imputation, are generally required to obtain unbiased estimates in surveys subject to missingness which is not completely at random. We present a systematic data-driven approach used to identify predictors of non-response in Next Steps, an ongoing English national cohort study which follows a sample of young people from age 13-14 years to (currently) age 25 years. The identified predictors of non-response were across a number of broad areas, including personal characteristics, schooling and behaviour in school, activities and behaviour outside of school, mental health and wellbeing, socioeconomic status, and practicalities around contact and survey completion. We found that including these predictors of non-response as auxiliary variables in multiple imputation analyses allowed us to largely restore sample representativeness in several different settings: sociodemographic characteristics and household salary (relative to data from earlier waves) and university attendance (relative to an external benchmark). We propose that such variables are included in future analyses using principled methods to reduce bias due to non-response in Next Steps. Future work will consider whether incorporating information from linked administrative datasets can further aid the handling of non-response in Next Steps.