ESRA logo

ESRA 2023 Program

              



All time references are in CEST

Improving the representativeness, response rates and data quality of longitudinal surveys 5

Session Organisers Dr Jason Fields (US Census Bureau)
Dr Nicole Watson (University of Melbourne)
TimeThursday 20 July, 16:00 - 17:30
Room U6-07

Longitudinal survey managers are increasingly finding it difficult to achieve their statistical objectives within available resources, especially with the changes to the survey landscape brought by the COVID-19 pandemic. Tasked with interviewing in different populations, measuring diverse substantive issues, and using mixed or multiple modes, survey managers look for ways to improve survey outcomes. We encourage submission of papers on the following topics related to improving the representativeness, response rates and data quality of all, but especially longitudinal, surveys:

1. Adaptive survey designs that leverage the strength of administrative records, big data, census data, or paradata. For instance, what cost-quality tradeoff paradigm can be operationalized to guide development of cost and quality metrics and their use around the survey life cycle? Under what conditions can administrative records or big data be adaptively used to supplement survey data collection and improve data quality?

2. Adaptive survey designs that address the triple drivers of cost, respondent burden, and data quality. For instance, what indicators of data quality can be integrated to monitor the course of the data collection process? What stopping rules of data collection can be used across a multi-mode survey life cycle?

3. Papers involving longitudinal survey designs focused on improving the quality of measures of change over time. How can survey managers best engage with the complexities of such designs? How are overrepresented or low priority cases handled in a longitudinal context?

4. Survey designs involving targeted procedures for sub-groups of the sample aimed to improve representativeness, such as sending targeted letters, prioritising contact of hard-to-get cases, timing calls to the most productive windows for certain types of cases, or assigning the hardest cases to the most experienced interviewers.

5. Papers involving experimental designs or simulations aimed to improve the representativeness, response rates

Papers

Efforts to boost response and maximise representativeness in the Next Steps Age 32 Survey.

Mr Matt Brown (Centre for Longitudinal Studies, UCL) - Presenting Author
Ms Tugba Adali (Centre for Longitudinal Studies, UCL)
Ms Stella Fleetwood (Ipsos)
Ms Kirsty Burston (Ipsos)
Ms Kirsty Macleod (Ipsos)

Maintaining representation and maximising sample size are key objectives of all longitudinal studies.

Next Steps is a longitudinal study following participants born in England in 1989-90. Eight waves of data collection have taken place since 2004. The 9th wave (the Age 32 Survey) is currently ongoing.

The survey employs a mixed mode web-first design with non-respondents followed up face-to-face. This paper presents the findings from two experiments conducted in the first phase of fieldwork which sought to improve representation and maximise response.

1) Targeted incentives
Next Steps has long offered conditional incentives to participants. Higher value incentives typically have a greater impact on response but offering large incentives is not always feasible or cost-effective. As such, targeting larger incentives at participants with lower response propensity has become increasingly common, though remains rare in the UK. Our first experiment sought to evaluate whether offering prior wave non-respondents a larger incentive than prior wave respondents could improve non-response bias in Next Steps.

2) Mop-up non-response survey
Survey refusals are often circumstantial and result from survey invitations being made at a non-optimal time. The Next Steps Age 32 Survey is relatively long (60 minutes) which is another cause of refusals. Our second experiment was conducted after interviewers had exhausted efforts to contact web phase non-respondents. Non-respondents were re-invited to participate online after a 3 month gap, with half randomly allocated to a group asked to complete the full 60 minute survey and the remainder allocated to an abbreviated 20 minute version. The findings from the experiment will allow us to assess whether this approach of providing a final opportunity to participate online can boost response and whether reducing the length at this 'mop-up' stage can increase completion rates.


Implications of parental opt-out or opt-in consent for response rates and representativeness in a national longitudinal survey of youth

Professor Ben Edwards (Australian National University) - Presenting Author
Professor Matthew Gray (Australian National University)
Mr Martin Murphy (Australian Council of Educational Research)
Dr Dan Edwards (Australian Council of Educational Research)
Ms Intifar Chowdhury (Australian National University)
Dr Nikki Honey (Social Research Centre)
Ms Kylie Hillman (Australian Council of Educational Research)
Professor Andrew Norton (Australian National University)

In this paper we discuss the implications of parental opt-out or opt-in consent for the representativeness and response rates of wave 1 of GENERATION, a school-based survey of over 16,000 15-16 year olds from 295 schools in Australia. We capitalise on a natural experiment - the variation in state government education policies requiring parents to actively notify schools if they want their young person to participate (opt-in consent) or notify schools if they do not want their young person to participate (opt-out consent). Education policies in government school systems in four of eight states required parental opt-in consent enabling exogenous policy variation in survey implementation and an opportunity to test whether response rates and representativeness of the survey were affected.

Preliminary evidence from 204 schools suggests that response rates were lower. Based on AAPOR standard definitions, the response rate was 0.62 for Opt-out schools while it was 0.35 for Opt-in schools (RR1, Completed Surveys). When partial surveys were also considered (RR2), differences in response rates were also much greater (0.92 compared to 0.52).

Given previous evidence that parental opt-in consent leads to lower response rates from disadvantaged populations, we will also focus on whether disadvantaged students are less likely to participate. Given the exogenous nature of the policy the findings from this study our findings should provide important evidence for survey methodologists to argue for opt-out approaches in low-risk student surveys. We discuss our findings and prior evidence in the context of a human rights perspective of children from disadvantaged backgrounds t be heard and the need for evidence on these populations to support education equity.


Lotteries as a means to improve participation in panel surveys? Short- and long-term effects of participatory and personalized lottery measures on the survey participation of young adults in NEPS

Dr Benjamin Schulz (WZB Berlin Social Science Center) - Presenting Author
Mr Konstantin Wagner (WZB Berlin Social Science Center)

Young adults are hard to reach for survey participation, just as they are hard to keep in panels. In this paper, we examine to what extent and for which subgroups a lottery with personalized prizes can help boost survey participation of young adults in the National Educational Panel Study (NEPS). Observing increasingly lower response rates in NEPS Starting Cohort Grade 9 as participants completed secondary education or vocational training, we implemented a cross-wave participatory lottery as of the 8th wave, including a randomized lottery in the 10th wave. We study short-term and long-term effects of the lottery by estimating how winning affects participation in the 11th, 12th, and 13th wave. Comparing effects of high (N=59) versus low (N=470) versus no prizes (N=9,610), we test hypotheses about reciprocity norms. Moreover, we use competence measures to investigate differences with regard to cognitive burden.
We find evidence of short- and long-term effects on survey participation, especially of winning high prizes. Among winners, participation in subsequent waves is about 7 to 23 percentage points higher than among non-winners. In line with our expectations, we find evidence of reciprocity norms as long-term effects are particularly strong among participants who won high prizes. In contrast to our expectations concerning interviewee burden, we find that a lottery win increases further participation particularly strongly among respondents with high math test scores.
We discuss implications of our findings for study designs and survey praxis in panel studies. In particular, we detail how the participatory, cross-wave character of our lottery including its announcement in cover letters, questions about the preferred kind of lottery and vouchers just as the presentation of the results to these questions in the wave before the lottery might have contributed to its particularly strong effect.


Do targeted higher-value conditional incentives improve survey response and representation in longitudinal studies? Evidence from the COVID Social Mobility and Opportunities Study (COSMO) in England

Dr Jake Anders (UCL Centre for Education Policy and Equalising Opportunities)
Professor Lisa Calderwood (UCL Centre for Longitudinal Studies)
Dr Tugba Adali (UCL Centre for Longitudinal Studies)
Mr James Yarde (The Sutton Trust)
Mr Luke Taylor (Kantar Public) - Presenting Author

Existing literature indicates that monetary incentives are effective at increasing survey co-operation rates. Typically, the larger the incentive offered, the higher the response rate achieved. There is more limited evidence on whether a higher incentive improves the representativity of achieved samples.
COSMO is a new cohort survey led by UCL and the Sutton Trust, funded by ESRC and UKRI, and with the fieldwork conducted by Kantar Public. In this paper, we examine the use of targeted higher-value conditional incentives in the context of recruiting a longitudinal panel using a predominantly push-to-web methodology.
The study is following a sample of young people across England who were in Year 11 (aged 15-16) in 2020/21. The sample was primarily drawn from the Department for Education National Pupil Database (NPD). The wave 1 fieldwork was conducted from September 2021 to April 2022.
Previous research indicated that young people from poorer backgrounds (who are eligible for free school meals) tend to respond to surveys at lower rates. Therefore, we varied the conditional incentive offered, aiming to improve the response and representation of this sub-population. As standard, a £10 conditional incentive was used. Young people attending schools with the highest proportions of pupils eligible for free school meals were offered £20.
As a result, otherwise similar young people received different incentives if they were on either side of the cut off separating these groups of schools. This allows us to use a regression discontinuity design to estimate the causal impact of the increased incentive on participation.
In addition, we use the detailed NPD sample frame information available to assess whether the larger incentive value helped improve the representativity of the achieved sample.
We, thus, provide evidence for researchers as to the optimal incentive approach for future longitudinal surveys.