ESRA logo

ESRA 2023 Program

              



All time references are in CEST

Improving the representativeness, response rates and data quality of longitudinal surveys 3

Session Organisers Dr Jason Fields (US Census Bureau)
Dr Nicole Watson (University of Melbourne)
TimeThursday 20 July, 09:00 - 10:30
Room U6-21

Longitudinal survey managers are increasingly finding it difficult to achieve their statistical objectives within available resources, especially with the changes to the survey landscape brought by the COVID-19 pandemic. Tasked with interviewing in different populations, measuring diverse substantive issues, and using mixed or multiple modes, survey managers look for ways to improve survey outcomes. We encourage submission of papers on the following topics related to improving the representativeness, response rates and data quality of all, but especially longitudinal, surveys:

1. Adaptive survey designs that leverage the strength of administrative records, big data, census data, or paradata. For instance, what cost-quality tradeoff paradigm can be operationalized to guide development of cost and quality metrics and their use around the survey life cycle? Under what conditions can administrative records or big data be adaptively used to supplement survey data collection and improve data quality?

2. Adaptive survey designs that address the triple drivers of cost, respondent burden, and data quality. For instance, what indicators of data quality can be integrated to monitor the course of the data collection process? What stopping rules of data collection can be used across a multi-mode survey life cycle?

3. Papers involving longitudinal survey designs focused on improving the quality of measures of change over time. How can survey managers best engage with the complexities of such designs? How are overrepresented or low priority cases handled in a longitudinal context?

4. Survey designs involving targeted procedures for sub-groups of the sample aimed to improve representativeness, such as sending targeted letters, prioritising contact of hard-to-get cases, timing calls to the most productive windows for certain types of cases, or assigning the hardest cases to the most experienced interviewers.

5. Papers involving experimental designs or simulations aimed to improve the representativeness, response rates

Papers

Using a Stopping Rule to Optimize Cost-Quality Trade-offs in a Longitudinal Study of Doctorate Recipients

Dr James Wagner (University of Michigan) - Presenting Author
Dr Brady West (University of Michigan)
Ms Deji Suolang (University of Michigan)
Dr Brian Kim (University of Maryland)
Mr Curtiss Engstrom (University of Michigan)
Dr Jennifer Sinibaldi (Pennsylvania State University)

Rising nonresponse and the increasing cost of conducting surveys are creating pressure on surveys to efficiently allocate resources. The challenge is to produce the highest quality data possible within a fixed budget. In studies that use interviewers, allocating the effort of these interviewers is an important aspect of the design that has a large impact on quality. Some studies have focused on prioritizing some cases so that they receive more effort. For example, cases from under-responding groups might be judged to be important and, therefore, are prioritized. Another approach uses stopping rules to “de-prioritize” some cases so that they receive less effort. This effectively reallocates the effort to more important cases.

We propose a stopping rule that is based upon the product of predicted costs and the predicted mean squared error of a survey estimate. The rule is designed to optimize the cost-error trade-off. Inputs to the rule include predictions about costs and survey estimates. We simulate the impact of implementing the stopping rule on the US Survey of Doctorate Recipients (SDR), which is a longitudinal survey conducted every two years in the US using web, mail, and CATI. The SDR is sponsored by the US National Center for Science and Engineering Statistics. We vary the types of models used to generate the input predictions (parametric regression vs. nonparametric tree models) and the timing of the implementation of the rule. We found that substantial cost savings are possible with relatively small decreases in estimate quality.


Change or Challenge: Is Consent to Data Transfer to new Project Institutions in a Self-administered Panel Survey Selective?

Miss Almut Schumann (Federal Institute for Population Research (BiB)) - Presenting Author
Dr Claudia Schmiedeberg (University of Munich)

A high participation in longitudinal surveys is important for securing data quality and representativeness. Changes in study design or processes, which have become increasingly common recently, bear the risk of affecting survey participation and sample composition. An institutional change, e.g. introducing a new survey institute or a new project partner, within a running panel may be such a risk factor for increasing selectivity as respondents’ explicit consent is required for transferring the data to a new institution. But it is unclear which respondent characteristics are associated with the consent decision and whether the timing of such an institutional change plays a role.
Based on data from wave 14 of the German Family Panel pairfam, we investigate which respondents provided consent to being contacted for future waves by the new project team when an institutional change was announced. Besides sociodemographic characteristics, we examine respondents’ family situation, which may affect how interested respondents are in the family-related topics of the panel and then again influence their consent decision. In particular, as prior panel experience affects the commitment to the survey, we investigate if selection effects are more pronounced among respondents who had entered the panel only recently as compared to respondents from the original sample.
We find sociodemographic characteristics to affect giving panel consent as migrants and lower educated respondents are significantly less likely to consent, but relationship situation does not appear to influence the probability to provide panel consent. Furthermore, our findings reveal that a short panel experience leads not only to less panel consent but also to stronger selection effects. As consent from respondents is mandatory in case of institutional change, running panel surveys should be aware that this step might be a cutting point for respondents who are generally less likely to participate, particularly when panel experience


The Transformed Labour Force Survey (TLFS): Improving the representativeness and quality of data through an Adaptive Survey Design

Ms Maria Tortoriello (UK Office for National Statistics) - Presenting Author
Ms Sabina Kastberg (UK Office for National Statistics)

The UK Office for National Statistics are transforming their social surveys. Transformation of the Labour Force Survey (LFS) began in 2017, focusing on the re-design of the survey for online data collection. At the start of the Covid-19 Pandemic an online-only Transformed Labour Force Survey (TLFS) was launched. An encouraging response rate was achieved. However, as expected from a voluntary online-only survey, differential non-response bias was a problem. As the Pandemic lessened, the natural next step for the TLFS was to introduce face-to-face interviewers. A 'knock-to-nudge' approach was implemented to improve response and representativeness of the data. Rather than allocating the most expensive survey element to all respondents, we targeted the face-to-face resource at historically under-represented and hard-to-get groups though implementation of an Adaptive Survey Design (ASD). Analysis of historical TLFS data showed online data collection was working effectively for certain groups of respondents but not others. An ASD would ensure data collection resources are used in the most efficient way whilst increasing response from historically under-represented population groups.

Closely following methodology developed by Statistics Netherlands, our ASD is based on a response propensity model. A logistic regression model was applied to historical TLFS data to identify auxiliary variables strongly associated with response in order to formulate the ASD strata. Face-to-face interviewers are targeted at under-represented strata to reduce the variation in response propensities for a selected set of auxiliary variables.

The ASD was implemented in November 2022. Moving away from a traditional focus on response rates, we will be monitoring a range of data quality indicators over the next 6 months to inform future development of the ASD.

This presentation will explore results from the first iteration of the ASD.


Transitioning Healthcare Research from Telephone to a Mixed-mode Methodology while Ensuring Sample Representativeness Among Lower SES Populations

Ms Robyn Rapoport (SSRS) - Presenting Author
Mr Rob Manley (SSRS)
Mr Jonathan Best (SSRS)
Ms Lauren Haynes (Commonwealth Fund)
Dr Sara Collins (Commonwealth Fund)

Traditional telephone surveys have been a cornerstone for data collection for the past 50 years. As is well known, with steadily declining response rates and the accompanying increase in data collection costs researchers are exploring a range of approaches aimed at maintaining data quality, while controlling costs. These include mixed-mode methodologies that combine address-based samples, probability panels and targeted telephone outreach. A key challenge, however, is when the study being transitioned is long-standing and measures healthcare access and health outcomes over time.

The Commonwealth Fund Biennial Health Insurance Survey (“Biennial”) is a longitudinal survey that has been gathering data for more than two decades in the United States. Survey foci include healthcare coverage gaps and other cost-related healthcare issues. Each iteration has featured a stratified sample and an adaptive design aimed at ensuring representative samples of lower SES populations. The most recent iteration, which fielded from March 28 through July 4, 2022, used a multi-mode data collection methodology conducted alongside a companion RDD cell sample study. Including the companion sample allowed us to analyze trends from the 2020 Biennial survey and explore mode effects that may have been introduced by the change in sampling and data collection methodologies. This companion sample closely mirrored the sample designs of previous Biennial surveys.

Using these data, we will present a case study on transitioning from RDD cell-based methodology to a mixed-mode methodology that includes ABS push-to-web (or call in) combined with interviews from SSRS Opinion Panel and prepaid (contract) cell sample. We will compare sample demographic distributions along with key healthcare outcomes such as health insurance status, medical bills, health outcomes, and underinsurance. We will identify statistically significant differences and explore the factors that may be leading to differences, such as mode of completion,