ESRA logo

ESRA 2023 Glance Program


All time references are in CEST

Survey Nonresponse Trends and Trust in Surveys: A global perspective on the current survey climate 2

Session Organisers Professor Bettina Langfeldt (University of Kassel)
Mrs Ulrike Schwabe (DZHW)
Dr Henning Silber (GESIS)
TimeTuesday 18 July, 14:00 - 15:30
Room U6-20

In our complex and interconnected world, there is a strong need for databased scientific approaches to solving diverse local as well as global problems. However, to transform scientific recommendations successfully into policy measures, societal trust in scientific methods and results is required. Yet, mistrust toward scientific results seems to be on the rise in recent years.
In a democracy, surveys can be an important tool for measuring public opinion and informing political decision-makers about the views of their constituents. Yet, decreasing survey participation, attempts to manipulate polls, and misleading accusations of “fake polls” as well as polls carried out not in accordance with established scientific standards, put the validity of the gathered data in jeopardy. If the survey climate continues to be on the decline, this will have drastic consequences for survey-based research since both policymakers as well as the recipients of political interventions have to believe in the accuracy of the data.
Against this background, this session aims to bring together current research on the following topics:
Survey Climate: What is the current state of the survey climate? What caused changes? What can be done to foster a positive survey climate, both in terms of increasing survey participation and increasing the quality of survey data?
Trust in Surveys: What can be done to increase trust in surveys and their results? Which are determinants of participation and giving truthful answers that can be used to strengthen the quality of survey data? How is declining trust in surveys related to the declining trust in science?
Surveys and Society: If and how is political participation linked to trust in science and the generalized attitudes towards surveys? What is the role of surveys in the democratic process? Can process-based data (e.g., digital trace data) help to mitigate the survey

Papers

Soft Attrition in Self-Completion Panel Surveys: Evidence on Respondent Changes and its Influencing Factors from the German GESIS Panel

Dr Sven Stadtmüller (GESIS - Leibniz-Institute for the Social Sciences) - Presenting Author
Dr Henning Silber (GESIS - Leibniz-Institute for the Social Sciences)
Ms Anna Hebel (GESIS - Leibniz-Institute for the Social Sciences)

Self-completion modes for survey data collection have become increasingly popular. Although they have many advantages, one problem is that interviewers are not present to verify the respondents’ identity and eligibility. This is disadvantageous especially if target persons live in multi-person households, as people may simply hand over the questionnaire (or URL) to another family member which is (more) willing to participate. In the context of self-completion panel surveys, we refer to this phenomenon as “soft attrition” as opposed to “hard attrition” where target persons no longer participate in the survey at all.

In panel studies, soft attrition can occur between each survey wave. This may produce variance that can mistakenly be considered by researchers as intra-individual changes in their variables of interest. However, some panels refrain from asking socio-demographics in each wave, as asking questions on information that is already available and should not change over time is considered a waste of survey time and is burdensome for most respondents. Therefore, undetected soft attrition in panel surveys may impair the results of panel data analyses.

Our research investigates how substantial the problem of soft attrition is, and whether we can explain that target persons do not leave the panel but rather hand over the survey to another person. We address these questions by relying on the GESIS Panel, a probabilistic mixed-mode (web and mail) panel survey of the German residential population. These data are especially appropriate because, unlike in comparable panel studies, questions on socio-demographics are only collected annually. We examine how many panelists experienced a change in these socio-demographics in the course of their panel membership, and which factors (e.g., survey attitudes and evaluation of the GESIS Panel) influence the likelihood of soft attrition, especially compared to hard attrition.


The Relation Between Nonresponse Rates and Nonresponse Bias. An Update and Extension of Groves and Peytcheva (2008)

Dr Peter Lugtig (Utrecht University) - Presenting Author
Dr Bella Struminskaya (Utrecht University)
Ms Shannon Dickson (Utrecht University)
Miss Kirsten van Kessel (Utrecht University)
Miss Celine van Henneveldt (utrecht University)
Miss Annemarie Timmers (CITO)
Professor Robert M. Groves (georgetown University)
Dr Emilia Peytcheva (RTI)

Groves and Peytcheva (2008) showed that the relation between nonresponse rates and nonresponse bias is weakly positive. Their meta-analysis of 49 studies showed that higher nonresponse rates do lead to higher nonresponse bias, but higher response rates aren't always better when it comes to the extent of nonresponse bias. As the response rate is still often used as a heuristic for the risk of nonresponse bias, the findings of Groves and Peytcheva have proven important.
The study of Groves and Peytcheva relies on studies conducted mostly in the 1980s and 1990s conducted via face-to-face and telephone (RDD) surveys. The purpose of the current study is twofold:

1. To update the meta-analysis with 140 new studies conducted in the period 2000-2020, and study whether the relation between nonresponse rates and nonresponse bias has become stronger or weaker. Anecdotal evidence suggests that the steady decline in response rates has often not led to much higher nonresponse bias, implying that the relation between the two is perhaps even weaker as before. But is this really true?
2. Because of the large number of added studies, we can include many more covariates that help explain when the relation between nonresponse rates and nonresponse is weak or strong. We coded a wide range of survey design characteristics on modes, contact protocols, and key outcomes in every study. Using this covariate information, we can study design characteristics such as the effect of reminders on both nonresponse rates and nonresponse bias, mixed mode studies, and other measures that are generally taken to increase response rates. During the conference, results will presented graphically, and results from a meta-analysis focusing on between and within-study predictor will be shown.


Survey Design Features With Potential for High Response Rates: A Meta-analytical Approach

Dr Jonas Klingwort (Statistics Netherlands) - Presenting Author
Mr Jeldrik Bakker (Statistics Netherlands)
Dr Vera Toepoel (Statistics Netherlands)

The number of population surveys conducted is enormous and increasing, but response rates are declining. The solution to avoid the missing data problem is not to have any. Choosing appropriate survey design features and professional administration is required to approach this solution.

This paper presents results from two meta-analytical approaches to quantifying the effects of survey design features on the response rate. The first application uses German crime surveys. Individual and independent studies dominate criminological survey research in Germany. This circumstance allows to systematically study the effects of different survey design features and their impact on the response rate in crime surveys. Therefore, a systematic literature review of German crime surveys between 2001-2021 was conducted, and the survey design features study year, target population, coverage area, data collection mode, and responsible institute were collected. The second application uses official statistics surveys (Statistics Netherlands). Here are considerably more, and detailed survey design features available: fieldwork strategy, fieldwork duration, number of reminders, length of the survey, survey topic, recruitment letter, incentives, presence of an interviewer, and interviewer workload.

First, the overall response rate to German crime surveys and Official Statistics surveys is estimated (Meta-analysis of proportions). Second, the influence of the design features on the response rate for both datasets is quantified using a meta-regression (moderator analysis). This enables to identify which factors positively or negatively affect the response rate. Third, we will discuss the model selection since many features are available. Finally, we will report on the set of optimal survey design features to maximize response.

We consider the demonstrated approach and the results of high importance for survey practitioners and survey statisticians. The developed models allow for evidence-based decision-making when (re-) designing a population survey.


Predicting Nonresponse in Surveys in Sweden 2015 to 2021: Cohort Replacement or Cooling Survey Climate?

Dr Sebastian Lundmark (University of Gothenburg) - Presenting Author
Mr Kim Backström (Åbo Akademi University)

Declining response rates have remained a major worry for survey research in the 21st century. In the past decades, the same decline in people’s willingness to participate in surveys (i.e., response propensities) has been seen across virtually all western nations (Gummer, 2019; Leuuw and Heer, 2002), and the trend shows no signs of halting (Brick & Williams, 2013). Even more worrisome, declining response propensities increase the risk of extensive nonresponse bias (Hedlin, 2020; Groves, 2006). With continuing declining response rates, a better understanding of which factors are associated with survey nonresponse and its impact on nonresponse bias is paramount. Knowing which factors relate to low response propensities enables survey research to better model nonresponse weights and to which groups to tailor surveys in order to decrease nonresponse bias efficiently. To that end, this study draws from previous theories and research on survey nonresponse and investigates nonresponse bias over time in two time-series cross-sectional studies administered in Sweden (the National SOM Surveys 2015-2021 and the Gothenburg SOM Surveys 2016-2021). Registry data on all sampled persons and their corresponding neighborhood-level contextual data are alleviated to model response propensities and increase understanding factors leading to nonresponse. Furthermore, utilizing the time-series aspect of the data, the manuscript details whether declining response rates are a function of a changing survey climate or due to changes within or between cohorts in terms of age and in terms of new immigration patterns. The preliminary findings indicated that nonresponse bias increased over time and that this increase was more likely due to a change in the survey climate than due to cohort replacement.


Open-ending questions to close

Mrs Marion Elie (CDSP - Sciences Po) - Presenting Author
Mrs Emmanuelle Duwez (CDSP - Sciences Po)

The ELIPSS Panel is a French probability-based web panel which includes people over 18 years old living in metropolitan France. The management of this panel has two main objectives : to maintain an active panel (a high response rate) and to limit its attrition (the departure of panelists) as much as possible. To achieve this, various strategies are used, such as follow-up calls (by phone, email and mail), quantitative monitoring of survey results and response rates, and taking into account other data collected during the surveys.

At the end of each questionnaire, panelists have the opportunity to express themselves through open-ended response questions regarding any concerns, difficulties or other remarks in free inserts. After the responses are gathered, quantitative and qualitative data analysis is carried out on the quality of the questionnaires, the content of the responses left by the panelists and the time spent to answer the survey. This information allows us to provide feedback to the survey teams, but also to understand what can be improved in the questionnaires.