ESRA logo

ESRA 2023 Glance Program

All time references are in CEST

Adapting survey mode in a changing survey landscape: Experiences from repeat cross-national, cross-sectional, and general social surveys 3

Session Organisers Ms Siobhan O'Muircheartaigh (European Social Survey ERIC (City, University of London))
Mr Tim Hanson (European Social Survey ERIC (City, University of London))
Dr Oshrat Hochman (GESIS)
Dr Rene Bautista (NORC at the University of Chicago)
Professor Rory Fitzgerald (European Social Survey ERIC (City, University of London))
TimeTuesday 18 July, 16:00 - 17:00
Room U6-01b

Studies to measure attitudes, opinions and behaviours have been and remain critical to understanding societies around the world. In the face of the COVID19 pandemic and changing trends in the interviewer workforce, many repeat cross-sectional social surveys have been experimenting with self-completion and mixed-mode approaches. The European Social Survey (launched in 2001) and United States’ General Social Survey (launched in 1972) are key examples of long-standing studies that collect data to inform research on changes over time, and are now exploring new modes for the future. This session brings together ESS, GSS, and other cross-sectional social surveys to share experiences in survey mode transition.
The session's aims are threefold: (1) Share strategies and lessons-learned from recent mode experiments by ESS, GSS, and other studies, and potential ways to improve methods in future. (2) Highlight how different cross-sectional studies have modified survey protocols in recent years to adapt to changing conditions in the public (e.g., public health crisis, shifting communication modes, public’s willingness to respond to surveys, trends in interviewer workforce). (3) Provide a space for data creators, data users, and survey practitioners to discuss methodological and statistical challenges for cross-sectional studies considering such a move.
We invite submissions from those involved in transitioning repeat, cross-sectional, and cross-national social surveys to new data collection approaches. Topics of interest include: results from pilots or feasibility studies based on self-completion or mixed-mode approaches; findings from experimental research testing aspects of self-completion/mixed-mode designs (e.g., incentive and mailing strategies, survey length adaptations, sequential vs. concurrent designs); impacts of mode switches on measurement and survey time series; and discussions of experiences and challenges associated with adapting cross-sectional surveys to new modes across different cultural/national contexts.


Devising an Optimal Mixed-Mode Data Collection Strategy for the US General Social Survey [GSS]

Dr Colm O'Muircheartaigh (The University of Chicago) - Presenting Author
Mr Ned English (NORC)

This paper compares two approaches to optimizing field strategy for surveys using both face-to-face (f2f) and mail push to web (web push) modes of data collection. The results are based on outcomes for a large-scale national cross-sectional attitude survey [the US General Social Survey (GSS)].

Based on the experience of the 2020 GSS in which, due to the Covid pandemic, all data collection was switched to remote modes (primarily web push), a large-scale experiment was devised and implemented in the 2022 GSS. Strategy 1: for a sample of 3,844 addresses, all addresses were approached in person following the standard GSS f2f protocol used prior to the Covid pandemic. Towards the end of the field period, remaining nonresponding addresses were followed up using mail push to web. Strategy 2: In the first phase, for a sample of 11,168 addresses, all were approached using mail push to web. A subsample of 1 in 5 nonrespondents from the first phase was followed up in person, again using the standard GSS f2f protocol.

We address four questions in relation to the comparison of the two strategies: (1) is the composition of the respondent sets the same; (2) how do the weighted response rates compare; (3) what is the effective sample size for each of the strategies; and (4) given different cost assumptions for remote and in-person data collection, which design optimizes precision per unit cost? Our paper will be of interest to designers and users of multi-mode surveys.

Consequences of Moving from RDD to Web-based Probability Panels

Dr Ashley Amaya (Pew Research Center) - Presenting Author
Dr Patrick Moynihan (Pew Research Center)
Dr Laura Silver (Pew Research Center)

Data collection that required in-person work from print and mail vendors, phone interviewers, or field interviewers became difficult at the onset of the coronavirus pandemic. This abrupt societal shift along with slower, but persistent shifts in the survey environment (e.g., rising costs and falling response rates) demanded researchers test, if not immediately adopt, alternative data collection methods.
In many cases, researchers shifted from dual frame random digit dial (DFRDD) telephone surveys to web-based probability panels. While significant research has been conducted on the data quality of web-based probability panels, it has been limited to few countries (and even fewer within Europe). It has also focused primarily on biases in single point estimates. This is insufficient for researchers interested in understanding how a frame and mode shift may affect comparative trends over time or across countries.
In this presentation, we ask (1) whether web-based probability panels yield comparable attitudinal measures to DFRDD samples, (2) whether differences between the panels and DFRDD vary by country, and (3) what are the implications of these differences in the study of trends over time and comparisons across countries?
We use DFRDD and panel data collected at similar time points using similar questions in three countries (the United Kingdom, Germany, and France). We review differences between the two data sources within each country, compare the level of differences across countries, and use published examples to assess how conclusions would have been changed had panel data (as opposed to DFRDD data) been used in the analysis.

The Transition to Mixed Mode in the Generations and Gender Survey: An Overview and Lessons Learnt

Dr Arieke Rijken (Netherlands Interdisciplinary Demographic Institute)
Dr Siyang Kong (Netherlands Interdisciplinary Demographic Institute)
Dr Wojciech Jablonski (Netherlands Interdisciplinary Demographic Institute) - Presenting Author
Dr Olga Grünwald (Netherlands Interdisciplinary Demographic Institute)

The Generation and Gender Survey (GGS) provides longitudinal and cross-national data about population and family dynamics. Since 2020, GGS has started a new round of data collection – GGS-II, with a new sample in each participating country. A main feature of the GGS-II is the mixed-mode data collection. Face-to-face interviewing, such as GGS-I, was and still is the dominant mode for conducting surveys in social sciences. However, face-to-face surveys are suffering from increasing costs and declining response rates. Meanwhile, there is increasing evidence for the feasibility of web surveys. Moreover, with the onset of the COVID-19 pandemic, face-to-face interviews were hardly feasible due to potential health hazard on respondents and interviewers. Mixed-mode designs can provide greater flexibility and speed when timely information is crucial and face-to-face surveys are difficult or even impossible to implement.
So far, a total of 18 countries/territories have participated or have secured funding for conducting GGS-II Wave 1. The majority of the GGS-II countries/territories used self-completion web interviewing as the main mode of data collection, In some countries, face-to-face interviewing, telephone interviewing or paper-and-pencil is used as a fallback plan to reduce non-response and potential response bias. Two countries have randomly assigned respondents to mode (web and another mode), one in a pilot experiment, and one in a full wave. A few countries only conducted face-to-face interviews for practical reasons such as internet penetration and speed. In this paper, we will discuss in detail the mode, recruitment strategy and response rate of the GGS-II countries. We will also share the challenges and lessons learned in using mix-mode to conduct cross-national social surveys.

Evaluating Mode Effects and Response Trends using the 2016-2022 General Social Survey

Ms Jodie Smylie (NORC at the University of Chicago) - Presenting Author
Mr Brian M. Wells (NORC at the University of Chicago)
Mr René Bautista (NORC at the University of Chicago)

The General Social Survey (GSS) is a nationally representative survey historically conducted face-to-face every two years to measure the attitudes and opinions of the general public in the United States. As the COVID-19 pandemic was disruptive for surveys that relied on face-to-face interviewing, the GSS out of necessity adjusted its historical designs in favor of new data collection modes to continue collection during this challenging and critical time for measuring public opinion. Given face-to-face interviews could not be conducted safely during the pandemic, the GSS was redesigned into a self-administered web survey (supplemented with phone interviews) for collection in late 2020 and 2021. The shift to primarily web-based collection allowed respondents to skip questions more easily and answer sensitive questions without interviewer moderation. In 2022, to fully explore mode effects and better understand the 2021 GSS data, the GSS reintegrated some of its historical face-to-face design in a formalized multi-mode experiment that included web self-administration and face-to-face and phone interviews. These changes in the 2021 and 2022 GSS cross-sections will allow users to explore the full impact of the transition to web on survey responses and data quality.

In this presentation, we analyze data from GSS 2016 and 2018 (pre-web transition) compared with 2021 and 2022 to explore differences in responses and item missingness across modes. We examine dozens of variables across multiple topic areas including politics, religious self-identification, trust in institutions, social connections, and demographics. We will attempt to identify what types of response trends were meaningfully altered in 2021 and 2022 as compared with previous years. This analysis will provide survey researchers with a thorough examination of potential trend shifts and implications for data quality.