ESRA logo

ESRA 2023 Glance Program


All time references are in CEST

Boost that respondent motivation! 2

Session Organisers Dr Marieke Haan (University of Groningen)
Dr Yfke Ongena (University of Groningen)
TimeThursday 20 July, 16:00 - 17:30
Room U6-08

Conducting surveys is harder than ever before: the overwhelming number of surveys has led to survey fatigue, and people generally feel less responsible to participate in surveys. The downward trend in response rates of surveys is a major threat for conducting high-quality surveys, because it introduces the potential for nonresponse bias leading to distorted conclusions. Also, even when respondents decide to participate, they may be reluctant to disclose information for reasons such as: dislike of the topic, finding questions too sensitive or too hard, or they can be annoyed by the length of the survey.

Therefore, surveyors need to come up with innovative strategies to motivate (potential) respondents for survey participation. These strategies may be designed for the general population but can also be targeted to specific hard-to-survey groups. For instance, machine learning methods may improve data collection processes (Buskirk & Kircher, 2021), the survey setting can be made more attractive (e.g., by using interactive features or videos), and reluctance to disclose sensitive information may for instance be reduced by using face-saving question wording (Daoust et al. 2021).

In this session we invite you to submit abstracts on strategies that may help to boost respondent motivation. On the one hand abstracts can focus on motivating respondents to start a survey on the other hand we also welcome abstracts that focus on survey design to prevent respondents from dropping out or giving suboptimal responses. More theoretically based abstracts, for example literature reviews, also fit within this session.

Keywords: nonresponse, innovation, motivation

Papers

What factors influence participation in citizens’ workshops and follow-up surveys?

Ms Elke Himmelsbach (Kantar Public) - Presenting Author
Mrs Ina Metzner (nexus Institut)
Ms Maria Jacob (nexus Institut)
Dr Sophia McDonnell (Kantar Public)
Dr Daniel Guagnin (nexus Institut)
Dr Josef Hartmann (Kantar Public)

Longitudinal surveys with random sampling make it possible to shed light on understanding who are the non-participants and to adjust for it. This research especially concerns dropouts in citizen workshops as well as in follow-up surveys and is based on research funded by the Federal Office for Radiation Protection (BfS) and implemented by Kantar Public and nexus Institut.

The main research purpose of the evaluation was to understand whether and how deliberative dialogues are effective in increasing issue salience and individual competence in risk assessment for complex, scientific issues, such as 5G radiation and health risks. Another goal was to summarise learnings around sample design and to identify issues that may be improved if those workshops were to be repeated.

The gross sample was drawn at random by communal registration offices of four cities in Germany. The target persons received postal invitations to take part in a citizen workshop on the issue of 5G and health. Everyone who responded with a valid telephone number was contacted for a telephone interview as a precondition to be eligible for participating in the workshop. We collected 246 interviews of 25 minutes duration in July and August 2022. Thereof 134 respondents showed up and participated in one of the four citizen workshops. This is the base to compare participants and non-participants based on a range of variables, including socio-demographic as well as content-specific characteristics.

In addition, we are running three follow-up waves for telephone interviews with all 134 participants after the event. This will be another area to investigate the differences and hopefully also the drivers of participation in follow-up surveys. We will present results on factors influencing participation, measures to increase it and to adjust for nonresponse.


The Role of Respondent Motivation on Item Nonresponse for Split-Ballot Survey Data

Dr Melike Sarac (Hacettepe University Institute of Population Studies) - Presenting Author


The willingness to survey participation and statement of accurate answers are as important as high interviewer performance in surveys. The impact of respondents in surveys is usually explored for questions designed to measure attitudes, values, and beliefs rather than demographic characteristics, daily practices, and behaviors. This study aims to investigate the effect of respondent motivation on item nonresponse for a set of questions designed with the split-ballot technique. To reach this aim, five different countries (France, United Kingdom, Norway, Netherlands, Portugal) in the European Social Survey (Round 9) were selected. Within the survey, different versions of questions designed to reveal gender differences were asked to randomly selected sub-groups. In this study, “don’t know” and “no answer” response options were accepted within the item nonresponse. On the other side, respondent motivation is evaluated through interviewer evaluations per respondent. Complex sample design adjusted estimates were produced within the multivariate part of the study. Descriptive findings showed that there’s a significant moderate relationship between the motivation and item nonresponse in a negative way (Pearson’s coefficient: -0.24, p<0.01). Multivariate analyses controlling for interviewer and respondent characteristics found that a point increase in respondent motivation leads to a reduced level of item nonresponse (about 3 items in France, almost 2 items in the United Kingdom, Norway, Netherlands, and Portugal). Findings refer to the need for keeping respondent motivation at a high level during the interview, and the value of interviewer assessments when explaining item nonresponse. Moreover, it seems that questions designed with the split-ballot technique are prone to item nonresponse and thus, increasing alerts for such questions would be useful. When the switching mode propensity of the European Social Survey in recent times is considered, these alerts should be designed according to data collection methods.


Experimental evidence on the effect of motivational statements to reduce item non-response in an opt-in panel

Dr Alessandra Gaia (University of Milano-Bicocca) - Presenting Author
Professor Emanuela Sala (University of Milano-Bicocca)
Dr Chiara Respi (University of Milano-Bicocca)
Professor Guido Legnante (University of Pavia)

Motivational statements are often used in web surveys to simulate the interviewer presence, with the aim of minimising item non-response and non-response bias. We analyse the effect of motivational statements and assess whether they have any detrimental effect on drop-off rates, non-response in subsequent items and consent to linkage with Twitter data. In order to address our aims, we use randomise experimental data from a survey on attitudes towards passive data extraction implemented in an opt-in panel of the Italian population (N=2,249). A random subsample of respondents not providing a valid answer to a sensitive question on which political party they voted at the most recent election received a motivational statement; this was a privacy reassurance (in case respondents selected “prefer not to say”) or an invitation to try to recall the relevant information (in case they selected “don’t know”). The control group did not receive any motivational statement. We assess whether: i) motivational statements increase the number of valid responses; ii) the response distribution differed with/without the motivational statements and how does it compare with the “true” voting turnout and election outcome; iii) any impact of motivational statements on drop-offs, non-response in subsequent items and consent to data linkage; iv) whether the effect of motivational statements is moderated by socio-demographic characteristics and attitudes towards sharing data online. The study contributes to the literature on the topic in a novel way: first, it includes covariates on respondents’ attitudes towards privacy; second, it compares the voting distribution with “true” electoral outcomes; third, it assesses the effectiveness of motivational statements in opt-in panels, where these prompts are not as widely adopted as in large scale probability-based studies. The implication of empirical results for survey practice are discussed.