All time references are in CEST
Boost that respondent motivation! 1 |
|
Session Organisers | Dr Marieke Haan (University of Groningen) Dr Yfke Ongena (University of Groningen) |
Time | Thursday 20 July, 09:00 - 10:30 |
Room | U6-01f |
Conducting surveys is harder than ever before: the overwhelming number of surveys has led to survey fatigue, and people generally feel less responsible to participate in surveys. The downward trend in response rates of surveys is a major threat for conducting high-quality surveys, because it introduces the potential for nonresponse bias leading to distorted conclusions. Also, even when respondents decide to participate, they may be reluctant to disclose information for reasons such as: dislike of the topic, finding questions too sensitive or too hard, or they can be annoyed by the length of the survey.
Therefore, surveyors need to come up with innovative strategies to motivate (potential) respondents for survey participation. These strategies may be designed for the general population but can also be targeted to specific hard-to-survey groups. For instance, machine learning methods may improve data collection processes (Buskirk & Kircher, 2021), the survey setting can be made more attractive (e.g., by using interactive features or videos), and reluctance to disclose sensitive information may for instance be reduced by using face-saving question wording (Daoust et al. 2021).
In this session we invite you to submit abstracts on strategies that may help to boost respondent motivation. On the one hand abstracts can focus on motivating respondents to start a survey on the other hand we also welcome abstracts that focus on survey design to prevent respondents from dropping out or giving suboptimal responses. More theoretically based abstracts, for example literature reviews, also fit within this session.
Keywords: nonresponse, innovation, motivation
Dr Andre Pirralha (LIfBi) - Presenting Author
Dr Roman Auriga (LIfBi)
Dr Friederike Schlücker (LIfBi)
Dr Götz Lechner (LIfBi)
Ms Anna Passmann (LIfBi)
Retaining and keeping engaged longitudinal survey participants is important because it directly impacts both the quality and cost of survey data. Research has shown that unconditional incentives, given before the respondent answers the survey, are effective in promoting higher response rates. While between-wave contacts in longitudinal surveys are originally designed to track panel members, they can also have a differential impact on response rates. Whereas the literature on incentives is large, there are very few studies exploring the effects of between-wave contacts on response rates. Furthermore, to the best of our knowledge, no study has focused on the long-term impact of these contacts on response rates, nor its effects on the response rates of other members in the household.
In this paper, we evaluate the impact on response rates of between-wave keeping-in-touch mailing contacts both on its own and combined with tailored gifts and incentives, using data from the German National Education Panel Study (NEPS). We designed a randomized survey experiment where schoolchildren were randomly assigned to five different treatment conditions. The effect of the treatment on the response rate is assessed both for the short-term, the subsequent treatment wave (t+1), as well as for the long run, two waves after the treatment (t+2). In addition, we assess if between-wave contacts impact the response rate of parents, also asked to participate in the NEPS survey. The preliminary results show that between-wave mailings combined with incentives have a positive significant effect on the schoolchildren’s response rate. Finally, we also discuss the steps taken to tailor both the incentives and communication materials, aiming to optimize respondent engagement, to the specific schoolchildren and parents target group.
Miss Emma Zaal (University of Groningen) - Presenting Author
Dr Yfke Ongena (University of Groningen)
Professor John Hoeks (Univiersity of Groningen)
Socially Desirable Responding (SDR) arises when people want to portray a better version of themselves compared to how they actually behave. SDR negatively affects survey research data as it causes biased responses towards what respondents perceive as more socially acceptable. When questions are asked that touch upon delicate and sensitive topics, respondents are more likely to provide socially desirable reports. For example, in response to questions on norm compliant behavior (e.g., COVID-19 restrictions compliance) respondents are more likely to overreport while norm non-compliant behavior (e.g., substance use) is more likely to be underreported. Over the years, scholars from different backgrounds have developed a variety of strategies aimed at reducing socially desirable responses in surveys. To date, it is still unclear which strategies to lower SDR work best in order to obtain answers that are not, or only minimally biased towards societal approval. A promising new method to overcome SDR is proposed by Daoust et al. (2021). They show in experiments from 12 countries that if you include guilt-free strategies, participants are more likely to admit noncompliance with COVID-19 restrictions. A brief preamble (introduction) is added to the question, which includes information that other people also engage in non-compliant behavior. In addition, the response options are manipulated: a “face-saving” option is added (e.g., adding to Yes/No answer options, an “Only when necessary" option). We carried out a replication study and included two additional behavioral topics (sustainable behavior and responsible driving behavior) to examine whether face-saving strategies can also be effective in reducing SDR beyond questions on compliance with COVID-19 restrictions. In addition, compared to Daoust’s experimental design, we added a condition in which we test whether adding answer options without preamble is also effective in reducing SDR.
Ms Elke Himmelsbach (Kantar Public) - Presenting Author
Mrs Ina Metzner (nexus Institut)
Ms Maria Jacob (nexus Institut)
Dr Sophia McDonnell (Kantar Public)
Dr Daniel Guagnin (nexus Institut)
Dr Josef Hartmann (Kantar Public)
Longitudinal surveys with random sampling make it possible to shed light on understanding who are the non-participants and to adjust for it. This research especially concerns dropouts in citizen workshops as well as in follow-up surveys and is based on research funded by the Federal Office for Radiation Protection (BfS) and implemented by Kantar Public and nexus Institut.
The main research purpose of the evaluation was to understand whether and how deliberative dialogues are effective in increasing issue salience and individual competence in risk assessment for complex, scientific issues, such as 5G radiation and health risks. Another goal was to summarise learnings around sample design and to identify issues that may be improved if those workshops were to be repeated.
The gross sample was drawn at random by communal registration offices of four cities in Germany. The target persons received postal invitations to take part in a citizen workshop on the issue of 5G and health. Everyone who responded with a valid telephone number was contacted for a telephone interview as a precondition to be eligible for participating in the workshop. We collected 246 interviews of 25 minutes duration in July and August 2022. Thereof 134 respondents showed up and participated in one of the four citizen workshops. This is the base to compare participants and non-participants based on a range of variables, including socio-demographic as well as content-specific characteristics.
In addition, we are running three follow-up waves for telephone interviews with all 134 participants after the event. This will be another area to investigate the differences and hopefully also the drivers of participation in follow-up surveys. We will present results on factors influencing participation, measures to increase it and to adjust for nonresponse.
Dr Melike Sarac (Hacettepe University Institute of Population Studies) - Presenting Author
The willingness to survey participation and statement of accurate answers are as important as high interviewer performance in surveys. The impact of respondents in surveys is usually explored for questions designed to measure attitudes, values, and beliefs rather than demographic characteristics, daily practices, and behaviors. This study aims to investigate the effect of respondent motivation on item nonresponse for a set of questions designed with the split-ballot technique. To reach this aim, five different countries (France, United Kingdom, Norway, Netherlands, Portugal) in the European Social Survey (Round 9) were selected. Within the survey, different versions of questions designed to reveal gender differences were asked to randomly selected sub-groups. In this study, “don’t know” and “no answer” response options were accepted within the item nonresponse. On the other side, respondent motivation is evaluated through interviewer evaluations per respondent. Complex sample design adjusted estimates were produced within the multivariate part of the study. Descriptive findings showed that there’s a significant moderate relationship between the motivation and item nonresponse in a negative way (Pearson’s coefficient: -0.24, p<0.01). Multivariate analyses controlling for interviewer and respondent characteristics found that a point increase in respondent motivation leads to a reduced level of item nonresponse (about 3 items in France, almost 2 items in the United Kingdom, Norway, Netherlands, and Portugal). Findings refer to the need for keeping respondent motivation at a high level during the interview, and the value of interviewer assessments when explaining item nonresponse. Moreover, it seems that questions designed with the split-ballot technique are prone to item nonresponse and thus, increasing alerts for such questions would be useful. When the switching mode propensity of the European Social Survey in recent times is considered, these alerts should be designed according to data collection methods.
Miss Mara Verheijen (Centerdata) - Presenting Author
Mr Joris Mulder (Centerdata)
Mr Joost Leenen (Centerdata)
Online panels are nowadays routinely used around the world as a method for gathering survey data for many purposes and survey research relies heavily on its panel members and their responses. Therefore, non-response bias poses a serious threat to drawing reliable conclusions from survey data collected in online panels, as non-respondents can differ from respondents in terms of their characteristics and attitudes.
Even though surveys in the Dutch LISS panel (Longitudinal Internet studies for the Social Sciences) also face the risk of potential survey fatigue and nonresponse, several measures have been successfully deployed to mitigate these risks and keep respondent participation high. The LISS panel, active since 2007, is the only panel in the Netherlands based on a true probability sample of households drawn from the population register by Statistics Netherlands. Households not included in the sample cannot participate, so no self-selection can take place. The panel consists of 5,000 households, comprising approximately 7,500 individuals. The monthly response rate lays between 70 - 85%, depending on survey topic and the target group. Respondent attrition is about 10 to 12% per year.
In this presentation we discuss the initial procedures for setting up such a highly responsive panel (i.e. sample and recruitment), how to maintain the panel and keeping panel members motivated. We show results of several incentive experiments, how this is helpful during the recruitment phase, how it can help prevent attrition during panel participation and the effects on overall response. Also, we will briefly discuss wat causes attrition in the LISS panel and the effects and results of our ‘sleepers study’, an experiment on stimulating and retain participation in the LISS panel (via letters, incentives, feedback and language use). Finally, we will touch on applying machine learning predicting optimal survey length.