All time references are in CEST
Boost that respondent motivation! 3 |
|
Session Organisers | Dr Marieke Haan (University of Groningen) Dr Yfke Ongena (University of Groningen) |
Time | Friday 21 July, 09:00 - 10:30 |
Room | U6-11 |
Conducting surveys is harder than ever before: the overwhelming number of surveys has led to survey fatigue, and people generally feel less responsible to participate in surveys. The downward trend in response rates of surveys is a major threat for conducting high-quality surveys, because it introduces the potential for nonresponse bias leading to distorted conclusions. Also, even when respondents decide to participate, they may be reluctant to disclose information for reasons such as: dislike of the topic, finding questions too sensitive or too hard, or they can be annoyed by the length of the survey.
Therefore, surveyors need to come up with innovative strategies to motivate (potential) respondents for survey participation. These strategies may be designed for the general population but can also be targeted to specific hard-to-survey groups. For instance, machine learning methods may improve data collection processes (Buskirk & Kircher, 2021), the survey setting can be made more attractive (e.g., by using interactive features or videos), and reluctance to disclose sensitive information may for instance be reduced by using face-saving question wording (Daoust et al. 2021).
In this session we invite you to submit abstracts on strategies that may help to boost respondent motivation. On the one hand abstracts can focus on motivating respondents to start a survey on the other hand we also welcome abstracts that focus on survey design to prevent respondents from dropping out or giving suboptimal responses. More theoretically based abstracts, for example literature reviews, also fit within this session.
Keywords: nonresponse, innovation, motivation
Ms Saskia Bartholomäus (GESIS Leibniz Institute for the Social Sciences) - Presenting Author
Mr Tobias Gummer (GESIS Leibniz Institute for the Social Sciences)
Having a positive experience when answering a survey can have a positive impact on participation rates and data quality. While previous research has focused on how certain question(naire) design elements such as overall length, comprehensibility of questions, and visual layout impact the respondents’ experience, much less attention has been paid to a questionnaire’s content itself. Prior studies have shown that interest in a topic matters for participation decisions and providing high quality answers. Yet, little do we know about how strategically providing content can be utilized to motivate respondents. Although researchers might be limited in changing the content of a survey, including a module of additional selected questions seems feasible. Consequently, with the present study, we aim to answer the research question whether it is possible to change respondent’s survey experience by intentionally changing a survey’s content. To investigate our research question, we conducted a non-probability web survey among 1.097 respondents of an online access panel in Germany. After completing a module with questions on politics, respondents were randomly assigned to either a second module of questions on politics or a module with questions on their subjective well-being. Following the experimental manipulation, we measured the respondents survey experience. Preliminary results indicate that the modules differed in how interesting and diverse they were perceived. The effects of the content were moderated by the respondents’ self-reported topic interests. For instance, respondents who reported a higher interest in answering political questions seem to have a better survey experience when answering two modules on politics compared to those with lower topic interest. In summary, we found a first indication that supplementing a questionnaire with content that is of interest to respondents can improve their overall survey experience. As utilizing these individual differences in survey experience could help to systematically improve data quality and increase participation among selected subgroups, we will discuss the implication of our findings for adaptive survey designs.
Ms Marta Mezzanzanica (National Center for Social Research) - Presenting Author
Mr Curtis Jessop (National Center for Social Research)
Emails are a quick and cheap way to reach participants. However, the volume of messages individuals receive per day can lead to a sense of ‘email overload’ (Dabbish & Kraut, 2006), with recipients choosing to ignore messages, not reading them fully, or not acting on them immediately.
Invitation emails typically invite participants to take part in surveys via a link that takes them to a survey landing page. This approach requires a click-through before the participant sees the survey and can start answering it. By embedding survey question(s) in the invitation emails, participants can start the survey directly from the message, reducing barriers to participation.
Various experiments with convenience samples have demonstrated the effectiveness of this strategy in boosting response rates of (mainly) customer satisfaction surveys (Liu & Inchausti, 2017; Cobanoglu et all 2022). This study explores the effects in a social research context - embedding a question in the invitation email of a wave of the NatCen Panel, a probability-based panel in the UK. Two email templates were used: Email A had a ‘Start Survey’ button that takes participants to the survey landing page (control group); Email B allowed participants to answer the first question in the email itself (experiment group). The templates were tested with two different types of panel members: ‘new joiners’, who had never been invited to any panel survey before, and ‘long-term members’ who had been previously been invited to participate in surveys.
The study evaluates the experiment by comparing the click-through rates of the two email templates, and the proportion of those clicking through that go on to complete the survey. It also looks the impact on data– comparing the answers of people that completed the question embedded in the email and those that completed it within the survey.
Mrs Julia Bergquist (The SOM institute) - Presenting Author
Mr Sebastian Lundmark (The SOM institute)
Declining response rates has been an increasing problem in survey research, which may lead to an increased risk of non-response bias, and subsequently, misrepresentations of reality.
Although studies have investigated the impact of nonresponse (Blumberg and Luke, 2007; Groves 2006; Groves et al. 2012; Groves and Peytcheva 2008), fewer have suggested remedies on how to improve the response rates, and especially how to improve the response rates among the harder-to-recruit subgroups.
In the present study, the impact of adding an unconditional symbolic incentive to the survey has been assessed. The assessment was made on a self-administered mixed-mode survey (paper-and-pencil mail-back and web questionnaire). The experiment randomly assigned one group of respondents to be given an unconditional symbolic incentive in the first invitation to complete the questionnaire, and the other group did not receive the symbolic incentive. One main sample containing 44,250 individuals was invited to participate. Only individuals between the ages of 16-90 years old was invited to participate. The main sample was divided into four sub-samples based on geographical areas. Analyses were done on the main sample as well as the four sub-samples separately. Prior to being invited to complete the questionnaire, each sample person in all four sub-samples was being randomly assigned to one of two groups.
The results of the experiment will shed light on whether a symbolic unconditional incentive affects response propensities, non-response bias, measurement error, data quality and cost of administration.
Miss Elisabeth Falk (The SOM Institute) - Presenting Author
Dr Sebastian Lundmark (The SOM Institute)
Mrs Frida Sandelin (The SOM Institute)
For some time now, surveys have struggled to handle declining response rate, which in turn have increased the risk of non-response bias (Groves, 2006). Although studies have investigated the impact of nonresponse (e.g., Groves 2006), fewer have suggested remedies on how to improve the response rates, and especially among harder-to-recruit subgroups. In this study, experiments were administered in an attempt to increase response propensities among two hard-to-reach subgroups in Sweden: young people and people born outside the Nordics. The experiments were administered during the fall of 2021, where the treatment group among young people were offered a monetary incentive in terms of a digital gift card applicable in a wide range of stores (retail value 50 SEK) sent to their email address, whereas the treatment group among people born outside the Nordics were offered equivalent gift card but with a retail value of 99 SEK. The respondents in the control groups were sent a lottery ticket by physical mail. The results indicated that young people that were offered the monetary incentive showed a lower response propensity than the group that were offered the lottery incentive, but no difference in response propensity was detected in the experiment among people born outside the Nordics. In the fall of 2022, two follow up experiments were administered. The treatment groups among young people were offered either a cinema gift card (retail value ≈130 SEK) or a gift card at Sweden’s largest grocery shop chain to the value of 75 SEK. The treatment group among people born outside the Nordics were offered a gift card at a well-known café chain (retail value 100 SEK). The respondents in the control group in both experiments were sent a lottery ticket by physical mail.
Ms Lisanne Slot (University of Groningen) - Presenting Author
Ms Marieke Haan (University of Groningen)
Ms Yfke Ongena (University of Groningen)
Surveys have long been a very important research tool in social science research and provide evidence about respondents' practices, attitudes and knowledge, which cannot be obtained through data science alone. Moreover, surveys form a basis for government decisions and for society's perception. However, the downward trend in response rates poses a major threat to conducting high-quality surveys as it can introduce the potential for non-response bias. Non-response bias arises when the non-respondents differ too much from the respondents. This can lead to distorted research results. In addition, it is important to avoid low response rates, as they can lead to poor external validity, low statistical power, higher variance, an image problem for research organizations and higher costs due to efforts made to maintain acceptable response rates .
Because surveys are important, different theories have been developed over the years to obtain responses. Due to changes in research methods and factors that have influenced response rates over the years, the question is whether existing response theories are still useful and if so, in what context. Therefore, it would be good to reconsider existing theories and update them when necessary, so that they can provide a more effective guidance for designing future surveys with acceptable response rates. In this systematic literature review, based on theories mentioned in previous research by Dillman (2020) and Keusch (2015), articles have been collected that cite these response theories. The review provides insight into how these theories work, for whom they work, whether the theories are only mentioned as a theoretical base or actually linked to hypotheses, and whether the theories work for both self-administered and interviewer-administered surveys. The results of the literature review are used to examine the possibility to develop a universal theoretical framework focusing on survey response.