ESRA logo

ESRA 2023 Program

              



All time references are in CEST

Recruiting Web Surveys via Postal-Mail: Best-Practice, Experiments, and Innovation 2

Session Organisers Dr Jean Philippe Décieux (Federal Insitute for Population Research)
Dr Carina Cornesse (DIW Berlin)
TimeTuesday 18 July, 14:00 - 15:30
Room U6-01a

Since e-mail addresses are usually unavailable on standard sampling frames of broader population surveys (e.g., population registers), recruiting high-quality web surveys is challenging. When conducting such large-scale and large-scope web surveys, recruitment and surveying is, therefore, typically conducted in two separate steps: First, a (probability-)sample of the study population is drawn and contacted offline, often during a brief face-to-face or telephone recruitment-interview. Second, members of the sample are asked to switch to the online mode for the actual survey.
Compared to interviewer-administered contact and recruitment, postal-mail strategies are becoming increasingly popular and a large number of cross-sectional as well as longitudinal web survey projects are currently being initiated using postal-mail recruitment in combination with online survey methodology. There are several reasons for this. For example, recruiting web surveys via postal-mail is usually both more time- and cost-efficient than the available alternatives. In addition, this strategy avoids undesirable interviewer effects and allows respondents to read through study and recruitment material at their own speed, time, and convenience.
Currently, the methodology for successful postal-mail recruitment of web surveys is advancing fast. Therefore, this session aims to provide a broad exchange forum for researchers and projects working on and with postal-recruited web surveys. In addition to sharing experiences and best-practices, we are particularly interested in experimental approaches that might include, topics such as:
• Strategies for enabling the transition from offline contact to web data collection mode
• Comparing the success of postal-mail recruitment to other web survey recruitment strategies
• Optimizing initial response, panel consent, and panel registration for postal-mail recruited longitudinal studies
• Push-to-web and other mixed-mode recruitment approaches
• Cost-benefit analyses of different incentive and reminder strategies
• Design and layout effects

Keywords: Web Survey; Recruitment; Mixed-Mode; Survey Costs; Postal Recruitment; Experimental survey research

Papers

Quid Pro Quota? The Effects of Incentives on Survey Participation

Dr Daniel Weitzel (Colorado State University)
Dr Katharina Pfaff (University of Vienna) - Presenting Author
Professor Sylvia Kritzinger (University of Vienna)

The use of computer-assisted web interviews (CAWI) and thus the reliance on online panels has increased over the past decades. While quite some research has already examined the factors impacting response rates in cross-sectional surveys, our research focuses on factors enhancing the setup of offline-recruited online panels - and thus mid- to long-term factors. We examine four types of monetary incentivisation strategies and their effect on participation at online surveys and web panel recruitment. More specifically, we compare unconditional monetary incentives, conditional monetary incentives, conditional donations to a charity organization, and leaving respondents the choice between a monetary incentive or donation. By contrasting panel recruitment with panel participation in subsequent waves, we test amongst others whether these four incentive strategies differ in successfully recruiting online panelists in the longer run. Drawing a sample stratified by region (NUTS3) from the Austrian Central Population Register (ZMR), a public register in which all persons registered in Austria are recorded, our analysis encompasses over 2.000 respondents across multiple survey waves fielded in 2022.


From Concurrent Mixed-Mode to Push-to-Web: Experimental Design Change in a Panel Survey Study with Postal Mail Recruitment

Dr Carina Cornesse (German Institute for Economic Research) - Presenting Author
Dr Jean-Yves Gerlitz (University of Bremen)
Professor Olaf Groh-Samberg (University of Bremen)

Self-administered panel survey studies where sample members are contacted via postal mail often apply mixed-mode designs with online and paper survey mode options. Via the online mode, data can be collected fast, at low cost, and with high data quality. Via the paper mode, biases due to population members who are unable or unwilling to provide survey data online can be prevented. Researchers usually hope that the vast majority of respondents will choose the online survey mode. The challenge is to encourage respondents to do so without alienating non-internet users and people reluctant to provide data online.

Most experimental research on how to mix modes in surveys with postal mail contact focuses on cross-sectional surveys or on early panel recruitment stages and suggests that response rates are higher and biases lower if the online and paper mode are offered concurrently in the postal mail invitation. However, once trust between respondents and researchers has been established in a panel study, it may be beneficial to switch to offering only the online mode option in the survey invitation letter and leaving the paper mode option for the reminder letters.

We examine the impact of such a design change in the newly established German Social Cohesion Panel. In our experiment, a random subgroup of the panel sample was switched from concurrent mixed-mode to a sequential “push-to-web” design in the first regular survey wave after panel recruitment while the rest of the panel sample remained on the concurrent design for one more panel survey wave. Preliminary results show that the share of online respondents is much higher in the sequential mode design while survey response rate differences between the experimental groups are marginal, thus indicating that changing the mode sequence offered in the postal mail letters of


Examining Data Quality and Respondent Burden When QR Codes Push Respondents to Complete Via Mobile Devices: An Experiment on a U.S. Government Web-Only Survey

Dr Scott Leary (Internal Revenue Service) - Presenting Author
Mr Kris Pate (Internal Revenue Service)
Mrs Brenda Schafer (Internal Revenue Service)
Mr Rizwan Javaid (Internal Revenue Service)
Mr Pat Langetieg (Internal Revenue Service)

In 2022, due to factors related to the COVID-19 pandemic all Internal Revenue Service Taxpayer Burden Surveys were converted to web-only. To increase accessibility of these surveys we tested the inclusion of QR code in one of the converted surveys.
The Gift Tax Return Burden Survey (GTT) measures the time and money U.S taxpayers spend complying with federal gift tax reporting requirements. Invitation letters with the URL and a PIN are sent via mail to 25,000 participants. For Tax Year 2021, we conducted a QR code experiment. Half of the sample was randomly assigned to receive the standard letter invitation. The remaining half received a letter that with a QR code to access the survey. All respondents received up to four contact letters.
Because the presence of a QR code increases the use of a mobile devices (Lugtig & Lutien, 2021) we must consider whether such a push impacts data quality and respondent burden. Prior research shows that respondents who use smartphones are likely to have higher rates of item nonresponse (Keusch & Yan, 2016), provide shorter open-ended responses (de Bruijne & Wijnant, 2013) and take longer to complete a survey (Andreadis, 2015) compared to those who complete on a PC. However, as mobile optimization advances (Pew, 2021) these questions warrant revisiting.
We will examine the impact of the inclusion of a QR code on data quality and respondent burden in our GTT survey. We will analyze variability of answers, item nonresponse, answers to narrative open-ended questions and time to complete the survey based on the device used to complete the survey.


The Effect of Targeting and Framing on Panel Recruitment: A Comparison Across Four Panels

Mr Kim Backström (Åbo Akademi University) - Presenting Author
Dr Sebastian Lundmark (University of Gothenburg)
Mr Felix Cassel (University of Gothenburg)
Dr Isak Vento (Åbo Akademi University)
Mr Rasmus Sirén (Åbo Akademi University)
Miss Jenny Backström (Åbo Akademi University)
Professor Kim Strandberg (Åbo Akademi University)
Professor Jordi Muñoz (Universitat de Barcelona)
Dr Raül Tormos (Centre d'Estudis d'Opinió)

Several institutes have created online probability panels where panelists are recruited offline using probability sampling. Online probability panels are thought to be less influenced by sampling errors and biases due to self-selection while remaining a reasonable and cost-effective approach to acquiring survey data. However, online probability panels have tended to suffer from lower response rates (recruitment rates) than other survey data collection methods. Given the likely relationship between response rates and non-response bias, it is paramount for online probability panels to find interventions that increase response rates and that decrease non-response bias.

This study draws from three theories to assess the effects that targeting and framing interventions in the recruitment phase may have on recruitment rates and non-response bias. First, the impact of targeting and framing was assessed by administering a higly similar preregistered experiment to a stratified random sample of citizens in Sweden (https://doi.org/10.17605/OSF.IO/DCYH2), and Finland to Swedish-speaking Finns (https://doi.org/10.17605/OSF.IO/PS7TY). In both administrations, the sampled individuals received an invitation via postal mail to join an online panel. The invitation was randomly assigned to include either a targeted message or not and to be framed in a benefit or loss manner. Furthermore, the invitation included an early-bird incentive in Finland. Based on the findings of the initial administrations, the experiment will be replicated and extended in two more administrations to a Finnish and a Catalan sample.

The results of the four data collections will shed light on whether targeted messages, framing of the invitation, and the interaction between them affect recruitment rates, non-response bias, data quality, and cost of recruitment of the targeting in online probability panels recruited via postal mail.