ESRA 2019 Draft Programme at a Glance
Multiple Mode Data Collection: Innovative Approaches 3
|Session Organisers|| Dr Scott Leary (Internal Revenue Service)
Dr Jocelyn Newsome (Westat)
Ms Brenda Schafer (Internal Revenue Service)
|Time||Wednesday 17th July, 09:00 - 10:30|
Single-mode surveys -- whether telephone, web, or mail -- all have limitations that can result in coverage error. For example: telephone surveys may not reach households that have dropped landlines in favor of cell phones; web surveys are not an option for those without internet access (or those reluctant to click on survey links for fear of phishing scams); and mail surveys do not always make it to the intended recipient (and aren’t always opened or returned even when received).
In the face of these challenges, researchers are increasingly turning to mixed-mode surveys that afford the opportunity of combining multiple survey modes. Using multiple modes offer a variety of benefits to researchers. They can help improve response rates and reduce nonresponse error by improving coverage (Dillman, 2014). They can also speed up data collection and lower costs by allowing researchers to begin with a less expensive method of data collection and then move to a more expensive mode only for non-respondents (de Leeuw, 2005; Pierzchala, 2006; Tourangeau, Conrad, & Couper, 2013). Some researchers suggest that a sequential approach may be most effective, where a mail questionnaire is offered first and a web questionnaire offered later in the data collection process (Dillman, 2014) and others advocate for web first followed by other modes (Millar & Dillman, 2011), since the web is a very cost-efficient method.
However, as researchers embrace mixed-mode designs, it is not always clear how best to offer the various modes. Is it best to offer different modes concurrently or sequentially, and if sequentially, which mode works best if offered first? This session will explore innovative approaches used in multi-mode surveys. Researchers are invited to submit papers discussing experiments or pilot studies on any of the following topics:
• Multi-mode survey design experiments, including the use of smartphones and tablets;
• “Web push” experiments, including the impact on survey response and nonresponse;
• Discussion of differences in mode impact at different points in data collection; and
• Impact of mode sequencing on hard-to-reach populations.
Keywords: multimode, mixed mode, web push
Does pushing to web work for longitudinal household panels? First results of a two-wave pilot of the Swiss Household Panel
Dr Erika Antal (FORS)
Dr Robin Tillmann (FORS)
Dr Marieke Voorpostel (FORS) - Presenting Author
Dr Gian-Andrea Monsch (FORS)
The Swiss Household Panel (SHP) follows Swiss households over time since 1999. Its main mode of interview is by telephone. In preparation for its third refreshment sample, the Swiss Household Panel (SHP) is conducting a pilot to investigate the costs and benefits of a mixed mode design involving CAWI. We first present the design of the two-wave pilot study, which compared three treatment groups: a control group mainly relying on CATI interviews, a group that uses predominantly CATI on the household level and CAWI on the individual level, and a group that uses mostly CAWI for both the household and the individual questionnaires.
We present findings on how the three protocol groups compared with respect to initial response rates and sample composition in the first wave. The use of a register-based sample offers the advantage of the availability of auxiliary data for assessing the extent of selection error. We also show retention rates and changes in the sample composition in the second wave and discuss implications of these findings for the design of the next SHP refreshment sample and more generally for using multiple modes in longitudinal household panels.
Comparing Costs, Response Rates, Demographics and Responses in Non-probability and Probability Sample Surveys Conducted in Oregon
Professor Virginia Lesser (Oregon State University) - Presenting Author
Ms Lydia Newton (Oregon State University Survey Research Center)
Ms Jeannie Sifneos (Oregon State University Survey Research Center)
Mail, telephone, and the Web are commonly used survey modes to conduct probability based surveys. Increasingly, surveys use online panels created by a number of research organizations. The online panels consist of individuals recruited to participate in surveys as needed. Demographics of the individuals are collected when creating such panels. The demographics are used to create a specific panel of individuals for a client, typically a subset of the larger panel generated by the research organizations. This subset is designed to match the desired profile of the specific population characteristics requested by the client. Most online panels are created not using probability-based methods but they are commonly used due to their convenience and low cost to obtain survey data. When conducting opinion surveys repeated over time, the impact of changing from a probability-based sampling mode to a non-probability panel must be examined. A study was conducted in a population of Oregon residents in 2016 and repeated in 2018. In both years, probability-based surveys were used to select a sample of participants, along with a sample recruited from an online panel which used non-probability methods. For the probability-based survey, options for completing the survey included both mail or web, examining offering a mail mode or the Web, followed by a mail mode, to complete the questionnaire. For the Web approach, an additional study compared the use of a postcard versus an additional mailing of the questionnaire. Results of the probability-based survey were then compared with the online panel results. Two different online panel vendors were used in each of the years to compare approaches. Response rates, demographics of the participants who completed questionnaires, cost to conduct the studies, and responses are compared across these approaches.
Introducing Web to a Panel Survey: The Health and Retirement Study
Dr Mary Beth Ofstedal (University of Michigan) - Presenting Author
Dr Mick P. Couper (University of Michigan)
Mr Andrew L. Hupp (University of Michigan)
Ms Rebecca Gatward (University of Michigan)
Dr David Weir (University of Michigan)
The Health and Retirement Study (HRS) is a panel study of people over age 50 in the United States that began in 1992 and currently uses face-to-face and telephone interviewing on a rotating basis with half the sample assigned to each every wave. As one of several cost-saving measures, HRS incorporated web as an alternate mode to telephone starting in the 2018 wave. The core interview is long, averaging 90 minutes for telephone and 150 minutes for face-to-face. In 2018, a random subsample of about 2,250 participants with web access was invited to complete their interview online, rather than by telephone. The web sample was randomized to receive reminders at either 6 or 12 day intervals. Web non-respondents (and partial respondents) were contacted to schedule a telephone interview after 6-12 weeks of no web activity. A control sample of 1,500 participants with web access received the usual telephone interview, with no option to complete by web. HRS is roughly two-thirds of the way through 2018 data collection and the web option is shaping up to be a success. In this presentation we will summarize the 2018 web experience, focusing on three sets of analyses: 1) a comparison of the sequential web-telephone and control samples with regard to response rates and predictors of response, data quality (item missing, straightlining, response heaping, break-offs, etc.), and interview length; 2) a comparison of the short versus long reminder interval groups with regard to web and final response rates and the timing of survey completion; and 3) characteristics of the web surveys with respect to devices and browsers used, number of sessions to complete, location of suspends, etc. We will end the presentation by reflecting on lessons learned as part of HRS’s transition to incorporate web in a sequential mixed-mode design.
Mixed-mode experiment in the IAB Establishment panel
Ms Marieke Volkert (Institute for Employment Research) - Presenting Author
Professor Joe Sakshaug (Institute for Employment Research, University of Mannheim)
Mr Peter Ellguth (Institute for Employment Research)
Dr Susanne Kohaut (Institute for Employment Research)
Dr Iris Möller (Institute for Employment Research)
The IAB Establishment panel collects representative data of German establishments and provides deep insights into economic developments in Germany since 1993. Interviewing establishments requires special procedures, for example, a non-linear question-answer process or managing multiple respondents who have access to different company-related information. Thus, until 2018 interviews were conducted via interviewer-administered paper-and-pencil interviewing (PAPI) to meet the special needs of establishment respondents.
Declining response rates and a need to better control the data collection process led to the formation of a new survey tool, which integrates computer-assisted personal interviewing (CAPI) with self-administered Web interviews on one exchange platform. To estimate and optimize the effects of the new software on response rates and data quality we conducted a sequential mixed-mode experiment in 2018 on a refreshment sample of establishment cases.
We randomly assigned the establishments to a CAPI-Web, a Web-CAPI and a PAPI mode (control) group. In the first group, interviewers conducted CAPI interviews and establishments were allowed to finish the interview online. In the second group, establishments were invited via invitation letter to take part in an online interview. If no interview resulted, interviewers visited the companies of both groups to conduct CAPI interviews. In the last group, interviewers visited companies assigned to the control group and conducted PAPI interviews. Companies could also finish the questionnaire via self-administered PAPI.
In this paper, we present the results of this experiment and measure the effects of using different modes on response rates, nonresponse bias, and other data quality indicators. We also report the logistical challenges and “lessons learned” of implementing mixed-mode experiments in a survey of establishments.
Telephone or online first? Effects of a mode experiment on data quality in a panel survey
Mr Sebastian Hülle (Institute for Employment Research (IAB)) - Presenting Author
Professor Joseph Sakshaug (Institute for Employment Research (IAB))
Increasing cost pressure has led to fewer population surveys being conducted via interviewer-administered modes in favor of cheaper self-administered modes. In addition, declining response rates or changes in the mode-specific coverage can influence a mode change or a mixed-mode design. Nevertheless, changing the mode is potentially problematic–especially for existing panel surveys–since mode-specific effects on response behavior and data quality are likely. In order to identify and quantify the effects of a mode change, an experimental design is required that allows different error sources (non-response, measurement) to be appropriately analyzed. However, little is known about the consequences of a mode change within panel surveys in terms of data quality.
Using data of the fourth wave of the Linked Personnel Panel (LPP), the paper investigates the consequences on data quality of experimentally introducing web as an additional mode in a German panel survey that has been conducted exclusively by telephone so far. The experimental design varies the order of the first mode offered (telephone vs. web) and incorporates not only the panel sample but also a refreshment sample. Thus, comparisons are possible between panelists (longitudinal dimension) and new respondents entering the study (cross-sectional dimension). We analyze the effects of the mode experiment on key indicators of data quality (e.g., response rates, nonresponse error, measurement error). The results are of potential interest for those who are interested in the consequences of (introducing) mixed-mode designs and mode changes within panel surveys.