ESRA logo

Tuesday 16th July       Wednesday 17th July       Thursday 18th July       Friday 19th July      

Download the conference book

Download the program





Tuesday 16th July 2013, 14:00 - 15:30, Room: No. 1

Mode Effects in Mixed-Mode Surveys: Prevention, Diagnostics, and Adjustment 2

Convenor Professor Edith De Leeuw (Utrecht University)
Coordinator 1Professor Don Dillman (Washington State University)
Coordinator 2Dr Barry Schouten (Statistics Netherlands)

Session Details

Mixed-mode surveys have become a necessity in many fields. Growing nonresponse in all survey modes is forcing researchers to use a combination of methods to reach an acceptable response. Coverage issues both in Internet and telephone surveys make it necessary to adopt a mixed-mode approach. Furthermore, in international and cross-cultural surveys, differential coverage patterns and survey traditions across countries make a mixed-mode design inevitable.

From a total survey error perspective a mixed-mode design is attractive, as it is offering reduced coverage error and nonresponse error at affordable costs. However, measurement error may be increased when using more than one mode. This could be caused by mode inherent effects (e.g., absence or presence of interviewers) or by question format effects, as often different questionnaires are used for different modes.

In the literature, two kinds of approaches can be distinguished, aimed at either reducing mode effects in the design of the study or adjusting for mode effects in the analysis phase. Both approaches are important and should complement each other. The aim is to bring researchers from both approaches together to exchange ideas and results.

This session invites presentations that investigate how different sources of survey errors interact and combine in mixed mode surveys. We particularly invite presentations that discuss how different survey errors can be reduced (prevented) or adjusted for (corrected). We encourage empirical studies based on mixed-mode experiments or pilots. We especially encourage papers that attempt to generalize results to overall recommendations and methods for mixed-mode surveys.



Note: Depending on the number of high quality paper proposals we could organize one or more sessions.
Note 2: We have four organizers, this does not fit the form. Fourth is Joop Hox Utrecht University, j.hox@uu.nl


Paper Details

1. Effects of Sponsorship on Response: Mixed-Mode Web and Mail Surveying

Mrs Michelle Edwards (Washington State University)
Dr Don Dillman (Washington State University)
Dr Jolene Smyth (University of Nebraska-Lincoln)

Past research has found mail surveys with government and university sponsors obtain higher response rates than surveys sponsored by other organizations, possibly resulting from the increased perception of legitimacy and authority associated with these sponsors. With mixed-mode web and mail surveys of the general public, our ability to convey the legitimacy of a survey may be even more difficult. While few studies have considered sponsorship effects on response to web surveys, to our knowledge, no study has explored the effects of sponsorship on response in mixed-mode surveying. We tested the effects of survey sponsorship by a local (in-state) university sponsor versus a distant (out-of-state) university sponsor on response rates and respondent samples using an experiment conducted in spring 2012 with an address-based sample of residents from two states in the U.S. This study produced a number of findings, including: (1) local-sponsored surveys obtained higher response rates than distant-sponsored surveys, regardless of whether a mail-only or mixed-mode (initial web request followed by a mail questionnaire offered in the fourth contact) design was utilized; (2) mixed-mode (2web+mail) designs obtained lower response rates than mail-only designs, regardless of sponsor, and (3) for 2web+mail sample members, receiving a local-sponsored survey significantly increased one's estimated odds of responding by web (relative to not responding), but did not significantly increase one's odds of responding by mail (relative to not responding). This research demonstrates preliminary support for our concerns


2. Survey Mode Effects on Income inequality Measurement

Mr Pirmin Fessler (Oesterreichische Nationalbank)
Mr Maximilian Kasy (Harvard University)
Mr Peter Lindner (Oesterreichische Nationalbank)

Exploiting a quasi-experiment we use non-parametric re-weighting and regression approaches to estimate the causal effect of the interview method on item-nonresponse and the unconditional observed income distribution. The minor change of interviewing households via the telephone instead of personally leads to major changes in item non-response, which increases by roughly 20% - 30%.
The observed distribution of income is compressed translating to a decrease in the Gini coeffcient by about 10%. Commonly used rankings of countries by Gini coeffcients of the income distribution might largely be an artifact of different survey techniques as the interview mode than true differences in income inequality.



3. Mode effects in Labour Force Surveys - do they really matter?

Mr Thomas Koerner (Federal Statistical Office Germany (Destatis))

In the Labour Force Survey (LFS) of the European Union, like in many other household surveys, there is a growing trend towards the use of multiple data collection modes. While comparing computer-assisted personal interviewing (CAPI) and computer-assisted telephone interviewing (CATI) have already been used side-by-side in the past, computer assisted web interviewing (CAWI) is currently being prepared as an additional mode in many countries. One of the main concerns associated with the use of multiple data collection modes is that mode effects could lead to measurement bias. Despite the vast literature on mode effects in survey research, there are relatively few studies on the importance of mode effects in the specific case of Labour Force Surveys. Labour Force Surveys differ from other population surveys in several respects: For example, they have a particularly long questionnaire, they are sometimes carried out with mandatory response, they predominantly contain factual, non-sensitive questions, and many of the questions include a long lists of response items.
Based on research carried out in the European project "Data collection for social surveys using multiple modes" carried out by five National Statistical Offices (Germany, the Netherlands, Finland, the United Kingdom, and Norway), the contribution identifies likely sources of mode effects in the specific setting of the LFS. Furthermore analyses from a randomized experiment of the German LFS will be presented comparing CAPI, CATI, CAWI, as well as paper-and-pencil self-completion. Based on these preliminary findings, we discuss which aspects of mode