Conference Programme 2015
Tuesday 14th July Wednesday 15th July Thursday 16th July Friday 17th July
Thursday 16th July, 16:00 - 17:30 Room: HT-105
Mixing modes and mode effects
|Convenor||Mrs Caroline Bayart (University Lyon 1 )|
|Coordinator 1||Mr Patrick Bonnel (University Lyon 2 - ENTPE)|
Session DetailsSurvey response rates are decreasing over the world. Even if weighting procedures allow to reduce the incidence of non-response, it is always necessary to postulate that people with some socio-demographic characteristics who do not respond to a survey have the same behaviour than people with the same socio-demographic characteristics who respond. But evidence seems to indicate that it is not always the case and survey non-response might produce bias. Efforts are made to increase response rate for traditional survey by improving the questionnaire, reducing respondent burden, increasing reminders… Even if results are generally positive, it is in most cases not sufficient.
A way to balance the impact of non-response and produce more reliable results, is to propose different media and let people chose the apppropriate mode and moment to answer. The potential of new and interactive media seems to be high to collect data. But these solutions generates some bias. First, in terms of design and administration of the questionnaire, which could vary according to the mode. Then, the generalization of the results to the whole population sometimes is an issue. Lastly, the question of data comparability remains. When mixed survey modes are used, individuals choose to belong to one group or another or only respond if the proposed medium suits them. The responses are therefore not completely comparable, because the sample is no longer random and the presence of respondents is determined by external factors, which may also affect the variable of interest in the studied model. The danger when databases are merged is that a sample selection bias will be created and compromise the accuracy of explanatory models.
The aim of the session is to to characterize bias generated by mixed modes surveys and give some perspectives for reduce them.
Paper Details1. How to combine survey media (web, telephone, face-to-face): application to the Lyon household travel survey
Mrs Caroline Bayart (University Lyon 1)
Mr Patrick Bonnel (University Lyon 2 - ENTPE)
Response rates with all traditional modes are declining. It becomes difficult to carry out efficient households travel surveys because non respondents probably have different behavior from those who agree to be interviewed. To reduce this bias of non-response, we initiated a project of a web survey in parallel of the household travel survey conducted by phone in Lyon in 2012-2014. After a description of the web respondents, we characterize its travel pattern and estimate a selection bias. These results will be compared to the previous travel survey realized by web and face-to-face in Lyon in 2006.
2. Why do Internet users choose the offline mode? – Evidence from the recruitment of a mixed mode panel
Dr Tanja Dannwolf (GESIS-Leibniz Institute for the Social Sciences)
Dr Klaus Pforr (GESIS-Leibniz Institute for the Social Sciences)
Web surveys have become increasingly popular due to lower costs. Some surveys counter the problem of coverage error by adding an alternative mode to respondents. Whereas this increases response rates and improves coverage, the respondents’ reasoning to choose a specific mode is largely unknown. We examine this mode choice (online vs. mail) in the GESIS Panel face-to-face recruitment interview: Web literacy and affinity for technology decrease the propensity for the offline mode, controlling for age, education, and other factors. This shows that the differences between online and offline participants cannot be accounted for by simple post-stratification weighting
3. Consequences of mid-stream mode switching in a panel survey
Professor Nick Allum (University of Essex)
Professor Fred Conrad (University of Michigan)
We report results of an experiment in a panel survey (the UK Household Panel Survey Innovation Panel) that compared the accuracy of past event recall, validated by responses at previous interview, comparing respondents who were switched to web mode with those that continued to be interviewed F2F. We also assess how asking for pre-commitment to careful answering might mitigate any potential loss of data quality resulting from switching to web. More generally, our results allow us to understand a little better how mode-switching interacts with cognitive processes underlying survey response to produce data of varying quality
4. Selection Bias and Cross-Group Differences depending on the Level of Effort in Mixed-Mode-Surveys
Mr Hagen Von Hermanni (TU Dresden)
Dr Robert Neumann (TU Dresden)
Using data of two mixed-mode telephone surveys (n1=1.000/1.100; n2=1.500/1.500) we use the level-of-effort paradata to determine cross-group differences in approval rates for an indicator of subjective wellbeing. Using different approaches we determine the impact of the selection bias resulting from the number of attempts to reach a respondent and compare those for the different modes. Deriving from previous research on mixed-mode-studies we can make assumptions about the differences and similarities between landline-, dual- and mobile-only users. We will conclude with some remarks on the possible
5. Mode effects when collecting data on sensitive behaviours: an experiment using a repeated measures design on the British National Survey of Sexual Attitudes & Lifestyles
Mr Bob Erens (London School of Hygiene & Tropical Medicine)
Ms Sarah Burkill (Karolinska Institutet)
Dr Andrew Copas (University College London)
Using a repeated measures design, the same respondents were asked questions about their sexual behaviours in a CAPI/CASI survey (the British National Survey of Sexual Attitudes & Lifestyles) and in a web survey. Although 90% of responses did not change across mode, about one-third of the variables showed significantly higher reporting of sensitive behaviours on the web. While the web appears a promising option for surveys of sensitive behaviours, mixing modes may increase measurement errors, and mode effects are likely likely to vary by question type and content, as well as with the particular mix of modes used.