Conference Programme 2015
Tuesday 14th July Wednesday 15th July Thursday 16th July Friday 17th July
Tuesday 14th July, 16:00 - 17:30 Room: L-102
Experimental designs in online survey research 3
|Convenor||Mr Henning Silber (Göttingen University )|
|Coordinator 1||Mr Jan Karem Hoehne (Göttingen University)|
|Coordinator 2||Professor Dagmar Krebs (Giessen University)|
Session DetailsExperimental studies have become increasingly popular in survey research and are carried out in various disciplines such as sociology, political science, linguistics, economics and psychology. In survey research experimental designs are useful tools to get a better understanding of cognitive processes in order to give better practice advice for improving study and questionnaire design. In particular, the technological advances have made it significantly easier to use experimental designs in online field experiments as well as in computerized laboratory experiments.
This session invites presentations on empirical studies and theoretical discussions of experimental designs in online survey research.
- Empirical online research can include studies on response behavior and social desirability bias, as well as experiments on response rates and question design effects. Furthermore, we especially encourage presentations with replicated experimental results and welcome replications in different social contexts such as different cultural, educational and ethnic groups.
- Additionally, we invite presentations that discuss the value of experiments from a theoretical perspective. Theoretical presentations could contrast the merits and the limits of different forms of experimental study designs or provide a future outlook on the prospects of online experiments in survey research.
Presentations could cover the following research areas:
- Theory of experimental study designs
- Replication of experimental results
- Comparisons between different experimental designs (e. g., laboratory and field experiment)
- Split-ballot experiments (e. g., context effects, question order, response order, acquiescence, visual design effects, verbal effects)
- Choice experiments
- Laboratory experiments on response behavior (e. g., using eye tracking)
- Experiments with incentives
- Vignette studies
- Future prospects of experimental designs
Paper Details1. The effectiveness of incentives on recruitment and retention rates: an experiment in a web survey
Mr Joris Mulder (CentERdata, Tilburg University)
Dr Salima Douhou (CentERdata, Tilburg University)
The purpose of this paper is to examine which incentive level is related to long term participation of respondents, 3 years after the start of an experiment in the LISS panel. During the recruitment of a refreshment sample in 2011 a random half (~1000 households) of the sample was promised a higher monetary payment of 25 euros per hour for completing monthly questionnaires, the other half was promised the standard 15 euros an hour. We study the relation between incentive level and response during the recruitment process and the relation between hourly incentive level and long term participation and attrition.
2. Placement of the Linkage Consent Question in a Web Survey of Establishments
Professor Joseph Sakshaug (University of Mannheim)
Ms Basha Vicari (Institute for Employment Research)
Sample surveys routinely ask for respondent consent to link survey information with administrative databases, but not all survey units agree to the linkage. Efforts to study whether placement of the linkage consent question influences consent have been undertaken. However, these studies have only been performed in interviewer-administered household surveys. Whether placement matters in self-administered modes (e.g., web) and for other types of target populations remains an open question. We present results of a placement experiment in a web survey of establishments in Germany and offer insights and contribute to “best practice” guidelines for maximizing linkage consent.
3. Finding Item Nonresponse Patterns: Three Internet Survey Experiments Into the Effects of Nonresponse Options on Item Nonresponse and Distribution of Opinions
Mrs Jannine Van De Maat (Leiden University)
How do nonresponse options like a DK option or a filter question affect the item nonresponse and distribution of opinions? This question will be answered in a comparative setting by conducting and comparing three internet survey experiments which examine the effects of question design and in particular nonresponse options like the DK option and the filter question. The experiments are carried out with three different internet panels. To differentiate between the effect of nonresponse options, five issues areas are included which each consist of replicated questions.
4. Offline recruiting of young people for an online survey - what affects response rates
Dr Eva Zeglovits (University of Vienna and IFES (Institut für Empirische Sozialforschung))
When conducting a survey among young adults, online surveys are the mode of choice. But what can you do if you only have postal addresses? In an experiment I test response rates varying the usage of incentives, unconditional versus conditional incentives, as well as the amount of the incentive. Using a stratfied random sample from a register of the population (aged 15-24) in Austria, a pure offline recruiting process is applied (invitation letter to participate in an online survey). I analyse response rates and conclude with recommendations for online surveys among young people based on a register sample.