Conference Programme 2015
Tuesday 14th July Wednesday 15th July Thursday 16th July Friday 17th July
Tuesday 14th July, 11:00 - 12:30 Room: L-102
Experimental designs in online survey research 1
|Convenor||Mr Henning Silber (Göttingen University )|
|Coordinator 1||Mr Jan Karem Hoehne (Göttingen University)|
|Coordinator 2||Professor Dagmar Krebs (Giessen University)|
Session DetailsExperimental studies have become increasingly popular in survey research and are carried out in various disciplines such as sociology, political science, linguistics, economics and psychology. In survey research experimental designs are useful tools to get a better understanding of cognitive processes in order to give better practice advice for improving study and questionnaire design. In particular, the technological advances have made it significantly easier to use experimental designs in online field experiments as well as in computerized laboratory experiments.
This session invites presentations on empirical studies and theoretical discussions of experimental designs in online survey research.
- Empirical online research can include studies on response behavior and social desirability bias, as well as experiments on response rates and question design effects. Furthermore, we especially encourage presentations with replicated experimental results and welcome replications in different social contexts such as different cultural, educational and ethnic groups.
- Additionally, we invite presentations that discuss the value of experiments from a theoretical perspective. Theoretical presentations could contrast the merits and the limits of different forms of experimental study designs or provide a future outlook on the prospects of online experiments in survey research.
Presentations could cover the following research areas:
- Theory of experimental study designs
- Replication of experimental results
- Comparisons between different experimental designs (e. g., laboratory and field experiment)
- Split-ballot experiments (e. g., context effects, question order, response order, acquiescence, visual design effects, verbal effects)
- Choice experiments
- Laboratory experiments on response behavior (e. g., using eye tracking)
- Experiments with incentives
- Vignette studies
- Future prospects of experimental designs
Paper Details1. Investigation of Response-Order-Effects using Eye Tracking
Mr Jan Karem Höhne (german)
Survey researcher are aware of the fact that the design of questions can affect the response behavior of respondent’s. A phenomenon which can occur by answering questions with multiple response categories, are Response-Order-Effects. The bulk of the empirical findings regarding to these effects is based on relatively “indirect data”. Hence, respondent’s behavior is not completely observed, but reconstructed by the researcher. Therefore this paper deals with eye tracking – measuring eye movements allows examining cognitive information processing. The analysis of the eye tracking data show, that respondents pay not the same attention to all response categories.
2. Using eye tracking to examine respondent’s processing of forced-choice vs. check-all-that-apply question formats
Ms Cornelia Neuert (Gesis - Leibniz Institute for the Social Sciences)
Recent experimental research has shown that the check-all and forced-choice question formats do not produce comparable results (Smyth et al., 2006).
In this study, half of the respondents (n=42) were assigned to a version of a survey in which two questions were formatted as check-all-that-apply questions and the other half (n=42) were assigned to a version in which the same two questions were formatted as forced-choice questions.
By analyzing the respondents' eye movements, both question formats are compared concerning the amount of attention and cognitive effort respondents spend while answering the questions.
3. Direction of Response Format in Web and Paper & Pencil Surveys
Professor Dagmar Krebs (University of Giessen)
Characteristics of response scales are important factors guiding cognitive processes underlying the choice of a response category in responding to the request for an answer on an attitude item. Additionally, the mode of data collection might be an important factor guiding response behavior. This paper deals with the effect of scale direction within two different modes, a web survey and a paper & pencil-survey. Identical items are presented with response scales of different direction – either beginning on the left hand side with the positive or the negative response option. According to scale direction labels are agree-disagree or disagree-agree.
4. Comparing response order experiments with probability and non-probability samples
Mr Henning Silber (Göttingen University)
Professor Jon Krosnick (Stanford University)
Professor David Yeager (University of Texas at Austin)
Online survey experiments with different question forms are frequently employed to evaluate question wordings and response scales. Building on the results of these experiments more accurate wordings and response scales are selected. Yeager et al. (2011) challenged the inconsiderate use of non-probability online samples by showing that different non-probability online samples do not lead to equal conclusions in terms of the distribution of substantive variables. Based on their findings, this paper investigates whether identical split ballot design experiments employed in seven non-probability online surveys reveal equal results.