Conference Programme 2015
Tuesday 14th July Wednesday 15th July Thursday 16th July Friday 17th July
Thursday 16th July, 14:00 - 15:30 Room: HT-104
Going beyond the basics of questionnaire design: new and innovative approaches to instrument design in web surveys 1
|Convenor||Dr Femke De Keulenaer (Ipsos )|
|Coordinator 1||Professor Edith De Leeuw (Utrecht University)|
|Coordinator 2||Mr Arnaud Wijnant (CentERdata / Tilburg University)|
Session DetailsWeb questionnaires can differ substantially from questionnaires in traditional modes; nonetheless, most web-based instruments are based on classic text-based questionnaire principles. Rather than simply applying the design principles for paper questionnaires, researchers could capitalize on the unique properties of the web interaction. In web surveys, the medium can be fully exploited to produce better ways of asking and answering questions, and to introduce new approaches to surveying (i.e. going beyond “asking questions”).
In this session, we would like to focus on the potential for advanced questionnaire design in web surveys and innovative approaches to surveying respondents. Topics could include, but are not limited to the following:
- The unique properties of web interaction can be used to design web questionnaire interfaces that adapt or tailor themselves to respondents’ behaviour, diagnose respondents’ need for clarification, detect respondents’ lack of effort, etc.
- A difference between web surveys and traditional surveys can be the focus on (audio-)visual communication. The dynamic and graphical nature of the web has led to the creation of a wide range of measurement tools that previously could not be done on paper; examples include card sort tasks, interactive maps and verbal information.
- Web surveys also offer possibilities for innovative approaches to surveying; for example, behavioural experiments have made the switch from asking respondents to report on their behaviour (via survey questions) to actually observing respondent behaviour (e.g. using game-enhanced instruments or facial expression devices).
- This type of innovative approaches to web instrument design, however, can also lead to a variety of unpredicted effects that reduce the quality of web-based surveys. Researchers are also invited to present empirical evaluations and split ballot experiments. Only by fully understanding both the benefits - and the drawbacks – of innovations can we fully exploit the potential of web surveys.
Paper Details1. Developing a web-administered Event History Calendar: Lessons learned from user-testing
Mr Matt Brown (Centre for Longitudinal Studies - UCL Institute of Education)
Ms Joanna Dardenne (NatCen Social Research)
Previous evidence suggests that the use of Event History Calendars (EHCs) in surveys can increase the quality of reporting of events. However, little information has been published to date regarding the effectiveness of EHCs in a web-based context. This paper will describe the development and testing of an EHC designed for the 8th wave of Next Steps, a longitudinal cohort study whose members are now 25 years of age. Eye-tracking and cognitive interviewing methods were used to explore how respondents interacted with the EHC. The findings will be discussed alongside design implications for future web-based EHCs.
2. New generation of online questionnaires?
Dr Melanie Revilla (RECSM-UPF)
Mr Carlos Ochoa (Netquest)
Mr Albert Turbina (Netquest)
Kapelner and Chandler (2010) proposed a new way of presenting the online questionnaires (called “Kapcha”) to “draw additional attention to the instructions and answer choices by “fading-in” the survey’s words”. Going further into this direction, we developed two new forms of questionnaires: one similar to the Kapcha, the second in which the text appears on the webpage by block. In both, at the beginning, a set of reading tests are shown to find the most suited speed for each individual. Answering becomes more natural for the respondent, keeping his focus just on the next task to be done.
3. New Rating Scale Designs Using Dynamic Drag-and-Drop
Ms Tanja Kunz (Darmstadt University of Technology)
In Web surveys, new rating scale designs are available besides grid questions using conventional radio buttons. In this study, two drag-and-drop rating scales are tested that aim at encouraging respondents’ thorough processing in terms of careful consideration of the content of respective items and repeated verification of the meaning of response options. In fact, findings revealed respondents’ reduced susceptibility to certain satisficing behaviors frequently accompanied by rating scales. However, both scales entail higher respondent burden as indicated by more missing data and longer response times. Therefore, the use of drag-and-drop scales needs to be carefully assessed.