ESRA 2019 Draft Programme at a Glance
Mobile-first survey design: going beyond mobile optimization in web surveys 2
|Session Organiser|| Ms Laura Wronski (SurveyMonkey)
|Time||Wednesday 17th July, 16:30 - 17:30|
People use their smart phones for everything: email, banking, dating, ordering a coffee, hailing a cab… and taking surveys. While the prevalence of smart phones presents many opportunities for web survey creators—who can now reach respondents on their phones 24 hours a day 7 days a week through several different means—it also raises the bar for respondents’ expectations when taking surveys on their phones. Accustomed to being delighted by the apps and mobile sites with which they regularly interact, survey takers are now looking to participate in mobile web surveys with the same ease—and even enjoyment—that they do everything else on their smart phones.
High quality web surveys, therefore, must be designed with an eye toward not just mobile optimization but with a mobile-first mindset. This session will gather research on innovative techniques and tools for web survey creators, particularly as it relates to survey design.
Topics may include:
- New recruitment methods, e.g. QR codes, geolocation-based surveys
- New question types, e.g. left/right swiping, sliding scales, voice response, photo upload
- New techniques to improve response rates, e.g. gamification, alerts/notifications
Keywords: mobile optimization, web surveys, innovation, survey design
Too Old for Games? Investigating the appeal of gamification by age
Mr William Giraldo (Nielsen Company) - Presenting Author
Mrs Malgorzata Siarkiewicz (Nielsen Company)
Mr Jeff Scagnelli (Nielsen Company)
The global growth of smartphone ownership, along with the usage of mobile apps has provided an attractive platform for gathering information in an efficient way from respondents. To best leverage this approach, its critical to support longitudinal engagement so that greater insights about individuals and households can be developed. With all of the potential distractions which are present during a mobile usage session, there is a need to develop engagement models with additional appeal beyond pure research purposes. Game mechanics provide a framework for adding motivational concepts into a data collection process. The effectiveness of this has been explored in different studies, however the correlation between age of respondent and the impact of the process is less clear. It is intuitive that gamification should be effective in engagement of younger people, however its important to understand the impact with older respondents who traditionally participate in research. This study evaluates the effectiveness of gamification models on different age and demographic groups. The study is based on data from a crowd-sourcing app where users report their purchases by scanning receipts in exchange for virtual currency. The comparisons are done using survival analysis and frequentist statistics methods to understand differences in user retention and engagement.
The Power of Referrals: Investigating the benefits of mobile referral recruitment
Ms Malgorzata Siarkiewicz (Nielsen) - Presenting Author
Mr William Giraldo (Nielsen)
Mr Jeff Scagnelli (Nielsen)
Whereas the definition of relationships seem to evolve over time, one aspect seems to hold. Acquaintance are important when it comes to information sharing. Nowadays better access to information results in an overwhelming amount of data, making recommendations even more important. Referral marketing is the method of promoting products or services to new customers through referrals, usually word of mouth. Such referrals often happen spontaneously but businesses can influence this through appropriate strategies. Numerous studies have proven that customers acquired through a referral program have been observed to exhibit higher margins and lower churn than customers acquired through other means.
This study explores recruitment for a crowdsourcing app where users report their purchases by scanning receipts in exchange for virtual currency. We will present a comparison of different referral segments distinguished based on the user recruitment source (referral, not-referral) and user behavior (referring, not referring). Another area of investigation is the effectiveness of second-hand referrals, to understand how the behavior translates to others.
Using a Respondent-Centred design model to aid cognition in surveys.
Miss Victoria Cummings (Office for National Statistics ) - Presenting Author
The Tourangeau, Rips and Rasinski (2000) model takes psychological learning and improves the questionnaire evaluation process by understanding the cognitive steps. In the same way, the introduction of online self-complete modes means learning from the sphere of UX testing has also become relevant as we know that interaction with the tool will affect these cognitive steps.
The Office for National Statistics (ONS) has accepted that a shift towards greater use of administrative sources and an online-first mixed-mode collection will lead to breaks in time series data. ONS has recognised that this represents a unique opportunity to transform our surveys with a blank page approach - we only retain the concept which data-users are trying to measure. We then radically redesign the questionnaires with a respondent-centric and online smartphone-first approach.
We conduct research to understand the respondent mental models (or schemas) around the wider topic, survey completion, and online reading behaviours in general, and design our surveys to work with the respondent rather than against them. For instance, the current UK Labour Force Survey process asks all main job questions before moving to a repetitive set for second jobs. However, changes to the flow that allow ‘leap-frogging’ of main and second job questions collect better quality data by facilitating the comprehension, retrieval, judgement and response.
Our design approach moves away from unimode design. We optimise our design for the mode as we believe that if the concepts are understood and answered in the same way then we can have different questions and this will improve data quality rather than reduce it.
In this talk we will share our practical examples and recommendations for designing in this way. Another ONS presentation will cover the data from a large-scale random probability sample survey that validates our approach.