ESRA 2017 Programme

Tuesday 18th July      Wednesday 19th July      Thursday 20th July      Friday 21th July     

     ESRA Conference App

Thursday 20th July, 11:00 - 12:30 Room: Q2 AUD2

Adapting online surveys for mobile devices 1

Chair Dr Olga Maslovskaya (University of Southampton )
Coordinator 1Professor Gabriele Durrant (University of Southampton)
Coordinator 2Mr Tim Hanson (Kantar Public)

Session Details

The substantial recent increase in levels and ownership and use of mobile devices (particularly smartphones) has been reflected in a rise in the proportion of respondents completing surveys using these devices. For some large social surveys in the UK, for example, between 10% and 20% of respondents now use a smartphone to complete the questionnaire.

This recent shift poses challenges for survey designers, as they seek to enable respondents to complete on their device of choice without any loss of data quality. Solutions to this challenge are varied, and range from minimal adaptation to major overhaul. The latter may include steps to fully optimise the survey layout and presentation for mobile devices, revisions to questionnaire content (e.g. reduced questionnaire length, shorter questions) or alternative completion formats (e.g. splitting surveys into ‘chunks’ that can be completed over a period of time).

For this session we welcome papers on a range of topics relating to adapting surveys for mobile devices, including the following:

• Attempts to produce ‘mobile optimised’ versions of questionnaires
• New question formats that may be better suited to mobile devices (e.g. more interactive)
• Issues with question formats that are known to be problematic on mobile devices (e.g. grids)
• Experimentation to assess the impact of different survey or question formats
• Analysis of data quality indicators that highlights particular issues relating to mobile devices
• Usability testing conducted on mobile devices to identify common issues

We are interested in examples from a range of different types of online survey, including ad hoc studies, tracking projects, longitudinal studies, online panels and mixed mode surveys that include online components. We encourage papers from researchers with a variety of backgrounds and across different sectors, including academia, national statistics and research agencies.

This session aims to foster discussion, knowledge exchange and shared learning among researchers and methodologists around issues related to increased use of mobile devices for survey completion. The format of the session will be designed to encourage interaction and discussion between the presenters and audience.

Paper Details

1. The effect on smartphone participation of switching to a mobile-first design
Dr Christian Bruch (University of Mannheim)
Professor Annelies Blom (University of Mannheim)
Dr Barbara Felderer (University of Mannheim)
Ms Jessica Herzing (University of Mannheim)

Changes in the way people use technology are increasingly affecting the measurement quality of online surveys. In particular, a rising number of respondents answering surveys via smartphones can be observed. Because of the small screen size of smartphones and their touchscreen features, smartphone designs present a challenge to the design of the online questionnaire. In order to meet these challenges, the German Internet Panel, a probability based online panel, decided to move from a traditional desktop-optimized design to a mobile-first design, which was first implemented in May 2016.

The aim of this paper is to present the effects of this switch to a mobile first design, especially with respect to the participation rate via smartphone. Does for example the switch to a mobile-first design lead to a significant increase in smartphone participation? To investigate the impact of this change we use segmented (breakpoint) regressions.

Furthermore, an experiment in July 2016 of the German Internet Panel was conducted in which respondents were randomly divided into three groups and received different invitation e-mails with and without the mentioning of the new mobile-first questionnaire design. In this context the research question of interest is whether the explicit mentioning of the mobile-first design is accompanied by a further increase in smartphone participation.

2. The journey from Mobile Agnostic to Mobile First
Ms Sue York (The University of Queensland)

The accepted practices for incorporating mobile devices as data collection tools in survey research have been constantly evolving. The initial efforts of survey researchers focused on the best ways to handle what at the time were referred to as “accidental mobile” respondents – respondents who wanted to or tried to complete online (or web) surveys designed for computers using mobile devices. The next step in the evolution was the call for device agnostic survey design – where survey designs functioned across a range of devices.
Currently, the dialogue in the research community calls for “Mobile First” questionnaire design. “Mobile First” questionnaire design advocates building mobile devices into the design from the beginning of the design process, rather than adjusting a design to work on mobile devices in the later stages of the process.
There are a number of challenges for researchers seeking to design “Mobile First” questionnaires. Firstly, there is no generally agreed theoretical or operational definition of what a “Mobile First” questionnaire design is. The second, and perhaps most significant, challenge is changing survey researchers’ mindsets as questionnaire designs are re-envisioned to harness the capabilities of mobile devices and draw on the way mobile devices are used by potential survey participants in their daily lives.
This presentation will bring together the views of some of the key stakeholders in the mobile space, including academic and commercial survey researchers, survey software developers and providers, and online panel providers, and explore what it means to design a “Mobile First” questionnaire. The presentation will also review published literature in the area and suggest an agenda for future research.

3. Adapting a Multi-mode Household Survey for Mobile Devices
Ms Brenda Schafer (IRS)
Mr Pat Langetieg (IRS)
Dr Saurabh Datta (IRS)
Ms Jennifer McNulty (Westat)
Dr Jocelyn Newsome (Westat)
Ms Hanyu Sun (Westat)
Dr Kerry Levin (Westat)

Smartphone and tablet ownership has skyrocketed, with 68% of individuals owning a smartphone and 45% owning a tablet in 2015 (Pew Research Center, 2015). Increasingly, individuals opt to take surveys on mobile devices, with as many as 30% of respondents completing web surveys on a mobile device, although exact rates vary by population and topic (Peterson, 2012; De Bruijne, 2014). When a web survey is not designed to be mobile-friendly, respondents often have difficulty completing the web survey on a mobile device. Varying screen sizes may require horizontal scrolling; matrices may be difficult to navigate; drop-downs may not display properly; and small response boxes may be difficult for fingers to select on a touch screen. These design challenges can frustrate respondents, leading to nonresponse and break-offs (Couper & Peterson, 2016).
Given the rise in mobile device ownership, we incorporated a mobile version into the latest administration of our IRS Individual Taxpayer Burden (ITB) survey. The IRS ITB survey is an annual survey sent to 20,000 individuals that measures the time and money taxpayers spend complying with tax law. The standard version of the survey comprises of two critical items that ask about time and money, along with 24 other items that provide context by asking about filing activities. The paper version of the survey is 13 pages long; the web version spans 26 screens. In order to make the survey mobile-friendly, it was shortened to five items: the two items of highest research interest to the IRS—the time and money spent on the tax return—and three additional items that measured stress associated with complying with tax filing regulations. In addition, the time and money items were shortened substantially. In the standard survey, respondents are given detailed instructions about what activities and costs to include or exclude, with lengthy bulleted inclusion and exclusion lists. These instructions were dropped from the mobile version to optimize for smaller screens. In addition, the stress items are typically presented in a matrix format; for the mobile version, each sub-item was asked as a separate question.
This mobile version of the survey was sent to a sample of 2,000 respondents and offered in English and Spanish. Respondents were able to access the survey by scanning a QR code or by typing in the URL provided in the instruction card they received in the mail. Because we only had mailing addresses for the selected respondents, all contact materials were sent by mail. In this paper, we examine the impact of a shortened survey on data quality. We will also examine the mobile survey’s response rates, particularly from populations that have been historically underrepresented in the ITB survey (e.g., younger adults and low income respondents), as well as any geographical differences in response. We will also look at which devices (i.e., laptop, tablet, or smartphone) respondents utilized when completing the survey. Finally, we will discuss respondents’ preferred method for entering the survey (QR code versus URL).

4. National Travel Survey digital diary development
Ms Katriina Lepanjuuri (NatCen Social Research)

Today many household surveys are exploring the possibility of going online. While this is a very exciting development and could contribute to higher response rates, there are many aspects that need to be kept into consideration and proper testing and development needs to be an important stage of this development process. National Travel Survey is a household survey which collects detailed personal travel behaviour data including trip mode and purpose using paper diaries covering 7 consecutive days, starting on a specified day. Approximately 16,000 individuals, in 7,000 households in England participate in the survey every year. It covers travel by people in all age groups, including children. It has been running for over 50 years now and is a major source of information for the Department for Transport, academics and many other public bodies. The advent of online data collection both in terms of survey research and for government services has led the NTS team to develop a web prototype of the online diary. The long-term aims for the study are to ensure that data is collected that will meet the need of a wide range of users in an efficient and cost-effective approach whilst maintaining accuracy and, as far as possible, comparability with the existing approach (so that time series of results are consistent). The website development has taken place in the summer and autumn of 2016. It has been an iterative process driven by the aim of developing a tool specifically optimised for mobile devices. It has included rounds of user-testing and cognitive interviewing. The paper will present the key considerations in relation to the website design. It will also discuss the challenges surrounding the project and findings from the first stage of the development process. It will also reflect on the next stages of work and the challenges ahead in terms of survey processes, data comparability and mode effects.