ESRA 2017 Programme

Tuesday 18th July      Wednesday 19th July      Thursday 20th July      Friday 21th July     

     ESRA Conference App


Friday 21st July, 09:00 - 10:30 Room: Q2 AUD2

Mixed-Device Surveys and Device Preference

Chair Dr Marieke Haan (Utrecht University )
Coordinator 1Dr Peter Lugtig (Utrecht University)
Coordinator 2Dr Vera Toepoel (Utrecht University)
Coordinator 3Dr Olga Maslovskaya (University of Southampton)
Coordinator 4Professor Gabriele Durrant (University of Southampton)
Coordinator 5Mr Tim Hanson (TNS BMRB)

Session Details

We live in a digital age with widespread use of technologies in everyday life. Technologies change very rapidly and affect all aspects of life, including surveys and their designs. Online data collection is now common in many countries. Online surveys are not only completed on PCs/Laptops but also on other devices such as tablets and smartphones (i.e., mobile devices). In this session we would like to address questions related to online device use and device preference for completing surveys.

Surveys can be completed with one or even multiple devices. For example, in a panel surveys there can be device combinations between waves but also within waves. Studies focusing on traditional mixed-mode surveys have shown that there are mode preferences and that age, frequency of internet use, number of times of participating in a survey, and education are consistent predictors for mode preference. The possibility of using multiple devices for an online questionnaire makes us wonder whether there is a device preference in mixed-device online survey context too. Recent studies have shown that there are specific respondents who are more inclined to use certain devices. Knowing more about a respondent’s actual device preference may be helpful with regards to future survey participation of that specific respondent. Survey practice can perhaps cater to the device preference of survey respondents using adaptive or responsive designs. This session welcomes submissions of papers on different aspects of mixed-device online surveys in both cross-sectional and longitudinal contexts. Topics may include but are not restricted to the following areas:

• Coverage issues in mixed-device online surveys
• Data quality issues in mixed-device online surveys, including item and unit nonresponse, breakoffs and completion
times
• Optimisation of surveys and adaptation of question design for online surveys
• Impact of different questions’ designs or presentations on response across devices
• Comparison of different types of mobile devices, different operating systems and screen sizes
• Use of different devices over time in panel studies
• Device preference

We encourage papers from researchers with a variety of backgrounds and across different sectors, including academia, national statistics and research agencies. We particularly welcome contributions that use experimental designs, and/or other designs that can inform future strategies towards mixed-device surveys.

Paper Details

1. The mobile Web only population – socio-demographic characteristics and potential bias
Mrs Anke Metzler (Darmstadt University of Technology)
Mr Marek Fuchs (Darmstadt University of Technology)

In recent years the use of mobile phones and tablet PCs to complete Web surveys has grown steadily. Thus, survey researchers are facing new challenges when designing Web survey questionnaires. Among others, mobile devices differ from desktop PCs and notebooks according to their data input method and screen size. Thus, Web surveys are more burdensome for respondents using mobile devices than for respondents using a desktop or notebook. Respondents using a mobile phone to complete a Web survey (as compared to a tablet) experience pronounced problems particularly when questionnaires are not optimized for mobile devices.

Since most Web survey respondents still have a desktop PC or notebook available to access the internet optimizing questionnaires for mobile devices may not be seen as a necessity, also because multiple questionnaire versions for different devices may potentially induce differential measurement error. Even though, the range of available devices to access the Internet has increased, there is also a growing number of people who cannot rely on other devices for internet access than their mobile phone.

In this paper we assess the so-called mobile Web only population that no longer has access to the internet using a desktop PC or notebook. It is assumed that this group is most likely underrepresented in Web surveys not optimized for mobile devises due to additional response burden resulting in a potential bias. Using Eurobarometer data from 2012 to 2014 across 27 European countries we estimated the size and the socio-demographic properties of the mobile Web only population.

Results indicate that the percentage of mobile Web onlys among the Web population increased from 1 percent in 2012 to 4 percent in 2014 across Europe. A more detailed analysis indicated that the mobile Web only population is more likely female, younger, more often single, less educated, less likely living in rural communities and in multi-person households compared to the Web population. Overall, results suggest that Web survey questionnaires should employ an adaptive design approach that accommodates all devices since exclusion or underrepresentation of the mobile Web onlys causes bias.


2. Device use and effects of screen size on data quality in a cross-sectional probability-based online survey in Spain
Ms Sara Pasadas del Amo (Institute for Advanced Social Studies. Spanish National Researc Council (IESA/CSIC))
Mr Juan Antonio Domínguez Álvarez (Institute for Advanced Social Studies. Spanish National Researc Council (IESA/CSIC))
Ms Mónica Mendez Lago (Center for Sociological Research (CIS))
Mr Manuel Trujillo Carmona (Institute for Advanced Social Studies. Spanish National Researc Council (IESA/CSIC))

One of the effects of the rapid technological changes that we are witnessing in the last few decades is that we no longer have a complete control over the context, the display and other features of the survey situation. The use of smartphones and other mobile devices is spreading in most countries and so is the application of these technologies for a wider range of tasks including how respondents complete online surveys.

The multiplication in the number of devices that can be used to respond an online survey and their different features (screen size, touchscreen versus keyboard, survey setting, etc.) have raised concern about the quality of the data obtained. There is a growing body of research dealing with the effects of mixed-device use on nonresponse and measurement error in online surveys (Bruijne y Wijnant 2014; Callegaro 2013; Couper y Peterson 2016; Lugtig y Toepoel 2016; Mavletova 2013; Poggio, Bosnjak, y Weyandt 2015; Struminskaya, Weyandt, y Bosnjak 2015; Toninelli y Revilla 2016; Wells, Bailey, y Link 2014).

Following the approach used by Lugtig and Toepoel (2015) and Struminskaya (2015) our paper has two objectives in mind. We will describe device use (type of device and screen size for mobile devices) in Spain and individual and socioeconomic factors that explain the differences in those variables. The second part of the paper analyzes measurement errors between different devices.

Our data stems from a cross-sectional online survey addressed to a full probability sample of the resident population in Spain, achieved through an offline push2web procedure. Invitation letters with an URL and individual passwords were sent to a 4,500 name sample drawn from the Population Register. With the aim of reducing nonresponse a postal reminder were sent to reluctant respondents three weeks after the first mailing and respondents were paid 5€ per completion (they could opt for receiving or donating it to a charity). As a result the response rate for the survey is 39% (RR1). 73.4% of respondents used PC or Laptop computer, 11.8% tablets and 14.7% smartphones (2.7% of them were phablets) to complete the interview. In the case of mobile devices screen size goes from 3.14 to 12.9 inches (Mean: 6.89; SD: 2.43).


3. What do we know about mixed-device online surveys and mobile device use in the UK?
Dr Olga Maslovskaya (University of Southampton)
Professor Gabriele Durrant (University of Southampton)
Professor Peter Smith (University of Southampton)

We live in a digital age with high level of use of technologies. Surveys have also started adopting technologies including mobile devices for data collection. There is a move towards online data collection in the UK, including the plan to collect 75% of household responses through the online mode of data collection in the UK 2021 Census. However, evidence is needed to demonstrate that the online data collection strategy will work in the UK and to understand how to make it work effectively. No research has been conducted so far in the UK to address respondent’s online choice of device or behaviour in mixed-device online surveys. This paper is very timely and will fill this gap in knowledge. This paper aims to study survey participants’ online choices of device and online survey behaviour. This project also aims to explore differences in a range of devices used by respondents in online surveys in the UK (desktops (PCs), laptops, tablets and mobile phones (smartphones)) with a special focus on mobile devices.
This analysis uses all so far publically available UK social surveys which had an online component (Understanding Society Innovation Panel, Community Life Survey, European Social Survey, 1958 National Child Development Survey, Second Longitudinal Stud of Young People in England). Descriptive analysis and multinomial logistic regressions (where possible) are used to study significant correlates of different device use in mixed-device online surveys. Distributions of different device use by demographic and socio-economic characteristics as well as significant correlates of device use will be presented. Comparisons to other countries (Netherlands, Germany, Spain and the US) will be drawn.

The originality of the analysis lies in addressing the underresearched area of different device use including mobile device use in mixed-device online surveys in the UK. The findings from this analysis will be instrumental to better understanding the trends in device use and response behaviour in mixed-device online surveys in the UK more generally and, specifically, in informing best practice for the next UK Census 2021.The knowledge about characteristics of respondents who choose to use different devices in online surveys in the UK can help target certain groups more effectively. It also can help improving the design of the surveys and response rates as well as reducing survey costs and efforts. This analysis lays foundations for future analysis of data quality issues in mixed-device online surveys in the UK.
The paper is proposed as a part of a UK-based research project, the National Centre for Research Methods Research Workpackage 1 ‘Data Collection for Data Quality’, which is funded by the UK Economic and Social Research Council (ESRC) and is led by a team from the University of Southampton. The project investigates amongst other topics mobile device use in online surveys.


4. A Model for Device Use in Online Panel Surveys
Dr Marieke Haan (Utrecht University)
Dr Peter Lugtig (Utrecht University)
Dr Vera Toepoel (Utrecht University)

Nowadays online surveys can be completed on several devices: PC desktop, laptop, tablet, and smartphone. We investigate why people choose a specific device for survey participation. For this study we used the GESIS online panel survey. GESIS is a probability-based mixed-mode access panel with more than 4800 panelists of the German speaking population. Several questions on device use were implemented in a wave of the GESIS panel in December 2014. These items included topics such as: device ownership, device preference for survey participation, device access, and possibilities to use the device in combination with the Internet. The survey also included items on for example privacy and various social demographic questions. Panel members could state their device preference (PC/laptop, tablet, or smartphone). Additionally, they could also select a paper and pencil questionnaire if that was their preferred participation mode.
Our first analyses show that 94% of the panel members owns at least one device. Then, 23% only has a PC/laptop, 33% of the respondents owns a PC/laptop and a smartphone, 30% owns all devices. Six percent of the respondents did not own any of the devices. We also found that over 85% of the respondents participated using their preferred device. Based on these results we are interested in finding out the underlying reasons for device use in online surveys. First a theoretical model is developed in which we identify possible motives for device use. Second, we use multiple items from the GESIS questionnaire to test our model. These items are related to: opportunities for respondents to use a device, their norms (e.g. opinions on privacy), device preferences, and social demographic information. The findings of this study have important implications for survey research. By developing a model, it is possible to identify predictors for device use. Furthermore, information about device use can be very useful for targeting panel members in future waves.


5. Does mobile web survey completion affect response quality amongst young people? Evidence from the second Longitudinal Study of Young People in England (LSYPE2)
Mrs Emily Bell (Kantar Public)
Mr Peter Matthews (Kantar Public)
Mr Alexander Wenz (University of Essex)
Mr Tim Hanson (Kantar Public)

The rising use of mobile devices to access the internet, and in particular the small but increasing group of people who rely solely on mobile devices for internet access, brings with it implications for social research. In the UK, for some large scale social surveys we are seeing up to a quarter of respondents choosing to respond using a smartphone, and this is no longer something researchers can ignore.

This is a particular issue for research that involves young people. Recent data from Ofcom suggests that use of smartphones is highest amongst young people, with those aged 16-24 spending an average of 3.6 hours a day using their smartphones.

The evidence in relation to the impact of using mobile devices for survey completion on response quality is mixed. While most studies suggest that mobile device completion increases break-off rates and survey completion time, there is less conclusive evidence on the existence of a consistent effect on response quality and measurement error.

This paper will present evidence of the effect of mobile web survey completion on response quality from the second Longitudinal Study of Young People in England (LSYPE2), Wave 4. LSYPE2 is a longitudinal survey following a sample of young people from the age of 13/14 to 19/20. The first three waves used face-to-face data collection. At wave 4, when the respondents were aged 16/17, a sequential mixed-mode design (web, telephone then face-to-face) was introduced. 789 participants used a smartphone to complete the survey at wave 4 (27% of those that chose to complete the survey online).

Qualitative evidence from usability testing prior to fieldwork suggested that young people were able to easily complete the survey on smartphones without experiencing problems, however quantitative evidence is needed to conclude whether mobile device completion has an impact on response quality.

This paper will add to the evidence base by examining the effects of device on response quality amongst a cohort of young people who have grown up in the age of smartphone use. It will examine the following questions:

- What is the socio-demographic profile of those using mobile devices to complete the survey and what predicts device choice?
- What is the impact of device type on the survey break-off rate?
- What impact does device have on completion time?
- Does device type impact on the way cohort members respond to particular types of questions?
- What is the impact of device type on data quality indicators, including item non-response, primacy effects, and straight-lining?

Findings from this research will contribute to the question of how survey methodologists can cater to device preference at the same time as ensuring high quality data.