ESRA logo
Tuesday 14th July      Wednesday 15th July      Thursday 16th July      Friday 17th July     




Friday 17th July, 09:00 - 10:30 Room: O-201


New sources of data for survey research: challenges and opportunities 1

Convenor Mr Arnaud Wijnant (CentERdata – Tilburg University )

Session Details

We live in a rapidly changing world in which people using smartphones all the time, being on Facebook, tweet about what they like, a world of ‘internet of things’ in which more and more data are available. All these data could in potential be a new source of information for survey researchers. However, this implies another way in how we collect and analyse data, moreover how can these new sources of data help survey researchers?

Smartphones and tablets for example offer new opportunities for collecting ‘passive’ data which can provide insight into how individuals use smartphones, GPS information tells us more about the exact location of the respondent and can track their patterns of mobility. Pop-up questions which allow us to ask people at random intervals how they feel at that moment (Experience Sampling). Other options are to ask respondents to make photos or videos and to scan the barcodes of what they bought. Furthermore all these kinds of data can be combined (with traditional surveys) to give a full overview of the respondent’s behaviour and well-being.

Social media offer new ways of collecting information on people without asking questions. How can twitter and other social media help to improve surveys or give a deeper understanding in people’s opinions or behaviour?

In this session we want to examine best practices and new ways of collecting and analysing data that can complement survey research. We welcome contributions, but not limited to these, which report on the use of smartphones, social media and other new sources of data in their research design.

Paper Details

1. Mobile devices in web surveys: How much difference do they make?
Mrs Inna Becher (LINK Institute for Market and Social Research, Zurich)

Responding to open-ended questions is more time-consuming compared to closed questions. In case of mobile devices, it is plausible to assume that open-ended questions are skipped more often or answers could be shorter than if the survey were completed on a computer.
Using survey data, we demonstrate that there are fewer differences linked to the participation channel as expected. We do not find any evidence for skipping the open-ended questions. However, the answers provided on a mobile device are significantly shorter. Furthermore, there are no differences concerning the use of “don’t know” options.


2. Using Wikipedia Page View Statistics to Measure Issue Salience
Mr Simon Munzert (University of Konstanz)

The contemporaneous salience of issues in the public is an important concept in the study of politics and public opinion. Standard survey-based measures have well-known drawbacks. Most importantly, conducting surveys is costly, which makes tracking of salience over time difficult and limited to few topics. In this paper, I propose an alternative measure that builds upon Wikipedia page view statistics. I present validation efforts in which I compare time series based on Wikipedia page view statistics with traditional salience measures from political polls as well as figures from Google Trends.


3. How WOuld We Measure Public Opinion If We Didn't Have Public Opinion Polls? (And Would We Be Better or Worse Off?)
Dr Tom W. Smith (NORC)

What if one wanted to measure public opinion, but public opinion polls did not exist? How to assess public opinion? First, how was public opinion measured before the advent of public opinion polls in the 1930s? Second, even in the public opinion polling era, alternatives have been advanced as either substitutes for or supplements to public opinion polls. A case study of Hong Kong in 2014 illustrates possibilities. Finally, in the alleged post-public opinion polling period, Internet/ social media/Big Data alternatives to public opinion polls are being advocated. Advantages and disadvantage of public opinion polls and their alternatives



4. The feasibility of collecting objective physical activity data from 14-year olds on the sixth sweep of the Millennium Cohort Study
Ms Emily Gilbert (Centre for Longitudinal Studies, UCL Institute of Education)
Ms Anne Conolly (Ipsos MORI)
Ms Lisa Calderwood (Centre for Longitudinal Studies, UCL Institute of Education)

Measuring physical activity presents challenges. Most large-scale studies use self-reports to measure physical activity, which is subject to biases. Devices that measure activity directly offer a solution.

The sixth sweep of the Millennium Cohort Study collects data from study members when they are 14 years old and includes data collection using activity monitors.

This paper details two pilot studies conducted before the survey, which assessed the feasibility of collecting activity monitor data from 14-year olds and compared two different devices. The findings have informed the data collection approach taken, and provide insight into the implementation of activity