All time references are in CEST
Sources of resistance to digital data collection: Privacy concerns, digital trust and data literacy
|Session Organiser|| Dr Caroline Roberts (University of Lausanne)
|Time||Friday 21 July, 09:00 - 10:30|
Access to high quality data on which to base policy decisions and track their impact is an essential component of democratic society. For decades, survey research has been the principal method by which objective and accurate statistics have been produced to inform decision-making, but there is growing demand to turn to alternative data sources (in particular, digital behavioural data). The Covid-19 pandemic had a fundamental impact on the social data collection landscape, revealing weaknesses in existing methods and the data produced, and accelerating innovation in research designs and data collection practice. Among the innovations offering significant potential for improving the future responsiveness of survey research is the use of digital data collection methods - in particular, smartphone apps that provide a platform for multimodal, in-the-moment, data collection and for linking data from multiple sources. Furthermore, research designs that combine data from probability-based sample surveys with mass collaboration (e.g. citizen science projects relying on digital data collection methdods) and other data sources, represent one of the potentially most powerful applications of scientific surveys for future social research. Many barriers to the successful implementation of such innovations remain, however. Among these, public resistance to participating in research using digital data collection methods represents a major challenge because of its implications for the accuracy and representativeness of the data collected. Key sources of resistance to digital data collection methods include concerns about data privacy, and a lack of digital trust and digital, data and privacy literacy. Methodological research is needed to improve our understanding of when and how these different sources of resistance to surveys based on digital data collection methods impact on actual participation decisions, and to test alternatives ways to address them. This session welcomes submissions focused on these concerns.
Keywords: app surveys, smart surveys, privacy, trust, literacy
Dr Caroline Roberts (University of Lausanne) - Presenting Author
Mr Marc Asensio Manjon (University of Lausanne)
Mr Nicolas Pekari (FORS)
Public willingness to participate in research on smartphones remains one of most important barriers to incorporating digital data collection methods in probability-based sample surveys of the general population. While the proportion of web survey respondents completing questionnaires via a mobile browser continues to grow, when asked about hypothetical willingness to agree to smartphone sensor data collection, resistance remains high – particularly when it comes to so-called ‘passive’ data capture. Two related factors that have emerged in numerous studies as key correlates of stated unwillingness to agree to smartphone sensor data capture are concerns about data privacy and digital trust. Less is known, however, about how closely hypothetical willingness maps on to actual participation, and hence, how important privacy concerns and digital distrust really are in specific participation decisions. In this paper, we present the results from a probability-based online panel survey designed to assess the impact of data privacy concerns and digital trust on both hypothetical and actual willingness to complete different types of digital data collection. Through embedded methodological experiments we also tested alternative methods aimed at reducing or offsetting the potential negative impact of privacy concerns and distrust on survey participation decisions, including providing information designed to reassure sample members about data confidentiality and security, and offering different amounts of monetary incentive for completing data tasks. Drawing on the Leverage-Salience Theory of survey nonresponse, we assume that different methods will affect participants differently, depending on how important their prior levels of concern about data privacy and digital distrust are for their decision to complete digital data collection tasks on their smartphone. We examine these interacting influences on participation decisions using the example of a data donation task in which panellists were asked to share data from the digital wellbeing/ screentime functions of their smartphones.
Dr Hayk Gyuzalyan (Highgate Consultancy) - Presenting Author
Chief Digital Office of UNDP has been conducting Digital Readiness Assessments in individual countries since 2020. The assessment's objective is to measure the progress of a country on the path of digital transformation, by 20+ key aspects, such as infrastructure and cultural norms. The Assessments employ a range of methods: a survey among key informants, a data dashboard of publicy available data related to different aspects of digital transformation, a citizen survey, and a document review. Each component has its own methodological challenges, including maximising validity of the collection instruments, ensuring cross-country equivalence of the measurement tools, treatment of missing values in the data dashboard, interpretation of country scores and rankings, conversion of numerical data to unique DRA scores. The paper will look into individual components of the DRA, individual challenges, and how it resolves or plans to resolve them.
Ms Caroline Winkler (Institute for Transport Planning and Systems, ETH Zurich) - Presenting Author
Mr Adrian Meister (Institute for Transport Planning and Systems, ETH Zurich)
Dr Basil Schmid (Institute for Transport Planning and Systems, ETH Zurich)
Professor Kay W. Axhausen (Institute for Transport Planning and Systems, ETH Zurich)
TimeUse+ is a study run in Switzerland between mid-2022 and early 2023 that collected passive GPS tracks from participants over the course of four weeks. Participants annotated these objective data with subjective accounts of what occurred at each location or during each trip: activities and their duration, whether a friend or household member was present, and whether they spent any money. Over 63,000 individuals were invited to participate in the study, 10.6% of whom were curious enough to begin the initial online questionnaire. The net response rate for the entire study of two questionnaires and a tracking period of four weeks lies just over 2.2%-- a rate that seems drastically low, but is equivalent to that of much shorter, simpler, and in turn less burdensome app-based travel diaries. This work presents response and attrition rates at different points during the TimeUse+ study. We illustrate reasons for disinterest reported in the initial questionnaire, that is, concerns that lead to an unwillingness to participate. To that, we uncover reasons for drop-out from participants who were interested and started tracking, but did not successfully complete the study duration and validation requirements, as revealed in emails sent to the research team.
Dr Shujun Liu (Cardiff University) - Presenting Author
Dr Luke Sloan (Cardiff University)
Dr Tarek Al Baghal (University of Essex)
Mr Curtis Jessop (National Center for Social Research (NatCen))
Dr Paulo Serôdio (University of Essex)
Dr Matthew Williams (Cardiff University)
Linking survey data with social media data has become popular over the past few years. However, the willingness of survey respondents to provide social media data is limited. Previous research discovered that respondents’ willingness varies depending on the survey mode and certain social demographics of respondents. Nevertheless, more important and less explored could be related to the respondents' privacy concerns. We leverage data from the 10th wave of the Understanding Society Innovation Panel (n = 486), a nationally representative survey of United Kingdom, which measures privacy concerns using multiple variables. We extract a common factor of privacy concerns, and propose that respondents’ a) activity variety, b) self-reported efficacy and c) use frequency with mobile technology could decrease their privacy concern, which in turn influence their consent decision. Potential moderating effects of gender, education and age are also explored among the relationships of mobile technology use, privacy concern and consent decision. The results of Structural Equation Modelling (SEM) suggest that respondents’ activity variety, self-reported efficacy and use frequency with mobile technology are significantly associated with respondents’ privacy concern, which subsequently predict their consent decision. In addition, gender is found to moderate the relationship between activity variety with mobile technology and consent decision: women are more likely to provide their Twitter account if their activity variety with mobile technology is greater. Findings would aid us in better understanding the underlying factors affecting respondents’ decision to consent linking data, and developing more techniques to address respondents’ potential privacy concerns.
Dr Michèle Ernst Stähli (FORS (Swiss Centre of Expertise in the Social Sciences)) - Presenting Author
Mr Alexandre Pollien (FORS (Swiss Centre of Expertise in the Social Sciences))
Dr Michael Ochsner (FORS (Swiss Centre of Expertise in the Social Sciences))
Unconditional cash incentives are known in literature and survey practice as the most efficient way to incentivize for survey participation, whatever the mode of interview. In some conditions checks to cash can come close to cash in terms of effect on response rates, avoiding some drawbacks of cash. However, checks are dead or are dying in most countries, and cash is not a viable option for many countries and institutions or modes of invitation. With the rise of electronic payment solutions, the idea to use digital cash as incentive is straightforward. Moreover, it could be applied easily also in case of electronic invitations. In Switzerland a large share of the resident population is equipped with a national electronic payment app called TWINT that allows also to receive money. We realized two incentive experiments including TWINT as unconditional incentive. The first experiment took place in 2020, just when the COVID pandemic started, and the second in 2022, after the digitalization boost induced by the pandemic. The survey, called MOSAiCH, is cross-sectional with a probability based sample of the general population, administered by web followed by a paper and pencil questionnaire for reluctant sample units (push-to-web). The data allow to investigate reasons of acceptance and resistance related to behaviours and attitudes among others in the field of payments, internet, trust and privacy concerns, and to check for sociodemographic bias on the whole sample. We will show if electronic cash incentives are effective for push-to-web surveys and what kind of population is sensitive or not to this kind of motivation.