ESRA 2019 Draft Programme at a Glance


Cognition in Surveys 1

Session Organisers Dr Naomi Kamoen (Tilburg University)
Dr Bregje Holleman (Utrecht University)
TimeTuesday 16th July, 11:00 - 12:30
Room D25

In recent years, various models describing the cognitive processes underlying question answering in standardized surveys have been proposed, such as the model by Tourangeau, Rips and Rasinski (2000). This model distinguishes four stages in question answering: (1) comprehension of the question, (2) retrieval of information, (3) deriving a judgement, and (4) formulating a response. In addition, there are dual-process models, such as the satisficing model proposed by Krosnick (1991). In this model, two groups of respondents are distinguished: those who satisfice, and try to do just enough to give a plausible answer versus those who optimize, and do their best to give a good answer.


Cognitive models such as the two described above, have many applications. For example, they help in understanding what is measured when administering surveys, and they provide a point of departure in explaining the wide range of method effects survey researchers observe. Also, cognitive theory in surveys is used by psychologists, linguists and other scholars to obtain a deeper understanding of, for example, language processing, the nature of attitudes, and memory.
In this first session of Cognition in Surveys, cognition is understood to be 'cold cognition' (i.e., beliefs and reasoning) as well as 'hot cognition' (i.e., emotions), which are also related to personality traits.

Keywords: cognition; question-answering processes; satisificing, emotion

What’s Happening in our Brains? Cognitive Processes and Neuronal Correlation While Answering Questions.

Professor Martin Weichbold (University of Salzburg (Sociology)) - Presenting Author
Professor Dietmar Roehm (University of Salzburg (Neurolinguistics))
Professor Reinhard Bachleitner (University of Salzburg (Sociology))
Professor Wolfgang Aschauer (University of Salzburg (Sociology))

Cognitive models have become a constitutional part of survey theory and practice. Although they offer a useful understanding of cognitive processes during an interview, we have to ask whether these models are still valid when considering recent findings in neuroscience where advanced imaging methods measures today allow for more objective insights into real-time cognitive processing.
In an exploratory study 48 students answered an online survey consisting of 120 items. Half of the items were factual questions, the other attitudinal ones, referring to topics of everyday life (ecology, consumption, study, travel, politics, and art). A pretest evaluation was used to distinguish easy and cognitively challenging questions using subjective evaluation as well as response latency of the answers.
While behavioral studies can track down only the outcome of knowledge access/retrieval, we used online measures to track the processing and proceduralization of this knowledge. To catch the fine-grained temporal dynamics of question processing, we used eye-tracking and EEG. In order to gain better understanding of the neural substrates (i.e., metabolic activity) of question processing, we recorded functional near-infrared spectroscopy (fNIRS, a non-invasive optical imaging technique to measure cortical hemodynamic activities) data concurrently to EEG.
First results show differences for the distinguished types of questions, for instance an increase in oxygenated and a decrease in deoxygenated haemoglobin in several brain areas for difficult attitudinal questions compared to easy attitudinal and factual questions. Both effects indicate specific neuronal activity. This is in line with EEG results, where spectral power analysis in the EEG frequency domain showed a graded theta band event-related synchronization effect (strongest for difficult attitudinal questions). In our presentation, we want to show some selected results of our study and discuss the consequences of our findings.


Emotion Regulation and Survey Response Quality

Professor Lonna Atkeson (University of New Mexico) - Presenting Author
Professor Cherie Maestas (University of North Carolina Charlotte)
Professor Sara Levens (University of North Carolina Charlotte)

How do emotional motivations influence survey response patterns? Researchers have considered how attentiveness influences the quality of responses, but have not examined underlying factors that might contribute to attentiveness. Specifically, we consider how emotion and emotion regulation influences survey response. Surveys are often tasked with obtaining opinions about emotion provoking individuals and events as well as eliciting information about respondents’ emotions about such targets. We draw upon regulation theory to consider how different emotion regulation strategies used by individuals to cope with upsetting events influence survey responses. For example, the emotion regulation strategy “avoidance” leads individuals to avoid or ignore upsetting stimuli, which could lead to survey inattentiveness. Similarly, the emotion regulation strategy of cognitive reappraisal may encourage careful thought and greater attentiveness to questions and answer scales. As a result, reappraisers may appear to be more engaged and thoughtful citizens. We consider these processes and their influence on survey response in a national three wave panel survey that collected data after 3 different horrific shootings in the United States in 2017 and 2018, one a partisan motivated shooting of a member of Congress, and two mass shootings (Las Vegas, NV and Parkland, FL). We collected survey measures of general emotion regulation habits using a well-validated psychological scale and developed novel event specific emotion regulation scales. We use both to examine their relationship to survey attentiveness including unit non-response, straight-lining, moderating opinions, and open-ended responses. Because of the panel nature of our design, we can examine this question both cross-sectionally and over time as respondents both maintain and alter their moods.


Am I Being Neurotic? Personality as a Predictor of Survey Response Styles

Professor Patrick Sturgis (University of Southampton) - Presenting Author
Professor Michael F. Schober (The New School for Social Research)

Survey researchers have long known that some respondents provide poorer quality responses to questionnaires than others, as evidenced through empirical indicators like rates of item nonresponse, Don’t Knows, mid-points, and a lack of differentiation across adjacent questions. The theory of survey satisficing (Krosnick, 1991) has been advanced to explain individual differences in the propensity to manifest these kinds of sub-optimal responses: respondents pursue strategies to minimise the cognitive costs of answering survey questions and, in doing so, provide acceptable rather than optimal answers. Furthermore, the propensity to satisfice can be explained by a function of three key features of the response context: the difficulty of the task and the ability and motivation of the respondent. In short, satisficing is more likely when a question is difficult to answer and the respondent is of lower cognitive ability and weakly motivated (e.g., Roberts and Allum, 2018). In this paper, we explore a potentially important gap in our understanding of the causes of satisficing response-styles: personality. We use data from wave 3 of the UK Household Longitudinal Survey (UKHLS), collected 2011-2013, to assess the extent to which individual differences on the Big Five personality inventory (measured using the 15 item BFI-S scale) predict satisficing response styles. The UKHLS also includes measures of motivation (political engagement) and a battery of cognitive ability measures, thus allowing comparison of the relative effects of motivation, cognitive ability, and personality. We find large and significant effects of personality on the propensity to exhibit satisficing response styles. Mostly, these effects accord with theoretical expectation. For example, satisficing response styles are less prevalent amongst respondents scoring high on the Conscientiousness dimension. For a majority of response-style indicators considered, the personality variables are of greater explanatory power than the indicators of motivation and cognitive ability.


Some Evidence on the Influence of Context and Cultural Factors to Extend the Question-and-Answer Model for Web Surveys

Professor José Luis Padilla García (Universidad de Granada) - Presenting Author
Miss Dörte Naber (Universidad de Granada, Universität Osnabrück)
Dr Isabel Benítez (Universidad Loyola Andalucía (Sevilla))

Since the 1980s, there has been a huge amount of research on cognitive processes to understand the processes underlying the question-and-answer process when answering survey questions. To date, the so-called “Question-and-Answer” process, originally introduced by Tourangeau (1984, 2018), has been used most prominently in literature. However, extensions of the original model as well as new models have been proposed, such as the extended model of Schwarz and Oyserman (2001) and the ImpExp model of Shulruf, Hattie & Dixon (2008). In our presentation, we will first review briefly existing models of the question-and-answer process and second, analyze and integrate current theories and findings of survey research with the aim to propose a more comprehensive model. In particular, we will use results of an ongoing research project to extend the knowledge about the question-and-answer process in the context of web surveys. We resort to Web Probing (WP) qualitative evidence together with responses collected of 1,000 participants (500 in Germany and 500 in Spain) to single and multi-items of the 8th European Social Survey Round within a mixed-method design. In this session, our main focus will be on the social influence effect (the “ecology” of the web survey) that is often missed by the dominant versions of the question-and-answer cognitive models. Therefore, we will present and discuss on indicators for the social influence as well as selected context factors and cross-cultural factors that could play a crucial role for the question-and-answer process for web surveys. Doing so, we aim to contribute to a more profound comprehension of the answering process especially in web surveys and furthermore, to understand and minimize the effects of related response biases and cross-cultural factors that can undermine data quality.