ESRA logo

ESRA 2019 glance program


Online Probing in Comparison to Face-to-Face Cognitive Interviewing

Session Organisers Miss Catherine Fenton (National Centre for Social Research )
Dr Ruxandra Comanaru (National Centre for Social Research )
TimeThursday 18th July, 16:00 - 17:30
Room D09

Proposal for a session at the 8th Conference of the European Survey Research Association (ESRA)
University of Zagreb, Croatia, 15th to 19th July 2019

Session organisers: Ruxandra Comanaru and Catherine Fenton (NatCen, UK)
Email: Ruxandra.Comanaru@natcen.ac.uk & Catherine.Fenton@natcen.ac.uk


Online probing in comparison to face-to-face cognitive interviewing

Online probing has become a popular method for evaluating survey questions due to the advantages of recruiting respondents quickly and in a more cost-effective way. Cognitive testing methods allow us to explore the processes by which respondents answer survey questions, exploring whether respondents understand the concepts used; are able to provide an answer in a consistent way; and in a way that the researcher intended (Collins, 2014). When adopting an online probing method, the cognitive interviewer is absent and the respondent answers the probing questions in a self-administered form.

Online probing has been regarded as an effective pretesting method due to its advantages (Edgar, 2012). Researchers highlight the benefits of using online probing methods, including, being able to quantify the findings (Behr et. al, 2012) and recruitment of respondents can be quicker, more cost effective and more accessible. Additionally, as the online survey is completed without a cognitive interviewer, this eliminates the interviewer effects and may increase the reliability of the findings (Conrad and Blair, 2009). Nevertheless, there are limitations to online probing in comparison to face-to-face cognitive testing. During face-to-face cognitive testing, interviewers can probe for more information where more detail is required to explore the respondents understanding of the question, and this method may therefore provide more in-depth information (Meitinger and Behr, 2016).

Research literature and studies have explored online probing methods. Despite this, there are still gaps that remain about the primary differences between online probing and more traditional cognitive testing methods, and when to use which method. Furthermore, there is much debate about which kind of items can be evaluated by web probing and around the sample size most appropriate for using this method.

For this session, we invite papers comparing findings from web probing with cognitive interviewing methods, discussing the factors that might guide decisions about when to use web probing; and any practical considerations in using web probing methods either on their own or in conjunction with other testing methods.

Keywords: Online probing, cognitive interviewing, testing survey questions

Better Together: How Web Probing and Cognitive Interviewing Complement Each Other in Question Evaluation Studies

Dr Paul Scanlon (National Center for Health Statistics) - Presenting Author

Download presentation

While early research on web probing (Murphy et al 2014) suggested that it could supplant traditional evaluation methods in some cases, more recent commentary advise that the methods should be supplement one another (Scanlon and Edgar 2017). The National Center for Health Statistics has incorporated web probing into its ongoing question and questionnaire evaluation efforts with the understanding that the two methods produce different types of data and have complementary strengths and weaknesses. Specifically, NCHS primarily administers close-ended probes (developed from previous face-to-face interviews) to respondents on statistically-sampled, recruited web panels in order to extrapolate the findings from cognitive interviews to the wider national population and to perform sub-group analyses.

This presentation will provide practical considerations for implementing question evaluation studies that combine both traditional face-to-face cognitive methods and web probing using two case studies. First, in response to the ongoing opioid use epidemic in the United States, NCHS set out to design and evaluate a new set of questions on opioid use and misuse for public health surveillance. This project combined cognitive interviewing and card sorting with web probing and a split-ballot experiment embedded in a web survey to evaluate the best way to ask about opioid pain killer use in the past year. Second, an ongoing redesign of the United States’ primary household health survey, the National Health Interview Survey, had led to proposed changes for the collection of injury statistics. In order to evaluate these proposed changes NCHS used both traditional cognitive interviewing and web probing to not only examine the validity of the questions, but also the feasibility of including them on a national survey. This presentation will explain how NCHS designed these projects and incorporated the qualitative and quantitative findings into final analyses that provided pragmatic guidance to survey managers and policymakers.


Online Probing in Comparison to Face-to-Face Cognitive Interviewing

Miss Catherine Fenton (National Centre for Social Research ) - Presenting Author

Traditional cognitive interviewing methods typically involve a face-to-face interview. In recent years web probing methods have been developed as an alternative way in which to assess respondent’s understanding of survey questions.

In this paper I would discuss the results from a project where both cognitive interviews and online probing were used to test survey questions to measure the prevalence of gifting. The project was commissioned by Her Majesty’s Revenue and Customs (HMRC), a UK government department. The testing included an online pilot of 510 respondents and 16 cognitive interviews.

The web probing asked a number of follow-up probes at the end of the questionnaire and aimed to identify, how easy or difficult respondents found specific questions; any suggestions for improvements; and any general feedback on the questionnaire as a whole. The cognitive interviews aimed to explore comprehension of the key terms; whether participants were able to answer the questions with ease or difficulty; assess recall of information; and the sensitivity of the questions.

In this paper we compare findings from the two methods, assessing the extent to which the two methods produced comparable findings. We will consider the advantages and limitations of both pre-test methods and when it is appropriate to use each method.


Evaluating the Midpoint of the Left-Right Scale: Open vs. Closed Category Probing Conducted in Two Online Access Panel’s

Mr Volker Hüfken (University of Düsseldorf) - Presenting Author

In this paper, we used two category follow up probe administered to respondent who initially select the midpoint in the 9-point left-right scale, to determine whether they selected this alternative in order to indicate distinct opinion, or to indicate that they do not have an opinion on the issue (position). We used two different variants of probing, we want to investigate which ones provide the better results. It is expected that the open-answer-format will differ from the closed-answer-format. With regard to the assumptions of consistency theory, the empirical findings show clear differences in the expected response patterns.
Based on two cross-sectional surveys (n=1.279) we find, that in the open-answer-format the vast majority of responses turn out to be ‘don’t knows’ and that reallocating these responses from the mid-point to the don’t know category significantly alters descriptive inferences.
In the alternative (closed-category probing) variant, the results from the preliminary question were reproduced. As to be expected, the results prove a clear significant response consistency.
Our findings have important implications for the design and analysis of bipolar rating scales especially or the left-right political orientation scale.


Face-to-Face Cognitive Interviewing and Online Probing for Sensitive Survey Questions on Drug Use

Mrs Darja Lavtar (National Institute of Public Health) - Presenting Author
Dr Gaja Zager Kocjan (University of Ljubljana, Faculty of Arts)

Download presentation

Qualitative methods such as cognitive interviewing allow for the examination of the validity of survey questions taking into account respondents’ language and cultural context. At the Slovenian National Institute of Public Health (authorised producer of national health and health care statistics), cognitive testing was conducted to thoroughly examine Slovene versions of the selected questions from the Survey on Tobacco, Alcohol and Other Drugs, followed by a quantitative testing of the original and revised questions in the pilot study. Cognitive testing was implemented in two steps, starting with face-to-face cognitive interviews (CIs) that were followed by online probing. Online probing was conducted using two versions of web questionnaire: while the first version included the same survey questions that were tested using CIs, the second version focused only on the questions about the use of drugs and the misuse of medicines and the users of an NGO working in the field of the reduction of harmful effects of drugs (DrogArt) were invited to participate. The two modes of cognitive testing were in line with the design of the national survey (face-to-face and online) and were suitable due to the sensitive survey topic with high expected social desirability in answering survey questions. Based on the findings of the qualitative testing, the wording of some questions and answer categories was modified. Qualitative testing was followed by a pilot study using computer assisted web interviewing (CAWI) with a simple random sample (SRS) of 600 inhabitants that were splitted into two halves, allowing to test and compare the responses to the original and the revised version of the questions. Findings from both the qualitative pre-testing and the pilot survey proved useful in identifying and improving problematic survey questions, specifically those concerning sensitive topics, prior to collecting data in the field.


Mixed Methodology in Questionnaire Testing for the Internet-Panel Survey

Miss Yulia Epikhina (Federal Center of Theoretical and Applied Sociology of the Russian Academy of Sciences) - Presenting Author

Internet panel surveys face a double challenge of accuracy. Firstly, researchers must test surveys questions. Secondly, they have to test and adjust the electronic form of the questionnaire. The mixed methodology is instrumental in tackling these two challenges. Cognitive interviews allow to discover possible biases of perception that are inherent in the questions posed to the respondent. The online probing helps detect errors that emerge in the electronic forms of the same questionnaire. The paper summarizes the outcomes of testing questionnaires for an Internet panel survey. The testing included three procedures – cognitive interviewing, online probing, collecting and analyzing respondents’ comments after online probing. In each of the procedures a particular type of errors came to the surface. Cognitive interviews allowed to detect possible misunderstanding of questions by the respondents and correct the formulations to avoid erroneous interpretations. Online probing opened a possibility to discover errors that are specific to the electronic form and navigation. Analysis of comments made by the respondents after the completion of the questionnaire helped to evaluate the adequacy of the instrument as a whole and concentrate on the time limits within which the responses could be described as accurate.