ESRA logo

ESRA 2023 Preliminary Glance Program

All time references are in CEST

Interviewers and measurement quality 1

Session Organiser Dr Michael Link (IPSOS, Atlanta, GA USA)
TimeThursday 20 July, 14:00 - 15:30
Room U6-01f

Many factors – personal, design, environmental – shape interviewer behavior and can impact measurement quality. These studies examine this complexity through various lenses. The first considers the shortage of qualified field staff for in-person survey data collection in the US as a result of technological advances and pandemic-driven work-life balance changes. Comparisons are made between US and European experiences and recommendations are provided for adapting recruitment practices and alleviating workforce demands. The second study considers the effect of mixed-mode designs on interviewer variance in data collection. The results suggest that interviewer variance may be larger in face-to-face mode for sensitive items, while larger in telephone mode for non-sensitive items. Third, the European Social Survey (ESS) seeks to minimize undesirable interviewer behavior (UIB) and monitor interviewer behavior to ensure data quality. A holistic approach is used to prevent, detect, and assess interviewer-related issues and minimize UIB. The fourth investigates the effect of interviewer's age on respondents' values in the ESS from 30 countries. Results show that older interviewers primed conservative values and reduced openness to change. Fifth, gamification has been shown to improve Field Interviewer experience, motivation and loyalty in survey research. The paper discusses the effectiveness of multiple production-driven gamification programs in a nationwide survey and their impact on Field Interviewer behavior and resulting respondent engagement. The final study investigates response styles (straight-lining, middle-, extreme-responding, and acquiescence) in face-to-face surveys using data from the Portrait Values Questionnaire in the ESS in 18 countries. Results showed significant effects of respondent-interviewer gender and age matching, while interview length was also related to the presence of response styles. The study recommends using interviewer-collected data and further studies on response styles in face-to-face surveys.

Keywords: interviewer behavior, measurement quality

Effective ways to detect and minimize undesirable interviewer behaviour in a decentralized cross national comparative survey. Findings from the Europeans Social Survey R9 and R10.

Dr Paulette Flore (SCP/ESS) - Presenting Author
Mr Roberto Briceno-Rosas (Gesis/ESS)
Dr Joost Kappelhof (SCP/ESS)

Interviewers can affect both the measurement and the representation dimension of the Total Survey Error framework (TSE, Groves et al.2009). However, undesirable interviewer behavior (UIB) can not only affect the accuracy of estimates, but UIB can also affect the comparability of estimates in case of multinational, multiregional or multicultural (3MC) surveys. Therefore, detecting and reducing UIB becomes even more urgent for an interviewer-assisted survey in a 3MC context, especially when it concerns large scale face-to-face surveys employing many interviewers at the same time.

The European Social Survey (ESS) aims to measuring attitudes, beliefs and behavior patterns in a changing world and has been conducted as a bi-annual cross-national face-to-face survey since 2001. In order to discourage UIB and keep its adverse effects on data quality to a minimum, the ESS developed a framework aimed to tackle this issue using a holistic approach with respect to interviewer behaviour and their involvement in the survey life cycle. This approach allows the ESS to prevent, detect and assess interviewer related issues in real time and post data collection, which affect the ESS data quality.

We will discuss the ESS approach to minimizing UIB and monitoring interviewer behaviour in a timely and comparable way. We will present results of this approach using the post hoc assessment of the R9 data release as well as results from the interim data set analysis of R10.

What can interviewer-collected paradata tell about measurement quality in face-to-face surveys? Analyzing response styles in the 21-item version of Schwartz’s Portrait Values Questionnaire based on the European Social Survey, 2008– 2018

Dr Marek Muszyński (Institute of Philosophy and Sociology, Polish Academy of Sciences) - Presenting Author
Dr Piotr Jabkowski (Faculty of Sociology; Adam Mickiewicz University, Poznan)

Response styles, defined as the tendency to choose the response option on a basis other than the content of a question, is a response bias that can seriously threaten the quality of surveys. Although response styles are present in both face-to-face and self-completion surveys, they have been far more studied in the latter mode of data collection. Our study aimed to fill this gap and investigate four response styles (straightlining, middle-, extreme-responding, and acquiescence) in face-to-face surveys. Specifically, we focus on the presence of response styles in the 21-item version of Schwartz’s Portrait Values Questionnaire based on large data from six waves of the European Social Survey (2008-2018) covering 18 countries participating in all consecutive rounds of the project. We made use of three complementary types of ESS datasets by combining: (1) standard cumulative survey results with (2) data from “interviewers’ questionnaires,” which provided interviewers’ observations on the context of the interview, and (3) data from “contact forms” recording the timing, mode, and outcome of each contact attempt. We identified main response styles covariates, concentrating on interviewer-collected survey paradata describing respondent characteristics, interview contexts, survey length, and the interviewer-respondent sociodemographic match.
The results of multi-level regressions (with respondents nested within interviewers, countries, and ESS rounds) pointed to a non-negligible role of respondent-interviewer gender and age matching for the presence of response styles. In turn, much lower effects were obtained regarding respondents’ levels of cooperation before and during an interview. At the same time, interview length was significantly related to the presence of response styles, with faster interviews associated with lower data quality. We concluded with a recommendation for using interviewer-collected paradata and further studies on response styles in face-to-face surveys.

Adapting Field Staff Recruiting Efforts in a Post-Pandemic World

Mr Rick Dulaney (Westat) - Presenting Author
Dr Jennifer Kelley (Westat)
Dr Jill Carle (Westat)
Ms Tammy Cook (Westat)
Mr Brad Edwards (Westat)

Quality interviewers are critical to successful in-person survey data collection. However, labor market shifts, exacerbated by the pandemic, have made field staff recruitment and retention increasingly challenging. In the U.S. context, technological advances and a pandemic-invigorated emphasis on work-life balance have catalyzed a seismic shift in where and how work is done, making it more difficult to find applicants with the right balance of interpersonal and technical skills. These broader labor market changes impact recruitment and retention, further straining data collection and project budgets, and potentially decreasing data quality. Some suggest the field survey labor force shortage is an existential threat to the in-person data collection mode, or even to all interviewer-mediated surveys.
This presentation will compare the recent U.S. field labor experience with conditions in Europe, drawing from interviews with leading European survey organizations to understand the scope of current challenges in recruitment and retention of qualified data collectors. Our assessment of U.S. conditions is informed by reports from a series of panel sessions with representation from most of the largest survey data collection organizations in the U.S., and a deep dive into Westat's experience on 8 major projects over the past decade. We will highlight recommendations for adapting recruitment practices in a post-pandemic labor market across survey contexts. We will also review ways to alleviate CAPI workforce demands (e.g., multimode alternatives; updating value propositions for respondents).