ESRA logo
Tuesday 18th July      Wednesday 19th July      Thursday 20th July      Friday 21th July     




Tuesday 18th July, 09:00 - 10:30 Room: N AUD5


It’s the Interviewers! New developments in interviewer effects research 1

Chair Dr Salima Douhou (City University of London, CCSS )
Coordinator 1Professor Gabriele Durrant (University of Southampton)
Coordinator 2Dr Olga Maslovskaya (University of Southampton)
Coordinator 3Dr Kathrin Thomas (City University of London, CCSS)
Coordinator 4Mr Joel Williams (TNS BMRB)

Session Details

To what extent do interviewers affect data collection and how can we better monitor and limit their impact?

Any deviation from the standardised protocol of the data collection process has the potential to induce bias to the data. Interviewer effects, defined as the distortions of survey responses in surveys with interviewer presence, may have a severe impact on data quality. These effects result from potential reactions to the social style and personality of interviewers, but also to their presentation of questions.

Analysis based on data that are biased by interviewer intervention and the conclusions drawn on the basis of this are likely to be incorrect. Hence, survey methodologists have improved the way in which interviewers are trained and briefed in order to limit the interviewers' influence. Yet, it remains open why even in surveys with exceptional efforts to train and monitor interviewers, interviewer effects occur.

Interviewers make (initial) contact with the prospective respondents and attempt to convince them to participate in the survey. The doorstep interaction between prospective respondents and interviewers is rarely documented, but an increasing number of studies indicates that some interviewers are more successful than others in convincing the prospective respondents to participate in a survey and to avoid non-response.

Once door-step interaction has been successful, interviewers may further affect the way in which respondents answer the survey questions on the questionnaire. Variation in survey responses may be due to the attitudes, interpersonal skills and personality of interviewers, but also relate to how the interviewers present particular questions and how strictly they follow the instructions. Any deviation from the standardised protocol provided by the core research team of the survey project decreases the comparability of the survey responses.

This session welcomes papers on new developments in the area of interviewer effects. Topics may include but are not restricted to:
• methodological developments in measuring and modelling interviewer effects,
• interviewer effects on measurement error,
• interviewer effects on nonresponse rates and nonresponse bias,
• interviewer influences on response latencies (timings),
• influence of personality traits, behaviour, attitudes, experience, and other characteristics of interviewers on survey estimates,
• implications for interviewer recruitment and training strategies,
• monitoring and evaluation of fieldwork efforts by interviewers,
• collection of GPS data or audio-visual material of door-step interactions.

Papers that discuss these issues from a comparative perspective are also welcome. We invite academic and non-academic researchers and survey practitioners to contribute to our session.

Paper Details

1. Interviewer Gender and Survey Responses: The Effects of Humanizing Cues Variations
Dr Wojciech Jablonski (Utrecht University & University of Lodz)
Dr Aneta Krzewinska (University of Lodz)
Dr Katarzyna Grzeszkiewicz-Radulska (University of Lodz)

According to social interface theory (Nass et al. 1996; Nass et al. 1997; Fogg & Nass 1996), humanizing information sent by the computer causes reactions typical of human-to-human interactions. Based on these findings, several methodological research studies have been conducted in order to investigate whether the regularities described by the psychologists can be observed in internet surveys in the form, among other things, of interviewer effect (e.g., Tourangeau et al. 2003; Couper et al. 2003; Fuchs 2009). These research results, however, seem to be inconsistent.
In this presentation, we report selected results from an experiment conducted in November and December 2016 among university students (N=900) as part of the research project funded by the Polish National Science Center. This project aims to estimate the influence of humanizing cues on the quality of the data obtained in internet surveys. The presentation shows the findings concerning the impact of the interviewer’s gender on the data. The experiment was based on the multifactorial plan in equal, completely randomized groups. Form of imitation/presence of the interviewer was our major independent variable (factor A) in four scenarios: (1) CAWI/text (with all stimuli presented in the form of text); (2) CAWI/photo (with stimuli presented in the form of text and an interviewer photo); (3) CAWI/movie (with all stimuli presented in the form of video of real interviewers and, additionally, the answers presented in the form of text); and (4) CAPI. The other independent variables were: interviewer gender (factor B, nested within factor A), with two values (male/female) and an extra version with no information for interviewer gender (“we”); interviewer (factor C, random factor nested within factors A & B), where five male and five female interviewers were engaged; and interviewer gender (factor D, constant factor) with two values (male/female).


2. Influencing households’ cooperation: Do an interviewer’s personality and attitude matter?
Miss Sabine Friedel (Munich Center for the Economics of Aging (MEA), Max-Planck-Institute for Social Law and Social Policy)

This paper examines interviewer effects on household cooperation in the Survey of Health, Ageing and Retirement in Europe (SHARE). SHARE, as a face-to-face study, collects micro data on health, socio-economic status and social and family networks. Up to wave six, it covers 20 countries with approximately 123.000 interviews. Besides SHARE’s great advantages, e.g. being a harmonized, multidisciplinary, cross-national panel database, SHARE provides extensive information on interviewers and therefore enables research on interviewer effects. Interviewers play a crucial role during the entire data collection process in all interviewer-mediated surveys Since Wave 5, the SHARE Interviewer Survey has been implemented as an online survey and collects cross-national data on SHARE interviewers. Linking the interviewer data to the SHARE survey data allows analyzing how different interviewer characteristics influence survey outcomes.
Previous research proved existence of interviewer effects and addressed the interviewer’s influence on survey outcomes and evidence of interviewer effects on survey participation in interviewer-mediated studies is undoubted (Groves and Couper, 1998; Pickery and Loosveldt, 2002; Blom et al., 2011). However, only a few researchers have already investigated those interviewer effects in more detail. Often, little is known about interviewers besides information on gender, age and experience. A few studies interview their interviewers and therefore have detailed information about their attitudes, behavior or socio-economic status (Hox and De Leeuw,2002; Blohm et al.,2006; Durrant et al., 2010; Jäckle et al., 2013). Research on interviewer effects and survey participation shows ambiguous results of the interviewer characteristics, which provides room for additional research in this area.
This paper examines the relationship of interviewer personality, interviewer attitudes and household cooperation in SHARE. I focus on cooperation instead of the joint process of contact and cooperation because interviewer effects might be different in contact and cooperation. Moreover, I concentrate on interviewer effects arising from interviewers’ personality traits and attitudes to strengthen or weaken already existing evidence.
The data for analysis comes from SHARE’s sixth wave. 12 out of 18 participating countries conducted the Interviewer Survey. Only data of those countries with high participation rates of the Interviewer Survey, resulting in a good linkage rate of the interviewer data and ‘regular’ SHARE survey data is used. Thereby information of approximately 400 interviewers working in five different countries (AT, DE, IT, SE and SI) are considered for analysis. My unit of analysis is interviewers. Therefore, the dependent variable is the interviewer-specific household cooperation rate. Analyses are based on a linear regression model including country fixed effects.
Preliminary results show that personality traits, e.g. openness, and attitudes towards the importance of involvement in scientific research that serves society, matter. They have an effect on the interviewer’s specific cooperation rate, whereas other characteristics like attitudes towards reluctant respondents have not.
These findings could be used to develop new fieldwork strategies (e.g. matching of interviewer attributes to different subgroups). However, additional research is needed before, if different attributes are important for different subgroups.


3. Do interviewer characteristics affect the levels of political knowledge? Evidence from Austria and Germany.
Dr Sabrina Jasmin Mayer (University of Duisburg-Essen)
Dr David Johann (German Centre for Higher Education Research and Science Studies & University of Vienna)

Political knowledge is a key concept for the explanation of several political phenomena such as the individuals’ voting behavior, as it enforces attitude formation and mediates the interplay of (mass media) information and personal predispositions (e.g., Delli Carpini and Keeter, 1996; Zaller, 1992). In addition, it is crucial for effective democratic participation of citizens (e.g., Delli Carpini and Keeter, 1996; Lau and Redlawsk, 2006).
In most election studies, political knowledge is measured by a battery of questions enquiring about citizens’ factual political knowledge. These batteries often refer to various subdomains such as citizens’ knowledge of institutions and political processes, public figures or political parties. While the dimensionality (i.e., whether or not political knowledge is a multidimensional concept), suitable question formats (e.g., closed- or open-ended), or the inclusion of “don’t know”-answers are often discussed by scholars (e.g. Price, 1999), interviewer effects on political knowledge are hardly addressed. This is surprising because research on interviewer effects in different contexts indicates that interviewers can substantially affect the respondents’ response behavior (e.g. Beullens and Loosveldt, 2016). For example, interviewer characteristics such as gender or race have been shown to influence how respondents answer to questions on gender-related issues (Flores-Macias and Lawson, 2008). Such effects were found for various different modes of data collection, but are usually stronger for personal interviews (CAPI). A first study on interviewer effects on responses to knowledge questions (Dawson and Silver, 2003) identifies a race-of-interviewer effect, i.e. African American respondents tend to give the correct answer more frequently when they were interviewed by an interviewer of the same race, even when controlling for respondents’ level of education and gender.
Our paper aims at closing the research gap on interviewer effects on responses on knowledge questions. We analyze the effects of interviewer characteristics, especially age, education, and gender, on respondents' specific political knowledge. We differentiate between three subdomains of political knowledge: knowledge of (1) political actors, (2) the political system, and (3) party positions. We employ two different kinds of knowledge measures: (a) the amount of correct answers (correct answers are scored 1 and all other answers are coded as 0) and (b) the expression of valid answers (responses, independently whether correct or not, are coded as 1 and “don’t know” responses coded as 0). Such a procedure allows us not only to investigate whether interviewer characteristics have an effect on the number of correct answers, but also to determine whether they affect the tendency to provide a substantial answer to knowledge questions. We rely on data collected in CAPI by the Austrian National Election Study (AUTNES) as well as the German Longitudinal Election Study (GLES). We show that interviewers’ level of education and gender affect respondents’ specific political knowledge in both countries.


4. Interviewer effects on onliner and offliner participation in the German Internet Panel
Ms Jessica Herzing (University of Mannheim, Germany)
Professor Annelies Blom (University of Mannheim, Germany)
Professor Bart Meulemann (University of Leuven, Belgium)

Research has shown that interviewers play a crucial role in obtaining cooperation from sample units. While previous studies investigate the influence of interviewers on unit nonresponse, they typically focus on face-to-face or telephone interviews. Yet recently, we have seen a rise in probability-based online panels, where interviewers recruit panelists for the online panel during face-to-face or telephone interviews. Furthermore, we know from research that recruiting previously offline sample units into probability-based online panels is difficult and that high nonresponse rates among such offliners threaten the representativeness of online panels. Our paper therefore considers the role that interviewers play in recruiting offliners into a probability-based online panel.
We use data from the recruitment interview of the German Internet Panel (GIP). The GIP is a probability-based, face-to-face recruited online panel, which includes persons without computers and/or internet by equipping them with needed devices. In addition, we use data from an interviewer survey conducted among the interviewers involved in the face-to-face recruitment of the GIP.
We investigate whether there is an interviewer effect on people’s likelihood to participate in the GIP. We analyze which interviewer characteristics determine participation in the GIP and investigate whether the interviewer effects and explanatory interviewer characteristics found differ, when interviewers try to recruit previously-offline as compared to previously-online persons.
We find significant interviewer effects on participation in the GIP. We further find that interviewers do not differentially affect the participation of onliners and offliners. However, the interviewer characteristics associated with the successful recruitment of onliners differs from those associated with recruiting offliners. For example, older interviewers are better at recruiting offliners than younger interviewers, but interviewer age has no effect on the recruitment of onliners. In addition, interviewers who expect to achieve higher recruitment rates are better at recruiting onliners than interviewers expecting low recruitment rates, but interviewers’ expectations have no effect on offliner recruitment.