ESRA logo
Tuesday 14th July      Wednesday 15th July      Thursday 16th July      Friday 17th July     




Tuesday 14th July, 16:00 - 17:30 Room: O-202


Interviewers’ Deviations in Surveys 3

Convenor Dr Natalja Menold (GESIS )
Coordinator 1Professor Peter Winker (University of Giessen)

Session Details

Interviewers’ behavior might have an impact on survey data. The session deals with deviations from the instructions when contacting and interviewing respondents. These deviations might be caused by different factors such as task difficulty, interviewers’ ability, experience and motivation, but also by the quality of questionnaires and instructions. Deviations might result in bias, but could also foster data quality, e.g. if interviewers try to help the respondents in providing a (correct) answer. The session aims to discuss both, deliberate (e.g. falsifications; providing explanation) and non-deliberate deviations (e.g. interviewers’ mistakes).
Researchers are invited to submit papers dealing with all kinds of interviewers’ deviations in the survey process which might result in non-observation or measurement errors but also positively influence survey outcomes. Of interest are theoretical approaches and empirical studies on detection and prevention, on explanatory factors and consequences of interviewers’ deviations. Thus, interviewers’ motivation to deviate from prescribed standards or to produce high quality survey data as well as interviewers’ cognitive skills and competencies could be of interest.

Paper Details

1. Interviewer Monitoring and Performance in the Survey of Income and Program Participation
Dr Jason Fields (US Census Bureau)
Dr Matthew Marlay (US Census Bureau)
Dr Holly Fee (US Census Bureau)

Using Wave 1 data from the 2014 panel of the Survey of Income and Program Participation (SIPP), this study examines indicators of Field Representative (FR) performance. We focus on the correlations across indicators, the predictive value of early indicators like interviewer training certification test scores, and employ multilevel modeling to understand how much of the variance in key measures (like interview length, consent to record interviews, don’t know and item refusal rates, incomplete data, and contact strategies) are attributable to respondents, FRs, and regional offices.


2. Is it a must to read questions word by word in survey interview? findings of Chinese Family Panel Studies
Dr Liying Ren (Peking University)
Dr Jie Yan (Peking University)

Researchers make great efforts to refine question wording in the design of survey questionnaires, but interviewers may not read questions word by word in field interview. How could we find out interviewers’ such deviant behavior? And what would be its consequences? This paper describes the strategies of detection and reports the discoveries from practices based on the experience and paradata of Chinese Family Panel Studies (CFPS). With a focus of three prominent problems identified above, it further investigates the factors that foster the deviant behavior and assesses the deviations’ influences on survey data quality.


3. Interviewer-Respondent Interactions in Conversational and Standardized Interviewing: Results from a National Face-to-face Survey in Germany.
Mrs Felicitas Mittereder (Michigan Program in Survey Methodology, University of Michigan, Ann Arbor)
Mrs Jennifer Durow (Michigan Program in Survey Methodology, University of Michigan, Ann Arbor)
Mr Brady West (Institute for Social Research, University of Michigan, Ann Arbor)

In recent years, researchers have given greater attention to standardized and conversational interviewing techniques used in survey data collection. In this study, we explore respondents’ reactions to both interviewing techniques in a national face-to-face survey conducted in Germany. The recorded interviews were coded and tabulated into respected groups of variables. We find that respondents show more evidence of confusion when they recognize the interviewer’s ability to respond. We explore the situation when a difficult question is met with ease in the standardized group, but with confusion in the conversational group. We address the concern that conversational interviewers


4. The interviewer in the respondent’s shoes: Explaining respondent behaviour by the behaviour of interviewers when answering survey questions
Ms Celine Wuyts (Centre for Sociological Research, KU Leuven)
Professor Geert Loosveldt (Centre for Sociological Research, KU Leuven)
Dr Anina Vercruyssen (Centre for Sociological Research, KU Leuven)

Although interviewers are often expected to act in a neutral and standardized manner during survey interviews, to some extent they intentionally or unintentionally deviate from their instructions. We hypothesize that interviewers’ behaviour in responding to a questionnaire is related to the respondents’ behaviour in the same questionnaire through the way interviewers deviate from standardization in interacting with respondents. On the basis of data from the sixth round of the European Social Survey in Belgium and interviewer data from the same questionnaire, we investigate the relation between how interviewers and respondents answer survey questions.