Longitudinal surveys - challenges in running panel studies 3
|Convenor||Dr Jutta Von Maurice (Leibniz Institute for Educational Trajectoires )|
|Coordinator 1||Ms Joanne Corey (Australian Bureau of Statistics)|
This study provides new insights into what effect the wording of proactive dependent interviewing (PDI) questions has on changes measured in panel surveys. We report on an experiment in the UK Understanding Society Innovation Panel where the wording of PDI questions was experimentally varied. For example respondents were reminded of their previous health status and asked “Is this still the same?” versus “Has this changed?”. We report on behaviour coded recordings of the interviews, to examine the role of interviewer and respondent behaviours in measuring change. Initial results suggest that the “Has this changed?” version is problematic and best avoided.
Three research questions are addressed: Which panelists make use of the opportunity to get in touch with panel survey management? What concerns or requests do they have? Which communication channels are used? Evidence is presented from the first year of GESIS panel (www.gesis-panel.org). The contents, the contact-channels used, and distribution over time of the panelists’ requests across both modes (online, offline) are described. Multivariate analyses encompassing panelist and survey-level characteristics are employed to explain various contact incidence metrics. The findings set the stage for developing evidence-based measures and procedures to prevent panel attrition.
Particularly regarding panel studies arguments for interviewer continuity are manifold. Panel attrition in subsequent waves can be avoided by interviewer continuity. In addition, rules for the interviewers’ contact behavior in surveys are set, which allow for avoiding selectivity effects.
Is it really the interviewer continuity, which ensures a higher probability of realizing panel cases? What role do contact strategies or contact history play?
With the data from the PASS panel study we try to find answers for our questions and hints for further field strategies from descriptive data and regression approaches.
For panel surveys it is usually recommended to allocate the same interviewer to the same respondent over time. Surprisingly, there is little empirical evidence on how interviewer (dis-)continuity affects item nonresponse. Using the first five waves of the German Family Panel (pairfam) we can show that there is no negative effect of interviewer change, the interviewer’s experience or an age difference between interviewer and respondent on the level of item nonresponse. However, a change from a male to a female interviewer results in a higher likelihood for item nonresponse.
Straightlining, an indicator of satisficing, refers to giving the same answer in a series of questions arranged on a grid. We investigated whether straightlining changes with respondents’ panel experience in two Internet panels in the Netherlands. Specifically, we considered straightlining on 10 grid questions in LISS core modules (7 waves) and on a grid of evaluation questions presented after every major survey. For both core modules and evaluation questions we found that straightlining increases with respondents’ panel experience for at least three years. We discuss options on what to do with straightliners in panel surveys.