Conference Programme 2015
Tuesday 14th July Wednesday 15th July Thursday 16th July Friday 17th July
Thursday 16th July, 14:00 - 15:30 Room: O-202
Longitudinal surveys - challenges in running panel studies 3
|Convenor||Dr Jutta Von Maurice (Leibniz Institute for Educational Trajectoires )|
|Coordinator 1||Ms Joanne Corey (Australian Bureau of Statistics)|
Session DetailsLongitudinal surveys - challenges in running panel studies.
This session will focus on the organisation of panel studies, including panel maintenance, panel engagement, sample review processes, choice of data items and methodologies, and interviewer training.
The focus is on the particular challenges faced by those running panel studies such as:
. maintaining up-to-date contact information and tracking of respondents, including privacy concerns;
. engaging repsondents over the life of the survey, particularly for different age groups, for example how to keep young people interested as they move from children to young adults and they become the primary consenter;
. how successful are different modes for making contact, e.g. mail, phone, text;
. do targeted approach stategies work, e.g. different approach letters depending on past wave response;
. decision making guidelines about when a respondent should be removed from the sample;
. the debate between longitudinal consistency and using a better/updated measure;
. how to conduct training for a mix of experienced and new interviewers, balanced with the amount of new content and methodologies; and
.testing techniques for longitudinal surveys.
Paper Details1. Interviewer and respondent behaviours when measuring change with dependent interviewing
Dr Annette Jäckle (University of Essex)
Dr Tarek Al Baghal (University of Essex)
This study provides new insights into what effect the wording of proactive dependent interviewing (PDI) questions has on changes measured in panel surveys. We report on an experiment in the UK Understanding Society Innovation Panel where the wording of PDI questions was experimentally varied. For example respondents were reminded of their previous health status and asked “Is this still the same?” versus “Has this changed?”. We report on behaviour coded recordings of the interviews, to examine the role of interviewer and respondent behaviours in measuring change. Initial results suggest that the “Has this changed?” version is problematic and best avoided.
2. Who, why, and how get panelist back to survey panel managers? Evidence from the GESIS Panel
Dr Tanja Dannwolf (GESIS - Leibniz Institute for the Social Sciences)
Ms Gabriele Wahlig (GESIS - Leibniz Institute for the Social Sciences)
Mr Kai Böge (GESIS - Leibniz Institute for the Social Sciences)
Professor Michael Bosnjak (GESIS - Leibniz Institute for the Social Sciences)
Three research questions are addressed: Which panelists make use of the opportunity to get in touch with panel survey management? What concerns or requests do they have? Which communication channels are used? Evidence is presented from the first year of GESIS panel (www.gesis-panel.org). The contents, the contact-channels used, and distribution over time of the panelists’ requests across both modes (online, offline) are described. Multivariate analyses encompassing panelist and survey-level characteristics are employed to explain various contact incidence metrics. The findings set the stage for developing evidence-based measures and procedures to prevent panel attrition.
3. What strategies should be followed with interviewers and field processing in order to avoid panel attrition?
Mrs Birgit Jesske (infas Institute for Applied Social Sciences)
Mr Martin Kleudgen (infas Institute for Applied Social Sciences)
Particularly regarding panel studies arguments for interviewer continuity are manifold. Panel attrition in subsequent waves can be avoided by interviewer continuity. In addition, rules for the interviewers’ contact behavior in surveys are set, which allow for avoiding selectivity effects.
Is it really the interviewer continuity, which ensures a higher probability of realizing panel cases? What role do contact strategies or contact history play?
With the data from the PASS panel study we try to find answers for our questions and hints for further field strategies from descriptive data and regression approaches.
4. The influence of interviewer change and interviewer’s characteristics on item nonresponse
Ms Kristin Hajek (Ludwig-Maximilians-University Munich)
Ms Nina Schumann (Ludwig-Maximilians-University Munich)
For panel surveys it is usually recommended to allocate the same interviewer to the same respondent over time. Surprisingly, there is little empirical evidence on how interviewer (dis-)continuity affects item nonresponse. Using the first five waves of the German Family Panel (pairfam) we can show that there is no negative effect of interviewer change, the interviewer’s experience or an age difference between interviewer and respondent on the level of item nonresponse. However, a change from a male to a female interviewer results in a higher likelihood for item nonresponse.
5. Straightlining in Web survey panels over time
Dr Vera Toepoel (utrecht university)
Professor Matthias Schonlau (university of waterloo)
Straightlining, an indicator of satisficing, refers to giving the same answer in a series of questions arranged on a grid. We investigated whether straightlining changes with respondents’ panel experience in two Internet panels in the Netherlands. Specifically, we considered straightlining on 10 grid questions in LISS core modules (7 waves) and on a grid of evaluation questions presented after every major survey. For both core modules and evaluation questions we found that straightlining increases with respondents’ panel experience for at least three years. We discuss options on what to do with straightliners in panel surveys.