ESRA 2017 Programme

Tuesday 18th July      Wednesday 19th July      Thursday 20th July      Friday 21th July     

     ESRA Conference App


Tuesday 18th July, 11:00 - 12:30 Room: F2 104

Get in touch - Stay in touch: Methodological and procedural efforts and advancements regarding field access and panel maintenance 2

Chair Dr Roman Auriga (Leibniz Institute for Educational Trajectories )
Coordinator 1André Müller-Kuller (Leibniz Institute for Educational Trajectories)
Coordinator 2Dr Götz Lechner (Leibniz Institute for Educational Trajectories)

Session Details

This session focuses on central methodological and procedural efforts and advancements in dealing with panel access and maintenance. With regard to target population characteristics, study designs, and environmental/systemic conditions, elaborated and customized communication- and operationalization strategies do matter, as access to panel samples is administered in manifold ways. In addition, dealing with and operationalizing complex designs requires specific panel strategies that have to be implemented to cope with the inherent complexity of studies.

Besides special requirements of the target population and the study design, major challenges for the administration of panel studies mainly arise due to the sovereignty and responsibility of gatekeepers and therefore the need for authorization (e.g., institutional surveys). During the negotiation and administration processes within a multidimensional system with multiple players (stakeholders, normative-institutional social actors, targets, etc.), various – sometimes competing and changing – interests need to be brought in line to gain and maintain access. Considering the renewal of the EU-data protection regulation (EU) 2016/679, fieldwork management has to be revisited.

Response errors (i.e., unit and item nonresponse), panel attrition and withdrawals are painful mistakes culminating in biases or panel error. Experiences and strategies in multi-informant, multi-cohort and mixed-mode panels (e.g. NEPS, BCS, SOEP, SHARE) should help to increase our understanding of the causes and consequences of nonresponse, refusal and withdrawal behavior is the fundamental basis for staying in touch.

As administrating panel studies is an important part of conducting panels successfully, we are going to concentrate on the art of panel-maintaining strategies in the fields of:
a) Increased understanding of nonresponse, refusal and withdrawal behavior. Which are the most appropriate ways for first and each successive contact and communication with target persons with regard to panel stability?
b) Multi-level negotiation strategies and tools dealing with systemic obstacles and the renewal of data protection regulations.
c) The reduction of response error and panel attrition considering particular framings of sense and meaning in target populations. Which characteristic communication strategies shall be addressed to target person’s lifeworld (Lebenswelt). How can trustful relations with target persons be fostered through customized communication offerings?
d) Advanced tracking and maintaining strategies comprising common and less common panel populations.

Paper Details

1. ELIPSS : how to improve and create the panel management activites
Mr Alexandre Chevallier (SciencesPo)
Mrs Elodie Petorin (SciencesPo)

ELIPSS (Étude Longitudinale par Internet Pour les Sciences Sociales) is a probability based Internet panel, dedicated to social sciences, inspired by the dutch LISS Panel.
The pilot study is running since 2012, consists of 1000 panel members and since 2016, 2500 new panel members were recruited for a total of 3500 panelists.


The target population represents all individuals living in private households in metropolitan France, aged 18-79 at the entry in the panel, having sufficient command of the French language to answer self-administered questionnaires. Each panelist is equipped free of charge with a tablet connected to the Internet.
ELIPSS surveys are given each month and should not exceed 30 minutes. Surveys deal with various topics such as health, environment, politics, etc. They are designed by researchers after a selection by a dedicated scientific committee.
One of our key point is to maintain regular participation to the surveys. Over the years, the ELIPSS team has set up an array of panel managements tools that have made possible to handle more than three thousand panelists and their tablets with three panel managers only.


The panel maintenance relies on an intranet inspired by the LISS Panel, the Panel Management System (PMS), and tailor-made by the IT team in close collaboration with our panel management team. How does this tool developed in house allow a personalized follow-up of more than three thousand panelists?


Thanks to the pilot study, we were able to formalize our specific management processes and communication strategies to develop specific tools. One of these tools is the ‘identity card’ which summarizes all necessary information to follow-up a panelist. Four types of information are gathered to allow a quick profile analysis :
- contact data (name, first name of the person and contact details)
- response profile (response rate and follow-up letters sent)
- incidents data (closed and open ticket)
- tablet and phone line data (inventory reference and status).
Detailed information is available for each data category and allows a precise management when needed.
A daily script checks the answer to the survey of panelists and adds them in different “groups”. Each group will be applied a specific communication process : for example one group will only receive phone calls to motivate them to answer whereas another one will receive letters.


These efforts to maintain the panel are positive: thanks to these, the response rates are around 80% each month and the attrition rate is moderate (less than 25% after 40 months)..
We are constantly trying to develop new features to make our survey management more efficient and easy. For instance, in the near future, we will deploy android notifications for tablets as well as an automatic survey recall system to be able to follow more precisely the panelists.


This presentation will give an overview of the different monitoring strategies and the tools we have developed to support the panel management activities. We will discuss the new features that could be set up to improve our processes.


2. Which factors affect participation in an additional online survey after participating in a one-hour-interview?
Miss Anne Kersting (infas Institut für angewandte Sozialwissenschaft GmbH (Institute for Applied Social Sciences), Bonn, Germany)
Mr Michael Ruland (infas Institut für angewandte Sozialwissenschaft GmbH (Institute for Applied Social Sciences), Bonn, Germany)
Dr Reiner Gilberg (infas Institut für angewandte Sozialwissenschaft GmbH (Institute for Applied Social Sciences), Bonn, Germany)

It is essential for panel studies to keep panel attrition and nonresponse bias to a minimum. Studies show that several factors such as study design, survey mode, individual characteristics, or interviewers have an impact on panel attrition and sometimes on nonresponse bias. They also show that surveys with specific topics bear the risk of selective participation. In addition, different studies indicate that the respondents’ commitment to a study is one of the key strategies for ensuring stable participation and thus avoiding panel attrition. Incentives are considered one key element among others to improve respondents’ commitment to a study.
This paper focuses on how individual characteristics, experiences in the panel study as a whole and in the personal interview just carried out, or learning effects regarding incentives affect participation rates in an additional online survey.
The analysis is based on the school-leaver-sample of the German National Educational Panel Study (NEPS), an educational panel study based on six age-specific samples (cohorts). The school-leaver-sample is a rep-resentative sample of students recruited in 2010 in grade 9 (aged 15-17) in different types of schools: Students started with paper-based surveys in the classroom. After having left school, they have been followed up and interviewed either in an individual context at home or via telephone. Since 2013 certain topics of the questionnaire were outsourced into additional online surveys in order to reduce interview length and respondents’ burden.
In 2013 and 2014, only certain groups (e.g. students), which were identified in the interviews, were requested to participate in the additional online survey. In 2015, all respondents (school leavers aged 20-23) were informed that the survey consists of two parts, first a personal interview (CATI or CAPI) and subsequently an additional online survey. In order to encourage participation online the persons in question were informed that they would receive the monetary incentive only after having participated in both parts of the survey, personal interview and online survey. Hesitant interviewees received up to three reminders to complete the online questionnaire. However, all respondents – also those which indeed participated in the interview but not in the online survey – received the full monetary incentive eventually, in order to maintain commitment to the study.


3. “Don't Ever Call Me Again” – Methodological Efforts of Longitudinal Studies
Mr André Müller-Kuller (Leibniz Institute for Educational Trajectories)
Dr Nicole Luplow (Leibniz Institute for Educational Trajectories)
Mr Florian Bains (Leibniz Institute for Educational Trajectories)

When it comes to causal inference, longitudinal data is considered to be the state-of the art. It is praised as the salutary measure to uncover causal relationships. But with the emergence of complex longitudinal designs, a variety of problems arise which require attention and handling. As almost all surveys are affected by non-response, i.e. refusals/non-contact/unknown eligibility, withdrawal of the panel consent to contact targets once again is a unique problem of longitudinal survey designs.

As longitudinal surveys depend on repeated participation of sampled targets, the availability of individual contact information is an essential requirement for staying in contact with them. When participants and non-participants differ in specific characteristics, withdrawal of panel consent –similar to non-response– is problematic with regards to data quality (Engel und Schnabel 2014). Furthermore, the withdrawals and therefore their impact on analyses cumulate over time.

Common ways to handle survey errors are ex-post adjustments, like weighting or imputation, as well as sample-refreshment: strategies which only mask the problem (Schnell, 2012). A crucial point to not only reduce such errors but also preventing them, is getting better knowledge concerning the causes and risks of non-response and withdrawals. As stated by the American Association for Public Opinion Research, survey researchers have needed more comprehensive and reliable diagnostic tools to understand the components of total survey error (AAPOR 2016).

On the one hand, studies about non-response find that response rates differ across some demographic variables, i.e. age and gender (Smith 1983; van Loon, 2003; Porter, 2005; Billiet, 2007). Additional characteristics, like urbanicity (de Leeuw and de Heer, 2002), health status (van Loon, 2003), social involvement (Porter, 2005) and number of household members (Groves and Couper, 1998), also influence a person’s probability to participate. In contrast to van Loon (2003), Porter (2005) furthermore finds differences in survey participation between levels of education.
Explanations of withdrawal on the other hand, are sparse and insufficient. Previous studies on reproaching targets show no significant influence of socio-demographic characteristics on withdrawal apart from the initial approach, because highly educated persons are overrepresented (Engel et al. 2012, S. 279ff).

Using data and experiences from the National Educational Panel Study (NEPS) it is possible to examine withdrawals thoroughly. One big advantage of the longitudinal data is the possibility to draw upon information from previous waves and additional process data, enabling comprehensive analyses to explain refuse and withdrawal behavior. The study’s special design allows profound analyses of six different cohorts within diverse settings and with varying modes. These analyses will provide information on behavioral patterns and help develop guidelines on handling withdrawals.


4. Non-response Bias reduction through weighting with Paradata
Mr Florian Bains (Leibniz-Institute for Educational Trajectories)
Mr André Müller-Kuller (Leibniz-Institute for Educational Trajectories)

Non-response and withdrawals in surveys have become some of the biggest threats to data quality. Response rates still decrease despite extensive measures, introduced to stabilize them (De Leeuw & De Heer, 2002). When people systematically refuse to respond to surveys, the risk of biased analyses of target variables increase. If covariates correlate, both with survey participation and these target variables, they don’t represent the target population anymore. Therefore, a number of different approaches were developed to reduce the covariance between these three factors. The most common approach to this problem might be response propensity weighting (Groves, 2006).

The idea behind response propensities is that sampled persons each have a different probability to take part in the survey depending on their characteristics. These characteristics are incorporated into a logistic regression with the survey outcome as dependent variable. The weights are then calculated via inversion of the regression’s results. Subsequently they can be used for various different analyses (Groves, 2006).

This weighting adjustment although, is not always feasible. To calculate response propensities, one needs information on participants as well as the non-respondents. When using panel data, it is possible to draw upon previously collected variables. But there is rarely any information on the refusers in a cross-sectional survey. Thus Kreuter and Olson (2013) came up with the idea to incorporate paradata, e.g. data collected by the interviewer, into the response propensity model to function as proxy variables for the refusers’ characteristics.

On the one hand, paradata may be automatically collected throughout the survey process, e.g. the number of calls that are necessary to reach the respondent and the specific outcome of each attempt. On the other hand, it might be data purposely recorded by the interviewer, e.g. maintenance of housing the respondent lives in. They often indicate a person’s will to participate but rarely correlate with target variables (Kreuter, 2013). Nonetheless they may still be able to reduce the covariance between response propensity and target variable to decrease the risk of non-response bias.

In this study we examine how paradata can be effectively incorporated into response propensity models with regards to people refusing participation or recalling their consent to save their data. The National Educational Panel Study provides extensive documentation on the survey process and further information on participants from six cohorts in varying settings and modes, making it an excellent data source to test whether paradata, in addition to survey data, can help to reduce bias due to non-response.