ESRA 2019 Draft Programme at a Glance


Recruiting and maintaining probability-based online panel surveys 1

Session Organisers Mr Ulrich Krieger (SFB 884, University of Mannheim)
Ms Anne Cornilleau (Centre de Données Socio-Politiques, Sciences Po)
Ms Ines Schaurer (GESIS - Leibniz Institute for the social sciences)
TimeThursday 18th July, 09:00 - 10:30
Room D21

In recent years, there has been an increasing number of online studies based on a probability sample. In addition, traditional large scale panel surveys have increasingly gone online (e.g. UKHLS, ESS).
Recruiting and maintaining such online panel surveys creates its unique challenges.

Probability-based online panels aim to combine the best of the offline and online survey world. Because of the lack of a sampling frame, there are many ways of offline recruitment that have their own challenges. Furthermore, the online survey needs to be constantly adjusted to the rapidly changing web and device infrastructure. As most online panel surveys have short wave spans, contact procedures to respondents need to be adjusted accordingly.

In this session we invite papers presenting solutions and discussing challenges when recruiting, refreshing or maintaining probability-based online panel surveys. This includes topics such as:
Sampling and contacting respondents for online participation.
Pushing respondents to web participation in online panel studies.
Improving sign-up rates to online panel studies.
Strategies for refreshing the samples of running online panel studies.
Techniques and interventions to prevent panel attrition.
Methods to improve the usability of the online surveys, especially for mobile devices.
In addition, all other topics relevant to the session theme are welcome.






Keywords: probability-based online survey, panel maintenance, panel recruitment

What Are the Most Effective Strategies of Web-Push in a Probability-Based Panel?

Mr David Bretschi (GESIS – Leibniz-Institute for the Social Sciences) - Presenting Author
Dr Ines Schaurer (GESIS – Leibniz-Institute for the Social Sciences)
Dr Don Dillman (Washington State University)

In recent years, web-push strategies have been developed in several cross-sectional mixed-mode surveys in order to improve response rates and reduce the costs of data collection. However, pushing respondents into the more cost effective web option has rarely been examined in the context of panel surveys. This study evaluates how different web-push strategies affect the willingness of mail mode respondents in a mixed-mode panel to switch to the web.

We conducted a randomized web-push experiment in the October/November wave 2018 of the GESIS Panel, a probability-based mixed-mode panel in Germany (n=5,736). We used an incompletely crossed experimental design with two factors: A) time of presenting the web-option and B) prepaid vs. promised incentives by randomly assigning 1,986 mail mode panelists to one of three conditions:
1) the web option was offered concurrently with the paper questionnaire including a promised 10 € incentive for completing the survey on the web,
2) the web option was presented sequentially two weeks before sending the paper questionnaire and respondents were also promised an incentive of 10 €,
3) same sequential approach as group 2, but with a prepaid 10 € incentive instead of a promised incentive.
We examine how conditions differ on the web response rate of mail mode respondents and the proportion of respondents who agreed to switch to the web mode for future waves.

Contrary to our expectation, the results show that prepaid incentives do not improve the web response rate compared to promised incentives. In contrast, we found that a sequential presentation of the web option significantly increases the web response rate for the single wave, as opposed to offering the web mode concurrently. However, this effect between experimental groups decreases among respondents who agreed to switch to the web mode for future surveys.


Push-to-web recruitment of a probability-based online panel: Experimental evidence

Dr Carina Cornesse (SFB 884, University of Mannheim)
Professor Annelies Blom (Department of Political Science and SFB 884, University of Mannheim)
Dr Barbara Felderer (SFB 884, University of Mannheim)
Mrs Marina Fikel (SFB 884, University of Mannheim)
Dr Ulrich Krieger (SFB 884, University of Mannheim) - Presenting Author

Past research has shown that pushing respondents to the web is a successful way to increase response rates, reduce data collection costs, and produce representative outcomes. However, studies in that literature are usually limited to cross-sectional surveys on small and homogeneous target populations. Our study rises beyond this limited scope to a broad and, so far, unique application: We investigate the relative success of pushing respondents to the web compared to alternative survey design strategies across the recruitment stages of a probability-based online panel. In order to do this, we implemented a large-scale experiment into the 2018 recruitment of the German Internet Panel (GIP).

In this experiment, we sampled 12,000 individuals and randomly assigned each individual to an experimental group: online-only, online-first, offline-first, or concurrent-first. Individuals in the online-only group received a mail invitation to participate in the web version of the GIP recruitment survey. Nonrespondents in the online-only group were followed up by invitations to the web version of the GIP recruitment survey again. Individuals assigned to the online-first group received the same invitation letter as the online-only group asking them to participate in the web version of the GIP recruitment survey. However, nonrespondents were followed up with a reminder letter containing a paper-and-pencil version of the GIP recruitment survey. Individuals in the offline-first group received the paper-and-pencil questionnaire with the initial invitation letter and were followed up with invitations to the web version of the GIP recruitment survey. Individuals in the concurrent-first group were given the choice between participating in the web version or the paper-and-pencil version. Then, nonrespondents were followed up by invitations to the web version of the GIP recruitment survey. In our presentation, we will show the results of this experiment and discuss our findings.


Equipping the whole panel with a mobile device. Challenges in recruting and maintaining the ELIPSS panel

Miss Emmanuelle Duwez (ELIPSS - CDSP/Sciences Po (Paris)) - Presenting Author
Miss Anne Cornilleau (ELIPSS - CDSP/Sciences Po (Paris))

During the last decade, the Internet has moved from an attractive data collection mode to a common way to administer surveys. Nowadays, the main issue is the representativeness of these web surveys, especially through the academic world. Since the LISS Panel and the Knowledge Panel, there is a growing interest in setting up probability-based web panel. The ELIPSS Panel (Étude longitudinale par internet pour les sciences sociales) is one of such initiatives born in France in 2012 and aims at offering social scientists a service to produce nationally representative surveys. It differs from similar survey systems in other countries by equipping all panel members and by using mobile Internet as the main mode of data collection. A touchscreen tablet and 4G internet subscription are provided to all panel members in exchange for their participation. Thus, they can answer the research questionnaires even if they did not have an Internet connection at home.
As such a service to collect data for social sciences did not exist in France, a pilot study had to be set up to define the recruitment process, establish procedures for managing the panel and producing surveys, and develop software tools. After this pilot phase, for which 1000 individuals were recruited, the panel was expanded to a membership of 3300 members in 2016.
The coming end of ELIPSS in December 2019 gives us the occasion to do an assessment of its implementation. This presentation will focus on the lessons learned in the ELIPSS project with a particular attention to the challenges to recruit and maintain a panel when equipping all its member with a mobile device.


Using Address-Based Sampling to Recruit to Pew Research Center’s American Trends Panel

Mr Nick Bertoni (Pew Research Center) - Presenting Author

The American Trends Panel (ATP) is Pew Research Center’s online, probability-based panel. Between 2014 – 2017, the Center had conducted three separate recruitments, all of which took place at the end of a dual-frame RDD phone survey. In 2018, the Center conducted it’s first-ever recruitment to the ATP using Address-Based Sampling (ABS). The objectives of this recruitment were to expand the size of the panel and enhance the quality of its overall sample composition. A push-to-web approach was used to take advantage of the coverage of an ABS survey and combine that with the cost-effectiveness of a web survey. Lessons learned from this ABS recruitment will be shared for important factors such as visual design, within-household selection, and recruitment of hard-to reach populations – including non-internet households. Results of the recruitment will be provided and compared to the current panel composition and past ATP recruitments. Shortcomings of the recruitment will also be discussed in order to provide areas of opportunity for the Center and for the field to build upon this mode of recruitment. This presentation should be informative for researchers who are using offline modes, including ABS or push-to-web designs, to recruit to online probability-based panels.