Conference Programme 2015

Conference floor plans and map
Tuesday 14th July      Wednesday 15th July      Thursday 16th July      Friday 17th July     


Thursday 16th July, 16:00 - 17:30 Room: HT-102

Response rates and nonresponse bias in comparative surveys 2

Convenor Dr Koen Beullens (KU Leuven )
Coordinator 1Dr Ineke Stoop (SCP)

Session Details

Comparing response rates and possibly associated nonresponse bias can be hard in the context of cross-national surveys. First, response rates objectives may be set differently. In this respect, the European Social Survey sets a 70% response rate objective, without really penalizing countries that do not achieve this objective. PIAAC, on the other hand, also sets a target response rate of of 70%, but accepts response rates in between 50% to 70%, whereas when response rates below 50% occur a nonresponse analysis has to be provided. Second, response rates have to be calculated in a comparable ways. Response rates from EU-SILC and the LFS, for instance, are sometimes hard to compare because the numerator and/or denominator may be calculated in different ways. Third, and even more complicated, are the national differences regarding survey design features (e.g. sampling design) that have diverse implication for the response and the nonresponse bias.

Not only is it hard to determine nonresponse bias for a single survey, the cross-national context even adds more complexity, probably strongly jeopardizing the comparability between countries or surveys. The survey climate in different countries, the related nonresponse mechanisms, strategies to minimize nonresponse (bias) or adjustment methods are not likely to be considered as uniform over different countries or surveys.

Therefore, this session welcomes papers on (1) enhancing response, (2) fieldwork strategies minimising nonresponse bias and (3) nonresponse adjustment methods, all providing better comparability for cross-national surveys.

Paper Details

1. Is it Really Worth the Effort? – A Meta-Regression Approach on the (Cost-) Effectiveness of Incentives in Self-Administered Surveys
Mr Andreas Schneck (Goethe University Frankfurt Main)
Professor Katrin Auspurg (Goethe University Frankfurt Main)

Nonresponse (bias) is a huge threat to validity of survey results. Beside reminders incentives offer a promising way to increase response rates, particularly if no information on nonrespondents exists. For the first time a meta-regression approach is used that allows multivariate hypotheses tests while addressing heterogeneity across study settings that could otherwise bias results: What incentive settings works best given basic conditions of the survey? How cost-effective are incentives? Summarizing 363 trials particularly low-valued unconditional monetary incentives are found to offer effective ways to improve response rates while at the same time keeping survey costs at minimum.


2. A responsive fieldwork design to increase retention rates in the Survey of Health, Ageing and Retirement in Europe (SHARE)
Dr Annette Scherpenzeel (Munich Center for the Economics of Aging (MEA))

Panel retention rates in most countries which participate in the Survey of Health, Ageing and Retirement in Europe (SHARE) have stabilised, over the last four waves. In Germany however, the retention rate is lower than in other countries. For the sixth wave of data collection, a responsive fieldwork design was implemented in the German substudy, in an attempt to increase the retention rate. We present the responsive design used, including the respondent characteristics included in the monitoring, the fieldwork interventions that resulted from the monitoring, and the results on the German retention rates.


3. Money Makes the World Go ‘Round: A Survey Experiment on Income Non-Response in Multi-National Surveys
Dr Jill Carle (Pew Research Center)
Dr James Bell (Pew Research Center)
Ms Fatima Ghani (Pew Research Center)

Creating comparable, cross-national measures of household income is inherently challenging, given differences in language and context, and often prohibitively high non-response rates. Given these limitations, what is the most useful survey method for reducing non-response rates while providing data that is comparable across a wide-range of countries? Using Pew Research Center data from a survey experiment and several cross-national surveys, we illustrate one mechanism of reducing refusal rates for income questions. By combining a close-ended income question with more general, median-based follow up options, we limit item non-response on income.