ESRA 2017 Programme
|ESRA Conference App|
Thursday 20th July, 11:00 - 12:30 Room: N 101
Challenges of long-term repeated cross sectional attitude surveys
|Chair||Dr Oshrat Hochman (GESIS Leibniz Institute for the Social Science )|
|Coordinator 1||Mr Michael Blohm (GESIS Leibniz Institute for the Social Science)|
|Coordinator 2||Ms Martina Wasmer (GESIS Leibniz Institute for the Social Science)|
Session DetailsLong-term repeated cross sectional attitude surveys are crucial for monitoring society and social change across time. At the same time, these surveys are also influenced by social changes and new realities. One consequence of this relationship is the tension between the need to replicate questions across the waves and the need to be innovative: On the one hand, repeated measures are meant to allow the monitoring of social changes over time, requiring the regular repetition of items. On the other hand, in the light of such social changes, but also considering advances in survey research, cross-sectional surveys must be up-to-date, and allow the scientific community to investigate new emerging issues and utilize the best measures for existing trends.
The production of high-quality data in Long-term cross sectional attitude surveys also relies on its reliability and representativeness. Long-term cross sectional attitude surveys are thus required to deal with the process of declining response rates to scientific surveys especially in the western world, characteristic also for other survey types. Whether in Telephone, or in face to face interviews, in Postal questionnaires or online, individuals are more reluctant than ever to respond to surveys or to get in contact with target persons. Declining response rates require survey producers to invest more thought and more resources and efforts in the implementation of -social science- surveys. Among others, thought must be given to incentives, interviewer training, fieldwork monitoring, mixed mode, and adaptive designs securing high response rates and high data quality.
The proposed session will foster the exchange of ideas regarding current and prospective challenges in long-term repeated cross sectional attitude surveys. We invite papers especially to the topics a) reconstruction and modification of survey items and b) measures securing high response rates in long-term repeated cross sectional surveys.
Paper Details1. Decreasing Response Rates in the German General Social Survey: A Threat for Data Quality?
Mrs Jessica Walter (GESIS - Leibniz Institute for the Social Sciences)
Mr Michael Blohm (GESIS - Leibniz Institute for the Social Sciences)
Mrs Martina Wasmer (GESIS - Leibniz Institute for the Social Sciences)
The German General Social Survey (ALLBUS) is a repeated cross-sectional multi–thematic survey conducted since 1980. Like other surveys ALLBUS faces also the problem of declining response rates. Since 1994 response rates decreased steadily from 53.7% to 35% and a further decline has to be expected – despite higher fieldwork efforts. If fewer and fewer selected persons are willing to participate in surveys the question arises how this affects survey data quality. Selective participation behavior might threaten the representativeness of a survey. In this presentation we will deal with these data quality issues based on information from the ALLBUS. Is there a (negative) relationship between the response rates of the ALLBUS surveys and nonresponse bias over time? In our analysis we look at this relationship not only over years instead we also investigate this relationship over the course of the data collection period.
The ALLBUS provides a good basis to analyze the relationship between response rate and data quality, since it has used an almost identical study design regarding the key features of the survey implementation (since 1994 target population of adults living in private households; samples of named individuals; 3500 completed interviews; interview duration around 70 minutes; consistent calculation of response rates). To analyze nonresponse bias we validate the ALLBUS surveys on an external criterion namely the German (micro-) census. For example, the selective bias regarding gender, age and education are examined. Since such a benchmark criterion is applicable to socio demographic variables only, we analyze substantial variables by comparing results over the course of the data collection period as well.
First results indicate that decreasing response rates are not associated with declining data quality or changes in sample composition. We argue that these findings only hold in samples with high quality standards regarding sampling and controlling.
2. What if the interviewer decides: Effects of interviewer-based additional incentives for response enhancement in the German sub-study of the European Social Survey (ESS)
Ms Julia Harand (infas Institut für angewandte Sozialwissenschaft GmbH (Institute for Applied Social Sciences), Bonn, Germany)
High response rates are still considered as a key indicator of good survey quality. For the European Social Survey (ESS) as an internationally comparative cross-sectional study it is essential to enhance response rates and keep nonresponse bias to a minimum. To increase the response rate a set of fieldwork measures (e.g. internationally standardised interviewer training, advance letters and incentives, multiple visits at different times and days of the week, close fieldwork monitoring) were taken into account. We will focus on the ESS Round 8 incentive as a direct measure of participation in the study.
Enhancing response rates is a well-known instrument to minimise the chance of nonresponse bias – even if the nonresponse bias does not only depend on the response rate itself, but also on the differences between respondents and non-respondents (selectivity). Some groups (e.g. people with a higher education) usually participate with a higher probability in surveys than other groups (e.g. people with a lower education), which will lead to nonresponse bias in the end. Therefore, incentives are a well-known instrument for receiving higher participation of individuals from such subgroups.
With our incentive strategy we aim not only at gaining a higher response rate but also a more balanced response rate. Besides the announcement of a general monetary incentive in the advance letter, the interviewers had the additional opportunity to offer reluctant persons an even higher incentive. This incentive could be used temporarily at the beginning of the field period to persuade those reluctant persons right from the beginning. The interviewers were free to decide whom to offer this higher incentive. We will focus on the incentive’s effect on the response rate, which includes on the one hand the individual (rational) decision of the interviewer to offer a higher incentive (when persuading a sample person seems probable) as well as the decision of the sample person to participate for more money. Would this kind of incentive be a good instrument to increase the participation in the survey? Does the interviewers’ decision and behaviour have an effect on participation? Do we reach reluctant groups who otherwise tend to refuse participation?
The analysis is based on the German sub-study of the European Social Survey (ESS), an internationally comparative study that has been conducted every two years across Europe since 2002. The German sample is a representative registry offices’ sample of individuals living in private households in Germany aged 15 or older.
We estimate the effect of the incentive on survey commitment and response rate by means of a multivariate model. The model also takes into account the key demographic factors as well as possible interviewer effects.
3. Monetary incentives in face-to-face surveys of the general population - what works best?
Mr Michael Blohm (GESIS - Institute for the Social Sciences)
Mr Achim Koch (GESIS - Institute for the Social Sciences)
Declining response rates are a continuing problem for household surveys in many Western countries. Survey organizations have made various attempts at increasing response rates or at least halting downward trends in response rates. These include, for instance, an increased number of call attempts, interviewer training in refusal avoidance and refusal conversion, the use of advance letters or the provision of incentives to sample persons to encourage survey participation. The use of respondent incentives in order to increase response has a long tradition in mail surveys. More recently, however, the use of incentives has also become more common in face-to-face surveys.
Also the German General Social Survey (ALLBUS) has been facing an increase in nonresponse in the past years. The ALLBUS is a biennial face-to-face survey of the adult population, covering a wide range of topics and aiming at charting the long-term trends in attitudes and behavior in Germany (http://www.gesis.org/en/allbus). Between 1994 and 2014 the response rate of ALLBUS decreased from 54% to 35%. The main reason for this decline was a rise in the number of refusals. In order to find out whether the use of respondent incentives might help to stop this trend, a series of incentive experiments has been implemented in the ALLBUS in the past years.
According to the literature, monetary incentives are more effective than in kind incentives. Therefore, in each experiment monetary incentives were provided to respondents. Both the type of incentive (promised vs. prepaid) and the value of the incentive (promised: 10 vs. 20 Euros; prepaid: 5 vs. 10 Euros) were varied. Since some of the experimental treatments were implemented twice, we can also analyze how stable incentive effects are under broadly similar conditions. In our presentation, we will mainly deal with the effect of the different types of incentives on cooperation and response rates and on survey costs. Our results indicate that prepaid incentives are superior compared to promised incentives. These insights are relevant for surveys with a similar design like ALLBUS, i.e. large-scale face-to-face surveys of the general population.