ESRA logo
Tuesday 18th July      Wednesday 19th July      Thursday 20th July      Friday 21th July     




Tuesday 18th July, 14:00 - 15:30 Room: N AUD4


Improving Response in Government Surveys 1

Chair Dr Kerry Levin (Westat )
Coordinator 1Dr Jocelyn Newsome (Westat)
Coordinator 2Ms Brenda Schafer (Internal Revenue Service)

Session Details

Government-sponsored household surveys continue to face historically low response rates (Groves, 2011; de Heer, 1999). Low response can correlate with response bias, since nonresponse is rarely uniform across groups or subgroups of interest. As a result, many government surveys focus heavily on maximizing response, often at a high cost (Tourangeau & Plewes, 2013; Stoop et al, 2010). For example, the cost of the U.S. census has more than doubled over the past several decades, from $16 per household in 1970 to $98 per household in 2010 (2010 Census GAO Report).
Survey methodologists have sought to address the problem of nonresponse by implementing a number of interventions to increase respondent engagement, minimize burden, and reduce response bias. Methodologists have suggested a variety of intervention strategies, including: offering surveys in multiple survey modes; limiting survey length; emphasizing official sponsorship; offering incentives; making multiple, distinct communication attempts; utilizing respondent-centric design; and targeting messaging to the intended audience (Dillman, 2014). However, the effectiveness of these strategies vary widely, and researchers continue to explore how to best implement these strategies in a cost-effective manner.
This session will explore innovative approaches for increasing response in government-sponsored household surveys. Researchers are invited to submit papers discussing experiments or pilot studies on any of the following topics:
• Multimode survey design, including optimizing for mobile devices, to encourage response;
• Improving response through the use of incentives, targeted messaging, or multiple distinct forms of follow-up;
• Interviewer techniques that encourage response;
• Use of paradata to improve response;
• Impact of survey length on response; and
• Attempts to increase response from hard-to-reach populations of interest.

References
2010 Census: Preliminary Lessons Learned Highlight the Need for Fundamental Reforms. GAO- 11-496T. Washington, D.C.: April 6, 2011.
De Heer, W. (1999). International response trends: results of an international survey. Journal of official statistics, 15(2), 129.
Dillman, D. A., Smyth, J. D., & Christian, L. M. (2014). Internet, phone, mail, and mixed-mode surveys: the tailored design method. John Wiley & Sons.
Groves, R. M. (2011). Three eras of survey research. Public Opinion Quarterly, 75(5), 861-871.
Stoop, I., Billiet, J., Koch, A., & Fitzgerald, R. (2010). Improving survey response: Lessons learned from the European Social Survey. John Wiley & Sons.
Tourangeau, R. & Plewes, T. J. (Eds.). (2013). Nonresponse in social science surveys: a research agenda. National Research Council.

Paper Details

1. International Trends in Household Survey Nonresponse: The Labour Force Survey
Professor Edith de Leeuw (Utrecht University)
Professor Joop Hox (Utrecht University)
Dr Annemieke Luiten (Statistics Netherlands)
Dr Barry Schouten (Statsitics Netherlands)

Household surveys by National Statistical Institutes and government organizations show downward trends in response rates, and survey researchers fear that response rates have been decreasing over the last decades. In 1990, at the first international workshop on household survey nonresponse, an initiative was started to compare nonresponse data from official agencies. This resulted in one of the first international trend analyses by De Leeuw & De Heer (2002), who analyzed longitudinal data from National Statistical Institutes covering the period 1972 to 1997 and showed that over the years response rates declined, with contact rates declining on average 0.2% a year and refusal rates increasing on average 0.3% per year. Since then communication with respondents has changed to a variety and combination of survey modes. Do these changing data collection methods result in a different picture?
In 2015 an initiative was started to again study response rate trends. The new questionnaire, based on the original 1990 questionnaire, added questions on mixed mode data collection, fieldwork effort, and fieldwork costs. This questionnaire was sent to all European Labour Force Survey (LFS) contacts, as well as some non-European other countries, like Australia, New Zealand, Canada and the USA. A retrospective inventory was used: respondents were asked to report on response, refusal and contact rates from 1998 to 2015. In order to limit the respondent burden, we asked for a description of the 2015 design only and asked respondents to indicate in which years design changes were implemented.
In the preliminary analysis of the new data (1998-2015) we differentiated between voluntary and mandatory surveys, as levels and trends proved to be different in the original analyses by De Heer (1999) and De Leeuw & De Heer (2002). The new data also show this difference: the mean response rate for voluntary LFS was 72.5% and 84.7% for mandatory surveys. Refusal rates were 14.1 and 3.4, respectively, and noncontact rates were 11.6 vs 9.3 %. A trend analysis on response, refusal and noncontact showed that from 1998 to 2015 the response diminished, while refusals and noncontact rates increased.
We combined the original data with the new data in a single data file. Some countries existed only in one of the data collection periods, these were left out. Some countries reported not all years, but these were left in. A multilevel regression on the entire observation period showed that the regression is almost flat in the period 1984-1997, but in the period1998-2015, there is a shift in slope over time. Response rates decrease faster and refusal rates increase faster after 1998. Noncontact rates show a different trend: they increase through the entire range of years, but show no change in trend after 1998. Overall, the main conclusion is that the trends visible in de Leeuw and de Heer (2002) mostly continue, with possibly a small acceleration.
In the coming months we continue these analyses, replicating and extending the original analyses regarding correlates of response, refusal and (non-)contact.


2. Experiments to improve response in a government survey
Dr Jocelyn Newsome (Westat)
Dr Kerry Levin (Westat)
Ms Hanyu Sun (Westat)
Ms Jennifer McNulty (Westat)
Ms Brenda Schafer (IRS)
Mr Pat Langetieg (IRS)
Dr Saurabh Datta (IRS)

The survey industry has experienced a steady decline in response rates (Groves 2011). We will examine findings from a series of experiments designed to increase response for a large household survey. The IRS Individual Taxpayer Burden (ITB) Survey is an annual multi-mode survey sent to 20,000 individuals in the United States. It measures the time and money taxpayers spend complying with tax law. The IRS ITB Survey is currently being fielded for the sixth consecutive year. Over the years, multiple experiments have been conducted to explore various methods of increasing response (Newsome et al., 2012, Levin et al., 2013, Levin et al., 2015). We will examine the impact of each experiment on response rates, as well any difference in response by subgroups that have been historically underrepresented in the ITB survey (e.g., younger adults, low income respondents, and parents with young children).
We will discuss the following experiments:
 Mode order. Although survey researchers often use mixed-mode surveys to help reduce particular forms of survey error, speed up data collection, or lower costs (de Leeuw, 2005; Pierzchala, 2006), current research is unclear which sequence of modes is most effective. In order to address this question, an experiment comparing mode sequence was embedded in the administration of the 2010 IRS ITB Survey.
 Number of contacts. Past studies have shown that additional contacts result in an increase in overall response, although response rates move incrementally towards an asymptote or a plateau (Hassol et al., 2003, Rookey et al., 2012). However, there is minimal evidence in the literature as to the “optimum” number of contacts. In the 2012 ITB Survey, we experimented with the effectiveness of a seventh contact.
 Incentives. Decades of research have demonstrated that, all else being equal, incentives increase participation rates and reduce refusal rates (Singer & Bosarte, 2006). However, although incentives are effective, they add substantially to costs in a large-scale household survey. In the face of tightening government budgets, the IRS has experimented with the impact of incentives in administrations of both the 2010 and 2015 ITB Survey.
 Messaging. When formulating an overall communication strategy, Dillman (2014) advocates a “respondent-centric” design that appeals to a wide variety of people. In the administration of the 2013 ITB Survey, an experiment examined the impact of different messaging styles on response rates. We compared a more formal style with one designed to appear more open and engaging. This friendlier messaging also included an appeal to civic responsibility and improving the status quo, which has been shown to be particularly effective for Millennials (Molyneux & Teixeira, 2010).
 Non-response follow-up. Although reminder postcards and phone messages (both automated and with a live interviewer) are established strategies for non-response prompting, the literature is unclear as to which is most effective (Dillman et al., 2014; Census, 2004; McCarthy, 2008). Two experiments in the administration of the 2010 ITB Survey compared the effectiveness of a postcard, automated call, and a live interviewer in increasing response.


3. Turning things around: Ways to improve response in pension-related surveys
Mr Jochen Heckmann (Kantar Public Germany)
Dr Thorsten Heien (Kantar Public Germany)

Like many other surveys in Germany and elsewhere, the study on “Old-Age Incomes in Germany” (Alterssicherung in Deutschland; ASID), a register-based multi-mode survey using more than 16.800 postal, f2f and telephone interviews of people aged 55+ and their partners in 2015 and run by Kantar Public (former TNS Infratest Sozialforschung) since 1986, recently suffered from a decline in survey participation: Within only 20 years, the overall response rate dropped by 17 percentage points from 56% in 1992 to 39% in 2011. Due to its enormous importance for the contracting authority, the German Federal Ministry of Labour and Social Affairs, and the legislation and research on pensions in Germany in general, an intense discussion on how to improve survey response started. As a consequence, several measures were implemented or experimentally tested before and during fieldwork of the ASID 2015. They include a) the design and timing of mailings, b) the training and payment of interviewers (especially to increase response from hard-to-reach populations of interest like immigrants), and c) incentives for respondents. Actually and in contrast to similar surveys in Germany, the overall response rate increased from 39% in 2011 to 42% in 2015.

The paper discusses the results in detail, including the analysis of survey response of relevant subgroups by means of multivariate modeling of data. To control for (un)wanted side effects and with respect to a broader “total survey error” perspective, the analysis also takes measurement errors into account. This is enabled by the extensive post-fieldwork data editing procedures of the ASID. Furthermore and to improve survey cost efficiency, the paper illustrates to what degree and at what cost nonresponse (and measurement) errors could be decreased. It concludes by throwing some light on future ASID survey design perspectives.


4. Improving Response in Multimode and Single Mode Probability Based Surveys Compared to a Non-probability Survey
Dr Virginia Lesser (Oregon State University)
Ms Kerri Nawrocki (Oregon State University)
Ms Lydia Newton (Oregon State University)

Participation in surveys, regardless of the mode of delivery, is decreasing. Survey methodologists continue to seek methods that can obtain high response rates. For example, providing options for completing a survey by using either mail or web could appeal to different demographic groups. Using some type of incentives may attract others to complete the survey. A 2015 household study of the general population using probability sampling was conducted for the Oregon Department of Transportation. The purpose of the study was to assess satisfaction of the services provided by the State of Oregon’s Department of Transportation. Nested within the main purpose of the study, experiments using different methods to improve response rates were compared. The first approach examined the impact of offering multiple modes of response. The sample was divided into two groups. One group was presented a mixed-mode approach, which initially asked the participants to complete the survey by web. Nonrespondents were sent printed surveys and asked to return the questionnaire by mail. The second group of participants were given only the printed version of the questionnaire and asked to return it by mail. Within each of these groups, we also investigated whether an incentive, a special decal of the State of Oregon, would impact response rates. We will examine response rates and coverage error across the mixed-mode and single mail mode groups. At the same time, a less expensive approach to obtain opinions from Oregon households was examined. A non-probability panel of Oregon residents was asked to complete the same questionnaire. We will compare key questions from the mixed-mode and single mode groups in the probability sample, with the results from the non-probability sample. We will summarize the cost differences for conducting these approaches.