ESRA 2019 Draft Programme at a Glance
Understanding Nonrespondents to Inform Survey Protocol Design 1
|Session Organisers|| Ms Brenda Schafer (Internal Revenue Service)
Dr Scott Leary (Internal Revenue Service)
Mr Rizwan Javaid (Internal Revenue Service)
Mr Pat Langetieg (Internal Revenue Service)
Dr Jocelyn Newsome (Westat )
|Time||Wednesday 17th July, 09:00 - 10:30|
Government-sponsored household surveys continue to face historically low response rates (e.g., Groves, 2011; de Leeuw & de Heer, 2002). Although an increase in nonresponse does not necessarily result in an increase in nonresponse bias, higher response rates can help reduce average nonresponse bias (Brick & Tourangeau, 2017).
One method to address nonresponse is to maximize response, but this approach is often at a high cost with mixed success (Tourangeau & Plewes, 2013; Stoop et al, 2010). A variety of intervention strategies have been used including: offering surveys in multiple survey modes; limiting survey length; offering incentives; making multiple, distinct communication attempts; and targeting messaging to the intended audience (e.g., Dillman et al, 2014; Tourangeau, Brick, Lohr and Li, 2017). A second method to address non-response involves imputations and adjustments after data collection is complete. (Kalton & Flores-Cervantes, 2003) However, the effectiveness of this approach largely depends on what auxiliary variables are used in the nonresponse adjustment models.
Although research has been done to understand nonresponse in surveys, there are still many unanswered questions, such as: What demographic characteristics distinguish nonrespondents from respondents? What socio-economic or other barriers may be contributing to a low response rate? Answering these and similar questions may allow us to tailor survey design and administration protocols to overcome specific barriers that lead to nonresponse. Reducing nonresponse may mean fewer adjustments and imputations after data collection.
This session will focus on understanding characteristics of nonrespondents, barriers to survey response, and how knowledge about nonrespondents can guide survey design protocol. Researchers are invited to submit papers, experiments, pilots, and other approaches on any of the following topics:
• Better understanding how nonrespondents differ from respondents.
• Understanding barriers to response for different subgroups.
• Understanding how nonresponse for different subgroups may have changed over time.
• Using knowledge about nonrespondents to design focused intervention strategies. These could include, but are not limited to tailored messaging, tailored modes of administration, distinct forms of follow-up, and shortened surveys.
• Designing survey protocols to increase response from hard-to-reach populations of interest.
Keywords: Nonresponse, Survey Protocol, Survey Design, Behavioural Insights
The Impact of Declining Response Rates on Nonresponse Bias in a National Health Survey
Ms Shelley Roth (Westat) - Presenting Author
Ms Wendy Van de Kerckhove (Westat)
Dr Minsun Riddles (Westat)
Ms Yiting Long (Westat)
Mr Jay Clark (Westat)
Dr Leyla Mohadjer (Westat)
Response to household surveys continues to drop regardless of survey topic, population of interest, or data collection mode. Nonresponse bias occurs when unit nonrespondents are different from respondents, resulting in inaccurate survey estimates in the inference population. We present some results of a nonresponse bias analysis using data from the National Health and Nutrition Examination Survey (NHANES) conducted in the United States. NHANES collects information used to assess the health and nutritional status of adults and children in the United States, to estimate the prevalence of various diseases and health conditions, and to provide information used to plan health policy. Recent drops in response rates prompted a full investigation into possible nonresponse bias for the 2013-2014 and 2015-2016 survey years. NHANES is unique in that it collects information via both an interview and a physical exam. The physical exam data allow for evaluation of potential bias within important survey outcomes, such as diagnosed diabetes, hypertension, and high cholesterol. We attempted to determine the relationship between a set of characteristics that were available for respondents and nonrespondents to both response propensity and final survey outcome statistics. We evaluated nonresponse bias prior to survey weighting adjustments using bivariate and multivariate analyses to examine the relationship of response status to the auxiliary variables, and by examining the R-indicator. We evaluated the effects of survey weighting adjustments on nonresponse bias by comparing differences in estimates between stages of weighting, and by examining correlations of auxiliary variables with outcomes. Finally, we evaluated nonresponse bias on final outcome statistics after weighting adjustments were implemented by comparing NHANES estimates to external data sources and prior years, by calculating the potential range of bias through a sensitivity analysis, and by evaluating differences in outcome estimates for respondents when considering level-of-effort to obtain response.
Understanding nonresponse in a German PhD Panel Study – Results from a mixed mode nonrespondents survey
Ms Susanne de Vogel (German Centre for Higher Education and Science Studies (DZHW)) - Presenting Author
Dr Gesche Brandt (German Centre for Higher Education and Science Studies (DZHW))
The German Centre for Higher Education Research and Science Studies (DZHW) set up a government-sponsored panel study to examine the learning conditions, career entry and career development of doctorate holders in Germany. Therefore, we conducted a full sample survey of all those who successfully completed their PhD at a German higher education institution (HEI) in 2014. Altogether about one fifth of the population took part in the initial wave (N=5,411). The panel has continued up to now with three subsequent observations and will continue with further waves (online surveys) taking place every year. Thereby, the response rate was 27% in the second wave and between 61% and 66% in the following surveys.
To investigate the mechanisms that lead to nonresponse in our study and to get ideas about how to increase response rates in future interviews, we conducted a nonrespondents survey aimed at all 1,705 PhD holders who did not take part in the fourth wave. To investigate mode effects, we chose an experimental survey design. The nonrespondents sample was split in three groups inquiring the reasons of nonresponse in 1) an online form with a closed question, 2) an online form with an open question, or 3) a personal email.
Overall, the response rate again was low (11%) but turned out to differ between modes. It was highest in the second group (14%) and lowest in the third group (8%). Regarding the causes for nonresponding, however, the three subsamples yielded similar findings. Among the most common reasons were the PhD holders’ lack of time, they forgot to respond, and the questionnaire’s length.
These results provide valuable implications about how to improve the survey design of our future waves but also showed that technical problems or issues of data protection are not relevant for nonresponse.
Behavioral Insights: Using Data Driven Analysis to Better Understand Barriers to Survey Response
Ms Brenda Schafer (Internal Revenue Service) - Presenting Author
Mr Rizwan Javaid (Internal Revenue Service)
Mr Patrick Langetieg (Internal Revenue Service)
Dr Scott Leary (Internal Revenue Service)
Dr Kerry Levin (Westat)
Dr Jocelyn Newsome (Westat)
Ms Martha Stapleton (Westat)
As with all surveys involving statistical samples, researchers must consider the impact of nonresponse bias on the generalizability of the findings. One method to address nonresponse bias is to maximize response, but this approach can be quite costly and often has mixed results (Tourangeau & Plewes, 2013; Stoop et al., 2010). Other approaches look more closely at who is not responding. For example, the U.S. Bureau of Labor Statistics uses a data-driven approach to inform efforts to reduce nonresponse by studying nonrespondent characteristics at each of three data collection processes (address refinement, recruitment, and data collection) of their Job Openings and Labor Turnover Survey (Earp, 2013). This paper examines how analysis of the demographic characteristics of nonrespondents can help survey researchers better understand barriers to survey response in order to inform survey sampling methodology, nonresponse follow-up protocols, and nonresponse bias analysis.
The IRS Individual Taxpayer Burden (ITB) survey is a multi-mode survey that has been conducted annually since 2010. The survey is sent to a statistically representative sample of 20,000 United States taxpayers. It measures the time and money taxpayers spend to comply with their tax reporting responsibilities. Historically, taxpayers with the least complex tax returns are the least likely to respond to the ITB survey.
For this research, we analyze taxpayer characteristics of nonrespondents for seven ITB surveys (Tax Years 2010-2016). We segment nonrespondents by tax return characteristics (e.g., gender, income level, filing history, filing method) and go beyond demographic analysis to show how reflecting on behavioral insights can lead to a more informed approach to reducing survey nonresponse.
Geospatial Analysis on Nonresponse Rates for a US Tax Administration Household Survey
Mr Rizwan Javaid (US Internal Revenue Service) - Presenting Author
Ms Brenda Schafer (US Internal Revenue Service)
Mr Patrick Langetieg (US Internal Revenue Service)
Mr Scott Leary (US Internal Revenue Service)
Ms Hanyu Sun (Westat, Inc.)
Mr Michael Giangrande (Westat, Inc.)
Ms Jocelyn Newsome (Westat, Inc.)
Ms Kerry Levin (Westat, Inc.)
In the past, studies have shown that geography can play an important role in survey nonresponse (Harzing, 2000; Harzing, 2006). For example, one study paired socioeconomic factors with geography to evaluate nonresponse (Bates & Mulry, 2011). Findings indicated that married homeowners in the Midwest and Northern Atlantic areas in the US typically had the lowest nonresponse for Census surveys. Another study evaluated the nonresponse by location when an incentive was provided (Sun, et al., 2018). The results, based on a US tax administration survey, showed that the most populous states (e.g. Texas, California, and New York) had the highest nonresponse.
The Individual Taxpayer Burden (ITB) survey is an annual multi-mode tax administration survey sent to 20000 individuals in the US. The survey measures the time and money taxpayers spend complying with tax law regulations. The ITB survey is currently being fielded for the eighth consecutive year. While the national nonresponse rate has remained steady, between approximately 60% and 70% each year, it varies considerably by state. Notably, certain regions of the country, such as the Midwest (e.g., North Dakota, South Dakota, and Minnesota) consistently have the lowest nonresponse rates, whereas the South (e.g., Texas, Louisiana, Mississippi, and Georgia) have the highest nonresponse rates.
We will use geospatial analysis to examine how survey nonresponse among different taxpayer segments varies by location, using survey responses and tax administration data from Tax Years 2010 - 2016. We will analyze changes in overall nonresponse at the state level year to year. We will then examine how tax return data, including filing status, filing method, age, income, and gender, interact with geography to impact survey nonresponse by state.
Note: Relevant session is "Understanding Nonrespondents to Inform Survey Protocol Design" under topic "Survey Fieldwork Processes." Session wasn’t available for selection on submission site, so I chose general session to get abstract submitted.