ESRA logo

ESRA 2019 full progam


Monday 15th July Tuesday 16th July Wednesday 17th July Thursday 18th July Friday 19th July


Exploring New Insights into the Measurement and Reduction of Respondent Burden 1

Session Organisers Dr Robin Kaplan (Bureau of Labor Statistics)
Dr Morgan Earp (Bureau of Labor Statistics)
TimeTuesday 16th July, 11:00 - 12:30
Room D04

In government surveys, respondent burden is often thought of in terms of objective measures, such as the length of time it takes to complete a survey and number of questions. Bradburn (1978) posited that in addition to these objective measures, burden can be thought of as a multidimensional concept that can include respondents’ subjective perceptions of survey length, how effortful the survey is, and how sensitive or invasive the questions are. The level of burden can also vary depending on the mode of data collection, survey topic, demographic group, and frequency with which individuals or businesses are sampled. Ultimately respondent burden is concerning because of its potential impact on measurement error, attrition in panel surveys, survey nonresponse, nonresponse bias, and data quality. Thus, both objective and subjective measures of burden may have effects on survey outcomes, but few studies have explored both types of burden in a single study to better understand the unique contributions each may have. This panel aims to explore new and innovative methods of measuring and mitigating both objective and subjective perceptions of respondent burden, while also assessing the impact of respondent burden on survey response and nonresponse bias. We invite submissions that explore all aspects of respondent burden, including:
(1) The relationship between objective and subjective measures of respondent burden
(2) Qualitative research on respondents’ subjective perception of survey burden
(3) The relationship between respondent burden, response propensity, nonresponse bias, response rates, item non-response, and other data quality outcomes
(4) Sampling techniques, survey design, use of survey paradata, and other methodologies to help measure and reduce respondent burden
(5) Differences in respondent burden across different survey modes
(6) Measurement of multiple components of respondent burden, including effort, sensitivity, how easy or difficult the questions are to answer, interest, or motivation
(7) The use of alternative data sources to reduce burden
(8) Burden involved in data collection efforts, including survey organization contact attempts, reporting burden for establishment surveys, or proxy reporting in household surveys
(9) Measurement of respondent burden in populations that are more frequently surveyed than others

Keywords: Respondent burden, subjective burden, data quality, burden reduction

Data Quality of Proxy Reports: Inconsistent Educational Information in the German Microcensus Panel

Mr Simon Börlin (GESIS - Leibniz Institute for the Social Sciences) - Presenting Author

Download presentation

In many surveys proxy interviews are frequently used in the data collection process. On the one hand, using proxy interviews in household surveys reduces the costs of the data collection process and raises the response rates. In addition, the burden on respondents is reduced by shorter interview duration per household and by interviewing fewer people per household. On the other hand, it is assumed that the data quality of proxy reports is lower than with self-reports. Nevertheless, analyses of the data quality of proxy reports have not yet been sufficiently conducted. This paper examines the data quality of proxy reports on the basis of inconsistent educational information (ISCED, educational degrees, year of attainment, etc.). Because persons who are represented by a proxy interview differ systematically from those who participate themselves in the survey, this selection effect must be distinguished from the measurement effect. This is ensured by the analysis of time-constant information with the test-retest method using data from the German Microcensus Panel of 2012 and 2013. The preliminary multivariate results indicate a higher item non-response for proxy reports for educational information. Furthermore, first analyses of inconsistencies in ISCED and educational degrees show that proxy reports have more often inconsistent responses than self-reports. Especially changes in the respondent type between two years (self to proxy or vice versa) increase the amount of inconsistent educational degrees. However, compared to other relevant characteristics, the effects of proxy reports seem to play a minor role concerning inconsistent education information. However, further analyses show that the relationship between the respondent and the target person (e.g. spouse or a child-parent relationship) are relevant for the extent of inconsistent educational information. Spouses provide less inconsistent information than children or other persons in the household, while wives provide more consistent educational information than husbands.


Using a Tip Sheet to Improve Nonresponse in an IRS Business Survey

Dr Scott Leary (IRS) - Presenting Author
Dr Kerry Levin (Westat)
Mrs Sarah Bennett-Harper (Westat)
Dr Karen Stein (Westat)
Ms Martha Stapleton (Westat)
Ms Brenda Schafer (IRS)
Mr Patrick Langetieg (IRS)
Ms Lisa Rupert (IRS)

Many factors impact the survey response process, especially when collecting data from businesses. Some examples of the burden placed on respondents include the time needed for data retrieval, talking to colleagues, obtaining permission to release data, (Snijkers et al., 2013) and often navigating complicated information systems or records rather than relying on memory (Earp, McCarthy, 2012). Because of our concerns about these difficulties leading to low response, we devised a tip sheet to help reduce burden by helping respondents answer survey items more efficiently.

The IRS’s Business Taxpayer Burden (BTB) Survey measures and models the compliance burden imposed by the Federal tax system, including the time and money spent filing federal income tax returns. BTB is a mixed-mode survey, sent to about 900 large businesses in the United States and its territories. There are five contact attempts, with contacts 1, 3 and 5 including a paper survey in the packet.

In our experiment, the entire sample received our standard survey packet. Half of the respondents also received the tip sheet. The tip sheet encouraged recipients to reach out to other departments in their business as resources for survey responses, to share the paper survey with co-workers or provide them with the username and password to the web survey. The tip sheet reminded recipients that best estimates for time and money questions were acceptable. Based on the findings, we examined the impact of the tip sheet on survey response rates throughout data collection, as well as data quality such as missing data and changes to responses from a data verification effort.


Respondent Burden in a Mobile App: Evidence from a Shopping Receipt Scanning Study.

Mr Brendan Read (University of Essex) - Presenting Author

This study considers the burden placed on participants, subjectively and objectively, when asked to use a mobile app to scan shopping receipts. The existing literature on respondent burden is reviewed to present a framework of seven factors that affect burden, and this research demonstrates how these may be used to identify potential predictors of burden. Such an approach, together with the findings of this paper may have potential implications when applied to a number of emerging research contexts involving in-the-moment and repeated data collection.

The data used come from the Understanding Society Spending Study, a shopping receipt scanning study that asked respondents to use their mobile devices to take photos of their receipts, or otherwise input manually information about their purchases. The study was embedded within the ongoing probability-based Understanding Society Innovation Panel, which is a household panel study representative of Great Britain and data from the ninth wave of the Innovation Panel were also used for these analyses.

Evidence was found to suggest that subjective perceptions of burden may not be strongly correlated with the actual objective burden faced. There were no systematic trends in subjective burden throughout the course of the study, though, as respondents completed more of the repeated tasks in the study, the objective burden per task did decrease. In terms of predictors of burden hypothetical willingness to complete the task was predictive of lower subjective burden. Older and female respondents were also slower to complete individual tasks in the study.


Exploring the Relationship between Burden Factors and Survey Response

Dr Morgan Earp (US Bureau of Labor Statistics) - Presenting Author
Dr Brandon Kopp (US Bureau of Labor Statistics)
Dr John Dixon (US Bureau of Labor Statistics)

Given decreasing response rates, the survey research field is looking to reduce nonresponse by reducing respondent burden. In panel surveys specifically, respondent burden in a previous wave could lead to future nonresponse. To understand burden, we look to Bradburn (1978) who describes four main factors of respondent burden: length, effort, stress, and frequency. Bradburn also suggests that interest helps to reduce burden perceptions, but that the impact may waver with repeated sampling.

This paper explores the potential for using proxy measures of Bradburn’s four burden factors and participant interest, and explores their relationship with response across several surveys varying in topic, length, and sensitivity. The Current Population Survey (CPS) administers a variety of supplemental surveys to a select subpopulation of CPS respondents almost every month. Using CPS respondent data and paradata, we identified the following proxy measures of burden: 1) length (interview type and survey duration); 2) effort (number of adults, self/proxy reporting); 3) stress (income, ethnicity, marital status, and education refusal indicators); 4) frequency (the number of supplemental surveys the participant was previously sampled for); and 5) interest (employment status, disability status, home ownership, presence of children, and education level). While we know survey frequency is related to person-level characteristics, we also expect that those same characteristics are related to survey interest, and thus propose an interaction effect between interest and frequency as Bradburn suggested.

Our paper examines the relationship between five burden factors and survey response using a structural equation model and tests for model invariance across the various supplements to determine if the relationship between burden factors and response varies (in strength or direction) by survey topic. This paper will not only provide insight into the relationship between burden and response, but will also carefully examine the relationship between person-level characteristics, interest, survey frequency, and survey topic.