ESRA logo

ESRA 2019 full progam


Monday 15th July Tuesday 16th July Wednesday 17th July Thursday 18th July Friday 19th July


Can't You Hear Me Knockin': Methodological and Practical Considerations in Designing, Implementing and Assessing Contact Strategies 3

Session Organisers Dr Patrick Moynihan (Pew Research Center)
Ms Martha McRoy (Pew Research Center)
Dr Laura Silver (Pew Research Center)
Mr Jamie Burnett (Kantar Public)
TimeTuesday 16th July, 16:00 - 17:00
Room D24

While low response rates are not necessarily indicative of substandard sample data, they can be associated with differential nonresponse and, possibly, biased estimates. Minimizing nonresponse, however, can be challenging as optimal fieldwork strategies need to balance the value of a more representative sample at the close of fieldwork with the expenditure of limited resources (time, budget, staff/vendor capacity) prior to and during fieldwork.

What’s often underappreciated is that many researchers are far removed from data-collection sites and data-processing responsibilities, raising the question of whether an informed assessment of fieldwork strategies is possible. A premium may be placed on the accuracy and completeness of non-substantive data (paradata and auxiliary information not collected from the survey instrument itself) to allow for a transparent understanding of design implementation. But this solution is far from perfect, offering only a partial view of what transpired in the field.

The goal of this session is to bring together researchers bridging the methodological concerns of unit nonresponse with the practical difficulties of monitoring, assessing and refining fieldwork protocols. Given the robust literature on nonresponse bias, the problem of unit (rather than item) nonresponse will be the subject of this conference session with a focus on contact (rather than cooperation) strategies.

We welcome submissions linking the methodological and practical sides undergirding contact strategies; topics may include but are not limited to:

- The implications of additional contact attempts on sample representativeness, data quality and fieldwork efficiency;

- Research designed to identify minimum standards for recording contact attempts, paradata or auxiliary data to improve the efficiency of data management and data-quality assessments;

- Strategies used to verify correct implementation of contact protocols;

- Approaches to improve interviewer compliance with contact protocols, such as cutting-edge training techniques, the use of technology or incentivized compensation structures;

- Experimentation to improve contact, such as activity indicators within frames or using distinct callbacks;

- Using paradata – ranging from GPS identifiers to observable characteristics of housing units/neighborhoods – to improve contact and support the verification of contact protocols;

- Methods to improve contact efficiency with hard-to-reach/low-incidence populations.

While our session's title is suggestive of in-person interviewing, we welcome papers from the perspective of telephone interviewing too. We’d emphasize our preference for research on how contact strategies can be implemented successfully and transparently. We invite academic and non-academic researchers as well as survey practitioners to contribute.

Keywords: unit nonresponse, nonresponse bias, contact data, data quality, paradata

Improvements to Contacting Materials and Strategies for General Social Survey (GSS) 2018

Ms Jodie Smylie (NORC at the University of Chicago) - Presenting Author
Dr René Bautista (NORC at the University of Chicago)

The GSS, with its generally consistent characteristics and field protocols from one round to another, provides a suitable environment in which to initiate exploratory analysis of the efficacy of new contacting strategies implemented from one round to another. In recent rounds of the GSS, interviewers have experienced increased challenges in both getting in touch with respondents and in gaining cooperation. In order to mitigate these challenges, a number of improvements were made to contacting strategies for the 2018 round of the GSS. These improvements included the development of more appealing, concise, and directive advance materials, informed by best practices found on other studies; the implementation of a prepaid incentive and initial postpaid incentive; new in-person materials to aid in contacting residents of gated communities and locked buildings; standardized in-kind gifts, both presented by interviewers at the door and delivered from an external vendor; and the use of a “last chance” postcard instead of a “last chance” letter used in previous rounds. These improvements to contacting materials, strategies and lessons learned will be discussed and qualitative information on the efficacy of these materials will be shared. Initial analyses show heightened production in GSS 2018 in comparison to production in the previous round. Further analyses will be conducted in order to get a sense for how these new materials and strategies may have decreased the number of contact attempts needed to get in touch with the selected households as well as boost the numbers of completed interviews. Results will provide insight into new and improved materials and strategies for contacting sampled households and how implementation of these materials and strategies helped the GSS field cases in an increasingly difficult social environment for case outreach.


Employing Behavioral Insights to Entice Response to the American Community Survey

Dr Victoria Velkoff (U.S. Census Bureau) - Presenting Author
Dr Jennifer Ortman (U.S. Census Bureau)

Download presentation

The U.S. Census Bureau asks 3.5 million households to respond to the American Community Survey (ACS) each year. When the ACS is sent out to respondents, we include information about the importance of the survey in the package. These materials are critical in encouraging response. The design and messages the mail materials contain not only convey key information about the recipients’ participation, but also set the tone for their interaction with the Census Bureau. Enticing response to the ACS is the central focus of much of our research. The Census Bureau is turning to insights gained from behavioral sciences to improve our communications with respondents with the goal of bolstering response rates and easing respondent burden. This presentation highlights work to integrate lessons learned from behavioral science research to communicate with respondents and motivate their response to the ACS. The research presented includes efforts to boost participation through various modifications to existing ACS mail materials, including a data slide in mail invitations to communicate legitimacy and the purpose of the ACS to respondents, and work to design new mail materials based on the best practices identified in behavioral insights and other research. The presentation shows the utility of drawing on multidisciplinary research to persuade households to respond to these surveys and make responding less burdensome.


What Does Extra Effort Yield in the Current Telephone Survey Climate?

Ms Sarah Dipko (Westat) - Presenting Author

In the current telephone survey climate, making contact with and interviewing sampled cases by telephone presents numerous challenges. The primary challenge is to get sample members to answer the telephone, as high non-contact rates plague both random digit dial and list sample designs. Once that barrier is crossed, the next challenge is to gain cooperation with sample members. In addition, when calling a list sample one faces the prospect of tracing – this may or may not be warranted depending on the degree of mobility of sample members.
For telephone centers that use calling algorithms, one approach to constraining effort is to set a maximum number of calls before finalizing cases as nonresponse. Some systems permit these maximums to be exceeded in order to increase the contact rate, depending on sample performance and available study resources. In addition, it is common to use refusal conversion to attempt to gain cooperation with those who initially refuse to participate (AAPOR, 2014). This paper addresses the effect of using these techniques, as well as tracing, for a recent multi-mode survey of participants in the U.S. Department of Agriculture's Supplemental Nutrition Assistance Program (SNAP). The research questions addressed include the following.
• What does extra effort yield in terms of shifting the demographics of survey participants – does it yield more of the same types of respondents interviewed without extra effort, or shift the distribution?
• Does employing extra effort increase the level of hostility expressed by those who refuse the study?
• What percentage of overall study completes is yielded by employing extra effort?
• Finally, are the same effects observed for both English-speaking and Spanish-speaking sample cases, or are the dynamics different?