ESRA logo

ESRA 2019 full progam


Monday 15th July Tuesday 16th July Wednesday 17th July Thursday 18th July Friday 19th July


Can't You Hear Me Knockin': Methodological and Practical Considerations in Designing, Implementing and Assessing Contact Strategies 2

Session Organisers Dr Patrick Moynihan (Pew Research Center)
Ms Martha McRoy (Pew Research Center)
Dr Laura Silver (Pew Research Center)
Mr Jamie Burnett (Kantar Public)
TimeTuesday 16th July, 14:00 - 15:30
Room D30

While low response rates are not necessarily indicative of substandard sample data, they can be associated with differential nonresponse and, possibly, biased estimates. Minimizing nonresponse, however, can be challenging as optimal fieldwork strategies need to balance the value of a more representative sample at the close of fieldwork with the expenditure of limited resources (time, budget, staff/vendor capacity) prior to and during fieldwork.

What’s often underappreciated is that many researchers are far removed from data-collection sites and data-processing responsibilities, raising the question of whether an informed assessment of fieldwork strategies is possible. A premium may be placed on the accuracy and completeness of non-substantive data (paradata and auxiliary information not collected from the survey instrument itself) to allow for a transparent understanding of design implementation. But this solution is far from perfect, offering only a partial view of what transpired in the field.

The goal of this session is to bring together researchers bridging the methodological concerns of unit nonresponse with the practical difficulties of monitoring, assessing and refining fieldwork protocols. Given the robust literature on nonresponse bias, the problem of unit (rather than item) nonresponse will be the subject of this conference session with a focus on contact (rather than cooperation) strategies.

We welcome submissions linking the methodological and practical sides undergirding contact strategies; topics may include but are not limited to:

- The implications of additional contact attempts on sample representativeness, data quality and fieldwork efficiency;

- Research designed to identify minimum standards for recording contact attempts, paradata or auxiliary data to improve the efficiency of data management and data-quality assessments;

- Strategies used to verify correct implementation of contact protocols;

- Approaches to improve interviewer compliance with contact protocols, such as cutting-edge training techniques, the use of technology or incentivized compensation structures;

- Experimentation to improve contact, such as activity indicators within frames or using distinct callbacks;

- Using paradata – ranging from GPS identifiers to observable characteristics of housing units/neighborhoods – to improve contact and support the verification of contact protocols;

- Methods to improve contact efficiency with hard-to-reach/low-incidence populations.

While our session's title is suggestive of in-person interviewing, we welcome papers from the perspective of telephone interviewing too. We’d emphasize our preference for research on how contact strategies can be implemented successfully and transparently. We invite academic and non-academic researchers as well as survey practitioners to contribute.

Keywords: unit nonresponse, nonresponse bias, contact data, data quality, paradata

Cutting Corners: Detecting Gaps between Household Contact Protocol vs. Ingrained Habits in the Field

Dr LinChiat Chang (Independent Consultant) - Presenting Author
Miss Mpumi Mbethe (Independent Consultant)
Miss Shameen Yacoob (MPLOY Market Research)

Download presentation

Design implementation in the field requires collaboration and transparency between researchers and field teams. We focus on household contact strategies in a survey of 1,600 young women in South Africa. A stratified random cluster sample with probability proportional to size was drawn from target geographies, and 4 random start points were identified per cluster.

Our back checks in field as well as analysis of fieldwork data revealed that interviewer compliance with contact protocol was inconsistent, in large part due to ingrained habits and behavior norms. Paradata was used to support the verification of contact protocols - GPS coordinates allowed us to revisit the start points and replicate the course of the interviewer every step of the way. Gaps were observed in the distance between actual vs. presumed start point; the number of household contacts that should had been recorded before an eligible respondent was found vs. actual counts; whether movement was in alignment with the left hand rule or not, and more. Because not all field team members have the same work ethic, a fair and consistent reward system was needed to counteract an industry norm of cutting corners. To this end, we computed performance scores that could be used to reward honest work and delegate more work to better field workers.

One output from this survey was to estimate relative appeal of specific products in this target population. Simulations will be presented to show the extent to which volumetric population projections could be biased if household contacts are not fully documented and taken into account. We put forth a strong recommendation that researchers spend substantial time in the field during data collection to understand the errors and gaps that arise in household contact strategies, so as to inform adjustments on data modeling.


SMART Text Messages to Improve Compliance with Adaptive Design Data Collection Protocols

Ms Amanda Nagle (United States Census Bureau) - Presenting Author
Dr Kevin Tolliver (United States Census Bureau)

Face-to-face surveys sometimes choose to implement adaptive designs through case prioritization to improve data quality, control costs, or reach other survey specific goals. In these designs, interviewers are asked to apply additional effort to “high priority” cases, decrease effort expended on “low priority” cases, and treat “medium priority” cases with a typical amount of effort. The success of these interventions is dependent on interviewers treating their cases according to the stated priority. If the interviewers do not follow the prioritization instructions, the cases never receive differential treatment, and the survey cannot see the effects of adaptive design.
In the Survey of Income and Program Participation 2019 data collection, text messages will be sent to interviewers to test if text message communications can nudge interviewers to be compliant with prioritization instructions. We will test two message styles and 3 message frequencies using a Sequential Multiple Assignment Randomized Trail (SMART) design. Unlike fully factored experimental designs, SMART designs randomize an individual to a treatment multiple times during the experiment. This sequential design allows for analysis of both the marginal effect of individual treatments as well as the ordering and schedule of the treatments. It also allows us to use paradata to change a treatment that is not working for some interviewers to improve their compliance mid data collection. We will be able to understand the type and intensity of message that is effective for different types of respondents. This research presents preliminary results from the March 2019 through June 2019 data collection.


Improving Compliance with Face to Face Contact Protocols

Mr Jamie Burnett (Kantar Public) - Presenting Author

Download presentation

Maximizing contact is an integral part of high quality face to face surveys. However, in many countries and companies less familiar with these requirements, changing Interviewer behaviours is a difficult task. In this paper we review the effectiveness of strategies employed to improve compliance with the face to face contact protocols in a multi-national survey. Encompassing those that attempt to mitigate for it prior to fieldwork as well as those that help us identify non-compliance and address it during fieldwork. We show that these strategies, in combination, do improve adherence. Given some of these strategies do impact on costs and delivery timescales We go on to review, what if any benefit they have on quality.


Optimizing Field Interviewer Level of Effort to Identify and Survey Adult Smokers with an Address-Based Sample

Dr Kristine Wiant (RTI International) - Presenting Author
Mr Joseph McMichael (RTI International)

Effective allocation of interviewer resources is a key challenge in designing a probability-based study of low-incidence populations. In this presentation, we provide a case study of interviewer resource allocation in identifying a population that comprises less than 11% of U.S. adults: smokers, age 25-54. Data come from the first wave of a longitudinal study that evaluates the effectiveness of a public education media campaign designed to motivate current cigarette smokers to quit smoking. Wave-1 survey data were collected as part of in-person interviews in 15 U.S. counties in which the campaign is active and in 15 U.S. control counties in which the campaign is not present. We selected an address-based sample for Wave-1, and the initial contact with respondents was a mail screener, intended to aid in identifying households with at least one potentially eligible resident. Interviewers followed up in person with households that returned an eligible mail screener and with a random sample of households that did not return a mail screener to confirm or determine the household’s eligibility, select an eligible respondent, and complete a self-administered computerized interview. Our presentation focuses on considerations for interviewer level-of-effort for households with different levels of predicted eligibility rates, based on available sampling frame variables and answers to the mail screener (when available). We will demonstrate methods we used to communicate case-level contact limits and progress towards those limits to the interviewers and tools available to supervisors and managers in managing compliance with these contact limits. We will analyze interviewer compliance with these contact limits and present lessons learned about obstacles to compliance based on focus groups conducted with interviewers. Finally, we evaluate implications of number of contact attempts on sample representativeness and fieldwork efficiency and describe additional measures taken to optimize resource allocation.