ESRA logo

ESRA 2019 glance program


Can't You Hear Me Knockin': Methodological and Practical Considerations in Designing, Implementing and Assessing Contact Strategies 1

Session Organisers Dr Patrick Moynihan (Pew Research Center)
Ms Martha McRoy (Pew Research Center)
Dr Laura Silver (Pew Research Center)
Mr Jamie Burnett (Kantar Public)
TimeTuesday 16th July, 11:00 - 12:30
Room D23

While low response rates are not necessarily indicative of substandard sample data, they can be associated with differential nonresponse and, possibly, biased estimates. Minimizing nonresponse, however, can be challenging as optimal fieldwork strategies need to balance the value of a more representative sample at the close of fieldwork with the expenditure of limited resources (time, budget, staff/vendor capacity) prior to and during fieldwork.

What’s often underappreciated is that many researchers are far removed from data-collection sites and data-processing responsibilities, raising the question of whether an informed assessment of fieldwork strategies is possible. A premium may be placed on the accuracy and completeness of non-substantive data (paradata and auxiliary information not collected from the survey instrument itself) to allow for a transparent understanding of design implementation. But this solution is far from perfect, offering only a partial view of what transpired in the field.

The goal of this session is to bring together researchers bridging the methodological concerns of unit nonresponse with the practical difficulties of monitoring, assessing and refining fieldwork protocols. Given the robust literature on nonresponse bias, the problem of unit (rather than item) nonresponse will be the subject of this conference session with a focus on contact (rather than cooperation) strategies.

We welcome submissions linking the methodological and practical sides undergirding contact strategies; topics may include but are not limited to:

- The implications of additional contact attempts on sample representativeness, data quality and fieldwork efficiency;

- Research designed to identify minimum standards for recording contact attempts, paradata or auxiliary data to improve the efficiency of data management and data-quality assessments;

- Strategies used to verify correct implementation of contact protocols;

- Approaches to improve interviewer compliance with contact protocols, such as cutting-edge training techniques, the use of technology or incentivized compensation structures;

- Experimentation to improve contact, such as activity indicators within frames or using distinct callbacks;

- Using paradata – ranging from GPS identifiers to observable characteristics of housing units/neighborhoods – to improve contact and support the verification of contact protocols;

- Methods to improve contact efficiency with hard-to-reach/low-incidence populations.

While our session's title is suggestive of in-person interviewing, we welcome papers from the perspective of telephone interviewing too. We’d emphasize our preference for research on how contact strategies can be implemented successfully and transparently. We invite academic and non-academic researchers as well as survey practitioners to contribute.

Keywords: unit nonresponse, nonresponse bias, contact data, data quality, paradata

Participation Rates in Psychological Studies over Time - A Meta-Analysis.

Mrs Tanja Burgard (ZPID - Leibniz Institute for Psychology Information) - Presenting Author
Dr Michael Bosnjak (ZPID - Leibniz Institute for Psychology Information)
Dr Nadine Kasten (University of Trier)

Download presentation

Relevance & Research Question:
The main question of the meta-analysis is, whether the initial participation rate in psychological studies has decreased over time. Moreover, possible moderators of this time effect will be addressed: The design of an invitation letter, the contact protocol, the data collection mode, the burden of participating in the study and the incentives given to participants.

Methods & Data:
Eligible studies for the meta-analysis have to report (quasi-)experiments on initial response rates from empirical studies in the field of psychology. The experimental manipulation of an eligible study is the variation of survey design characteristics. Student samples will be excluded, because students are often obliged to participate for their studies and therefore, their motivation differs from other populations.
The outcome of interest will be the initial response rate. As there may be different experimental comparisons per study report, the data are hierarchical. Using the metafor package in R, three-level mixed effects models will be used to account for the dependencies in the data and to enable testing moderator variables on the level of the report (e.g. Type of report, publication year) and on the level of the experiment (e.g. year of data collection, incentives). The relevant independent variable for all tests is the time of sampling. The moderating effects of the survey design will be tested using the characteristics of study conduction as moderator variables.

Results:
Results are not available yet.

Added Value:
The trend of declining response rates in the last decades can aggravate the possible bias due to nonresponse. Therefore, it is of interest what factors may moderate this trend to be able to guide survey operations by empirical evidence to optimize survey response. Due to the change in the willingness to participate in scientific studies, the continuous updating of the evidence is of importance.


Paper, Email, or Both? Effects of Contact Mode on Participation in a Web Survey of Establishments

Dr Joseph Sakshaug (Institute for Employment Research)
Dr Basha Vicari (Institute for Employment Research) - Presenting Author
Professor Mick Couper (Institute for Social Research, University of Michigan)

Identifying strategies that maximize participation rates in population-based Web surveys is of critical interest to survey researchers. While much of this interest has focused on surveys of persons and households, there is a growing interest in surveys of establishments. However, there is a lack of experimental evidence on strategies for optimizing participation rates in Web surveys of establishments. To address this research gap, we conducted a contact mode experiment in which establishments selected to participate in a Web survey were randomized to receive the survey invitation with login details and subsequent reminder using a fully-crossed sequence of paper and email contacts. We find that a paper invitation followed by a paper reminder achieves the highest response rate and smallest aggregate nonresponse bias across all-possible paper/email contact sequences, but a close runner-up was the email invitation and paper reminder sequence which achieved a similarly high response rate and low aggregate nonresponse bias at about half the per-respondent cost. Following up undeliverable email invitations with supplementary paper contacts yielded further reductions in nonresponse bias and costs. Finally, for establishments without an available email address, we show that enclosing an email address request form with a prenotification letter was not effective from a response rate, nonresponse bias, and cost perspective.


Participation Rates for Younger Age Groups Benefit most of SMS Reminders – FinHealth 2017 Survey

Dr Hanna Tolonen (National Institute for Health and Welfare) - Presenting Author
Dr Katri Sääksjärvi (National Institute for Health and Welfare)
Dr Päivikki Koponen (National Institute for Health and Welfare)
Professor Seppo Koskinen (National Institute for Health and Welfare)
Dr Annamari Lundqvist (National Institute for Health and Welfare)
Dr Katja Borodulin (National Institute for Health and Welfare)

Download presentation

Background. Participation rates in health examination surveys (HES) have been declining, jeopardizing the representativeness of survey results. Information on effectiveness of recruitment methods is needed to find cost-effective methods to increase the participation rates of future studies. We (have) examined which population sub-groups benefit most on survey reminders, especially SMS reminders in the FinHealth 2017 Survey.

Methods. The FinHealth 2017 Survey is a nationally representative HES covering general population aged 18 years or over. The eligible sample size was 11,965. Persons selected to the sample received an invitation letter with request to confirm or change pre-defined examination time either through electronic system or by contacting the survey office. Those who did not confirm their appointment two weeks prior to given time received a phone call, SMS or post-card reminder.

Results. Mobile phone numbers were available for about 60% of invitees. A total of 58% of invitees participated in the health examination, 8% returned only (survey) questionnaire and 34% did not participate. In total, 29% of invitees (43% of participants) confirmed or changed their given appointment time without reminder.

Reminders were most effective tool to increase participation rates among younger age groups. For example, among 18-24-years-old, total participation rate was 44% and of those only 21% responded to the initial invitation. 63% of 18-24 years old participants required reminder before coming to health examination. Similar phenomenon was observed among men and persons speaking other language than official languages (Finnish and Swedish) of Finland.

Conclusions.Additional contacts to remind and motivate participation seem to be beneficial especially for younger age groups, men and for persons speaking languages other than official language of the country. In this study, as well as in our previous studies, younger age groups benefited most of re-contacting.


Schedule Your Own Telephone Interview Time Online? Experimental Evidence of the Effects of an Online Interview Scheduler on Fieldwork Outcomes

Dr Katherine McGonagle (Institute for Social Research, University of Michigan) - Presenting Author
Dr NARAYAN SASTRY (Institute for Social Research, University of Michigan)

A tremendous amount of fieldwork effort in panel surveys using CATI is devoted to scheduling the interview. Multiple contact attempts are usually necessary, and respondents often break their appointments. We experimented with a new approach that allowed respondents to choose their own telephone interview time using an online scheduler. We describe the experiment and the effects on fieldwork outcomes. The panel survey was the 2017 wave of the Transition into Adulthood Supplement (TAS-2017). TAS is a nationally representative study of U.S. young adults aged 18-28 years embedded within the worlds’ longest running panel study, the Panel Study of Income Dynamics (PSID). The primary mode of data collection is CATI. During recent waves of TAS, the number of interviewer contact attempts to schedule telephone interviews and the number of broken appointments more than doubled. TAS-2017 offered a random subsample (n=1563) of all respondents (n=2084) the option of scheduling their own appointments online. The treatment group received information about the online scheduler in their invitation letter. Subsequent messages were sent to all respondents via email and text message to encourage participation in the study. Messages sent to the treatment group also included the web address of the online interview appointment scheduler and a note encouraging its use. This paper describes the online interview appointment scheduler experiment and the results, focusing on fieldwork outcomes such as the number of days and interviewer attempts to complete the interview, the rate of broken appointments, and response rates. Differential effects of the scheduler on these outcomes by key respondent characteristics are also reported.


How Can We Get in Touch? Asking Respondents about their Preferred Survey Mode

Dr Maria Clelia Romano (Istat)
Dr Barbara Lorè (Istat) - Presenting Author
Dr Gabriella Fazzi (Istat)
Dr Daniela Pagliuca (Istat)

Response rates are constantly decreasing even in official statistical surveys. In order to keep them high and to reach all the target populations, Istat is strengthening the multi-methods design in social surveys, according to both sequential and concurrent strategies.
In order to better direct our choices, we added some questions at the bottom of questionnaires in ongoing surveys, asking respondents on their preferences and on their propensity to alternative modes. For example, in the Population Census people who completed the questionnaire in CAWI mode were interviewed about their availability to answer by telephone. Moreover, people that had not replied via web were asked about the reason that led them to this choice. In other surveys on households, similar questions are proposed to people who answered through different data collection modes.
Our analysis focuses on data coming from the Population Census and other two social surveys (Multipurpose survey on households: aspects of daily life and Income and Living Conditions - Eu-Silc). In this paper, we try to set up the different profiles of respondents, classifying them on the basis of their social and demographic characteristics and their preferences or difficulties in using different modes of data collection.
We think that taking into account attitudes of different targets of population can be very useful to address our strategies in implementing mixed mode surveys, to reduce response burden and to increase response rate.