ESRA logo

ESRA 2019 full progam


Monday 15th July Tuesday 16th July Wednesday 17th July Thursday 18th July Friday 19th July


Multiple Mode Data Collection: Innovative Approaches 1

Session Organisers Dr Scott Leary (Internal Revenue Service)
Dr Jocelyn Newsome (Westat)
Ms Brenda Schafer (Internal Revenue Service)
TimeTuesday 16th July, 11:00 - 12:30
Room D16

Single-mode surveys -- whether telephone, web, or mail -- all have limitations that can result in coverage error. For example: telephone surveys may not reach households that have dropped landlines in favor of cell phones; web surveys are not an option for those without internet access (or those reluctant to click on survey links for fear of phishing scams); and mail surveys do not always make it to the intended recipient (and aren’t always opened or returned even when received).

In the face of these challenges, researchers are increasingly turning to mixed-mode surveys that afford the opportunity of combining multiple survey modes. Using multiple modes offer a variety of benefits to researchers. They can help improve response rates and reduce nonresponse error by improving coverage (Dillman, 2014). They can also speed up data collection and lower costs by allowing researchers to begin with a less expensive method of data collection and then move to a more expensive mode only for non-respondents (de Leeuw, 2005; Pierzchala, 2006; Tourangeau, Conrad, & Couper, 2013). Some researchers suggest that a sequential approach may be most effective, where a mail questionnaire is offered first and a web questionnaire offered later in the data collection process (Dillman, 2014) and others advocate for web first followed by other modes (Millar & Dillman, 2011), since the web is a very cost-efficient method.

However, as researchers embrace mixed-mode designs, it is not always clear how best to offer the various modes. Is it best to offer different modes concurrently or sequentially, and if sequentially, which mode works best if offered first? This session will explore innovative approaches used in multi-mode surveys. Researchers are invited to submit papers discussing experiments or pilot studies on any of the following topics:
• Multi-mode survey design experiments, including the use of smartphones and tablets;
• “Web push” experiments, including the impact on survey response and nonresponse;
• Discussion of differences in mode impact at different points in data collection; and
• Impact of mode sequencing on hard-to-reach populations.

Keywords: multimode, mixed mode, web push

The Efficacy of a Respondent Centric Approach - Evidence from a Large Experimental Mixed Mode Random Probability Survey in the UK.

Mr Andrew Phelps (Office for National Statistics)
Mr Joseph Herson (Office for National Statistics) - Presenting Author

The UK government has a Digital by Default strategy which means that by 2020 digital self-service is the default option for people who can use it, not the only option. Coupled with public expectations to be able to do surveys online, and increasing use of smartphones to perform online tasks means the Office for National Statistics (ONS) has invested in a transformation programme that focuses on research to deliver and integrate respondent centred online data collection. We have taken a ‘blank page’ approach to the redesign of our respondent journey.

ONS practices respondent centrism in its approach to design; we place the respondent at the forefront of our design process by aligning the questions and flows with their mental models. By doing so we create a questionnaire that collects accurate data which meets analytical requirements whilst also delivering a positive respondent experience.

This talk will share findings from a large experimental mixed-mode random probability survey – the Labour Market Survey (LMS), which used a prototype labour market questionnaire and sampled 14,000 UK households. Households were given an initial 10-day period to complete online, after which nonresponse was followed up by a field interviewer. Fieldwork was between October 2018 and April 2019.

The talk will reflect on the following:
- Mixed mode response rates, with comparisons to the current (unimode) UK Labour Force Survey (LFS)
- Impact on online take-up from having concurrent modes during the field stage
- Variability of online take-up rates across sampled areas
- Did the respondent centric approach appear to help reduce mode effects?
- Sample composition, compared with the current LFS
- Key labour market estimates, compared with the current LFS

Another ONS presentation at ESRA 2019 covers practical examples and recommendations on techniques, questions and approaches used to develop a respondent centric LMS questionnaire, that was designed to be ‘smartphone first’.


Respondent Centred Design: An Overview of our Innovative Approach and Evidence to Support its Use in Combating Barriers to Successful Multi-Mode Data Collection

Dr Natalia Stutter (Office for National Statistics) - Presenting Author
Ms Laura Wilson (Office for National Statistics)

A research programme exploring the introduction of online data collection is underway at the Office for National Statistics (ONS) in the UK. It is part of a wider transformation programme which includes the Social Survey portfolio. The UK government has committed to a ‘digital by default’ strategy across all government services by 2020. To successfully achieve this goal ONS is pursuing an innovative research approach, exploring the end-to-end journey of a potential survey respondent, including respondent engagement strategies, which is the focus of this talk.

Respondent communications play a pivotal role in multimode data collection, particularly where the approach is online first. Over the last two years ONS has invested heavily in qualitative and quantitative research to explore the impact of respondent materials and mailing strategies on take-up and response rates. Our respondent centred approach to this work has demonstrated that it is possible for us to achieve a 28% take-up rate for a lightly incentivised online only voluntary social survey.

To develop an effective respondent engagement strategy, we took a blank page approach to our materials. We blended traditional social research methods alongside innovative methods to create a suite of communications based on respondent insights. Through our research we have learnt about the best mailing strategy for increasing the timeliness of response. We explored if we can break down barriers to opening mail using regionally branded envelopes. We also looked at whether push to web does anything to increase coverage and lastly, how cost effective it is to adopt a cheaper mode of data collection first.

We will also reflect on the effectiveness of these materials in a multimode context by discussing the more recent results of the 2018 Labour Market survey and share how we are looking to engage with groups of respondents differently in a longitudinal context.


What Works, What Doesn’t? Three Studies Designed to Improve Survey Response

Mr Patrick Langetieg (Internal Revenue Service) - Presenting Author
Ms Brenda Schafer (Internal Revenue Service)
Mr John Guyton (Internal Revenue Service)
Ms Hanyu Sun (Westat)
Dr Jocelyn Newsome (Westat)
Ms Jennifer McNulty (Westat)
Dr Kerry Levin (Westat)

Over the past few decades, the survey industry has experienced a steady decline in response rates (e.g., Groves, 2011; Brick & Williams 2013). This decline has been attributed to multiple factors, such as the growth in the number of surveys, a decrease in free time (longer commutes and more two-earner households, and an increasing distrust of survey research, even when government-sponsored (Czajka & Beyler, 2016).

In this paper, we present results from three studies conducted in an attempt to increase response rates and reduce nonresponse bias for a national household survey. Study 1 investigated the optimal sequence of contacts to get sample members to respond. We also examined the impact of incentives and the type of nonresponse follow-up. Study 2 examined the effects of an additional (seventh) contact on response rates. Study 3 explored how using friendly or formal messaging in the communication materials affected response, particularly among Millennials (ages 20-35).

In Study 1, we found that a combination of a mail-only approach and the use of $2 prepaid incentives increased response rates. However, in terms of potential nonresponse bias as indicated by estimated response propensities for subgroups, using incentives did not bring into the respondent pool sample members who were less likely to respond. In Study 2, we found that the use of a seventh follow-up contact only slightly increased response rates. Finally, in Study 3, we found that the use of a friendly message actually suppressed response rates. Furthermore, using a friendly message did not affect the response propensities of sample members between the ages of 20-35. In fact, the friendly messaging increased potential nonresponse bias for sample members aged 35-48. We conclude this paper with a discussion of how the findings align with previous research.


We Can Do Better; The Application of Theory for Improving Response Rates to Mixed-Mode Surveys

Professor Don Dillman (Washington State University) - Presenting Author

Declines in survey response rates being observed throughout the world are due in part to the use of data collection designs that ignore how the many elements of a particular design, including decisions on the joint use of survey modes, may interact to affect rates of response. Although several theories of response behavior have been proposed as means of improving response rates, they tend to emphasize particular techniques for improving response while ignoring others and are often limited to single-mode applications. In addition these theories provide little more than abstract advice, while ignoring how the theory should specifically guide the design of each survey contact and materials associated with those requests for response. In this paper I propose the need to develop comprehensive data collection designs (from individual communications to the questionnaires and any supporting materials) guided by theory that has been shown effective in explaining human behavior. To do this I critique current theoretical approaches, and data collection designs that provide little or no guidance on how each aspect of data collection is likely to connect to other parts in supportive ways, guided by established theories of human behavior. In particular I discuss how the use of multiple modes of contact and response provide ways of improving response that need to be considered in the development of better theory. I also propose moving away from individual tests of response-inducing techniques that has tended to dominate response rate research to the creation and testing of comprehensive designs that explicitly use theory to guide the development of design details.