ESRA logo
Tuesday 18th July      Wednesday 19th July      Thursday 20th July      Friday 21th July     




Wednesday 19th July, 09:00 - 10:30 Room: N AUD4


Adaptive and Responsive Designs in Complex Surveys: Recent Advancements and Challenges 1

Chair Mr Jason Fields (U.S. Census Bureau )
Coordinator 1Dr Asaph Young Chun (U.S. Census Bureau)
Coordinator 2Professor James Wagner (University of Michigan)
Coordinator 3Dr Barry Schouten (Statistics Netherlands)
Coordinator 4Ms Nicole Watson (University of Melbourne)

Session Details

Adaptive and responsive survey designs (Groves and Heeringa, 2006; Wagner, 2008) have attempted to respond to a changing survey environment that has become increasingly multimode, multiple data sources-driven and multilingual. The Journal of Official Statistics will be publishing in 2017 a Special Issue on Adaptive Design in complex surveys and censuses (Edited by Chun, Schouten and Wagner, forthcoming). In our efforts to address multiple challenges affecting the survey community and the fundamental interest of the community of survey methodologists to produce quality data, we propose a session of papers that discuss the latest methodological solutions and challenges in adaptive and responsive designs for complex surveys. We encourage submission of papers on the following topics of adaptive or responsive design:

1.Applied and theoretical contributions, and comparisons of variants of adaptive design that leverage strengths of administrative records, big data, census data, and paradata. For instance, what cost-quality tradeoff paradigm can be operationalized to guide development of cost and quality metrics and their use around the survey life cycle? Under what conditions can administrative records or big data be adaptively used to supplement survey data collection and improve data quality?

2.Papers addressing the following triple drivers of adaptive/responsive design: cost, respondent burden, and data quality. For instance, what indicators of data quality can be integrated to monitor the course of the data collection process? What stopping rules for data collection can be used across the phases of a multi-mode survey?

3.Papers involving longitudinal survey designs where data collection systems need to fulfill their panel focus and provide data for the same units over time, and leverage adaptive processes to reduce cost, reduce burden, and/or increase quality. For instance, how can survey managers best engage the complexity of issues around implementing adaptive and responsive designs, especially for panel surveys that are in principle focused on measuring change over time? How are overrepresented or low priority cases handled in a longitudinal context?

4.Papers involving experimental designs or simulations of adaptive survey design. For instance, experimental implementation of an adaptive design, especially those involving multiple data sources, a mixed mode of data collection or a cross-national design.

5.Papers that apply Bayesian methods to build adaptive designs. For example, adaptive designs where the design parameters are given priors and then updated as additional data are collected.



Paper Details

1. Aggressive, Relaxed, or Simply the Default? Adaptive Survey Design Strategies to Reduce Nonresponse Error
Dr Antje Kirchner (RTI International)
Dr Nicole Tate (RTI International)
Dr Emilia Peytcheva (RTI International)
Mrs Jennifer Cooney (RTI International)
Dr Natasha Janson (RTI International)

Adaptive designs have been used for almost a decade in survey research, providing researchers with methods to balance survey quality and survey cost during data collection (Groves and Heeringa, 2006). This article evaluates the effectiveness of such a design aiming at increasing response rates and reducing the potential for nonresponse bias in the 2016/17 Baccalaureate and Beyond Longitudinal Study (B&B, field test) of college graduates. We use prior wave response behavior to assign cases to different data collection protocols, varying contact frequency, mode, incentive amount, and interview length.

More specifically, prior wave nonrespondents were randomly assigned to either a default (used in the prior data collection), or an aggressive data collection protocol (aimed at converting nonrespondents through higher incentives, more frequent contacts, and an abbreviated interview). Prior study early respondents were assigned to a relaxed protocol, and all other respondents to the study default protocol. Data collection consisted of three phases, where different design components (incentive, survey length, or mode) were changed. We investigate response rates, sample representativeness and nonresponse bias using frame data for each group and each phase. The availability of a rich set of auxiliary frame data for all sample members allows us to compare the effectiveness of the different data collection protocols.

Preliminary results indicate the response rate is significantly higher in the aggressive (RR2=34.3%) compared to the default (RR2=19.8%, p<0.001) data collection protocols for the base-year nonrespondents, and significantly higher in the relaxed (RR2=71.0%) compared to the default (RR2=63.5%, p<0.01) data collection protocols for the base-year respondents. Furthermore, while response rates significantly increase with each data collection phase, nonresponse bias assessed using Mahalanobis distance measures decreased significantly in the first data collection phase, but not in later phases. We discuss implications of our findings for data quality and efficiency gains for data collection.


2. An adaptive design approach to web fieldwork on the UK Household Longitudinal Study
Miss Hannah Carpenter (Kantar Public)
Dr Jon Burton (Institue for Social and Economic Research, University of Essex)

The UK Household Longitudinal Study, also known as Understanding Society, started in 2009 and for the first 7 waves almost all interviewing was completed face to face. At wave 8 in 2016 the study became mixed mode with 40% of households being invited to take part online initially (with non-responders being followed up face to face). To make the study as cost effective as possible, it was important to maximise the number of people and, in particular, whole households that completed the survey online before the start of face to face interviewing. Sample was issued in monthly batches so, to test the most effective design for web fieldwork, an adaptive design approach was used to vary the length of web fieldwork, the incentives used, and the reminder strategy for different batches. Most of these variations were implemented experimentally. This paper will cover the practicalities of using an adaptive design approach, as well as the details of the different web fieldwork designs used, how they affected response rates, and whether or not they were cost effective.


3. A Preliminary Development of a Methodological Approach to Case Prioritization using Composite R-Indicators for the Survey of Income and Program Participation (SIPP)
Dr Kevin Tolliver (U.S. Census Bureau)
Dr Jason Fields (U.S. Census Bureau)

The Survey of Income and Program Participation (SIPP) is a longitudinal household survey conducted by the U.S. Census Bureau. Government policy formulators depend heavily upon SIPP for information on employment dynamics, the utilization of government assistance programs, the family context, and their interactions. The SIPP program has engaged a test of adaptive design procedures to increase the representation of movers as well as an effort to increase sample balance of program participants relative to their distribution in the initial sample. The movers are prioritized before and during data collection using administrative data. The program participation prioritization, is based on work the SIPP research team has undertaken to use R indicators as a tool to target to under represented households and balance on variables related to program participation. During the course of the first test, the research team found that balancing on some of the 17 variables could drive imbalance in other variables. This research describes the overall planning for the development of adaptive procedures in the SIPP and the research to develop a methodological approach for case prioritization using a composite R indicator consisting of all variable level and categorical level R indicators and their propensity to respond.


4. Managing trade-offs between fieldwork cost, response bias and response rates in a large-scale longitudinal survey
Dr Lisa Calderwood (University College London)
Mr Martin Wood (NatCen Social Reseach )
Dr Pablo Cevera Alvarez (University of Salamanca )

Responsive or adaptive designs involve making adjustments to survey design protocols during data collection in order to actively manage in real-time the trade-off between fieldwork costs, response bias and response rates (Groves and Heeringa, 2006; Couper and Wagner, 2011). These approaches are facilitated by technological developments such as electronic sample management systems allowing the real-time collection and analysis of para-data. Schouten et al. (2011) recommend the use of the ‘R’ indicator to evaluate representivity in real-time. Longitudinal studies are generally better-able than cross-sectional surveys to evaluate response bias as they are able to use prior wave data.

Although there has been a growing interest in this area, relatively few large-scale surveys have implemented these approaches. This paper will report experiences from the eighth wave of Next Steps (previously known as the Longitudinal Study of Young People in England) in managing the trade-offs between fieldwork cost, response bias and response rate in a live survey setting.

Next Steps is a cohort study following around 16,000 young people in England born in 1989-1990. The sample was recruited in 2004 when the study members were aged 13/14. By wave 7 in 2010, the sample size had fallen to around 8,600, and this attrition was strongly related to baseline socio-demographic characteristics. The study was re-started in 2013 and another wave of the survey conducted in 2015/2016, for which all participants who have ever taken part will be approached again.

The survey design for Wave 8 consists of a sequential mixed mode approach involving web, telephone and face-to-face. Due to length of time, and variability in time, since the study members were last interviewed there was considerable uncertainty about what the response rates will be, both overall and to each of the different survey modes, and therefore what the cost of the fieldwork would be.

In order to help manage this uncertainty, we conducted a ‘soft-launch’ of the survey involving a random sub-sample of 2,000 cases which provided more reliable estimates of response rates and fieldwork costs. The intention was that this would also allow us to evaluate the impact of additional fieldwork effort, particularly re-issuing and ‘enhanced tracing’ on response rates and response bias. Overall, our aim was to maximise the overall response rate, and hence the achieved sample size, and minimise non-response bias, within the context of a fixed fieldwork budget.

We set out to use response models in order to compute a response probability for each participant and use R-indictors to monitor the representativeness of the achieved sample at key points during fieldwork, to evaluate the impact of fieldwork effort on sample representivity and, in the event that it was not possible to issue all of the cases to face-to-face fieldwork due to the budget constraint, use the R-indictors to help prioritise cases to be issued.