ESRA 2017 Programme

Tuesday 18th July      Wednesday 19th July      Thursday 20th July      Friday 21th July     

     ESRA Conference App

Wednesday 19th July, 09:00 - 10:30 Room: F2 102

Administrative Records for Survey Methodology 1

Chair Dr Asaph Young Chun (US Census Bureau )
Coordinator 1Professor Mike Larsen (George Washington University)
Coordinator 2Dr Ingegerd Jansson (Statistics Sweden)
Coordinator 3Dr Manfred Antoni ( Institute for Employment Research)
Coordinator 4Dr Daniel Fuss (Leibniz Institute for Educational Trajectories)
Coordinator 5Dr Corinna Kleinert (Leibniz Institute for Educational Trajectories)

Session Details

Incorporation of administrative records have long been regarded as a way of improving the quality and interpretability of surveys and censuses and of controlling the rising cost of surveys (Chun and Scheuren, 2011). The increasing number of linked datasets, such as Health and Retirement Study in the U.S., National Educational Panel Study in Germany, and Understanding Society in UK, are accompanied by growing empirical evidence about the selectivity of linked observations. The extent and pace of using administrative data varies from continent to continent and from country to country. This is partly due to differential concerns about privacy, confidentiality, and legal constraints, as well as variability in acceptance and implementation of advances in statistical techniques to control such concerns.

The primary goal is to control data quality and reduce total survey error. This session will feature papers that implement "total administrative records error" and “total linked data error” methods and provide case studies and best practices of using administrative data tied to the survey life cycle (Chun and Larsen, a forthcoming Wiley book). The session invites papers that discuss fundamental challenges and recent advancements involved in the collection and analysis of administrative records, integration with surveys, censuses, and auxiliary data. We also encourage submission of papers discussing institutional collaboration on linked data, sustainable data access, provision of auxiliary tools and user support. For example, papers in this session include, but are not limited to the following topics:

1.Innovative use of administrative data in household surveys and censuses to improve the survey frame, reduce nonresponse follow-up, and assess coverage error.

2.Quality evaluations of administrative data and quality metrics for linked data

3.Recent advancements in processing and linking administrative data with survey data (one-to-one) and with multiple sources of data (one-to-many).

4.Recent methods of disclosure limitation and confidentiality protection in linked data, including linkages with geographical information.

5.Bayesian approaches to using administrative data in surveys, censuses, small area estimation, and nonresponse control.

6.Implementation of new tools that facilitate the use of linked data by simplifying complex data structures or handling inconsistent information in life-course data

7.Strategies for developing and maintaining a user-friendly infrastructure for the analysis and dissemination of linked data and solutions for collaboration

8.Applications that transform administrative data into information that is useful and relevant to policymaking in public health, economics, science and education.

Paper Details

1. Evaluation of the Quality of Administrative Data Used in the Dutch Virtual Census
Mr Eric Schulte Nordholt (Project leader of the Census)
Dr Piet Daas (Senior methodologist)
Dr Martijn Tennekes (Methodologist)
Dr Saskia Ossen (Project leader at the division Data collection)

Since the last census based on a complete enumeration was held in 1971, the willingness of the population in the Netherlands to participate has decreased tremendously. Statistics Netherlands found an alternative in a Virtual Census, by using available registers and surveys as alternative data sources. Advantages of a Virtual Census are that it is cheaper and more socially acceptable. The combined use of registers and surveys for composing a census, however, also leads to several methodological challenges. One of them is determining the effect of the quality of the sources. For registers, for instance, the collection and maintenance is beyond the control of the statistical institute. It is therefore important that these institutes are able to determine the quality of the sources used. Insight into the quality of the sources used enables a well thought-out comparison between comparable information in various sources.

2. Evaluating the Accuracy of Administrative Data to Augment Survey Responses
Dr Marcus Berzofsky (RTI International)
Dr Stephanie Zimmer (RTI International)
Mr Timothy Smith (RTI International)

Surveys in both Europe and the United States are facing pressure to reduce the burden imposed on respondents. One way to achieve this is to shorten the length of the survey instrument, sometimes by removing items that may be critical to the analysis. However, if the information addressed by these items can be obtained through administrative sources with minimal measurement differences (i.e., the values provided from the two sources are similar) then survey methodologists can reduce burden and minimize information loss.

Substituting administrative data for survey data is an ideal solution, so long as two criteria are met. First, the administrative data need to be consistently defined and accurately maintained across all data sources. Second, the manner in which the administrative data attributes are defined needs to be consistent (i.e., with minimal measurement error) with how the data would be defined through self-reported responses.

For prison inmates, the agency with custodial jurisdiction – a state or the federal government – maintains a large amount of administrative data on each inmate including demographics, criminal history, and sentencing. These types of data are often collected prior to the final stage of sampling and used for stratification and nonresponse models. However, if an element meets the two criteria, then the element also could be used in lieu of a survey item.

The 2016 Survey of Prison Inmates (SPI) obtained a wide range of administrative information on the inmates housed in the sample of prisons. SPI, a periodic nationally representative survey of U.S. prison inmates sponsored by the Bureau of Justice Statistics (BJS) and conducted by RTI International in 2016, collected information from inmates on a variety of topics including the basic demographics and their sentence length and date of admission for their current offense. The administrative data were collected during the rostering process prior to selecting a sample of inmates.
BJS is interested in expanding the use of administrative data in conjunction with self-reported information in the analysis of the inmate population. As such, the administrative data initially collected to assess nonresponse bias also can be used to evaluate how well administrative data may perform as a substitute for self-reported data. While some characteristics, such as age, have little likelihood of measurement error, other characteristics such as race/ethnicity and sentence length may not be consistently defined across states or by respondents.

In this presentation, we assess 1) both of the stated criteria for using administrative data in place of self-reports and 2) the consistency and quality of administrative data as they relate to inmates in prison. If these basic inmate characteristics are consistently defined and have good agreement with the self-reported data then it is possible that administrative data on other characteristics such as economic or criminal history data also may be viable substitutes for self-reported data.

3. Assessing Administrative Data Quality: The Truth is Out There
Dr Asaph Young Chun (US Census Bureau)
Dr Sonya Porter (US Census Bureau)

Administrative records are data collected primarily for fulfilling administrative purposes by non-statistical agencies such as by Internal Revenue Service, Social Security Administration, and Selective Service System. The utility of AR has been widely discussed for decades to supplement complex surveys and censuses across the Atlantic (Chun and Scheuren, 2011). The quality of administrative records is central to making decisions about their use over the survey life cycle, from frame construction to adaptive data collection to estimation, including imputation. This paper presents a framework of Total Administrative Data Errors, discusses approaches to assessing administrative data quality and the linked data quality, respectively, and demonstrates with case studies the utility of the Total Linked Data Errors approach, including pros and cons. We discuss practical implications of using administrative records for survey methodology.