ESRA logo

ESRA 2023 Program

              



All time references are in CEST

Digital technologies for the data lifecycle of large-scale surveys 2

Session Organisers Dr Diana Zavala-Rojas (Universitat Pompeu Fabra, European Social Survey ERIC, RECSM )
Mr Niccolò Ghirelli (European Social Survey ERIC HQ, City, University of London )
TimeFriday 21 July, 09:00 - 10:30
Room U6-28

At the different stages of the survey data lifecycle, process-generated or user-generated content, meta- and para-data play an increasingly important role. As large-scale surveys rely more on the implementation of highly standardised processes, project management and data lifecycles utilise more digital tools to ensure both data quality and maintainability
Research infrastructures, and survey data providers are developing new digital technologies to enhance greater consistency, and improve communication between the different stakeholders involved.

This session aims to bring together those working with data dashboards and survey tools to share their experiences. We welcome presentations on tools used during the survey data lifecycle (e.g., questionnaire design, sampling, translation, data collection and data processing) from different stakeholders (fieldwork agencies, research infrastructures, academic institutes, etc.) and/or studies (national and cross-national, cross-sectional, and longitudinal, conducted in any mode).

Presentations could outline how methodological and practical issues have been addressed and show how digital technologies have an impact on survey implementation and appraisal of data quality. Presentations may also provide systematic reviews of performance indicators, monitoring dashboards, and studies testing tools.

Keywords: survey tools, survey project management, metadata, total survey error, digital technologies

Papers

myESS: virtual collaborative environment

Dr Diana Zavala (European Social Survey ERIC, Universitat Pompeu Fabra) - Presenting Author
Ms Danielly Sorato (RECSM - Universitat Pompeu Fabra)

This presentation introduces the myESS survey project. myESS is an application project based on 1) a virtual collaborative work environment 2) a concept designed to document the roles, interactions and activities of the different stakeholders of a survey project and, 3) a concept to apply it to ESS ERIC lifecycle. We discuss the advantages and challenges of developing, implementing and adopting in-house management software in distributed survey projects.


LIfBi Study Manager: Current Status and Future Vision of Digitized Study Management in the NEPS

Mr Nils Lerch (Leibniz Institute for Educational Trajectories)
Ms Lea Rauh (Leibniz Institute for Educational Trajectories) - Presenting Author

Many studies with an often complex design are administered simultaneously in the National Educational Panel Study (NEPS) at the Leibniz Institute for Educational Trajectories (LIfBi). The division of labor is high in these studies, that is, there are many different multilocal work units with specialized competencies. Thus, it is challenging to keep an overview of processes, responsibilities and routines. From a management perspective, organizational integration by a central work unit is required. Furthermore, digital technology can support the central and coordinating role of this work unit through digital tools that provide the needed overview for all stakeholders. The LIfBi Study Manager is such a digital tool. The LIfBi Study Manager was created to improve the survey management in the areas Study Overview, Scheduling and Workflows. The Study Overview has the function of continuously keeping all stakeholders informed about the current parameters of the study design. It also documents and archives the final status of studies. Schedules consist of all relevant timelines and milestones of a study. Finally, Workflows have a structuring function to define working steps in a certain order and determine which actors are involved. These tasks and processes are vital for the success of studies conducted at the LIfBi and the collaboration between different work units. The aim of the LIfBi Study Manager is to introduce a so-called learning system, that is a system that collects data that can be used for future surveys. The current status and future developments of the LIfBi Study Manager are presented.


Developing an indicator system to compare survey designs: quantifying survey infrastructure work.

Dr Roman Auriga (LIfBi - Leibniz Institute for Educational Trajcetories) - Presenting Author

Scientific infrastructure is increasingly becoming a recognized field of scientific work. One key issue connecting management tasks within scientific infrastructure and survey studies is the systematisation of survey designs in terms of their complexity, procedures needed and the efforts behind them. Given all factors involved, the chance for inappropriate planning, staffing and miscalculations of resources is high. Moreover, as survey study designs can very to a great extend regarding to data collection modes, the amount of target groups or number of instruments, the entity “survey study” seems no to be the best level for comparing efforts or indicators of scientific infrastructure within or between studies or organisations. Finally, the variety of reported survey implementation indicators or processes makes learning from each other hard. In response to these challenges, I will present a system of quantifiable and comparable survey infrastructure indicators.

We developed the indicator “infrastructure field” which translates the preparation needed to administer a survey into comparable packages and formalises different survey designs and workflows to quantifiable and measurable units. Those separable and distinguishable units are defined through temporality (of the survey situation), coherency of the protocols needed in field, surveyed populations, survey situation and access mode. Using the indicator system we designed, we first are able to represent visibly the complexity of survey designs. Secondly, prospective plans of staffing and/or workload become more easy and adequate, too. We are also able to indicate time stable and comparable indicators needed for different kind of official reports. Last but not least, the system helps us to meet the needs of the staff for their work to be visible. As studies can contain from one up to 10 or more “infrastructure fields”, meeting these needs is crucial for successful scientific infrastructure management.


Designing a sample and survey management service to support quality methods for the ESS CRONOS 2 online panel

Mr Baptiste Rouxel (SciencesPo)
Mr Malaury Lemaître-Salmon (SciencesPo)
Mrs Geneviève Michaud (SciencesPo, CNRS) - Presenting Author
Mr Tom Villette (SciencesPo)

From 2019 onwards, the European Social Survey ERIC has been collaborating with a small team of software research engineers at the Center for Socio-Political Data (SciencesPo) to support its innovative cross-national high quality online survey, in the framework of two European Commission funded projects: "Social Sciences and Humanities Open Cloud" (SSHOC) and "Next Steps in Securing the Sustainability of the European Social Survey, European Research Infrastructure Consortium" (ESS-SUSTAIN-2).
Together, the teams have co-designed the CRONOS 2 software infrastructure, taking advantage of previous experience gained during the "CROss-National Online Survey "(CRONOS 1 experiment for ESS), and the "Longitudinal Internet Studies for Social Sciences" (ELIPSS panel for SciencesPo) and inspired by other online panel studies taking place in european countries and beyond.
To provide a feature-rich survey platform and a controlled panel management tool, WPSS (for Web Panel Sample Service) has been designed as a two-fold infrastructure combining a commercial survey platform, Qualtrics, connected using its application programming interface (API) to a new application written in Python and Django.
SciencesPo has delivered a software infrastructure that was used during a pilot test (2020). This test has provided decisive feedback to the development process, and allowed the teams to improve and refine the software product. Multilingual and scalable, trustable and secure, providing flexible contact modes, the infrastructure has been used since September 2021, by 12 countries and served harmonised questionnaires to more than 10000 panelists, sending entangled emails and short text messages. The presentation will propose a detailed view of the software features, especially the custom survey and messages dashboards, and how such features can support quality survey methods and contribute to the global data quality.