ESRA logo

ESRA 2023 Glance Program


All time references are in CEST

Push-to-Web Surveys: Challenges and Opportunities 4

Session Organisers Dr Olga Maslovskaya (University of Southampton)
Dr Peter Lugtig (Utrecht University)
TimeThursday 20 July, 16:00 - 17:30
Room U6-23

We live in a digital age with widespread use of technologies in everyday life. Technologies change very rapidly and affect all aspects of life, including surveys and their designs. Online data collection is now common in many countries and some surveys started employing push-to-web approach (for example, Generations and Gender Surveys in some countries, including the UK) in which offline contact modes are used to encourage sample members to go online and complete a web questionnaire. This method of data collection is typically used when sampling frames do not include email addresses for all members of the target population. It is important to address different methodological challenges and opportunities which are associated with push-to-web surveys.

This session welcomes submissions of papers on different methodological issues associated with pish-to-web surveys in both cross-sectional and longitudinal contexts. Examples of some topics of interest are:

• Coverage issues
• Data quality issues
• Unit nonresponse
• Response rates and nonresponse bias
• Mobile device use
• Questionnaire length
• Other topics

We encourage papers from researchers with a variety of backgrounds and across different sectors, including academia, national statistics institutes, data collection organisations, and research agencies.

This session aims to foster discussion, knowledge exchange and shared learning among researchers and methodologists around issues related to push-to-web surveys. The format of the session will be designed to encourage interaction and discussion between the presenters and audience.

Keywords: online surveys, push-to-web surveys, nonresponse bias, response rates, data quality

Papers

The innovative ‘ePhenome’ tool: Remotely assessing phenotypes for the very large, whole-of-state Generation Victoria (GenV) cohort of parents and newborns.

Dr Susan Clifford (Murdoch Children's Research Institute, University of Melbourne) - Presenting Author
Dr Fanhong Shang (Murdoch Children's Research Institute, University of Melbourne)
Ms Katie McBain (Murdoch Children's Research Institute, University of Melbourne)
Ms Bran Ranjithakumaran (Murdoch Children's Research Institute)
Professor Richard Saffery (Murdoch Children's Research Institute, University of Melbourne)
Professor Sharon Goldfeld (Murdoch Children's Research Institute, University of Melbourne, Royal Children's Hospital Melbourne)
Professor Melissa Wake (Murdoch Children's Research Institute, University of Melbourne, University of Auckland)

BACKGROUND: A major block to mounting very large, geographically-dispersed cohorts is the need for high-throughput, cheap, low-burden phenotypic measurement. The whole-of-state Generation Victoria (GenV), launched during the COVID19 pandemic, is Australia’s largest child and adult research project. We developed an ‘ePhenome’ digital platform accessible on all devices that could measure key cohort-relevant phenotypes over time.

METHODS: The ‘GenV & Me’ ePhenome custom App and website was co-developed with a digital health company and piloted and refined over nine incremental software releases so far. Using only the participant’s smartphone, tablet or computer, it remotely captures survey items, diagnoses, life events, photos, videos, adaptive tests and direct assessments. Measures selection was guided by GenV’s Outcomes Framework, Core Outcome Sets, collaborators, literature, major cohorts and parent panel; it spans cognition, physical and mental health, functioning, growth, dysmorphology, hearing and vision. Measures were allocated to assessments sent to parents (up to four/year); each is brief, low burden, intuitive, appealing and compatible across devices.

RESULTS: Deployed in May 2022, we send hundreds of surveys daily. Parents have completed 15,000 surveys across children’s first 9 months of life and uploaded >1200 baby videos. Data quality and participant engagement tracking is near-real time. Additional assessments and more sophisticated data capture methods (eg interactive cognition, hearing assessments) and user experience (eg data playback, multiple languages, enhanced App security) launch in 2023. We will showcase the breadth of GenV’s phenotypic measures, and innovative survey, software and participant management methods developed to enable remote, large-scale phenotypic assessment.

CONCLUSIONS: High-throughput and wholly-digital data collection requiring only a participant’s device can support measurement previously requiring face-to-face assessment, supporting a shift in observational and interventional research to very large size and dispersed geography.


Transitioning an Employee Panel Survey from Telephone to Online and Mixed-Mode Data Collection

Mr Jan Mackeben (Institut für Arbeitsmarkt- und Berufsforschung) - Presenting Author
Mr Joe Sakshaug (Institut für Arbeitsmarkt- und Berufsforschung)

Employee (panel) surveys, which are essential for measuring ongoing labor market developments, are facing significant challenges of respondent recruitment and retention. Even interviewer-administered panel surveys, historically considered the gold standard form of data collection, are facing high costs and nonresponse issues that threaten their sustainability and inferential capabilities. However, many employee (panel) surveys still use costly interviewer-administered modes to reach this special population. Certain employee subgroups may be especially hard to reach using these modes. Supplementing an employee telephone single-mode survey with online data collection is a popular method of reducing costs and may increase coverage and reduce nonresponse in employee surveys. However, the effects of introducing online data collection in an ongoing panel survey of the employed population have received little attention. We address this research gap by analyzing a mode design experiment embedded in the fourth wave of a German employee panel survey. Individuals were randomly assigned to the standard telephone-only design, or a sequential web-telephone mixed-mode design. An invitation letter experiment was also conducted to test the effect of mentioning the telephone follow-ups in the web survey invitation. Introducing the mixed-mode design led to higher response rates in the refreshment and panel group, higher (refreshment group) and similar (panel group) levels of nonresponse bias and lower costs compared to the single-mode design. The likelihood of web participation varied across certain employee subgroups, including occupation type and employment contract. Mentioning the telephone follow-ups had no effect on participation in the web starting mode or the full mixed-mode design. Implications of these findings for survey practice are discussed.


The Generations and Gender Survey (GGS) in the UK: Results from user testing and methodological experiments

Dr Olga Maslovskaya (University of Southampton) - Presenting Author
Miss Grace Chang (University of Southampton)
Professor Brienna Perelli-Harris (University of Southampton)
Professor Ann Berrington (University of Southampton)

The Generations and Gender Survey (GGS) is part of a global data collection infrastructure focused on population and family dynamics. The GGS collects demographic, economic, and social attitude data on young and mid-life adults (18-59) as they enter into adulthood, form partnerships, and raise children. In this presentation, the design of the UK GGS, which has been conducted online using push-to-web approach to data collection, will be presented. We reflect on the challenges of conducting probability-based online data collection in the UK context, including the absence of an individual-level sampling frame. Our project included an enhancement component that developed and tested new data entry formats and summary screens to improve the quality of life history data. The results of user testing suggest that users identified a split way of entering dates (when the respondents needed to enter the year manually but select a month from the list of options) as the most user-friendly. Based on the user testing, the decision was taken not to proceed with implementation of summary screens which were designed with the hope that they would help improve quality of life history data. The reasons for the decision will be presented and discussed.
We also present the results of two methodological experiments (incentives experiment and QR codes experiment). The results from incentives experiment which was conducted in stage 1 of the data collection informed the incentive structure for the stage 2 of data collection which will finish in March 2023. The QR codes experiment will be conducted in the second stage of the data collection.


Learning From the Past, Building the Future: Changing the Way We Think About Web-Only Surveys.

Mr Kris Pate (Internal Revenue Service) - Presenting Author
Dr Scott Leary (Internal Revenue Service)
Mrs Brenda Schafer (Internal Revenue Service)
Mr Rizwan Javaid (Internal Revenue Service)
Mr Pat Langetieg (Internal Revenue Service)

“When we dedicate ourselves to a plan and it isn’t going as we hoped, our first instinct isn’t usually to rethink it. Instead, we tend to double down and sink more resources in the plan” (Grant, 2021). What happens when your survey printer goes out of business, and you don’t have the time or money to get a new printer through the standard process?

The Internal Revenue Service (IRS) Taxpayer Burden Survey program gathers data from taxpayers and businesses about the burden of complying with federal tax reporting requirements. These data are collected through the administration of 18 different surveys. Historically, 13 of these surveys have been administered in a multi-mode protocol (paper, and web) while the remaining 5 have been administered as web-only. All materials (survey invitation letters, survey packets, reminder letters) for these surveys were printed and mailed by privately contracted printing firms. In 2022, the contracted printer for all our Fiscal Year 2023 surveys unexpectedly went out of business five weeks before the first survey launch date. It was not possible to obtain a new external print contract in time for a successful data collection.

We quickly partnered with an internal IRS printing resource that could only print letters. Because we could not print survey booklets, all multi-mode surveys had to be converted to a web-only protocol. We used prior survey and web test data to redesign the data collection protocol.
This paper explores the impact of the pivot to web-only by analyzing response rates and potential nonresponse bias from surveys that were converted from multi-mode to web-only and comparing them to burden surveys that have always had web-only protocols. What we learn will shape the future of the entire IRS Burden Survey Program.


Surveys of health care organizations in times of crisis. Insights from an online survey of nursing, palliative and hospice care facilities in Germany during the COVID-19 pandemic

Ms Diana Wahidie (Witten/Herdecke University) - Presenting Author
Dr Yüce Yilmaz-Aslan (Witten/Herdecke University)
Professor Patrick Brzoska (Witten/Herdecke University)

Background
The COVID19 pandemic has a significant impact on health care. The compliance with hygiene regulations and the associated increased workload also affect the successful implementation of research projects. The aim of this study was to demonstrate the challenges and opportunities of (online) surveys of health care organizations during the COVID19 pandemic using nursing, palliative, and hospice care facilities in Germany as an example.

Methods
Between February and June 2021, facility managers of all inpatient nursing (N=10086), palliative, and hospice care facilities (N=632) in Germany were invited by email to participate in an online survey, followed by 4 reminders. Facilities that did not want to participate in the online survey were asked to complete a non-response form.

Results
The willingness to participate in the study was low. This can be attributed, amongst others, to the dynamic nature of the research topic, the heavy workload experienced by stakeholders in the nursing care sector, and the associated lack of time resources. However, with each reminder, a substantial response increase could be observed. In total, 1561 facilities responded to the survey (response rate: 14.6%) and 266 facilities (2.5%) participated in the non-responder survey. Most common reasons given for nonparticipation were lack of time, the facility not being affected by the study topic, and other reasons, such as too many parallel surveys focusing on COVID19.

Discussion
Research methods such as the present online survey have advantages in times of a pandemic, such as surveying a large sample within a short time frame at low cost, but limitations must be considered when interpreting the results. In addition to the low response rate, these include high dropout rates, limited representativeness due to sociodemographic differences among online participants, nonparticipation by facilities heavily affected by COVID19 infections, as well as technical issues.