ESRA logo

ESRA 2023 Program

              



All time references are in CEST

Push-to-Web Surveys: Challenges and Opportunities 5

Session Organisers Dr Olga Maslovskaya (University of Southampton)
Dr Peter Lugtig (Utrecht University)
TimeFriday 21 July, 09:00 - 10:30
Room U6-01e

We live in a digital age with widespread use of technologies in everyday life. Technologies change very rapidly and affect all aspects of life, including surveys and their designs. Online data collection is now common in many countries and some surveys started employing push-to-web approach (for example, Generations and Gender Surveys in some countries, including the UK) in which offline contact modes are used to encourage sample members to go online and complete a web questionnaire. This method of data collection is typically used when sampling frames do not include email addresses for all members of the target population. It is important to address different methodological challenges and opportunities which are associated with push-to-web surveys.

This session welcomes submissions of papers on different methodological issues associated with pish-to-web surveys in both cross-sectional and longitudinal contexts. Examples of some topics of interest are:

• Coverage issues
• Data quality issues
• Unit nonresponse
• Response rates and nonresponse bias
• Mobile device use
• Questionnaire length
• Other topics

We encourage papers from researchers with a variety of backgrounds and across different sectors, including academia, national statistics institutes, data collection organisations, and research agencies.

This session aims to foster discussion, knowledge exchange and shared learning among researchers and methodologists around issues related to push-to-web surveys. The format of the session will be designed to encourage interaction and discussion between the presenters and audience.

Keywords: online surveys, push-to-web surveys, nonresponse bias, response rates, data quality

Papers

Two Experiments in a Small Probability-Based Online Panel

Dr Markus Hahn (Australian National University) - Presenting Author
Professor Nicholas Biddle (Australian National University)

Two important qualities of longitudinal surveys are high representativeness and low attrition. Here, we present results from two experiments in which we tested different recruitment messages and longitudinal incentives in a small probability-based online panel.

About 600 adults were recruited in 2022 from the Australian electoral roll and from the population of Australian cellphone owners (through RDD). Initial contacts occurred through letters, postcards, and SMS text messages; subsequent contacts through email. Panel members were asked to complete an online questionnaire in each wave, with push-to-web being the only option. Incentives were offered in every wave upon questionnaire completion. The electoral roll sample also received an unconditional incentive with their invitation. Respondents enrolled in the panel after completing wave 1. The recruited panel compared favourably to typical benchmarks.

The first experiment varied the recruitment message. One treatment emphasised university sponsorship and non-commerciality; another the longitudinal nature of the panel, stressing the importance of participating in multiple waves. Results were mixed. Detailed information reduced wave 1 response rates in the SMS, but increased them in the electoral roll sample. Attrition, however, was lower for those who received more details, regardless the recruitment mode.

The second experiment, conducted in wave 2, again stressed the longitudinal nature, but also offered to some a bonus incentive for participation in the next three waves. Stressing the importance of ongoing participation reduced response rates. The promise of offering a bonus after three waves offset that reduction somewhat, but not completely.

Our results have implications for the establishment of online panels. Providing detailed information on the nature of the panel may reduce recruitment, but also attrition.


Developing the QCEW Business Supplement (QBS)

Mrs Sharon Stang (Bureau of Labor Statistics)
Mr Demetrio Scopelliti (Bureau of Labor Statistics) - Presenting Author

The U.S. Bureau of Labor Statistics surveys approximately 1.2 million business establishments every year for the Annual Refiling Survey (ARS). The ARS is a short (5 minute), web-only survey, that asks respondents to review and update their business mailing address, physical location address, and main business activity. These data keep the BLS business register (sampling frame for BLS surveys) current. This survey is conducted on a three-year cycle with one-third of establishments surveyed each year. Establishments are contacted via email (when an email address is available) or by postal mail, and asked to respond online. The ARS is an ideal survey to append additional questions to because it is relatively short, and because establishments are contacted every three years, meaning there's little chance of over-burdening respondents with the additional questions. Beginning in 2018, BLS began testing a supplemental survey instrument, appended to the end of the ARS, that would allow for the collection of new data at little or no additional cost. In 2020, the platform developed for this purpose was used to conduct a seven-question survey on how businesses were responding to the COVID-19 pandemic. As a follow up, in 2021, BLS conducted a longer 21-question survey on COVID-19 pandemic impacts, and then in 2022, as we began to emerge from the COVID-19 pandemic, a 22-question survey was fielded to learn about telework, hiring, and vacancies at establishments.

This paper will cover the QBS online platform design, sampling strategies for supplemental surveys, email and postal mail solicitation techniques, response rates by method of solicitation, data available from these three surveys, and future plans for the platform in 2023, and beyond.


Study member preferences for questionnaire mode in the MRC National Survey of Health and Development

Ms Maria Popham (MRC Unit for Lifelong Health and Ageing at UCL) - Presenting Author
Dr Andrew Wong (MRC Unit for Lifelong Health and Ageing at UCL)

The MRC National Survey of Health and Development (NSHD) is the oldest of the British birth cohort studies and has actively assessed study members since their birth in 1946. Up until 2020, online questionnaires had never been used.

Restrictions due to COVID-19 meant that online questionnaires were essential to collect data during the pandemic. This provided the NSHD study members’ the opportunity to trial online questionnaires and indicate their preferences for future sweeps. We now report quantitative data on responses via different modes (online (O) or postal (P)) across multiple sweeps, and supplement this with qualitative information from Advisory Panel study members describing reasons for their preferred mode.

The first online survey was sent out in May 2020 during the COVID-19 pandemic, when participants were aged 74, with a subsequent postal option sent to non-responders shortly after (June 2020(P)). The percentage completing the online version of the first covid questionnaire was 67% (O;n=1260; P:n=616). This rate increased to 74% (O;n=1569; P:n=547) for the second covid questionnaire (Sept 2020(O)/ Nov 2020 (P)) and was maintained at 73% (O;n=1399; P:n=514) for the third covid questionnaire (Feb 2021 (O)/ May 2021 (P)). However, in March 2022, the percentage of study members indicating their preference to continue receiving online questionnaires had fallen to 33% (O;n=540; P:n=1108). This decline continued, with only 15% (O;n=300; P:n=1661) completing the online version of the 2022 questionnaire (regular data collection sweep). When discussing reasons for this preference, being able to annotate on the questionnaire was the most common response for preferring postal questionnaires.

Our experience has shown that paper questionnaires remain an essential tool for future sweeps in this age group.


Adaptive design for sensitive surveys: the experience of a National Statistical Office (CSO, Ireland)

Dr Jessica M Coyne (Central Statistics Office) - Presenting Author
Mr Tony Kelleher (Central Statistics Office)

The objective of the Safety of the Person (SOP) survey was to design and build a mobile first, multi-mode survey to collect data on a sensitive topic for the production of national statistics, while simultaneously developing a full survey life cycle to cater for an adaptive design and a respondent focused instrument for data collection. This was a unique project with a considerable ethical element, due to the sensitive content contained in the instrument, and involved contributions from many stakeholders. Our sample consisted of 13,000 randomly selected individuals from a respondent-based register, with a back up sample of 7,000. We offered three modes to the respondents, a computer assisted web interview (CAWI), or a computer assisted personal interview (CAPI) with a computer assisted self-interview (CASI) element or a pen and paper instrument (PAPI). Our data collection period was seven months (June 2022 to December 2022). Our field force consisted of 26 interviewers with three coordinators; the interviewers carried out follow up calls to respondents who had not participated following receipt of the invitation letters. The process developed for the interviewer was termed the ‘knock to nudge’ where they encouraged the respondents to complete the survey online prior to offering the other modes. To date, the response rate is 38% with the following breakdown by mode: 78% CAWI, 17% CAPI/CASI and 5% PAPI. On average we received 22 responses per day throughout the collection period with our highest returns on Tuesdays, Wednesdays and Thursdays of each week.


Now please do it like this: Pushing university administrators to an online, establishment-based census of student housing

Mr Alfred Tuttle (none) - Presenting Author

The Census Bureau enumerates everyone living in the United States every ten years. A special operation, the Census of Group Quarters, enumerates people living in various types of group quarters (GQs), which are places where people live or stay in a group living arrangement provided by an organization such as university housing. The enumeration of GQ residents involves coordination by administrators of GQ organizations, and a suite of options is offered to allow them to select enumeration procedures that work best for them and their residents. Traditional options include distribution and collection of self-administered paper questionnaires completed by residents and in-person interviews conducted by field staff, among others. For the 2020 Census of Group Quarters, a new electronic instrument, eResponse, was developed to enable administrators to enumerate their residents from institutional records using electronic spreadsheet templates.

The coronavirus pandemic lockdown temporarily disrupted 2020 Census field operations, and universities sent most of their students away. Thus, self- and interviewer-administered enumeration methods were no longer viable. At the request of the Census Bureau, many universities opted to use eResponse to enumerate the students who were expected to be living in university housing to ensure that the census estimates accurately reflected the student populations of their communities.

How did the adoption of the new centralized, records-based response option affect universities? eResponse imposed response tasks more like an establishment data collection than a population survey. In this presentation I will discuss the processes and challenges for universities in enumerating their student housing as establishments rather than facilitating individual responses, and the applicability of establishment survey methods in the collection of population data.