ESRA logo
Tuesday 18th July      Wednesday 19th July      Thursday 20th July      Friday 21th July     




Friday 21st July, 11:00 - 12:30 Room: N AUD4


What do we tell them, and how? Reviewing current practices for communicating about surveys with respondents 1

Chair Mr Alfred Tuttle (US Census Bureau )

Session Details

Surveys generally involve more than just the interaction between a respondent and the survey instrument or interviewer. Surveyors must also communicate information about a survey that prepares respondents for response – what types of information will be collected, how to access the survey, time frame of completion, availability of modes, etc. In addition, surveyors may provide information intended to convey the importance of the survey – planned and past uses of results, names of important sponsors and other stakeholders, etc. In short, surveyors attempt to affect respondents’ decisions 1) to respond 2) in a timely manner and 3) with the effort needed to ensure the reporting of accurate data.

This long-standing problem is complicated by rapid technological and social changes. Today’s surveyors, faced with new challenges such as declining response rates, changes in communication behaviors among target populations, competition from telemarketers for respondents’ attention, the emergence of cyber-security threats, etc., must adapt their communications to address changes in the survey environment.

This session will explore surveyors’ experiences with developing, implementing, and evaluating survey communications. Of particular interest are (but are not limited to):
• Empirical evaluations of the effectiveness of survey communications
• Development and evaluation of messages related to privacy, data security, use of administrative records, and other recent topics of concern to respondents and surveyors
• The use of newer communication modes such as email, SMS, social media, etc., as well as novel uses of traditional modes
• Research into respondent behaviors with regard to processing survey materials and making decisions about cooperation
• Application of theoretical approaches to crafting messages to positively affect respondents’ decision to participate, e.g., social exchange theory, cost-benefit analysis, principles of psychological influence, etc.
• The impacts of imposed requirements – legal statutes, informed consent policies, institutional review boards, etc. – and how surveyors adapt their communications to meet these requirements

Paper Details

1. what should we (and should we not) tell them? Qualitative testing of different approaches and materials to recruit members of a new Australian online probability panel
Ms Karen Kellard (Australian National University)

The Social Research Centre (owned by the Australian National University) has recently established the country’s first nationally representative online probability panel –Life in Australia (LinA). As part of the early planning, qualitative research was conducted to explore and test various options for recruitment to the panel. This session reports on the techniques used, and how respondent behaviours influenced the subsequent design of LinA.

Background
The Social Research Centre (SRC) team responsible for developing the Life in Australia (LinA) panel produced draft recruitment and survey materials – all designed for the purposes of randomly recruiting panel members to LinA. It was recognised by the team that qualitative exploratory testing was essential to evaluate and assess people’s general responses to the different messaging and modes, to ensure that subsequent ‘opt-in’ and participation in the panel was maximised. A qualitative sample of 17 respondents participated in either a focus group or individual face-to-face in-depth interview – this was found to be sufficient in terms of saturation of themes and consistency of views.
Instruments and approaches tested
The main instruments that were tested on research respondents included:
• Introductory recruitment script: a script prepared for interviewers at the Social Research Centre’s call centre to use when ‘cold-calling’ potential research respondents. Respondents were asked their impressions on the introduction, opening statement, and explanation of the research. Researchers tested three different options of the opening statement.
• Primary Approach Letter (PAL): with the intention of sending to a sample of potential respondents to explain the research and what is expected of panelists if they join. Feedback was obtained on the content, layout and credibility of the letter, and comprehension of the content.
• Pre-approach SMS: respondents were asked their likely response if they were to receive a text message telling them they had been selected to participate, their impressions of the SMS, and their feelings of the authenticity of the text message.
• Logos and strapline: different study ‘straplines’ were tested to determine preferences.
• Envelopes: two options were created for respondents to select their preference.
‘Life in Australia’ draft instruments were tested by using a concurrent probing technique, which means that respondents were asked to listen to or read the materials, which was then followed by a series of probes to explore the cognitive aspects of the questions (i.e. comprehension, retrieval, etc.).
The qualitative interviews and focus group also covered the following topics:
• Overall views of the concept (being invited to be part of the panel, views on terms such as ‘research’, ‘panel’, and ‘study’)
• Views on conversational approaches to recruitment
• Views on incentives, including frequency, amount and type.
The aim of the research was to assess whether the materials that had been developed were effective in engendering participation in the ‘Life in Australia’ panel. To avoid any advance conditioning, respondents were not fully briefed as to the purpose of the research prior to attendance, other than advised that they would be assisting in the development of a new ‘product’.


2. Improving mail contact communications; results from an experimental test of a new approach
Dr Don Dillman (Washington State University)
Mr Pierce Greenberg (Washington State University)

Interest in using mail contacts either to push-respondents to the web or for mail back responses to household surveys is increasing because of the high coverage qualities of registration list and Postal Service residential addresses. As a replacement for RDD telephone contacts mail contacts and the use of postal questionnaire materials makes it possible to develop and use multiple communications to encourage response far more effectively than can now be done by telephone. Yet, one of the most understudied factors about mail contact is the nature of the communication attempts made with sample members. In this presentation we will report results from a mail only survey experiment that evaluates a comprehensive and integrated design of all communication components sent by mail. The communication process is conceptualized as including all contents of these contacts, including such considerations as the outside of the mail envelope, contents of the envelope, design of the questionnaire cover pages, and initial questionnaire content. Results of this approach are compared against a standardized approach that incorporates common government survey practices across multiple contacts. This mail-back experiment uses Postal Service address-based sampling of households in a lower internet penetration area of the United States. Our ultimate goal is to provide better insight into how communications with respondents affect response behavior and might be integrated into designing more effective communications for web-push survey methods that are now used in many countries.


3. The effectiveness of introductory motivational messages for response quality improvement in web surveys
Dr Nejc Berzelak (University of Ljubljana, Faculty of Social Sciences)
Dr Ana Villar (City University London)
Ms Elena Sommer (City University London)

Emphasising the study importance on the welcome page to a web survey is commonly used as an attempt to stimulate survey participation. Less is known about how to improve data quality, and in particular how to encourage respondents to provide as accurate answers as possible. This study investigates the effectiveness of introductory motivational messages and explicit commitment statements on various indicators of response quality in a probability online panel.

The existing literature on improving respondent engagement using motivational messages has typically focused on: 1) emphasising the value of their contribution to the research, and 2) asking respondents to explicitly commit to providing accurate answers. Respondents may be more likely to put sufficient effort into the response process if they perceive their participation as a valuable contribution to the research, and an explicit commitment may further strengthen this effect by increasing their sense of obligation to accomplish what they have agreed to. If these approaches are successful, respondents will be more motivated to invest effort in responding, which could in turn reduce response shortcutting behaviours (like nondifferentiation) and improve reliability and validity of the data.

The literature exploring introductory motivational messages and commitment statements in web surveys is scarce. Two existing studies found positive if relatively small effects on the response quality. Their results suggest that the improvement may only be observed among specific groups of respondents, like higher educated and those who provide answers of a decent but not optimal quality. Further research is needed to thoroughly understand how these approaches perform among various groups of respondents in terms of response quality indicators throughout the survey participation.

This paper focuses on an experimental study conducted on the CROss-National Online Survey (CRONOS) panel, a probability-based online panel established by the European Social Survey in Estonia, Slovenia, and the UK. Panel members were randomly assigned to one of three experimental groups: 1) a standard introduction to the web questionnaire, 2) an introduction including a motivational text emphasising the importance of obtaining accurate answers, and 3) an introduction including the motivational text and requesting respondents to explicitly state their commitment to providing accurate answers.

The study considers a comprehensive set of response quality indicators to assess various potential effects of motivational messages and commitment statements. This includes item nonresponse, non-substantive answers, survey break-off rates and locations, response times for individual questions, and analysis of reliability and latent structures. Additional indicators are used as moderators to understand how the effects vary across different respondents. In addition to basic socio-demographics, the effects are compared between countries, early and late respondents, by type of device and level of IT literacy, and whether respondents were multitasking during survey participation. The performance of the studied motivational approaches is evaluated among the same respondents in two data collection waves, which contributes to understanding their usefulness and feasibility for panel studies.