ESRA logo

ESRA 2019 full progam


Monday 15th July Tuesday 16th July Wednesday 17th July Thursday 18th July Friday 19th July


Response Option Order Effects in Cross-Cultural Contexts

Session Organisers Dr Ana Villar (Facebook)
Dr Yongwei Yang (Google)
TimeFriday 19th July, 13:30 - 14:30
Room D09

The order of response options may affect how people answer rating scale questions. Response option order effect is present when changing the order of response options of a rating scale leads to differences in the distribution or functioning of individual or group of questions. Theoretical interpretations, notably satisficing, memory bias and anchor-and-adjustment have been used to explain and predict this effect under different conditions. Recent advance in visual design principles with respect to “interpretive heuristics” (esp. “left and top mean first” and “up means good”) adds more insights on how positioning of response options may affect answers when scales are presented visually. A number of studies have investigated the direction and size of response option order effects, but present a complex picture and most were conducted in mono-cultural fashion. However, the presence and extent of response option order effect may be affected by cultural factors in a few ways. First, interpretive heuristics, such as “left means first” may work differently due to varying reading conventions (e.g., right-to-left in Arabic or Hebrew). Furthermore, people within cultures where there are multiple primary languages (e.g., Hebrew and English) and multiple reading conventions (e.g., in Japan where texts may be read left-to-right-top-down horizontally or top-down-right-to-left vertically) may respond to positioning heuristics differently. Respondents from different countries may have varying degrees of exposure and familiarity to a specific type of visual design. Considering how internet is consumed across countries, with plenty of web contents presented left-to-right regardless of language, it is conceivable that heavy users of online contents be less susceptible to the impact of reading conventions. This session will organize papers on research investigating rating scale response option order effect across countries with different reading conventions and industry norms for answer scale designs, considering device of completion as a moderating factor. The research will consider the effect of vertical vs. horizontal display, survey topic area, number of scale points, age, gender, and education level. The works will reflect a range of analytical approaches: distributional comparisons, latent structure modeling, and analysis of response latency. The session will provide guidance on best practices in presenting ratings scales on smartphones, as well as for comparative analysis involving data obtained with different rating scale response option orders.

Keywords: Cross-cultural survey research

Effects of Response Styles for Group-Differences in Psychosocial Scales: Evidence from the HRS

Miss Fernanda Alvarado-Leiton (University of Michigan) - Presenting Author
Miss Sunghee Lee (University of Michigan)

The use of Likert-type scales in survey research is extensive because of their easiness of design and implementation. Nonetheless, there are some caveats to their utilization such as Response Styles (RS) which occur when respondents answer in a particular way regardless of the content of the question. This becomes a problem when different groups of respondents elicit more (or less) RS tendencies, complicating comparisons across subpopulations. In cross-cultural research, comparisons of scale-derived scores are an everyday task. In fact, several surveys are envisioned with the sole goal of providing comparisons between different cultural groups or countries. With the aim of extending the evidence on how multi-culturalism impacts measurement, we propose to examine the effect of RS (specifically extreme and acquiescence RS) in race-ethnicity subgroup comparisons in the United States context. The United States encloses a multi-cultural and multi-ethnic population. Previous studies have evidenced differences in measurement between ethnic groups before, finding evidence of acquiescence RS among Hispanics and extreme RS among African Americans. Consequently, this research aims to (1) assess the presence and prevalence of RS across racial groups in Likert scales, (2) to perform statistical adjustments (for example, confirmatory factor analysis, latent class analysis, and OLS regression) to these scales to obtain adjusted mean score estimates at the racial group level, and (3) to assess how using a particular adjustment impact group comparison. To achieve this, I will examine three scales coming from the Health and Retirement Study (HRS) self-administered questionnaire (SAQ) in two waves of data collection (2014 & 2016). The contribution of this research lays in the extension of the definition of culture by examining subgroups in within a country. Furthermore, it provides evidence of the reach and limitations of different methods to adjust scale scores prompting researchers to examine carefully the available tools for this goal.


Response Option Order Effects in Cross-Cultural Context: An Experimental Investigation

Dr Yongwei Yang (Google, Inc.) - Presenting Author
Dr Rich Timpone (Ipsos Science Center)
Dr Mario Callegaro (Google, Inc.)
Ms Marni Hirschorn (Ipsos)

Response option order effect occurs when different orders of rating scale response options lead to different distribution or functioning of survey questions. Theoretical interpretations, notably satisficing, memory bias (Krosnick & Alwin, 1987) and anchor-and-adjustment (Yan & Keusch, 2015) have been used to explain such effects. Visual interpretive heuristics (esp. “left-and-top-mean-first” and “up-means-good”) may also provide insights on how positioning of response options may affect answers (Tourangeau, Couper, & Conrad, 2004, 2013). Most existing studies that investigated the response option order effect were conducted in mono-cultural settings. However, the presence and extent of response option order effect may be affected by “cultural” factors in a few ways. First, interpretive heuristics, such as “left-means-first” may work differently due to varying reading conventions (e.g., left-to-right vs. right-to-left). Furthermore, people within cultures where there are multiple primary languages and multiple reading conventions might possess different positioning heuristics. Finally, respondents from different countries may have varying degree of exposure and familiarity to a specific type of visual design. In this experimental study, we investigate rating scale response option order effect across three countries with different reading conventions and industry norms for answer scale designs -- US, Israel, Japan. The between-subject factor of the experiment consists of four combinations of scale orientation (vertical and horizontal) and the positioning of the positive end of the scale. The within-subject factors are question topic area and the number of scale points. The effects of device (smartphone vs. desktop computer/tablet), age, gender, education, and the degree of exposure to left-to-right contents will also be evaluated. We incorporate a range of analytical approaches: distributional comparisons, analysis of response latency and paradata, and latent structure modeling. We will discuss implications on choosing response option orders for mobile surveys and on comparing data