ESRA logo

Tuesday 16th July       Wednesday 17th July       Thursday 18th July       Friday 19th July      

Download the conference book

Download the program





Thursday 18th July 2013, 14:00 - 15:30, Room: No. 7

The Use of Probing Questions to Evaluate Items in Intercultural Research

Convenor Professor Michael Braun (GESIS - Leibniz Institute for the Social Sciences)
Coordinator 1Dr Dorothée Behr (GESIS - Leibniz Institute for the Social Sciences)

Session Details

Equivalence of measurement across countries is a necessary prerequisite for intercultural research. The traditional method to establish invariance of measurement is applying one or more data-analytic approaches. However, most of these approaches are only helpful in deciding whether measurement invariance is obtained or not (e.g., multigroup structural equation modeling) but not at getting at the causes of problems with functional equivalence.

When it comes to identifying the causes of non-equivalence, probing techniques are an ideal device. After all, they allow discovering what respondents - across countries - have in mind when answering survey questions. Knowing about potential causes of non-equivalence is an important step towards improving measurement instruments for future use in intercultural research.

For this session, we invite papers on probing in intercultural research. Such probing make take place as part of source questionnaire development or as part of translation testing. Alternatively, probing may come in as a follow-up to an already fielded survey to shed light on statistically suspicious data. Furthermore, the probing may take place during cognitive interviewing or as part of imbedded follow-up questions at the main data collection stage itself (regardless of mode). Papers are welcome on both substantive findings and on methodological challenges and considerations.


Paper Details

1. Probing in cross-cultural survey research - status quo and outlook

Dr Dorothee Behr (GESIS)

Equivalence of measurement is a prerequisite for the sound use of cross-cultural survey data. Equivalence cannot be assumed, though, but must both be assured and assessed. Probing, understood in this context as follow-up open-ended questions asked by an interviewer to collect additional information on respondent's answers, is a suitable means both to assure equivalence when developing cross-cultural measurement instruments and assess equivalence of the collected "real" survey data. Probes are asked in cognitive interviews (Willis, 2005), in respondent debriefing sessions (DeMaio & Rothgeb, 1996), as part of "random probes" (Schuman, 1966), or as part of web probing (Behr, Braun, Kaczmirek, & Bandilla, 2012), to give a rough classification of different probing scenarios. In this paper, I will first examine these different probing scenarios and compare them in terms of goals, actors involved, case numbers, etc. Second, I will look at the status quo of these probing scenarios in cross-cultural survey research. Third, I will discuss the potential of probing in cross-cultural research against the backdrop drop of calls for more mixed-method studies in intercultural research (van de Vijver & Chasiotis, 2010).


2. Integrating cognitive interviewing and DIF analysis to uncover causes of non-equivalence in cross-cultural research: A mixed methods research approach

Dr Jose-luis Padilla (University of Granada)
Dr Isabel Benitez (University of Granada)

Cognitive interviewing (CI) is considered a qualitative method used to fix survey questions or capture what the question is truly measuring. On the other hand, Differential Item Functioning (DIF) analysis is one of the methods most used to determine if the level of metric equivalence has been achieved in cross-cultural assessments. However, DIF results are frequently hard to understand. The aim of this paper is to present a study in which we have conducted a mixed-method research integrating findings from cognitive interviewing and expert appraisals with DIF results. Evidences were combined for establishing conclusions about differences across American and Spanish participants who responded to different versions of the Student Questionnaire of the Program for International Student Assessment (PISA, OECD, 2006).
DIF analyses were implemented to 9,800 responses (4,900 from American participants and 4,900 from Spanish participants) by Differential Steps Functioning (DSF) and Ordinal Logistic Regression (OLR). DIF results pointed out 8 items with large DIF across conditions. On the other hand, 44 cognitive interviews (20 in USA and 24 in Spain) were conducted. In addition, 8 experts provided also qualitative evidence on problematic elements in items which could threat the comparability across group. Results provided information about types of differences across groups which were related generally with the translation process and specifically with differences in the concepts characteristics, the concept meanings, and contextual issues. Finally, mixed research contributions for uncovering causes of non-equivalence in cross-cultural research will be discussed.


4. What do respondents mean when they indicate to be "citizens of the world"? Using probing questions to elucidate international differences in cosmopolitanism

Professor Michael Braun (GESIS)
Dr

Measurement of cosmopolitan attitudes has proved to be challenging. A direct measure targeting the identification of respondents ("identity measure") has sometimes been found to be inferior to a composite measure where the stances of respondents towards a broad array of pertinent attitudes are taken into consideration. To explain unexpected findings, a mixed methods approach has been suggested to find out what lies behind this global identity, that is, what people have in mind when indicating to be "citizens of the world" (e.g. Pichler 2012). To the best of our knowledge, qualitative evidence for such an item has not yet been collected.

We report results from a probing study where the closed question of the Eurobarometer ("... to what extent do you personally feel you are ... a citizen of the world") was followed by a category-selection probe ("Please tell us why you feel [to a great extent / somewhat / not really / not at all] that you are a citizen of the world"). The data come from Web surveys conducted in Canada (English speaking, only), Denmark, Germany, Hungary, Spain, and the U.S. in 2011. Respondents were drawn from nonprobability online panels. Any generalizations to the entire populations, therefore, have to be treated with extreme caution. However, we use an acceptable replication of the country patterns found in the Eurobarometer data (for the European countries, only) as a precondition for using the Web survey to shed light on the Eurobarometer data.