ESRA logo

ESRA 2023 Program

              



All time references are in CEST

Analyzing Open-Ended Questions in Survey Research 1

Session Organiser Dr Alice Barth (Department of Sociology, University of Bonn)
TimeWednesday 19 July, 09:00 - 10:30
Room U6-05

Open-ended questions in surveys provide information on respondents’ personal perspectives, their interpretation and understanding of concepts. In addition, respondents get an opportunity for individualized feedback. Open-ended questions are almost indispensable in collecting data on issues that are too diverse for standardized questions, e.g. job designations. While many researchers used to refrain from open-ended questions due to the arduous process of transcribing and coding responses, the rise of computer-assisted data collection and software solutions for analyzing large amounts of text data can now accelerate the work process. Nevertheless, in dealing with open-ended questions, researchers need a specific methodological toolbox that is adapted to analyzing unstructured text data.

This session aims at discussing methods for processing and analyzing responses to open-ended questions. Topics of particular interest are, for example,
- coding responses to open-ended questions (manual, semi-automated or automated coding)
- text mining / natural language processing approaches
- qualitative content analysis of responses
- data quality and mode effects in open-ended questions
- open-ended probes as a means of evaluating comparability, validity and comprehension of questions
- analyzing respondent feedback on the survey process
- using information from open-ended questions to complement or contradict results from standardized questions
We are looking forward to contributions that highlight the methodological and/or substantial potential of open-ended questions in survey research.

Keywords: open-ended questions; data quality; text mining; text data; content analysis

Papers

Perception of the Social Composition in the Residential Environment

Professor Sören Petermann (Ruhr University Bochum) - Presenting Author

The perception of similarity and difference of fellow human beings is often debated in the social sciences, for example in the context of attributions of own and foreign groups or ethnic demarcations. These perceptions gain social relevance when they are incorporated into cooperative or discriminatory action. One example is Schelling's segregation model, according to which household mobility behaviour is explicitly based on the social composition of the neighbourhood. Closed survey questions usually presuppose a group classification, e.g. migrants or Muslims, and are therefore criticised because decisive processes of individual demarcation are presupposed but not investigated.
This is the point of departure for this presentation. It deals with the subjective perception of the social composition in the living environment of residents of large cities in western Germany. Firstly, it must be clarified to what extent residents perceive homogeneity and heterogeneity in their living environment. Secondly, it must be clarified what the perceptions refer to: Segregation models usually refer to ethnic and social lines of differentiation - are these reflected in the perceptions or are demographic and religious differences important? Thirdly, an attempt is made to clarify whether, against the background of ecological reliability, the subjective perceptions go hand in hand with the objective data on population composition from municipal statistics as well as the assessments of primary researchers.
The survey data analysed was collected as part of a study on diversity and social interactions in German cities. Empirically, it is shown that especially migration-related diversity is perceived, but that often perceptions of residents of a neighbourhood are not uniform, but show a more or less significant variance. For indicators of social and ethnic composition, there are clear deviations between perception and statistical data, not only globally, but also at neighbourhood level.


Who is considered a migrant in Eastern Europe? Evidence from Estonia and Latvia

Dr Christian Czymara (Tel Aviv University) - Presenting Author
Professor Anastasia Gorodzeisky (Tel Aviv University)
Professor Inna Leykin (The Open University of Israel)

Since the fall of the Iron Curtain, post-socialist Central and Eastern European countries have started to receive increasing attention from scholars of international migration in Europe. One of the unique characteristics of the region is that historical migration patterns within the former socialist federations - within one sovereign state - have often been retroactively interpreted as a movement across independent nation-states. Such patterns may in turn be reflected in citizens' perceptions of immigrants. This paper analyzes whom people in two independent post-socialist states, namely Estonia and Latvia, imagine as an immigrant based on novel and national-representative survey data collected in Fall 2022. We use quantitative text analysis on 829 unique responses to an open-ended survey question, asking which group comes to a respondent’s mind first when thinking about immigrants in their country. As can be expected, results show that Russia’s war against Ukraine and the resulting migration streams are heavily reflected in the image of an immigrant in these countries. 41 percent of all respondents mentioned terms related to Ukraine. Yet, other characteristics respondents frequently pointed out include a search for ‘work’ or ‘a better life’. Algorithm-based topic modeling reveals that the war in Ukraine was significantly more present in the Latvian data, while an immigrant description based on skin color occured more often in Estonia. Humanitarian concerns were mentioned about equally in both countries, but the topic of Russian emigration was more present in Latvia. The Soviet Union/USSR was mentioned by relatively few respondents in both countries. In sum, there is considerable heterogeneity in Estonians’ and Latvians’ pictures of immigrants. Next steps will include comparisons with closed-ended items, analyzing differences between educational and age groups, examining temporal variation, and adding data that is currently being collected in Lithuania.


Understanding the Impact of Open-Ended White and Black Origin Write-In Response Coding on Respondent Race and Ethnicity Profiles

Mr Todd Hughes (University of California Los Angeles) - Presenting Author
Dr Ninez Ponce (University of California Los Angeles)
Mr Andrew Juhnke (University of California Los Angeles)
Ms Parneet Ghuman (University of California Los Angeles)
Ms Jiyoun Yoo (University of California Los Angeles)

Starting in the 2021 version of the California Health Interview Survey (CHIS), all respondents who self-identified as White or Black in the initial race question were then asked a follow-up, open-ended question about their origins, following the model used for the 2020 United States Census. These detailed entries sometimes complemented or contradicted their responses to checkbox categories, leading to modifications of their race group responses during data processing. Using CHIS 2021 data, we examine CHIS’ race coding process for the white and black origin write-in questions to determine the impact of these new questions on the final race profile of CHIS respondents. Almost 1.5% of CHIS adult respondents had a change to their race variable values from White or Black to a different race based on entries to the open-ended follow-up questions. The CHIS 2021 data also show an over 9% increase in Hispanic/Latino respondents who identify as more than one Hispanic/Latino subtype due to the coding of the White and Black write-in responses.

We also explore the scope of information obtained from these write-in questions. Nearly half of those who identified as White provided at least two origins, while that number was only around 4% for those who identified as Black. Around 14% of White and 8% of Black respondents did not provide an origin response or a response that could be coded properly.

Overall, we analyze how the inclusion of the White and Black origin write-in response questions and the race variable processing decisions led to changes in the racial/ethnic profile of CHIS 2021 respondents. This may also add context to analysis of 2020 U.S. Census race data and other surveys collecting information on a respondent’s race and ethnicity.


How Far Does The Method of Web Probing Travel? Applying The Approach in India and the U.S.

Ms Ingrid Arts (Utrecht University) - Presenting Author
Dr Katharina Meitinger (Utrecht University)
Professor Rens van de Schoot (Utrecht University)

Open-ended questions can, especially in web surveys, be a good addition to closed questions. When asked as cognitive probes in the context of web probing, they provide information on the validity and comparability of target questions.

However, previous methodological research on web probing was predominantly conducted in Western countries, such as European countries and the US. There is a current research gap whether established web probing approaches also work in non-Western countries, such as India.
It could be that the current approach of probing is not performing as well in these contexts due to differences in, for example, communication styles.

We report findings from a nonprobability web survey on attitudes towards environmental change that was conducted in India and the U.S. in December 2022 (N=1,200). In this survey we replicated questions from the World Value Survey (5th wave) on the attitude towards environmental change. We asked respondents closed questions about how they perceived different local and global environmental problems and whether they are willing to pay money to prevent/solve environmental problems. Respondents received category selection, comprehension, and specific probes throughout the survey.

In this presentation, we compare the performance of established web probing approaches in India and the U.S. with different indicators, such as number of themes, degree of elaboration (i.e. provision of examples), item nonresponse and response time. We will also assess whether performance differs by probe type.