ESRA logo

ESRA 2023 Program

              



All time references are in CEST

Analyzing Open-Ended Questions in Survey Research 3

Session Organiser Dr Alice Barth (Department of Sociology, University of Bonn)
TimeWednesday 19 July, 14:00 - 15:00
Room U6-21

Open-ended questions in surveys provide information on respondents’ personal perspectives, their interpretation and understanding of concepts. In addition, respondents get an opportunity for individualized feedback. Open-ended questions are almost indispensable in collecting data on issues that are too diverse for standardized questions, e.g. job designations. While many researchers used to refrain from open-ended questions due to the arduous process of transcribing and coding responses, the rise of computer-assisted data collection and software solutions for analyzing large amounts of text data can now accelerate the work process. Nevertheless, in dealing with open-ended questions, researchers need a specific methodological toolbox that is adapted to analyzing unstructured text data.

This session aims at discussing methods for processing and analyzing responses to open-ended questions. Topics of particular interest are, for example,
- coding responses to open-ended questions (manual, semi-automated or automated coding)
- text mining / natural language processing approaches
- qualitative content analysis of responses
- data quality and mode effects in open-ended questions
- open-ended probes as a means of evaluating comparability, validity and comprehension of questions
- analyzing respondent feedback on the survey process
- using information from open-ended questions to complement or contradict results from standardized questions
We are looking forward to contributions that highlight the methodological and/or substantial potential of open-ended questions in survey research.

Keywords: open-ended questions; data quality; text mining; text data; content analysis

Papers

Comparison between open-ended and standardised responses in an email longitudinal survey

Professor Alessandra Decataldo (Università degli studi di Milano Bicocca)
Professor Brunella Fiore (Università degli studi di Milano Bicocca)
Dr Noemi Novello (Università degli studi di Milano Bicocca)
Mr Federico Paleardi (Università degli studi di Milano Bicocca) - Presenting Author
Dr Sara Recchi (Università degli studi di Milano Bicocca)

This contribution aims to discuss the methodological and substantive implications of open-ended responses, in comparison with standardised responses, within a longitudinal study on tutors of host organisations involved in Projects for Transversal Competences and Orientation (PCTOs), an Italian education policy. PCTOs offer high-school students a period of practical experience within an organisation, educating and orienting them by promoting on-the-job training. The email survey, carried out in two waves one year apart, investigates the perspective of tutors in organizations and institutions about PCTO design, objectives, implementation, and practical functioning.
We consider in this paper two specific questions: one is related to the main activities carried out by the host organisation; the other refers to the professional role of the respondent. Methodologically, the choice was motivated by the impossibility of coding the broad spectrum of possible activities in a structured question, which would have resulted in valuable information loss. The limited population (of 367 projects) allowed the manual coding of the answer in both waves. In addition, the second wave introduces standardized answer modalities, besides open-ended questions: within a triangulation framework, the aim is to analyse convergence and discrepancies in responses. By adopting two different response modalities, we have the chance to assess the quality of manual coding procedure. Moreover, the comparison is relevant for its substantial impact as well, as we have the chance to assess variations over time in manual coding and standardised answers regarding activities carried out within host organisations. Relying on in-depth interviews with a sub-sample of 15 tutors, eventual discrepancies in responses might also be addressed, providing additional qualitative data. This provides us with insights on the respondents' professional paths, particularly in their role of tutors. Furthermore, the tutor activity is compared to other variables regarding PCTOs' implementation processes.


Exploring the structure of text data in open-ended survey questions with multiple correspondence analysis and topic modeling

Ms Rebekka Atakan (University of Bonn, Department of Sociology) - Presenting Author

Open-ended survey questions are a great way for survey participants to express their opinions in their own words. However, the analysis of the responses can be very time consuming. In recent years, forms of automated text analysis have been introduced for a time-saving analysis of open-ended questions.

Typically, if one is interested in the automated analysis of unstructured text data, topic modeling is a popular method to cluster similar words and, by doing this, detect abstract topics that occur in the document corpus. Topic modeling has originally been developed for the analysis of longer documents; therefore, a debate emerged around adequate topic modeling procedures for the time-saving automated analysis of short texts such as responses to open-ended survey questions (Roberts et al. 2014; Finch et al. 2018; Pietsch and Lessmann 2018; Nanda et al. 2021).

In this presentation, we compare topic modeling to an alternative, but less discussed method for analyzing open-ended survey questions: multiple correspondence analysis (MCA). While topic modeling aims at the clustering of similar words in topics, MCA is a scaling technique for categorical variables (Le Roux and Rouanet 2010; Husson and Josse 2014). Therefore, in addition to detecting similarities, the relational structure of the terms used by the participants can be visualized. With MCA, it is possible to map the relationship of all terms to each other, meaning that also contrasting subject matters can be identified.

Using data of the fifth wave of the Cologne Dwelling panel, we will analyze the responses of two open-ended questions on respondents’ perception of their neighborhood with topic modeling and MCA. Strengths and weaknesses of both procedures will be compared and we will discuss ways to integrate the results.