ESRA logo

ESRA 2019 full progam


Monday 15th July Tuesday 16th July Wednesday 17th July Thursday 18th July Friday 19th July


Usability Testing in a Changing Survey Environment 2

Session Organiser Mrs Emily Geisen (RTI International)
TimeTuesday 16th July, 16:00 - 17:00
Room D02

Technology is constantly evolving, which promotes innovation and advancements in the way we collect survey data. Due to these technological advances, the ways that respondents and interviewers interact with surveys are changing. For example, modern web surveys include features such as touch-screen, videos/images, GPS, sensors, voice recognition and text-to speech, dynamic error messages, and other capabilities. Each of these features changes the respondent-survey interaction, which can affect the quality of the data collected. As a result, usability testing is critical to ensure that respondents can complete surveys accurately, efficiently and with satisfaction. Technological advances have also affected methods available for conducting usability testing such as improved eye-tracking equipment and remote, unmoderated user testing. This session invites presentations that either showcase usability testing of surveys with technological advances or demonstrate innovative methods for conducting usability testing on surveys. We particularly invite presentations employing (1) usability testing of surveys with advanced technological features (e.g., sensors, machine-learning), (2) usability testing of new survey platforms (e.g., Blaise 5), (3) innovations or advances in existing usability methods, or (4) usability testing in multicultural contexts. We are also interested in studies that empirically demonstrate the utility and benefit of usability testing.

Keywords: Usability testing, eye-tracking, user experience, remote, unmoderated

A Multiple Method Approach to Testing a Complex Web-Based Establishment Survey Instrument

Miss Aryn Hernandez (U.S. Census Bureau) - Presenting Author
Miss Temika Holland (U.S. Census Bureau)
Miss Rebecca Keegan (U.S. Census Bureau)
Mrs Amy Anderson Riemer (U.S. Census Bureau)

Every five years the U.S. Census Bureau conducts an Economic Census (EC). This mandatory, self-administered establishment-based survey collects a wide range of financial and production data, such as payroll, employment, revenue by product type, and expenses. Between 2012 and 2017, the survey instrument for the EC was radically transformed from a downloaded software application to an entirely web-based platform. This new instrument was designed to meet the needs of widest variety of companies from small sole proprietorships with only one location to businesses with tens of thousands locations. The respondent-facing portion of the instrument is composed of two sites, the Respondent Portal and the survey instrument known as Centurion, connected by a program that provides a seamless link. The Portal delivers options to respondents that were previously only available through telephone contact with survey clerks. It also offers respondents the ability to share the survey with others within their business, a useful feature for business surveys, where data are often gathered from multiple sources. In addition, the Centurion instrument offers multiple reporting methods, user-friendly review features, and ‘how-to’ videos.
To facilitate the leap to this complex, web-based instrument, researchers conducted multiple rounds of usability testing and respondent debriefings. The usability testing consisted of a mix of traditional methods, such as having respondents perform tasks while thinking aloud, and more modern methods, such as screen recordings. Additionally, the ‘how-to’ videos underwent separate cognitive testing. Respondent debriefings, conducted via phone and internet, provided insight into remaining issues experienced during actual reporting. Finally, researchers are utilizing paradata to identify features that performed successfully and those that may benefit from further testing. This presentation will highlight innovative interface design and system functionality, as well as selected methodology and findings from the various rounds of testing.


Usability Testing in Cross-National Surveys - Experiences with Different Approaches

Dr Sabine Springer (European Union Agency for Fundamental Rights) - Presenting Author
Dr Vida Beresneviciute (European Union Agency for Fundamental Rights)
Mr Sami Nevala (European Union Agency for Fundamental Rights)

The European Union Agency for Fundamental Rights (FRA) has extensive experience in conducting cross-national surveys, covering all EU Member States, to produce evidence-based advice on fundamental rights. Web surveys are used to collect reliable and comparable data about e.g. discrimination experiences of hard-to-reach or hard-to-sample populations such as the Jewish community. FRA is currently preparing a survey that will give the general population the opportunity to provide information on their daily life experiences in relation to fundamental rights. The survey uses online data collection (CAWI) in some countries and computer-assisted self-interviews (CASI) in others.

High-quality data can only be obtained if completing the survey on different devices, without help from an interviewer, does not impact the content or quality of the answers. In cross-national surveys, country specificities in the use of the devices or the understanding of the instructions on how to complete the survey present an additional risk for the quality and the comparability of the data.

In preparation of FRA’s second survey on discrimination and hate crime against Jews, usability tests on different devices took place in six countries. For the Fundamental Rights Survey, different modes were tested during the cognitive testing phase in six countries looking for country and socio-demographic specificities in the use of CAWI, CAPI and CATI. Before the pilot, covering 28 EU-Member States, a usability test took place in the UK. During the pilot implementation in all EU Member States, observations of the interviewers present during the CASI part were collected.

Based on these multiple experiences with web-based surveys, either through CAWI or CASI, FRA has been able to test and evaluate different approaches in usability testing in a cross-national setting. The presentation offers an overview of the pros and cons of each of these approaches.


Cognitive and Usability Pretesting via Web-Conferencing Platforms: Comparison of Cost and Capabilities across Platforms

Mrs Emily Geisen (RTI International) - Presenting Author
Ms Amanda Smith (RTI International)
Ms Herschel Sanders (RTI International)
Mr Patrick Hsieh (RTI International)

Web-conferencing platforms, such as GoToMeeting, Cisco WebEx, Zoom, and BlueJeans, are useful for conducting cognitive and usability testing of computerized surveys. Benefits include: (1) remote testing (where participant and moderator are in different locations), which facilitates recruitment of geographically dispersed participants (2) remote observation (where observers are in a different location from participants or moderators), and (3) screen-recording. However, web-conferencing platforms vary considerably in the features and capabilities they offer. For example, some platforms require extensive downloads before users (e.g., cognitive/usability testing participant) are able to access the platform, whereas others simply require clicking on a link. Some platforms allow for sharing of mobile device screens – which is critical for pretesting mobile surveys – while others only allow screen-sharing of laptops and desktops. Given the number of web-conferencing platforms available and the considerable variability across platforms, it can be difficult for researchers to determine what solution would work best for their survey pretesting needs. The purpose of this presentation is to identify a comprehensive list of web-conferencing software platforms and summarize the key advantages and limitations of each software with respect to cost and capability. We will specifically examine technical requirements (e.g., do users have to download extensions or visit a website), accessibility (e.g., mobile vs. laptop/desktop, web-only vs application-based), ability for picture-in-picture (face cam and screen view), audio quality and options, screen recording to the cloud, mouse-tracking, security, and other capabilities to determine how well each platform performs in the context of survey pretesting. We will also examine publicly available user feedback ratings. Researchers can use this information to consider what web-conferencing features are needed for their study and to make an informed decision about which platform will best suit their needs and budget.