Abstract
Introduction: Conducting research online has become common in human participant research and notably in the field of human-computer interaction (HCI). Many researchers have used English-language and Western participant pool and recruitment platforms like Amazon Mechanical Turk and Prolific, with panel quality and representativeness known to vary greatly. Less is known about non-English, non-Western options. We consider Japan, a nation that produces a significant portion of HCI research. We report on an evaluation of the widely-used Yahoo! Crowdsourcing (YCS) recruitment platform.
Methods: We evaluated 65 data sets comprising N = 60, 681 participants, primarily focusing on the 42 data sets with complete meta data from studies requiring earnest participation (n = 29, 081).
Results: We found generally high completion (77.6%) and retention rates (70.1%). Notably, use of multimedia stimuli exhibited higher completion (97.7%) and retention (91.9%) rates. We also found that the “general” participant setting attracted middle-aged men.
Discussion: We offer guidelines for best practice, such as online questionnaire design strategies to increase data quality and filtering to capture a more representative audience. We reveal the nature, power, and limitations of YCS for HCI and other fields conducting human participant research online.
Artifacts
Information
Book title
Frontiers in Psychology
Volume
16:1588579
Date of issue
2025/08/21
Date of presentation
2025/08/21
Citation
Katie Seaborn, Satoshi Nakamura. Quality and representativeness of research online with Yahoo! Crowdsourcing, Frontiers in Psychology, Vol.16:1588579, 2025.