Skip to main content
U.S. flag

An official website of the United States government

Official websites use .gov
A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS
A lock ( ) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

Can Crowdsourcing Improve HCI?

Published

Author(s)

Serge M. Egelman

Abstract

Much of HCI research focuses on improving the user experience by using data from human subjects experiments. Designing a laboratory study, observing participants, and compensating them is a very expensive process, in terms of both time and money. Due to these costs, sample sizes tend to be relatively small, which in turn has an effect on the estimation of exact effect sizes. However, new crowdsourcing technologies, such as Amazon’s Mechanical Turk, allow researchers to conduct human subjects experiments in much less time, on much larger sample sizes, and for less money. In this paper I describe several previous studies that I have performed using crowdsourcing—some prior to joining NIST—and explain how they would have been time and cost prohibitive without crowdsourcing technologies.
Proceedings Title
Proceedings of The Human Computer Interaction Consortium (HCIC) 2011 Workshop
Conference Dates
June 14-18, 2011
Conference Location
Monterey, CA
Conference Title
The Human Computer Interaction Consortium (HCIC) 2011 Workshop

Keywords

Crowdsourcing, HCI methods, user studies

Citation

Egelman, S. (2011), Can Crowdsourcing Improve HCI?, Proceedings of The Human Computer Interaction Consortium (HCIC) 2011 Workshop, Monterey, CA (Accessed October 31, 2024)

Issues

If you have any questions about this publication or are having problems accessing it, please contact reflib@nist.gov.

Created June 18, 2011, Updated February 19, 2017