@uniwue_info3

MicroTrails: Comparing Hypotheses about Task Selection on a Crowd Sourcing Platform

, , , , , and . International Conference on Knowledge Technologies and Data-driven Business (I-KNOW), Graz, Austria, (October 2015)

Abstract

To optimize the workflow on commercial crowdsourcing platforms like Amazon Mechanical Turk or Microworkers, it is important to understand how users choose their tasks. Current work usually explores the underlying processes by employing user studies based on surveys with a limited set of participants. In contrast, we formulate hypotheses based on the different findings in these studies and, in- stead of verifying them based on user feedback, we compare them directly on data from a commercial crowdsourcing platform. For evaluation, we use a Bayesian approach called HypTrails which allows us to give a relative ranking of the corresponding hypotheses. The hypotheses considered, are for example based on task categories, monetary incentives or semantic similarity of task descriptions. We find that, in our scenario, hypotheses based on employers as well the the task descriptions work best. Overall, we objectively compare different factors influencing users when choosing their tasks. Our approach enables crowdsourcing companies to better understand their users in order to optimize their platforms, e.g., by incorporating the gained knowledge about these factors into task recommendation systems.

Links and resources

Tags

community