Abstract
We conduct an experimental analysis of a dataset comprising over 27 million microtasks performed by over 70,000 workers issued to a large crowdsourcing marketplace between 2012-2016. Using this data-never before analyzed in an academic context-we shed light on three crucial aspects of crowdsourcing: (1) Task design -helping requesters understand what constitutes an effective task, and how to go about designing one; (2) Marketplace dynamics - helping marketplace administrators and designers understand the interaction between tasks and workers, and the corresponding marketplace load; and (3) Worker behavior - understanding worker attention spans, lifetimes, and general behavior, for the improvement of the crowdsourcing ecosystem as a whole.
Original language | English (US) |
---|---|
Pages (from-to) | 829-840 |
Number of pages | 12 |
Journal | Proceedings of the VLDB Endowment |
Volume | 10 |
Issue number | 7 |
DOIs | |
State | Published - 2017 |
Event | 43rd International Conference on Very Large Data Bases, VLDB 2017 - Munich, Germany Duration: Aug 28 2017 → Sep 1 2017 |
ASJC Scopus subject areas
- Computer Science (miscellaneous)
- Computer Science(all)