TY - JOUR
T1 - Optimal crowd-powered rating and filtering algorithms
AU - Parameswaran, Aditya
AU - Boyd, Stephen
AU - Garcia-Molina, Hector
AU - Gupta, Ashish
AU - Polyzotis, Neoklis
AU - Widom, Jennifer
PY - 2014
Y1 - 2014
N2 - We focus on crowd-powered fltering, i.e., fltering a large set of items using humans. Filtering is one of the most commonly used building blocks in crowdsourcing applications and systems. While solutions for crowd-powered fltering exist, theymake a range of implicit assumptions and restrictions, ultimately rendering them not powerful enough for real-world applications. We describe two approaches to discard these implicit assumptions and restrictions: one, that carefully generalizes priorwork, leading to an optimal, but oftentimes intractable solution, and another, that provides a novel way of reasoning about fltering strategies, leading to a sometimes suboptimal, but effciently computable solution (that is asymptotically close to optimal). We demonstrate that our techniques lead to signi ficant reductions in error of up to ì{thorn}Û for fixed cost over prior work in a novel crowdsourcing application: peer evaluation in online courses.
AB - We focus on crowd-powered fltering, i.e., fltering a large set of items using humans. Filtering is one of the most commonly used building blocks in crowdsourcing applications and systems. While solutions for crowd-powered fltering exist, theymake a range of implicit assumptions and restrictions, ultimately rendering them not powerful enough for real-world applications. We describe two approaches to discard these implicit assumptions and restrictions: one, that carefully generalizes priorwork, leading to an optimal, but oftentimes intractable solution, and another, that provides a novel way of reasoning about fltering strategies, leading to a sometimes suboptimal, but effciently computable solution (that is asymptotically close to optimal). We demonstrate that our techniques lead to signi ficant reductions in error of up to ì{thorn}Û for fixed cost over prior work in a novel crowdsourcing application: peer evaluation in online courses.
UR - http://www.scopus.com/inward/record.url?scp=84901773447&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84901773447&partnerID=8YFLogxK
U2 - 10.14778/2732939.2732942
DO - 10.14778/2732939.2732942
M3 - Article
AN - SCOPUS:84901773447
VL - 7
SP - 685
EP - 696
JO - Proceedings of the VLDB Endowment
JF - Proceedings of the VLDB Endowment
SN - 2150-8097
IS - 9
ER -