Resolvable vs. irresolvable disagreement: A study on worker deliberation in crowd work

M Schaekermann, J Goh, K Larson, E Law - Proceedings of the ACM on …, 2018 - dl.acm.org
Proceedings of the ACM on Human-Computer Interaction, 2018dl.acm.org
Crowdsourced classification of data typically assumes that objects can be unambiguously
classified into categories. In practice, many classification tasks are ambiguous due to
various forms of disagreement. Prior work shows that exchanging verbal justifications can
significantly improve answer accuracy over aggregation techniques. In this work, we study
how worker deliberation affects resolvability and accuracy using case studies with both an
objective and a subjective task. Results show that case resolvability depends on various …
Crowdsourced classification of data typically assumes that objects can be unambiguously classified into categories. In practice, many classification tasks are ambiguous due to various forms of disagreement. Prior work shows that exchanging verbal justifications can significantly improve answer accuracy over aggregation techniques. In this work, we study how worker deliberation affects resolvability and accuracy using case studies with both an objective and a subjective task. Results show that case resolvability depends on various factors, including the level and reasons for the initial disagreement, as well as the amount and quality of deliberation activities. Our work reinforces the finding that deliberation can increase answer accuracy and the importance of verbal discussion in this process. We contribute a new public data set on worker deliberation for text classification tasks, and discuss considerations for the design of deliberation workflows for classification.
ACM Digital Library
Showing the best result for this search. See all results