Incentives for Truthful Reporting in Crowdsourcing
AAMAS |
A challenge with the programmatic access of human talent via crowdsourcing platforms is the specification of incentives and the checking of the quality of contributions. Methodologies for checking quality include providing a payment if the work is approved by the task owner and hiring additional workers to evaluate contributors’ work. Both of these approaches place a burden on people and on the organizations commissioning tasks, and may be susceptible to manipulation by workers and task owners. Moreover, neither a task owner nor the task market may know the task well enough to be able to evaluate worker reports. Methodologies for incentivizing workers without external quality checking include rewards based on agreement with a peer worker or with the final output of the system. These approaches are vulnerable to strategic manipulations by workers. Recent experiments on Mechanical Turk have demonstrated the negative influence of manipulations by workers and task owners on crowdsourcing systems [3]. We address this central challenge by introducing incentive mechanisms that promote truthful reporting in crowdsourcing and discourage manipulation by workers and task owners without introducing additional overhead.
We focus on a large class of crowdsourcing tasks that we refer to as consensus tasks. Consensus tasks are aimed at determining a single correct answer or a set of correct answers to a question or challenge based on reports collected from workers. These tasks include numerous applications where multiple reports collected from people are used to make decisions. We adapt the peer prediction rule [4] to formulate a payment rule that incentivizes workers to contribute to consensus tasks truthfully in crowdsourcing. The rule pays workers depending on how well their report helps to predict another worker’s report for the same task. To address several shortcomings of the peer prediction rule, we introduce a novel payment rule, called the consensus prediction rule. This payment rule couples payment computations with planning to generate a robust signal for evaluating worker reports. The consensus prediction rule rewards a worker based on how well her report can predict the consensus of other workers. It incentivizes truthful reporting, while providing better fairness than peer prediction rules.
A more detailed presentation of the ideas investigated in this work, including a comparison with existing payment rules, an investigation of considerations in applying payment rules in real-world applications, and a detailed empirical evaluation can be found in [2].