Identifying unreliable and adversarial workers in crowdsourced labeling tasks
From MaRDI portal
Publication:4637009
zbMATH Open1435.68264MaRDI QIDQ4637009FDOQ4637009
Authors: Srikanth Jagabathula, Lakshminarayanan Subramanian, Ashwin Venkataraman
Publication date: 17 April 2018
Full work available at URL: http://jmlr.csail.mit.edu/papers/v18/15-650.html
Recommendations
- Budget-Optimal Task Allocation for Reliable Crowdsourcing Systems
- Bandit-based task assignment for heterogeneous crowdsourcing
- Quality-aware online task assignment mechanisms using latent topic model
- Eliminating spammers and ranking annotators for crowdsourced labeling tasks
- More for less: adaptive labeling payments in online labor markets
Classification and discrimination; cluster analysis (statistical aspects) (62H30) Learning and adaptive systems in artificial intelligence (68T05)
Cites Work
- Random graphs.
- Semi-matchings for bipartite graphs and load balancing.
- Spectral methods meet EM: a provably optimal algorithm for crowdsourcing
- Budget-Optimal Task Allocation for Reliable Crowdsourcing Systems
- Eliminating spammers and ranking annotators for crowdsourced labeling tasks
- Optimal Inference in Crowdsourced Classification via Belief Propagation
- Identifying unreliable and adversarial workers in crowdsourced labeling tasks
Cited In (3)
Uses Software
This page was built for publication: Identifying unreliable and adversarial workers in crowdsourced labeling tasks
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4637009)