Abstract
We show how deep learning methods can be applied in the context of crowdsourcing and unsupervised ensemble learning. First, we prove that the popular model of Dawid and Skene, which assumes that all classifiers are conditionally independent, is equivalent to a Restricted Boltzmann Machine (RBM) with a single hidden node. Hence, under this model, the posterior probabilities of the true labels can be instead estimated via a trained RBM. Next, to address the more general case, where classifiers may strongly violate the conditional independence assumption, we propose to apply RBM-based Deep Neural Net (DNN). Experimental results on various simulated and real-world datasets demonstrate that our proposed DNN approach outperforms other state-of-the-art methods, in particular when the data violates the conditional independence assumption.
Original language | English |
---|---|
Title of host publication | Proceedings of the 33rd International Conference on Machine Learning, ICML 2016 |
Editors | M. F. Balcan, K. Q. Weinberger |
Pages | 30-39 |
Number of pages | 10 |
Volume | 48 |
State | Published - 19 Jun 2016 |
Event | 33rd International Conference on Machine learning - New York, United States Duration: 19 Jun 2016 → 24 Jun 2016 Conference number: 33rd |
Conference
Conference | 33rd International Conference on Machine learning |
---|---|
Abbreviated title | ICML 2016 |
Country/Territory | United States |
City | New York |
Period | 19/06/16 → 24/06/16 |