Abstract
In this paper we propose and test a methodology for evaluation of statements of a multi-viewpoint ontology by crowdsourcing. The task for the workers was to assess each of the given statement as true statements, controversial viewpoint statement or error. Typically, in crowdsourcing experiments the workers are asked for their personal opinions on the given subject. However, in our case their ability to objectively assess others’ opinions is examined as well. We conducted two large-scale crowdsourcing experiments with about 750 ontological statements originating from diverse single-viewpoint ontologies. Our results show substantially higher accuracy in evaluation for the objective assessment approach compared to the experiment based on personal opinions.
Original language | English |
---|---|
Pages (from-to) | 1-4 |
Number of pages | 4 |
Journal | Proceedings of the Association for Information Science and Technology |
Volume | 52 |
Issue number | 1 |
DOIs | |
State | Published - Jan 2015 |
Keywords
- Multi-viewpoint ontology
- crowdsourcing
- ontology statement classification
All Science Journal Classification (ASJC) codes
- General Computer Science
- Library and Information Sciences