CoRefi: A Crowd Sourcing Suite for Coreference Annotation

Aaron Bornstein, Arie Cattan, Ido Dagan

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Coreference annotation is an important, yet expensive and time consuming, task, which often involved expert annotators trained on complex decision guidelines. To enable cheaper and more efficient annotation, we present COREFI, a web-based coreference annotation suite, oriented for crowdsourcing. Beyond the core coreference annotation tool, COREFI provides guided onboarding for the task as well as a novel algorithm for a reviewing phase. COREFI is open source and directly embeds into any website, including popular crowdsourcing platforms. COREFI Demo: aka.ms/corefi Video Tour: aka.ms/corefivideo Github Repo: https://github.com/aribornstein/corefi

Original languageEnglish
Title of host publicationEMNLP 2020 - Conference on Empirical Methods in Natural Language Processing, Proceedings of Systems Demonstrations
EditorsQun Liu, David Schlangen
PublisherAssociation for Computational Linguistics (ACL)
Pages205-215
Number of pages11
ISBN (Electronic)9781952148620
StatePublished - 2020
Event2020 System Demonstrations of the 2020 Conference on Empirical Methods in Natural Language Processing, EMNLP 2020 - Virtual, Online
Duration: 16 Nov 202020 Nov 2020

Publication series

NameEMNLP 2020 - Conference on Empirical Methods in Natural Language Processing, Proceedings of Systems Demonstrations

Conference

Conference2020 System Demonstrations of the 2020 Conference on Empirical Methods in Natural Language Processing, EMNLP 2020
CityVirtual, Online
Period16/11/2020/11/20

All Science Journal Classification (ASJC) codes

  • Computational Theory and Mathematics
  • Computer Science Applications
  • Information Systems

Fingerprint

Dive into the research topics of 'CoRefi: A Crowd Sourcing Suite for Coreference Annotation'. Together they form a unique fingerprint.

Cite this