DenoisingWord Embeddings by Averaging in a Shared Space

Avi Caciularu, Ido Dagan, Jacob Goldberger

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

We introduce a new approach for smoothing and improving the quality of word embeddings. We consider a method of fusing word embeddings that were trained on the same corpus but with different initializations. We project all the models to a shared vector space using an efficient implementation of the Generalized Procrustes Analysis (GPA) procedure, previously used in multilingual word translation. Our word representation demonstrates consistent improvements over the raw models as well as their simplistic average, on a range of tasks. As the new representations are more stable and reliable, there is a noticeable improvement in rare word evaluations.

Original languageEnglish
Title of host publication*SEM 2021 - 10th Conference on Lexical and Computational Semantics, Proceedings of the Conference
EditorsLun-Wei Ku, Vivi Nastase, Ivan Vulic
PublisherAssociation for Computational Linguistics (ACL)
Pages294-301
Number of pages8
ISBN (Electronic)9781954085770
DOIs
StatePublished - 2021
Event10th Conference on Lexical and Computational Semantics, *SEM 2021 - Virtual, Bangkok, Thailand
Duration: 5 Aug 20216 Aug 2021

Publication series

Name*SEM 2021 - 10th Conference on Lexical and Computational Semantics, Proceedings of the Conference

Conference

Conference10th Conference on Lexical and Computational Semantics, *SEM 2021
Country/TerritoryThailand
CityVirtual, Bangkok
Period5/08/216/08/21

All Science Journal Classification (ASJC) codes

  • Computational Theory and Mathematics
  • Computer Networks and Communications
  • Computer Science Applications
  • Information Systems
  • Theoretical Computer Science

Fingerprint

Dive into the research topics of 'DenoisingWord Embeddings by Averaging in a Shared Space'. Together they form a unique fingerprint.

Cite this