Improving quality and efficiency in plan-based neural data-to-text generation

Amit Moryossef, Ido Dagan, Yoav Goldberg

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

We follow the step-by-step approach to neural data-to-text generation we proposed in Moryossef et al. (2019), in which the generation process is divided into a text-planning stage followed by a plan-realization stage. We suggest four extensions to that framework: (1) we introduce a trainable neural planning component that can generate effective plans several orders of magnitude faster than the original planner; (2) we incorporate typing hints that improve the model’s ability to deal with unseen relations and entities; (3) we introduce a verification-by-reranking stage that substantially improves the faithfulness of the resulting texts; (4) we incorporate a simple but effective referring expression generation module. These extensions result in a generation process that is faster, more fluent, and more accurate.

Original languageEnglish
Title of host publicationINLG 2019 - 12th International Conference on Natural Language Generation, Proceedings of the Conference
PublisherAssociation for Computational Linguistics (ACL)
Pages377-382
Number of pages6
ISBN (Electronic)9781950737949
DOIs
StatePublished - 1 Jan 2019
Event12th International Conference on Natural Language Generation, INLG 2019 - Tokyo, Japan
Duration: 29 Oct 20191 Nov 2019

Publication series

NameINLG 2019 - 12th International Conference on Natural Language Generation, Proceedings of the Conference

Conference

Conference12th International Conference on Natural Language Generation, INLG 2019
Country/TerritoryJapan
CityTokyo
Period29/10/191/11/19

All Science Journal Classification (ASJC) codes

  • Software

Fingerprint

Dive into the research topics of 'Improving quality and efficiency in plan-based neural data-to-text generation'. Together they form a unique fingerprint.

Cite this