Semantic parsing via paraphrasing

Jonathan Berant, Percy Liang

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

A central challenge in semantic parsing is handling the myriad ways in which knowledge base predicates can be expressed. Traditionally, semantic parsers are trained primarily from text paired with knowledge base information. Our goal is to exploit the much larger amounts of raw text not tied to any knowledge base. In this paper, we turn semantic parsing on its head. Given an input utterance, we first use a simple method to deterministically generate a set of candidate logical forms with a canonical realization in natural language for each. Then, we use a paraphrase model to choose the realization that best paraphrases the input, and output the corresponding logical form. We present two simple paraphrase models, an association model and a vector space model, and train them jointly from question-answer pairs. Our system PARASEMPRE improves stateof- the-art accuracies on two recently released question-answering datasets.

Original languageEnglish
Title of host publicationLong Papers
PublisherAssociation for Computational Linguistics (ACL)
Pages1415-1425
Number of pages11
ISBN (Print)9781937284725
DOIs
StatePublished - 2014
Externally publishedYes
Event52nd Annual Meeting of the Association for Computational Linguistics, ACL 2014 - Baltimore, MD, United States
Duration: 22 Jun 201427 Jun 2014

Publication series

Name52nd Annual Meeting of the Association for Computational Linguistics, ACL 2014 - Proceedings of the Conference
Volume1

Conference

Conference52nd Annual Meeting of the Association for Computational Linguistics, ACL 2014
Country/TerritoryUnited States
CityBaltimore, MD
Period22/06/1427/06/14

All Science Journal Classification (ASJC) codes

  • Language and Linguistics
  • Linguistics and Language

Fingerprint

Dive into the research topics of 'Semantic parsing via paraphrasing'. Together they form a unique fingerprint.

Cite this