@sebastian

Learning Semantic Correspondences with Less Supervision

, , and . Proceedings of the Joint Conference of the 47th Annual Meeting of the ACL and the 4th International Joint Conference on Natural Language Processing of the AFNLP, page 91--99. Suntec, Singapore, Association for Computational Linguistics, (August 2009)

Abstract

A central problem in grounded language acquisition is learning the correspondences between a rich world state and a stream of text which references that world state. To deal with the high degree of ambiguity present in this setting, we present a generative model that simultaneously segments the text into utterances and maps each utterance to a meaning representation grounded in the world state. We show that our model generalizes across three domains of increasing difficulty—Robocup sportscasting, weather forecasts (a new domain), and NFL recaps.

Links and resources

Tags

community

  • @dblp
  • @sebastian
@sebastian's tags highlighted