Article,

A hybrid architecture for robust MT using LFG-DOP

.
Journal of Experimental & Theoretical Artificial Intelligence, 11 (3): 441--471 (1999)

Abstract

We develop a model for machine translation (MT) based on data-oriented parsing (DOP) allied to the syntactic representations of lexical functional grammar (LFG). We begin by showing that in themselves, none of the main paradigmatic approaches to MT currently suffice to the standard required. Nevertheless, each of these approaches contains elements which if properly harnessed should lead to an overall improvement in translation performance. It is in this new hybrid spirit that our search for a better solution to the problems of MT can be seen. We summarize the original DOP model of Bod, as well as the DOT model of translation of Poutsma on which it is based. We demonstrate that DOT is not guaranteed to produce the correct translation, despite provably deriving the most probable translation. We go on to critically evaluate previous attempts at LFG-MT, commenting briefly on particular problem cases for such systems. We then show how the LFG-DOP model of Bod and Kaplan can be extended to serve as a novel hybrid model for MT which promises to improve upon DOT, as well as the pure LFG-based translation model.

Tags

Users

  • @unhammer
  • @dblp

Comments and Reviews