Downloads: 145

Files in This Item:
File Description SizeFormat 
ipsjjip.20.512.pdf598.25 kBAdobe PDFView/Open
Title: Joint Phrase Alignment and Extraction for Statistical Machine Translation
Authors: Neubig, Graham
Watanabe, Taro
Sumita, Eiichiro
Mori, Shinsuke  kyouindb  KAKEN_id
Kawahara, Tatsuya  kyouindb  KAKEN_id  orcid https://orcid.org/0000-0002-2686-2296 (unconfirmed)
Keywords: statistical machine translation
phrase alignment
non-parametric Bayesian statistics
inversion transduction grammars
Issue Date: 2012
Publisher: Information Processing Society of Japan
Journal title: Journal of Information Processing
Volume: 20
Issue: 2
Start page: 512
End page: 523
Abstract: The phrase table, a scored list of bilingual phrases, lies at the center of phrase-based machine translation systems. We present a method to directly learn this phrase table from a parallel corpus of sentences that are not aligned at the word level. The key contribution of this work is that while previous methods have generally only modeled phrases at one level of granularity, in the proposed method phrases of many granularities are included directly in the model. This allows for the direct learning of a phrase table that achieves competitive accuracy without the complicated multi-step process of word alignment and phrase extraction that is used in previous research. The model is achieved through the use of non-parametric Bayesian methods and inversion transduction grammars (ITGs), a variety of synchronous context-free grammars (SCFGs). Experiments on several language pairs demonstrate that the proposed model matches the accuracy of the more traditional two-step word alignment/phrase extraction approach while reducing its phrase table to a fraction of its original size.
Rights: © 2012 by the Information Processing Society of Japan
URI: http://hdl.handle.net/2433/167749
DOI(Published Version): 10.2197/ipsjjip.20.512
Appears in Collections:Journal Articles

Show full item record

Export to RefWorks


Export Format: 


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.