Tree-to-Sequence Attentional Neural Machine Translation

@article{Eriguchi2016TreetoSequenceAN,
  title={Tree-to-Sequence Attentional Neural Machine Translation},
  author={Akiko Eriguchi and Kazuma Hashimoto and Yoshimasa Tsuruoka},
  journal={ArXiv},
  year={2016},
  volume={abs/1603.06075},
  url={https://api.semanticscholar.org/CorpusID:12851711}
}
This work proposes a novel end-to-end syntactic NMT model, extending a sequence- to-sequence model with the source-side phrase structure, which has an attention mechanism that enables the decoder to generate a translated word while softly aligning it with phrases as well as words of the source sentence.

Figures and Tables from this paper

...