The OP paper proposes a different architectural approach to the translation task of the 2014 NIPS paper by Ilya Sutskever, Oriol Vinyals & Quoc Le,
<i>Sequence to Sequence Learning with Neural Networks</i><p><a href="http://papers.nips.cc/paper/5346-sequence-to-sequence-learning-with-neural-networks" rel="nofollow">http://papers.nips.cc/paper/5346-sequence-to-sequence-learni...</a><p>which "uses a multilayered Long Short-Term Memory (LSTM) to map the input sequence to a vector of a fixed dimensionality, and then another deep LSTM to decode the target sequence from the vector." The task was English to French.<p>Meng et al.(2015)(OP) translate a Chinese sequence to English, using a network based on Neural Turing Machines(NTM) which uses LTSM units, they name this novel architecture Neural Transformation Machine (NTRam).<p>The Neural Turing Machine(NTM) was proposed by Deepmind's Alex Graves, Greg Wayne & Ivo Danihelka, it couples a neural net and LTSM memory to produce a differentiable, thus trainable analogy to a Turing Machine or Von Neumann architecture - to perfom copying, sorting and associative recall. An exploration of whether Neural Networks can be put to basic computing functions.<p><i>Neural Turing Machines(2014)</i> by Alex Graves, Greg Wayne, Ivo Danihelka<p><a href="http://arxiv.org/abs/1410.5401v2" rel="nofollow">http://arxiv.org/abs/1410.5401v2</a>