TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Neural Transformation Machine: Sequence-To-Sequence Learning

49 pointsby groarover 9 years ago

1 comment

deepnetover 9 years ago
The OP paper proposes a different architectural approach to the translation task of the 2014 NIPS paper by Ilya Sutskever, Oriol Vinyals &amp; Quoc Le, <i>Sequence to Sequence Learning with Neural Networks</i><p><a href="http:&#x2F;&#x2F;papers.nips.cc&#x2F;paper&#x2F;5346-sequence-to-sequence-learning-with-neural-networks" rel="nofollow">http:&#x2F;&#x2F;papers.nips.cc&#x2F;paper&#x2F;5346-sequence-to-sequence-learni...</a><p>which &quot;uses a multilayered Long Short-Term Memory (LSTM) to map the input sequence to a vector of a fixed dimensionality, and then another deep LSTM to decode the target sequence from the vector.&quot; The task was English to French.<p>Meng et al.(2015)(OP) translate a Chinese sequence to English, using a network based on Neural Turing Machines(NTM) which uses LTSM units, they name this novel architecture Neural Transformation Machine (NTRam).<p>The Neural Turing Machine(NTM) was proposed by Deepmind&#x27;s Alex Graves, Greg Wayne &amp; Ivo Danihelka, it couples a neural net and LTSM memory to produce a differentiable, thus trainable analogy to a Turing Machine or Von Neumann architecture - to perfom copying, sorting and associative recall. An exploration of whether Neural Networks can be put to basic computing functions.<p><i>Neural Turing Machines(2014)</i> by Alex Graves, Greg Wayne, Ivo Danihelka<p><a href="http:&#x2F;&#x2F;arxiv.org&#x2F;abs&#x2F;1410.5401v2" rel="nofollow">http:&#x2F;&#x2F;arxiv.org&#x2F;abs&#x2F;1410.5401v2</a>