TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Deep learning meets vector-symbolic AI

91 pointsby sayonaramanover 3 years ago

5 comments

uoaeiover 3 years ago
Seems like a variant of a Siamese network which uses binarized embedding vectors for predictions instead of the raw embedding vectors. What exactly is the novelty presented here?
toisanjiover 3 years ago
I would like to see deep learning working with embodied cognition somehow: <a href="http:&#x2F;&#x2F;www.jtoy.net&#x2F;blog&#x2F;grounded_language.html" rel="nofollow">http:&#x2F;&#x2F;www.jtoy.net&#x2F;blog&#x2F;grounded_language.html</a>
axiosgunnarover 3 years ago
Can any experts chime in? Is this any good?
评论 #28635447 未加载
评论 #28632475 未加载
sayonaramanover 3 years ago
you might also be interested in the recent work on &quot;resonator networks&quot; VSA architecture [1-4] by Olshausen lab at Berkeley (P. Kanerva who created the influential SDM model [5] is one of the lab members).<p>It&#x27;s a continuation of Plate [6] and Kanerva work in the 90s and Olshausen&#x27; groundbreaking work on sparse coding [7] which inspired the popular autoencoders [8].<p>I find it especially promising they found this superposition based approach to be competitive with optimization so prevalent in modern neural nets. May be backprop will die one day and be replaced with something more energy efficient along these lines.<p>[1] <a href="https:&#x2F;&#x2F;redwood.berkeley.edu&#x2F;wp-content&#x2F;uploads&#x2F;2020&#x2F;11&#x2F;frady2020resonator.pdf" rel="nofollow">https:&#x2F;&#x2F;redwood.berkeley.edu&#x2F;wp-content&#x2F;uploads&#x2F;2020&#x2F;11&#x2F;frad...</a><p>[2] <a href="https:&#x2F;&#x2F;redwood.berkeley.edu&#x2F;wp-content&#x2F;uploads&#x2F;2020&#x2F;11&#x2F;kent2020resonator.pdf" rel="nofollow">https:&#x2F;&#x2F;redwood.berkeley.edu&#x2F;wp-content&#x2F;uploads&#x2F;2020&#x2F;11&#x2F;kent...</a><p>[3] <a href="https:&#x2F;&#x2F;arxiv.org&#x2F;abs&#x2F;2009.06734" rel="nofollow">https:&#x2F;&#x2F;arxiv.org&#x2F;abs&#x2F;2009.06734</a><p>[4] <a href="https:&#x2F;&#x2F;github.com&#x2F;spencerkent&#x2F;resonator-networks" rel="nofollow">https:&#x2F;&#x2F;github.com&#x2F;spencerkent&#x2F;resonator-networks</a><p>[5] <a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Sparse_distributed_memory" rel="nofollow">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Sparse_distributed_memory</a><p>[6] <a href="https:&#x2F;&#x2F;www.amazon.com&#x2F;Holographic-Reduced-Representation-Distributed-Structures&#x2F;dp&#x2F;1575864304" rel="nofollow">https:&#x2F;&#x2F;www.amazon.com&#x2F;Holographic-Reduced-Representation-Di...</a><p>[7] <a href="http:&#x2F;&#x2F;www.scholarpedia.org&#x2F;article&#x2F;Sparse_coding" rel="nofollow">http:&#x2F;&#x2F;www.scholarpedia.org&#x2F;article&#x2F;Sparse_coding</a><p>[8] <a href="https:&#x2F;&#x2F;web.stanford.edu&#x2F;class&#x2F;cs294a&#x2F;sparseAutoencoder.pdf" rel="nofollow">https:&#x2F;&#x2F;web.stanford.edu&#x2F;class&#x2F;cs294a&#x2F;sparseAutoencoder.pdf</a>
评论 #28637382 未加载
gibsonf1over 3 years ago
What makes us humans intelligent and able to learn so quickly is our reasoning faculty especially our conceptual reasoning capabilities. There is no intelligence and learning without that, just sophisticated ml&#x2F;dl pattern matching &#x2F; perception. Symbolic AI led to the first AI winter because a symbol is just an object that represents another object. That&#x27;s not a lot to work with.<p>The AI industry needs to finally discover conceptual reasoning to actually achieve any understanding. In the mean time huge sums of money, energy and time are being wasted on ml&#x2F;dl on the idea that given enough data and processing power, intelligence will magically happen.<p>This IBM effort doesn&#x27;t even remotely model how the human brain works.
评论 #28637137 未加载
评论 #28637746 未加载
评论 #28637614 未加载