TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Short-term Hebbian learning can implement transformer-like attention

47 pointsby tester89about 1 year ago

2 comments

light_hue_1about 1 year ago
Overpromises and under delivers.<p>&gt; Hebbian learning, can implement a transformer-like attention computation if the synaptic weight changes are large and rapidly induced<p>Well this has no analog in the brain either.<p>In any case. How is this transformer like attention? Not all attention mechanisms lead to a transformer. Certainly the two are not synonymous.<p>&gt; While it is tempting to assume that cortical rhythms might play a role, we have found that match windows around 0.5—2 s are necessary for reducing noise in the spike train comparisons, a timescale much longer than the cycles of theta, beta or gamma rhythms found in the cortex.<p>Well that&#x27;s totally the wrong timescale.<p>There&#x27;s a glut of these &quot;abuse some random feature of the brain that already serves a different purpose to implement something that clearly doesn&#x27;t work but is vaguely reminiscent of something that happens in machine learning so we&#x27;ll call the two the same&quot;. They contribute nothing.<p>The few somewhat worthwhile actually show a working network for a real task. The real breakthrough will be a paper that actually works for real, on real data, and can be implemented in the brain; we&#x27;ve got nothing like that. This isn&#x27;t one of the good ones.
评论 #39588159 未加载
adamnemecekabout 1 year ago
This is not surprising considering the paper &quot;Hopfield Networks is All You Need&quot; <a href="https:&#x2F;&#x2F;arxiv.org&#x2F;abs&#x2F;2008.02217" rel="nofollow">https:&#x2F;&#x2F;arxiv.org&#x2F;abs&#x2F;2008.02217</a>. Hopfield networks learn via Hebbian learning.
评论 #39587245 未加载