TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Short-term Hebbian learning can implement transformer-like attention

47 点作者 tester89大约 1 年前

2 条评论

light_hue_1大约 1 年前
Overpromises and under delivers.<p>&gt; Hebbian learning, can implement a transformer-like attention computation if the synaptic weight changes are large and rapidly induced<p>Well this has no analog in the brain either.<p>In any case. How is this transformer like attention? Not all attention mechanisms lead to a transformer. Certainly the two are not synonymous.<p>&gt; While it is tempting to assume that cortical rhythms might play a role, we have found that match windows around 0.5—2 s are necessary for reducing noise in the spike train comparisons, a timescale much longer than the cycles of theta, beta or gamma rhythms found in the cortex.<p>Well that&#x27;s totally the wrong timescale.<p>There&#x27;s a glut of these &quot;abuse some random feature of the brain that already serves a different purpose to implement something that clearly doesn&#x27;t work but is vaguely reminiscent of something that happens in machine learning so we&#x27;ll call the two the same&quot;. They contribute nothing.<p>The few somewhat worthwhile actually show a working network for a real task. The real breakthrough will be a paper that actually works for real, on real data, and can be implemented in the brain; we&#x27;ve got nothing like that. This isn&#x27;t one of the good ones.
评论 #39588159 未加载
adamnemecek大约 1 年前
This is not surprising considering the paper &quot;Hopfield Networks is All You Need&quot; <a href="https:&#x2F;&#x2F;arxiv.org&#x2F;abs&#x2F;2008.02217" rel="nofollow">https:&#x2F;&#x2F;arxiv.org&#x2F;abs&#x2F;2008.02217</a>. Hopfield networks learn via Hebbian learning.
评论 #39587245 未加载