TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Hopfield Networks Is All You Need

184 点作者 meiji163大约 4 年前

12 条评论

einpoklum大约 4 年前
Brief abstract for the lay person (like me):<p>1. Hopfield Networks are also known as &quot;associative memory networks&quot;, a neural network model developed decades ago by a guy named Hopfield.<p>2. It&#x27;s useful to plug these in somehow as layers in Deep Neural Networks today (particularly, in PyTorch).<p>I hate non-informative titles!
评论 #27009758 未加载
评论 #27009197 未加载
评论 #27008955 未加载
ArtWomb大约 4 年前
Trending as John Hopfield scheduled to present his &quot;biologically plausible&quot; response to the Modern Hopfield Network at ICLR next week:<p>Large Associative Memory Problem in Neurobiology and Machine Learning<p><a href="https:&#x2F;&#x2F;arxiv.org&#x2F;abs&#x2F;2008.06996" rel="nofollow">https:&#x2F;&#x2F;arxiv.org&#x2F;abs&#x2F;2008.06996</a><p>MHN seem ideal for prediction problems based purely on data, such as chemical reactions and drug discovery:<p>Modern Hopfield Networks for Few- and Zero-Shot Reaction Prediction<p><a href="https:&#x2F;&#x2F;arxiv.org&#x2F;abs&#x2F;2104.03279" rel="nofollow">https:&#x2F;&#x2F;arxiv.org&#x2F;abs&#x2F;2104.03279</a>
评论 #27010267 未加载
kdavis大约 4 年前
“Sooner or later, everything old is new again.” -Steven King
评论 #27010351 未加载
scrubs大约 4 年前
Quoting: &quot;We introduce a new energy function and a corresponding new update rule which is guaranteed to converge to a local minimum of the energy function.&quot;<p>Is this a minimum in a local area or local in the range of some function? I could see perhaps that&#x27;d being an advantage if you happen to know that local part of the range<p>In contrast we&#x27;re usually looking for global min&#x2F;max say with annealing algorithms. How is local is better in the context of this paper than global?
评论 #27012481 未加载
aparsons大约 4 年前
I’ve seen a lot of efforts to add a notion of associative memory into neural networks. Have any exciting applications of such architectures been publicised?
评论 #27008923 未加载
评论 #27008959 未加载
评论 #27009833 未加载
评论 #27008906 未加载
zibzab大约 4 年前
I looked at the paper but it was way over my head.<p>Can anyone explain it in simpler terms to a person who barely understands attention models and has no idea what associative memory means here?
mark_l_watson大约 4 年前
Nice paper! I used Hopfield networks in the 1980s. I hope that I can clear a few hours of time this week to work through this. I admit that for machine learning, that I have fell into the “deep learning for everything pit” in the last six or seven years. Probably because DL is what I usually get paid for.
tediousdemise大约 4 年前
Off-topic, but does anyone know what Jekyll theme this is? Absolutely beautiful formatting and color scheme.
评论 #27009278 未加载
评论 #27009130 未加载
gyre007大约 4 年前
This reminded me of a very old fun side project of mine [1] that had made me look at neural networks from a different perspective.<p>[1] <a href="https:&#x2F;&#x2F;github.com&#x2F;milosgajdos&#x2F;gopfield" rel="nofollow">https:&#x2F;&#x2F;github.com&#x2F;milosgajdos&#x2F;gopfield</a>
EVa5I7bHFq9mnYK大约 4 年前
If I understood them correctly, they store all the training samples and then select one most similar to a given input.
SneakyTornado29大约 4 年前
Are*
评论 #27012590 未加载
komalghori22大约 4 年前
Amazing