TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Learning Concepts with Energy Functions

166 点作者 stablemap超过 6 年前

4 条评论

nerdponx超过 6 年前
From the abstract of the article they linked:<p><i>Energy-Based Models (EBMs) capture dependencies between variables by associating a scalar energy to each configuration of the variables. Inference consists in clamping the value of observed variables and finding configurations of the remaining variables that minimize the energy. Learning consists in finding an energy function in which observed configurations of the variables are given lower energies than unobserved ones. The EBM approach provides a common theoretical framework for many learning models, including traditional discriminative and generative approaches, as well as graph-transformer networks, conditional random fields, maximum margin Markov networks, and several manifold learning methods.</i><p><i>Probabilistic models must be properly normalized, which sometimes requires evaluating intractable integrals over the space of all possible variable configurations. Since EBMs have no requirement for proper normalization, this problem is naturally circumvented. EBMs can be viewed as a form of non-probabilistic factor graphs, and they provide considerably more flexibility in the design of architectures and training criteria than probabilistic approaches.</i><p>Seems like a really interesting unification of the wide variety of techniques out there in statistics and machine learning, analogous to the &quot;everything is a computation graph, as long as it&#x27;s differentiable&quot; revolution. I like it when this kind of thing has its day. Would be interesting to see how well it works non-robotics problems.
评论 #18401845 未加载
评论 #18407652 未加载
评论 #18401107 未加载
IIAOPSW超过 6 年前
Setting up some energy function and then finding the lowest energy state sounds a lot like adibatic quantum computing. Assuming this research lives up to the hype, quantum computers might be able to run this algorithm faster. Quantum machine learning is already a thing, but its nice to see it fit so congruently with a classical counterpart.
评论 #18404422 未加载
arashout33超过 6 年前
I have no idea what&#x27;s going on in this article. Is there a good resource or video for understanding this stuff?
yters超过 6 年前
Every new technique is able to &quot;quickly learn X.&quot; Something most not be so quick, otherwise why aren&#x27;t these techniques turning into AGI?<p>I think the problem is the goal is not well defined. So, increased velocity has no bearing on increased velocity towards the target.<p>A side question, why is there no research into whether human intelligence is computable? The assumption in AI is that human intelligence is computable, but I&#x27;ve never seen any good argument or evidence that this is true. Seems very unscientific to exert so much energy into this research direction without validating the fundamental assumption.<p>For example, the one instance I know of that defines AGI in a quantitative manner is Solomonoff induction (SI), but it is not computable. If SI is representative of human intelligence, then AGI is impossible.
评论 #18401175 未加载
评论 #18402652 未加载
评论 #18401166 未加载
评论 #18401129 未加载