TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Ensmallen – Flexible C++ library for efficient mathematical optimization

95 点作者 forrrealman超过 6 年前

8 条评论

montecarl超过 6 年前
I&#x27;ve written several frameworks for optimization and the approach here of defining special objective functions works quite well. I like the look of this and would love to try it out soon.<p>This is a small thing, but I&#x27;m really happy to see that their L-BFGS line search algorithm supports a maximum step size. Many numerical optimization libraries do not offer this simple feature, which is critical if your goal is to find a minimum that is close to the initial starting location. This is important in atomistic geometry optimization.
cozzyd超过 6 年前
In terms of c++ matrix libraries, I&#x27;ve found eigen3 to be better than armadillo (although armadillo is easier to use).
nolps超过 6 年前
Github is <a href="https:&#x2F;&#x2F;github.com&#x2F;mlpack&#x2F;ensmallen" rel="nofollow">https:&#x2F;&#x2F;github.com&#x2F;mlpack&#x2F;ensmallen</a>
评论 #18555181 未加载
svantana超过 6 年前
This is looking very nice, can&#x27;t wait to try it out. But I feel it&#x27;s a missed opportunity to not provide some simple examples [1], as well as a benchmark to quickly estimate the amount of flops that can be expected. The main reason to use a library like this one over pyTorch or autograd&#x2F;scipy is speed, after all.<p>[1] The test cases can be seen as examples, obviously, but they are not written in an easily accessible form IMHO.<p>EDIT: Looking more closely, it seems you need to provide an explicit gradient, no autodiff included, which makes it a complete nonstarter for me.
mempko超过 6 年前
Looked at the documentation, had to do a double take &quot;Is this mlpack??? a fork??&quot; then checked, yup, it&#x27;s a header only version of the mlpack&#x27;s optimization library. Very cool.
glalonde超过 6 年前
Anyone know how this would compare to IPOPT for constrained, sparse nonlinear optimization?
ipunchghosts超过 6 年前
How does this differ from Ceres?
评论 #18557833 未加载
评论 #18555347 未加载
评论 #18555827 未加载
hughes超过 6 年前
Sounds like the name is based on the perfectly cromulent word &quot;embiggen&quot;[1], but somewhat confusingly uses the first part of the word &quot;enlarge&quot;.<p>[1] <a href="https:&#x2F;&#x2F;www.merriam-webster.com&#x2F;dictionary&#x2F;embiggen" rel="nofollow">https:&#x2F;&#x2F;www.merriam-webster.com&#x2F;dictionary&#x2F;embiggen</a>
评论 #18557518 未加载
评论 #18586416 未加载
评论 #18556457 未加载