TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Deep Learning outperforms Mathematica on symbolic integration and solving ODEs

2 点作者 htfy96超过 5 年前

1 comment

htfy96超过 5 年前
Author&#x27;s original tweet: <a href="https:&#x2F;&#x2F;twitter.com&#x2F;GuillaumeLample&#x2F;status&#x2F;1202178956063064064" rel="nofollow">https:&#x2F;&#x2F;twitter.com&#x2F;GuillaumeLample&#x2F;status&#x2F;12021789560630640...</a><p>&gt; Although neural networks struggle on simple arithmetic tasks such as addition and multiplication, we show that transformers perform surprisingly well on difficult mathematical problems such as function integration and differential equations. &gt; We define a general framework to adapt seq2seq models to various mathematical problems, and present different techniques to generate arbitrarily large datasets of functions with their integrals, and differential equations with their solutions. &gt; On samples of randomly generated functions, we show that transformers achieve state-of-the-art performance and outperform computer algebra systems such as Mathematica. &gt; We show that beam search can generate alternative solutions for a differential equation, all equivalent, but written in very different ways. The model was never trained to do this, but managed to figure out that different expressions correspond to the same mathematical object &gt; We also observe that a transformer trained on functions that SymPy can integrate, is able at test time to integrate functions that SymPy is not able to integrate, i.e. the model was able to generalize beyond the set of functions integrable by SymPy. &gt; A purely neural approach is not sufficient, since it still requires a symbolic framework to check generated hypotheses. Yet, our models perform best on very long inputs, where computer algebra systems struggle. Symbolic computation may benefit from hybrid approaches.