TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Why Do Neural Networks Need an Activation Function?

7 点作者 strikingloo将近 6 年前

2 条评论

ml_thoughts将近 6 年前
The posted article isn&#x27;t particularly fascinating, but for a bit of fun, there&#x27;s an OpenAI project where they demonstrate that due to the non-linear rounding of Float32 values you can actually train &quot;non-linear&quot; linear networks: <a href="https:&#x2F;&#x2F;openai.com&#x2F;blog&#x2F;nonlinear-computation-in-linear-networks&#x2F;" rel="nofollow">https:&#x2F;&#x2F;openai.com&#x2F;blog&#x2F;nonlinear-computation-in-linear-netw...</a>
评论 #20326306 未加载
united893将近 6 年前
You don&#x27;t need advanced math to answer this question. If there&#x27;s no activation function then all the weights in each layer can be multiplied together and the whole network is just a linear classifier.
评论 #20322322 未加载