TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Brains scale better than CPUs. So Intel is building brains

20 点作者 truth_seeker将近 6 年前

3 条评论

arathore将近 6 年前
Though neuromorphic architectures are interesting, the Loihi is essentially an ASIC for Spiking Neural Networks [1]. The lower power consumption than GPUs is due to the limited nature of computation that can be performed on these chips (which is generally true for ASICs).<p>[1] <a href="https:&#x2F;&#x2F;en.wikichip.org&#x2F;wiki&#x2F;intel&#x2F;loihi" rel="nofollow">https:&#x2F;&#x2F;en.wikichip.org&#x2F;wiki&#x2F;intel&#x2F;loihi</a>
billconan将近 6 年前
I don&#x27;t quite understand brain simulation. Most article talks about how neuron works, how they are connected. It seems I can write a simulation easily. <a href="https:&#x2F;&#x2F;medium.com&#x2F;@bennashman&#x2F;modeling-your-brain-on-a-computer-with-computational-neuroscience-57596c919c70" rel="nofollow">https:&#x2F;&#x2F;medium.com&#x2F;@bennashman&#x2F;modeling-your-brain-on-a-comp...</a><p>What I don&#x27;t get is, the interconnected neurons don&#x27;t seem to be differentiable. How do they learn? How to provided feed back to the system? How to train them like we do to ANN?
评论 #20481357 未加载
评论 #20481079 未加载
smachiz将近 6 年前
I look forward to my brain leaking memory to other threads.