TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Rain Neuromorphics trained a neuralnet on an analog chip–an array of memristors

2 点作者 hyperluz超过 2 年前

1 comment

hyperluz超过 2 年前
"Backpropagation, the training algorithm used almost exclusively in AI systems today, is incompatible with analog hardware since it is sensitive to the small variabilities and mismatches in on-chip analog devices. While compensation techniques have been used to make analog inference chips, these techniques have yet to prove successful for backpropagation-based training. Rain’s approach, which uses activity difference techniques, calculates local gradients instead of backpropagation’s repeated use of global gradients. The technique builds on previous work on equilibrium propagation training algorithms and is mathematically equivalent to backpropagation; in other words, it can be used to train mainstream deep learning networks."