TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

50x Speed Improvements on Deep Learning Networks Using Brain-Derived Algorithms

5 pointsby monkeypilotover 4 years ago

2 comments

monkeypilotover 4 years ago
The end result is a sparse network that more closely &quot;mimics the brain&quot;. Is this as significant as it sounds or is there something I am missing here?<p>Original Paper here - <a href="https:&#x2F;&#x2F;numenta.com&#x2F;assets&#x2F;pdf&#x2F;research-publications&#x2F;papers&#x2F;Sparsity-Enables-50x-Performance-Acceleration-Deep-Learning-Networks.pdf" rel="nofollow">https:&#x2F;&#x2F;numenta.com&#x2F;assets&#x2F;pdf&#x2F;research-publications&#x2F;papers&#x2F;...</a>
p1eskover 4 years ago
Seems like Numenta is out of ideas about how the brain works. What happened to them?<p>In this work they simply took a standard neural network, sparsified weights and activations, trained it with standard gradient descent and backpropagation, and called it a day.<p>This would have been unthinkable 10 years ago when they were after biological plausibility of their algorithms.