TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

“Everything that works works because it's Bayesian” Why Deep Nets Generalize?

40 pointsby fhuszaralmost 8 years ago

2 comments

yakultalmost 8 years ago
If DNN is just a crappy approximation of some kind of Bayesian inference, then where are the better approximations that beat it on all the metrics we care about? And if that magical thing does exist, why aren't people using it to beat the pants off the DNN people and take their lunch money?
评论 #14425009 未加载
评论 #14423599 未加载
wodenokotoalmost 8 years ago
I&#x27;m surprised about the NN that memorize the data. I&#x27;d imagined there would not be enough units to memorize everything.<p>But if we have a network that has essentially memorized a random dataset, how is it functionally different from a nearest neighbor algorithm?
评论 #14424930 未加载