TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Toy Models of Superposition (2022)

46 pointsby ZeljkoSover 1 year ago

4 comments

DigitalNoumenaover 1 year ago
I am curious why isn&#x27;t the network always in an &quot;interference&quot; state, but sometimes it collapses into a &quot;restricted&quot; ONB?<p>&gt; <i>the neural networks we observe in practice are in some sense noisily simulating larger, highly sparse networks</i><p>This seems somewhat related to a point made by Ilya Sutskever here [1]: NNs can be though of an approximation to the Kolmogorov compressor. Speculating, one could say any network is a projection of the ideal compressor (which argualy perfectly represents all n-features in an n-dimensional ONB) into a lower dimentional space, hence the interference. But why is there not always such an interference?<p>[1] <a href="https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=AKMuA_TVz3A">https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=AKMuA_TVz3A</a>
jiggawattsover 1 year ago
This is one of the best presentations of such a complex topic, or <i>any</i> topic for that matter that I&#x27;ve seen in a long time!<p>To the authors if they happen to find themselves here, I say: bravo!
nailloover 1 year ago
Anthropic is awesome
dustingetzover 1 year ago
guys. the mobile layout. people consume hobby content on mobile