TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Show HN: Two strange useless things to do with a neural net

22 pointsby curuinorover 6 years ago

3 comments

murftownover 6 years ago
Cool, I&#x27;ll bite. I don&#x27;t know some of your mathy terms like &quot;putative extensive transduction&quot; or &quot;anisotropy&quot; so go easy on me.<p>But I hear you say things like &quot;Neural nets are nonlinear iterated function systems by construction.&quot;, and it resonates with me. I&#x27;ve often tried to draw a parallel between the recursive tree structure of a program with the matrix structure of a neural network, which as I understand it is mathematically still a graph of functional nodes feeding into one another - it&#x27;s just that treating it like a matrix allows the programs to be much more performant by using low-level structures like numpy arrays.<p>In any case, I often think about the tree structure of programs, and we whether could get to a place of programs that can analyze, understand and evolve the abilities of other programs. This could be seen as the old-school &quot;symbolic AI&quot; mindset. And then, the similarities and differences of that with the matrix-based neural network revolution that&#x27;s happened in the last years. I wonder about possible holomorphisms and interfaces between those two paradigms.<p>I don&#x27;t doubt that I barely skimmed the surface of what you were actually saying. The very general and meta way you were talking about information processing is intriguing to me, and I&#x27;d love to understand more. Thanks for sharing!
评论 #19236418 未加载
CormacBover 6 years ago
&gt;Neural nets are nonlinear iterated function systems by construction. I tend to believe that the progression of the weights in weight space is a slice of another nonlinear iterated function system, also by construction. So I would tend to believe that the overall landscape of the optimization is suffused with directions with positive Lyapunov exponent, because if it&#x27;s a fractal and an attractor, one considers it a strange attractor and begins suspecting that the dynamical process that creates it is chaotic. But that induces anisotropies in the optimization surface.<p>Any technology that is sufficiently advanced is indistinguishable from a text generator.
curuinorover 6 years ago
As I said on the repo, I welcome discussion for a while, probably until end of day today on HN, but I just stay bookmarked on the SA CSPAM thread so you can ask me questions there whenever. This is probably the only way you can get me to actually explain the thing in plain English, I gave up trying to preemptively explain things