TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Algebraic Simplification Neural Nets

3 pointsby dalyabout 1 year ago
It seems that there should be a way to algebraically simplify neural nets.<p>Trivially a node that has a zero weight can be removed as can any links to &#x2F; from that node.<p>It should also be possible to eliminate nodes that have a full-value (aka &#x27;1&#x27; on the 0-1 scale)<p>I have also seen work where the matrix multiplies during training can have columns &quot;collapsed&quot;.<p>The ultimate question might be applying an &quot;algebraic simplification&quot; of the final network to simplify a post-trained network used for inference.<p>The idea is to take a path through a network, constructing the equation for that path, reducing it to a shorter path by conbining nodes and weights.<p>It is certain that a node participates in several (hundred?) paths. In this case it might be useful to &quot;copy&quot; the node so it can be part of a path reduction without affecting other paths.<p>I believe that in theory some neural networks can be reduced to a single hidden layer[1]. The game would be to algebraically reduce network depth.<p>[1] Lee, et al. &quot;On the ability of neural nets to express distributions&quot; https:&#x2F;&#x2F;arxiv.org&#x2F;pdf&#x2F;1702.07028.pdf (2021)

no comments

no comments