TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Graph Neural Networks use graphs when they shouldn't

130 pointsby Pseudomanifoldover 1 year ago

4 comments

ajb117over 1 year ago
Besides the the title being a bit exaggerated, it&#x27;s been long-acknowledged in the GNN community that graph rewiring often helps learning; see [1, 2] for example. Also, at first skim, I&#x27;m surprised there&#x27;s no discussion on oversmoothing&#x2F;oversquashing in this paper. It seems like what they&#x27;re calling &quot;overfitting&quot; on the graph structure could be interpreted as the GNN&#x27;s node representations saturating&#x2F;starving because of poor graph topology.<p>[1] - <a href="https:&#x2F;&#x2F;arxiv.org&#x2F;abs&#x2F;2111.14522" rel="nofollow noreferrer">https:&#x2F;&#x2F;arxiv.org&#x2F;abs&#x2F;2111.14522</a> [2] - <a href="https:&#x2F;&#x2F;arxiv.org&#x2F;abs&#x2F;2210.02997" rel="nofollow noreferrer">https:&#x2F;&#x2F;arxiv.org&#x2F;abs&#x2F;2210.02997</a>
eachroover 1 year ago
Isn&#x27;t the natural solution to just use attention layers instead of graph convolution layers then? Then there the attention mechanism will end up learning the useful graph structure via the attention weights?
评论 #37572590 未加载
评论 #37598239 未加载
评论 #37571985 未加载
tbalsamover 1 year ago
I&#x27;m curious how this relates to MDL-tied regularization. I don&#x27;t see any explicit reference to the L2 norm here, and I&#x27;m not too smart about graph neural networks.<p>But it seems like overfitting is almost necessary unless some kind of obscure variational technique is used? They did have one paper where they used a progressively compressed GNN to isolate an approximation for a cosmological equation smaller and more accurate than had been previously achieved. I&#x27;ll update here if I find it.<p>Edit: Ah, here we are: <a href="https:&#x2F;&#x2F;arxiv.org&#x2F;abs&#x2F;2006.11287" rel="nofollow noreferrer">https:&#x2F;&#x2F;arxiv.org&#x2F;abs&#x2F;2006.11287</a> I believe there was some very good followup work posted on it this year that I saw some movement during the 3-5 months that I was really active on Twitter (brief time that it was), seems like an extremely promising area, and one I should learn if I truly do care about the Kolmogorov complexity of a problem.
dpflanover 1 year ago
Anyone working with GNNs have an experience with this? Has this affected your work?
评论 #37572418 未加载
评论 #37572134 未加载
评论 #37573219 未加载
评论 #37572156 未加载
评论 #37575316 未加载