TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Factor Graphs and Tensor Networks

1 pointsby cgadski12 months ago

1 comment

cgadski12 months ago
Tensor network notation is really useful for differentiating with respect to tensors. For example, where F is a real function of a matrix variable, think of how you&#x27;d differentiate F(A X) with respect to X. Conceptually this is easy, but I used to have to slow down to write it in Einstein notation. Thinking in terms of tensor diagrams, I just see F&#x27;, X and A strung together in a triangle. Differentiating with respect to X means removing it from the triangle. The dangling edges are the indices of the derivative, and what&#x27;s left is a matrix product of A and F&#x27; along the index that doesn&#x27;t involve X.<p>This blog post made me realize that tensor diagrams are the same as the factor graphs we talk about in random field theory. Indices of a tensor network become variables of a factor graph and tensors become factors. The contraction of a tensor network with positive tensors is the partition function of a corresponding field and so on.