This was a great YouTube video on the subject: <a href="https://www.youtube.com/watch?v=a-767WnbaCQ" rel="nofollow">https://www.youtube.com/watch?v=a-767WnbaCQ</a><p>A submission to 3blue1brown's SoME (summer of math explanation) competition
I was recently struggling with the best way to randomly construct a well-connected, recurrent topology of neurons until I encountered Percolation theory.<p>There is a basic natural log scaling rule that essentially guarantees that you will have a well-connected topology (even with random connections) as long as you ensure a minimum # of connections are assigned to each element.<p>The required fanout at each order of magnitude network size goes something like:<p><pre><code> 10: ~3 connections
100: ~5 connections
1,000: ~7 connections
10,000: ~10 connections
100,000: ~12 connections
1,000,000: ~14 connections
100,000,000,000: ~26 connections
</code></pre>
I've been able to avoid a lot of complicated code by leveraging this.
Christensen is one of the authors of "Complexity and Criticality", which I highly recommend to anyone interested in percolation theory, the Ising model, self organized criticality and other models [0].<p>[0] <a href="https://www.worldscientific.com/worldscibooks/10.1142/p365#t=aboutBook" rel="nofollow">https://www.worldscientific.com/worldscibooks/10.1142/p365#t...</a>
Awesome primer on percolation <a href="https://www.youtube.com/watch?v=a-767WnbaCQ" rel="nofollow">https://www.youtube.com/watch?v=a-767WnbaCQ</a>
Relevant to the study of phase transitions in machine learning; e.g., <a href="https://openreview.net/forum?id=0pLCDJVVRD" rel="nofollow">https://openreview.net/forum?id=0pLCDJVVRD</a>