It would be interesting to view the evolution over long periods of time.<p>This simulation is 2D, but it's similar to what happens in globular star clusters. In these, there's a phenomenon called the "gravothermal catastrophe".<p>The particles (stars, sponge bits) relax to thermal equilibrium, where the kinetic energy of a particle has a distribution where probability declines exponentially in energy/temperature. Some of the particles will have energy high enough to escape to infinity (to "evaporate"). When they leave, the remaining particles are more tightly bound, so the cluster shrinks. The particles then move faster (by the virial theorem, total kinetic energy is always 1/2 the negative of the gravitational potential energy). Evaporation accelerates until the cluster basically explodes.<p>Why this doesn't happen to actual star clusters was eventually determined to be due to three body collisions that cause binary stars to form, and these stars then inject energy into passing stars (causing the binary star orbits to shrink). This energy injection reheats the cluster, inflating it again and preventing runaway evaporation.<p>I'm not clear that the simulation here can handle formation of such binaries.
Correct me if i am not wrong they probably defining initial conditions for the particles (such as positions (to create the spongebob (althought I wish there was a way to convert an image to these n bodies), velocities, and masses)
Then you set up the gravitational interactions between them and futher iteratively update their positions and velocities over time using some numerical integration method (Euler's method/Runge-Kutta method?) to simulate their motion
This is cool!<p>...but 1 second per time step is a lot. I wonder how fast it would've been if it wasn't in Python. I think we as a society are doing a whole lot of people (especially physicists) a disservice by mainly teaching them Python rather than languages which are literally hundreds of times faster in some cases. Python works well when you just want to glue together existing fast C or Fortran libraries with Python APIs, but it quickly proves limiting.<p>I've personally been caught by the Python trap, where the easiest way to do something was to write a Python script to do it, and it worked, but then I wanted to try to process more data or whatever, and suddenly Python is a huge limiting factor. I then spend more time parallelizing the Python code to make it run faster, and it becomes a beast that's hard to debug and which maxes out 32 CPU cores and is still 10x slower than what a single threaded Rust program would've been and I regret my choice of language.<p>EDIT: Also, this is in no way anti-Python, I think it's a nice language and there are many uses where it is wholly appropriate.