I worked on this movie, I was at DNEG at the time. One of the standout things that I remember is that this particular simulation was toxic to the fileserver that it was being stored on.<p>From what I recall, I don't think that it was running on that many machines at once. Mainly because it required the high memory nodes that were expensive. I <i>think</i> it was only running on ~10 possibly 50 machines concurrently. But I could be wrong.<p>What it did have was at least one dedicated fileserver though. Each of the file servers at the time were some dual proc dell 1u thing with as much ram as you could stuff in them at the time (384 gigs I think). They were attached by SAS to a single 60 drive 4u raid array. (Dell PowerVault MD3460 or something along those lines. They are rebadged by Dell and were the first practical hotswap enclosure that took normal 3.5" SAS drives, that didn't cost the earth)<p>The array was formatted into 4 raid6 groups, and LVM'd together on the server. it was then shared out by NFS over bonded 10gig links.<p>Anyway. That simulation totally fucked the disks in the array. By the time it finished (I think it was a 2 week run time) it had eaten something like 14 hard drives. Every time a new disk was inserted, another would start to fail. It was so close to fucking up the whole time.<p>I had thought that the simulation was a plugin for houdini, or one of the other fluid simulation engines we had kicking around, rather than a custom 40k C++ program.
There was a pretty good talk[1] on this at SIGGRAPH 2015. Speakers included later Nobel prize winner Kip Thorne.
I tried to find any recordings but had no luck. (Presumably it's on the conference DVD)<p>[1] <a href="https://history.siggraph.org/learning/double-negative-presents-the-visual-effects-of-interstellar-by-thorne-franklin-james-and-tunzelmann/" rel="nofollow">https://history.siggraph.org/learning/double-negative-presen...</a>
ScienceClic did an excellent video about the accuracy with a better recreation:
<a href="https://youtu.be/ABFGKdKKKyg" rel="nofollow">https://youtu.be/ABFGKdKKKyg</a>
If you want to avoid opening Twitter (now known as X), here is the link to the paper: <a href="https://arxiv.org/pdf/1502.03808" rel="nofollow">https://arxiv.org/pdf/1502.03808</a><p>TLDR
“A typical IMAX image has 23 million pixels, and for Interstellar we had to generate many thousand images, so DNGR had to be very efficient. It has 40,000 lines of C++ code and runs across Double Negative’s Linux-based render-farm. Depending on the degree of gravitational lensing in an image, it typically takes from 30 minutes to several hours running on 10 CPU cores to create a single IMAX image. Our London render-farm comprises 1633 Dell-M620 blade servers; each blade has two 10-core E5-2680 Intel Xeon CPUs with 156GB RAM. During production of Interstellar, several hundred of these were typically being used by our DNGR code.“
I could do it with Photoshop, two candles of Red Bull and a 5K USD payment in my account.<p>Jokes apart — this is the best SciFi movie ever! It’ll become one of these movies one’s got to rewatch every 5y or so.
There’s an interesting discussion related to this in this video: <a href="https://youtube.com/watch?v=Z4oy6mnkyW4" rel="nofollow">https://youtube.com/watch?v=Z4oy6mnkyW4</a>
It is a pity they didn't spend comparable effort on the story. I thought it was a really disappointing film, especially after they banged on so much about how good the physics in it was (apart from the black hole rendering, it wasn't).