If I'm reading that graph right, the red line is at 10kmsec, i.e. 10 seconds?<p>Incremental build times are near and dear to my heart; I spent a lot of time making the Chrome incremental build fast, resulting in this tool: <a href="http://martine.github.io/ninja/" rel="nofollow">http://martine.github.io/ninja/</a> . In developing Ninja I was surprised to discover that Linux stat() with a warm disk cache is very fast -- well under 100ms to stat the ~40k source files Chrome uses in its build (see the "node stat" lines here: <a href="https://github.com/martine/ninja/wiki/Timing-Numbers" rel="nofollow">https://github.com/martine/ninja/wiki/Timing-Numbers</a> ). At its best point I think we got the one-file-changed build/compile/link cycle of Chrome (a ~70mb C++ binary) to around 5 seconds.<p>Of course, Facebook's problem is surely very different -- their scale could be many more files, and perhaps the programs their engineers run while developing cause their disk caches to flush more frequently. Just found it interesting to worry about the cost of stats.