I enjoyed this article a lot.<p>One thing that seemed glossed over, so I'm not sure if it's obvious for their use case, is the trade-off between compress once, distribute many times.<p>When looking at how long it takes to compress vs transmit, the optimisation was done to make the sum of both as small as possible - minimise(time(compress) + time(transmit)).<p>Instead it seems like you want to do is - minimise(time(compress) + expected_transmissions * time(transmit))<p>For any reasonable number of distributed copies of a game, it seems like this time to transmit will quickly come to dominate the total time involved.<p>I suspect, however, that the time to compress grows extremely quickly, for not much gain in compression, so the potential improvement is probably tiny even if you expect to be transmitting to millions of clients.