<p><pre><code> however, although the resulting file size is smaller,
once gzipped it actually leads to larger files than
by only using minification, therefore this is a
totally useless article,
</code></pre>
no, it's not. mobile devices have a relatively small cache size, and assets are cached in non-gzip-ed form. so non-gezip-ed filesize is important (on mobile, that is)
The problem might be NP-hard, but with modern commercial (CPLEX, Gurobi, Xpress) and open-source (Cbc, SCIP) solvers for discrete optimization you can find a provably optimal solution in a relatively short time, even for thousands of nodes.<p>The disadvantage of genetic algorithms is that they are heuristics, with no guarantees on the quality of the solution found. Exact solvers such as the ones above do provide an estimate of how far you are from the optimal solution if you decide to stop them before they terminate.
What about if you rerun the genetic algorithm with the fitness function being the size of the final compressed+gzipped file? Maybe that could counter-intuitively reorder the CSS so that there are more long runs of repetitive rules which result in an optimum final level of compression.
Best practice for serving .css files is gzippped? Is this at the file level or on-the-fly by the web server? In any case, why is this considered best practice? (I ask as a sysadmin, not a web developer.)