Author of crass here.<p>I wouldn't recommend doing what the author suggests. crass doesn't squeak out extra bytes when you reprocess because it already does this for you.<p>If you combine multiple minifiers, the bugs in one minifier can end up being amplified by the others. For instance, one minifier might not consider !important when moving around CSS. Another minifier might then take that output and perform an optimization that's no longer just unsafe, but now incorrect. It might even delete rulesets that it believes are not used, ruining your stylesheet.<p>There are suites that test the correctness of minifieres, and many don't do great. CSS still "works" when the syntax is invalid, so invalid stylesheets result in undefined results when you minify. Between bugs and undefined behavior, I wouldn't recommend mixing and matching just to save one or two TCP packets, especially with gzip/brotli on the wire.
I worry that this is the chaos of folks not understanding compiler theory. This is the result: every minifier is a compiler, but none of these minifiers boast properties like idempotence and none of them are sufficiently correct that they achieve their goal on the first try. I don't expect to necessarily see scholarly papers, but I would not ever expect a minifier to improve on itself when run multiple times in a row!
Your site is very nicely styled and easy to read... I like how large and legible and clean-looking the font is.<p>Also, your writing style is funny, nice work.
If you really want to get fancy you could use Selenium to get screenshots and compare them to check that the remynified CSS produces the same layout as the original CSS.<p><a href="http://www.seleniumhq.org/" rel="nofollow">http://www.seleniumhq.org/</a>
Transforming .a-really-long-class-name-or-id into something shorter, like .x would save a lot more bytes.<p>Another thing is splicing unecessary properties, which can even save more bytes, but given web apps becoming becoming so dynamic right now, that would be hard or really awkward for developer to work with too.
I really can't tell if this is satire. But either everyone is in on the joke and out to get naive people like me (in that case, well played) or Hacker News does not seem to think so.<p>I don't think there is one single use case, on any scale, anywhere in the world, where saving up to 17% on a minimized css file before gzip would matter in the slightest, let alone while adding 20mins to your build cycle :)
I am not a fan of css minimizers.<p>I'm more a fan of writing compact, clean, logical css in the first place.<p>A couple months ago I worked on a project and re-wrote a 129k minified css file someone created as a clean un-minified 12k css file that had 100% of the original functionality plus some additional UI improvements.<p>You can only get these improvements if you understand what you are writing and stop using sass to write bloated files.
`rm bootstrap.css` look, a 100% reduction! And it led to a better website.<p>`gzip goodfile.css` and there's an improvement several times more effective than even the best minifier. And it keeps your source code legible in the browser and doesn't require a slow/buggy asset pipeline to test changes.<p>Yes, yes, I know minify+gzip can save like an additional 1% over what gzip alone does. To me, that's just not worth the cost to the developer.
What really reduces css filesize for me is uncss[0]. It removes the unused styles and then I run the css through a minifier to finish the job.<p>[0] <a href="https://github.com/giakki/uncss" rel="nofollow">https://github.com/giakki/uncss</a>
Great work! There are several directions where you can make your work more substantial. For once it would be interesting to see what's the marginal benefit of iterative minification over iterations (can be represented in a graph). Does Remynification approach an asymptotic size value, will further minification actually increase the size instead of decreasing?
Also, it would be interesting to take screenshot using selenium as suggested by sbierwagen and see if Remynification actually preserves initial layout.
Lastly, it would be interesting to see a theory proposed as why such method work, and what can future minifier learn from this.
I would be extremely curious to find out what modifications were made by the minifiers in round 2 and onwards that it couldn't have caught the first time...?<p>JS or HTML I suppose I could see, because they're more complicated, but CSS by itself is very simple, so I'm wondering what actually got left out.<p>Do you have any diffs or info on what it was removing or doing?
I found that rearranging by desired output seemed to work better, especially once you consider that compression works best on repeated longer strings. With minification as it stands you're forced to have the longer strings once, with the short strings (the CSS properties) repeated a lot.<p>This is exactly backwards from what you want. You want the short strings to appear infrequently, and the longer strings a lot.<p>CSS resets sheets are a bad example for this kind of thing, as they're strongly sorted by desired output property, but for general CSS for something with a lot of components, for example, or a CSS sheet with page template specific styling, it seems like it should minify and gzip a lot better.<p>Plus, you can group your CSS by relevant section, i.e. keep your colours separate from your alignment, from your fonts, etc.<p>Except the problem is that doing it this way requires a bit of rearranging of the rules, which may cause some trouble in a fairly small number of cases, so that's why it's out as an automated way of minifying things.
Does your "reminified" CSS actually compress better under GZIP than any of the independent minifiers? The variance you were seeing looks to be irrelevant in comparison.
"It took 25 extra minutes to save this 261 bytes. Worth it? You decide."<p>I think I've decided it's obviously not worth it, even if we built a time machine and sent the css back to 1955 when they'd appreciate 261 bytes difference. (Granted they had no use for CSS in 1955).
I totally read the title as "Remnification" and "Remnyfication". I didn't see the right spelling until I got to section 5! "OH! RE-minification"... ;)<p>Very nice, this is a fun project and a nice write up. I would definitely worry about lossy minification on production code, I've bumped into many minifier bugs that broke my valid CSS.<p>Also, pretty sure you could get Bootstrap.css down to a couple k-bytes and really truly pwn the file size leaderboard if you could dead-strip all the rules not actually referenced in your HTML & JS.
The problem with such a tree optimization is that you will only find a locally best solution.<p>If compressor1 generates the smallest result in step 1 all other compressors will only try to minimize this result. But maybe the compressor did something which the other conpressors are not optimized for. So you'll find a lically best solution for a starting point with conpressor1.
But maybe it would have been better to start with compressor3 because it's result is smaller after step 2 than starting with compressor1.
This just makes no sense. If somehow the minimizer wasn't able to further process it, maybe it's just because it was already quite thoroughly processed and you will loss some information if you continue? That's totally unsound approach. What does that extra fee bytes of saving do to loading time anyways? Probably very little, yet you risk very real degradation and broken behaviors.<p>Not to mention the author even admitted to not being very proficient in CSS and doesn't even want to learn JS because he "dislikes it too much"... The whole description of the process is basically dragging something on out of very little substance.<p>In general good laughing material though, just as he acknowledged at the end of the article, I guess.
This is pretty cool!<p>> Are the current CSS minifiers correct?<p>> I handle crashing minifiers, as well as ones that loop forever.<p>It could actually be useful to know the exact css before running the minifier who crashed/got stuck. One could check with a css validator if this is correct css in the first place (if not, one of the previous minifier screwed up) , and if so, inform the minifier's maintainers of their tool crashing with this particular valid input.
"IT...COULD...WORK!!!"
- Dr. Frederick Frankenstein<p>Cool idea. Especially not reinventing wheels but chaining them together instead. Good work.
I wonder what the impacts of (re|)minification is on actual network performance.<p>For example: consider a minified file x. X is likely getting gzipped when served. Is the reminified version smaller than the original? Same size? Bigger?<p>You might expect the obvious answer, but did anyone do the actual measurements?
I'm curious though, how do these minifications, or CSS minifications in general, affect runtime performance? Reducing the file size is great, and a goal in itself (though brotli is more important where gzip is supported), but another important goal is rendering the page fast. Some CSS selectors are faster than others, is this something minifiers take into account? Or is this not significant at all?
>> There is a catch: Remynification takes forever and may break your CSS, even though this isn't really my fault.<p>Stopped reading after this line.
Looking at modern websites: very few use the Accept-Encoding: gzip of the HTTP 1/1<p>This could even «minify the served CSS» a lot :)<p>(ok client side it will always be as big, but the server side is often paying its traffic and the client side may too so it reduces traffic anyway and means great savings).<p>For public static websites, the savings induced by gzip compression totally justify to not use http2 when traffic matters.
Good… (read in Palpatine's voice)<p>Now rewrite all your selectors in optimum precedence and specificity for file size.
After that, remember that it will all go through gzip, so let's see how reordering properties affect compression.
Filesize is not necessarily better performance. On todays hardware i believe (but i have not tested this..) that opening a zipped 1MB file often is slower than just opening it directly, basically thats just shoving it in RAM.
Now I know why most of my colleagues in software development do not have a degree - academy really, really closes your mind in some strange bubble and shields you from real world...<p>This article - I don't even...