This could be an interesting companion to Helium. Helium-css can be used on live sites to find unused CSS. Combine with this project to remove duplicates too. <a href="https://github.com/geuis/helium-css" rel="nofollow">https://github.com/geuis/helium-css</a>
Personally I used Chrome dev tools - and it has a feature called 'CSS Selector Profiles' under the 'profile' tab.<p>You can 'record' yourself using a website - and Chrome records how many 'hits' each css rule gets.<p>This is better than Dust-me, and all the other CSS rule detectors - because it allows you to use your site in a 'dynamic' way - and test for all the edge CSS cases (like resizing your browser to small, or enabling an error message etc).<p>After using my SaaS for an hour - I found hundreds of rules that just never got used.<p>Also - it tells you how many times a rule got used. So I found a few number of rules that literally only got used once, and I could often re-write them into 1-2 bigger rules - further reducing my CSS overhead.
From the makers of the phenomenal Chosen jquery plugin.<p><a href="http://harvesthq.github.io/chosen/" rel="nofollow">http://harvesthq.github.io/chosen/</a><p>Hopefully this is indicative of the quality of this library! I'll definitely try it out this morning.
If you're a ruby developer I strongly encourage you to check out the source code and parsing in general. This is heavily using parslet [1] to build a css parser [2]. I'm sure there are edge cases I have missed, but still the LOC for this codebase is relatively small and fairly readable.<p>Stay away from the RedundancyAnalyzer though. There be dragons.<p>[1] <a href="http://kschiess.github.io/parslet/" rel="nofollow">http://kschiess.github.io/parslet/</a><p>[2] <a href="https://github.com/zmoazeni/csscss/tree/ae2f22f4416bca35f903970c15aa1685a8d237cd/lib/csscss/parser" rel="nofollow">https://github.com/zmoazeni/csscss/tree/ae2f22f4416bca35f903...</a>
CSSO [1] is a tool that removes duplicate declarations during minification instead of just warning about them. It also performs more advanced structural optimizations.<p>[1] <a href="http://bem.info/tools/csso/" rel="nofollow">http://bem.info/tools/csso/</a>
Here is something I came across at PyCon. This technique uses genetic algorithm to minimize the CSS. The author claims 10% improvement over the standard CSS minimizer.<p>Links:<p>- <a href="https://us.pycon.org/2013/schedule/presentation/178/" rel="nofollow">https://us.pycon.org/2013/schedule/presentation/178/</a><p>- <a href="https://github.com/ryansb/genetic-css" rel="nofollow">https://github.com/ryansb/genetic-css</a>
It seems like when run on SCSS mode, it expands mixins before running the redundancy check. For instance, two of my selectors both include three of the same mixins, which end up expanding into 23 rules, and CSSCSS reports 25 shared rules between them. (Further inspection confirms that exactly 2 normal rules are shared.)<p>I would think that using mixins should also count as having eliminated redundancy. A simple solution would be to ignore them, or more ideally the redundancy check could treat mixins just as a normal rule, so that it could detect using the same set of mixins in multiple places as redundancy.
We really need this, our css has grown in to a monster over the years.<p>This may also be of interest:<p>Dust-Me Selectors is a firefox extension that scans HTML pages to find unused CSS selectors. <a href="http://www.brothercake.com/dustmeselectors/" rel="nofollow">http://www.brothercake.com/dustmeselectors/</a>