> Around the same time we switched from an (outdated) manually created critical CSS file to an automated system that was generating critical CSS for every template — homepage, article, product page, event, job board, and so on — and inline critical CSS during the build time. Yet we didn’t really realize how much heavier the automatically generated critical CSS was.<p>Is reducing the total amount of CSS per page so you don't have to calculate the critical CSS at all an option?<p>To throw my own page into the ring, here's a non-trivial product website of mine where the homepage is 0.3MB total over the wire and renders in 0.4 seconds for me (includes custom fonts, payments, analytics, large screenshot and real-time chat widget):<p><a href="https://www.checkbot.io/" rel="nofollow">https://www.checkbot.io/</a><p>The major tricks I'm using is keeping the website CSS small (CSS for homepage + rest of site gzips to less than 8KB), inlining all CSS, rendering default fonts before the custom font has loaded, SVG for all images (this saves a ton), and not using JavaScript for content (which blocks rendering).<p>The screenshot in the header is in SVG format and inlined directly into the page along with the CSS, so the moment the HTML arrives the browser can display all above the fold content. Logos are another good one for the SVG + inline treatment.
It's still way too slow. It's a big page of text. It should load in an instant.<p>In my browser, I see 1.75 MB sent over the wire and a 2.5 second load time. My big pages of text [1] need 105 kB and load in 0.4 seconds. Their compressed critical CSS is the same size as my entire uncompressed CSS file. They send more CSS bytes than I send bytes in total.<p>If you want to make a content website fast, it's quite simple: send just the content.<p>[1] <a href="https://allaboutberlin.com/guides/german-health-insurance" rel="nofollow">https://allaboutberlin.com/guides/german-health-insurance</a>
Nice to see another company covering all these steps and validating the work we've done at my company. Unfortunately we weren't as successful, or at least, our results were not as fruitful. A good score for our site (tracker.gg) is 70 on mobile. Turns out it's pretty hard to optimize the bootstraping of an application that can render 20 different websites! Mobile devices spend 1200ms on the main thread. It will be interesting to see how these changes impact our page rank when Google starts incorporating Core Web Vitals into its algorithm this year.
Not fantastic: <a href="https://www.websitecarbon.com/website/smashingmagazine-com-2021-01-smashingmag-performance-case-study/" rel="nofollow">https://www.websitecarbon.com/website/smashingmagazine-com-2...</a>
Snarky TLDR: half the JS load time was ad scripts. IMO Most of the performance increase was around specifying image heights, using facades for third party embeds, and optimizing around ad and analytics scripts.<p>Nice write up but not a big surprise to anyone that blocks analytics tracking, ads, and third party embeds.