> <i>the longest post on my site, takes 92 KiB instead of 37 KiB. This amounts to an unnecessary 2.5x increase in load time</i><p>Sure, if you ignore latency. In reality it's an unnecessary 0.001% increase in load time because that size increase isn't enough to matter vs the round trip time. And the time you save transmitting 55 fewer KiB is probably less than the time lost to decompression. :p<p>While fun, I would expect this specific scenario to actually be worse for the user experience not better. Speed will be a complete wash and compatibility will be worse.
> Why readPixels is not subject to anti-fingerprinting is beyond me. It does not sprinkle hardly visible typos all over the page, so that works for me.<p>> keep the styling and the top of the page (about 8 KiB uncompressed) in the gzipped HTML and only compress the content below the viewport with WebP<p>Ah, that explains why the article suddenly cut off after a random sentence, with an empty page that follows. I'm using LibreWolf which disables WebGL, and I use Chromium for random web games that need WebGL. The article worked just fine with WebGL enabled, neat technique to be honest.
It is actually possible to use Brotli directly in the web browser... with caveats of course. I believe my 2022 submission to JS1024 [1] is the first ever demonstration of this concept, and I also have a proof-of-concept code for the arbitrary compression (which sadly didn't work for the original size-coding purpose though). The main caveat is that you are effectively limited to the ASCII character, and that it is highly sensitive to the rendering stack for the obvious reason---it no longer seems to function in Firefox right now.<p>[1] <a href="https://js1024.fun/demos/2022/18/readme" rel="nofollow">https://js1024.fun/demos/2022/18/readme</a><p>[2] <a href="https://gist.github.com/lifthrasiir/1c7f9c5a421ad39c1af19a9c4f060743" rel="nofollow">https://gist.github.com/lifthrasiir/1c7f9c5a421ad39c1af19a9c...</a>
Chromies got in the way of it for a very long time, but zstd is now coming to the web too, as it’s finally landed in chrome - now we’ve gotta get safari onboard
I work on Batch Compress (<a href="https://batchcompress.com/en" rel="nofollow">https://batchcompress.com/en</a>) and recently added WebP support, then made it the default soon after.<p>As far as I know, it was already making the smallest JPEGs out of any of the web compression tools, but WebP was coming out only ~50% of the size of the JPEGs. It was an easy decision to make WebP the default not too long after adding support for it.<p>Quite a lot of people use the site, so I was anticipating some complaints after making WebP the default, but it's been about a month and so far there has been only one complaint/enquiry about WebP. It seems that almost all tools & browsers now support WebP. I've only encountered one website recently where uploading a WebP image wasn't handled correctly and blocked the next step. Almost everything supports it well these days.
While peeking at the source, I noticed that the doctype declaration is missing a space. It currently reads <!doctypehtml>, but it should be <!doctype html>
I've used this trick before! Oddly enough I can't remember <i>what</i> I used it for (perhaps just to see if I could), and I commented on it here: <a href="https://gist.github.com/gasman/2560551?permalink_comment_id=4431196#gistcomment-4431196" rel="nofollow">https://gist.github.com/gasman/2560551?permalink_comment_id=...</a><p>Edit: I found my prototype from way back, I guess I was just testing heh: <a href="https://retr0.id/stuff/bee_movie.webp.html" rel="nofollow">https://retr0.id/stuff/bee_movie.webp.html</a>
This page is broken at least on Sailfish OS browser, there is a long empty space after the paragraph:<p>> Alright, so we’re dealing with 92 KiB for gzip vs 37 + 71 KiB for Brotli. Umm…<p>That said, the overhead of gzip vs brotli HTML compression is nothing compared with amount of JS/images/video current websites use.
I personally don't much care for the format, if I save an image and it ends up WebP then I have to convert it before I can edit or use it in any meaningful way since it's not supported in anything other than web browsers. It's just giving me extra steps to have to do.
In the same vein, you can package HTML pages as self-extracting ZIP files with SingleFile [1]. You can even include a PNG image to produce files compatible with HTML, ZIP and PNG [2], and for example display the PNG image in the HTML page [3].<p>[1] <a href="https://github.com/gildas-lormeau/SingleFile?tab=readme-ov-file#file-format-comparison">https://github.com/gildas-lormeau/SingleFile?tab=readme-ov-f...</a><p>[2] <a href="https://github.com/gildas-lormeau/Polyglot-HTML-ZIP-PNG">https://github.com/gildas-lormeau/Polyglot-HTML-ZIP-PNG</a><p>[3] <a href="https://github.com/gildas-lormeau/Polyglot-HTML-ZIP-PNG/raw/main/demo.png.zip.html">https://github.com/gildas-lormeau/Polyglot-HTML-ZIP-PNG/raw/...</a>
> <i>Typically, Brotli is better than gzip, and gzip is better than nothing. gzip is so cheap everyone enables it by default, but Brotli is way slower.</i><p>Note that <i>way slower</i> applies to speed of compression, not decompression. So Brotli is a good bet if you can precompress.<p>> <i>Annoyingly, I host my blog on GitHub pages, which doesn’t support Brotli.</i><p>If your users all use modern browsers and you host static pages through a service like Cloudflare or CloudFront that supports custom HTTP headers, you can implement your own Brotli support by precompressing the static files with Brotli and adding a <i>Content-Encoding: br</i> HTTP header. This is kind of cheating because you are ignoring proper content negotiation with <i>Accept-Encoding</i>, but I’ve done it successfully for sites with targeted user bases.
> A real-world web page compressed with WebP? Oh, how about the one you’re reading right now? Unless you use an old browser or have JavaScript turned off, WebP compresses this page starting from the “Fool me twice” section. If you haven’t noticed this, I’m happy the trick is working :-)<p>Well it didn't work in Materialistic (I guess their webview disable js), and the failure mode is really not comfortable.
If only we hadn't lost Jan Sloot's Digital Coding System [1], we'd be able to transmit GB in milliseconds across the web!<p>[1] <a href="https://en.wikipedia.org/wiki/Sloot_Digital_Coding_System" rel="nofollow">https://en.wikipedia.org/wiki/Sloot_Digital_Coding_System</a>
I very much enjoyed reading this. Quite clever!<p>But...<p>> Annoyingly, I host my blog on GitHub pages, which doesn’t support Brotli.<p>Is the <i>glaringly</i> obvious solution to this not as obvious as I think it is?<p>TFA went through a lot of round-about work to get (some) Brotli compression. Very impressive Yak Shave!<p>If you're married to the idea of a Git-based automatically published web site, you could <i>at least</i> replicate your code and site to Gitlab Pages, which has supported precompressed Brotli since 2019. Or use one of Cloudflare's free tier services. There's a variety of ways to solve this problem before the first byte is sent to the client.<p>Far too much of the world's source code already depends exclusively on Github. I find it distasteful to also have the small web do the same while blindly accepting an inferior experience and worse technology.
Lots of nice tricks in here, definitely fun! Only minor nitpick is that it departs fairly rapidly from the lede ... which espouses the dual virtues of an accessible and js-optional reading experience ;)
From the linked Github issue giving the rationale why Brotli is not available in the CompressionStream API:<p>> As far as I know, browsers are only shipping the decompression dictionary. Brotli has a separate dictionary needed for compression, which would significantly increase the size of the browser.<p>How can the decompression dictionary be smaller than the compression one? Does the latter contain something like a space-time tradeoff in the form of precalculated most efficient representations of given input substrings or something similar?
I didn't know canvas anti-fingerprinting was so rudimentary. I don't think it increases uniqueness (the noise is different every run) but bypassing it seems trivial: run the thing n times and take the mode. With so little noise, 4 or 5 times should be more than enough.
It's impressive how close this is to Brotli even though brotli has this massive pre-shared dictionary. Is the actual compression algorithm used by it just worse, or does the dictionary just matter much less than I think?
I <i>loved</i> that (encoding stuff in <i>webp</i>) but my takeaway from the figures in the article is this: brotli is so good I'll host from somewhere where I can serve brotli (when and if the client supports brotli ofc).
On the fingerprinting noise: this sounds like a job for FEC [1]. It would increase the size but allow using the Canvas API. I don't know if this would solve the flicker though (not a front end expert here)<p>Also, it's a long shot, but could the combo of FEC (+size) and lossy compression (-size) be a net win?<p>[1]
<a href="https://en.m.wikipedia.org/wiki/Error_correction_code" rel="nofollow">https://en.m.wikipedia.org/wiki/Error_correction_code</a>
Things I seek in an image format:<p>(1) compatibility<p>(2) features<p>WebP still seems far behind on (1) to me so I don't care about the rest. I hope it gets there, though, because folks like this seem pretty enthusiastic about (2).
Is there a tool or some other way to easily encode a JPG image so it can be embedded into HTML? I know there is something like that, but is it easy? Could it be made easier?
What a fun excursion :) You can also use the ImageDecoder API: <a href="https://developer.mozilla.org/en-US/docs/Web/API/ImageDecoder" rel="nofollow">https://developer.mozilla.org/en-US/docs/Web/API/ImageDecode...</a> and VideoFrame.copyTo: <a href="https://developer.mozilla.org/en-US/docs/Web/API/VideoFrame/copyTo" rel="nofollow">https://developer.mozilla.org/en-US/docs/Web/API/VideoFrame/...</a> to skip canvas entirely.
They did all this and didn't even measure time to first paint?<p>What is the point of doing this sort of thing if you dont even test how much faster or slower it made the page to load?