I don't like this suggestion of "best practice" without any numbers. Progressive JPEG uses more RAM (because you can stream normal JPEGs so you only have to buffer a row of JPEG blocks) and lots more CPU (up to 3x).<p>Most of the compression benefits can be obtained by setting the "optimized Huffman" flag on your compressor. e.g., a baseline JPEG will save 10-20% when "optimized" and progressive very rarely achieves a double-digit win after that.<p>MobileSafari eats cycles every time you inject a large image into the DOM, so (while I haven't benchmarked progressive), it seems like this new-found love of progressive JPEG is beating up one of the slower code paths in the mobile browsers. And to me, it doesn't look that good!
I'm nitpicking, but<p><pre><code> When images arrive, they come tripping onto the page, pushing other elements around and triggering a clumsy repaint.
</code></pre>
This is easily avoided by defining the image dimensions in your stylesheet.
<p><pre><code> their Mod_Pagespeed service
</code></pre>
mod_pagespeed is an open source Apache module, not a service. Google also runs PageSpeed Service, an optimizing proxy. Both support automatic conversion to progressive jpeg.<p><pre><code> SPDY does as well, translating jpegs that are over
10K to progressive by default
</code></pre>
The author has SPDY and mod_pagespeed confused; this is a mod_pagespeed feature.<p>(I work at Google on ngx_pagespeed.)
> <i>Plotting the savings of 10000 random baseline jpegs converted to progressive, Stoyan Stefanov discovered a valuable rule of thumb: files that are over 10K will generally be smaller using the progressive option.</i><p>At first I thought: "What, you're opening JPEGs and saving them again? Don't you lose image quality every time you open and save in a lossy format?"<p>But then I read the actual source [1], and it says that `jpegtran` can open baseline JPEGs and save them losslessly as progressive JPEGs. That sounds useful!<p>Does anyone know whether other image editing software do the same thing to JPEGs that are re-saved without modification? What about Photoshop? GIMP? MS Paint?<p>[1] <a href="http://www.bookofspeed.com/chapter5.html" rel="nofollow">http://www.bookofspeed.com/chapter5.html</a>
I would not call this a "best practice", but simply an alternative.<p>Progressively loading photos in that manner is not a good user experience either. The first paint is often quite jarring, especially, as pointed out in the article, over 90% of photos simply don't load this way.<p>For content such as photos that are contained within some sort of layout, it would be better to have a placeholder there that is the same frame as the final image size, then have the final image appear upon completion.
This is sort of tangential, but that browser chart reminded me of something I've been curious about for a while: Why do certain browsers only render foreground JPGs progressively? Is it a rendering engine limitation or an intentional one (perhaps for usability during page load)?
Oh, no no no. Ill-advised idea. We've gone through this before already, and it was resolved in favor of baseline with optimization. How was it resolved? By website visitors, who hated-hated-hated progressive JPEGs.
Boy, those who ignore history... cue the "Doom Song" from Invader Zim, sung by Gir.<p>(edited)
Assuming the image has a small thumbnail embedded in the EXIF data, maybe it would be even faster to use that resource (already included with the image), scaled to fill the image space and replaced when the image loads.<p>Essentially the same effect, with some back-end code. I suppose we'd need a benchmark test to find the real numbers. The main advantage would be sticking with existing file types already in use.
><i>and progressive jpegs quickly stake out their territory and refine (or at least that’s the idea)</i> //<p>Your browser stakes out the image area if you set height and width attributes for the image thus avoiding reflows.<p>Also the bracketed comment makes it sound like this feature doesn't work well?
>They are millions of colors and pixel depth is increasing.<p>Um, no. Nobody is serving anything more than 8 bpc on a web page, and nobody has a monitor that could show it to them if they did.
Good article! I'm quite surprised mobile browsers basically ignore this though. Showing the low quality image when zoomed out, and only the full quality when zoomed in would avoid the CPU issues.
I'm usually eager to jump on board with recommendations from Stoyan’s performance calendar, but Anne’s description of baseline loading doesn't comport with my experience in Webkit browsers (and possibly others). They don't "display instantly after file download", but display almost line by line as the file is downloading, with a partial image appearing from top to bottom as information becomes available. Since this particular trick is about perception -- users being given some visual indication that that the image is loading, and data as soon as possible -- the difference between progressive and baseline loading seems like it should be subtler than the article suggests.