Bravo. I love JPEG. Amazing that it's been 23 years since its release and it remains as useful as ever.<p>I remember what it was like to watch a 320*200 JPEG image slowly build up on a 386SX PC with a VGA card. Today, a HD frame compressed with JPEG can be decoded in milliseconds. This highlights the secret to JPEG's success: it was designed with enough foresight and a sufficiently well-bounded scope that it keeps hitting a sweet spot between computing power and bandwidth.<p>Did you know that most browsers support JPEG video streaming using a plain old <img> tag? It works also on iOS and Android, but not IE unfortunately.<p>It's triggered by the "multipart/x-mixed-replace" content type header [0]. The HTTP server leaves the connection open after sending the first image, and then simply writes new images as they come in like it were a multipart file download. A compliant browser will update the image element's contents in place.<p>[0] <a href="http://en.wikipedia.org/wiki/MIME#Mixed-Replace" rel="nofollow">http://en.wikipedia.org/wiki/MIME#Mixed-Replace</a>
This is very promising. Images by far dominate a web page, both in number of requests and total number of bytes sent [1]. Optimizing image size by even 5-10% can have a real effect on bandwidth consumption and page load times.<p>JPEG optimization using open source tools is an area that really needs focus.<p>There are a number of lossless JPEG optimization tools, but most are focused on stripping non-graphical data out of the file, or converting the image to a progressive JPEG (since progressive JPEG's have rearrange pixel data you can sometimes get better compression since there may be more redundancy in the rearranged data). Short of exceptional cases where you can remove massive amount of metadata (Adobe products regular stick embedded thumbnails and the entire "undo" history for an image) lossless optimization usually only reduces file size by 5-15%.<p>Lossy JPEG optimization has much more potential. Unfortunately, beyond proprietary encoders, the most common lossy JPEG optimization exclusively is to reduce the JPEG quality. This always felt like killing flies with a tank, so advances in this area would be awesome.<p>I've written extensively about Lossy optimization for JPEGs and PNG, and spoke about it at the Velocity conference. A post and my slides are available[2].<p>[1] - <a href="http://httparchive.org/trends.php" rel="nofollow">http://httparchive.org/trends.php</a><p>[2] - <a href="http://zoompf.com/blog/2013/05/achieving-better-image-optimization-with-lossy-techniques" rel="nofollow">http://zoompf.com/blog/2013/05/achieving-better-image-optimi...</a>
JPEG has shown amazingly good staying power. I would have assumed "JPEG is woefully old and easy to beat" but Charles Bloom did a good series of blog posts looking at it, and my (non-expert and probably hopelessly naive) takeaway is that JPEG still holds its own for a 20+ year old format.<p><a href="http://cbloomrants.blogspot.com/2012/04/04-09-12-old-image-comparison-post.html" rel="nofollow">http://cbloomrants.blogspot.com/2012/04/04-09-12-old-image-c...</a>
For improving general-purpose gzip / zlib compression, there is the Zopfli project [1] [2]. It also has (alpha quality) code for PNG file format; since this functionality wasn't originally included, there are also third-party projects [3].<p>You might be able to shave a percent or so off the download size of compressed assets.<p>[1] <a href="https://news.ycombinator.com/item?id=5316595" rel="nofollow">https://news.ycombinator.com/item?id=5316595</a><p>[2] <a href="https://news.ycombinator.com/item?id=5301688" rel="nofollow">https://news.ycombinator.com/item?id=5301688</a><p>[3] <a href="https://github.com/subzey/zopfli-png" rel="nofollow">https://github.com/subzey/zopfli-png</a>
Now if only they'd do a mozpng.<p>(For context: libpng is a "purposefully-minimal reference implementation" that avoids features such as, e.g., Animated PNG decoding. And yet libpng is the library used by Firefox, Chrome, etc., because it's the one implementation with a big standards body behind it. Yet, if Mozilla just forked libpng, their version would instantly have way more developer-eyes on it than the source...)
We've been using <a href="http://www.jpegmini.com/" rel="nofollow">http://www.jpegmini.com/</a> to compress JPGs for our apps. Worked OK, although we didn't get the enormous reductions they advertise. However 5% - 10% does still make a difference.<p>We've been using the desktop version. Would love to use something similar on a server, but jpegmini is overpriced for our scenario (I'll not have a dedicated AWS instance running for compressing images every second day or so). Will definitely check out this project :)
I noticed that optimizing JPEG images using jpegoptim (<a href="http://www.kokkonen.net/tjko/projects.html" rel="nofollow">http://www.kokkonen.net/tjko/projects.html</a>) reduces the size by a similar factor, but at the expense of decoding speed.<p>In fact, on a JPEG-heavy site that I was testing with FF 26, there was such a degradation in terms of responsiveness that transitions would stutter whenever a new image was decoded in the background (while preloading).<p>It made the effort to save 2-4% in size wasted with a worse user experience.
If my goal were to compress say 10,000 images and I could include a dictionary or some sort of common database that the compressed data for each image would reference, could I not use a large dictionary shared by the entire catalog and therefore get much smaller file sizes?<p>Maybe images could be encoded with reference to a common database we share that has the most repetitive data. So perhaps 10mb, 50mb or 100mb of common bits that the compression algorithm could reference. You would build this dictionary by analyzing many many images. Same type of approach could work for video.
Data compression and image compression is a great way to improve the overall internet, bandwidth and speed. Maybe as important as new protocols like SPDY and js/css minification and cdn hosting of common libraries.<p>As long as ISPs/telcos don't go back to the days of AOL network wide compression to reduce bandwidth beyond low quality I am for this at service level like facebook/dropbox uploads. I hope this inspires more in this area. Games also get better with better textures in less space.<p>Still to this day, I am amazed at the small file sizes macromedia (adobe now) was able to obtain with flash/swf/asf even high quality PNGs would compress. So yes we all have lots of bandwidth now but crunching to the point of representing the same thing is a good thing. With cable company caps and other bandwidth false supply shortage that focus might resurge a bit.
JPEG-2000 exists, but decoding is still too slow to be useful.<p><a href="http://en.wikipedia.org/wiki/JPEG_2000" rel="nofollow">http://en.wikipedia.org/wiki/JPEG_2000</a>
I'm actually disapointed. I hoped they developed a still image format from Daala. Daala has sigificant improments such as overlapping blocks, differently sized blocks and a predictor that works not only for luma or chroma, but for both.
I like that Mozilla is improving the existing accepted standard, but using modern (mostly patented) codec techniques we could get lossy images to under 1/2 of the current size at the same quality and decode speed. Or at a much higher quality for the same size.<p>The speed modern web concerns me. The standards are not moving forward. We still use HTML, CSS, Javascript, Jpeg, Gif, and PNG. Gif especially is a format where we could see similar sized/quality moving images at 1/8th the file size if we supported algorithms similar to those found in modern video.<p>In all of these cases, they aren't "tried and true" so much as "we've had so many problems with each that we've got a huge suite of half-hacked solutions to pretty much everything you could want to do". We haven't moved forward because we can't. WebP is a good example of a superior format that never stood a chance because front-end web technology is not flexible.
I have heard similar things about GIF (that there are optimisations that most encoding software does not properly take advantage of). But I haven't seen any efforts, or cutting edge software that actually follows through on that promise. The closest I've seen is gifscicle, which is a bit disappointing.<p>What would be great if there was some way for an animated gif's frame delays to opt-in to being interpreted correctly by browser- That is, a 0-delay really would display with no delay, and so optimisation strategies involving the splitting of image data across multiple frames could be done- and when read in by a browser, all frames would be overlaid instantly, module loading time.<p>What other things can be done to further optimise animated gif encoding?
It's not clear from the article, in their "comparison of 1500 JPEG images from Wikipedia" did they just run through the entropy coding portion again or did they requantize? (I suspect they did jus the entropy coding portion, but hard to tell).<p>Getting better encoding by changing the quantization method can't be purely a function of file size, traditionally PSNR measurements as well as visual quality come into play.<p>Good to see some work in the area, I will need to check out what is new and novel.<p>That said, a company I worked for many moons ago came up with a method where by reorganization of coefficients post-quantization, you could easily get about 20% improvement in encoding efficiency, but the result was not JPEG compatible.<p>There is a lot that can be played with.
When I optimize JPG or PNG I usually use ScriptJPG and ScriptPNG from <a href="http://css-ig.net/tools/" rel="nofollow">http://css-ig.net/tools/</a><p>They are shell scripts running many different optimizers
"support for progressive JPEGs is not universal" <a href="https://en.wikipedia.org/wiki/JPEG#JPEG_compression" rel="nofollow">https://en.wikipedia.org/wiki/JPEG#JPEG_compression</a><p>e.g. the hardware decoder in the Raspberry Pi
<a href="http://forum.stmlabs.com/showthread.php?tid=12102" rel="nofollow">http://forum.stmlabs.com/showthread.php?tid=12102</a>
Hey everyone, after some testing we have just deployed mozjpeg to our web interface at: <a href="https://kraken.io/web-interface" rel="nofollow">https://kraken.io/web-interface</a><p>You can test it out by selecting the "lossless" option and uploading a jpeg. Enjoy!
At first glance this seems wasteful. I do not think anyone would have problem in using Jpeg. However, in many cases, before the the invention of a thing who has had no problem using old tools!
Has somebody translated the jpeg library to JavaScript? Besides encoding and decoding jpeg, it has some useful modules that would be nice to have in the web browser.
This is so dumb, there are a million JPEG crushers in existence but instead of advocating the use of one of these Mozilla writes their own? Why not support webp rather than dismiss it due to compatibility and waste time doing what has been done before.