TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Introducing the ‘mozjpeg’ Project

409 pointsby joshmozabout 11 years ago

27 comments

pavlovabout 11 years ago
Bravo. I love JPEG. Amazing that it&#x27;s been 23 years since its release and it remains as useful as ever.<p>I remember what it was like to watch a 320*200 JPEG image slowly build up on a 386SX PC with a VGA card. Today, a HD frame compressed with JPEG can be decoded in milliseconds. This highlights the secret to JPEG&#x27;s success: it was designed with enough foresight and a sufficiently well-bounded scope that it keeps hitting a sweet spot between computing power and bandwidth.<p>Did you know that most browsers support JPEG video streaming using a plain old &lt;img&gt; tag? It works also on iOS and Android, but not IE unfortunately.<p>It&#x27;s triggered by the &quot;multipart&#x2F;x-mixed-replace&quot; content type header [0]. The HTTP server leaves the connection open after sending the first image, and then simply writes new images as they come in like it were a multipart file download. A compliant browser will update the image element&#x27;s contents in place.<p>[0] <a href="http://en.wikipedia.org/wiki/MIME#Mixed-Replace" rel="nofollow">http:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;MIME#Mixed-Replace</a>
评论 #7349266 未加载
评论 #7349683 未加载
评论 #7350345 未加载
评论 #7350074 未加载
评论 #7350981 未加载
评论 #7350586 未加载
评论 #7351818 未加载
billyhoffmanabout 11 years ago
This is very promising. Images by far dominate a web page, both in number of requests and total number of bytes sent [1]. Optimizing image size by even 5-10% can have a real effect on bandwidth consumption and page load times.<p>JPEG optimization using open source tools is an area that really needs focus.<p>There are a number of lossless JPEG optimization tools, but most are focused on stripping non-graphical data out of the file, or converting the image to a progressive JPEG (since progressive JPEG&#x27;s have rearrange pixel data you can sometimes get better compression since there may be more redundancy in the rearranged data). Short of exceptional cases where you can remove massive amount of metadata (Adobe products regular stick embedded thumbnails and the entire &quot;undo&quot; history for an image) lossless optimization usually only reduces file size by 5-15%.<p>Lossy JPEG optimization has much more potential. Unfortunately, beyond proprietary encoders, the most common lossy JPEG optimization exclusively is to reduce the JPEG quality. This always felt like killing flies with a tank, so advances in this area would be awesome.<p>I&#x27;ve written extensively about Lossy optimization for JPEGs and PNG, and spoke about it at the Velocity conference. A post and my slides are available[2].<p>[1] - <a href="http://httparchive.org/trends.php" rel="nofollow">http:&#x2F;&#x2F;httparchive.org&#x2F;trends.php</a><p>[2] - <a href="http://zoompf.com/blog/2013/05/achieving-better-image-optimization-with-lossy-techniques" rel="nofollow">http:&#x2F;&#x2F;zoompf.com&#x2F;blog&#x2F;2013&#x2F;05&#x2F;achieving-better-image-optimi...</a>
IvyMikeabout 11 years ago
JPEG has shown amazingly good staying power. I would have assumed &quot;JPEG is woefully old and easy to beat&quot; but Charles Bloom did a good series of blog posts looking at it, and my (non-expert and probably hopelessly naive) takeaway is that JPEG still holds its own for a 20+ year old format.<p><a href="http://cbloomrants.blogspot.com/2012/04/04-09-12-old-image-comparison-post.html" rel="nofollow">http:&#x2F;&#x2F;cbloomrants.blogspot.com&#x2F;2012&#x2F;04&#x2F;04-09-12-old-image-c...</a>
评论 #7349254 未加载
评论 #7352364 未加载
评论 #7354084 未加载
评论 #7349212 未加载
csenseabout 11 years ago
For improving general-purpose gzip &#x2F; zlib compression, there is the Zopfli project [1] [2]. It also has (alpha quality) code for PNG file format; since this functionality wasn&#x27;t originally included, there are also third-party projects [3].<p>You might be able to shave a percent or so off the download size of compressed assets.<p>[1] <a href="https://news.ycombinator.com/item?id=5316595" rel="nofollow">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=5316595</a><p>[2] <a href="https://news.ycombinator.com/item?id=5301688" rel="nofollow">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=5301688</a><p>[3] <a href="https://github.com/subzey/zopfli-png" rel="nofollow">https:&#x2F;&#x2F;github.com&#x2F;subzey&#x2F;zopfli-png</a>
derefrabout 11 years ago
Now if only they&#x27;d do a mozpng.<p>(For context: libpng is a &quot;purposefully-minimal reference implementation&quot; that avoids features such as, e.g., Animated PNG decoding. And yet libpng is the library used by Firefox, Chrome, etc., because it&#x27;s the one implementation with a big standards body behind it. Yet, if Mozilla just forked libpng, their version would instantly have way more developer-eyes on it than the source...)
评论 #7350174 未加载
评论 #7349649 未加载
评论 #7349660 未加载
评论 #7353084 未加载
CookWithMeabout 11 years ago
We&#x27;ve been using <a href="http://www.jpegmini.com/" rel="nofollow">http:&#x2F;&#x2F;www.jpegmini.com&#x2F;</a> to compress JPGs for our apps. Worked OK, although we didn&#x27;t get the enormous reductions they advertise. However 5% - 10% does still make a difference.<p>We&#x27;ve been using the desktop version. Would love to use something similar on a server, but jpegmini is overpriced for our scenario (I&#x27;ll not have a dedicated AWS instance running for compressing images every second day or so). Will definitely check out this project :)
评论 #7352450 未加载
tenfingersabout 11 years ago
I noticed that optimizing JPEG images using jpegoptim (<a href="http://www.kokkonen.net/tjko/projects.html" rel="nofollow">http:&#x2F;&#x2F;www.kokkonen.net&#x2F;tjko&#x2F;projects.html</a>) reduces the size by a similar factor, but at the expense of decoding speed.<p>In fact, on a JPEG-heavy site that I was testing with FF 26, there was such a degradation in terms of responsiveness that transitions would stutter whenever a new image was decoded in the background (while preloading).<p>It made the effort to save 2-4% in size wasted with a worse user experience.
评论 #7349691 未加载
评论 #7353071 未加载
ilakshabout 11 years ago
If my goal were to compress say 10,000 images and I could include a dictionary or some sort of common database that the compressed data for each image would reference, could I not use a large dictionary shared by the entire catalog and therefore get much smaller file sizes?<p>Maybe images could be encoded with reference to a common database we share that has the most repetitive data. So perhaps 10mb, 50mb or 100mb of common bits that the compression algorithm could reference. You would build this dictionary by analyzing many many images. Same type of approach could work for video.
评论 #7349550 未加载
评论 #7354136 未加载
评论 #7352437 未加载
评论 #7362168 未加载
评论 #7351255 未加载
评论 #7349499 未加载
rwmjabout 11 years ago
Why don&#x27;t they just contribute the jpgcrush-like C code back to libjpeg-turbo?<p>Edit: A good reason given in the reply by joshmoz below.
评论 #7349174 未加载
评论 #7349153 未加载
drawkboxabout 11 years ago
Data compression and image compression is a great way to improve the overall internet, bandwidth and speed. Maybe as important as new protocols like SPDY and js&#x2F;css minification and cdn hosting of common libraries.<p>As long as ISPs&#x2F;telcos don&#x27;t go back to the days of AOL network wide compression to reduce bandwidth beyond low quality I am for this at service level like facebook&#x2F;dropbox uploads. I hope this inspires more in this area. Games also get better with better textures in less space.<p>Still to this day, I am amazed at the small file sizes macromedia (adobe now) was able to obtain with flash&#x2F;swf&#x2F;asf even high quality PNGs would compress. So yes we all have lots of bandwidth now but crunching to the point of representing the same thing is a good thing. With cable company caps and other bandwidth false supply shortage that focus might resurge a bit.
评论 #7350943 未加载
cjensenabout 11 years ago
JPEG-2000 exists, but decoding is still too slow to be useful.<p><a href="http://en.wikipedia.org/wiki/JPEG_2000" rel="nofollow">http:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;JPEG_2000</a>
评论 #7350435 未加载
评论 #7350195 未加载
评论 #7350031 未加载
United857about 11 years ago
What about WebP? Isn&#x27;t that intended to be a eventual replacement to JPEG?
评论 #7350048 未加载
评论 #7350239 未加载
评论 #7350517 未加载
评论 #7350049 未加载
评论 #7351521 未加载
评论 #7350080 未加载
1risabout 11 years ago
I&#x27;m actually disapointed. I hoped they developed a still image format from Daala. Daala has sigificant improments such as overlapping blocks, differently sized blocks and a predictor that works not only for luma or chroma, but for both.
评论 #7355296 未加载
评论 #7351568 未加载
Taekabout 11 years ago
I like that Mozilla is improving the existing accepted standard, but using modern (mostly patented) codec techniques we could get lossy images to under 1&#x2F;2 of the current size at the same quality and decode speed. Or at a much higher quality for the same size.<p>The speed modern web concerns me. The standards are not moving forward. We still use HTML, CSS, Javascript, Jpeg, Gif, and PNG. Gif especially is a format where we could see similar sized&#x2F;quality moving images at 1&#x2F;8th the file size if we supported algorithms similar to those found in modern video.<p>In all of these cases, they aren&#x27;t &quot;tried and true&quot; so much as &quot;we&#x27;ve had so many problems with each that we&#x27;ve got a huge suite of half-hacked solutions to pretty much everything you could want to do&quot;. We haven&#x27;t moved forward because we can&#x27;t. WebP is a good example of a superior format that never stood a chance because front-end web technology is not flexible.
评论 #7355285 未加载
TheZenPsychoabout 11 years ago
I have heard similar things about GIF (that there are optimisations that most encoding software does not properly take advantage of). But I haven&#x27;t seen any efforts, or cutting edge software that actually follows through on that promise. The closest I&#x27;ve seen is gifscicle, which is a bit disappointing.<p>What would be great if there was some way for an animated gif&#x27;s frame delays to opt-in to being interpreted correctly by browser- That is, a 0-delay really would display with no delay, and so optimisation strategies involving the splitting of image data across multiple frames could be done- and when read in by a browser, all frames would be overlaid instantly, module loading time.<p>What other things can be done to further optimise animated gif encoding?
评论 #7352931 未加载
jmspringabout 11 years ago
It&#x27;s not clear from the article, in their &quot;comparison of 1500 JPEG images from Wikipedia&quot; did they just run through the entropy coding portion again or did they requantize? (I suspect they did jus the entropy coding portion, but hard to tell).<p>Getting better encoding by changing the quantization method can&#x27;t be purely a function of file size, traditionally PSNR measurements as well as visual quality come into play.<p>Good to see some work in the area, I will need to check out what is new and novel.<p>That said, a company I worked for many moons ago came up with a method where by reorganization of coefficients post-quantization, you could easily get about 20% improvement in encoding efficiency, but the result was not JPEG compatible.<p>There is a lot that can be played with.
transfireabout 11 years ago
If only JPEG supported transparency.
评论 #7352646 未加载
Matrixikabout 11 years ago
When I optimize JPG or PNG I usually use ScriptJPG and ScriptPNG from <a href="http://css-ig.net/tools/" rel="nofollow">http:&#x2F;&#x2F;css-ig.net&#x2F;tools&#x2F;</a><p>They are shell scripts running many different optimizers
morganwabout 11 years ago
&quot;support for progressive JPEGs is not universal&quot; <a href="https://en.wikipedia.org/wiki/JPEG#JPEG_compression" rel="nofollow">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;JPEG#JPEG_compression</a><p>e.g. the hardware decoder in the Raspberry Pi <a href="http://forum.stmlabs.com/showthread.php?tid=12102" rel="nofollow">http:&#x2F;&#x2F;forum.stmlabs.com&#x2F;showthread.php?tid=12102</a>
kraken-ioabout 11 years ago
Hey everyone, after some testing we have just deployed mozjpeg to our web interface at: <a href="https://kraken.io/web-interface" rel="nofollow">https:&#x2F;&#x2F;kraken.io&#x2F;web-interface</a><p>You can test it out by selecting the &quot;lossless&quot; option and uploading a jpeg. Enjoy!
kllrnohjabout 11 years ago
So... version 1.0 is basically a shell script that calls libjpeg-turbo followed by jpgcrush?
评论 #7349072 未加载
评论 #7349090 未加载
评论 #7349099 未加载
sp332about 11 years ago
Any chance of incorporating other psy improvements, instead of just targeting SSIM?
Momentumabout 11 years ago
At first glance this seems wasteful. I do not think anyone would have problem in using Jpeg. However, in many cases, before the the invention of a thing who has had no problem using old tools!
SimHackerabout 11 years ago
Has somebody translated the jpeg library to JavaScript? Besides encoding and decoding jpeg, it has some useful modules that would be nice to have in the web browser.
callesggabout 11 years ago
A bit to soon to start announcing the project. But I like the initiative hope the project manages to improve stuff.
评论 #7349151 未加载
评论 #7349137 未加载
davidgerardabout 11 years ago
What license are they doing this under? Hopefully they&#x27;re aiming to upstream this to libjpeg.
jimbonesabout 11 years ago
This is so dumb, there are a million JPEG crushers in existence but instead of advocating the use of one of these Mozilla writes their own? Why not support webp rather than dismiss it due to compatibility and waste time doing what has been done before.