I'm a bit surprised there's no mention of image optimization proxy / services like thumbor[0] (which is open source). Instead of pre-processing all your images, it lets you worry about it later. You can compose different transformations and filters (e.g. add a watermark, resize, crop etc). This is especially useful when things on the website change. It lets you keep the original at full size, and transform them as you need.<p>There are some commercial services in this space, as well as other similar open source services.<p>If you're looking for a quick way to get thumbor up and running with docker, I'd plug <a href="https://github.com/minimalcompact/thumbor" rel="nofollow">https://github.com/minimalcompact/thumbor</a><p>[0] <a href="https://github.com/thumbor/thumbor" rel="nofollow">https://github.com/thumbor/thumbor</a>
As someone else mentioned, it's much easier to just use ImageOptim if you're on Mac. There's also a CLI and accompanying tool for NPM that includes this and ImageAlpha and JpegMINI: <a href="https://www.npmjs.com/package/imageoptim-cli" rel="nofollow">https://www.npmjs.com/package/imageoptim-cli</a><p>But one thing I'd caution is that webp is not a panacea for image optimization. It's only supported in Chrome. If you want to fully leverage next-gen image formats cross browser, you'll also need JPEG 2000 and JPEG XR...and even if you do all of that, you still won't get support for Firefox.<p>There's also srcset and lossy compression, which are also viable options: <a href="https://userinterfacing.com/the-fastest-way-to-increase-your-sites-performance-now/" rel="nofollow">https://userinterfacing.com/the-fastest-way-to-increase-your...</a>
Coincidentally, when I was a freshman in high school in the late 90's this was a topic our instructor drilled into us. I remember trying to shave off every little kilobyte for .gif and .jpg files and to make my personal website load as quickly as possible with a reasonable amount of quality over a modem.<p>From my perspective, it seems that everything has gotten way more bloated there is an assumption that everyone has unlimited data and bandwidth. I used to have a 1GB data cap on my phone that I would blow out in a couple of weeks from just reading news websites. For example, bloomberg.com shouldn't need to make nearly ~300 requests and download 18mb of data just to load the front page.
A long time I ago I "automated" the optimization of images on my site by running `optipng` in a cronjob. Every file that touches my server gets optimized.<p>I wrote about it here: <a href="https://ma.ttias.be/optimize-size-png-images-automatically-webserver-optipng/" rel="nofollow">https://ma.ttias.be/optimize-size-png-images-automatically-w...</a><p>Benefits:<p>- Don't have to think about it<p>- Optipng is really good at reducing PNG's to their bear minimum<p>Downsides:<p>- Doesn't resize images (if a 1024x768 is displayed as a 10x8, it'll still download the 1024x768)<p>- Only does PNG<p>- If your images are stored in git (and you didn't pre-optimize before committing/deploy), you can get merge conflicts<p>Still, better than nothing.
<picture>
<source srcset="sample_image.webp" type="image/webp">
<source srcset="sample_image.jpg" type="image/jpg">
<img src="sample_image.jpg" alt="">
</picture><p>Didn't know you could wrap images in a <picture> tag and browsers (except for IE) will automatically download the .webp version if they support it. Used to do this via a Javascript. I like on-demand scaling where you pass scaling parameters in the url, such as: /200x200/sample_image.jpg.
If you’re developing on a Mac, ImageOptim can handle all of the image compression (JPG, PNG, etc): <a href="https://imageoptim.com/mac" rel="nofollow">https://imageoptim.com/mac</a><p>For SVGs, svgo (brew install svgo) usually produces the best results for me.
I just ran a quick comparison on two big ecommerce sites I'm familiar with. I know for a fact they performed as much optimization as they possible could in their respective file formats<p>Webp and .jpg file both had similar dimensions, picture detail complexity, and dpi. Webp format came out to 50% smaller file size<p>I didn't have enoughof a sampling and/or tests though.<p>I personally don't think image optimization with webp should be a thing though. The lack of native web support is one issue, the next is lack of native support on windows OS is another.<p>Two things IMO are most important about image optimization for heavy-image load sites. One is lazyloading.js (frontend library) via specifying a class for those images past a threshold browser height on the backend. Most notably this is used in many ecommerce sites, but on analysis amazon doesn't seem to be using this.<p>Next would be sprite compression of common social links. A great example of this is amazon actually, checkout this image I extracted from from their webpage.<p><a href="https://images-na.ssl-images-amazon.com/images/G/01/gno/sprites/nav-sprite-global_bluebeacon-V3-2x_optimized._CB474516457_.png" rel="nofollow">https://images-na.ssl-images-amazon.com/images/G/01/gno/spri...</a>
I've been using this great imagemagick script for optimizing images the past few years. Works like a charm. Any images that are going to be served from my websites first get optimized via the script.<p><pre><code> smartresize() {
mogrify -path $3 -filter Triangle -define filter:support=2 -thumbnail $2 -unsharp 0.25x0.08+8.3+0.045 -dither None -posterize 136 -quality 82 -define jpeg:fancy-upsampling=off -define png:compression-filter=5 -define png:compression-level=9 -define png:compression-strategy=1 -define png:exclude-chunk=all -interlace none -colorspace sRGB $1
}
</code></pre>
Usage:<p><pre><code> smartresize image.png 300 outdir/
</code></pre>
Looks like I must have found it here <a href="https://www.smashingmagazine.com/2015/06/efficient-image-resizing-with-imagemagick/" rel="nofollow">https://www.smashingmagazine.com/2015/06/efficient-image-res...</a>
If you are using webpack, I highly recommend imagemin-webpack-plugin [0] (although I might be a bit biased as I created it...)<p>It will run a slew of image optimizers by default using imagemin, and has support for a wide range of others.<p>It also supports caching and optimization of images that aren't being directly imported through webpack (thanks to some awesome contributors) so it's a great way to set it and forget it and never have to worry about sending 3mb images to your users by accident.<p>[0] <a href="https://github.com/Klathmon/imagemin-webpack-plugin" rel="nofollow">https://github.com/Klathmon/imagemin-webpack-plugin</a>
If you are using Ruby, I can recommend the image_optim[0] gem together with image_optim_pack (that packs the binaries), it is maintained by a great person I only know by the name of his handle "toy".<p>I used to give him a few dollars per week when Gratipay was still up and running, sadly I don't know of an alternative now.<p>[0] <a href="https://github.com/toy/image_optim" rel="nofollow">https://github.com/toy/image_optim</a>
Another option, if your images are simple enough, is to use <a href="https://github.com/fogleman/primitive" rel="nofollow">https://github.com/fogleman/primitive</a> to convert it to SVG. Might not be worth the effort though as the space savings could be too insignificant to matter for the artifacts produced. Neat effect though for small amounts of shapes.
Sometimes I visit a project and drop all public images (logos, icons, stock photos, whatever) in the project in imageoptim. 75% reduction in file-size for some images is not uncommon.
One of the nicest additions to the GitHub Marketplace is a bot that will optimise your images automatically: <a href="https://github.com/marketplace/imgbot" rel="nofollow">https://github.com/marketplace/imgbot</a><p>(No connection to me - just think it's a great idea, and totally free.)
Making sure that grayscale/monochrome images are set as such (oftentimes they aren't) can also shave the size down. I use imagemagick for that.
Setup metrics that you'd like to hit for your pages.
When I was a kid putting up a fan site for games, (back on Geocities), I was mostly stuck on a 28.8k modem, and aimed to have it finish loading in 30s.<p>Now, I was quite brutal in the compression and probably should have backed off a bit to avoid artifacts.<p>It might be nice to have an automated test that the system under test limits the bandwidth and try to load your pages, and set some times that you'd like to hit.
is there a robust, automated, and universal process for gauge image optimization result, and comparing it to required image quality level? I've always done it manually and I have a keen sense of image degradations as I increase the compression.
I personally have been working around web technologies and performance for the last 14 years. I understand where this article is coming from and I actually have to give it some credit, it has a good click-bait title but it's an old solution. I'm tired of reading about this type of solutions, there are thousands of articles exactly like this one.<p>Reducing the size of your images is just the first step and there are many things you need to consider in order to make your website faster, things you need to solve:
- Format: deliver the images in the right format for each browser (e.g. using WebP for Chrome)
- Size: What size for each image? What happens on mobile, tablet, desktop and the different screen sizes and pixel ratios (it's not only retina or not)
- Quality: Is your image being resized by the browser? Are you using raw files to generate the optimized images?
- Thumbnails: Are you also generating thumbnails for listings or smaller versions of your images? How are you going those thumbs to your original images? Do you need to use a Database?
- Storage: Where are you going to store those images?
- Headers: Caching static assets it's key for recurrent users, are you using Apache or Nginx? Is your setup working well?
- CDN: are you using a CDN to deliver those assets? CloudFlare is great but it's not the fastest way to deliver images. What about setting the right configuration for that CDN? How much are you going to spend?<p>So what's next? Going to one of the API service to optimize images, read their 500 pages of docs to get to resize and crop an image? Ad complex plugins to your backend and have a high dependence integration?<p>I mean, if you like to add more dependencies to your project, maintain more code, spends hours rebuilding scripts and running cron jobs to update your images, go for it.<p>That's why about 9 months ago we started working on a new concept, solving all these problems with a service that integrates as easy as a lazy-loading plugin and it solves EVERYTHING about image optimization (and yes, everything that you're talking about on every comment on this HN post).<p>Don't get me wrong, we have a lot to improve and there are many details of our product that we need to polish, but we believe we've built a solution that solves perfectly all the most important parts of image optimization and delivery. It's not about reducing the image size by 1KB more, it's about everything else and understanding the big picture.<p>We love feedback and our backlog is prioritized based on our customer needs, let us know what you think.<p>Here's the link to our startup website: <a href="https://piio.co" rel="nofollow">https://piio.co</a>
After seeing people forgetting to do basic optimization step on images at our respective jobs, a friend and I build <a href="https://www.shrink.sh" rel="nofollow">https://www.shrink.sh</a>. The goal this tool is to create a catch all system. It also prevent you from installing some tools that will slow even more build or deploys and that you would need to maintain forever.
For one of my websites (a static page with some pretties), I challenged myself to remove as much cruft as possible without degrading the experience.<p>I used Fontello to strip out unnecessary FontAwesome icons and uncss to remove unused Bootstrap styles, replaced some Bootstrap JS with vanilla JS and made use of SVGs (optimised with SVGOMG) for backgrounds and the logo.<p>The resulting site is a total of 178Kb when viewed in Chrome (down from over 1MB), including bootstrap, analytics, some screenshots, a custom font and animated logo. There's plenty more I could do to trim off size, but I had more important things to do.<p>There are so many ways to make webpages smaller and more efficient, and it can be a really fun learning experience.
<a href="https://github.com/DarthSim/imgproxy/" rel="nofollow">https://github.com/DarthSim/imgproxy/</a> imgproxy I've not seen mentioned, it is probably the fastest image processing I've used yet. It uses libvips, <a href="https://github.com/jcupitt/libvips/wiki/HOWTO----Image-shrinking" rel="nofollow">https://github.com/jcupitt/libvips/wiki/HOWTO----Image-shrin...</a> which not only handles resizing & other basic img needs but optimizes on top of being very light weight on memory & CPU cycles compared to most other implementations.
There's some unexamined hooey in this post. For example you can't really compress a JPEG, but you can re-encode it to a lower quality, which can have dramatic effect on the file size. That's what moz2jpeg is doing for this person.<p>But they could have just done the same thing in Photoshop, Preview, MS Office image tool, etc. JPEG is a standard; file size does not depend on what tool you use to create it. It's strictly dependent on the image itself, and the render settings you choose. Same with PNG.<p>In fact, you'll get better quality for the file size if you go directly from the original image straight to your final resolution in one step. Rendering to high-quality JPEG, then re-rendering on the server to shrink the file size, will give you worse image quality than just going straight from the original file to the final in one render.<p>WebP looks promising but is not yet well-supported. Most sites can go a long way just by caring about, testing for, and adjusting image rendering defaults to optimize for file size.<p>EDIT to add a bit more:<p>If you are optimizing images as part of a pre-deploy build process, you can use whatever library you want. The only thing that really matters is your choice of format (JPG or PNG), and the render settings. Or, you can hand-optimize the images and drop them into your repo to deploy as-is.<p>If you're running a CMS where non-developers are going to be uploading images through an admin UI (like Wordpress), your CMS should be using a server-side library to render optimized versions of the images that get uploaded, then serving the optimized versions. You can adjust the settings of the server's image library, although that might require a plugin or module, or custom code, depending on the CMS.<p>Missing this is a common killer mistake in page load times. I visited a site the other day that served a <i>16 MB JPEG file</i> for the "hero" image on the homepage. My guess it that it was the JPEG straight out of a high-resolution camera.<p>This is also good for user privacy, as the server-side rendering should remove IPTC and EXIF data that would get served with the original image.
You can save another 7.6% on that png by passing it through advpng+pngout. ImageOptim is fantastic for this: <a href="https://imageoptim.com/mac" rel="nofollow">https://imageoptim.com/mac</a>
We at Gumlet also provide image optimisation which just works: <a href="https://www.gumlet.com" rel="nofollow">https://www.gumlet.com</a> It's also accompanied by client side javascript library: <a href="http://github.com/gumlet/gumlet.js" rel="nofollow">http://github.com/gumlet/gumlet.js</a>
Cloudinary is another option to get optimized images/videos without having to manually optimize everything. Definitely a good option if you have lots of user uploaded content. Running an optimization script within your app will likely use more resources than it's worth unless you're already operating at scale.
2016 Google developers conference had something on this : <a href="https://m.youtube.com/watch?v=r_LpCi6DQME" rel="nofollow">https://m.youtube.com/watch?v=r_LpCi6DQME</a>
we recently moved all our images to s3, we have created some lambda function which compress the images using Guetzli, it's very slow but the results are good.
This article like many others is full of fallacies.<p>Image formats are not used wisely: [1] is PNG not JPEG and [2] is JPEG not PNG.<p>> I found that setting quality (mozjpeg) to 70 produces good enough images for the most part, but your mileage may vary.<p>You can get away with this setting for hidpi sizes but 1x will look horrible [1]. If you care about quality, the mileage is actually 75-95.<p>> (Pngquant) quality level of 65-80 to provide a good compromise between file size and image quality<p>Again, it may only be applied to hidpi sizes, and it will easily ruin any gradients or previously quantized images.<p>Pngquant is a great color quantization tool but it does not actually perform any lossless PNG optimizations, which can save you at least 5% more, and up to 90% in some cases.<p>All of these tools will also blindly strip metadata (but it's not guaranteed!) along with color profiles and Exif Orientation resulting in color shifts and image transformations respectively.<p>Most importantly, none of them are good enough for automatic lossy compression. Guetzli is the closest but it still has some severe issues [3]. I'm also trying to build a real thing, and it is hard.<p>> there’s value in using WebP formats where possible<p>WebP lossless and WebP lossy are quite different formats. WebP lossy being always 4:2:0 is not a good replacement for JPEG [4] especially at higher quality. On the contrary WebP lossless has evolved into a decent alternative for PNG including lossy [5].<p>Proper responsive images would give you considerably smaller page weight and improve performance on mobile devices. BTW Google treats oversized images as unoptimized [6].<p>[1] <a href="https://freshman.tech/assets/dist/images/http-status-codes/everything-ok-683.jpg" rel="nofollow">https://freshman.tech/assets/dist/images/http-status-codes/e...</a><p>[2] <a href="https://freshman.tech/assets/dist/images/articles/freshman-1600-original.png" rel="nofollow">https://freshman.tech/assets/dist/images/articles/freshman-1...</a><p>[3] <a href="https://github.com/google/guetzli/issues" rel="nofollow">https://github.com/google/guetzli/issues</a><p>[4] <a href="https://research.mozilla.org/2014/07/15/mozilla-advances-jpeg-encoding-with-mozjpeg-2-0/" rel="nofollow">https://research.mozilla.org/2014/07/15/mozilla-advances-jpe...</a><p>[5] <a href="https://twitter.com/jyzg/status/958629795692150790" rel="nofollow">https://twitter.com/jyzg/status/958629795692150790</a><p>[6] <a href="https://developers.google.com/speed/pagespeed/insights/?url=https%3A%2F%2Ffreshman.tech&tab=desktop" rel="nofollow">https://developers.google.com/speed/pagespeed/insights/?url=...</a>
There are better ways that place usability first. By usability this means that there is nothing for the content creator to do and nothing for the frontend developer to do.<p>I use mod_pagespeed - there are versions for nginx and Apache that do all of the heavy lifting.<p>With mod_pagespeed you can get all of the src_set images at sensible compression levels. All you need is to markup your code with width= and height= values for each img.<p>With this in place the client can upload multi-megabyte images from their camera without having to fiddle in Photoshop etc. It just works and the hard part is abstracted out to mod_pagespeed.<p>By taking this approach there is no need to use fancy build tools. However, a background script to 'mogrify' your source images is a nice complement to mod_pagespeed, if you want your images to be in Google Image Search then 1920x1080 is what you need.<p>The really good thing about taking the mod_pagespeed route is that you do get 'infinite zoom' on mobile, e.g. pinch and zoom and it fills in the next src_set size. Keep going and you eventually get the original, which you have background converted to 1920x1080.<p>There is also the option to optimise image perceptually, so you are not just mashing everything down to 70% (or 84%).<p>On your local development box you can run without mod_pagespeed and just have the full resolution images.<p>Or you can experiment with more advanced features such as lazy_loading - this also comes for free with mod_pagespeed.<p>If you want your images to line up in nice squares then you might add in whitespace to the images. Maybe taking time in Photoshop to do this. However, it is easier to just 'identify' the image height/widths and to set something sensible for them, keeping the aspect ratio correct. Then you can use modern CSS to align them in figure elements to then let mod_pagespeed fill out the src_sets.<p>Icons and other images that are needed are best manually tweaked into cut down SVG files and then put into CSS as data URLs, thereby reducing a whole load of extra requests (even if it is just one for a fiddly 'sprite sheet').<p>Oh, a final tweak, if you are running a script to optimise uploaded images and to restrict max size then you can also use 4:2:0 colour sampling. This is where the image still has the dots but the colours are 'halved in resolution'. This is not noticeable in a lot of use cases and particularly good if you are using PNGs to get that transparency.<p>As mentioned, mod_pagespeed reduced project complexity by offloading the hard work to the server, keeping cruft out of the project and making the build tools out of the way. It can also be covered to inline some images and plenty else to get really good performance.<p>Mileage may vary if the decision has been made to use a CDN where such functionality is not possible. However, if serving a local market then a faux CDN is pretty good, i.e. a static domain on HTTP2 where the cache is set properly and no cookies are sent up/down the wire to get every image.<p><a href="https://www.modpagespeed.com/doc/filter-image-optimize" rel="nofollow">https://www.modpagespeed.com/doc/filter-image-optimize</a>
<a href="https://www.modpagespeed.com/doc/filter-image-responsive" rel="nofollow">https://www.modpagespeed.com/doc/filter-image-responsive</a>
Venkatraman Santanam.<p>Made deep learning improve thumbnail representation.
Facebook (or google) showed up, gave him a 6 to 8 zeros to the left of the decimal, preceded at the far left by a $ and O(1).<p><a href="https://arxiv.org/pdf/1612.03268.pdf" rel="nofollow">https://arxiv.org/pdf/1612.03268.pdf</a>