I'm trying to always keep size of images I publish on the web at a minimum. I have been using the smush.it service a lot, but after Y! took over the service, it hasn't been working too well -- for example I can't smush images stored locally any more.<p>Also, minifying and merging JS and CSS is a good trick to keep loading times down.<p>Now I'm interested to know -- which webapps, tools or tricks do you use to optimize your web content?
1) The YSlow presentations.<p>2) YSlow itself.<p>3) Cheating creatively. For example, the front page of Bingo Card Creator has one middle-weight screenshot on it, which links to a hefty full-sized version in a lightbox which accompanies some text exhorting users to download or sign up now. I want opening that Lightbox to be faster than instant, but the screenshot is huge. So I cheat: by the simple expedient of placing <img src="/blah.jpg" style="display: hidden;" /> somewhere visible on the front page, everybody who hits it will start loading the image more or less immediately. Then, when they trigger the lightbox (some seconds later), the image displayed in it (<img src="/blah.jpg" />) doesn't come from my server at all, it comes straight from their browser cache, and renders almost instantly.<p>Related trick: I load the critical assets for my purchasing page on every page on the site. Thus, someone accessing the purchasing page (which I have clear reason$ to want to be fast) will almost always see it stupidly fast, despite it having a fair bit of Javascript and whatnot to render the cart.<p>I also abuse the browser cache for "live" previews in my web app. Switching an image's source to an image which is not downloaded is a bad idea, because it typically causes both a delay and unsightly flicker. So instead of switching the source of the image directly, you switch the source of a hidden image, then use a Javascript onload callback to switch the visible image to the source of the hidden image when the hidden image finishes loading.
- Host your CSS, JS and images on a CDN
-- Serve them from a different domain. Otherwise every element request requires cookies to be sent in the headers. Waste of overhead for end users.<p>- Combine as many JS and CSS files as possible. You will also want to compress them.<p>- Use memcached to cache backend elements to speed up pageloads and hit the database less.<p>- If you arent using a CDN, at least use a reverse proxy so you don't create unnecessary overhead for your webserver.<p>- Get a profiler for your app. See what areas are slower then others. Find a database profiler and fix queries that are slow. If your using MySQL use the slow query log to see where there are hold ups.<p>- Make sure you have the proper indexes on your database. Over doing it can waste memory.<p>- Try and use gzip if possible. Most browsers support it and it can have a noticeable speed increase.
A little trick from FLOSS Podcast's session with John Resig: instead of serving jQuery from your domain, use the Google hosted one: <a href="http://ajax.googleapis.com/ajax/libs/jquery/1.3/jquery.min.js" rel="nofollow">http://ajax.googleapis.com/ajax/libs/jquery/1.3/jquery.min.j...</a> - you get to benefit from Google's CDN, and there's a fair chance it's already cached in the user's browser.
Since the YSlow recommendations and practices are almost universally used, wouldn't it be possible to automate the process? Could someone write a servlet filter that incorporated the optimization rules and automatically applied them to outgoing content on the fly? And you could make this an Apache module so that it could sit at the web server layer instead of at the container layer.<p>This would also address issues that arise when you're using a product or framework that assembles the page dynamically, and you're not in control of how the css links are constructed.
I've found YSlow very useful in identifying problems.<p>My lazy, three-step optimization process:<p>1. turn on gzip encoding in apache<p>2. verify that caching headers (Etag/last-modified) are being set<p>3. tail -f mysql-slow.log
I do the following:<p>1. Use a python profiler and look for problems, which usually means rewriting db calls to tradeoff storage space vs CPU time and/or front loading tasks so they're heavier on write than read<p>2. Write memcache layers wherever possible<p>3. Optipng or cut the backgrounds out of gifs<p>4. Clean up redundant css / js<p>5. Minify<p>6. I set really long modified headers, but use a version number that I can increment in my statics files' query strings
* in Photoshop - "save for web" and let it figure out what it thinks is optimal<p>* YSlow for Firebug to see if anything is really loading slowly / being gross<p>* Like you mentioned, YUI Compressor to minify/compress CSS and JS<p>* We use Symfony so the Symfony dev toolbar to profile slow SQL queries and generally see what is causing pages to render slowly.<p>* Lately, Firebug's NET panel to monitor what requests are taking the most time.
For images, I haven't found a better solution than to fire up Photoshop and do batch process on all the images and save for web. The quality looks the same, and the images are WAY smaller.<p>I've used pngcrush before, but just for png's, and it didn't come close to what Photoshop was giving me as a result.<p>Photoshop does cost ~$700, but if you already have it, definitely give it a whirl.