How does this shit make it to the front page?<p>First and foremost, everyone needs caching. It's what makes computers fast. That RAM you have? Cache. The memory in your CPU? Cache. The memory in your hard drive? Cache.<p>Your filesystem has a cache. Your browser has a cache. Your DNS resolver has a cache. Your web server's reverse proxy [should] have a cache. Your database [should] have a cache. Every place that you can conceivably shove in another cache, it can't hurt. Say it with me now: Cache Rules Everything Around Me.<p>First you should learn how web servers work, why we use them, and how to configure them. The reason your Apache instance was running slow is probably because you never tuned it. Granted, five years ago its asynchronous capabilities were probably haggard and rustic. It's gotten a lot more robust in recent years, but that's beside the article's point. Nginx is an apple, CloudFront is an orange.<p>Next you should learn what CDNs are for. Mainly it's to handle lots of traffic reliably and provide a global presence for your resources, as well as shielding your infrastructure from potential problems. Lower network latency is just a happy side effect.
More generally: once you adopt any of the various schemes for having a inbound proxy/front-end cache (Fastly, CloudFlare, CloudFront, or an in-house varnish/squid/etc), are all the optimizing habits of moving static assets to a dedicated server now superfluous?<p>I think those optimizing habits <i>are</i> now obsolete: best practice is to have a front-end cache.<p>A corollary is that we usually needn't worry about a dynamic framework serving large static assets: the front-end cache ensures it happens rarely.<p>Unfortunately it's still the doctrine of some projects that a production project will always offload static-serving. So for example, the Django docs are filled with much traditional discouragement around using the staticfiles serving app in production, including vague 'this is insecure' intimations. In fact, once you're using a front-end cache, there's little speed/efficiency reason to avoid that practice. And, if it is specifically supected to be insecure, any such insecurity should to be fixed rather than overlooked simply because "it's not used in production".
Wait, so if I pay more for a CDN to deliver my static data, that will work better than when I try to save money and do it myself?<p>[Insert Oscar winning Face of Shock here]
nginx still buys you SSI (which allows you to, for example, cache the same page for all users and have nginx swap out the username with a value stored in memcache), complex rewrite rules, fancy memcache stuff with the memc module (ex: view counters), proxying to more than ten upstream servers, fastcgi, and lots of other fancy stuff.<p>Cloudfront is a replacement for varnish, not nginx.
I was once told by somebody wise that if a post asks a question, then the answer is usually no.<p>e.g.: Is Mountain Lion going to kill Windows 8? .. etc.
Does anyone have experience with using nginx as a caching proxy? I've used Varnish and swear by it, it's just an amazing piece of software. How well can nginx replace Varnish?
I think not. Requirements change, and locking myself in to a front-end cache is not appealing. I may also have things which I can't or won't let others cache for me, so I want my local stack to be optimized anyway. You won't see me serving everything out of WEBrick anytime soon just because I have a cloud cache.<p>It's nice to be able to defer decisions, especially optimizations, but making performance someone else's problem entirely seems like it could promote sloppy thinking and poor work. It's the difference between augmenting a solid platform when the need arises versus front-loading dependencies because it's okay to be lazy.
There's a good post from late 2011, in the context of 12-factor deployment on Heroku, where the author muses about just using a pure Python server behind a CDN to serve static content:<p><i>...and yeah, I think I should bloody use this server as a backend to serve my in production.</i><p><a href="http://tech.blog.aknin.name/2011/12/28/i-wish-someone-wrote-django-static-upstream-maybe-even-me/" rel="nofollow">http://tech.blog.aknin.name/2011/12/28/i-wish-someone-wrote-...</a>
Sure it's obsolete, who needs databases and live, chancing data. All we need is a static pages. Besides who needs to build his own infrastructure, it's 2012 right ? Let's buy it.
If you want to serve static files cheaply and are moving less than 10TB/mo you will find that CloudFront is a magnitude more expensive than bunch of VPSes with lots of monthly bandwidth.<p>Viability of this depends heavily on the use but if you're moving funny pictures of cats then you won't be generating lots of income and want to optimize the bandwidth costs.
Before implementing that, be aware that CloudFront doesn't support custom SSL certificates. If you have any user-session in your app, you don't want them to login on <a href="https://efac1bef32rf3c.cloudfront.net/login" rel="nofollow">https://efac1bef32rf3c.cloudfront.net/login</a>
CloudFront is pretty good, just make sure you are able to config your asset source in one line. Otherwise you have to use a tool to invalidate the cloudfront cache frequently during dev and it's not instant.