Low latency with a handful of pingdom monitoring nodes sitting in data centers does not necessarily translate to the "fastest repo on the web". We've tested CloudFlare since their launch using thousands of real users, and based on that testing performance tends to be on the low end compared to traditional static content CDNs. CloudFlare is more of a website proxy than CDN. By assuming full control of your website DNS, it stands out more with add-on features like security. Here is a link to some real user performance analysis I've compiled for various CDNs including CloudFlare:
<a href="http://dl.dropbox.com/u/20765204/feb12-cdn-report/index.html" rel="nofollow">http://dl.dropbox.com/u/20765204/feb12-cdn-report/index.html</a>
this shouldn't be hosted on the cloudflare.com domain. since I am a customer every request sends my cloudflare session cookie and a bunch of Google Analytics cookies.<p>Not only is it 2.5KB of extra header info sent, but I don't think cloudflare should know which websites their customers have been visiting.
For analytics and tag generation of libraries hosted by CDNJS, take a look at: <a href="http://www.scriptselect.com" rel="nofollow">http://www.scriptselect.com</a> It's a weekend project I did a couple of weeks ago using d3 and backbone. You can select libraries, view selected library size, and copy the generated script tags for the libraries you've selected. Just a little tool to make using CDNJS a little more convenient. If there's enough demand, I'll add other CDNs.<p>Thanks to Ryan, Thomas, and CloudFlare for a very cool service!
Quick question... I thought the best part of CDN hosted js files was that they were more likely already cached on the client, not so much for the speed of delivery.<p>So, wouldn't it be better to go with the most popular and not the fastest?
+1 to joshfraser - cache hits always beat requests. I remember seeing a stat about n% of top 100 sites use the google CDN for jquery, it was by far the most popular. Stick with Google for popular libraries.
They say this is 'peer-reviewed' but is that all? If someone sends them a pull request for an update to a widely used but perhaps smaller library, will they review it or does it just get merged into the CDN? It seems like a good way to get access to millions of browser sessions. Is anyone in Cloudfare taking responsibility for checking that the code is coming from the authoritative repo and not joeblow/underscore.js?<p><a href="https://github.com/cdnjs/cdnjs#pull-requests-steps" rel="nofollow">https://github.com/cdnjs/cdnjs#pull-requests-steps</a><p>While Google and Microsoft are slower to update their libs, we can assume that they are downloading releases from official sources.
The fastest request is the one that never happens. One of the biggest benefits of using hosted libraries is that browsers cache those files locally. By sharing the same URL for your copy of jQuery with thousands of other sites, you increase your odds of getting a local browser cache hit. For popular libraries like jQuery you're probably best using Google since they have the most adoption. That said, I think CloudFlare's CDN is an interesting idea and could grow into something genuinely useful especially for less popular libraries.
The question I have [but not the answer] is:<p>Usually in every project there is bunch of .js [jquery, backbone, etc] and .css files. So the good practice is not only to minify and compress, but also to bundle some/all of them into several big combined files to save on extra HTTP calls.<p>So my question is - what is better - (1) have separate files served from such CDN [or any public CDN] or (2) combine the files and serve yourself by nginx/AWS?<p>Not a developer, feel free to correct any mistakes :-)
I've recently started using CDNJS for my projects. Thanks to Ryan, Thomas, and CloudFlare for this awesome service!<p>Even happier to see that you guys host CSS and images for the common libs. I will change my bootstrap css hosting over to yours soon.
is the chrome extension[1] working? was just thinking that I might start to use it if there was a browser extension that makes sure those requests stay always local.
otherwise its rather slow to use any remote resource while developing a page locally.<p><a href="https://github.com/cdnjs/browser-extension" rel="nofollow">https://github.com/cdnjs/browser-extension</a>