TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Pushing Files to the Browser Using Delivery.js, Socket.IO and Node.js

50 点作者 liamk超过 13 年前

6 条评论

alexhaefner超过 13 年前
This is something we looked into a while ago. There are a number of disadvantages to this approach that were not highlighted.<p>(1)Base64 encoding files makes them larger, this will eliminate the advantage of smaller headers, almost always. Specifically:<p>"Very roughly, the final size of Base64-encoded binary data is equal to 1.37 times the original data size + 814 bytes (for headers)."<p>Source: <a href="http://en.wikipedia.org/wiki/Base64#MIME" rel="nofollow">http://en.wikipedia.org/wiki/Base64#MIME</a><p>If you want to send/get Binary data and manipulate it on the client side, XHR requests can handle binary data, which can be placed into javascript typed arrays.<p>(2)Concurrency is limited with websocket requests. You can only push one file per socket at a time, and then if you want to start pushing multiple concurrently you'd need to have more websocket connections open. I understand that you can push files one after another through the same socket, but that's not concurrency. On the back end the infrastructure to send different files through different websocket connections and manage the concurreny can get really messy really quickly. With Http requests, you can usually do two requests concurrently from any 1 domain, and then you can load balance across a set of domains.<p>(3) When Socket.io falls back to HTTP polling, you may end up consuming a lot of bandwidth on headers alone.<p>(4) If you're working with something that has cross domain issues, i.e. a WebGL application base64 encoded URLs will not work. They cannot be used, you have to have resources coming from the URL of a CORS accepted domain.<p>In the end it's simpler and more ideal to just push your files through http requests with built in concurrency.
评论 #3580621 未加载
rhplus超过 13 年前
<i>The most apparent disadvantage would be the fact that it bipasses traditional caching methods.</i><p>And this should be considered a fairly big disadvantage if what you're pushing is publically cacheable. Consider places where a HTTP URI might be cached: client memory, client disk, forward proxy, CDN/reverse proxy, server memory, server disk. A web-sockets delivery mechanism would miss out on half of these caches. Of course, if the files are private and requested infrequently by the client, then a push mechanism might well be preferred.<p><i>In-browser file zipping could have a positive impact on transfer speeds</i><p>Yes, you really should be compressing your content on the wire (although compressed images, even in base64, might not benefit much), but I'm skeptical that a JS gzip library could compete with the browser's native decompression code. Has anyone done any profiling of JS gzip libraries?
iamleppert超过 13 年前
Some problems with this approach:<p>1. Concurrency. Regular HTTP and pipelining allow the transfer of more than one resource at a time, often times many more. 2. Caching. The author mentions this, but fails to mention browser-side caching, which is the most significant form of caching, that is, to get a resource without having to go out over the network. This could perhaps be addressed with local storage, but his node module doesn't take that into consideration. 3. Compression. Regular HTTP requests support gzip; I'm not sure if websockets do or not. An initial google wasn't promising and it seems to be experimental. He mentioned in-browser unzipping, which is interesting but a more standards based approach would probably be via the content-accept header on a websocket connection.
kwamenum86超过 13 年前
"With zip.js files could be deflated, or inflated, within the browser."<p>That's what gzip is for.
评论 #3580006 未加载
ammmir超过 13 年前
base64 encoding binary files is just wrong when you could as well "push" the files' URLs over a websocket connection. that way you can take advantage of gzip compression and caching on the client.<p>it also looks like the server-side code uses the blocking fs.readFileSync() call to read the entire file into memory... but maybe it could have some uses for small dynamically-generated data. if so, not sure a "file" abstraction is needed :)
wavephorm超过 13 年前
This is an anti-pattern. The HTTP spec already sends files to web browsers, and it can send many files in parallel.
评论 #3580100 未加载
评论 #3580093 未加载
评论 #3580098 未加载