TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Scaling node.js to 100k concurrent connections

64 pointsby dworradalmost 13 years ago

13 comments

ericzalmost 13 years ago
Before everyone gets excited about these big numbers, I would like to remind you that even higher concurrency can be achieved with even lower CPU and memory usage using Erlang. These numbers are good for Node, but don't use this as evidence that Node is magical and much better at handling large numbers of connections than other systems.
评论 #4359796 未加载
评论 #4359883 未加载
评论 #4361270 未加载
评论 #4360681 未加载
评论 #4360006 未加载
评论 #4360197 未加载
评论 #4359777 未加载
forgotAgainalmost 13 years ago
Garbage collection is disabled. How is this then relevant to any real world usage?
评论 #4360976 未加载
gaiusalmost 13 years ago
Isn't this really scaling the underlying C runtime to 100k connections?
babuskovalmost 13 years ago
I use Node in production. The main thing I like about it is that looking at system usage graphs while number of users grow, only thing that is going UP is bandwidth ;)<p>I'd really like to see a story of someone really having 100k connected browsers. My online game currently peaks at about 1000 concurrent connections, and node process rarely lasts longer than 2 hours before it crashes. Of course, using a db like Redis to keep users sessions makes the problem almost invisible to users, as restart is instantaneous. I'm using socket.io, express, crypto module, etc.<p>I'd really like to see real figures for node process uptime from someone having 5000+ concurrent connections.
评论 #4360683 未加载
评论 #4361061 未加载
评论 #4360466 未加载
antiheroalmost 13 years ago
Can uwsgi/nginx be configured similarly?<p>Is it common practise to have node face the web without nginx?
decadalmost 13 years ago
Link to his next post showing him breaking 250k - <a href="http://blog.caustik.com/2012/04/10/node-js-w250k-concurrent-connections/" rel="nofollow">http://blog.caustik.com/2012/04/10/node-js-w250k-concurrent-...</a>
devmachalmost 13 years ago
It's a shame, that he didn't mentioned about kernel tuning. Without custom settings ( like net.ipv4.tcp_mem ), i think, it's a very difficult to reach this numbers.
nivertechalmost 13 years ago
I did 3M/node on physical severs, 800K/node on EC2 instances.<p>We mostly use Erlang on server-side and node.js + CoffeScript on client-side (where they rightfully belong ;)
nicolastalmost 13 years ago
It struck me the author runs his apps as root (in screenshots). But then I remembered he's using node.js to handle "thousands of concurrent connections".
评论 #4360696 未加载
dotborg2almost 13 years ago
Looks like author is not aware of some cuncurrency problems, deadlocks etc. Backend/database might not scale to 100k concurrent connections so easily.
评论 #4361072 未加载
评论 #4361073 未加载
ericmoritzalmost 13 years ago
I would really love to know what he did to tune that Rackspace VM. I had a terrible time trying to get node.js and others to get past 5,000 concurrent websocket connections on a m1.large EC2 instance or on Rackspace.
mariuzalmost 13 years ago
I wonder what happens at 100k database connections , i will give a try with firebird and the nodejs driver
评论 #4360989 未加载
评论 #4360005 未加载
评论 #4360366 未加载
bluesmoonalmost 13 years ago
i remember seeing this on HN back in April