> one effective way that we get high performance out of node.js is to avoid doing computation in node.js.<p>And therein lies the problem with node. Its great for rapid prototyping, but things fall apart pretty quickly with CPU laden tasks. That and the whole single threaded thing with node (technically JS) can be a wrecking ball when one task takes too long.
I had never read about that 1.5GB heap limit. I've come across that same error message - FATAL ERROR: JS Allocation failed - process out of memory - several times in my application but googling never gave me a good answer why this was happening, especially since I had a lot of spare memory on the server. Very good thing to know...
Seems to me like there's very little new here to anyone paying attention to the space. They didn't raise any limits - they just fixed their own badly written code.<p>The 1.5 gig limit is not widely known (it's a limit of V8 and I believe, correct me if I'm wrong here, that it's improved in later versions due to some changes to GC) but if you're hitting it you're doing something <i>massively</i> wrong: like they were here.<p>Splitting data up into chunks for the event loop to process is priority one in any Node app that deals with data processing. It has been for a long time. This is true <i>anywhere</i> you run JavaScript. It's where libraries like [Highland](<a href="http://highlandjs.org/" rel="nofollow">http://highlandjs.org/</a>) excel and why Node has the concept of Buffers.<p>Chunking data should be a no brainer and it's frankly a little strange that Jut weren't doing this in the first place. Raises questions about what else is not being done correctly under the hood.<p>There's no meat here (aside from learning that NPM uses them).
> In fact, as of this writing, the JPC depends on 103 NPM packages<p>It's really scary that you need so many packages to build a web app using Node.js.