While node is doing the calculation, it won't be doing anything else (like serving the next request). If a more traditional server is doing the calculation, it will spin up another process to handle the next request.<p>If all you do is calculating Fibonacci, you can get ~(amount of CPUs) times the performance. You could use multinode for the same effect, but this is additional work.<p>In the end, it's a matter of the type of service you are doing. If it's a Fibonacci generator, you'd use something that's better suited than either node/JS or any other scripting language.<p>If you are doing something that's I/O heavy (which is probably the majority of today's web applications), node or other scripting languages might be better suited because they are easier to work with for most of us.<p>It's just tools. Not religion. I wouldn't use a hammer to remove a screw. And I definitely wouldn't write hateful articles (I'm referring to the original one) because somebody was using a screwdriver instead of a hammer to remove the screw.
The original Node.JS is cancer article is a silly troll.<p>But it's totally ridiculous how in response, people keep writing these terrible, straw-man Python servers to try to prove that Python is so horribly slow.<p>If you want to write Python web apps, there is a correct way to do it, and it isn't to use SimpleHTTPServer. Write a WSGI application, and serve it with any of a number of decent WSGI servers (just for starters, try uwsgi behind nginx; but if you really insist on directly serving requests out of a server written in an interpreted language, you could try gevent).
<p><pre><code> try:
my_var
except NameError:
pass
else:
if my_var is not None:
# Ted needs better examples
...
</code></pre>
When would you <i>EVER</i> need to use this code? There is no situation in which you should ever need to use a variable in Python that may or may not be defined. While Ted's example may seem like a cheap shot, it does highlight an important problem with JavaScript: all the craziness with regards to types that aren't "real" types like undefined, arguments, and Array.
IMHO i think node.js sucks because it forces you to manually pass callbacks around. can't it remember my call site for me and use coroutines or call/cc or yield or something? even fork() exists on UNIX (or pthread_create()). why is passing callbacks around the answer? it's like using GOTO.
As a cancer survivor, just wanted to let you know that the poor taste exhibited here is pretty sad. When you want to make your point next time, use a title that doesn't include something that kills people. Thank you.
It's the equivalent of tying a giant boulder to the back of a Ferrari 599 and scoffing at how Ferrari can dare to call it a "high performance" car. Stop trying to drag giant boulders around.
It's not the main point of the article, but I just thought I'd point out that node already does have a WSGI/Rack-style library for folks who like that kind of thing. It's called Strata (<a href="http://stratajs.org" rel="nofollow">http://stratajs.org</a>). Disclaimer: I'm the author.
I think one thing that the article hints at is still relevant: "developers" nowadays throw around terms like "scalability" and other hype phrases and think that since they know the slightest thing about some new technology, they're a real developer. Sadly, this is incredibly naive. Any script kiddie / code monkey can some application in the latest over-hyped framework and say "HEY LOOK, IT CAN HAS NOSQL, IT CAN HAS SCALABILITY, IT DOES BIGGGGDATA", and write some inefficient code for this application. The truth is that many don't even understand basic data structures and how a computer processes information at a lower level. If you don't understand what something such as algorithmic complexity is, and can't look at your code from a more scientific and critical point of view, don't call yourself a freaking developer. Pick up a book and learn what REAL computer science is, not what the latest and greatest over-hyped framework is called.
Is it just me or is Brian Beck misunderstood Python idioms? What's with the try/except block? There's no auto-vivification in Python. So you just don't try to catch a NameError. You just let it blow up in your test and fix your code afterwards. I'm really tired of these straw man examples.
The article uses two examples to demonstrate that v8 is in fact fast. However, when using python or ruby to create a web server, the server will actually run in parallel (multiple threads), therefore, the average waiting time could be less than the node.js version.
I read your benchmarks on Ruby, but you didn't list the implementation or version of Ruby you used.<p>I'd guess you used MRI 1.8.X.<p>I decided to benchmark other versions(and implementations) of Ruby.<p>= jruby 1.6.4 (7.3 seconds)<p><pre><code> user system total real
7.388000 0.000000 7.388000 ( 7.349000)
</code></pre>
= Rubinius 1.2.4 (Little under 6 seconds)<p><pre><code> user system total real
5.940015 0.006878 5.946893 ( 5.842485)
</code></pre>
= CRuby 1.9.2 (38 seconds)<p><pre><code> user system total real
38.250000 0.090000 38.340000 ( 38.376857)
</code></pre>
= CRuby 1.8.7 (Little under 137 seconds)<p><pre><code> user system total real
136.960000 0.240000 137.200000 (137.437748)
</code></pre>
Thanks!
Node has a mantra which addresses this whole thread, right back to the beginning. "Everything in node runs in parallel, except your code". Manuel Kiessling has a written a great tutorial (<a href="http://nodebeginner.org" rel="nofollow">http://nodebeginner.org</a>) that shows how node noobs enter and can escape the blocking pitfall. Node is the right tool when your requests block on I/O. I'm not convinced about CPU heavy apps, yet.
I wonder if<p>1) node.js async i/o is any different from haskell i/o?
2) author knows something about strongly-typed languages, or he deliberately banned them those from server side? Imho, dziuba tried to drop a hint about strongly-typed languages, not python or ruby.
guys guys guys. you do realize ted is trolling us all right (go look at his twitter @dozba right now. I think he is having a great time). he is a pro at this (<a href="http://teddziuba.com/2011/07/the-craigslist-reverse-programmer-troll.html" rel="nofollow">http://teddziuba.com/2011/07/the-craigslist-reverse-programm...</a>)<p>Also if his thoughts on node.js don't annoy you enough go take a look at his archive: <a href="http://teddziuba.com/archives.html" rel="nofollow">http://teddziuba.com/archives.html</a>. He blogs/trolls/thinks about NoSQL, OS X, twisted/tornado, python, queues and more.
The nice thing about node (or any other event-based platform) is that memoizing the Fibonacci function is trivial, whereas in a multi-threaded implementation it would be tricky and error-prone.
This is where gcd can help. <a href="http://en.wikipedia.org/wiki/Grand_Central_Dispatch" rel="nofollow">http://en.wikipedia.org/wiki/Grand_Central_Dispatch</a>
I'm getting 0.020s tops for that fibonacci code on node (time curl <a href="http://localhost/" rel="nofollow">http://localhost/</a>), even going up to `1.1210238130165696e+167` (800th number). OSX Lion on a C2d 2.3ghz.<p>Python 2.7.1 took 1m25.259s (no server).<p>Am I doing something wrong? Or is there some incredibly optimized code path for OSX?<p>edit: even weirder, `time node fibonacci.js` without a server takes 0.090s.
I like how every rebuttal turns into "how fast can you compute a fibonacci number". I was looking for a function that burned a nontrivial amount of CPU, the choice of fibonacci was arbitrary. Let's move on from that.<p>What I was showing was that if your request handler does a nontrivial amount of CPU work, it will hold up the event loop and kill any "scalability" you think you're getting from Node.<p>If you Node guys were really that irritated by this, you're going to be super pissed when you learn how computers work.<p>I ain't even mad.
dboza is pretty awesome, beck didnt really address the underlying issue. If you have any real computation, node.js is not your solution. I figured out this in event-driven IO back when I was in school 15 years ago fffffff