Node benchmark is flawed though. Add something like<p>require('http').globalAgent.maxSockets = 64;<p>at the top of node script if you want a fair comparison with async php version. The bottleneck is bandwidth here. Not the runtime.<p>On my laptop, original script from the author took 35 seconds to complete.<p>With maxAgents = 64, it took 10 seconds.<p>Edit: And who is downvoting this? I just provided actual numbers and a way to reproduce them. If you don't like how the universe works, don't take it out on me.
Long term php guy (I maintained APC for years, slowly given up now), so I've worked a lot with ~2k/3k request-per-second PHP websites.<p>The real trick here is async processing. A lot of the slow bits of PHP code is people not writing async data patterns.<p>If you use synchronous calls in PHP - mc::get or mysql or curl calls, then PHP absolutely sucks in performance.<p>Nodejs automatically trains you around this with a massive use of callbacks for everything. That is the canonical way to do things - while in PHP blocking single-threaded calls is what everyone uses.<p>The most satifying way to actually get PHP to perform well is to use async PHP with a Future result implementation. To be able to do a get() on a future result was the only sane way to mix async data flows with PHP.<p>For instance, I had a curl implementation which fetched multiple http requests in parallel and essentially lets the UI wait for each webservices call at the html block where it was needed.<p><a href="https://github.com/zynga/zperfmon/blob/master/server/web_ui/include/curl_prefetch.php" rel="nofollow">https://github.com/zynga/zperfmon/blob/master/server/web_ui/...</a><p>There was a similar Memcache async implementation, particularly for the cache writebacks (memcache NOREPLY). Memcache multi-get calls to batch together key fetches and so on.<p>The real issue is that this is engineering work on top of the language instead of being built into the "one true way".<p>So often, I would have to dig in and rewrite massive chunks of PHP code to hide latencies and get near the absolute packet limits of the machines - getting closer to the ~3500 to 4000 requests per-second on a 16 core machine (<i>sigh</i>, all of that might be dead & bit-rotting now).
I get sick of these language wars, especially the constant stream of PHP ridicule that just never seems to end. The positives I try to take away from all of it is that there are a lot of people that are extremely passionate about software development and are striving for better tools and ways to express themselves. I want to believe that through the vitriol encountered in some of these articles that there are people really trying to improve the technologies at heart instead of taking part of some kind of programing language apologetics. In regards to PHP, I think that the ridicule has led to improvements in the language, but the overall tone in some of these articles is still a turn off for me.
I'm sick of all of these generic SPEED benchmarks. Let me tell you some BIGGEST & REAL benefits of NodeJS where PHP SUCKS.<p>1. Takes 1 minute to install on any platform (*nix, windows etc.)<p>2. A modern Package Manager (NPM) works seamlessly with all platforms.<p>3. All libraries started from 0 with async baked in from day 0.<p>4. No need to use any 3rd party JSON serialize/deserialize libs.<p>5. And above all, its Atwood's law<p>"any application that can be written in JavaScript, will eventually be written in JavaScript".<p><a href="http://www.codinghorror.com/blog/2009/08/all-programming-is-web-programming.html" rel="nofollow">http://www.codinghorror.com/blog/2009/08/all-programming-is-...</a>
Benchmarking is very hard because even the same language could show different results.<p>For example:<p><pre><code> for($i = 0; $i < count($list); $i++)
</code></pre>
vs<p><pre><code> $count = count($list);
for($i = 0; $i < $count; $i++)
</code></pre>
Most of the time benchmarks prove how capable a programmer is, not the speed of the language used.
Good test.<p>At some point, you'd expect these arbitrary this vs. that comparisons to die off. They haven't, and I'm guessing they won't.<p>Basically, it comes down to picking the tool that best supports your use case, or being okay with a compromise. Like the SQL/NoSQL discussions recently... Use it poorly and you get poor results.
I've made a similar observation to the original post — in my case, moving a bit of functionality from PHP to Node.js gave me 100x better performance: <a href="https://servercheck.in/blog/moving-functionality-nodejs-increased-server" rel="nofollow">https://servercheck.in/blog/moving-functionality-nodejs-incr...</a><p>But the reason for this wasn't that Node/JS is faster than PHP; it was because I was able to write the Node.js app asynchronously, but the PHP version was making hundreds of synchronous requests (this is the gist of the OP).<p>The issue I have is that Node.js makes asynchronous http calls relatively easy, whereas in PHP, using curl_multi_exec is kludgy, and few libraries support asynchronous requests.<p>The situation is changing, but the fact remains that asynchronous code is the norm in Node.js, while blocking code is the norm in PHP. This makes it more difficult (as of this writing) to do any non-trivial asynchronous work in PHP.
I agree that the comparisons are often unfair between languages/frameworks, and agree with everything phil says, but there is a lot to be said for language level non-blocking constructs.<p>I am really enjoying reading Go code and seeing how people use concurrency etc; and they are all doing it the same. When I would read ruby, I would have to know the particulars of a library like Celluloid or EventMachine which made it harder.
The "Thoughts" section was the most informative part of the benchmark which underscores the way I, when I was working with PHP, operated. When I started with PHP(2005), the frameworks were terrible, I would cobble together many random coding examples from stuff I found on the web and just make my own Framework up. I don't think PHP from a performance standpoint is any better or worse, but the default examples that you generally see in the ecosystem provide significantly worse performance. The one thing that Node clearly has an upper hand on PHP with is the ecosystem. It's a lot easier for a developer new to the Node ecosystem to hit that Node target than it would be for someone of the same skill to hit the PHP target in terms of hours spent.<p>One funny thing is that the ReactPHP[1] site is visually similar to the Node[2] homepage.<p>[1] - <a href="http://reactphp.org/" rel="nofollow">http://reactphp.org/</a>
[2] - <a href="http://nodejs.org/" rel="nofollow">http://nodejs.org/</a>
I have used RollingCurl (non blocking CURL) to fetch multiple API requests at once using PHP. Really easy to implement using a simple class. The example shows how you could build a simple efficient scraper.
Node.js for web scraping usually is the obvious choice:<p>Scraping using jQuery syntax such as:<p><pre><code> $('table tr').each(function(ix, el) {
names .push($(el).find('td').eq(0));
surnames.push($(el).find('td').eq(1));
})
</code></pre>
is more familiar to most web developers as opposed to the PHP syntax.<p>Even if Node was 5x slower than PHP I would still go for Node because of its easy jQuery syntax.