Your site apparently got ~2250 visits per day (so, less than two per minute) at the height of your "surge", and seems to consist of three pages (/, /about, and /open-source). Most people are only going to look at just /, so let's say 3000 pageviews. The day after was still seeing good amounts of traffic, so it wasn't some kind of momentary "all 3000 hit the site in the same minute" situation: it seems like a fairly benign decay. How could you possibly have been dealing with 287 concurrent users?<p>My website (saurik.com) is seriously written in JavaScript. I was doing this long before node was popular, and so it is designed "horribly sub-optimally": it is using Rhino, which is not known for speed. I use XSL/T as a templating engine to build the page layouts, which is also not known for speed. Every request is synchronously logged to a database. I get over 50k HTML pageviews a day, most for one recent article which I posted a few weeks ago: when I posted it, I was getting well over 3k pageviews per hour.<p>I do not do any caching: I generate each page dynamically every time it is accessed. I seriously dynamically generate the CSS every request (there are some variables). Even with 3k HTML pageviews per hour, that's less than one complex request per second. How does one even build a website that can't handle that load? That is what I'd seriously be interested in seeing: not "how do I handle being #1 on Hacker News", but "why is it that so many websites are unable to handle two requests per minute".