Hopefully-constructive criticism/additions:<p>* If your ssh connection took forever, it doesn't really matter if it's a load-issue or bandwidth-issue. The site is ungodly slow and so will be your admin access time. Kill the site, put up an "under construction" sign, fix the performance issue, bring the site back. Link to a google-cache, web archive, or other version if possible in the meantime.<p>* Don't waste time running a ps. If you're running apache, just grep for MaxClients in the error log. Actually, there's really no point in checking because if you're being slashdotted, you hit MaxClients, I guarantee.<p>* Top isn't going to give you an accurate amount of memory used per process. You need to check smaps and some other things and all of that will take too long. Remove all modules except what you need to serve whatever static content you want to get out there during the slashdotting. You definitely do want to reduce the memory any way you can if you're constantly swapping and its loading you to hell (confirm with vmstat/mpstat/iostat).<p>* `killall -9 httpd` works faster.<p>* Your estimates of ram per client are going up when they should be static (25MB for 512 RAM and 54MB for 4G?). If you're lucky your app won't even actually use up all this valuable memory - Copy-on-Write will save as much space as it can unless the individual process needs to reserve some anonymous memory in the process. Once you unburden yourself of extra modules (run 'ldd' on the individual apache modules if you want to see all the shit they can load into your box at runtime) run apache with one or two processes to test and look at the memory use, and go from there.<p>* I'm kind of on the fence about this one, but in some circumstances it can help a little to reduce MaxReqPerChild to something stupidly low, like 100-1000. You risk overloading with i/o when your process reaps and a new one loads up, but if your processes keep swelling up with more memory as they run (hi mod_perl!) killing them off and starting new may help you.<p>* Honestly, in a slashdotting situation, use 'wget' or 'curl' to take a snapshot of your dynamic page(s) and put those in place as static files to be served to users. If you don't have a proper caching layer don't even worry about your database because you will almost invariably kill it with queries, which will kill your webservers. If you want a 'dynamic' version of your site to show to people that updates regularly, set up a cron job to wget the dynamic pages every 1-2 minutes and overwrite the static copy (but for god's sake make it back up the old copy and only move it if it's not empty or an error page).<p>* Looking for the biggest files is good. You can also grep and sort the apache logs to see which files are being requested the most, and staticize/shrink them however possible (css/jsp can have excess whitespace removed with some tools, images can be shrank with 'convert', dynamic pages can be made static as above, etc). `cat $LOG_DIR/access_log | sed -e 's/.*] "/"/' | sort | uniq -c | sort -g | tail` (sort doesn't print the count with 'sort -u' ... somebody should add that). Oh yeah, and anything that prints a log? You should disable that now before /tmp or /var fills up.