Roughly, a somewhat lackluster response to a somewhat lackluster DDoS attempt.<p>They tried blocking specific ip addresses, which didn't work, because the attack was somewhat distributed. They then just turned on some caching, which allowed the site to function, albeit with an unknown excess bandwidth charge pending.<p>And, the DDoS itself can't of been terribly impressive, as all it took to mitigate was a bit of caching. He mentions 10 requests / sec as the scale of the attack.
The webpage[0] seems to be having issues. The best I could do was the Google cache[1] or the Markdown source[2].<p>[0]: <a href="http://lologhi.github.io/symfony2/2016/04/04/DDoS-attack-for-ransom/" rel="nofollow">http://lologhi.github.io/symfony2/2016/04/04/DDoS-attack-for...</a><p>[1]: <a href="https://webcache.googleusercontent.com/search?q=cache:J7lca_k5dWcJ:ghirardotti.fr/symfony2/2016/04/04/DDoS-attack-for-ransom/+&cd=2&hl=en&ct=clnk&gl=us" rel="nofollow">https://webcache.googleusercontent.com/search?q=cache:J7lca_...</a><p>[2]: <a href="https://github.com/lologhi/lologhi.github.com/blob/master/_posts/2016-05-04-DDoS-attack-for-ransom.md" rel="nofollow">https://github.com/lologhi/lologhi.github.com/blob/master/_p...</a>
This is an amazingly weak DDoS, put your site behind CloudFlare or similar free service and go take a nap. They'll tank this without raising an eyebrow.
Ummmm.... A cache layer for any web application is a must have, perhaps he could have avoided the attack all along if it were present on the system since day one?...<p>At least for this kind of attack, a more serious DDoS won't be tamed by "just adding cache"
For next time you don't want to have to copy and paste. No need for SED.<p>cat <file> | cut -d ' ' -f1 | sort | uniq -c | sort -nr
Well, typical SLA for server side is 500 ms, then you have a chance to load a whole page under 3 seconds, which is recommended by google usability findings.<p>villa-bali is not even close to this, my bet that you (or your ORM) are making too many requests to database. Try to record ALL requests to database during page rendering and I bet you have about hundred.
Check out following test results:<p>8 test agents: <a href="http://loadme.socialtalents.com/Result/ViewById/57341f645b5f160adca6c1bc" rel="nofollow">http://loadme.socialtalents.com/Result/ViewById/57341f645b5f...</a> - 5% of users have to wait more than 2 seconds
16 test agents: <a href="http://loadme.socialtalents.com/Result/ViewById/57341f1a5b5f160adca6c19b" rel="nofollow">http://loadme.socialtalents.com/Result/ViewById/57341f1a5b5f...</a> 5% of users need to wait for more than 4 seconds.<p>Definitely, any bot can nuke your website easily.
I wonder what would happen if GET / only returned a redirect to somewhere (either an HTTP code or an HTML with window.location='http:/yoursite.com/new_page'