I would think Transfer-Encoding would be a better choice than Content-Encoding. It's processed at a lower level of the stack and <i>must</i> be decoded – Content-Encoding is generally only decoded if the client is specifically interested in whatever's inside the payload. (Note that you don't have to specify the large Content-Length in this case as it is implied by the transfer coding.)<p>Also worth trying is an XML bomb [1], though that's higher up the stack.<p>Of course you can combine all three in one payload (since it's more likely that lower levels of the stack implement streaming processing): gzip an XML bomb followed by a gigabyte of space characters, then gzip that followed by a gigabyte of NULs, then serve it up as application/xml with both Content-Encoding and Transfer-Encoding: gzip.<p>(Actually now that I think of it, even though a terabyte of NULs compresses to 1 GiB [2], I bet <i>that</i> file is itself highly compressible, or could be made to be if it's handcrafted. You could probably serve that up easily with a few MiB file using the above technique.)<p>EDIT: In fact a 100 GiB version of such a payload compresses down do ~160 KiB on the wire. (No, I won't be sharing it as I'm pretty sure that such reverse-hacking is legally not much different than serving up malware, especially since black-hat crawlers are more likely than not running on compromised devices.)<p>[1] <a href="https://en.wikipedia.org/wiki/Billion_laughs" rel="nofollow">https://en.wikipedia.org/wiki/Billion_laughs</a><p>[2] <a href="https://superuser.com/questions/139253/what-is-the-maximum-compression-ratio-of-gzip/579290" rel="nofollow">https://superuser.com/questions/139253/what-is-the-maximum-c...</a>
How come when I posted this (my blog post) here I only got 2 points? <a href="https://news.ycombinator.com/item?id=14704462" rel="nofollow">https://news.ycombinator.com/item?id=14704462</a> :D
Reminds me of a time I once wrote a script in Node to send an endless stream of bytes at a slow & steady pace to bots that were scanning for vulnerable endpoints. It would cause them to hang, preventing them from continuing on to their next scanning job, some remaining connected for as long as weeks.<p>I presume the ones that gave out sooner were manually stopped by whoever maintains them or they hit some sort of memory limit. Good times.
Interesting and related re attacks on a Tor hidden service: <a href="http://www.hackerfactor.com/blog/index.php?/archives/762-Attacked-Over-Tor.html" rel="nofollow">http://www.hackerfactor.com/blog/index.php?/archives/762-Att...</a><p>And the follow up: <a href="http://www.hackerfactor.com/blog/index.php?/archives/763-The-Continuing-Tor-Attack.html" rel="nofollow">http://www.hackerfactor.com/blog/index.php?/archives/763-The...</a>
Wait a minute... He is doing the exact same thing as the former RaaS (ransomware as a service) operator Jeiphoos (he operated Encryptor RaaS).
It's know that Jeiphoos is from Austria. Exactly one year after the shutdown of the service, someone from Austria is publishing the exactly same thing an Austrian ransomware operator were doing a year ago.
Does anyone know if this kind of white hat stuff has been tested by law?<p>Because it seems in the realm of possibility that if a large botnet hits you and your responses crash a bunch of computers you could do serious time for trying it. I'm hoping there's precedent against this...
This is why web crawlers are built with upper boundaries on <i>everything</i>!<p>Nobody malicious brings down crawlers. It's just unexpected things you find out on the internet.
The article says that 42.zip compresses 4.5 petabytes down to 42 bytes. It should say 42 <i>kilobytes</i>.<p>I don't see a way to comment on the article itself, but hopefully the author reads this.
I don't think this "Defends" your website. If anything, it draws attention to it.<p>Might also be used for some kind of reflection attack. Want to kill some service that let's users provide a url (for an avatar image or something) - point it to your zip bomber.
A friend of mine has a very useful little service that tracks attempts to breach servers from all over the world:<p><a href="https://www.blockedservers.com/" rel="nofollow">https://www.blockedservers.com/</a><p>It's a lot more effective to kill the connection rather than to start sending data if you're faced with a large number of attempts.
This is like the soft equivalent of leaving a USBKill device in your backpack, to punish anyone who successfully steals it and tries to comb through your data.
This would be an entertaining way of dealing with MITM agents as well, over HTTP. As long as the client knows not to open the request, you could trade them back and forth with the MITM spy wasting tons of overhead.
Another method is wasting attackers' time by sending out a character per second or so. It works so well for spam, that OpenBSD includes such a <i>spamd</i> honeypot.
We need some legal advice in this thread.<p>What if the compressed file is plausibly valid content? How could intent be malicious if a request is served with actual content?
Reminds me a bit of Upside-Down-Ternet: <a href="http://www.ex-parrot.com/pete/upside-down-ternet.html" rel="nofollow">http://www.ex-parrot.com/pete/upside-down-ternet.html</a>
This could also be seen as a bug on the browser side. I'd also be interested in the browser results for the petabyte version.<p>I wonder if there's room to do this with other protocols? Ultimately we want to crash whatever tool the scriptkiddy uses.
About a month ago one of my websites was being scraped. They were grabbing JSON data from a mapping system.<p>I replaced it with a GZIP bomb. It was very satisfying to watch the requests start slowing down, and eventually stop.
Interesting!<p>That also crossed with another thought about pre-compressing (real!) content so that Apache can serve it gzipped entirely statically with sendfile() rather than using mod_deflate on the fly, so unless I've misunderstood I think that bot defences can be served entirely statically to minimise CPU demand. I don't mind a non-checked-in gzip -v9 file of a few MB sitting there waiting...<p><a href="http://www.earth.org.uk/note-on-site-technicals.html" rel="nofollow">http://www.earth.org.uk/note-on-site-technicals.html</a>
Similar topic a couple of months ago:<p><a href="https://news.ycombinator.com/item?id=14280084" rel="nofollow">https://news.ycombinator.com/item?id=14280084</a>
Both ZIP and GZIP file formats store the uncompressed filesize in their headers. You could stream and check for these headers to determine if the a zip bomb is being delivered. Obviously something script-kiddies aren't going to do, but the scripts they use can be improved and redistributed fairly easily.
Do browsers protect against media served with Content- or Transfer-Encoding like this? If you use something that lets you embed images, what's to stop you from crashing the browser of anyone who happens to visit the page your "image" is on?
A similar `slow bomb` could be created for attempted ssh connections to a host using a sshrc script. For example clients which do not present a key, just keep them connected and feed them garbage from time to time. Or rickroll them.
Interesting, on FF54 the test link pegs a CPU but the memory doesn't rise. Eventually it stops and CPU returns to normal. But then I did a 'view source', and the memory use rose until the browser got oomkilled (20GB free ram + swap)