If you expose a web server to the internet today you'll get 10 malicious requests for every 1 legitimate request.<p>This constant and unrelenting beating at your doors doesn't go away unless you add perimeter protection.<p>The options here are:<p>1) Block the IP and cidr ranges that are giving you trouble<p>2) Silently scan the connection request and block it when things look fishy<p>3) Provide a challenge in the return response that is difficult for bots to complete<p>Most of the bot protection on the internet is #2 where you don't notice you've been verified as a human and the site just loads. People hate #3 of completing a challenge, but the other option here is #1 where the site doesn't load at all.<p>I'd argue that bots are breaking the internet.
Another thing that annoys me so much is that a lot of websites offer RSS feeds and then their RSS feeds are broken because of cloudflare.<p>If your feed reader periodically requests a feed, cloudflare starts showing their javascript based checking your browser thingy.
The site owner has complete control over this in the CF dashboard, and can easily disable it or lower the threshold. Myself, I'm quite happy with stopping bad traffic (about 20% of the requests to my sites) at the edge with CF and keeping my hosting costs down.
I've never used Cloudflare so apologies for what is probably documented somewhere. Can the site owners not set JS requirements per-URL? I ask because the same JS hidden browser tests can be added to NGinx and HAProxy using LUA scripting and it can be done by ACL for specific URL's. e.g. No-JS for static content and URL's that use GET but then require passing the JS hidden browser test prior to using a page that would require a POST. That is just one example of the myriad of possibilities. Can that not be set up in CF? Or is it all-or-none?<p>For people not using a CDN and wanting to keep bots off the static content, this can <i>for now</i> be partially accomplished doing two things. Forcing HTTP/2.0 and one raw table iptables rule to drop TCP SYN packets that do not have an MSS in the desired range. Most poorly written bots do not even bother to set MSS. I'd wager this is something CF looks at in their eBPF logic. Blocking non HTTP/2.0 requests will drop all search engine crawlers except for Bing.
cloudflare as of this month shows propaganda on the captcha page, like "40% of the internet was historically bots" (as if that matters). it actually fits right in with, the common sentiment that the old internet was bad, welcome in the new internet where nothing is allowed unless it's a legitimate commercial use. this is getting out of hand.
How exactly do you imagine bot/attach protection (cloudflare's main product) working without JS? Even to bypass a captcha using your browser to assert trust requires JS.<p>Are captchas and DDoS bot protection ruining the web?
Cloudflare is not the one breaking the internet, Bots are, they are just providing a solution to deal with the bot problem.<p>This is also controlled by the Cloudflare customer. If I'm having issues with my server due to fake/hostile traffic coming to my website, you're dang right I will do what it takes to stop it.
It's annoying to some of us, and will only result in escalation, browser plugins will prolly be made to only run js in this context and not in the final render.
Some more wasted processing power, that might block unwanted requests, but apart from DDoSes, these requests shouldn't be a threat anyway. Maybe DoS zombie agents will be updated to run a bit of js, if it's worth the hassle.<p>Every day we stray further from what the web could have been if we could have nice things.
Technically this is just breaking the web, not the internet - none of the other protocols are being interfered with.<p>Even Cloudflare's DNS product is just standard DNS protocol, sitting behind network-level DDoS protections. It's only HTTP where they tamper with the application layer.
This is JS specifically in Cloudflare's own domain is it not? I don't think you need to enable JS after. So a JS / cookie blocking setup should be versatile enough to let you allow only Cloudflare JS and cookies which are then self-destructed.