The hard part of building a CDN is to know when you need it. 99.9% of all websites with CDN do not need it.
Serving static files consumes so little resources that a single server can serve billions of users as long as you dont use script for serving the file. The most cost-effective with also the lowest latency solution is to never use CDN. If your webserver provider charge you a lot for traffic you are better off using another provider.
This is nicely written, and a lot of it mirrors my experience using nginx as a pseudo-cdn. Another area worth exploring might be http3, ssl session caching, and general latency/ttfb optimizations.
this is very very cool! One thing i would definitely like to see is domain name resolution.
Shopify, Dukaan, Vercel all make a big deal out of it ...going all the way to BGP.<p><a href="https://twitter.com/subhashchy/status/1536769406801309696" rel="nofollow">https://twitter.com/subhashchy/status/1536769406801309696</a>
Is it possible for CDNs to cache per URL per user? I'm thinking of something like /favorites where one URL would list something different for everyone. When I've setup caching on backend it was keyed off the user.<p>This was a very informative read!
The hard part of building a CDN is scaling it. The best approach imo is to use fly.io to host an anycast IP (with horizontal scaling) and store cache files on disk<p>Fly.io also has a Grafana dashboard built in for your machines
I'm curious if any HNers have opinions on prometheus vs other time series databases like influxdb?<p>I periodically consider a grafana & backend setup for when datadog becomes cost prohibitive for metrics with several tags.
Another example of a project duped into thinking Lua is “powerful”. It is small. That is it. Lua has near zero useful functionality and makes the developer repeatedly reinvent functionality over and over and over again.<p><a href="https://media1.giphy.com/media/TFO2mwVPIFoOJcuTSC/giphy.gif" rel="nofollow">https://media1.giphy.com/media/TFO2mwVPIFoOJcuTSC/giphy.gif</a>