Large tech companies seriously do not care. They say they do, and they point to all these heuristics and optimizations, they point to Chrome's dev tools where you can simulate slow connections, etc. Great.<p>The problem is, they're taking an experience that is fundamentally ridiculously heavy, and then spending thousands of man-hours trying to optimize it. No one even <i>considers</i> that maybe its the experience itself that is too heavy, and no optimizations can help that.<p>Youtube Home Page. Load it up, and you'll find its making over 200 requests, transferring megabytes of data. Google's most obvious solution: Lets speed up TLS, make each request go faster, lets invent new image and video compression algorithms to lower the size of each response, lets batch requests to reduce latency, technology, complexity, more code, more code.<p>No one actually takes a step back and asks if the Youtube home page should make 200 requests. What if it only made 20 requests? We gotta load some thumbnails, so there's bound to be a lot of requests there, but otherwise what the heck is all this JS?<p>TLS on one request isn't the problem. The problem is the hundreds of requests a typical website leans on.<p>Uncomfortable opinion: The only reason the internet has survived for so long is because of Moores Law. We've developed all of this technology and SDLC process in an era where another 20% jump in performance is just around the corner, so who cares if it's slow today. Yeah, that era is done. And we, as an industry, are completely fucked. Its not an understatement to say its a "back to the fundamentals" moment, and its going to cost us billions of collective dollars engineering for it.