If I lose internet connection I no longer have access to new links or past sites i've visited.<p>The sites I've visited had the HTML rendered, and that can be saved for later use.<p>Browsers should offer re-visiting one of the 10 last cached versions of visited sites instead of offering nothing.<p>New content should be fetched periodically to have something to read on times of network issues.<p>Having nothing to read can be boring.
I guess the reason is that Internet works so well nowadays.<p>A long time ago I was running a proxy at home behind a dialup (or was it ISDN?) that would cache the contents of websites and offer the old contents if it was unable to connect to refresh its copy. It worked <i>ok</i>, for the time, before the proliferation of SSL. I also modified it to <i>first</i> give the cached copy while updating the page in the background, which sometimes lead to breakage, but things weren't rock solid otherwise at the time ;-).<p>I guess something like that would be nice even today, but it would probably need to be built as an extension into a browser to most conveniently handle the SSL issue. I think I would use it, to archive all the pages I've visited and then easily visit older versions of them, in addition to having a way to search them.
So many sites rely on backend related functionality such as XHRs, session management and auth etc to render the front end. Sure I can cache and re-render the frontend from some historical snapshot but the same javascript that came down in the original load will execute again - with uncertain and probably not great results.<p>Providing functionality like that seems like a good idea on its face but implementing it would be yet another set of test cases and problems for site/webapp developers and nobody wants that in 2024 when internet connections are ubiquitous in Western, developed nations. In the case of non-developed nations, the money to implement such a thing just isn't a viable spend.<p>Doable? Sure. Worth the expense relative to other clear and present customer facing problems? I couldn't justify it.
JS serviceworkers can do a great job of intelligent cacheing and I have stuff of my own I can use on the tube (no internet). I don't know why they are not used more. Sadly they only intercept own site.