I have noticed that sites have become incredibly buggy over the past couple of years and they seem to be taking longer to load.<p>Yesterday, literally every interaction I had with a site resulted in some bug. Often the bug was severe enough to prevent me from even accessing the page. And these weren't small insignificant sites; these were web applications with billions in funding.<p>Furthermore, it seems like there has been an uptick in the idea that companies can do the same amount of work with fewer engineers. It would be nice to have an index to measure this sentiment against the actual performance of the site over time.
There are also search engines for the small web or indy web such as <a href="https://ooh.directory/" rel="nofollow">https://ooh.directory/</a> (blogs) and <a href="https://wiby.me/" rel="nofollow">https://wiby.me/</a> Sites like that don't tend to have a lot of ads, surveillance, and other things that slow pages down. Have not tried Million Short which ignores results from 1,000,000 popular sites.
I start by treating Javascript as whitelist only. If the site is blank without Javascript, I ask myself what is the chance its worth enabling scripts one by one. HTML CSS and reasonably sized images are quick, its videos, scripts, and calls to databases which are slow.