The amount of SEO-spam I encounter on my searches has definitely grown a lot.<p>I'm a bit afraid that this will make it more difficult, if not impossible, to reach good webpages. Personal blogs seem like they're not prioritized anymore. I have to append operators like `site:<i>.github.io` or `site:reddit.com` to get good results now.<p>Does anyone have any idea on how we could work on mitigating this degradation of search results?<p>Creating manually maintained lists of the best resources, blog posts, discord servers, articles, videos, etc. related to a topic seems like a viable solution to me. Some subreddits maintain their own wikis that already achieve this for some topics and so do the awesome-</i> repos on GitHub, but these are few in number.<p>I want to see more efforts being poured into this, but see no community actively working on this with goals in sight. Do their exist any that I have missed? Or is anyone interested in creating one? I'd be happy to help!
You can exclude domains from your search using `my_search_string -site:example.com`. A better solution is to use duckduckgo[1] for search.<p>1. <a href="https://duckduckgo.com/" rel="nofollow">https://duckduckgo.com/</a>
Check out '<a href="https://millionshort.com" rel="nofollow">https://millionshort.com</a>'<p>It gives you options to remove the top X number of most popular sites from your results.
See this thread:<p><a href="https://news.ycombinator.com/item?id=23202850" rel="nofollow">https://news.ycombinator.com/item?id=23202850</a>