Quality degradation of search results (on popular search engines) has been a trending topic on HN multiple times. Below are a couple of those discussions:
https://news.ycombinator.com/item?id=29392702
https://news.ycombinator.com/item?id=25538586
https://news.ycombinator.com/item?id=22107823
https://news.ycombinator.com/item?id=29552322<p>Most of these discussions seem to be concluding on a specific topic - a feedback loop that ad-driven search engines cause, forcing the content creators to optimize for wrong parameters. This business model pushed many creators behind walled gardens, app developers to defend their data at all costs.<p>We all agree that content discoverability was one of the big pain points on the web since its inception (and still is). I don't understand why we're yet to have a web standard to address this problem. I believe in interoperable search engines as a solution to this problem. Interoperability will force search engine services to employ varying business models based on user needs. This will also free up more information for public discovery.<p>It's a problem that popular web standards organizations could solve (worst, should have already been solved). Because nobody is doing it, I've decided to spend some time on it, now doing it much more seriously.
Here is the entire idea as whitepaper: https://github.com/Aquila-Network/whitepaper/blob/master/AquilaDB_white_paper.pdf
I'm looking forwards to your valuable suggestions for improvement.
This repo contains the current implementations of it: https://github.com/Aquila-Network<p>NB: it's not a shitcoin whitepaper shilling.