TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Ask HN: Indexing a Set of Personal Sites

9 pointsby vardhanwover 2 years ago
I am sure all of us have bookmarked&#x2F;collected many URLs over time due to various reasons. As this collection grows it becomes difficult to maintain a record of what each URL was interesting for (why I saved it), which makes some sort of a search into the <i>contents</i> of those URLs useful.<p>My question is are there any existing tools or suggestions to build some scripts to achieve this whereby I can do a search <i>only</i> on my private URL collection? I wouldn&#x27;t like to store the contents on my local machine, but maybe some sort of an indexing can be used?<p>Couldn&#x27;t get much in a simple DDG search so asking here.

2 comments

beauHDover 2 years ago
Well I have several browsers and do CTRL+D when I find something interesting, and the bookmarks pile up over the course of a year. After that, I export them through the browser to a HTML file that I then review after a year. Anything interesting, I keep for posterity. It helps if the URL is still active and not subject to link-rot. I avoid cloud services like Raindrop or Pinboard for my bookmarks since they could shutter without notice, and I prefer a local copy. Over time, I&#x27;ve organized all the URLs into neat categories so I can visit them at my leisure.
nhellmanover 2 years ago
A few months ago an HN user recommended to use the YaCy search engine to index personal bookmarks: <a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=31848210#31848566" rel="nofollow">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=31848210#31848566</a>