TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Ask HN: Indexing a Set of Personal Sites

9 点作者 vardhanw超过 2 年前
I am sure all of us have bookmarked&#x2F;collected many URLs over time due to various reasons. As this collection grows it becomes difficult to maintain a record of what each URL was interesting for (why I saved it), which makes some sort of a search into the <i>contents</i> of those URLs useful.<p>My question is are there any existing tools or suggestions to build some scripts to achieve this whereby I can do a search <i>only</i> on my private URL collection? I wouldn&#x27;t like to store the contents on my local machine, but maybe some sort of an indexing can be used?<p>Couldn&#x27;t get much in a simple DDG search so asking here.

2 条评论

beauHD超过 2 年前
Well I have several browsers and do CTRL+D when I find something interesting, and the bookmarks pile up over the course of a year. After that, I export them through the browser to a HTML file that I then review after a year. Anything interesting, I keep for posterity. It helps if the URL is still active and not subject to link-rot. I avoid cloud services like Raindrop or Pinboard for my bookmarks since they could shutter without notice, and I prefer a local copy. Over time, I&#x27;ve organized all the URLs into neat categories so I can visit them at my leisure.
nhellman超过 2 年前
A few months ago an HN user recommended to use the YaCy search engine to index personal bookmarks: <a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=31848210#31848566" rel="nofollow">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=31848210#31848566</a>