Well, I have a big problem.<p>I need to real save several links per day, no matter if the site goes down, dissapears or the article gets edited.<p>I tried Evernote Web Clipper: good, but very limited. Even with a paid account I get only 1GB which is not enough.<p>The example page I tried to save has 3.2MB (no cache).
With a html2pdf solution I got a 1.5MB PDF, which is good, but I think there might be a better solution.<p>A tool that only saves the URL is excluded.<p>Do you have any idea related to this?
For firefox: <a href="http://maf.mozdev.org/" rel="nofollow">http://maf.mozdev.org/</a> ...saves sites into a single 'MAFF' file, adds MHT Support to the browser<p>...and it's open source ('hg clone <a href="http://hg.mozdev.org/maf/'" rel="nofollow">http://hg.mozdev.org/maf/'</a> )
- Print to PDF. With articles from online magazines, choosing "Print" to get all pages on page and then Print to PDF is my current preferred snapshotting. It is easily searched using spotlight on my mac.<p>- Instapaper still works well for me.<p>- Of late I've also been using archive.is to snapshot pages. This is useful for article permalinks to cite for research, in case the original url disappears or its content changes after the conclusion of said research.
I just rolled my own to just save the URL and then convert it to a mobi file using Calibre.<p>I guess it could be easily edited to do the conversion there and then, rather than wait, which is what it does at the moment.<p>It will also email the converted ebooks to your kindle, so every 2 weeks i realize that i find too many things interesting to read.
If you're not aware of it, wget[1] allows you to save pages, whole sites, file types, etc.<p>[1] <a href="http://www.thegeekstuff.com/2009/09/the-ultimate-wget-download-guide-with-15-awesome-examples/" rel="nofollow">http://www.thegeekstuff.com/2009/09/the-ultimate-wget-downlo...</a>
<a href="http://kippt.com" rel="nofollow">http://kippt.com</a> might be a good site to look at. They store the website contents on the kippt site.