"nearly all of the textual content of the English Wikipedia" = "1GB"<p>I find that hard to believe. Other wiki readers' dumps are a multiple of that. Eg aarddict for en is ~8GB.
Seriously mis-titled, since it's nowhere even close to the "Entire Wikipedia" – it's a tiny subset of the English-language Wikipedia from what I can tell.
Nice job, this looks really useful - would certainly help for the times when I'm stuck with no internet access and need to look something up.<p>One minor niggle - when I changed the file I wanted to use in settings, there was no confirmation or notification to let me know it was downloading the new file. I ended up stopping the download, erasing the data and starting again, to be sure. It might be worth adding in a confirmation to let users know it was changed OK, and is being re-downloaded.
Can we use this technology for APIs for all various languages/frameworks?<p>I could definitely use a bit of a productivity boost (by turning off web access)
Also, there was a Wikipedia/git project a while back (offline editing). All of the revision history was dumped into git.<p><a href="http://scytale.name/blog/2009/11/announcing-levitation" rel="nofollow">http://scytale.name/blog/2009/11/announcing-levitation</a><p><a href="http://www.readwriteweb.com/hack/2011/08/gitjs-a-git-implementation-in.php" rel="nofollow">http://www.readwriteweb.com/hack/2011/08/gitjs-a-git-impleme...</a><p>Why does mediawiki have its own version control system, anyway?
This is cool, but the one thing I miss from all wikipedia dumps so far is images. It's essential for a lot of articles. Last time I checked, images were excluded from dumps because of license issues. "Fair use" in particular. How about a dump of just the images with fitting licenses? Does anyone here know why this is not available?
Random article results in 404 once in may be 4 times. Here is a suggestion for an improvement - a link for making 404 pages available offline. So if I go looking for a specific page that isn't offline I can make it available and read it later.
An offline wikitravel would be incredibly useful for travelers. I haven't found this yet so I built an offline wikitravel for android:
<a href="https://market.android.com/details?id=com.heliod.eutravelguide&hl=en" rel="nofollow">https://market.android.com/details?id=com.heliod.eutravelgui...</a>
Unfortunately I can't see any formulas correctly and the tables are quirky. Example: <a href="http://offline-wiki.googlecode.com/git/app.html?Permeability_(electromagnetism)" rel="nofollow">http://offline-wiki.googlecode.com/git/app.html?Permeability...</a>
It says it was tested in Firefox 10, which is a little surprising since it doesn't work at all in Firefox 10. The IndexedDB spec changed and Firefox changed to align with the spec between 9 and 10, but the page uses the old API.
Does this app grab the files from Wikipedia directly? It doesn't seem very nice to create an app that pulls down gigabytes of data from a web service you do not own nor have permission from.<p>EDIT: It appears my concern was unwarranted.
Sorry for being obtuse, but I DLd the 1GB repository - where is it stored and how do I access it?<p>I see I can go to the index, from this page - is this index served up from the 1GB DL I just did?<p>How can I transfer this to [device]?
Thanks so much for this. This will be incredibly useful for me (behind the GFW, which gets moody about Wikipedia pretty often). Could this easily periodically update itself to grab fresh versions of articles? I think that would be a great feature, especially if you could do it without having to pull down the whole database each time you wanted to update, instead just updating on an article-by-article basis.
Absolutely amazing. This technology can be used for many other offline databases. He provides the tools for indexing, compressing and everything needed for the reader. Make sure to read his corresponding blog post: <a href="http://antimatter15.com/wp/2011/12/offline-wiki-redux/" rel="nofollow">http://antimatter15.com/wp/2011/12/offline-wiki-redux/</a>
Amazing project, just what I was searching for. A few recommendations:<p>Could you expand the available download options to include an option to download all of wikipedia, not just a subset of the most popular articles?<p>Right now, mathematic and other kinds of formulae aren't rendered correctly. Is there any way you could fix that?<p>An option to include pictures (maybe compressed or low-res versions) would be neat.<p>Thanks!