TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Maintaining Large Binary Resources in Git

48 pointsby jgorhamover 13 years ago

3 comments

yasonover 13 years ago
I've long established the opinion that the right way to go is to store all versions of the binary files on a separate disk/share/website and use git to version the paths to those files. Then you just suffer to have one local copy of the binary archive and you can access the individual files a thousand times via git without bloating the git repos.<p>It seems that git-annex does exactly something like that, and apparently even more while being more sophisticated.
obtuover 13 years ago
Git is getting better with large files (now that changes to those are streamed to packfiles), but there's still some way to go to make it scale to large numbers of tracked files; <a href="https://git.wiki.kernel.org/articles/g/i/t/GitTogether10_bup_bf08.html" rel="nofollow">https://git.wiki.kernel.org/articles/g/i/t/GitTogether10_bup...</a> has some ideas to make the index more scalable.
emillonover 13 years ago
git-annex is the way to go ! I use to manage my media library and it's 95% awesome (the nitpick being that you can't easily edit files once they're checked in - there ain't no such thing as a free lunch I guess).