TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Maintaining Large Binary Resources in Git

48 点作者 jgorham超过 13 年前

3 条评论

yason超过 13 年前
I've long established the opinion that the right way to go is to store all versions of the binary files on a separate disk/share/website and use git to version the paths to those files. Then you just suffer to have one local copy of the binary archive and you can access the individual files a thousand times via git without bloating the git repos.<p>It seems that git-annex does exactly something like that, and apparently even more while being more sophisticated.
obtu超过 13 年前
Git is getting better with large files (now that changes to those are streamed to packfiles), but there's still some way to go to make it scale to large numbers of tracked files; <a href="https://git.wiki.kernel.org/articles/g/i/t/GitTogether10_bup_bf08.html" rel="nofollow">https://git.wiki.kernel.org/articles/g/i/t/GitTogether10_bup...</a> has some ideas to make the index more scalable.
emillon超过 13 年前
git-annex is the way to go ! I use to manage my media library and it's 95% awesome (the nitpick being that you can't easily edit files once they're checked in - there ain't no such thing as a free lunch I guess).