I've long established the opinion that the right way to go is to store all versions of the binary files on a separate disk/share/website and use git to version the paths to those files. Then you just suffer to have one local copy of the binary archive and you can access the individual files a thousand times via git without bloating the git repos.<p>It seems that git-annex does exactly something like that, and apparently even more while being more sophisticated.
Git is getting better with large files (now that changes to those are streamed to packfiles), but there's still some way to go to make it scale to large numbers of tracked files; <a href="https://git.wiki.kernel.org/articles/g/i/t/GitTogether10_bup_bf08.html" rel="nofollow">https://git.wiki.kernel.org/articles/g/i/t/GitTogether10_bup...</a> has some ideas to make the index more scalable.
git-annex is the way to go ! I use to manage my media library and it's 95% awesome (the nitpick being that you can't easily edit files once they're checked in - there ain't no such thing as a free lunch I guess).