Our development team is facing this issue right now; we need to include some larger binaries (pip and debian install files) with our Github repo. They're not likely to change much over the coming years, but necessary since we run our updates offline, and need the packages to be easily accessible for testing and development. To test and release, right now we're simply able to click the green "Download Zip" on Github. So we're balancing two issues: Putting the binaries right in Github will eventually make the repo bloated. And putting them aside and packaging them means that we have to take an extra step for testing, and that someone will eventually forget this or screw it up.<p>Anyone have any experience with this? We've looked at the following:<p>1. Submodules - This seems almost perfect, but requires some change in developer behavior. We are used to downloading a test zip package for offline testing from the Github website.
2. Batch/Bash File Packaging - Writing batch files that will automatically package up the larger files with the Github repo before we use in testing or release.
3. LFS - Git Large File Storage - We haven't really looked at the pros and cons here much. Any experience?