TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Ask HN: How do you handle large files in Git Repos under development?

2 pointsby johncoleover 7 years ago
Our development team is facing this issue right now; we need to include some larger binaries (pip and debian install files) with our Github repo. They&#x27;re not likely to change much over the coming years, but necessary since we run our updates offline, and need the packages to be easily accessible for testing and development. To test and release, right now we&#x27;re simply able to click the green &quot;Download Zip&quot; on Github. So we&#x27;re balancing two issues: Putting the binaries right in Github will eventually make the repo bloated. And putting them aside and packaging them means that we have to take an extra step for testing, and that someone will eventually forget this or screw it up.<p>Anyone have any experience with this? We&#x27;ve looked at the following:<p>1. Submodules - This seems almost perfect, but requires some change in developer behavior. We are used to downloading a test zip package for offline testing from the Github website. 2. Batch&#x2F;Bash File Packaging - Writing batch files that will automatically package up the larger files with the Github repo before we use in testing or release. 3. LFS - Git Large File Storage - We haven&#x27;t really looked at the pros and cons here much. Any experience?

1 comment

stephenrover 7 years ago
... why do you want .deb files in git? Build them and put them in an apt repo.
评论 #15746246 未加载