A lot of discussions about how git repos are supposed to be small are totally missing the point. This storage quota applies to everything, including release artifacts, containers, etc. Forget containers or CI artifacts on every commit, let's look at a very common scenario: using goreleaser to build binaries and deb/rpm/etc. packages for multiple architectures every release. This way a moderately sized Go project can easily consume 50-100MB or more per release. That gives you at most 50-100 releases across all your projects.<p>Using hosted GitLab for open source projects is looking less and less appealing.<p>I also posted about issue trackers on gitlab.com not allowing search without signing in a while back: <a href="https://news.ycombinator.com/item?id=32252501" rel="nofollow">https://news.ycombinator.com/item?id=32252501</a><p>Edit: An open source program that upgrades the quota is mentioned elsewhere in the thread: <a href="https://about.gitlab.com/solutions/open-source/" rel="nofollow">https://about.gitlab.com/solutions/open-source/</a> I don’t use hosted GitLab for my open source work, so no idea how many people get approved.
I guess this is the next step to reduce costs after the brakes were put on the "let's delete old OSS repositories" leaked plan.<p>For comparison, I think GitHub just have a cap of 100MB on any single individual file, plus:<p>> We recommend repositories remain small, ideally less than 1 GB, and less than 5 GB is strongly recommended. Smaller repositories are faster to clone and easier to work with and maintain. If your repository excessively impacts our infrastructure, you might receive an email from GitHub Support asking you to take corrective action. We try to be flexible, especially with large projects that have many collaborators, and will work with you to find a resolution whenever possible.<p>Which is a bit wishy-washy, but sounds like there's room for discretion / exceptions to be made there rather than a hard cap at 5GB.
If you build an image for testing on every commit and don't have a retention policy set up, you could be using a massive amount of space without realizing it. I can see why they did this.
GitLab team member here. The impacted users are notified via email and in-app notifications will begin 2022-08-22, so far we've contacted 30,000 users. Only GitLab SaaS users are impacted - the limits are not applicable to self-managed users.
Honestly for me the bigger issue is the upcoming bandwidth limit.<p>There is no clear date to when its going to be enforced.<p>But its definition is so wide its crazy, its basically any egress data except the web interface and shared runners.<p>AFAIK it will also include git clones!, so if your project suddenly gets popular the users clones will cost you many too.<p>Also if you use your own runner, cloning the repo to the runner will also be included in your bandwidth limit.<p>And since it should apply to GitLab pages, it becomes useless for anything the you want to get few visits.<p>Since GitLab is behind cloudflare, you might as well just use Cloudflare Pages at this point.
Am I reading it right that the original Free Tier had a quota of 45,000GB? That seems absurdly high and not very sustainable (hence the change I assume).
Git excels at tracking human keyboard output. A productive developer might write 100KB of code annually so a git repo can represent many developer years of collaborative effort in just a few MB. That is, unless you require git to track large media files, third party BLOBs, or build output.<p>However, sometimes tracking these things are necessary, and since there isn't an obvious companion technology to git for caching large media assets ("blob hub?") or tracking historical build output ("release hub?"), devs abuse git itself.<p>I wish there were a widely accepted stack that would make it easy to keep the source in the source repo, and track the multi-gb blobs by reference.
What's the point of tapering it down in stages like this? Between the October 19th quotas and the October 20th quotas, if you wait until the last minute, you have 24 hours to move 37.5TB of data. Then 4 more days to move another 7TB; does that actually help anyone? The proposition of getting that much data out of it at that speed seems a bit unrealistic. Why not just say "the quota will be 5GB on November 9th" and be done with it?
A couple of notes<p><pre><code> - If I tag a docker image with multiple tags, and then push it to Gitlab, each tag counts towards the storage limits even though SHAs are identical. eg 100MB container tagged with "latest" and "v0.5" uses 200MB of storage.
- The storage limit is not per repository, but per namespace. So 5GB free combined for all repositories under your user. If you create a group, then you get 5GB free combined for that group. Does this include forks? Does this include compression server side?
- The 10GB egress limit per month includes egress to self-hosted Gitlab Runners in free tier. Consider this with the 400 minutes per month limit on shared runners.
</code></pre>
These limits feel less like curbing abuse and more like squeezing to see who will jump to premium while reducing operating costs. Is this a consequence to Gitlab hosting on GCP with associated egress and storage costs? Is this a move to improve financials / justify a market cap with fiscal storm clouds on the horizon? Is this being incentivized by $67m in awarded stock between the CFO and 2 directors?<p>Stock history over last year for GTLB (since IPO in 2021?): <a href="https://yhoo.it/3QaExCs" rel="nofollow">https://yhoo.it/3QaExCs</a><p>From the golden era of 2015: <a href="https://about.gitlab.com/blog/2015/04/08/gitlab-dot-com-storage-limit-raised-to-10gb-per-repo/" rel="nofollow">https://about.gitlab.com/blog/2015/04/08/gitlab-dot-com-stor...</a><p>> To celebrate today's good news we've permanently raised our storage limit per repository on GitLab.com from 5GB to 10GB. As before, public and private repositories on GitLab.com are unlimited, don't have a transfer limit and they include unlimited collaborators.
I didn't even know there was no storage limit - that seems like an immediate way to get your platform used to store non-code data in very large quantities.
5GB isn't much different than the storage limits of other services, but their storage pricing is atrocious. I've seen the writing on the wall for a while and watched as GitLab went from being the cool open source alternative to GitHub to becoming a bloated oversized mess. I know several popular open source projects were offered premium tier upgrades for free. I am curious to see if these changes, especially transfer limits, will impact them enough to move away.
My Qt/C++ cross-platform FOSS Wallpaper Engine project[1] currently uses 47gb of storage. This is because I compile for every platform and store the artifacts for 4 weeks. Not sure what I will do in the future, because having older builds around to try out without recompiling is always nice.<p>[1] <a href="https://gitlab.com/kelteseth/ScreenPlay" rel="nofollow">https://gitlab.com/kelteseth/ScreenPlay</a>
Perhaps now is a good time to recommend the ever-popular BFG to anyone unaware:
<a href="https://rtyley.github.io/bfg-repo-cleaner/" rel="nofollow">https://rtyley.github.io/bfg-repo-cleaner/</a><p>Also my team's biggest repo is a 2.5 GB checkout but gitlab (self-managed) reports it as 185MB "files" and 353 MB "storage" (no CI/CD artifacts).
Seems like a buried lede here is that limits also now apply to paid accounts. Just checked my team’s name space: we have 700GB of storage used, and gitlab is going to start charging us $0.50/month/GB for everything in excess of 50 GB. On top of the hundreds of $/month we’re already paying in per-seat pricing. That seems absurdly expensive.
I'm assuming this is only for the ones they host and not the self-hosted solution. That is insane that anyone uploads terrabytes of data into gitlab, is there an actual valid non-illegal / weird backup choice use case? Is there some big ass GIS open source project out there that could use the attention of GitLab before they nuke some vital data somehow?
Anybody found out which projects on gitlab exceed the 45 TB limit?<p>I'm curious what kind of project would even need such a repository size. From a distant view this sounds like heavily mismanaged build artifacts in the project's git history; or abused storage for free CDN of video data or similar.
This doesn't affect me, but a better way to handle this would be to sell extra storage at, say, double GitLab's cost. Digital Ocean sells 250 GB object storage at $5/month and $0.02/GB beyond that.
So, we either go to Github, where our licenses are abused for their shitty ML.<p>Or we pay $20/month to Gitlab. And I can't figure out how the quotas will intersect with "professional", if at all.<p>For us Open Source devs, neither is a good option. Although I have heard good things about sr.ht / sourcehut. And for the service, it appears to be fair <a href="https://sourcehut.org/pricing/" rel="nofollow">https://sourcehut.org/pricing/</a>
Gitlab just can't shooting themselves in the foot.<p>Driving away individuals is apparently their strategy now.<p>Sad. I used to teach new developers starting with Gitlab pages.
Reading between lines it also says it going to enforce a 10GB limit on Paid tiers.<p>> Namespaces on a GitLab SaaS paid tier (Premium and Ultimate) have a storage limit on their project repositories. A project’s repository has a storage quota of 10 GB.<p>Even it's not mentioned as a change nor in the timeline, but that limit does not exist currently.
With this change, the 5 user limit[1], and original intent to delete dormant repositories[2][3], it seems as though GitLab is no longer able to support the free side of its business. GitLab has been touted as more OSS-friendly than GitHub, but a large part of the OSS ecosystem depends on free repositories. With these changes and this trajectory, I can't see myself putting another OSS project on GitLab.<p>It's a shame it's come to this, but I'm confident GitLab didn't make this choice lightly. It must be done in order for them to stay afloat.<p>Thank you GitLab team for your efforts. I hope you guys are successful in your future endeavors.<p>[1] <a href="https://about.gitlab.com/blog/2022/03/24/efficient-free-tier/" rel="nofollow">https://about.gitlab.com/blog/2022/03/24/efficient-free-tier...</a><p>[2] <a href="https://www.theregister.com/2022/08/04/gitlab_data_retention_policy/" rel="nofollow">https://www.theregister.com/2022/08/04/gitlab_data_retention...</a><p>[3] <a href="https://www.theregister.com/2022/08/05/gitlab_reverses_deletion_policy/" rel="nofollow">https://www.theregister.com/2022/08/05/gitlab_reverses_delet...</a>