Having a limit of 5 million files is perfectly reasonable. Failing to document that such a limit exists and refusing to publicly confirm it (which apparently is STILL the case) is extraordinarily poor customer service/communication.<p>Google KEEPS setting new records for poor customer communication, to the point where I (and much of the HN crowd) now expect it. Android developer banned from the app store? There is no meaningful way to appeal but you'll probably never be able to find out why. Your best hope is to post on HN and hope someone with power at Google notices.<p>Leadership at Google ought to recognize this; they ought to make an effort to improve the channels by which "customers" can communicate with Google. But I see no signs that they are even aware of the issue; I see no effort to change anything.<p>I would try to tell them but... there's no communication channel. Maybe I should post about it on HN.
Ha!<p>“ a safeguard to prevent misuse of our system in a way that might impact the stability and safety of the system."<p>Google: We have identified modern web development as a threat to our systems, and have taken measures to ensure npm users cannot store their npm_modules directories on GoogleDrive. Please consider rewriting your nodejs projects in Go.
Hmm, there was a HN thread about this a few days ago [1] where everyone seemed to attack people for even considering the idea of storing 5M files in a cloud storage solution, going so far as to argue that even <i>disclosing</i> such a limit would be unreasonable to expect.<p>In this thread, the prevailing thought seems to be that having a 5M file limit is unreasonable and adding it without disclosing it is egregious.<p>Just a curious thing I noticed.<p>[1]: <a href="https://news.ycombinator.com/item?id=35329135" rel="nofollow">https://news.ycombinator.com/item?id=35329135</a>
I pay for 5 TB and planned to use the drive to store a copy of my data.<p>Things I store that have lots of files:<p>- The frames for my Timelapse videos = 400,000 files<p>- The files in my Eagle app photo database = 400,000 files<p>- Other image files, my programming repositories, documents, music, stable diffusion Deforum frames = 400,000 files<p>80% of these files I've accumulated in the last 12 months and can see myself easily hitting this 5,000,000 file limit well before I run out of TB's<p>So now I know I will never be able to use all the space I'm paying for, I'm going to stop uploading my files and instead search for a proper backup service, something I should of researched in the first place.<p>Anyone here have any recommendations for a backup service?
If the number of users affected is as 'vanishingly small' as a Google spokesman indicated then you'd think they'd be able to contact them - at least the paying customers?
“In practice, the number of impacted users here is vanishingly small."<p>Well, yeah, I imagine they’re moving elsewhere.<p>Seriously though, do people actually trust them not to randomly intentionally break stuff at this point?
I see it speculated, downthread, that this is a response to modern web-dev and node (?) creating millions of files, etc.<p>I can’t comment on that but I <i>do know</i> that modern, encrypted, archive tools such as duplicity and borg and restic “chunk” your source files into thousands (potentially millions) of little files.<p>We see tens of billions of files on our zpools and have “normal” customers breaking 1B … and the cause is typically duplicity or borg.
Good remainder again that "the cloud is just someone's else computer"!<p>In my experience, GDrive is a piece of crap with a lot of weird behaviors and easy ways to lose your data if you sync your computer with it.<p>The worse here, as said by multiple persons, is not to have a limit. A limit on their service is fair. It is that this limit is undocumented, and that their key selling point is to shout everywhere that if you pay you will have "unlimited" storage. And that it will scale more easily than using your own "not cloud" backups.
Once again: Don’t use Google for anything crucial or critical. Not Google Cloud, Google Docs, Google Drive, even Gmail is becoming a liability.<p>Real Engineering involves developing forward looking designs and maintaining backwards compatibility. It involves a release schedule. It involves communication channels and releases notes. It’s hard. It’s unsexy.<p>Google treats their product lineup with the seriousness of a social media platform. They don’t care about your puny business; even if it means the world to you, it means nothing to them.
"Vanishingly small": a number of users small enough to be downplayed, but large enough so that neither an individual approach to the problem would work, nor that the problems could be ignored. Suspected to be a complex number.
Anyone knows how this works legally? You buy a service, suddenly without notice the services changes features. Does the small print allow for that? And how is this 'ok' in software but probably not anywhere else (pretty sure a service contract for an elevator doesn;t allow the service company to just say "we're going to limit the ?amount of times your elevator goes up and down to 100 times a day now")
Some people will let technical limitations define a product. Others will have the product dictate the technical design. This, to me, is an example of the former.<p>I don't know the serverside implementation of Google Drive but imagine the files on your Drive correspond to files on something like an ext4 filesystem. In this scenario, each file has a cost (eg in the inode table) and there is wastage depending on what your block size is. Whatever the case, Drive seems to treat files as first-class objects.<p>Compare this to something like Git. In Git (and other DVCSs like Mercurial), you need to track changes between files so the base unit is not files, it's the repo itself. It maps to your local filesystem as files but Git is really tracking the repo as a whole.<p>So if you were designing Google Drive, you could seamlessly detect a directory full of small files and track that directory as one "unit" if you really wanted to. That would be the way you make the product dictate the design.
Very interesting that google chose to do this instead of fixing the software that caused the limitation. No wonder that their products are seen as a joke in the business world.
The challenge with running cloud storage is that you have to think around the corners for usage and shape customer behavior with pricing. Seems like google didn't want to do this or was too lazy (sorry). Millions of files will always be a problem. The metadata costs more for these users, it's impossible to manage, hard to cleanup, etc.<p>The problem with google is if they fuckup their service they make it the customers problem. Other places if they fuckup, its more viewed as a one way door. You can sunset old products with (in this case, unlimited files), but you never put in a new restriction.
Initial thought is the ones who are surprised are the ones who sending a notification email to wouldn't have been noticed.<p>Like my dad has 300+ unread emails with who knows how many gigs of attachments.
I wonder if you could create a block-level virtual filesystem backed by Google Drive so that you could store many small logical files in one physical remote "block" (file).
Seems like an Engineering issue more than a User issue. They could just take the node_modules folders and zip them up behind the scenes without changing the user interaction.
This is why despite G Suite being in many ways a superior product, it's made almost no inroads in Corporate America vs. Microsoft Office. Enterprises need to be able to specify a business workflow and depend on it, and if there are nasty surprises it fucks with their money.<p>Microsoft software is much worse than many competitors but it's documented, the behavior doesn't change suddenly, and it's backwards compatible.
Rclone users noticed this new limit back in February.<p>Here is a thread discussing it on the rclone forum:<p><a href="https://forum.rclone.org/t/new-limit-unlocked-on-google-drive/36136" rel="nofollow">https://forum.rclone.org/t/new-limit-unlocked-on-google-driv...</a><p>It would be nice to have official confirmation of the limit rather than relying on speculation.
I know Google employees are reading this ... don't you recognise this sh!tshow and communicate it internally?<p>I'll never understand how such a large organisation can let this kind of stuff happen.
Wait, is it new?<p>I believe Google Drive for Workspace always have a file count limit, and IIRC it's as low as 500k or something, despite having "unlimited" capacity.<p>To be totally fair to Google, I know this precisely because there are communities of data hoarders that actively abuse various cloud storages. In Google Drive's case, They have ways to create "free" Google Workspace accounts via registration exploitation from various institutions. People use them to store PB-level data.<p>(For the interested, there are also ways to apply free MS developer accounts that are supposed to expire in 3 months but can be re-refresh indefinitely. This comes with 5TB "free" cloud storage x 5 (10?) separate sub-accounts.)
I wonder what jury-rigged solution may lead to breaking into 5M limit? I can't believe it was just digital hoarding. In the end, hoarders know better to keep things in zip-archives.
I wonder how that works for companies using Google Workspace. My company has Workspace users close to 6 digits I believe, I'd think we collectively store way more than a few million files.
Timely reminder of this great GitHub repo:<p><a href="https://github.com/awesome-selfhosted/awesome-selfhosted">https://github.com/awesome-selfhosted/awesome-selfhosted</a>
So just don’t be one of the vanishingly small paid users who this affects. Easy.<p>I wonder what vanishingly small is. 0.001% of a million is still thousands.