Just got this in my inbox. They haven't updated the FAQs page yet, as far as I can tell.<p>> Hi-<p>We’re improving the Terms of Service that apply to your Colab Pro or Colab Pro+ subscription making them easier for you to understand and improving the ways you can use Colab. The changes will take effect on September 29.<p>The [updated Terms of Service](https://research.google.com/colaboratory/tos_v3.html) include changes that will allow you to have more control over how and when you use Colab, allowing us to offer new services and features that will enhance your experience using Colab.<p>We will increase transparency by granting paid subscribers compute quota via compute units which will be visible in your Colab notebooks, allowing you to understand how much compute quota you have left. These compute units are granted monthly and will expire after 3 months. You will be entitled to a certain number of compute units based on your subscription level and will have the ability to purchase more compute units as needed.<p>Additionally, we will allow paid subscribers to exhaust their compute quota at a much higher rate. This will result in paid subscribers having more flexibility in accessing resources. Read more about these changes at our [FAQ](https://research.google.com/colaboratory/faq.html#compute-units).<p>If you would like to cancel your Colab Pro or Pro+ subscription, you can do that by going to pay.google.com and clicking Subscriptions and services. If you have any trouble canceling, you can email colab-billing@google.com for assistance. Please include an order number from one of your receipt emails if you email us for assistance.<p>-The Colab team
You can really tell, in the comments sections of changes like these, who is speaking from the perspective of having a professional/business vs. a personal use-case.<p>Individuals tend to be upset; while professionals are happy that individual free-riders will no longer be sucking up undue amounts of compute power, and so QoS on the system will improve for them.
Have been experimenting lately with GPUs off vast.ai. Has worked well for experiments with Stable Diffusion and is cheap!<p>Any other suggestions for where to rent cheap GPUs? - i've heard about Hetzner (<a href="https://www.hetzner.com/sb?search=gpu" rel="nofollow">https://www.hetzner.com/sb?search=gpu</a>), but they are 1080s.
Am I the only one that thinks it’s nice they’re being explicit about how much they’re giving you? I found the original ‘however much we have available and feel like giving to you’ plan limit highly unprofessional.<p>I got an A100 after I susbscribed, so it worked out for me, but still annoying you don’t know what you’ll get.
I deeply appreciate Colab. I bought a nice home GPU rig a few years ago, but seldom use it. When I am lightly using Colab I use it for free and when I have more time for personal research the $10/month plan works really well. I can see occasionally paying for the $50/month plan as the need arises in the future.<p>I am working on an AI book in Python. (I usually write about Lisp languages.) About half the examples will be Colab notebooks and half will be Python examples to be run on laptops.<p>In any case, I like the soon to be implemented changes, sounds like a good idea to get credits and see a readout of usage and what you have left.
I like the transparency, but this doesn't feel like the right way to do it. Computation should be free (or nearly free) if there's idle capacity, paid if Google is near capacity, and expensive/bidding if Google is above capacity.<p>Flat compute units seem simple, but result in a lot of waste.
At Paperspace we've long offered an alternative to Google Colab that includes free CPU, GPU, and (recently released) IPU machines.<p>Free notebooks can be run for 6 hours at a time.<p>More info available in docs: <a href="https://docs.paperspace.com/gradient/machines/#free-machines-tier-list" rel="nofollow">https://docs.paperspace.com/gradient/machines/#free-machines...</a>
At last! I love Colab but the vague promises around availability and quota made it impossible to recommend for my team to use professionally.<p>I even tried and failed to get it up and running with a Google cloud GPU recently, before just switching to Lambda which worked first time (but had since hit availability issues).
Question for the Colab team:<p>The restrictions listed at <a href="https://research.google.com/colaboratory/tos_v3.html" rel="nofollow">https://research.google.com/colaboratory/tos_v3.html</a> differ slightly from the limits listed at <a href="https://research.google.com/colaboratory/faq.html" rel="nofollow">https://research.google.com/colaboratory/faq.html</a> specifically tos_v3.html does not mention these items from the faq<p><pre><code> * using a remote desktop or SSH
* connecting to remote proxies
</code></pre>
I can appreciate why those were added - I've read posts and notebooks explaining how you can use ngrok or cloudflare to do those things in violation of the restrictions in the faq and clearly many people aren't using Colab as intended.<p>Speaking as someone who has been playing around with the Colab free tier with the expectation of moving to a paid service once I know what I'm really doing, I'd like to know if it's likely these restrictions will be eased a bit with the move to a compute credit system.<p>I'm still learning and haven't had a need to do those things <i>yet</i> but I believe remote ssh access would greatly simplify managing things. The Jupyter interface and integrated Colab debugger are good for experimenting but I'm worried that as I get closer to production I'll need a way to observe and change the state of long-running Colab processes the way I could with ssh, ansible or other existing tooling.<p>Clearly I can build that myself or use something like Anvil Works <a href="https://anvil.works" rel="nofollow">https://anvil.works</a> but that's time and effort I'd rather avoid if possible. So I'm hoping that the Colab team will ease the SSH restriction for people like me who want to use it for more traditional ops/monitoring of long running tasks.<p>Do you anticipate any change or easing of the SSH restriction?
I really like the increase in transparency, I found it somewhat disturbing to pay for what feels like a random amount of stuff. How should I know if I need Pro or Pro+ if there is no estimate out there what either might get me. The update does not seem to change that though.
I would love to have a distribution plotted of how much compute I might expect. Or at least Min/Average/Max run time until disconnect (rn. only Max is known).
From the Google Colab product lead:<p>> This has been planned for months, it's laying the groundwork to give you more transparency in your compute consumption, which is hidden from users today.<p><a href="https://twitter.com/thechrisperry/status/1564806305893584896" rel="nofollow">https://twitter.com/thechrisperry/status/1564806305893584896</a>
For those affected and wanting to run your stable diffusion notebook more, you can always spin up a notebook on Lambda cloud with A100s for only $1.10/hr. PyTorch, TensorFlow, and Jupyter notebooks are pre-installed:<p><a href="https://lambdalabs.com/service/gpu-cloud" rel="nofollow">https://lambdalabs.com/service/gpu-cloud</a>
Good. Hopefully this will reduce the randomness of type-of-GPU assignment on the Pro plan.<p>I fine-tuned a model on Colab Pro earlier this year and having to launch and quit 6 or 7 times to get a faster graphics card to ensure it completed within the time limit sucked.<p>Hope this will give more transparency into whether you are assigned a whole card or a virtual slice of one. Something I could never work out before!