Hi,<p>These days when I do literature reviews for the trending research, I find that most of the useful models need insane amount of GPU to train. As a student in a small lab, we are lack of GPU power. Is it still meaningful if I stay in this lab to learn deep learning??? I felt like all my work is like a toy...feeling lost, any advice?
You need GPUs to train models. Right now that's just the way it is; you can enjoy inference on weaker hardware, but training is plainly unforgiving.<p>Thankfully, you can get pretty good results finetuning on a much smaller and cheaper GPU like the 3060ti. Going from GPU-poor to GPU-rich is easier than you might think.
Apparently this is true of nearly all academic labs, e.g. <a href="https://twitter.com/Yuchenj_UW/status/1789319531430912042" rel="nofollow">https://twitter.com/Yuchenj_UW/status/1789319531430912042</a>.