I still remember seeing the announcement of the 3090 during the keynote presentation. People that aren’t waist deep in this area and haven’t struggled to get a GPU that can actually handle large, (language in my case), models didn’t realize how great a deal the 3090 was. 24 GB of high speed memory in your desktop GPU for ~$2k is just a remarkably cheap increase to productivity. If your access to cloud resources is constrained, it doesn’t need to be the absolute fastest GPU, but you do absolutely need the model to fit in memory otherwise your training problem typically remains intractable.<p>Getting one for MSRP is a different story, but I was lucky enough to get one from the EVGA queue just a couple months after release for right around $2k. Interestingly it also might be my most slowly depreciating piece of computer hardware ever. It’s just so overpowered for a consumer, I probably won’t need a new GPU for the better part of a decade.