You can see here the most downloaded models from the Hugging Face transformers hub: <a href="https://huggingface.co/spaces/huggingface/transformers-stats" rel="nofollow">https://huggingface.co/spaces/huggingface/transformers-stats</a><p>If I'm not mistaken these will be some of the best sota language models to try:<p>- Flan-T5<p>- OPT<p>- Distilbert<p>- GPT-J<p>Look into PEFT to fit finetuning on a consumer-grade device. You'll have to roll your own RLHF though.. :)
Been asked here before, but this was interesting:<p><a href="https://old.reddit.com/r/programming/comments/szqq5m/gptj_is_selfhosted_opensource_analog_of_gpt3_how/" rel="nofollow">https://old.reddit.com/r/programming/comments/szqq5m/gptj_is...</a>