vast.ai is a popular and economical option.<p>A single 3090 will host 70B reasonably well, two will fit it completely in vram.<p>Another thing I suggest is hosting on AI horde with koboldcpp, if the UI/API works for you and the finetune is appropriate for public use. You will get priority access to your host, but fulfilling other prompts in its spare time will earn you kudos to try other models people are hosting, or to get more burst throughput.<p><a href="https://lite.koboldai.net/#" rel="nofollow noreferrer">https://lite.koboldai.net/#</a>