Salad just launched a waitlist for their new Inference API running atop distributed infrastructure, 10k's of GPUs available.<p>The network is built from consumer nodes and comes at a fraction the cost of any other provider.
- 6x inferences per dollar for Stable Diffusion (3090)
- 4x inf/dollar for BERT (CPUs)<p>$1k/$100 in credits available on the waitlist:
salad.com/salad-inference-endpoints<p>-----
Salad also has a fully managed container service in beta, accessible at portal.salad.com, but account verification is required to get a container quota >0 (you go through the full setup and then prompted for more info)