TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

10k+ GPUs for Inference Jobs (waitlist giveaway)

3 pointsby bobjmilesover 2 years ago

2 comments

bobjmilesover 2 years ago
Salad just launched a waitlist for their new Inference API running atop distributed infrastructure, 10k&#x27;s of GPUs available.<p>The network is built from consumer nodes and comes at a fraction the cost of any other provider. - 6x inferences per dollar for Stable Diffusion (3090) - 4x inf&#x2F;dollar for BERT (CPUs)<p>$1k&#x2F;$100 in credits available on the waitlist: salad.com&#x2F;salad-inference-endpoints<p>----- Salad also has a fully managed container service in beta, accessible at portal.salad.com, but account verification is required to get a container quota &gt;0 (you go through the full setup and then prompted for more info)
boxerbkover 2 years ago
Do you have an easy way to see the cost differential between this and public cloud alternatives?
评论 #34149188 未加载