TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Ask HN: You have 50k USD and want build an inference rig without GPUs. How?

3 pointsby 4k3 months ago
This is more like a thought experiment and I am hoping to learn the other developments in the LLM inference space that are not strictly GPUs.<p>Conditions:<p>1. You want a solution for LLM inference and LLM inference only. You don&#x27;t care about any other general or special purpose computing<p>2. The solution can use any kind of hardware you want<p>3. Your only goal is to maximize the (inference speed) X (model size) for 70b+ models<p>4. You&#x27;re allowed to build this with tech mostly likely available by end of 2025.<p>How do you do it?

1 comment

sitkack3 months ago
You wait until someone posts an answer here, <a href="https:&#x2F;&#x2F;www.reddit.com&#x2F;r&#x2F;LLMDevs&#x2F;comments&#x2F;1if0q87&#x2F;you_have_roughly_50000_usd_you_have_to_build_an&#x2F;" rel="nofollow">https:&#x2F;&#x2F;www.reddit.com&#x2F;r&#x2F;LLMDevs&#x2F;comments&#x2F;1if0q87&#x2F;you_have_r...</a><p><a href="https:&#x2F;&#x2F;www.phind.com&#x2F;search&#x2F;cm6lxx6hw00002e6gioj41wa5">https:&#x2F;&#x2F;www.phind.com&#x2F;search&#x2F;cm6lxx6hw00002e6gioj41wa5</a>