TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Ask HN: You have 50k USD and want build an inference rig without GPUs. How?

3 点作者 4k3 个月前
This is more like a thought experiment and I am hoping to learn the other developments in the LLM inference space that are not strictly GPUs.<p>Conditions:<p>1. You want a solution for LLM inference and LLM inference only. You don&#x27;t care about any other general or special purpose computing<p>2. The solution can use any kind of hardware you want<p>3. Your only goal is to maximize the (inference speed) X (model size) for 70b+ models<p>4. You&#x27;re allowed to build this with tech mostly likely available by end of 2025.<p>How do you do it?

1 comment

sitkack3 个月前
You wait until someone posts an answer here, <a href="https:&#x2F;&#x2F;www.reddit.com&#x2F;r&#x2F;LLMDevs&#x2F;comments&#x2F;1if0q87&#x2F;you_have_roughly_50000_usd_you_have_to_build_an&#x2F;" rel="nofollow">https:&#x2F;&#x2F;www.reddit.com&#x2F;r&#x2F;LLMDevs&#x2F;comments&#x2F;1if0q87&#x2F;you_have_r...</a><p><a href="https:&#x2F;&#x2F;www.phind.com&#x2F;search&#x2F;cm6lxx6hw00002e6gioj41wa5">https:&#x2F;&#x2F;www.phind.com&#x2F;search&#x2F;cm6lxx6hw00002e6gioj41wa5</a>