TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

How much would you pay for local LLMs?

4 pointsby brody_slade_ai6 months ago
I want to build a private AI setup for my company. I&#x27;m thinking of hosting our model locally instead of in the cloud, using a server at the office that my team can access. Has anyone else done this and had success with it?<p>This setup will be used internally for uncensored chat, coding, image gen and analysis.<p>We&#x27;re thinking of using a combo of hardware:<p>- RTX 4090 GPU<p>- Threadripper Pro 5955WX (anyone used this one before?)<p>- SSD NVMe 1TB<p>What are your picks for a local AI setup? And what&#x27;s the minimum budget to achieve it?

3 comments

GianFabien6 months ago
Have you considered a Mac mini M4 Pro?
blinded6 months ago
I got a nvidia jetson orin, works great it’s not super beefy but it’s a nice little dev rig.
cpach6 months ago
dupe: <a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=42307524">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=42307524</a>