TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Ask HN: AI at home with multiple graphics cards

2 pointsby TheGamerUncle7 months ago
I was wondering if anyone has been testing and running AI locally, what applications and models have you been using and what have you been able to find useful, currently I have a not so old pseudoserver with a 3060 TI and a 2060 as well as a second computer with a 3060, what should or can I run, are there any models I can use over my local network that can benefit from feedback between two computers (agency models) ?

1 comment

talldayo7 months ago
According to this, you should be able to leverage multi-GPU machines using the stock-and-standard Llama.cpp: <a href="https:&#x2F;&#x2F;github.com&#x2F;ggerganov&#x2F;llama.cpp&#x2F;pull&#x2F;1703">https:&#x2F;&#x2F;github.com&#x2F;ggerganov&#x2F;llama.cpp&#x2F;pull&#x2F;1703</a>