TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Ask HN: What desktop to buy for home based AI/ML in 2023

5 pointsby ImageXavover 2 years ago
This year I started running some AI&#x2F;ML experiments at home using Google Colab. However, I want to start using my own compute for some of these.I was thinking of buying the following PC: https:&#x2F;&#x2F;www.palicomp.co.uk&#x2F;amd-velocity-fac10 and upgrading to a 3080 and i7 as I do a lot of image analysis and it would be helpful to be able to parallelism some operations.<p>What does HN think of this and these specs? Any recommendations? I&#x27;m trying to keep my budget under £2000 unless going a bit over that would allow for a real increase in processing power.

4 comments

KuriousCatover 2 years ago
I would recommend a card with higher RAM, may be a 3090ti coupled with an AMD Ryzen 9 processor. Recently, I got one for around 2000 USD. <a href="https:&#x2F;&#x2F;store.nvidia.com&#x2F;en-gb&#x2F;geforce&#x2F;store&#x2F;gpu&#x2F;?page=1&amp;limit=9&amp;locale=en-gb&amp;category=GPU&amp;gpu=RTX%203090,RTX%203090%20Ti" rel="nofollow">https:&#x2F;&#x2F;store.nvidia.com&#x2F;en-gb&#x2F;geforce&#x2F;store&#x2F;gpu&#x2F;?page=1&amp;lim...</a>
评论 #34068547 未加载
augasurover 2 years ago
I would pick AMD Ryzen CPU and any Nvidia GPU with more ram.<p>If budget allows, I would pick RTX3090, if budget is too tight, look at RTX3080 or RTX3080Ti 12 Gb vram models. 10 gb vram models can be too small in the future or if you plan to use it for image processing.<p>I also am working mainly with image processing and segmentation, we mainly use RTX3080 and RTX3080Ti GPUs.<p>It should also be cheaper to pick components and build PC yourself.
version_fiveover 2 years ago
I strongly recommend using colab or a crappy gpu for experimentation and just paying for compute when you want to run jobs that need it. Most tinkering doesn&#x27;t need any special gpu, so whatever you buy is going to mostly sit idle. Paying for cloud GPUs obviously costs more per hour, but you then only pay for what you need and can scale however you want
评论 #34068447 未加载
navjack27over 2 years ago
Are you planning on making money doing this? Invest in a 7950X and all the RAM you can and either a:<p>4090<p>3090<p>3090 Ti<p>A30<p>A40<p>A100<p>H100
评论 #34073910 未加载