TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Can I run this LLM? Check if LLM will work on your hardware

1 pointsby AJRF4 months ago

1 comment

AJRF4 months ago
Hey HN,<p>I&#x27;ve recently started running local LLMs, and one problem I encountered was inconsistent information on whether a model will run within a certain amount of VRAM.<p>I created a simple calculator that helps determine if a model can run on your hardware. It will also tell you how much VRAM you need at different quantization levels.<p>It doesn’t work with all models yet, but I’m working on building a more stable dataset to pull from.<p>Feedback is appreciated!