TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Ask HN: What laptop would you recommend for running LLM models locally?

3 pointsby josefslerkaabout 2 years ago

1 comment

ftxbroabout 2 years ago
I'm not an apple fanboy but I would consider some of the apple laptops, because they have some kind of sharing of RAM between their AI accelerators and their main memory. This lets them load bigger models than normal, which can be the limiting factor in running local LLM models.
评论 #35839201 未加载
评论 #35845353 未加载
评论 #35839375 未加载