TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Ask HN: What stack to use for LLM on my local Mac?

2 pointsby djangovmabout 1 year ago
For learning purposes, I want to solve the following problem:<p>Take my notes --&gt; Feed it in a model that can run on my mac (32GB M1 Pro, 10 cores) --&gt; ask questions before my next meeting to give me some context based on my notes.<p>What stack should I use, and how do I start? Please assume zero knowledge on my part besides python, java, and theoretical understanding of vector search, and basic ML.

2 comments

givenkiban1about 1 year ago
Use ollama to manage and download models locally.<p>there&#x27;s a python ollama module that can allow you to run inference with these models.<p>that&#x27;s the first part.
评论 #39732499 未加载
friendlynokillabout 1 year ago
I use lmstudio: <a href="https:&#x2F;&#x2F;lmstudio.ai&#x2F;" rel="nofollow">https:&#x2F;&#x2F;lmstudio.ai&#x2F;</a>