TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Does the size of your llama matter? What's Ollama?, can I plug it to Chroma? (

9 pointsby torrmalover 1 year ago
If you have not seen it, this pretty cool open source project Ollama: It let&#x27;s you run big LLMs in computers as small as your mac book!!<p>https:&#x2F;&#x2F;lu.ma&#x2F;yourdatayourmodel<p>Ollama is launching support for Linux based machines this Wednesday at the SF AI Collective (come: https:&#x2F;&#x2F;lu.ma&#x2F;yourdatayourmodel), alongside the founder of Chroma and a few more interesting people, join us in socializing and brainstorming about the future of all this!

1 comment

iain1992over 1 year ago
My Llama is huge! 70B parameters....Maybe I should try Ollama...
评论 #37653127 未加载