TE
TechEcho
Home
24h Top
Newest
Best
Ask
Show
Jobs
English
GitHub
Twitter
Home
Selfhost LLMs like Mixtral 8x7B on the Edge & across devices.
1 points
by
3Sophons
over 1 year ago
1 comment
3Sophons
over 1 year ago
Fully portable inference app only 2MB. Try it on your Mac with one single command: <a href="https://www.secondstate.io/run-llm/" rel="nofollow">https://www.secondstate.io/run-llm/</a>