TE
TechEcho
Home
24h Top
Newest
Best
Ask
Show
Jobs
English
GitHub
Twitter
Home
4 Command lines to run open source LLMs across devices with 2MB inference app
1 points
by
3Sophons
over 1 year ago
no comments
no comments