TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Show HN: We built a knowledge hub for running LLMs on edge devices

13 pointsby alanzhuly9 months ago
Hey HN! Alex and Zack from Nexa AI here. We are excited to share a project our team has been passionately working on recently, in collaboration with Jiajun from Meta, Qun from San Francisco State University, and Xin and Qi from the University of North Texas.<p>Running AI models on edge devices is becoming increasingly important. It&#x27;s cost-effective, ensures privacy, offers low-latency responses, and allows for customization. Plus, it&#x27;s always available, even offline. What&#x27;s really exciting is that smaller-scale models are now approaching the performance of large-scale closed-source models for many use cases like &quot;writing assistant&quot; and &quot;email classifier&quot;.<p>We&#x27;ve been immersing ourselves in this rapidly evolving field of on-device AI - from smartphones to IoT gadgets and even that Raspberry Pi you might have lying around. It&#x27;s a fascinating field that&#x27;s moving incredibly fast, and honestly, it&#x27;s been a challenge just keeping up with all the developments.<p>To help us make sense of it all, we started compiling our notes, findings, and resources into a single place. That turned into this GitHub repo: <a href="https:&#x2F;&#x2F;github.com&#x2F;NexaAI&#x2F;Awesome-LLMs-on-device">https:&#x2F;&#x2F;github.com&#x2F;NexaAI&#x2F;Awesome-LLMs-on-device</a><p>Here&#x27;s what you&#x27;ll find inside: A timeline tracking the evolution of on-device AI models Our analysis of efficient architectures and optimization techniques (there are some seriously clever tricks out there) A curated list of cutting-edge models and frameworks we&#x27;ve come across Real-world examples and case studies that got us excited about the potential of this tech<p>We&#x27;re constantly updating it as we learn more. It&#x27;s become an invaluable resource for our own work, and we hope it can be useful for others too - whether you&#x27;re deep in the trenches of AI research or just curious about where edge computing is heading.<p>We&#x27;d love to hear what you think. If you spot anything we&#x27;ve missed, have some insights to add, or just want to geek out about on-device AI, please don&#x27;t hesitate to contribute or reach out. We&#x27;re all learning together here!<p>This is a topic we are genuinely passionate about, and we are looking forward to some great discussions. Thanks for checking it out!

no comments

no comments