Hi HN,<p>I'm excited to share Bodhi App, a tool designed to simplify running open-source Large Language Models (LLMs) locally on your laptops. While we currently support M2 Macs, we plan to support other platforms as our community grows.<p># Problem<p>To use LLMs, you typically need to purchase a subscription from providers like OpenAI or Anthropic, or use OpenAI API credits with compatible Chat UIs. These options can not only burden you financially, but also raise data security and privacy concerns.<p>Many laptops are capable of running powerful open-source LLMs, but for non-tech users, setting them up can be challenging.<p># Solution: Bodhi App<p>Bodhi App allows you to run LLMs on your own hardware, ensuring data privacy and cost savings. Our goal is to bring the power of LLMs to everyone.<p>Built with non-tech users in mind, Bodhi App ships with a simple Chat UI, making it easy to start conversing with an LLM. It also exposes OpenAI-compatible APIs, enabling other apps to use LLM services without relying on external providers.<p># Features<p>Bodhi App currently supports:<p>1. Running GGUF format open-source LLMs from the Huggingface repository.<p>2. An in-built Chat UI to quickly start conversations with an LLM.<p>3. A powerful and familiar CLI to download and configure models from the Huggingface ecosystem.<p>4. Exposing LLMs as OpenAI-compatible APIs for use by other apps.<p>--<p># Feature Comparison: Bodhi App vs. Ollama<p>## Bodhi App<p>- Targeted at non-tech users.<p>- Includes a simple Chat UI to get started quickly.<p>- Integrates well with the Huggingface ecosystem:<p><pre><code> - Use Huggingface repo/filename to run a model.
- Use tokenizer_config.json for chat templates.
</code></pre>
- Currently only supports Mac M2.<p>## Ollama<p>- Requires some technical insight.<p>- No inbuilt Chat UI.<p>- Requires baking the model using a custom process:<p><pre><code> - Modelfile
- Golang template to specify chat templates.
</code></pre>
- Supports various OS platforms.<p>Bodhi App leverages the Huggingface ecosystem, avoiding the need to reinvent the wheel with Modelfile etc., and making it easier to get started quickly.<p># Quickstart<p>Try Bodhi App today by following these simple steps:<p>```bash<p>brew tap BodhiSearch/apps<p>brew install --cask bodhi<p>bodhi run llama3:instruct<p>```<p># Documentation and Tutorials<p>- Technical docs on GitHub (README.md, docs folder): <a href="https://github.com/BodhiSearch/BodhiApp">https://github.com/BodhiSearch/BodhiApp</a><p>- YouTube playlist covering Bodhi App features in detail: <a href="https://www.youtube.com/playlist?list=PLavvg7KIktFI1ZaFc2nLeZtcfCeN09MnT" rel="nofollow">https://www.youtube.com/playlist?list=PLavvg7KIktFI1ZaFc2nLe...</a><p># Conclusion<p>We would love for HN to try out Bodhi App and provide feedback. You can reach us through:<p>- Raising an issue on GitHub: <a href="https://github.com/BodhiSearch/BodhiApp">https://github.com/BodhiSearch/BodhiApp</a><p>- Connecting with the developer on Twitter: <a href="https://twitter.com/AmirNagri" rel="nofollow">https://twitter.com/AmirNagri</a><p>- Leaving a comment on our YouTube tutorials: <a href="https://www.youtube.com/playlist?list=PLavvg7KIktFI1ZaFc2nLeZtcfCeN09MnT" rel="nofollow">https://www.youtube.com/playlist?list=PLavvg7KIktFI1ZaFc2nLe...</a><p>Please show your support by starring the repo on Github.<p>Thank you for your time!<p>Best,
The Bodhi Team
This looks great for both tech and non tech folks and solves for use cases from both groups.
Will definitely try it out.
Thanks for open sourcing this!