We're launching Helix 1.0 today: local GenAI stack that runs on Open Source models, bootstrapped business, made $100K in revenue in 9 months.<p>We put a lot of effort into making the installation super simple, so you can run it on Linux/Windows with an NVIDIA GPU, alongside Ollama on Mac etc, or against an external LLM API.<p>Here's a demo of what you can do with it: <a href="https://www.youtube.com/watch?v=6QcOXq3VFpc" rel="nofollow">https://www.youtube.com/watch?v=6QcOXq3VFpc</a><p>In the demo:<p>* Helix Apps, version controlled configuration for LLM-based applications<p>* Knowledge, continuously updated RAG from a URL<p>* API integrations so your app can call an API to get up to date information when needed<p>* New Helix App Editor UI<p>* New easy installer with support for Helix running on macOS (alongside Ollama) and Windows on an NVIDIA GPU, as well as Linux with Docker and Kubernetes