Llamafile might just be the easiest way to run a LLM locally. I tried it out and thought it was awesome, but perhaps a bit lacking in the server UI. In the spirit of simplicity, I made a single-file web page for interacting with Llamafile that I'm calling Llamaphone. The goal of this project is to make the barrier of entry as low as possible for an average user, and it doesn't get much simpler than a single html file.