Super cool idea, I've been hoping for such a thing!<p>However, terrible interface!<p>Firstly I hate blindly running shell scripts from third part URLs.<p>Secondly yours is awful!<p>It does too much. It would be better to instruct how to set up the wasm environment and download the LLM separately from running the dang thing.<p>I'm laboriously separating these parts now to manually troubleshoot those that are failing.<p>Thanks for your efforts however.<p>If I get tinyllamma working I'll be piping it through piper for some speech output!
LlamaEdge revolutionizes AI/LLM runtime with lightweight (<5MB), portable, and secure applications for diverse CPUs/GPUs across different OSes, simplifying development and deployment from local to edge and cloud.
<a href="https://www.secondstate.io/LlamaEdge/" rel="nofollow">https://www.secondstate.io/LlamaEdge/</a>