I'm having fun with this visual editor for LLM scripts. It's almost like Hypercard for LLMs.<p>On my 16GB MacBook Air, I did not have to set the OLLAMA_ORIGINS env variable. Maybe I did that a long time ago, as I have a previous Ollama install. This is the first really fun toy/tool that I've found that uses local (also supports foundation model APIs) LLMs to do something interesting.<p>I'm having a ball!