I feel like the incumbent for running llm prompts, including locally, on the cli is llm: <a href="https://github.com/simonw/llm?tab=readme-ov-file#installing-a-model-that-runs-on-your-own-machine">https://github.com/simonw/llm?tab=readme-ov-file#installing-...</a><p>How does this compare?