Beware of ANSI escape codes where the LLM might hijack your terminal, aka Terminal DiLLMa.<p><a href="https://embracethered.com/blog/posts/2024/terminal-dillmas-prompt-injection-ansi-sequences/" rel="nofollow">https://embracethered.com/blog/posts/2024/terminal-dillmas-p...</a>
I feel like the incumbent for running llm prompts, including locally, on the cli is llm: <a href="https://github.com/simonw/llm?tab=readme-ov-file#installing-a-model-that-runs-on-your-own-machine">https://github.com/simonw/llm?tab=readme-ov-file#installing-...</a><p>How does this compare?
Did a similar curl script to ask questions to Llama3 hosted at Duckduckgo:<p><a href="https://github.com/zoobab/curlduck">https://github.com/zoobab/curlduck</a>