For what purposes you use llama.cpp?<p>What tools do you use with llama.cpp?<p>Is there anything you recommend to avoid when it comes to llama.cpp?<p>Want to collect your best practices/experiences and advice around llama.cpp. Eg. if you work with Visual Studio Code - what plugins you recommend, and what not. Etc...
I use it to help me write text.<p>Don't use any tools. I run it from the command line:<p>./main -f ~/Desktop/prompts/multishot/llama3-few-shot-prompt-10.txt -m ~/Desktop/models/Meta-Llama-3-8B-Instruct-Q8_0.gguf --temp 0 --color -c 1024 -n -1 --repeat_penalty 1.2 -tb 8 --log-disable 2>/dev/null<p>I prefer `main` to the new `llama-cli` because when searching history for "llama" I want to get commands that contain the "llama" models, not "mistral" ones, for example.