I'm imagining booting to linux commandline recovery console as a linux noob and being able to boot up on a split screen an offline chatbot that can parse all the man pages and tools onboard and walk you though solving various problems.<p>(But alternatively you could just make a more comprehensive and recovery focused `tl;dr` program)
It's definitely doable right now. llama.cpp (and frontends) can run in CLI on most CPUs. Mistral 7B is a "good enough" base model to run on most machines. Training is good, but you can also inject current relevant man pages into the context with a vectordb or maybe even a simple grep.<p>One issue is that LLMs tend to lie when they don't know really know a particular answer, and you usually solve that by double checking it with a web search.
It sounds potentially useful, but there's pretty much zero friction if you want to do this with Llama or ChatGPT today. I doubt there will be enough demand that people eventually keep bespoke "recovery models" on-disk. If it <i>does</i> become a desired feature, I think maintainers would prefer a deterministic GUI for recovery instead of a language model.