fusion-guide is a 12-billion parameter, open-source model built on fine-tuned Mistral Nemo, designed to generate Chain-of-Thought reasoning and provide guidance to enhance other models' performance. It is meant to work in tandem with a general instruction model, and early tests show that pairing it with a 7-billion parameter LLaMA 3.1 Instruction model can outperform much larger models on certain tasks.<p>Inputs must be wrapped in <guidance_prompt>{THE PROMPT}</guidance_prompt> tags. However, there are some limitations—fusion-guide may struggle with very long or complex prompts, as this is still an experimental approach.