Few weeks ago, we were tasked to make llama3 model to do function calling and build an API similar/compatible with OpenAI's python client.<p>While we're still improving our dataset and FastAPI backend server (which can't be open-sourced), we're happy to share with the community our open-sourced project of how to properly fine-tune llama3 model to support function calling, all following OpenAI's best practices.<p>We hope this will help other people with similar tasks/needs, happy to answer any questions.
In case anyone is interested, we have a very simple notebook which try to trick GPT-4o to reveal the structure of the prompt/response related to function calling, and it's located at:<p><a href="https://github.com/michaelnny/Llama3-FunctionCalling/blob/main/ideas/analyze_openai_api.ipynb">https://github.com/michaelnny/Llama3-FunctionCalling/blob/ma...</a>