Link to Hugging face: <a href="https://huggingface.co/collections/apple/openelm-instruct-models-6619ad295d7ae9f868b759ca" rel="nofollow">https://huggingface.co/collections/apple/openelm-instruct-mo...</a><p>I didn't have Apple releasing the weights to their AI model on my 2024 bingo card but I am glad to see it
>There are eight OpenELM models in total – four pre-trained and four instruction-tuned – covering different parameter sizes between 270 million and 3 billion parameters.<p>2024 is shaping up to be the year of open source models from some big players. Great steps for privacy.