With the new LLMs being released every now and then by big player I am starting to think LLMs won't be that unique.<p>Google, OpenAI, Anthropic, Mistral, Cohere and so many companies are building their own LLMs that have started to challenge GTP-4 and I think this will continue even after GPT-5.<p>Also with the huge cost of data collection, model training, compute cost and everything included I feel it's a bad deal to build LLMs for the smaller startups. Rather build a platform to leverage AI to deliver value to users or businesses.
It's totally infeasible for a small startup without significant funding to build a competitive LLM from scratch.<p>What is possible is fine tuning an existing LLM.<p>I suspect that this will become popular. One can imagine open source platforms with built-in plugin registries for easily installable agents. Each agent may come with it's own knowledgebase and vectors or keyword for selecting them. It may be a quantized fine tuned LoRA that can be loaded on the fly.
> Rather build a platform to leverage AI to deliver value to users or businesses.<p>By focusing on "today" problems they risk being leapfrogged by the next thing that solves more known and unknown problems and grants new abilities. Could spell instant death. The AI-birthers are gonna chase the fine-tune dollar anyway and maybe we won't even see any more jaw-dropping improvements from here on out but we will see what happens.
not sure, but enshittification is in the process. They are just too powerful if they work properly. It's like buying a Yamaha R1 and ridding in a 30mph zone.<p>For the record, enshittification should not be a thing for LLMs though...