The article is better than the headline might suggest. Key paragraph:<p>> It’s not just the performance of NVLM, but Nvidia’s decision to make it available as an open-source project. The likes of OpenAI, Claude, and Google aren’t expected to do that anytime soon. Nvidia’s approach could benefit AI researchers and smaller firms, as they’d get access to a seemingly powerful multimodal LLM without having to pay for it.
Sort of an obvious move to open source it--they're selling shovels and just showed everyone where the mines are.<p>This new release will generate more hype/competition around AI, with data and training increasingly becoming the moat (rather than model level innovations), which will lead to more people needing Nvidia GPUs.<p>Nonetheless, I really like how competitive OSS is in LLMs compared to other major innovations.
Let's check the Chatbot Arena in a bit. That's a more useful benchmark than any self-reported benchmarks.<p><a href="https://huggingface.co/spaces/lmsys/chatbot-arena-leaderboard" rel="nofollow">https://huggingface.co/spaces/lmsys/chatbot-arena-leaderboar...</a>