TE
ТехЭхо
ГлавнаяТоп за 24 часаНовейшиеЛучшиеВопросыПоказатьВакансии
GitHubTwitter
Главная

ТехЭхо

Платформа технологических новостей, созданная с использованием Next.js, предоставляющая глобальные технологические новости и обсуждения.

GitHubTwitter

Главная

ГлавнаяНовейшиеЛучшиеВопросыПоказатьВакансии

Ресурсы

HackerNews APIОригинальный HackerNewsNext.js

© 2025 ТехЭхо. Все права защищены.

Ask HN: Don't You Mind That LLMs Are Mostly Proprietary?

18 балловавтор: dakiol6 дней назад
I&#x27;m not sure if it&#x27;s just me getting older or what, but something strikes me as odd about the future of programming and software engineering: LLMs are impressive, but you have to pay to use them. I can&#x27;t recall another core tool or technology in the software industry—something central not just to the field, but to the world—that isn&#x27;t free or open source. Think TCP&#x2F;IP, the Linux kernel, Postgres, Git, ffmpeg, qemu, Latex, Kubernetes, and so on. Sure, there&#x27;s plenty of proprietary software out there, but it&#x27;s not the backbone of the internet or the computing industry.<p>Now, LLMs have the potential to become part of that backbone, yet nobody seems particularly concerned that they’re not open source (I&#x27;m talking about GPT, Claude, Copilot, Gemini). I know there are open source alternatives, but they’re not nearly as capable—and it seems most people here are perfectly fine using and paying for the proprietary ones.<p>I don’t like a future where I have to pay for every token just to write a program. And don’t tell me, &quot;Well, just don’t use LLMs&quot;; they’re going to become what Linux is today: ubiquitous.

11 comments

matt_s6 дней назад
There was a time back in the years after dot com era where search servers were sold with the proprietary software loaded for enterprises to use. A google box running on your local network. Now there are advanced things like elastic and other search specific technologies that you can use.<p>I think LLMs might follow this market pattern where you can buy something to host yourself and then commoditization happens enough where open source solutions will also evolve to have good enough solutions.<p>An idea for a disrupting company would be to open source their LLM and offer support and feature development to enterprises as the paid offering, kinda like Red Hat or others doing that model. A key difference is running an LLM locally on decent sized compute is fine but it will be costly to scale on your own.
评论 #44044768 未加载
daveguy6 дней назад
The state of LLMs right now seems analogous to the state of CAD software. There are free options available but they aren&#x27;t as capable. The good part about this analogy is that the needs of the users aren&#x27;t very dynamic -- if you are using a CAD software it is for design tasks. The interface may change, but the purpose doesn&#x27;t. Because of this, the quality gap between free CAD packages and<p>LLMs are natural language models (what words are likely to come next given the context), not any sort of AGI. For that purpose, the gap between open and closed models is closing much faster in LLMs than CAD. I think LLMs will go the way of chess engines -- one of these models will become the Stockfish of LLMs and the proprietary models will end up being a waste of money and resources.
selfhoster115 дней назад
I mind, a lot. That is why I&#x27;ve built a cheap (in relative terms) rig that can run models up to approximately 600B parameters, although only extremely slowly once the model spills out of the GPUs. I would much rather be able to run open LLMs slowly than not at all.
评论 #44045593 未加载
notaharvardmba3 дня назад
LLMs are just a statistical analysis of a large body of written information. The proprietary part is mostly the training. It’ll end up being similar to OSM vs Google maps vs HERE. You can get close with crowd&#x2F;open sourcing, but you can get significantly better at any given point in time by investing millions or billions. At some point the commodity version will be good enough and then no one will invest any more and we’ll stall out for a while until someone succeeds at the next moonshot.
Bostonian6 дней назад
You don&#x27;t have to use LLMs, of course, and when you want help coding, LLMs are vastly cheaper than human programmers.
评论 #44075719 未加载
评论 #44039598 未加载
trod12345 дней назад
&gt; I don&#x27;t like a future where I have to pay for every token just to write a program...<p>You don&#x27;t really need to worry then. There is no future like that in the long run. No future at all.<p>If you think a little, you&#x27;ll realize AI breaks a number of pillars holding society in a working state.<p>The thing forces the labor value of time to zero while simultaneously eliminating economic calculation in the factor markets, and eliminating capital formation.<p>There is a saying, businesses sometimes win so much that they lose. This is one of those times. All that lay ahead is a maelstrom of mathematical chaos, and the only ones that will survive are those not in it.
yen2235 дней назад
Windows has been the backbone of desktop computing for decades, and it certainly isn&#x27;t free. People relying on open-source software seems to be a relatively new phenomenon, starting in the early-2000s.<p>For what it&#x27;s worth, I absolutely share your concerns around the fact that LLM models are now proprietary. I want to see more open-source (or at least open-weight) models being built
minimaxir6 дней назад
&gt; I know there are open source alternatives, but they’re not nearly as capable<p>They certainly are capable (DeepSeek being the obvious example), the problem is that they&#x27;re still too expensive to run and there&#x27;s no currently differentiator to compete with the big players who are likely selling inference at a loss.
评论 #44035454 未加载
paulcole2 дня назад
I couldn’t care less as long as one of them gives me a lot of value.
baobun6 дней назад
You&#x27;re in a bit of an echo chamber, I believe. The concern is real and valid. It just doesn&#x27;t get spammed to death nearly as much as embracing narratives.
devops0006 дней назад
You can host Llama if you are concerned about this. If you have something so valuable you will not give it for free to everyone.
评论 #44035465 未加载
评论 #44035447 未加载