TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Ollama violating llama.cpp license for over a year

199 pointsby Jabrov5 days ago

14 comments

rlpb5 days ago
I don&#x27;t see how this claimed issue is valid.<p><a href="https:&#x2F;&#x2F;github.com&#x2F;ollama&#x2F;ollama&#x2F;blob&#x2F;main&#x2F;llama&#x2F;llama.cpp&#x2F;LICENSE">https:&#x2F;&#x2F;github.com&#x2F;ollama&#x2F;ollama&#x2F;blob&#x2F;main&#x2F;llama&#x2F;llama.cpp&#x2F;L...</a> says:<p>&quot;The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.&quot;<p>The issue submitter claims:<p>&quot;The terms of the MIT license require that it distribute the copyright notice in both source and binary form.&quot;<p>But: a) that doesn&#x27;t seem to be in the license text as far I can see; b) I see no evidence that upstream arranged to ship any notice in their binaries, so I don&#x27;t see how it&#x27;s reasonable to expect downstreams to do it; and c) in the distribution world (Debian, etc) that takes great care about license compliance, patching upstreams to include copyright notices in binaries isn&#x27;t a thing. It&#x27;s not the norm, and this is considered acceptable in our ecosystem.<p>Maybe I&#x27;m missing something, but the issue linked does not make the case that there&#x27;s anything unacceptable going on here.
评论 #44005587 未加载
评论 #44004436 未加载
评论 #44004435 未加载
评论 #44009880 未加载
评论 #44005493 未加载
评论 #44004268 未加载
Havoc5 days ago
I&#x27;m continually puzzled by their approach - it&#x27;s such self inflicted negative PR.<p>Building on llama is perfectly valid and they&#x27;re adding value on ease of use here. Just give the llama team appropriately prominent and clearly worded credit for their contributions and call it a day.
评论 #44004780 未加载
评论 #44005416 未加载
levifig5 days ago
FWIW, llama.cpp links to and fetches models from ollama (<a href="https:&#x2F;&#x2F;github.com&#x2F;ggml-org&#x2F;llama.cpp&#x2F;blob&#x2F;master&#x2F;tools&#x2F;run&#x2F;run.cpp">https:&#x2F;&#x2F;github.com&#x2F;ggml-org&#x2F;llama.cpp&#x2F;blob&#x2F;master&#x2F;tools&#x2F;run&#x2F;...</a>).<p>This issue seems to be the typical case of someone being bothered for someone else, because it implies there&#x27;s no &quot;recognition of source material&quot; when there&#x27;s quite a bit of symbiosis between the projects.
评论 #44006452 未加载
评论 #44006985 未加载
评论 #44005635 未加载
评论 #44007804 未加载
评论 #44006351 未加载
Koshima5 days ago
I think it’s fair to push for clear attribution in these cases, but it’s also important to remember that the MIT license is intentionally permissive. It was designed to make sharing code easy without too many hoops. If Ollama is genuinely trying to be part of the open-source community, a little transparency and acknowledgment can avoid a lot of bad blood.
评论 #44005817 未加载
评论 #44005917 未加载
评论 #44013045 未加载
mkesper5 days ago
Take a look at <a href="https:&#x2F;&#x2F;github.com&#x2F;containers&#x2F;ramalama&#x2F;tree&#x2F;main#credit-where-credit-is-due">https:&#x2F;&#x2F;github.com&#x2F;containers&#x2F;ramalama&#x2F;tree&#x2F;main#credit-wher...</a> for comparison. Ollama really should improve their acknowledgements and check license conformance more thoroughly.
评论 #44005016 未加载
gittubaba5 days ago
Huh, I wonder if people really follow MIT in that form. I don&#x27;t remember any binary I downloaded from github that contained a third_party_licenses or dependency_licenses folder that contained every linked library&#x27;s LICENCE files...<p>Do any of you guys remember having a third_party_licenses folder after downloading a binary release from github&#x2F;sourceforge? I think many popular tools will be out of compliance if this was checked...
评论 #44007868 未加载
评论 #44007416 未加载
paxys5 days ago
&gt; The terms of the MIT license require that it distribute the copyright notice in both source and binary form.<p>No, MIT does not require that. The license says:<p>&gt; The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.<p>The exact meaning of this sentence has never been challenged and never been ruled upon. Considering ollama&#x27;s README has a link to llama.cpp&#x27;s project page (which includes the license), I&#x27;d say the requirement has been satisfied.
评论 #44005436 未加载
评论 #44005540 未加载
gmm19905 days ago
The same MIT license is in the ollama project as is in the llama.cpp project, is this not sufficient?<p>llamma.cpp <a href="https:&#x2F;&#x2F;github.com&#x2F;ggml-org&#x2F;llama.cpp&#x2F;blob&#x2F;master&#x2F;LICENSE">https:&#x2F;&#x2F;github.com&#x2F;ggml-org&#x2F;llama.cpp&#x2F;blob&#x2F;master&#x2F;LICENSE</a><p>ollama.cpp <a href="https:&#x2F;&#x2F;github.com&#x2F;ollama&#x2F;ollama&#x2F;blob&#x2F;main&#x2F;LICENSE">https:&#x2F;&#x2F;github.com&#x2F;ollama&#x2F;ollama&#x2F;blob&#x2F;main&#x2F;LICENSE</a>
评论 #44005355 未加载
评论 #44005403 未加载
评论 #44005850 未加载
jjoergensen5 days ago
I noticed this &quot;thank you&quot; today: &quot;GGML<p>Thank you to the GGML team for the tensor library that powers Ollama’s inference – accessing GGML directly from Go has given a portable way to design custom inference graphs and tackle harder model architectures not available before in Ollama.&quot;<p>Source: <a href="https:&#x2F;&#x2F;ollama.com&#x2F;blog&#x2F;multimodal-models">https:&#x2F;&#x2F;ollama.com&#x2F;blog&#x2F;multimodal-models</a>
评论 #44006330 未加载
immibis5 days ago
It only matters if they sue. They won&#x27;t, so it doesn&#x27;t matter. Corporations have learned which open-source licenses are legal to ignore. One other project which openly declared an intention to ignore the licenses of its dependencies is SimpleX Chat, and they&#x27;re getting away with it just fine.
aspenmayer5 days ago
As an analogy, using AI&#x2F;LLM generated comments is against HN guidelines, but you won’t find this proviso in the HN guidelines proper. Where this information is communicated and how is left as an exercise for the reader.
alfiedotwtf5 days ago
A year of complaining but nobody has thought to just implement it themselves and push a PR?
评论 #44007069 未加载
bethekidyouwant5 days ago
Ollama has an mit license, stop eating yourselves, put up a two line merge request rather than complaining that someone else hasn’t for the last two years.<p>Also, am I missing something? How is this not sufficient? ollama&#x2F;llama&#x2F;lamma.cpp&#x2F;LICENSE
Der_Einzige5 days ago
Reason #1395292 that you should be using vLLM, but given the downvotes I get for pointing this out it appears that HN really hates lots of tok&#x2F;s (yes, even with batch size of 1 on your low tier GPU this is true)<p>Why does anyone in the GenAI care about copyright, licenses, etc? (besides for being nice and getting the community to like you, which should matter for Ollama)<p>This whole field is built off piracy at a scale never before seen. Aaron Swartz blushes when he thinks about what Llama and other projects pulled off without anyone getting arrested. Why should I care when one piracy project messes with another?<p>The whole field is basically a celebration of copyright abolitionism and the creation of &quot;dual power&quot; ala 1917 Russia where copyright doesn&#x27;t matter. Have some consistency and stop caring about this stuff.