TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

How should we define ‘open’ AI?

24 点作者 MilnerRoute大约 1 年前

10 条评论

enriquto大约 1 年前
There&#x27;s nothing special in &quot;AI&quot;. Open AI is just like all open source&#x2F;free software: Publicly available complete training data, public weights, public training source code so that the weights can be replicated exactly, public inference code so that the weights can be used. All of this under reasonable free software licenses (i.e. FSF&#x2F;OSI-approved).<p>Conceptually, the training data should be considered part of the source code. The weights are provided for practical purposes, because they are difficult to &quot;compile&quot;.
评论 #39936299 未加载
评论 #39937870 未加载
jraph大约 1 年前
&gt; The term &quot;open&quot; has no agreed-upon definition in the context of AI<p>I&#x27;m pretty sure &quot;open&quot; is not clear because those big corporations decided to blur its definition. They decided that &quot;open&quot; sounds good and used the term liberally. They could have built a strong definition since they use the term, but they didn&#x27;t, because it&#x27;s just marketing for them.<p>Facebook is especially guilty for using &quot;open source&quot; to qualify something that should have a restricted number of users, however big this number is. With all the brilliant people and lawyers they must have, it&#x27;s impossible they didn&#x27;t do this on purpose.
评论 #39935231 未加载
Eager大约 1 年前
Open weights is one thing, but we don&#x27;t even have that with OpenAI at least.<p>Even then, open weights is like me checking in a .exe and acting surprised if people look at me funny.<p>I&#x27;m definitely in the camp where all the artefacts are provided along with fully reproducible build and test environment for anyone who wants to retrace the steps.<p>Whatever &#x27;open&#x27; means, I don&#x27;t think it is eight shell companies, not even weights provided and closely guarded secrets around how RLHF, alignment and safety testing is carried out.<p>In fact, you would think that being &#x27;open&#x27; about at least alignment and safety testing procedures would be the least one could expect.<p>I do understand that revealing these things may disclose zero day exploits for bad actors, but on the other hand, being open for inspection is how things get fixed, and I&#x27;ve never been a fan of security through obscurity.
benreesman大约 1 年前
It pretty much comes down to two concepts that are easily common sense and will certainly be defined rigorously at some point:<p>Open AI must be “available weight”: the technical public defied the powers that be over mp3 files and HDMI cables and won. This stuff is going to get hacked, leaked, torrented, and distributed full stop until someone brokers a mutually acceptable compromise like Jobs did. Whatever your position on the legality or morality of this, it’s happening. How much does someone want to prop bet on this?<p>Open AI must be “operator-aligned”: there exist laws on the books, today, for causing harm to others, via computers, that many argue are already draconian. Within the constraints of the law as legislated by congress, ruled upon by the judiciary, and handled at the utmost, unambiguous emergency by the executive apparatus, the agent must comply with the directives of the operator bounded only by the agent’s capability and the operator’s budget.<p>The legal and regulatory framework will take years. We can start applying common sense now.
stale2002大约 1 年前
IMO this debate about what &quot;open&quot; means itself obfuscates the issue significantly.<p>This is because if you just say &quot;Well technically LLama 2 doesn&#x27;t fit the traditional definition of open source!&quot;, it implies that there is some sort of significant caveot or difference that makes it significantly more restrictive than other open source projects.<p>This, of course, isn&#x27;t true. Almost everyone can use LLama 2 for almost whatever they want. Yes, there are some restrictions, but the restrictions are so small that making a big deal over them incorrectly implies that there is some huge restriction, when there isn&#x27;t.
评论 #39939045 未加载
评论 #39936432 未加载
bee_rider大约 1 年前
IMO it shouldn’t be called open unless the thing being shared is human-understandable. Like open source programs, you get the source code, which you can inspect, and figure out if you trust it. This ability to inspect (and modify) what matters about open source.<p>When I look at ML weights, I don’t understand them, they just look like some random matrix to me. I think we need to have access to the training set and a description of the steps in training process (like a makefile).<p>If you want to share inscrutable weights after processing, call it what it is: Shareware. Shareware was great! But it isn’t open.
throwaway13337大约 1 年前
The purity of good ideas always get co-opted by cynical actors wearing the clothes of the ideals without having the ideals themselves. At the core, all these good ideas have in them a spirit of cooperation and trust. The trust is eroded over time to exploit the cooperation inherit in it while not incurring the cost of participation.<p>At that point, the words lose their meaning. You can see this worldwide with &quot;democratic republic of&quot; or, in our industry, &quot;agile&quot;. Whatever meaning they once had is gone and will not return.<p>In order to avoid this problem, you need to either use the entire expanded definition to be precise or create a new word that is associated only with your community.<p>Expanding the definition of a shorthand, like &quot;open&quot;, you never really achieve much because culture lives in the shorthand. Only true believer types like Stallman will insist on it for any length of time.<p>Therefore, whatever comes next in the non-cynical world of software would have to come from a new movement with a new vocabulary. The new values always rhyme with the old but are expressed differently to more directly disarm the new cynical malignancies that have killed the old good ideas.<p>The struggle is a forever arms race against parasitic participants in the global iterative prisoner&#x27;s dilemma we&#x27;re all playing.<p>It&#x27;s not about what is called &quot;open&quot;. New words with new community values need to replace it.<p>Open is dead.
tomrod大约 1 年前
Open: open weights<p>Reproducible: training and testing data sources, validation seeds, and production endpoints available
ByQuyzzy大约 1 年前
Well, it&#x27;s not open source, it&#x27;s not open to the public, they&#x27;re not open with what they&#x27;re doing or what their goals are. It&#x27;s just a word like wuzzle or fibblefobble. Or google.
rnd0大约 1 年前
A better question is WHO should define open AI. My answer to that would be: anyone but the people trying to establish a moat around AI.