Open source has enabled developers to learn new patterns, build companies for almost free and has made a huge positive impact on developers.<p>Recently we have seen generative AI such as ChatGPT trained on open source code which are so good at programming. It has made us concerned about loosing/impacting our jobs. Should open source allow AI companies which plan to make billions using their work, to train on open source code?
<a href="https://opensource.org/osd/" rel="nofollow">https://opensource.org/osd/</a><p><pre><code> 6. No Discrimination Against Fields of Endeavor
The license must not restrict anyone from making use of the program in a specific field of endeavor. For example, it may not restrict the program from being used in a business, or from being used for genetic research.
...
10. License Must Be Technology-Neutral
No provision of the license may be predicated on any individual technology or style of interface.
</code></pre>
I'm not sure how one can claim to be open source and <i>not</i> allow for it to be used for AI training.<p>There are certainly license / fair use / derived work issues to be resolved... but the "allow AI training" probably should be a "yes" for open source software.
I understand the concern and think it can even be broadened. Generative AI also threatens the pay and livelihood of graphic designers, artists, journalists, copywriters -- people who tend to be far more poorly paid than tech workers already. This application of generative AI to code-writing is really just automation coming home to roost. The question is, why should software engineers represent some special protected class from the dangers of automation?<p>I guess my point is there has to be some consistent standard here. Software engineers have no greater claim to their public-facing work than writers and artists, and are paid far more. If there's an argument to exclude open-source code from training sets there should also be an argument to exclude essays and journalism, digital drawings, and music from such training sets.