It depends on what you're developing and possibly for whom, but I think in general not that many will care about using AI or any other tool.<p>Nobody bats an eye when you use IntelliSense or another form of autocomplete, or a plugin with code snippets. I don't worry about the implications of using StackOverflow, or some plugins to do a bit of codegen for me (e.g. ORM mappings for my app from a live database schema), nor would I worry about getting an informal suggestion for how to write a particular method from a friend.<p>You'd have to be in a pretty corporate environment for AI or any of that to start mattering.
Perhaps nobody needs to own the code<p>Code should support human values. It's not clear that human values are best served by private code ownership. And AI code might best provide value as a publicly owned good.<p>That is why I write all my code under the AGPL, and think all AI assisted code should automatically be open source.
Interesting question. If the Google Vs Oracle case had played out today, and Google was able to prove an AI had created an API that looked incredibly similar to Java, would that have stood up as a defence? If code isn't owned by the developer or company that wrote it, then presumably someone else can't sue them for copyright infringement.<p>If that's true then there'll be no effective way to ever sue a software developer over the code they wrote (or generated, or prompted, or whatever the term will be.)
Well, it depends. It can be you, your company, a customer, a friend, a SaaS company that asked you to sign a CLA to contribute to their "open-source" project,…
I think we should think of AI more like a search engine that understands deeply what we want right now than a person helping us. And if anyone owns anything the AIs regurgitate it is the authors of the training data.
If you use paint you bought at the store, who really owns the finished painting?<p>If you use a saw to cut wood, who owns the wood you cut?<p>For now, "AI" is a tool. Maybe that will change in the future when AI is indistinguishable from people and have rights and privileges, but for now, it's just a tool, and tools do not transfer ownership.
If copyrightable creative source code is the toolchain precursor to an executable binary, can a creative LLM prompt be copyrighted as a toolchain precursor to non-creative source code? Should LLM prompts be versioned alongside generated source code?
I think the issue here is that it's really clear who owns the <i>liability</i> so if your employees are coding "with AI" then you need to understand that they don't know enough to know when they are putting you at risk.
If code needs an owner for legal purposes, then it should probably be the person or persons who commissioned the code to be created, if said ownership is not transferred to a different person or persons by agreement.
Who owns the finished product is whoever published the product (first). Otherwise a special license for AI (to differentiate spam mostly) can be created for autonomously created/published code.
Well,of course the one giving the prompts, not different from when your boss takes all the credit for a series of prompts that created a solution you wrote.
When you use a spellchecker to check a text who own the text?<p>When you use a style guide, citation book, encyclopedia, to write a text who owns the text?<p>I'm sorry but <i>obviously</i> if you produce something whether with help from a tool or not, <i>you</i> are responsible.
When AI helps you code, who owns the finished product?<p>When a hammer helps you build, who owns the finished product?<p>... in other news, we've just learned that paintbrushes now have a say in art ownership disputes.
Who owns the code? I guess you should never really admit that AI coding assistant helped you during coding and there you go....problem solved.<p>Edit: It's actually more harder than this; the best option would be to roll your own AI coding assistant either on-prem or in your private cloud.