Absolutely not. If you work for a company that's even semi-serious about security and has to be audited, you could be fired. Honestly the fact that you're asking this means your company has utterly failed at basic security training.<p>Don't send your code to any cloud you don't own, and especially don't send it to a service that's explicit about using your submissions in new training sets.
I think the interesting question is "What are the actual risks of pasting my code into ChatGPT?"<p>If you work for a company of any significant size, the answer from the legal team / outside counsel is almost certainly going to be "Do not share IP". While this is probably the correct answer, these teams are trained to be risk averse.<p>However, say you are a small startup or a solo dev. I am not certain there are significant risks beyond your code possibly showing up in future revisions of the model. In that scenario, one would think the risk of that code being possibly be made public could be evaluated on a case by case basis.<p>But possibly there are other risks?
My company has a very strict no-gpt/no-copilot policy. Check with your legal/security/compliance teams before sending your codebase to another company.
Not unless you want a promotion to customer. At my work if you send code from a BU to gpt or any AI they will more than likely vacuum seal you in a test tube.