Jailbreaking(I hate the term but anyways) GPT is not that hard with API. You just need to write say few migaligned output from GPT. Their API doesn't check if the GPT response in history actually came from GPT.
I don't understand why people still spend time on jailbreaks of the proprietary models, while they can easily use uncensored open-source models these days. I feel like its kind of waste of time.