Are you sure you want to encourage people to included automated "copy and paste" content into their everyday applications? ChatGPT and other large language models are literally just "autocomplete" on steroids. You can see this at work when you use the tool. Each word typed by the AI is simply the <i>next most likely word</i> statistically.<p>This is not rocket science nor even English, really, because people who speak English generally understand the content of the sentences they write. ChatGPT does not. It selects the words and phrases that are most likely to pass for a "real result," based on what you typed, and then parrots them back to you.<p>Now imagine that you've included such a capability in your text editor, blogging tool, whatever. If people use it as designed, it will merely repeat what others have written (nearly verbatim) and repeat parts of your prompt (in slightly different form) in a well-composed paragraph. That's generally known among writers as both plagiarism and "not doing your homework."<p>In a worst case scenario, ChatGPT repeats information that is simply factually incorrect. Lacking any way to ascertain veracity, the robot pipes back words to you in grammatically correct sentences -- which are not and cannot be fact checked automatically. This is a very dangerous platform from which to begin one's own writing. A responsible human will take the time to research and understand a topic on his or her own, then craft something which is useful for a specific audience (all of ChatGPT's responses are generic; it doesn't know anything about audiences).<p>Perhaps most importantly, ChatGPT is not capable of generating or revealing any <i>new</i> knowledge -- something that hasn't been written down and scanned earlier. This means that its writing cannot contribute anything at all to the advancement of human knowledge. It cannot provide any new insights or discoveries. It cannot draw conclusions or analyze the connections between things any more than a Xerox machine can.<p>I would say, "Don't mix ChatGPT with real work" for the reasons we don't mix cannabis edibles with regular candy. They look alike but one is dangerous for children.