It would violate the terms of service by OpenAI that say that users have to disclose that it was written by GPT-3.<p>But let's say that they're quietly breaking the rules.<p>Main giveaway is that GPT-3 is bad with long form. My favorite example is asking it to write a story about the man with the golden arm. It wrote a story of a man with a physical arm made of gold, who had enemies that wanted to steal it. So the man went into hiding and became a baseball player. It then wrote stories about the man's baseball career.<p>That's an AI cliche. A human cliche might be that the man was part of a secret experiment or something. It can be reeled in with a human cowriter, but checking for these can often take more time than someone autogenerating articles is willing to spend. Someone who claims an article was written with GPT-3 probably spent a lot of time rerolling the output.<p>GPT-3 can mimic structure like chapters and poetry, but it doesn't understand rules. It can't do sonnets because it doesn't understand syllables and rhymes. But it can make something that looks like a sonnet. It knows the length and relative content in a paragraph, page, or chapter, and it knows a book has a beginning and closing chapter. But it doesn't really know what the purposes of these are.
Why don't you just read them (or skim them or whatever to start) and see if they provide you with something of value? What is your concern? If somehow GPT did write a good article, why would you care? (Spoiler alert: it won't, but neither will most people who write and post blogs)