Take Stackoverflow as an example: We've all seen how good ChatGPT is at explaining programming concepts and generating novel snippets of code replacing the need to go to stackoverflow for many questions. This is because ChatGPT benefits from the entire historical corpus of stackoverflow answers, along with information from the broader web. The incentive for Stackoverflow to exist and continuously improve is that people will visit seeking answers and click on ads / paid job postings / etc. If we are getting our answers from LLMs how does the content ecosystem survive?
You say it yourself, it is trained on stack, which is advancing everyday with fresh information. While the transformer is trained once and stays the same, until there is a new version.<p>If it is better and more polite than stack, that would be great, but how would it ever progress beyond?<p>Will we get pages, where people discuss how it got stuff wrong?<p>Will stack improve a lot to question that the transformer can't answer satisfactory?
Wouldn't that be really great? It would move the content creation to the most freshes areas of information and make it even more valuable and insanely specific.<p>Something truly feels off with this, this is some sort of contradiction. I don't see it replacing it's own training data or improving upon it.<p>Let's wait and see in a few years. I guess stack will be fine and the transformers will find their place as mediocre tutors and secretaries.