Fragmentation yields a distribution of effort which yields theoretically inferior products, compared to the theoretical output of all the individual parts working towards the same goal. (Debatable, but we'll assume it.)<p>Centralization yields a lack of competition yields a lack of a drive to improve inferior products yields stagnation and disenfranchisement. (Also debatable, but probably safe to assume here as well.)<p>Startup costs in this area (basic compiler stuff, usually handled by llvm and other varied metacompiler frameworks nowadays) become vanishingly small with time, while the long tail becomes ever larger (libraries, tooling, ecosystem goodness, all are expected now).<p>Given these two observations, when a language becomes popular despite starting in a fragmented ecosystem, it slowly grows, starts to take advantage of network effects, and gains the benefits of growing centralization as it becomes "the defacto choice" in some area (usually at the cost of other players in the space, be they large or small). Eventually, it stagnates, parts of the community become disenfranchised and go and spawn their own languages and variants (taking ideas from their origin, along with their grievances and ideas from other areas with them), and the process begins anew. However because of the growing long tail, each time this cycle takes place, the time between expansion and explosion takes longer and longer.<p>At least, that's how I've been led to believe systems like these tend to work.<p>It seems like the call to be _mindful_ of what you're doing when you make a language is sensible; if you contribute to the cycle, you're contributing to what will eventually be 2 years of long-tail work for what will be the de-facto norm for developer tooling 40 years from now; but that's a terrible way to look at it. Wouldn't you rather get in closer to the ground floor and be part of the first iterations that set the standards for the iterations which come after? Isn't thinking of it any other way just being defeatist? (Since it amounts to concluding that future generations of developers must be better at this than you, and so are worth more early-cycle-iteration time?) So what if it's indulging the ego some; if that's the cost of improvement, then so be it. The languages we have today wouldn't have been made without their predecessors, and the languages of tomorrow won't exist without the ones we have today. Is it also equally possible that deciding _not_ to make some new language potentially delaying the progress of the programming language field as a whole? Is making that value judgement in the purview of anything other than hindsight? I can't begin to imagine in what ways future languages may improve life as a developer, but if they're anything like the stark contrasts we've seen recently in certain PL areas (ie, systems languages), it's sure to be exciting.