There's an evaporative cooling effect. Ten years ago it was obvious that Python was going to have a very hard time in the multicore world. People who needed that performance and knew they needed that performance left somewhere in the intervening years. It has been obvious for a while that Python was not going to be capable of a general solution to that problem no matter what it did. Now those people are no longer in the Python community.<p>What's left are people who don't need that performance, which is sometimes me and is when I still am happy to use Python for something, and people who do need that performance, but <i>don't know it</i>. Those are the ones who get into trouble.<p>I do wish that the Python developer community would be more open about the fact that Python performance is really quite bad in many cases and that while there are some tools to peck around the edges, it is still fundamentally a low performance language, things like NumPy notwithstanding (it is, ultimately, still a "peck around the edges" tool, even if the particular edge it pecks it does <i>extremely</i> well, but that only makes the performance delta worse when you fall out of its accelerated code path). I feel like maybe back in the 200Xs the community as a whole was more open to that. Now "Python is slow" is perceived by the community as an attack. Maybe because the ones who did understand what that issue was are mostly gone.<p>But in the 2020s, yes, Python ought to be straight-up eliminated from many tasks people want to do today on straight-up performance grounds. Overcoming a ~40x disadvantage in the single core era was sometimes tough, but often doable. But nowadays basically that gets multiplied by each core on the system, and overcoming a multi-hundred-factor disadvantage is just not worth your time <i>IF</i> that is a level of performance you need. Python is nice, but not nice enough to pay for multiple-hundred-factor performance penalties. Static languages have come a long way since 1995.