The article is all over the place, I don't really know what it's trying to say. It starts by criticizing Jon Blow's talk about "collapse of civilization by nobody being to able to do low-level programming anymore" as a tirade against the tirade of abstractions, but then talks about dxn as a prime (flawed) example of this "effort to remove abstractions". I mean, dxn is clearly just a wrong abstraction for the hardware we have right now, it's actually opposite to what most low-level programmers would do. So the dxn example is actually supporting what Jon was saying in the talk? It's just not a good example.<p>Also, the example about dynamic dispatch isn't really as persuasive as the author might think. Even if everyone benefits from these abstractions, what's the point when that abstraction is fundamentally slow on hardware in the first place, no matter how much optimization you can do? I mean, the Apple engineers have done everything they can do to optimize obj_msgSend() down to the assembly level, but you're still discouraged to use Obj-C virtual method calls in a tight loop because of performance problems. And we know in both principle and practice that languages which heavily rely on dynamic polymorphism (like Smalltalk) tends to perform much worse than languages like C/(non-OOP)C++/Rust which (usually) doesn't rely heavily on dynamic polymorphism. In these languages, when performance matters devs often use the good ol' switch statement (with enums / tagged unions / ADTs) instead to specify different kinds of behavior, since they are easier to inline for the compiler and the hardware is made to run switch statements faster than virtual calls. (Or to go even further you can just put different types of objects in separate contiguous arrays, if you are frequently iterating and querying these objects...) The problem I think for most programmers is that they don't know they can actually make these design choices in the first place, since they've learned "use virtual polymorphism to model objects" in OOP classes as a dogma that they must always adhere to, whereas a switch statement could have been better in most cases (both in terms of performance and code readability/maintainability. Virtual calls may be a good abstraction in some cases, but in most cases there are multiple abstractions competing with it that are more performant (and arguably, can actually be simpler).<p>The point Jon is trying to make (although maybe not that clear enough from the talk), is that we simply need better abstractions for the hardware that we have. And C/C++ doesn't really cut it for him, so that's why he's creating his own abstractions from scratch by writing a new language. He has often said that he dislikes "big ideas programming", which believes that if every programmer believes in a core "idea" of programming then everything will magically get better. He instead opts towards a more pragmatic approach to writing software, which is writing for the hardware and the design constraints we have right now. Maybe he may seem a bit grumpy from the perspective of people outside of OS/compiler/game development (since he also lets out some personal developer grievances in the talk), but I think his sentiment make sense in a big picture, that we have continuously churned out heaps of abstractions that have gone too far from the actual inner workings of the hardware, up to the point that desktop software has generally become too slow compared to the features it provides to users (Looking at you, Electron...)