We've built up layers and layers and layers of inefficiencies in the entire OS and software stack since the gigahertz wars took us from 66Mhz to multiple GHz in the 90s.<p>The software industry is awful at conserving code and approaches through the every-five-years total redo of programming languages and frameworks. Or less for Javascript.<p>That churn also means optimization from hardware --> program execution doesn't happen. Instead we plow through layers upon layers of both conceptual abstraction layers and actual software execution barriers.<p>Also, why the hell are standardized libraries more ... standardized? I get lots of languages are different in mechanics and syntax... But a standardized library set could be optimized behind the interface repetitively, be optimized at the hw/software level, etc.<p>Why do ruby, python, javascript, c#, java, rust, C++, etc etc etc etc etc not have evolved to an efficient underpinning and common design? Linux, windows, android, and iOS need to converge on this too. It would be less wasted space in memory, less wasted space in OS complexity, less wasted space in app complexity and size. I guess ARM/Intel/AMD would also need to get in the game to optimize down to chip level.<p>Maybe that's what he means with "DSLs", but to me "DSLs" are an order of magnitude more complex in infrastructure and coordination if we are talking about dedicated hardware for dedicated processing tasks while still having general task ability. DSLs just seem to constrain too much freedom.