Younger programmers may not appreciate how big a revolution the quantitative approach to computer design was. It now seems like an obvious idea: since all machines are Turing-complete, instruction sets and architecture should be optimized for performance across some representative workloads. Since the instruction set is one of the variables, the tasks must be specified in a high-level language and the compiler becomes part of the system being optimized.<p>Before that, instruction sets were driven more by aesthetics and marketing than performance. That sold chips in a world where people wrote assembly code -- instructions were added like language features. Thus instructions like REPNZ SCAS (ie, strlen) which was sweet if you were writing string handling code in assembler.<p>H&P must have been in the queue for the Turing award since the mid-90s. There seems to be a long backlog.
Who else studied with their book called "Computer Architecture" and what are your thoughts on it ?<p>I enjoyed that it was a simpler read then a lot of the circuits-type of books that are part of a EE/CE curriculum, but I always felt there was this lack of "hard science"/physics in the book.<p>And perhaps it was just not a topic they felt fit with the vision of what this book is suppose to be, and it likely came to be a better decision to abstract that part away for readability.
Two years ago, David Patterson interviewed John Hennessy for <i>Communications of the ACM</i>. They discuss the changing job landscape, MOOCs, and the future of education.<p><a href="https://doi.org/10.1145/2880222" rel="nofollow">https://doi.org/10.1145/2880222</a>
Really happy about this! Their book "Computer Architecture" (along with Tanenbaum's book) played a huge role in my development as an engineer.