There are some salty comments here, but I think the context is important. This paper passed across my desk in early 2008 when I was doing HFT stuff. It might be a bit of a stretch to say that the reason people are taught about cache lines in most CS programs is because of this paper, but at the time this paper was written, this was really specialized knowledge and groundbreaking to most software developers. This would go on to be a popular topic on C++ blogs from Important People (Boost maintainers, STL devs, etc) at least for the next 5 years.<p>Also, if you know Ulrich Drepper at all, either from some of his talks or his mailing list presence, this is just a very fitting title from him. Just pure deadpan, you think its funny, he probably does not, the fact that you think its amusing is just disappointing him like a professor looking out at freshman undergrads wondering how he got stuck teaching this class.
I wish Ulrich Drepper (thank you, Mr. Drepper) would update this with a section on <i>Row Hammer</i> and also <i>Spectre</i> and <i>Meltdown</i>. Programmer's need to know about memory <i>because</i> of these exploits, more so with the latter two in order avoid creating exploitable gadgets.<p>But then I also think that <i>What Every Computer Scientist Should Know About Floating-Point Arithmetic</i> should be updated to include UNUMs. I don't think that will happen either. Also, thank you Mr. Goldberg.
Is this title and content supposed to be ironic?<p>I quickly perused the article and I think this link should be renamed "What 99.9% of programmers don't need to know about memory."<p>I've managed to go from Associate to Principal without knowing 99% of what's covered in this document, and I'm struggling to understand why the average Java, C#, Python, Rust, <insert language here> programmer would need to know about transistor configurations or voltages, pin configurations, etc. Let alone 114 pages of low level hardware diagrams and jargon!<p>This document is for someone working on low level drivers for memory, or working on the hardware side. For any normal software engineer, this information is not helpful for doing your job.
I was so excited to dive into this, but ended up with the same Takeaway as most other commenters. Aside: As a data scientist, I’ve been surprised how much I’ve needed to learn about the finer points of optimizing GPU utilization for training.<p>It has all been from more experienced coworkers, and I would much appreciate any resources anybody could point me to (free or paid) so that I could round out my knowledge
For an accessible talk about the real-world implications of this, I enjoy watching Mike Acton's CppCon talk "Data-Oriented Design and C++": <a href="https://www.youtube.com/watch?v=rX0ItVEVjHc" rel="nofollow">https://www.youtube.com/watch?v=rX0ItVEVjHc</a>