The final pithy quote about how a prominent linux developer claimed no cpu would ever do such a thing, loses some of its impact when you realize that a _more_ prominent linux developer replied saying “I think, when it comes to speculative execution, our general expectation that CPUs don't do idiotic things got somewhat weakened in the past year or so ...”
To my knowledge none of these side-channels, even the original ones several years ago, have been exploited practically in the wild. IMHO gathering the amount of detail needed to attempt such an attack, as exemplified by the demos that have been given, would itself be prohibitively difficult. Thus the impact of yet another one remains negligible to a personal computer user, but of course the cloud providers would be super-paranoid about it.<p>It's worth noting that the memory protection scheme, introduced with the 286, was never intended to be a strong security barrier, but instead a means of isolating bugs and making them easier to debug.
That's a great article.<p>I do worry sometimes that something is up with CPU development, that we're tending towards more and more complicated designs with workflows that are very hard to analyse and simulate even for the designers themselves, but the actual workload execution ability performance per core isn't shifting upwards all that much, and then weird mitigations have to be applied that reduce that execution ability in practice.<p>Something makes me think that perhaps a different design paradigm should prevail, with particular attention paid to segregation of workloads and of core partitioning, perhaps an abandoning of hyperthreading and even to the extent of having 100% physical separation of cores and their caches.<p>But I'm very much not an expert in the field.<p>A little birdie inside of me every now and then wakes up and whispers 'is it a coincidence that these design paradigms are yielding so many vulnerabilities?'
AMD is supposed (or does) have very good branch predictors. What's interesting to me is they don't do a re-steering of the wrong prediction well before later instructions are issued to the back end.<p>Maybe their BTB is that good that they didn't see it worth investing in the control logic for it.
Somehow the "Final remarks" section reminded me of the more paranoid "wheels within wheels" Frank Herbert fiction.<p>Not pleasantly, the FH fiction wasn't, either.
I went AMD for the first time with my new laptop. Some 8 core Ryzen. Games crash all the time, not sure if it'd the architecture's fault, and I can't properly virtualize Windows 98 (which I like to just for fun/nostalgia) apparently due to the architecture. I feel like I'm sticking with Intel from now on like my old gut instinct said to.