I have a soft spot for side-channel attacks, they are often a beautiful example of out-of-the-box thinking. This whitepaper is no exception, in particular the second part about (ab)using the SVG filters.<p>I was thinking, of course it doesn't help much in mitigating this attack, but they calculate average rendering times over several repeats of the same operation. When profiling performance timings, it's usually much more accurate to take the <i>minimum</i> execution time. The constant timing that you want to measure is part of the low-bound on the total time, any random OS process/timing glitch is going to <i>add</i> to that total time, but it will not somehow make the timespan you are interested in randomly run faster. There might be some exceptions to this, though (in which case I'd go for a truncated median or percentile range average or something).<p>Also had some ideas to improve performance on the pixel-stealing, as well as the OCR-style character reading. With the latter one could use Bayesian probabilities instead of a strict decision tree, that way it'll be more resilient to accidental timing errors so you don't need to repeat as often to ensure that <i>every</i> pixel is correct, just keep reading out high-entropy pixels and adjust the probabilities until there is sufficient "belief" in a particular outcome.<p>But as I understand from the concluding paragraphs of this paper, these vulnerabilities are already patched or very much on the way to being patched, otherwise I'd love to have a play with this :) :)