This is over 4 months old, and is already patched in Python. Was discussed on HN at the time: <a href="https://news.ycombinator.com/item?id=33281106" rel="nofollow">https://news.ycombinator.com/item?id=33281106</a>
I didn't read the whole paper, but how can this even happen? Seems like the buffer overflow would be triggered for any file larger than 4 GiB, which I assume someone has tested in the 8 years since it was released.
To clarify, this only affects EdDSA as far as implementations use SHA-3 to hash a message before applying the signature. The actual elliptic curve operations code seems to be fine.
> partialBlock = (unsigned int)(dataByteLen - i);<p>The paper makes no mention of compiler warnings… but shouldn’t this cast trigger a compiler warning?
I wonder if this could be avoided by writing the canonical implementations in Rust or better yet in some system with formal verification.<p>This is such a critical part of the software stack, that we need a more reliable way of validation than just a bunch of people staring at the code written in C.
I find the current polynonce attack much worse: <a href="https://news.ycombinator.com/item?id=35048431" rel="nofollow">https://news.ycombinator.com/item?id=35048431</a>
Is this due to stupidity or malice?<p>I just can’t get my head round the idea that software written and reviewed by experts and submitted to the “National Institute of Standards and Technology” with a budget of 1 billion dollars can fuck up this way.<p>I’m no mathematician but I would have thought implementing pure number crunching code is not rocket science.<p>Buffer overflow, overwrite memory, run arbitrary code, seriously? LOL, WTF.