I rewrote the main trig functions in Javascript to help with cross browser determinism. They run as quickly as the native math routines while compromising accuracy to around 10 significant digits.<p><a href="https://github.com/strainer/trigfills" rel="nofollow">https://github.com/strainer/trigfills</a>
Excellent article. It's so easy to think "same inputs = same outputs" but that's such a simplification. There's so much more in between!
For anyone interested in this sort of thing you'd probably also be interested in John Gustafson's work on unums/posits. Here's a nice introductory presentation <a href="https://www.youtube.com/watch?v=aP0Y1uAA-2Y" rel="nofollow">https://www.youtube.com/watch?v=aP0Y1uAA-2Y</a>
I’m glad it pointed to the article of the terrible accuracy of the x87 trig functions - the OS X and I assume GCC software implementations are ostensibly correctly accurate, which I think is done by using witchcraft. (More precisely I think they do it using infinite precision Pi?)