Today's pedantic minute is sponsored by Carl Friedrich Gauss.<p>The FFT algorithm was <i>re</i>discovered in the 60's. Gauss discovered it in the early 19th century while studying the phase of the moon, before Fourier described his transform. He published it in latin, and the text was lost until 1984.
Paper: <a href="http://arxiv.org/pdf/1201.2501v1" rel="nofollow">http://arxiv.org/pdf/1201.2501v1</a><p>It starts by dividing the spectrum into frequency bins, and assumes that there is usually 0 or 1 non-zero coefficients in each bin. It wasn't clear to me what happens when a large number of non-zero coefficients are clustered together in one bin. Can O(k log n) still be achieved when most frequencies are clustered?
Interesting. I've a friend in Australia who's been doing some work analysing rainfall using arrays of wire across long distances (details somewhere here: <a href="http://wiredlab.org/" rel="nofollow">http://wiredlab.org/</a>), and during a recent conversation he pointed me at this paper: <a href="http://www.icita.org/icita2004/abstracts/128-5.htm" rel="nofollow">http://www.icita.org/icita2004/abstracts/128-5.htm</a> which seems to use another method (involving Primes) for calculating Fourier Transforms, and claims a 95% speed up.<p>Sadly I haven't had a chance to look into it in any great detail yet, although it's "in the queue" of things to poke at. I thought it worth pointing out in case anyone else might find it useful.
Spare transforms like this are very interesting, and these seem like some very good performance results. It's worth noting, though, that FFTW (<a href="http://www.fftw.org/" rel="nofollow">http://www.fftw.org/</a>) is still much faster on the sizes of transforms that many (most?) applications do - less than 2^15 or so entries. Also, as the number of non-zero frequencies climbs, sparse algorithms very quickly get overtaken by good implementations of more standard FFT algorithms.<p>Still, very cool stuff.
As usual the article misrepresents video compression. Almost l useful coding tools are spatial and not frequency based; the only famous one (DCT) was replaced with a rough approximation in H.264 and it works just as well.<p>This is because lots of realistic images don't really have meaningful frequency-based content. (imagine sampling every 8 or 16 pixels - would their values be in any way related to each other?)
I guess this is the project page, also contains the papers and some benchmarks:<p><a href="http://groups.csail.mit.edu/netmit/sFFT" rel="nofollow">http://groups.csail.mit.edu/netmit/sFFT</a>
For the plain ol' FFT, I've found the "Fastest Fourier in the West": <a href="http://www.fftw.org/" rel="nofollow">http://www.fftw.org/</a> to be an excellent library, which implements a whole bunch of different solvers, and picks the appropriate ones to use for your data.<p>Hopefully this new algorithm can be added to their toolbox in the future.
There's lots of things you can do with practical algorithms when you start thinking data-first approaches and try to fit the optimizations to the probability distribution of the data, i.e. fastest path for the most probable or important input and vice versa. Many approximations are based on the idea of ignoring input data which is improbable or has little effect on the outcome, and can be quite safely pruned out early.