Something that blew my mind the first time I learned it:<p>You can think of a function f(x) as the limit of an infinitely big vector where the entries index the infinitesimal. Eg f(x) = [...f(-2<i>dx), f(-dx), f(0), f(dx), f(2</i>dx)...]. The dot product of two functions (f,g) is still the usual sum[f(x)*g(x)] but the sum is replaced with an integral. Sin and Cos of integer frequencies happen to have a dot product of 0 (check it) which means doing a change of basis into those functions (aka the Fourier transform) happens to work out really nicely. Other than that Sin and Cos are not really privileged. For example you can do a transform into the basis of polynomial functions if you wanted to (aka the Taylor series). Any basis you can cook up would do just as well so long as its a complete basis. Just like in linear algebra, you have a complete basis if you can use linear combinations to construct the vectors ... [...1,0,0...], [...0,1,0...], [...0,0,1...] ... (aka the dirac delta functions). Differentiation is a matrix with dx on the off diagonal and -dx on the diagonal.