TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

The Fourier Transform and its Applications

264 pointsby kerckerover 8 years ago

10 comments

yagyuover 8 years ago
There is this hard question that is often glossed over in a first course on Fourier analysis: We learn that the Fourier transform of f(t) is an integral of exp(iwt)f(t) over all t. But when we look at a table of Fourier transform pairs, we find things like the Fourier transform of a constant function (is the Dirac delta). Integral does not converge? WTF?<p>I really think Prof. Osgood hits a sweet spot of not sweeping this under the carpet, and not becoming a course on functional&#x2F;integration theory. I&#x27;m a postdoc in theoretical physics, and use these transforms regularly, and I have yet to find a more honest but still useful introduction (although I understand that real mathematicians cringe at my abuse of mathematics.. :)<p>edit: typo
评论 #12481473 未加载
评论 #12478098 未加载
tptacekover 8 years ago
In case this is useful to anyone else:<p>There&#x27;s probably an easier way to do it, but if you want to get faster playback (2x is about as fast as I can comprehend this lecturer at), this line seems to do the trick:<p><pre><code> document.getElementById(&quot;myElement&quot;).querySelector(&quot;video&quot;).playbackRate = 2</code></pre>
评论 #12477104 未加载
评论 #12476935 未加载
评论 #12480466 未加载
评论 #12477282 未加载
farhanhubbleover 8 years ago
This is an exceptional course. In my CS undergrad days, some 7 years ago, when I wanted to make sense of Fourier Series and Transform, this course was my savior. The mathematics text book had a dry and unintuitive treatment of the topics and electrical engineering, physics and computer science books mostly relegated the topics to a few pages in an appendix.<p>I found this course and printed out the PDF of the lecture notes as I didn&#x27;t have enough bandwidth for watching the videos. The lecture notes had a wealth of information about the idea of a Fourier Series expansion, it&#x27;s connection to the Transform and the applications in diverse fields.<p>Prof. Osgood had great examples. In one such example he showed that the temperature of a ring must be a periodic function of time as the structure of the ring is periodic in space, and then went on to derive an equation for the temperature of the ring that was a Fourier Series.<p>It was from these notes that I learned a great deal about convolution too. The new website looks much better organized. I&#x27;d recommend every CS&#x2F;EE student to at least watch the first few lectures once.
评论 #12477721 未加载
评论 #12477805 未加载
jomamaxxover 8 years ago
I &#x27;took&#x27; the stanford NLP course online it was great.<p>Doing this one now.<p>Funny - I did Fourier in Comp. Eng. - but it was jammed together with so many other things, I felt overwhelmed by all of it, I did enough to do decently on the exam, but I feel a lot of it was lost on me, and I forgot much of it.<p>Using fourier in most computer applications is very rare!<p>I&#x27;m excited to have a project I need to use it for - and weirdly stocked to be watching this video.<p>I love it when I have a reason to learn something awesome ...
评论 #12477193 未加载
评论 #12479833 未加载
hackinthebochsover 8 years ago
I went through this entire lecture series and I can&#x27;t recommend it enough. Osgood is one of the most engaging professors I&#x27;ve ever listened to. I would happily learn anything from this guy. My only disappointment was that he doesn&#x27;t teach anything else!
zafover 8 years ago
Web page didn&#x27;t work for me on my Safari. YouTube works fine: <a href="https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=gZNm7L96pfY" rel="nofollow">https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=gZNm7L96pfY</a>
jokoonover 8 years ago
I think I remember that the fourier transform is used to send a signal across a analog line, for DSL modems, after it&#x27;s converted from binary.<p>Since analog lines may not have the same frequency response when it comes to interference, the modem usually tests the analog lines at various frequencies, and only keeps the intervals who are good enough to transmit.<p>I wonder if the FFT is used for wifi, since wifi passes through an analog signal.<p>Although I&#x27;m not even sure of what I&#x27;m talking about is true or relates to fourier.
评论 #12478584 未加载
Xcelerateover 8 years ago
30 lectures on Fourier transforms and not one mention of the Fourier transform over compact groups? (Which is arguably the most generalized&#x2F;abstract formulation of the concept of a Fourier transform.) A lot of recent research has been done on FFTs over groups like the rotation group and the symmetric group, and this work has lead to significant progress on problems that were previously considered intractable.
评论 #12479763 未加载
评论 #12477271 未加载
jdp123over 8 years ago
very very highly recomended. Once you can understand the basis of this concept you open your mind to a whole new area of math and things like the finite element method for PDE&#x27;s and wavelet transforms follow much more naturally.
drostieover 8 years ago
One of the fun ones, and this may or may not be covered by the link, is generating the &quot;cumulants&quot; for probability density functions. The basic idea in (basic, continuous) probability theory is that you have these things called &quot;random variables&quot; X,Y,Z which now need to be thought of taking on values with various probabilities; and the way we do that is to use calculus (where the d- in &quot;dx&quot; is a special notation coming from the word &quot;difference&quot;; &quot;dx&quot; means &quot;a tiny little change in x&quot;). The basic idea is that there is some <i>joint density</i> j(x, y, z) such that the probability that simultaneously x &lt; X &lt; x + dx, AND y &lt; Y &lt; y + dy, AND z &lt; Z &lt; z + dz, is simply given by j(x, y, z) dx dy dz.<p>With a bit of calculation you conclude that if we want to analyze Z = X + Y, then the probability-density for Z alone must be h(z) = ∫ dx j(x, z - x); there are a couple ways to do this but my favorite is to use Z = X + Y as a schematic to write a Dirac δ-function j(x, y, z) = j(x, y) δ(z - x - y), then the standard &quot;what is the probability-density for Z alone? integrate over both x and y!&quot; rule comes into play. But you can also reason it out by hand, if you want.<p>Anyway, then you come up with this cool definition of independence; X and Y are independent if j(x, y) factors nicely into f(x) g(y); independent probabilities tend to multiply. So that&#x27;s cool.<p>Now here&#x27;s where the Fourier transform comes in; consider using a convention where we denote the Fourier transform and with the same function-name but using square brackets rather than parentheses for its application,<p>f[q] = ℱ{p → q} f(p) = ∫ dp exp(-i p q) f(p).<p>When we Fourier-transform h(z) for Z = X + Y we get:<p>h[a] = ∫ dz exp(-i a z) h(z) = ∫ dx dy exp(-i a (x + y)) j(x, y)<p>and if the two are independent random variables we find that h[a] = f[a] g[a]; the sum of independent random variables has a product of Fourier transforms precisely because the above sum takes the form of a convolution.<p>Taking a logarithm, we have that the χ(a) = log h[a] = log f[a] + log g[a] and any properties of the logarithm of the Fourier transform of the density function must be additive amongst independent random variables. And this gives the <i>cumulants</i> of the random variables: pseudo-linear expressions (linear in independent random variables f(X + Y) = f(X) + f(Y), with f(k X) = q(k) * f(X) for some fixed q) which characterize the different &quot;moments&quot; of the random variables in question. In fact if we look carefully at the definition h[a] = ∫ dz exp(-i a z) h(z) we see that the variable P = k Z, having distribution b(p) = h(p&#x2F;k) &#x2F; k, generates b[a] = h[k a].<p>Calculating the zeroth, χ(0) = log h(0) = log 1 = 0. No biggie. Continuing the Maclaurin expansion gives χ&#x27;(a) = h&#x27;(a)&#x2F;h(a) and at a=0 that&#x27;s -i E(Z). We knew that the expectation value was linear, so no surprise there. The next term is (-i)^2 [h&#x27;&#x27;(a) h(a) - h&#x27;(a) h&#x27;(a)]&#x2F;[h(a) h(a)] which works out to just E(Z^2) - E(Z)^2, the familiar expression for the variance; so variance is pseudolinear (in this case q(k) = k^2). The point is, then you can just go on. There are in fact infinitely many pseudolinear cumulants which scale like q(k) = k^n, coming from this Maclaurin series, with a pattern based in fact entirely on the pattern of derivatives that comes out of log(f(x)).<p>As a freebie you have a proof of the central limit theorem. Take N independent identically-distributed random variables with characteristic functions χ(a), call them each X_i, form their sum S = sum_i X_i &#x2F; N, and from the properties we&#x27;ve already seen in this comment its characteristic function must be ψ(a) = N χ(a &#x2F; N). Do the Maclaurin series to 2nd order and ignore the latter terms for large N; you get convergence to a parabola, which means that the Fourier transform of the probability density is... a Gaussian! And the inverse Fourier transform of that Gaussian is just another Gaussian, so for large N the result must be a normal random variable with mean μ and variance σ²&#x2F;N, which of course is what they must be, because means are linear and variances are pseudolinear.
评论 #12481964 未加载