Hi HN, creator here. Lately I've been enjoying learning about the Web Audio API and wanted to apply it. I wanted a quick debugging tool to inspect the sounds I was synthesizing and ended up creating this. It's vanilla JS, so just view source to check it out.<p>Things I've ended up enjoying with it:<p>- Hum, whistle, and sing my favorite tunes and compare the images<p>- Take it on a nature walk and look at bird calls<p>- Try to make sounds to draw a specific picture or shape<p>How it works:<p>On load, the app requests microphone access, and connects that audio stream to a WebAudio AnalyzerNode. The AnalyzerNode offers a fast fourier transform that is called repeatedly using requestAnimationFrame. For each frame, the entire contents of the canvas are shifted left by 1 virtual pixel, and the current fourier frequency bins are plotted in the rightmost column.<p>Limitations:<p>- There are 1024 frequency bins, compared to roughly 20,000 hertz hearing range, so the resolution is pretty good but not perfect as each dot represents a range of ~20 hertz.<p>- Unfortunately doesn't work in iOS Firefox or Chrome; Apple won't offer microphone access via getUserMedia to 3rd party browsers.<p>- It has a PWA manifest; Android users should be able to add to homescreen. iOS users still can't since getUserMedia is restricted in iOS PWAs