Two-dimensional programming languages like Funges are a fruitful ground for live coding, but I was unsure how to achieve that until Ocra [1] appeared. Funge-98 does support threaded execution (via `t`), but its threading model is cooperative and thus requires meticulous synchronization for most uses of live coding. For example, you can "park" each instruction pointer to some position until the next beat, but you can't really control time to next note from there because it would be proportional to the number of instructions executed (and you need noticable delay to show the IP's movement).<p>Orca cleverly avoided this issue by making every instruction running simultaneously. If I understood correctly, NoiseFunge seems to found another solution out: it queues notes to play by the next beat and moves on. As I'm not really a live coding person I can't see this approach is easier than Orca or not though. Anyway, an interesting attempt indeed.<p>[1] <a href="https://100r.co/site/orca.html" rel="nofollow">https://100r.co/site/orca.html</a>
Creator here. I just updated the README and name for this project since it is the legacy implementation. I started the new implementation in Rust earlier this year.<p>If you're interested in seeing it in action, here's a recent video (there are more on my youtube page): <a href="https://www.youtube.com/watch?v=qeseWGmcIbY" rel="nofollow">https://www.youtube.com/watch?v=qeseWGmcIbY</a>
Ahh, this reminds me of my own attempt in Brainfuck!<p><a href="http://nexuist.github.io/brainsynth/" rel="nofollow">http://nexuist.github.io/brainsynth/</a><p>Been a year or two since I touched it, so I have no idea if it still works. Last I remember it was extremely slow; I think eventually I wanted to rewrite it in Vue or Svelte or even WebAssembly.<p>I think it's a fun concept, just need to work on the execution to make it painless to play with.
I have just been informed that there is now a new version written in Rust.<p>Complete with its own rust based Befunge compiler.<p><a href="https://github.com/revnull/noisefunge.rs" rel="nofollow">https://github.com/revnull/noisefunge.rs</a>
This is really, really good (in particular the composition). So good I envy you for not thinking of it earlier! :)<p>Keep on the algorhythm <3<p>(I have hacked together scripts in python for a few incomplete compositions and experiments with harmony. I even make the samples from scratch using basic math and waveforms -- I encourage you to try deriving the raw samples as well! waveform shape and phase interactions are surprisingly complex)
I'm intrigued, but its not 100% clear to me what this is. Is this essentially parallel to Sonic PI? What generates the actual sound? Does it control synths via MIDI or are synths/oscillators in the codebase? Thx