Hi! I'm a college freshman at Northeastern, and this is my first hackathon project for a major hackathon (MIT Media Lab). It won the Unconventional Computing track prize, so I'm pretty happy! I made this because I'm a major conlanger, and I was wondering if it would be possible to think in terms of sound when looking at an image. Fractals have easy-to-map parameters, so I created SHFLA, a language which takes in music, and creates fractals based on 0.1 second (you can change this) chunks of music! It's Turing-complete, so you can technically encode a ridiculous amount of information and computation in my system, although writing it out as music might take a while.<p>Hope you find the project cool :)