Shaders gonna shade

I've been messing around a bit with audio visualisation lately. It's a very strange sort of problem space. There's a lot – I mean, a lot – of existing software for VJing and other kinds of audiovisual mapping. It looks super complex, but in a sense the processing pipeline is quite simple: audio data → analysis/feature extraction → parametric rendering → visual data. Assuming you're happy to write the plumbing yourself, you just need a feature extraction library and a way to render visuals. So I thought I'd give that a shot and see if I could get something going in the browser.

The audio feature extraction part was easy, though on the browser there aren't a lot of options. The two I found that seemed good were Meyda and JS-Xtract. JS-Xtract supported more features, but in the end I used Meyda because its code was nicer and I'm a snob like that.

For parametric rendering, the obvious answer is graphics shaders: they run on the GPU so they're proper fast, and they can generate quite complex visuals with only a small amount of code. I found The Book of Shaders pretty helpful for the finer details of how to do that, though the possibility space of shaders is so huge it's kinda hard to get your head around.

Luckily, there's a lot of inspiration to draw from thanks to the prodigious shader community. The biggest gallery sites are Shadertoy, GLSL Sandbox, and Interactive Shader Format. This last one is particularly interesting because the Interactive Shader Format includes parameter definitions, so you can easily wire data up to ISF shaders. That said, you can do that with any shader by just picking some juicy-looking consts or magic numbers and turning them into parameters.

With those two ingredients, it's really just a matter of wiring up the feature extraction parameters to the shader parameters and away you go. By which I mean away you go into hour upon hour of parameter twiddling, endlessly nudging numbers slightly one direction or another, never sure if you're making them better or worse. With the benefit of hindsight, I now realise that the value of the off-the-shelf solutions isn't really the feature extraction and rendering, but the rapid feedback of the real-time parameter mapping between them.

That said, it was pretty fun to roll my own, and it's nice to know I'm only a real-time parameter mapping system away from a decent solution. The whole thing also gives me some ideas for a more intuitive way to build shaders, something where you take some simple animation and shape primitives and repeatedly combine them with higher-level operations until you get complex behaviour.

Meanwhile, the fruit of my efforts so far is the video at the top there. It's a fun little track called Cuisine by Nctrnm hooked up to a shader I mangled together out of this starfield from GLSL Sandbox.