Continuing the documentation catch-up…
Over the last few months we built a large number of animation patterns for the holohedron structure, all in the form of generator functions. We needed a way to do some kind of automation mix between them, mediated by external control cues coming in from the audio system as OSC messages.
We were also concerned about performance - the frame rate is not that high - and so only wanted to run generator functions whose effect would be visible at any one time, and effectively “mute” the others.
In the end, we opted for a 2D mixing scheme inspired by Audiomulch’s metasurface, which allows preset parameter values to be laid out as points in 2D space, so that a cursor position in the space determines a “mix” between presets according to distance from the cursor:
So, we laid out our animations on a 2D “terrain”, giving each of them an X/Y position and a radius (see the top image). In some cases the animations overlapped.
Our generator functions already took parameters for spacial position (for each display point), and elapsed time. We added another two: distance and heading of the animation relative to the cursor. An animatinn could choose to treat this navigational distance as an instruction to fade in or out (and in fact, we had a higher-order generator which applied a fade according to distance). And any animation out of range of the cursor was ignored for the purposes of rendering.
Cues from the audio system came in to the host machine via OSC; we used Max to turn that into a web socket message in order to get it into the browser. Every cue would kick off a cursor move to a specific X/Y point to take place in a time T. (Once again, our almost-throwaway project Twizzle was a perfect tool for the cue following.)
The actual hardware control was via video capture, using a NovaStar MCTRL300 to send audio via ethernet to the LED controllers:
To generate the pixel patterns for the controllers we put another HTML canvas into the application page, brought up a 2D drawing context, and drew into it by fast-updating an RGB byte array. The pixel animations are rather attractive in their own right, so here’s a quick screen-grab: