Body>Data>Space recently had the wonderful opportunity to present Collective Reality at Société des Arts Technologiques (SAT) Montréal, as part of the Immersion Experience Symposium (with the general theme of Embodied Spaces).
Repurposing the piece for the Satosphere dome was something of a challenge, since our camera and lighting setup was not designed for overhead projection from 13 metres away. Luckily, Zack Settel lent us a high-resolution IR camera and some OpenCV tracking machinery which we could repurpose to generate the crowd-tracking events needed by the visual and sound engines. Rather than do this directly in openFrameworks, we fell back instead to a familiar toolchain and piped the tracker’s events into a MaxMSP/ClojureScript bundle - derived from this demo project - since the higher-level tracking functions lend themselves to Clojure’s data manipulation strengths.
(We perhaps need a name for this style of Max programming where the patcher consists of a single object encompassing a scripting environment, surrounded by little other than communication and timing links.)
(Photo: Jo Hyde.)