HvA Immersive Lab
The HvA commissioned me to create a flexible, reusable template for their Immersive Lab, which would enable students and staff to easily project content and integrate interactive elements.
To achieve this, I developed several custom patches in TouchDesigner, available in this GiT Repository.
The primary patch allows users to load any content into TouchDesigner and map it to the appropriate wall for projection. Additionally, there’s a splitter component that enables users to load 16:9 ratio content and seamlessly distribute it across all the walls in the room. Examples of this are shown in the slideshow.
I also created several interactive components for students to reuse, detailed below.
Students Works (examples)
Interactive Patches
Body Tracking
Used NVIDIA AI with the Body Track chop to trace the z position of the visitor. The video shows how students used this to blend between images depending on the position of user.
Motion Tracking
Since most students use Macs, which do not have NVIDIA available, I developed a simple motion-tracking patch that detects motion in the camera view and allows the creation of hotspots to trigger changes in the content.
Facial Recognition
I used the Zig Sim app to leverage ArkiT features for facial recognition. I developed triggers based on a smile, open jaw, and closed eyes.
Communication was done wirelessly via OSC.
Phone Acceleration
Used the Zig Sim app to trace the acceleration of a phone and pass it to TouchDesigner so that it could be used to change the position of an object or modify content.
Students used the trigger to change the play direction of a video, and to create feedback traces in the room