HvA Immersive Lab

The HvA asked me to make a flexible reusable template for their Immersive Lab so that students and staff could easily project content and add interactivity elements to it.

I made several custom patches in TouchDesigner for this purpose. Those can be found in this GiT Repository

The main patch allows users to load any content into TouchDesigner and connect it to the corresponding wall where it should be projected.

There is also a splitter component that allows users to load content (16:9 ratio) and split it across all the walls in the room. The corner pictures are on the slideshow. 

I also developed several interactive components for students to reuse, see below for a list.

 

Students Works (examples)

Interactive Patches

Body Tracking

Used NVIDIA AI with the Body Track chop to trace the z position of the visitor. The video shows how students used this to blend between images depending on the position of user.

Motion Tracking

Since most students use Macs, which do not have NVIDIA available, I developed a simple motion-tracking patch that detects motion in the camera view and allows the creation of hotspots to trigger changes in the content.

Facial Recognition

I used the Zig Sim app to leverage ArkiT features for facial recognition. I developed triggers based on a smile, open jaw, and closed eyes.

Communication was done wirelessly via OSC.

Phone Acceleration

Used the Zig Sim app to trace the acceleration of a phone and pass it to TouchDesigner so that  it could be used to change the position of an object or modify content.

Students used the trigger to change the play direction of a video, and to create feedback traces in the room