top of page



FACE DJ is an experiment to build an audio controller with real-time face tracking data from my iPhone. With the phone camera capturing different facial expressions and movements, audio clips/effects are triggered in real time like DJing with one's face.

Technical workflow:

  1. I used ZIG SIM on my iPhone(with built-in LiDAR) to send AR Kit data to TouchDesigner on my computer via OSC.

  2. I linked TouchDesigner with Ableton Live via MIDI where all the audio clips and effect were created and mapped.

  3. In TouchDesigner, I built a simple interface and connected the incoming data to trigger both audios in Ableton and virtual buttons and sliders in TouchDesigner.

bottom of page