FACE DJ is an experiment to build an audio controller with real-time face tracking data from my iPhone. With the phone camera capturing different facial expressions and movements, audio clips/effects are triggered in real time like DJing with one's face.
I used ZIG SIM on my iPhone(with built-in LiDAR) to send AR Kit data to TouchDesigner on my computer via OSC.
I linked TouchDesigner with Ableton Live via MIDI where all the audio clips and effect were created and mapped.
In TouchDesigner, I built a simple interface and connected the incoming data to trigger both audios in Ableton and virtual buttons and sliders in TouchDesigner.