top of page

Interpreter

Stephanie Jantzen, Atsunobu Takemoto, Yoonjae Choi

20260420_impa-3 Large.jpeg

This work is an audiovisual soundscape that explores the semiotic meanings embedded in gestures and signs. It is structured as a semiotics-based performance in which hand movements, captured and analyzed through MediaPipe-based gesture recognition, are translated into transformations of sound and image.


The use of a semiotic language in this work stems from an interest in exploring the possibility of universal communication within a contemporary society that is increasingly both pluralistic and globally standardized. Gestures, signs, and sounds from everyday life function as mediators that are not bound to a specific language, yet remain widely recognizable across different cultural contexts. Through this, they are reconfigured into an expanded system of language.
Recognized gestures operate as linguistic units, and bodily movement generates meaning through audiovisual transformation. In this process, familiar signs and physical actions move beyond fixed meanings and emerge as a performative language that is continuously redefined through relational contexts.


This work consists of AI-generated visuals developed into a multi-layered structure to create dynamic and transformative expressions. Presented across five screens and interacting with sound, it forms a spatial, immersive experience. Based on an abstract narrative, the work begins with a dark, colorless world shaped by war and natural disasters. It then depicts a process of regeneration in which humanity merges with nature and technology. Drawing on posthumanist ideas, the work explores the concept of “unity,” encompassing both the integration of human, nature, and technology, and the connections among people beyond geographical and cultural boundaries.


By combining gesture recognition, spatial audio, and visual collage, the work creates an immersive environment in which meaning is formed through the interaction between body, sound, and image. Ultimately, these interactions sensorially reveal the possibility of "unity" that connects differences while allowing them to remain distinct.
 

“This project was a lot of firsts for me. I hadn’t worked with TouchDesigner or dealt with networking via OSC on this scale before, so it took a lot of trust in myself and my collaborators to make this real-time ecosystem as airtight as possible and, more importantly, to make a compelling experience out of it.” – Stephanie

Interpreter Screen Media

Interpreter_handTrackProgress_3-15.mp4
Interpreter_handTrackProgress_3-11.mp4
bottom of page