This piece explores the transference of gesture from physical space (motion) to sonic space (sound) to visual space (video) utilizing custom-built software that incorporates motion tracking and generative audio and video (version 2 of Motion-Influenced Composition).
Transference explores the ways in which gesture can be transferred between different mediums. Motion is analyzed and used to generate sound, which is in turn analyzed to generate video. What is lost during this transference of gesture? What is gained?