Music in Motion is an innovation in musical performance where a relationship is built between audio, gestures, and dynamic imagery. The basis of this project is to capture the movements of a musician during a performance and use that information to manipulate music and graphics in real-time.
The performance system consists of two parts. The first part captures the movements of an electric guitar player during a performance. This motion data is then used to drive different audio effects and parameters such as tone, pitch, volume, and overdrive. For example, pulling up on the guitar’s neck will change the position of a wah pedal or increase gain and distortion. The second part is to create a visual presentation driven by the music and motion data. This real-time dynamic visualization can be projected during a performance, either on large screens behind the performers, on the sides of the stage, or on the floor and ceiling, depending on the venue.
With both parts working together, a new genre of performance is born. A performance that connects the audience to the musician through dynamic visuals and physical body movements. This system encourages a very expressive performance that gives an everlasting impression.
Body Movements and Gesture Control
A motion study was conducted in 2003 by Sofia Dahl and Anders Frigberg, titled, “What can the body movements reveal about a musician’s emotional intention?”  In this study, they videotaped a marimba player with the intent to display sadness, anger, happiness, and fear. From the recordings they created four different clips that focused on key areas of movement and non-verbal communication. The clips were cropped to show the torso, head, everything but the hands, and full frame. Then, a threshold filter was applied to all of the footage rendering it as a drawing or rotoscope (thus hiding any facial expressions). Twenty subjects were asked to view the clips and rate the emotional intent of the performer on a scale from 0 to 6 (0 meaning nothing and 6 meaning very much). Using the predefined emotions of sadness, anger, happiness, and fear, Dahl and Frigberg concluded that all but anger were conveyed through the musician’s movements alone. The subjects also rated the movement of the performance based on adjectives like fast, jerky, uneven, etc. They found that anger was characterized by large, fast, uneven, and jerky movements; happy by large and somewhat fast movements, sadness by small, slow, even and smooth movements . The information gathered in this study is a great starting point for the logistics behind the dynamically generated visuals in the proposed performance system. I will be able to capture these distinct motions (fast, large, jerky, etc.) from the performer and then connect them to colors and images that depict their intended emotion (happy, sad, angry, etc.). A simple example would be to connect jerky movements (anger) to high contrast reds with sharp...