Playing an instrument can be an intensely physical experience. During an emotional performance, musicians may contort their face, change their posture, or even gesture with their instrument. Many of these changes in musculature are natural expressions of emotion, and yet they are often not captured as musically meaningful signals.
In this project, physical gestures, particularly those that involve movements of an instrument in space, are mapped onto musical effects parameters. In the current instantiation, ubiquitous mobile devices (e.g., iPhone or Wiimote) are used to link gestural and haptic controls with outboard musical effects processors.
The Phoenix (cover story)