Dissertation Defense: Perceptually Coherent Mapping Schemata for Virtual Space and Musical Method

Date: 
Wed, 04/02/2014 - 3:00pm - 5:00pm
Location: 
CCRMA Stage
Event Type: 
Other

 

Perceptually Coherent Mapping Schemata for Virtual Space and Musical Method

a PhD Defense by Rob Hamilton


Our expectations for how visual interactions sound are shaped in part by our own understandings of and experiences with objects and actions, and in part by the extent to which we perceive coherence between gestures which can be identified as “sound-generating” and their resultant sonic events. Even as advances in technology have made the creation of sonically dynamic computer generated audio-visual spaces not only possible but increasingly common, in the realms of sound design and music, real-time data generated by actors and their interactions in virtual space can be used to procedurally generate sound and music, creating tight couplings across the visual and auditory modalities. With the use of procedurally-generated music and audio in interactive systems becoming more prevalent, composers, sound-designers and programmers are faced with an increasing need to establish low-level understandings of the cross-modal relationships between visual gesture and sonified musical result.


Taking a two-faced approach, this dissertation first presents a series of creative methodologies for the real-time musical sonification of visual motion and gesture within expansive rendered gaming environments. Examples of composed interaction, compositional mapping and musical sonification inspired by the paradigms of avian flight, avatar physiologies and virtual choreographies from a series of immersive audio-visual works will be discussed and dissected.

In an effort to better understand the perceptual concerns governing sonified gesture and motion, this dissertation also details a series of user-studies designed to evaluate the perceptual coherence or crossmodal correspondence between actions and gestures performed in virtual space and their resultant procedurally-generated sound processes. Subjects were asked to watch short video examples of musically sonified avatar motions and rate the perceived fit between visual and auditory events. By defining each example as a composite set of simple motion and sound attributes, the statistical analysis of correlated motion/sound mappings and their relative contributions to the perceptual coherence of audio-visual interactions lays the groundwork towards the establishment of predictive models linking attributes of sound and motion to perceived fit.

Bio:   Rob Hamilton is a PhD candidate at CCRMA working with Chris Chafe. His artistic work and research interests focus primarily on the real-time control of interactive musical systems and new approaches for the composition and performance of dynamic multi-modal environments.


FREE
Open to the Public
Syndicate content