Measuring Musical Engagement Through Expressive Movement and EEG Brain Dynamics

Thu, 11/07/2013 - 12:00pm - 1:05pm
CCRMA Classroom - Knoll 216
Event Type: 
A natural method for monitoring listener engagement without distracting the listener could contribute to music perception and education research with possible wider applications to music classification, technology, and therapy. Music listeners often use words and/or gestures to describe music they enjoy. In addition to describing the music using genre or style categories (‘jazz’), they may use emotion words (‘happy’) and words related to physical movements (‘bouncy’). They may also use rhythmic hand gestures to convey the feeling and rhythmic pulse of the music. Similar gestures are used by instrumentalists during performance to focus their expressive intent. We are testing the use of expressive gestures to study musical engagement, by which we mean the experience of ‘entering into’ heard or performed music, a condition in which the listener perceives the musical stimulation affectively while withdrawing his or her attention from extra-musical stimuli and concerns. We train expert and non-expert participants to communicate the feeling of music they are hearing using simple rhythmic U-shaped hand/arm 'conducting' gestures that animate the 2-D movement of a spot of light on a video display while we use body motion capture and EEG to record their movements and brain activity. The animation is intended to focus and simplify the conductor's movements. We then ask viewers to rate the recorded 2-D spot animations of the recorded gestures on a musical emotion rating scale to test to what extent the musical affective experience of the 'conductors' can be conveyed by these animations to viewers who do not hear the music. Ratings of conductor and viewer groups were well correlated, verifying that the affective intent of the conductors' gestures can be indeed experienced by viewers. We then analyze conductors' movement and EEG data to find movement and brain dynamic patterns supporting the expression of musical feeling. We will also report results of training a classifier using a frequency-based common spatial patterns (FBCSP) approach to correctly distinguish Engaged and Not Engaged conditions from concurrently recorded EEG data.
Open to the Public
Syndicate content