Prof. Takako Fujioka on Neural correlates for prediction of musical structures and learning-related functional plasticity
What can EEG (and MEG) tell us about how we perceive polyphony, syntax, and longer-scale relationships? How does our perception of time develop as we listen to a musical piece? I think this work is important because we know precious little about how we music perception works, at the neural level. EEG and MEG give us tools to help us understand how we represent music internally, and how our perception develops over time. I’m very happy to announce that Prof. Takako Fujioka will be at the Hearing Seminar to update us with her latest work.
Who: Prof. Takako Fujioka (Stanford CCRMA)
What: Understanding Music Perception with EEG/MEG
When: Friday, November 13th at 10:30AM
Where: CCRMA Seminar Room
Why: Because we need to know more about the mechanisms of perception (music or not)
Abstract
Music is an important part of human culture for social communication. Music making and listening activities require complex coordination of multi-modal sensorimotor, cognitive and affective functions. From early ages music has been used not only for educational purposes but also for therapeutic purposes for those physically and mentally challenged. The nature of human musical abilities therefore transverses major subjects of cognitive neuroscience. Non-invasive neurophysiological recording techniques such as MEG and scalp EEG, with their excellent temporal resolution, allow us to investigate neural mechanisms for analyzing incoming sounds which unfold over time, predicting what comes next, and coordinating perception and action.
In this talk I will feature studies from recent years including those from our new EEG laboratory at CCRMA, Stanford. Our data demonstrate neural processing of polyphonic melodies, prediction of musical syntax, interaction between global and local harmonic relationships, and rhyming of spoken words. I will also discuss recent MEG data for neural representation of internalized timing information that connects auditory and motor systems.
Fujioka T, Ross B, Trainor LJ (in press) Beta-Band Oscillations Represent Auditory Beat and Its Metrical Hierarchy in Perception and Imagery. Journal of Neuroscience.
Fujioka T, Fidali BC, Ross B(2014) Neural correlates of intentional switching from ternary to binary meter in a musical hemiola pattern. Front Psychol;5:1257.