Malcolm Slaney on Connecting auditory, visual and motor signals
Date:
Fri, 01/20/2023 - 10:30am - 12:00pm
Location:
CCRMA Seminar Room
Event Type:
Hearing Seminar Last summer I helped lead the auditory, visual and motor group at the Telluride Neuromorphic Engineering Cognition Workshop. This is a rather intense 3 week long workshop investigating different projects at the intersection of neurophysiology, engineering and biology. It’s a lot of fun. (And the reason for more all-nighters than any other part of my career.)
This year the audio group looked at the connections between the auditory, motor and visual systems, using computer vision and brain decoding. Within this broad effort the work divided into two sub projects: violin and decoding.
On the violin side there was a lot of interest in connecting the appearance of a player with the sound, and visa versa. For example, can the recorded sound be used to fine tune the position estimates of the finger positions one might estimate using computer vision. These and related capabilities might be useful in an education setting.
On the decoding side, we were aiming to use MEG data to connect audio, visual and emotional signals to the brain. In the past we’ve been successful at connecting auditory (attention) the MEG and EEG signals. This year we added visual signals, through a visual model of fovea driven saliency, pupil size and perhaps even emotion. We added some musical auditory imaging, multi-lingual understanding, and even enculturation.
Who: Malcolm Slaney (Google)
What: Connecting auditory, visual and motor signals via the brain and computer vision
Where: CCRMA Seminar Room, top floor of the Knoll at Stanford
When: Friday January 20th at 10:30AM PST
Why: Everything connects to the brain, how do we decode it?
Come to the Hearing Seminar to hear about these pilot experiments. All enticing, I hope.
FREE
Open to the Public