Decoding Spatial Auditory Attention Using a Vestigial Pinna-Orienting System in Humans
Date:
Fri, 03/22/2019 - 10:30am - 12:00pm
Location:
CCRMA Seminar Room
Event Type:
Hearing Seminar But Daniel J. Strauss will be at CCRMA on Friday to talk about "Decoding Spatial Auditory Attention Using a Vestigial Pinna-Orienting System in Humans.” Decoding auditory perception from brain signals is certainly one of the more interesting directions for auditory research. At ARO earlier this year in Baltimore, there was some interesting research showing that movements of the eyes showed up as a movement in the ear drum. And now Daniel Strauss is visiting from the University of Saarland, and will be talking about how our pinna might reflect our listening intent. Cool!
Who: Daniel J. Strauss (Saarland University)
What: Decoding Spatial Auditory Attention Using a Vestigial Pinna-Orienting System in Humans
When: Friday, March 22 at 10:30AM
Where: CCRMA Seminar Room on the Top Floor of the Knoll at Stanford
Why: Cats do it, can humans?
Point your pinna towards CCRMA on Friday for an interesting discussion. What might be the use of this? Do we actively use this signal while we seek to understand the auditory world around us? Come to CCRMA to find out more.
Decoding Spatial Auditory Attention Using a Vestigial Pinna-Orienting System in Humans
Prof. Dr. med. Dr. Daniel J. Strauss
Head of Systems Neuroscience & Neurotechnology Unit (SNN-Unit)
Medical Faculty and the Faculty of Engineering HTWdS.
Saarland University
In contrast to cats and dogs, it is a common belief that we humans do not make pinna movements when focusing our attention. However, it has recently been suggested that we do have a pinna-orienting system. It lies as ''neural fossil'' within our brains since millions of years. In a collaboration with Steven Hackley and Ronny Hannemann we are digging for this ''neural fossil'' and I would like to discuss some interesting recent result. In particular, we looked at how spatial auditory attention can be decoded by means of surface electromyographic signals from the auricular muscles. Apart from an analysis of automatic reflex like patterns, we found pretty surprising but convincing evidence that there are sustained myographic signatures of the auricular muscles which follow the listener's voluntary attention. I will also show how far we are in employing this auricular EMG monitoring in a decoding scheme of the listening direction. As our EMG results are complemented by co-registered videos of subtle pinna motion in 3D using a stereo camera computer vision setup, we have an illustrative component too.
Open to the Public