Antje Ihlefeld - Predicting spatial audio quality for AR/VR
Date:
Fri, 05/12/2023 - 10:30am - 12:00pm
Location:
CCRMA Seminar Room
Event Type:
Hearing Seminar 
Antje Ihlefeld, who now leads auditory perception research at Meta Reality Labs in Redmond, will be talking about research she has done to better understand how central nervous system function affects the perception of spatial sounds. Her abstract is below.
Who: Antje Ihlefeld (Meta Reality Labs)
What: Predicting spatial audio quality for AR/VR
When Fri, 05/12/2023 - 10:30am - 12:00pm
Where: CCRMA Seminar Room
Why: How do we make augmented/virtual reality better?
Come to CCRMA, where we’ll use a real environment to describe how we interact with a complicated 3d sound field.
— Malcolm
Predicting Audio Quality for VR and AR
Achieving a sense of "presence" is a key goal for virtual and augmented reality (VR/AR). To truly immerse people in a virtual environment, audio experiences need to be indistinguishable from reality. However, achieving this level of audio quality can be a complex task, particularly when trying to balance computational complexity and the need for in-person perceptual studies. One solution is to develop computational metrics that rely on objective measures to predict spatial audio quality. Current approaches tend to rely on single-task sensory processing thresholds to gauge the amount of sensory information received at the auditory periphery. While these metrics can be useful, they only provide a limited picture of audio quality.
To develop more comprehensive audio quality metrics, it is necessary to consider central auditory processing. The thesis of this talk is that new paradigms are needed to better understand how listeners build up expectations and interact with virtual environments, in order to achieve a sense of presence. I will present behavioral evidence on how the brain processes and interprets audio information, as well as how expectations and interactions with the environment can shape auditory thresholds. Together, these findings suggest that by considering central auditory processing when developing audio quality metrics, we can gain a more comprehensive understanding of how to arrive at audio quality that is indistinguishable from reality.
Bio: Antje Ihlefeld applies principles of auditory neuroscience towards immersive AR/VR technology. Prior to joining Reality Labs at Meta as Tech Lead for Auditory Perception, she was the principal investigator in a federally funded lab that worked on restoring hearing in individuals with profound hearing loss, and a professor for biomedical engineering. Antje is passionate about driving technological advances through science and maintains close ties with higher education. She is a visiting professor at the Neuroscience Institute at Carnegie Mellon University.
Suggested reading: https://www.nature.com/articles/s41598-021-00328-0
FREE
Open to the Public