Jump to Navigation

Main menu

  • Login
Home

Secondary menu

  • [Room Booking]
  • [Wiki]
  • [Webmail]

Joel Snyder on Auditory and Visual Rhythm Perception

Date: 
Fri, 02/01/2013 - 1:15pm - 2:30pm
Location: 
Seminar Room
Event Type: 
Hearing Seminar

We've got rhythm!  But how?

I'm pleased to introduce Joel Snyder to the CCRMA community. Joel is studying the mechanisms we use to track rhythm. We can bop to either a visual or an auditory rhythm.  But do the two modalities use the same mechanism, or do they track their rhythms separately? How do the clock(s) work?

Who: Joel Snyder (UNLV)

What: Auditory and Visual Rhythm Perception

When: Friday Feb 1 at 1:15PM

Where: CCRMA Seminar Room (Upstairs at the Knoll)

 

Joel will be visiting Stanford on Friday.  Let me know if  you would like to meet with him and talk about your common interests.  I'll set up a schedule.

Title: "Auditory and visual rhythm perception"

Abstract:
Musical skills are usually considered to rely mostly on auditory and motor processing. However, musical information can also be gathered from vision and touch, raising questions about how much these non-auditory senses might contribute to musical behavior. The perception and production of timing patterns is one musical skill that has been studied extensively in both the visual and auditory modalities, providing hints about the extent to which vision might contribute to rhythm perception and also about the fundamental neural mechanisms of timing. My talk will therefore attempt to synthesize research that addresses the following questions: Is auditory-based timing for time intervals relevant to music better than visual-based timing for the same sized intervals? How robust are any observed differences to various moderating factors? To the extent that there are differences between auditory- and visual-based timing, what theories can best explain the differences and how can we test these theories? Finally, how does the comparison between auditory and visual timing inform us about an important issue in the literature, namely whether there are central timing mechanisms in the brain that compute time for all the senses as opposed to modality-specific timing mechanisms?

Bio: Dr. Snyder received a Ph.D. in Psychology from Cornell University and was a post-doctoral fellow at University of Toronto and Harvard University before starting the Auditory Cognitive Neuroscience Laboratory at UNLV. He is an expert on auditory perception and its neurological basis and has published numerous empirical studies and literature reviews in top psychology and neuroscience journals. His research has been supported by UNLV, the National Institutes of Health, the National Science Foundation, and the Army Research Office. Dr. Snyder’s research accomplishments were recognized with the 2009 Samuel Sutton Award for Early Distinguished Contribution to Human ERPs and Cognition.

FREE
Open to the Public
  • Add new comment
  • Calendar
Syndicate content
  • Home
  • News and Events
    • All Events
      • CCRMA Concerts
      • Colloquium Series
      • DSP Seminars
      • Hearing Seminars
      • Guest Lectures
    • Event Calendar
    • Events Mailing List
    • Recent News
  • Academics
    • Courses
    • Current Year Course Schedule
    • Undergraduate
    • Masters
    • PhD Program
    • Visiting Scholar
    • Visiting Student Researcher
    • Workshops 2022
  • Research
    • Publications
      • Authors
      • Keywords
      • STAN-M
      • Max Mathews Portrait
    • Research Groups
    • Software
  • People
    • Faculty and Staff
    • Students
    • Alumni
    • All Users
  • User Guides
    • New Documentation
    • Booking Events
    • Common Areas
    • Rooms
    • System
  • Resources
    • Planet CCRMA
    • MARL
  • Blogs
  • Opportunities
    • CFPs
  • About
    • The Knoll
      • Renovation
    • Directions
    • Contact

Search this site:

Winter Quarter 2023

101 Introduction to Creating Electronic Sound
158/258D Musical Acoustics
220B Compositional Algorithms, Psychoacoustics, and Computational Music
222 Sound in Space
250C Interaction - Intermedia - Immersion
251 Psychophysics and Music Cognition
253 Symbolic Musical Information
264 Musical Engagement
285 Intermedia Lab
319 Research Seminar on Computational Models of Sound
320B Introduction to Audio Signal Processing Part II: Digital Filters
356 Music and AI
422 Perceptual Audio Coding
451B Neuroscience of Auditory Perception and Music Cognition II: Neural Oscillations

 

 

 

   

CCRMA
Department of Music
Stanford University
Stanford, CA 94305-8180 USA
tel: (650) 723-4971
fax: (650) 723-8468
info@ccrma.stanford.edu

 
Web Issues: webteam@ccrma

site copyright © 2009 
Stanford University

site design: 
Linnea A. Williams