Spatial layers

From CCRMA Wiki
Revision as of 10:28, 2 October 2007 by Wikimaster (Talk | contribs)

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search

Project Summary

 written by Jason Sadural (jsadural@ccrma.stanford.edu)
 comments and suggestions always welcomed

Our purpose is to create a methodology to understanding the maximum threshold of observable individual musical tracks that move rythmically in space. This research will eventually lead to the creation of an auditory canvas that we can distort perturb by stretching, bending, or even ripping. The products developed in this experiment is intended for developement of Astro-Sonification. Various psychoacoustically inspired experiments will be conducted and data will be collected through a graphical user interface developed in PD with GEM. Experiments will primarily be conducted in the Listening Room.


GUI

The developed user interface is representative of the actual configuration of the listening room. The gray cubes represent actual speaker location and configuration with respect to the listener in the absolute center of the purple sphere. The solid blue spheres indicate individual output channels which can be moved anywhere in virtual space. The user is able to zoom in/out of the sphere as well as rotate to any perspective.

Spatialization

The current working driver which enables the sound source to move completely along the sphere is VBAP inspired by Ville Pulkki. Initial tests and subject response have show VBAP to produce high accuracy with point source localization in accordance with the virtual space. Further implementations involving physical modeling for PD have been added to the interface such as a spring. Tests have been done in which 9 point sources attached along a stretchy string move along the sphere in which the user purturbs it in real-time with great results.

In Progress

  • Ambisonics

Simultaneous playback of sounds incorporating First Order Ambisonics in order to create localization paths independent of purple sphere. Software used for this implementation will be with SuperCollider and the absolute position data will communicate with PD using Open Sound Control.

  • Real-time controllers

A wireless blue-tooth accelerometer based sphere called BRBI created by Woon Seung Yeo will interact with interface over OSC. Gestures such as rotation and shaking will be mapped accordingly with physical models implemented in the interface.

The gloves of shaolin will be a tool in which the user will be able to purturb the auditory canvas and be able to send projectiles in virtual space that would sonically represent inhomogenieties it encouters along it's path.

  • Source path designer

ex. a figure 8 path along the sphere will be able to stretch vertically or horizontally in real time.

  • Spatial cues

Algorithms incorporating doppler shiFt and virtual wall reflections and damping will be incorporated in accordance with psycho-acoustic experimentation design.

Psychoacoustic experimentation

For canvas design it is necessary to determine several auditory thresholds. One is determining the physical limit a correlated and syncopated sound can be seperated in virtual space with the user still able to identify correlation. Further discusions are needed in order to determine the types of sounds to be used. For the moment we will assume the sound source to be confined to the sphere. Psychoacoustic experimentation design is currently still at it's most preliminary stages. More to come...