written by Jason Sadural (firstname.lastname@example.org) comments and suggestions always welcomed
Our purpose is to create a methodology to understanding the maximum threshold of observable individual musical tracks that move rythmically in space. This research will eventually lead to the creation of an auditory canvas that we can distort perturb by stretching, bending, or even ripping. The products developed in this experiment is intended for developement of Astro-Sonification. Various psychoacoustically inspired experiments will be conducted and data will be collected through a graphical user interface developed in PD with GEM. Experiments will primarily be conducted in the Listening Room.
The developed user interface is representative of the actual configuration of the listening room. The gray cubes represent actual speaker location and configuration with respect to the listener in the absolute center of the purple sphere. The solid blue spheres indicate individual output channels which can be moved anywhere in virtual space. The user is able to zoom in/out of the sphere as well as rotate to any perspective.
The current working driver which enables the sound source to move completely along the sphere is VBAP inspired by Ville Pulkki. Initial tests and subject response have show VBAP to produce high accuracy with point source localization in accordance with the virtual space. Further implementations involving physical modeling for PD have been added to the interface such as a spring. Tests have been done in which 9 point sources attached along a stretchy string move along the sphere in which the user purturbs it in real-time with great results.
Simultaneous playback of sounds incorporating First Order Ambisonics in order to create localization paths independent of purple sphere. Software used for this implementation will be with SuperCollider and the absolute position data will communicate with PD using Open Sound Control.
- Real-time controllers
A wireless blue-tooth accelerometer based sphere called BRBI created by Woon Seung Yeo will interact with interface over OSC. Gestures such as rotation and shaking will be mapped accordingly with physical models implemented.
The gloves of shaolin will be a tool in which the user will be able to purturb the auditory canvas and be able to send projectiles in virtual space that would sonically represent inhomogenieties it encouters along it's path.
Creating a simple example with a sound moving around audio sphere according to the graphical user interface to check consistency. For our sounds we will mainly use short bursts, claps, snaps, etc..
- We will use a modification of pmpd/04_3D_exemple.pd which is a sphere with a mass able to roll completely around the sphere
- We will map the location of the sphere to the spherical position controls of VBAP
- We will test for the accuracy of ovserved position along different regions of the sphere