PMCA Exhibit

From CCRMA Wiki
Revision as of 17:16, 3 November 2008 by Rob (Talk | contribs) (2-way Mirror: "MirrorriM)

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search


Main Gallery Hidden Realities: The Invisible Stories Revealed by Data
January 25 - April 5, 2009

Themes: Sonification and Visualization


Possible Configurations

For the PCMA exhibit there are a number of q3osc configurations which could work. Constraints on setup are primarily based in the physical layout of the space available in the presentation hall and the relative levels of sound/noise in the area. The principle factors to consider are how many user terminals we will setup for museum-goers to use, whether video will be projected with a projector, and how sound will be projected in the space. What follows are a number of possible system configurations which could be used with specific equipment requirements for each.

User-Centric vs. Space-Centric

There are 2 principle different configurations of q3osc audio output which can be used here: 1) User-Centric audio output and 2) Space-Centric audio output.

In a User-Centric configuration, the audio for given events in the virtual environment is realized in reference to the position of each individual user in a fashion similar to traditional FPS video games. For example, a sound event occurring in the environment in a position behind and to the left of the client will be presented to the user in a stereo presentation with its amplitude, panning, and delay calculated to sound as if it were coming from the left rear of the client. If multiple clients are in an environment simultaneously, each will be presented a sound world scaled around their own avatar's position in the environment. In this case, each client must wear headphones to prevent confusion and mingling of relative audio presentations.

In a Space-Centric configuration audio events in the environment are spatialized across multiple point-source speaker locations which are directly correlated to positions in the environment. When sound events occur in the environment, their amplitude in each speaker is calculated based on a scaled distance calculation (i.e. the distance from the event and each speaker in X,Y,Z coordinates). In this manner, the physical location of the presentation is mapped onto or correlated with the virtual environment, creating a powerful relationship between the two. Space-Centric presentations work well being projected on video screens so that real-world spectators can move within the environment and watch and hear events in the virtual environment. In this case, headphones can not be used and sound must be projected in the physical space.

Single User-Centric: Stereo, no projection (headphones or speakers)


  • 1 cpu (laptop or desktop w/ monitor, keyboard) mouse

Simplest setup. 1 user at a time moves through environment and their motions and projectiles are sonified in stereo based on the event proximities to their client location in the virtual environment. All audio processing can be done on the same machine as the game server/client making the the easiest system to setup and maintain.

Impact: Least impact... only 1 user at a time takes away the ability for multi-user interaction, no possible collaborative effects (like if 2 users are near each other, we get a pitch representing their distance). No projector means users won't be sucked in to come and watch.

Single User-Centric: Stereo with projection or large plasma display (headphones or speakers)


  • 1 cpu

Mult User-Centric: Stereo (headphones or speakers)


Space-Centric: 3-channel/speaker Triangle configuration


Space-Centric: 4-channel/speaker configuration


Space-Centric: 8-channel/speaker configuration


Possible world Ideas:

2-way Mirror: "MirrorriM"

Projector displays q3osc world on a wall in the space... place a frame of sorts around the view location... should look as seamless as possible as if a window into the next room

Camera stays in one location... you see the environment as if in a picture frame

2 client setups where they move around making sound in the environment...

Instrument Station-based world

Performers can move around to different "instrument" constructs in the world. Each Area/shape/instrument has different musical and audio properties and different ways that users can interact with it.