Intermedia




Throughout my PLOrk and CCRMA years, I've had opportunities to experiment with new modes of musical expression and to broaden my definition of it. With the help of various tools (both standard and unlikely) and quirky people, I've produced some of my weirdest and most fun musical projects, which you will find below.

Lights

Lights is an interactive sonic visualizer, implemented through C++/OpenGL, that takes the FFT of incoming audio signals and uses the magnitude information of each frequency bin to trigger certain animations in the graphics. The vision was to represent, with a bit of randomness, the movements of the sonic "particles" that together create the music we hear. Each sample of the FFT output is randomly assigned to a place on a 3-dimensional grid, and flickers there very quickly whenever the magnitude is over a certain threshold. Thousands of these little instances take place each moment, resulting in an aggregate of flickering "lights" that dance to the music. The 3-D grid in the background (yellow) also flickers in response to the magnitude information, but with a threshold/sensitivity different from that of the lights. A band of pink lights appears in the middle of the screen in response to high frequencies, which often correspond to snares, claps, and beats commonly used in pop/dance music. The user can also contribute to the visual effects by rotating the grid at different speeds, zooming in and out, toggling the lights or the grid, and changing colors of the lights ("party mode"). Combining these three forces (randomness, particular characteristics of the musical input, and creativity of the user) can produce some unique and fun visual effects.


Credits: RtAudio by Gary P. Scavone; chuck_fft by Ge Wang & Perry R. Cook; VisualSine template by Ge Wang

Doors

Doors is a sonic performance for one performer and eight laptops, implemented using ChucK and SLOrk's unique hemispheric speakers. The performer is something of a "sonic wizard" that pulls out various sounds through an invisible doorway and directs them to dance and whirl all around her at the flick of the wand (well, Gametrak controller).

This live performance was debuted at the Stanford Center at Peking University (SCPKU) as part of the SLOrk in China summer program. Please note that the live performance, with the audio spatialized in 8 different directions, doesn't really lend itself to a simple video capture, but I hope the video will at least give you a sense of what the project was about. Stereo headphones are highly recommended.


Cookin' by BurritoFries

Food and music share quite a bit in common. They both nourish us, are widely loved, and involve the alchemical transformation of raw ingredients into a flavorful, dynamic whole. Cookin' is an interactive performance (enabled by ChucK and Gametrak controllers) that blends together music, motion, and kitchen stuff to cook up some fun.


Sardines

Sardines was inspired by the schools of sardines I saw at the Monterey Bay Aquarium in California. Their synchronized and rapidly propagating movements were absolutely mesmerizing, and I set out to create a simple musical simulation of this behavior using ChucK. The basic premise is as follows:
1. There are six different "fish" represented by six different note groups (five distinct pitch classes, plus bass notes in a lower register).
2. The first fish only responds to the changes in pitch, interval, or volume made by the sixth fish, the second fish only responds to the first, the third fish only responds to the second, and so on.
3. These changes are either "composed" or randomly triggered throughout the piece, representing the school's twists and turns through space.


Some years later, I took on this project once more, this time adding visual and interactive components to it through C++/OpenGL. Here, again, each fish is assigned a note, and the distance between a fish to its neighbors is indicative of the pitch-class interval between the two. The aggregate of these intervals makes up a chord, which can be changed (between major, dorian, pentatonic, or "mystery") and transposed by the user with keyboard strokes. Visually, the changing of the chord makes the fish rearrange their positions within the flock into a new configuration (thus changing the distance, or interval, between each fish), whereas transposition corresponds to a 90-degree horizontal turn.

I also streamlined the propagation logic based on Iain Couzin's research on flocking behavior. The new rules are as follows:
1. Each fish (note) maintains a constant distance (interval) away from its neighbors, within a given configuration (chord).
2. Each fish (note) only responds to its nearest neighbors' movements (transposition).
3. When a predator comes, run! (Clicking somewhere on the screen will make the fish quickly run away from that spot and then regroup somewhere else.)


Credits: RtAudio by Gary P. Scavone; MCD-API and Raka framework by Ge Wang

Falling

Falling is a real-time interactive program, implemented through Max/MSP in conjunction with the PS2 Gametrak controller, for intuitive and movement-based audiovisual exploration. It starts off in a "default" state, where the music plays at regular speed and the balls on the screen are safely bound to the ground by gravity. The only upward force is powered by the music; changes in the fundamental frequencies trigger random balls to shoot upward, but only to be pulled back down by gravity eventually. However, pulling the controller strings in different directions may cause various "anomalies" to occur: musically, these include pitch/tempo shifts, reshuffled audio, and filter effects; visually, these correspond to shifting gravity, rotating planes, and flashing colors, respectively.

Below is a quick summary of the relationships between the music, the visuals, and the movements:
Left String
1) x-axis movement (pulling string right/left)
= filter cutoff frequency (right => lower, left => higher)
= color variation of balls (right => more uniform, left => more varied)
2) y-axis movement (pulling string forward/backward)
= sample playback rate (forward => higher, backward => lower)
= gravity (forward => larger, backward => smaller)
3) z-axis movement (pulling string up/down)
= audio volume (up => louder, down => softer)
Right String
4) x-axis: same as #1) above
5) y-axis movement
= row number in 2D wavetable (forward => row 2, center => row 1, backward => row 3)
= changing dimensions (forward => smaller x-dimension, backward => smaller z-dimension)
6) z-axis: same as #3) above
The left and right strings are independently playing the same music track (at slightly different speeds), so the user can experiment with alternating between the two and creating various combinations of effects.


The music track used is Rachmaninov's "Vocalise" from 14 Romances, Op.34.

Cats

Cats is a simple music app (developed through MobMuPlat and PureData) about cats, pretty colors, and a bit of randomness. It uses random number generation to trigger certain pitches (or sound bites) and corresponding squares to appear at a given moment. Motions like sliding the bar in the middle and tilting or shaking the device change different musical parameters, such as tempo, probability of pitch/square location, timbre, and pitch class.