Emily Graber / 220b

Final Project

Readme

This projects features a variety of tracks that can accompany a short animation. The tracks have been generated with S.M.E.L.T., with Processing and ChucK via OSC, and by traditional composition. This project also illustrates how simultaneous visual and auditory stimuli influence perception.

This project is part of a collaboration with Paul Capotosto. Paul is now getting an MFA at Syracuse University. He created the footage that you will see below as part of a stop-frame animation project.

Composed

Noise

Granular Tracking

Processing Pixel Brightness

Codes

Some more explanation

Each version accomplishes something slightly different. In the Granular Tracking example, it almost seems as though Lee himself (the character in the animation) is emitting those short sounds. Interestingly, the Noise example was created with the same tracking method as the Granular Tracking, yet the results are significantly less compelling. The composed version is more suitable as background music. It is non-diegetic, but it still reflects Lee's general emotions. Finally, the Processing version illustrates the potential of automated visual tracking; in the Processing video above, I recorded data from only two pixels. In the future, more relevant things, such as the rate of changing rgb values, could be sonified to create an accurate representation of Lee's motion. Depending on which pixels' values change the most, the sound could be panned to accordingly.