MUSIC 220A –Homework 5

FFT-based Real-time Tracking

Ryan Landron

 

Final Binaural-Encoded Stereo Audio File (.wav):

https://ccrma.stanford.edu/~rlandron/220a/hw5.wav

Intermediate Audio Files:

https://ccrma.stanford.edu/~rlandron/220a/HW5/hw5-score1.wav

https://ccrma.stanford.edu/~rlandron/220a/HW5/hw5-score2.wav

https://ccrma.stanford.edu/~rlandron/220a/HW5/hw5-rhythm1.wav

https://ccrma.stanford.edu/~rlandron/220a/HW5/hw5-rhythm2.wav

https://ccrma.stanford.edu/~rlandron/220a/HW5/hw5-improv1.wav

https://ccrma.stanford.edu/~rlandron/220a/HW5/hw5-improv2.wav

ChucK Source Codes (.ck):

https://ccrma.stanford.edu/~rlandron/220a/HW5/hw5-playback.ck

Modifications to frequency and fft tracking time were done to the provided amplitude spectrum and amplitude frequency tracking codes in order to produce the computer generated sound

Description: For the score I decided to first focus on the birds chirping in the nature audio and graph their frequency and amplitude. I then did a second pass of the nature audio file and tracked other sounds such as ducks quacking and the water swirling. The results can be seen here:

 

I then changed parameters in the amplitude spectrum and amplitude frequency tracking codes (frequency selection and time of smoothing mostly, along with some other minor modifications) in order to get the sound I liked from the computer. To do the live tracking I took my air microphone from earlier in the quarter and placed it deep inside the sound hole of my guitar while playing along. In the rhythm sections I loaded up 2 separate drum tracks in different styles and beats per minute and played rhythm chords to them. I recorded all of the sections and then mixed them all together using Audacity. Once I had the final mix in Audacity I sent it through DBAP4e with minimal reverb and a circular pattern to add a swirling effect to the audio.