The data sources were taken from the Time Series Data Library:
The data were all of the same length and general purpose, so they naturally mapped to three different parts. I mapped all of them to STK flute instruments. The frequency and note on velocities were triggered with the same enveloping techniques as in the example program with envelopes.
As I was playing with the program I noticed an interesting phenomenon occurs when triggering noteoff and noteon events at the sample rate. A percussive sort of beat occurs with two of the datasets, whereas one of the datasets remains more tonal. It gives an almost tribal feeling. I was quite pleased with the effect, although it does require that the program is run with a sample rate of 48kHz.
The datasets are run through a total of two times throughout the piece. The first time they are run through partially, then each finishes in sequence. Then they are run through again at an accelerated rate.
One other note - with all this processing it may be difficult to render in realtime with the binaural processing enabled. I was able to obtain a recording by running chuck in silent mode.
Answers to questions: