Bioinformatic Feedbacks

This project makes use of a software system using bioinformatic data recorded from a performer in real-time as a probabilistic driver for the composition and subsequent real-time generation of traditionally notated musical scores. To facilitate the generation and presentation of musical scores to a performer, the system makes use of a custom LilyPond output parser, a set of Java classes running within Cycling ‘74’s MAX environment for data analysis and score generation, and an Atmel AT-Mega16 micro-processor capable of converting analog bioinformatic sensor data into Open Sound Control (OSC) messages.

Rather than use voluntarily generated physiological signals as active controls on the musical output, this system seeks instead to modify compositional content to react to involuntary physiological reaction. In this model, autonomic physiological data acts as a control signal while a performer’s physical gesture retains its traditional role as an expressive component of performance. In essence, the compositional decisions made by the composer act as a deterministic filter for the autonomic control signals generated by the performer.

By varying the relationship between physiological reaction and resultant compositional output, different compositional forms can be created. For instance, when an inverse response mapping is applied, where strong sensor readings generate weak or relatively simple compositional structures, a performer’s physiological state can potentially be coerced into a less excited state. Similarly, a performer in a stable or less excited state will be presented with more active musical cells, aiming to excite the performer into a more active state. When a direct mapping between physiological state and compositional form is applied, musical output mirrors physiological state, outputting musical cells that reinforce the current state. Additionally, more complex mapping relationships can be defined and implemented with relative ease.


Programming: Rob Hamilton