Difference between revisions of "Jouska"

From CCRMA Wiki
Jump to: navigation, search
Line 1: Line 1:
 
[project page for 220C course with Chris Chafe, Spring 2017]
 
[project page for 220C course with Chris Chafe, Spring 2017]
  
 +
== Overview and Setup ==
 +
'''Title''': ''Jouska''
  
== Overview ==
+
'''Composer''': Chris Lortie
Thesis: To create a narrative of the events that a person with Sporadic Fatal Insomnia would experience from the time of the condition’s onset until eventual death.
+
  
Title: Jouska
+
'''Duration''': 9'
  
Duration: 10'
+
'''Instrumentation''': Violin and live electronics (including µgic glove sensor)
  
Instrumentation: Violin and live electronics (including µgic sensor)
+
'''Premiere Performance''': Mari Kimura at the 2017 SPLICE Institute on Wednesday, June 14th at 7:30pm in the Dalton Center Recital Hall, Western Michigan University.
  
== Timeline and Formal Structure ==
+
'''Setup''': Violin >> DPA clip-on mic >> Max/MSP on a laptop >> stereo output.
The events that are represented by various sound objects describe a slow transition of sleep-related and insomnia-related sounds towards a chaotic texture of intense auditory hallucination and dementia. The piece is structured as a long-form crescendo with occasional moments of relief – this is aided by the use of Shephard tones to guide the frequency regions each sound will take place in. The Shephard tones themselves will not be present, but only act as a structural guide. The rate of these glissandi increases gradually.
+
During the piece, both fixed sound files and live processing of the violin input occur. Data from the violinist's glove sensor alter these effects and occasionally advance cues in the Max/MSP patch. There is no foot pedal involved in the piece.
  
== Notable Compositional Techniques ==
+
== Program Notes ==
Spectrum-based panning – splitting parts of a sound into its spectral components (either parametrically or automatically by peak detection) and panning to the same location at different rates. For longer sounds, this can act as a sort of “Shephard tone” for panning, by creating an ongoing cycle of these movements and hiding their onset times. I want to employ auditory illusions wherever possible, to emulate the “hallucinations” of the protagonist. These may include the wagon-wheel effect, Huggins Binaural Pitch, Bilsen Band Edge Pitch, Richard Warren speech illusion, Gabor grains, or pseudo-illusions like turning rhythms into pitch by increasing speed.
+
The word Jouska comes from the [http://www.dictionaryofobscuresorrows.com Dictionary of Obscure Sorrows], a compendium of invented words written by John Koenig that try to “give a name to emotions we all might experience but don’t yet have a word for.” Koenig defines Jouska as “a hypothetical conversation that you compulsively play out in your head…which serves as a kind of psychological batting cage where you can connect more deeply with people than in the small ball of everyday life, which is a frustratingly cautious game of change-up pitches, sacrifice bunts, and intentional walks.
 +
 
 +
== Circumstance ==
 +
''Jouska'' was written for the violinist [http://www.marikimura.com/about.html Mari Kimura] as part of a commissioned collaboration for the [http://www.splice.institute 2017 SPLICE Institute] at Western Michigan University. The criteria for this commission only dictated that the piece should be around 7-9 minutes in length. After being introduced to my collaborator, Mari, I asked about the potential to make use of her µgic sensor, a glove sensor prototype she developed with Liubo Borissov at IRCAM. She was more than happy to accommodate this request and sent instructions through email about the nature of the data the sensor would provide. In early May, I coded for her a "precomposition patch," which included most of the live processing used in the piece; these effects were split into 11 different "presets," each of which utilized her sensor data in a specific way. This patch later influenced how I coded the final performance patch, and allowed me to better understand the results of the data-audio interaction. Mari and I finally met in person on June 11th at the SPLICE Institute; here, we tweaked the patch with some final adjustments for the data streams and addressed some balance issues with the live processing.
  
 
== µgic sensor ==
 
== µgic sensor ==
Mari's µgic sensor will be utilized as a major component in the piece; it will primarily be used to drive different parameter values in the live processing of the violin. The µgic sensor can track the following performative motions:
 
  
1) bow stroke duration - track a ‘long note’ or long bow (with multiple slurred notes) that is held, as a controller or an element
+
'''The Data'''
 +
 
 +
Mari's [http://www.marikimura.com/augmented-violin.html µgic sensor] was utilized as a major component in the piece, primarily to act as a bridge between the performative gestures inherent to her playing and the resulting processing of her sound. In this way, the data from the sensor was used to drive different parameter values in the live processing. The µgic sensor is built to track the following performative motions using a 9-axis accelerometer:
 +
 
 +
1) bow stroke duration - tracks the length of a bow stroke in 10s of milliseconds
 +
 
 +
2) pizzicato - a sforzando pizzicato motion can be reasonably traced
 +
 
 +
3) bow energy - the aggregate energy value for the entire sensor (on a scale from 0. to 1.)
 +
 
 +
Pitch tracking and amplitude/note-onset tracking were available as parameters to work with, although they do not require the use of the sensor glove. For pitch tracking, I used the [http://imtr.ircam.fr/imtr/Max/MSP_externals IRCAM external, yin~].
 +
 
 +
[https://youtu.be/UKRrEdS_SMI Here is a video] of Mari explaining sensor interactions in her piece "Eigenspace."
 +
 
 +
'''Sensor Brainstorming'''
 +
 
 +
Before beginning the piece, I thought about what might be the most potent way to utilize the sensor data as an organic element in the piece. I wanted to use the sensor as a way to make the live electronics more congruent with the performative action. Certain movements cannot easily be tracked with amplitude or spectral algorithms alone. Mari is well known in the electroacoustic community to avoid the use of a foot pedal to trigger cues, particularly because they 1) destroy the illusion of interactivity, snapping the frame of reference towards the foot pedal itself for a brief moment, and 2) because the foot pedal does not hold an organic performative role that is embedded in how the performer plays their instrument. With the above arguments in mind, I brainstormed what sorts of interactive relations might fall under the criteria of:
 +
 
 +
1) events that would be difficult to align if there were structured as performer and fixed media playback (for example, a pizzicato motion could start or stop a sound file)
 +
 
 +
2) ways in which I could combine multiple elements together (such as pitch detection and bow duration) to create more interesting processing that does not become stale over time
 +
 
 +
3) evolutions of effects that would seem "inorganic" if they were triggered only with a foot pedal
 +
 
 +
--A list of brainstormed ideas sensor-processing interaction can be [https://stanford.box.com/s/a3268xxcry1wa0g6s9cqy71y25gq29bq found here]. This document includes Mari's correspondence as well.
 +
 
 +
'''Final Results'''
 +
 
 +
Each of the three available data streams from the sensor were ultimately used in the piece. These uses included:
  
2) pizzicato - pizzicato-sforzando can be reasonably traced
+
- using bow energy to crossfade between distortion and octave doubling
  
3) tremolo - or ‘energy amount’ a rapid detaché can be reasonably traced as either a trigger or a continuous ‘energy’ quality
+
- using bow energy to control the amplitude of the harmonizer
  
Additionally, pitch tracking and amplitude/note-onset tracking are available as parameters to track.
+
- using bow duration to pitchshift notes for their individual durations
  
 +
- using pizzicato motion to advance cues
  
I am looking to see what sorts of interactive relations might fall under the criteria of:
+
- using pizzicato motion to start or end sound files
  
1) things that would be difficult to align if there were just fixed media playback
+
- using pizzicato motion to "capture" a pizzicato note and send it through a different effect than everything else
  
2) ways in which I can combine multiple elements together (such as pitch detection and bow speed) to create more interesting processing that does not become stale over time
+
A snapshot of the data router in my Max patch can be viewed [https://stanford.box.com/s/lo9lowf4xd26gp4rf4u2zc2qfi1oop63 here.] This router was created simply using [gate] objects, each of which was turned on and off over the course of the piece by the [pattr] object. I originally used the [router] object in combination with [matrixctrl] to control these data, but ran into several issues that were embedded in the [router] object itself; namely, the object was sending out [https://cycling74.com/forums/router-bug ''both'' a control message as well as the data itself.] Parsing this control message out of the stream seemed to be fruitless, since it was not consistent over time.
  
--A list of brainstormed ideas for this can be found here: [https://stanford.box.com/s/78hwi4ga9r33rmcca4msh4xlibsdfsla]
+
In retrospect, I would have had more success in using the bow duration data instead of the bow energy data in many places. The bow energy values were not scaled in a way that matched well with Mari's gestures, but instead gave a more-or-less logarithmic curve of energy; because of this, the energy data was useful in detecting the onset of fast tremolo notes, but not the steady increase and decrease in bowing speed that I was looking for. In future iterations of the patch, I will recode the bow energy correlations to be bow duration correlations instead.
  
3) evolutions of effects that would seem "inorganic" if they were triggered only with a foot pedal.
+
== The Max Patch ==

Revision as of 15:21, 19 June 2017

[project page for 220C course with Chris Chafe, Spring 2017]

Overview and Setup

Title: Jouska

Composer: Chris Lortie

Duration: 9'

Instrumentation: Violin and live electronics (including µgic glove sensor)

Premiere Performance: Mari Kimura at the 2017 SPLICE Institute on Wednesday, June 14th at 7:30pm in the Dalton Center Recital Hall, Western Michigan University.

Setup: Violin >> DPA clip-on mic >> Max/MSP on a laptop >> stereo output. During the piece, both fixed sound files and live processing of the violin input occur. Data from the violinist's glove sensor alter these effects and occasionally advance cues in the Max/MSP patch. There is no foot pedal involved in the piece.

Program Notes

The word Jouska comes from the Dictionary of Obscure Sorrows, a compendium of invented words written by John Koenig that try to “give a name to emotions we all might experience but don’t yet have a word for.” Koenig defines Jouska as “a hypothetical conversation that you compulsively play out in your head…which serves as a kind of psychological batting cage where you can connect more deeply with people than in the small ball of everyday life, which is a frustratingly cautious game of change-up pitches, sacrifice bunts, and intentional walks.”

Circumstance

Jouska was written for the violinist Mari Kimura as part of a commissioned collaboration for the 2017 SPLICE Institute at Western Michigan University. The criteria for this commission only dictated that the piece should be around 7-9 minutes in length. After being introduced to my collaborator, Mari, I asked about the potential to make use of her µgic sensor, a glove sensor prototype she developed with Liubo Borissov at IRCAM. She was more than happy to accommodate this request and sent instructions through email about the nature of the data the sensor would provide. In early May, I coded for her a "precomposition patch," which included most of the live processing used in the piece; these effects were split into 11 different "presets," each of which utilized her sensor data in a specific way. This patch later influenced how I coded the final performance patch, and allowed me to better understand the results of the data-audio interaction. Mari and I finally met in person on June 11th at the SPLICE Institute; here, we tweaked the patch with some final adjustments for the data streams and addressed some balance issues with the live processing.

µgic sensor

The Data

Mari's µgic sensor was utilized as a major component in the piece, primarily to act as a bridge between the performative gestures inherent to her playing and the resulting processing of her sound. In this way, the data from the sensor was used to drive different parameter values in the live processing. The µgic sensor is built to track the following performative motions using a 9-axis accelerometer:

1) bow stroke duration - tracks the length of a bow stroke in 10s of milliseconds

2) pizzicato - a sforzando pizzicato motion can be reasonably traced

3) bow energy - the aggregate energy value for the entire sensor (on a scale from 0. to 1.)

Pitch tracking and amplitude/note-onset tracking were available as parameters to work with, although they do not require the use of the sensor glove. For pitch tracking, I used the IRCAM external, yin~.

Here is a video of Mari explaining sensor interactions in her piece "Eigenspace."

Sensor Brainstorming

Before beginning the piece, I thought about what might be the most potent way to utilize the sensor data as an organic element in the piece. I wanted to use the sensor as a way to make the live electronics more congruent with the performative action. Certain movements cannot easily be tracked with amplitude or spectral algorithms alone. Mari is well known in the electroacoustic community to avoid the use of a foot pedal to trigger cues, particularly because they 1) destroy the illusion of interactivity, snapping the frame of reference towards the foot pedal itself for a brief moment, and 2) because the foot pedal does not hold an organic performative role that is embedded in how the performer plays their instrument. With the above arguments in mind, I brainstormed what sorts of interactive relations might fall under the criteria of:

1) events that would be difficult to align if there were structured as performer and fixed media playback (for example, a pizzicato motion could start or stop a sound file)

2) ways in which I could combine multiple elements together (such as pitch detection and bow duration) to create more interesting processing that does not become stale over time

3) evolutions of effects that would seem "inorganic" if they were triggered only with a foot pedal

--A list of brainstormed ideas sensor-processing interaction can be found here. This document includes Mari's correspondence as well.

Final Results

Each of the three available data streams from the sensor were ultimately used in the piece. These uses included:

- using bow energy to crossfade between distortion and octave doubling

- using bow energy to control the amplitude of the harmonizer

- using bow duration to pitchshift notes for their individual durations

- using pizzicato motion to advance cues

- using pizzicato motion to start or end sound files

- using pizzicato motion to "capture" a pizzicato note and send it through a different effect than everything else

A snapshot of the data router in my Max patch can be viewed here. This router was created simply using [gate] objects, each of which was turned on and off over the course of the piece by the [pattr] object. I originally used the [router] object in combination with [matrixctrl] to control these data, but ran into several issues that were embedded in the [router] object itself; namely, the object was sending out both a control message as well as the data itself. Parsing this control message out of the stream seemed to be fruitless, since it was not consistent over time.

In retrospect, I would have had more success in using the bow duration data instead of the bow energy data in many places. The bow energy values were not scaled in a way that matched well with Mari's gestures, but instead gave a more-or-less logarithmic curve of energy; because of this, the energy data was useful in detecting the onset of fast tremolo notes, but not the steady increase and decrease in bowing speed that I was looking for. In future iterations of the patch, I will recode the bow energy correlations to be bow duration correlations instead.

The Max Patch