OROBORO

 

A Collaborative Musical Controller with Interpersonal Haptic Feedback
Created by Jen Carlile and Bjoern Hartmann

OROBORO is a novel collaborative controller which focuses on musical performance as social experience by exploring synchronized actions of two musicians operating a single instrument. Each performer uses two paddle mechanisms – one for hand orientation sensing and one for servo-motor actuated feedback. We introduce a haptic mirror in which the movement of one performer’s sensed hand is used to induce movement of the partner’s actuated hand and vice versa.

from oroborous, the serpent that eats its own tail
Music 220C & Music 250A/B
CCRMA @ Stanford University



Atmel Mega 128 with
Procyon MotorDriver V1.0
Outside the sphere of computer music, in the physical world, the constraints of human anatomy limit the complexity of expression of any single musical instrument. A natural way to add richness to musical performance is for musicians to assemble in groups, bands, combos, ensembles or orchestras. In these communal performances, the interpersonal dynamics between musicians are responsible for much of the fascinating richness and subtlety of the music. Alas, all too often recent software tools have ignored this important interpersonal component of musicianship by concentrating solely on the single-performer, single-computer paradigm. Reconnecting to communal practices, we focus on collaborative controllers for computer music which require contributions from multiple musicians. By exploring musical space together, the performance itself can serve as a potent medium for interpersonal communication. Toward this end we have built OROBORO – a music controller that connects two players sonically and haptically.
 
 
Using a mirror metaphor, two musicians face each other across a table. On each side of the table two paddle mechanisms are placed that support the performers’ hands. The paddles are used to record/induce movement in the following way: For each performer, the right hand is sensed. Movement and finger pressure are recorded and sent to a PureData (Pd) patch for sound synthesis and sample playback control. From the point of view of the musician, this is her active hand. The left hand is actuated, or passive from the performer’s perspective. Its cradle mechanism is equipped with positioning motors that relay the orientation of the other performer’s active hand.
Through this haptic mirror, performers are more deeply aware of what their partner is doing – they can see, hear and feel the collaborative effort. We consciously separated sensing and actuation into separate hands to avoid a haptic tug of war between the performers. We rely on each performer to integrate incoming and outgoing signals mentally. Sensor data from both performers’ active hands is combined to provide control for sound synthesis control. For left handed musicians, the arrangement of controllers can be swapped.
 
PROJECT UPDATES:  

-May 10, 2005 - - - - - - - - - - - - - - - *
In the continuing quest to get both boards talking to the same computer, I've purchased two Serial to USB adaptors to work with my Mac laptop. I've explored a few different options regarding how to get two AVR boards to talk to the same computer without overloading the serial port hardware. This has been a great learning experience--I now feel very comfortable looking at board schematics and understanding the functions of all of the ports-- but right now I need to pursue the fastest solution as NIME is less than two weeks away. After much consideration, I came to the conclusion that I probably need something that is basically plug and play (i.e. two serial/USB adapters that will connect to my laptop), rather than trying to write a recieveOSC library for the AVR.
On the sound/visual front, I completed a *very* basic visualization patch that describes the movement of the controllers.
Also, I met with Elaine Chew today and we discussed the project. She suggested that I blindfold users and observe their interactions via Oroboro--a simple and elegant test to gage the impact of interpersonal haptic feedback.
And on a final note, I've started working on a piece for Oroboro and 'tape' to be presented at the CCRMA open house. I'm using sound samples of radio static that I recorded last June while on the island of Gotland in southern Sweden. Here are some sound samples:
ex1 ex2 ex3 ex4 ex5 ex6 ex7 ex8

I particularly like ex5

 
        

 

May 2, 2005 - - - - - - - - - *

This week I have been working on two main areas: creating Pd patches for a new sound paradigm, and wading through the ATMega128 literature to figure out how to connect two boards together.
The Pd patch below translates orientation data and resistance data to clicks and filtered noise swirls.
SOME PATCHES FROM THIS WEEK:
- Clicks2.pd
- Spatialize.pd
- vibs.pd
I was hoping to get the two boards talking to each other using a 4 wire SPI interface (a master/slave interface), but the SPI ports are being using to drive the motorboard (mega128 = master, motorboard = slave). I am instead looking into creating communication over a 2 wire serial interface, or possibly over USART ports.  
   
   
FOR NOW, PLEASE SEE MY *OLD* 220C WEBSITE FOR PREVIOUS UPDATES ON OROBORO