Blue Glove: A gestural controller for computer music
Zhiyun (kevin) Kuang

Dept. of Music, Stanford University
Stanford, CA 94305


In this paper, we describe the design and implementation of the Blue Glove, a percussive gestural interface for the live performance of computer music and control of computer-based musical activities. Later I summarize the course 250A in aspects of what I have learned and what are my lessons.


Glove, Markov Chain, PureData(PD), bend flex sensor.


Blue Glove is an attempt to add a more active performance element to computer music in a manner both intuitive and engaging for audiences and performers. The goal is to enable musicians to control electronic instruments through expressive gestures, while maintaining or enhancing playability. Blue Glove attempts to accomplish this through an intuitive interface, a light-weight glove. The glove is equipped with a digital accelerometer that registers hand facing up, down and be vertical and that provides the control of volume and pitch. Each finger also equipped with a bend flex sensor mounted on the glove for more fine-grained control. Data is processed using an Atmel AT-Mega16 microprocessor and sent to a personal computer via Open Sound Control (OSC) interface, with core software written in Pure Data (Pd) and supplementary applications in C.


Several design ideas were tried during the development of the project. Including to make the glove as a piano keyboard, as a guitar chord player, as a controller to play a Chinese instrument PiPa which using five-note scaling and also as a drum machine. To be consistent with the our group’s ensemble theme which is all of us will use the vocal sound from one of our team members to make music, the Blue Glove finally has two modes: vocal mode and keyboard mode.


The Blue Glove controller is a light-weight "garden glove that allow the fingers good freedom of movement. The glove is equipped with five bend flex and a ADXL ADC digital accelerometer board. The gloves are wired to an AVRmini development board, which in turn is connected via a serial cable to a personal computer running Pd.



The Atmel microprocessor converts bending and acceleration data received from the gloves into Open Sound Control (OSC) messages. All data is sent to Pd as a single 30-bit stream of OSC messages running at 57600 bits per second.

4.2Pure Data (PD)

Eight parameters, five from the bending sensors and three from the accelerometer are received in PD via OSC. Bend sensor on the thumb is used to trigger two modes: vocal mode and keyboard mode.

4.2.1Vocal Mode

In vocal mode, two different vocal samples will be played in loops. The chunk size and the starting point of each playback sample are mapping to the bend sensors on the second and third finger according to the character of the sound. A delay effect is added and it’s mapped to the fourth finger. A five seconds drum sample is also played in the vocal mode with it’s volume can be controller by the fifth finger.

A C++ based Markov Chain PD object is used in the vocal mode. The Y-axis of the accelerometer is mapped to the pitch of vocal. Turning the hand clockwise will change the pitch from low to high. The pitch information will then be sent to the the Markov Chain system, and the output of the Markov Chain is mapped to the pitch of the drum. The idea is that when the vocal is played at a higher pitch, the drum will also be played back at a higher pitch, instead of having them with a linear mapping, it will be a more interesting relationship.

4.2.2Keyboard Mode

When the thumb sensor is bended, the keyboard mode is on and the vocal mode is mute. The other four fingers are used to control the pitch of four different instruments. Originally, I wanted to use different physical model sounds. Physical modeling in STK were successfully complied as PD objects, but due to the time constrain, I used four lowpassed phasors to simulate soprano, alto, tenor and bass. Turning the hand clockwise will also change the pitch of every instrument.


To use Blue Glove to play improvise is a little bit difficult at this stage, but you can still have some interesting output. In the vocal mode, turning the hand clockwise to change the pitch of the vocal and it also potentially control the pitch of the drum, the “rap” kind of music can be played. In the keyboard mode, bending the fingers will play four different instrument at the same time, although it might be hard to play back the exact pitch you desire, it can still play back the instrument in a certain range of that pitch, using one hand to play four instruments is still very interesting and full of surprise.

In the ensemble, our group’s idea is all of us to use the vocal sound from Aaron, a member in our group, to create music. During ensemble, Aaron will read (sing) a poem using his vocal effect microphone with buttons on it to control different effects, such as delay, reverb, etc. The other member, Steven, will use Aaron’s voice as input to a granular synthesis algorithm in his unify musical instrument based on the AVRMini and a DSP EVM board. And Sandy Lin, will use her washboard to playback Aaron’s vocal sound and also have eight channel surround speaker controls. And finally, my Blue Glove will also play back Aaron’s sample with drums on the background.


To play some five-note scaling based Chinese instruments, such as PiPa, might be interesting. Further work can also be done torefine the gestural modeling, A recent paper [1] in gesture mapping sample network using neural networks will be a good implementation.


As a student with math background, Music 250A helped me to enter the world of electronic engineering, computer science and music. To build the glove itself, program in PD, create the mapping are three major technical achievements. By thinking how to make the glove more light-weighted, and with more freedom to control, I learned how to make different wires connection in a clean way. Programming in C++ to build the Markov Chain object in PD helped me to understand PD in another level. And of course, there are lots of works needed to be done in mapping, and experienced some of them helped me to think more deeply about how to control music expressively and efficiently.

Besides the lab works and technical issues, 250A gave me many fresh ideas and made me to look at the same problem with different aspects. I started to look at controller as a discrete or continue type, I learned to observe daily activities and think them as metaphors in related to music or other objects, and how can people use some kind of controller to play music expressively. I also started to put down my ideas by drawing, it helped me to remember what’s my original thought when I looked back to it.


If I am going to build anther project, I would say I will make a more detail plan before I will do anything. I would probably like to know what exactly the controller will be, what music I would like to play and how can I map them. Although you might want to change the “exact” result later to fit your project, it’s much better than just make and try. I spent lot of time by just writing different PD patches, trying different effects, searching different mappings and see whether I like it or not. Sometime it’s necessary, but sometime it’s just simply lost in the project.

And I found that have a volume control built-in is a very useful and simple way to make the ensemble piece more musical. For the mappings, I first mapped all of them to the range of 0 to 1, I found it’s easier to map them later from this range.


Thanks to Professor Bill Verplank and Carr Wilkerson for helping me to make the project possible. Thanks for my teammates for their excellent team work and performance.


  1. Arshia Cont, Thierry Coduys and Cyrille Henry. Real-time Gesture Mapping in Pd Environment using Neural Networks.

  2. Robert Lugo, Damondrick Jack. Beat Boxing: Expressive Control for Electronic Music Performance and Musical Applications.