CCRMA
next up previous contents
Next: Psychoacoustics and Cognitive Psychology Up: Research Activities Previous: Physical Modeling and Digital Signal Processing

Controllers for Computers and Musical Instruments




Updates on the Radio-Baton Program

Max V. Mathews

The radio-baton research this year has focused on midifiles and the radio-baton. The conductor program has been modified to accept pure type 0 midifiles as scores. This requires some way of specifying trigger points in a midifile. Triggers have been encoded into noteon midi commands with keynumbers 0 through 11. These keynumbers are generally not used because the pitches they produce are below the range that can generally be heard as music. As an alternate approach, trigger points can be automatically added to a midifile corresponding to the time signature of the file. For example, a 3/4 time signature will have 3 triggers in each measure; a 4/4 time signature will have 4 triggers per measure. This work has been done by Andrew Einaudi.

The conductor program is currently being extended to accept type 1 midifiles as scores. This requires sorting the events in the various tracks in the type 1 file into time ordered events.

Midifiles are also being used as source material in the radio-baton Improv program. For this purpose, a midifile is parsed and read into a structure in the Improv program memory. This allows the Improv program to have easy access to fragments of the file. Thus, it is possible to repeat (loop) sections of the score as many times as desired by some live performace control, or to vary the tempo of the playback either with baton beats or with a knob for continuous tempo control. Several midifiles can be played at the same time, each with a seperate tempo control, but with algorithms to synchronize the files in various ways. For example, in rock music a "solid" percussion track played at a constant tempo can be synchronized with a flexible-tempo solo track by repeating measures in the percussion track as necessary to synchronize with the solo voice.

Many other ways of using midifile material are envisioned for the Improv program. Sequencers and midifiles are a powerful and widely used ways of composing popular music, so we believe their use in the Improv program will be an important addition.

The vBow: A Haptic Musical Controller Human-Computer Interface

Charles Nichols

Previous electronic musical controllers have either added technology to acoustic instruments to translate the expressive qualities of the instrument into digital data, or employed systems of sensors to capture the performance gestures of the player, to provide the computer musician with an expressive electronic musical controller. The advantage of the former case is that the force-feedback of the acoustic instrument, upon which the traditional player depends and his technical training has been based, is retained. However, the translation of an acoustic signal into a digital protocol is prone to error and quantizes the expression of the instrument, limiting the expressive range of the performer. In the latter case, the expression of the performer may be more accurately and fully translated, but the haptic response of the instrument is lost. The vBow, a virtual violin bow musical controller, uses a design based on robotic technology to both accurately capture gestural expression and provide haptic feedback to the performer.

The vBow senses the violinists stroke through the encoder of a servo, driven by a cable attached to a capstan on the shaft of the motor and to either end of the virtual bow. The servo, in turn, engages the bow, driven by the control software which simulates friction and vibration of the string on the bow, according to the velocity of the performer's stroke. The performance data sensed by the encoder and the haptic feedback instructions driving the servo are bussed through parallel data streams to and from a data acquisition card running at a high sampling rate, resulting in minimum sampling error. The result is a first attempt at simulating the complex interaction between the violinist's bow stroke, the acoustic violin's physical response, and the violinist's reaction to the force-feedback of the acoustical system.

Haptic User Interfaces for the Blind

Sile O'Modhrain and Brent Gillespie

Advances in graphic output technology have opened the window for the development of advanced graphical user interfaces making computers increasingly inaccessible to the blind. To date, developers seeking to overcome this situation have relied on two methods of outputting information: sound and braille. Neither of these have been able to provide an adequate substitute for graphics. For certain applications, such as synthesizer controllers and digital music editing, speech output would conflict with the audio output of the system. Therefore we feel it is necessary to explore other ways of presenting information in a tactile form. Because haptic displays can, like graphics, create virtual objects, they present a more natural analogue than text (as in speech or braille). For example, a motorized mouse can define a button to be felt as well as seen-imagine that a particular area of the mouse pad has a different texture. This force reflecting system would also usefully supplement graphical user interfaces for sighted users.

With support from the Stanford Office of Technology and Licensing we have built such a powered mouse, which we call the Moose. Using this device, we have developed a prototype haptic user interface for Windows 3.1 and we have also experimented with rendering haptically the spectrograms of soundfiles. Those who have used this haptic prototype agree that we have begun to tap a very promising resource, a fact which is reinforced by the sudden increase in commercially available haptic display devices. Therefore we feel it won't be long until haptic display will become a viable component of standard computer systems and blind computer users will have access to applications such as sound editors and MIDI sequencers for which speech output is extremely inadequate but for which haptic output is well suited.

Incorporating Haptic Feedback into Music Controllers

Sile O'Modhrain

This study investigates the role played by haptic (tactile/kinesthetic) feedback in musical instrument playing. Though musicians rely primarily on their sense of hearing to monitor and adjust the sound being produced by their instrument, there exists a second path through which valuable information about the instrument's behavior can be observed - namely the feedback received via the haptic senses, the senses of touch and kinesthesia. A violinist, for example, uses their sensitivity to pressure and vibration to control bow speed. A trombone player can "feel" where the resonant modes of their instrument are by an increase in vibrations fed back to their lips via the mouthpiece.

In our work, we are leveraging off the musician's unconscious use of combined haptic and auditory cues to design music controllers that combine both forms of sensory feedback. We are developing a prototyping environment which allows us to design the "feel" as well as the sound of an instrument. Using a variety of haptic display devices, we can control parameters of physical models running in STK, and use output from these models to generate forces or vibrations which the player can feel. We are currently running a series of studies to assess the utility of such haptic feedback in musical instrument controllers.

Improv: Computer/Performer Interaction Programming with MIDI in C++

Craig Stuart Sapp

Improv is an environment for writing programs that enable musician/computer interation using MIDI instruments. There are two components to Improv:

The Improv environments have been used in two Stanford courses: Introduction to Music Composition and Programming Using MIDI-Based Systems, and Topics in Interactive Computer-Music Performance. Also, the environment was used this past summer (1998) at a Summer Workshop in Germany at ZKM.

The programming library and environments are designed to be portable to different computer operating systems. Currently example programs can be compiled and run in the Windows 95/NT and Linux operating systems with Intel 75 MHz Pentium CPU's or better.

For more information about Improv, visit http://www-ccrma.stanford.edu/~craig/improv/

Alternative Controllers for Physical Model Development (and Fun!)

Gary P. Scavone

Two special purpose MIDI controllers, the Holey Controller and the Phoney Controller, have been created using BASIC Stamp II microprocessors by Parallax Inc. The design of these controllers was inspired by recent work of Perry Cook.


next up previous contents
Next: Psychoacoustics and Cognitive Psychology Up: Research Activities Previous: Physical Modeling and Digital Signal Processing
CCRMA CCRMA Overview
©2000 CCRMA, Stanford University. All Rights Reserved.