Juan I Reyes

Research:

- Scanned Synthesis -

(On-going research)

In the late nineties the word "haptics" will often come out on conversations all around CCRMA's Knoll corridors. However the term was introduced to me by John Chowning while referring to Sile O'Modhrain's projects and research on his first visit to Andes University in Bogota(circa 1993). At the time music interaction courses were spreading all around the world because of real time systems spin-offs from the NeXT computer, Motorola DSP chips and the ISPW, notably Kyma, MAX-MSP, PD, and Super Collider. But also because of affordable sensors and embedded systems like Parallax's Basic Stamp. A model for these courses was Tom Igoe's Physical Computing course at NYU's ITP. But outstanding to me was a joint effort course between San Jose State, Princeton and Stanford. I was lucky many of the ideas for this course were just baked at Interval Research on Page Mill in Palo Alto. The roll of personalities for the course lectures could not be better. Joe Paradiso, Richard Duda, Don Buchla among others including Laurie Anderson. The course was organized by Ben Knapp, Perry Cook, Bill Verplank and Max Mathews. Worth mentioning this course gave life to the Arduino revolution to a great extend thanks to Pascal Stang' AVRLib and AVR Minis with customized music additions by Scott Wilson (see NIME-2003 paper).

Max Mathews next "thing" at the time was a secret being crafted at Interval Research. All we knew was that it dealt with the hidden magic of music control and performance code named as "haptics". We got a sense of this technology as an added bonus on a CCRMA Colloquium packed of people from all related fields of the medium. It turns out that Bill Verplank and Max had been further researching haptic ideas and came out with a new process for generating an manipulating sound named Scanned Synthesis. It is "scanned" because the system continuously scans a wave table containing values of a finite model of a generalized string. As shown on their presentation, this technique was based on touch, human auditory perception and motor control abilities.

A finite model of a string can be modeled using springs as elastic medium connected with a set of masses and dampers. So called haptic vibration frequencies fall below 30Hz. around 15Hz., and are the vibration rates of the system. Shapes of motion result from excitations to the system's initial conditions and are reflections proportional to a performer's gesture. The scanned synthesis model obeys the physics of the wave equation.

Since these vibration frequencies cannot be heard by any human, to make them audible the "shape" of the dynamic system is scanned periodically along a closed path. -Think of an endless spring-. Pitch is determined by the speed of the scanning and thus independent from dynamics of actual conditions of the system (see article). Timbral nuances are achieved by 'manipulating' forces on the masses -initial conditions- of the system. On Bill's and Max's first implementation this was done with the radio baton sticks. On this implementation a circular string on rest mode was excited and damped with the baton sticks while controlling its shape. Changes in shape meant changes in timbre.

Bill Shottstaedt was also part of the Scanned Synthesis project, being in charge of doing a computer's visualization of the radio baton system. Because of the haptic component, in addition to the controlling side, -plus some of its physical modeling features-, I got attracted to this technique. With Bill Shottstaedt's support and assistance I ported the circular string code and other of Max's code to CLM. Max objected the idea because of lack of real time interaction but I proved my point by using combinations of several instances of scanned synthesis generators and by manipulating circular wave shapes on a note-by-note basis. This Scanned Synthesis instrument is available on the CLM distribution.

I had to admit that rendering a Scanned Synthesis note on CLM resulted in a periodic and continuous waveform difficult to change even by using time and duration envelopes. But to my advantage the CLM version was open to signal processing, adding features not possible on the radio baton version. To prove features on both approaches, two compositions came into being: Chryseis for multichannel tape and Feather Rollerball for live piano, radio baton and Scanned Synthesis. From performance and composition standpoints both approaches have been satisfactory. However, in accordance to Max's and Bill Verplank's points, Feather Rollerball has the benefits of a live performance and interaction among performers. Therefore points about live and real time control of Scanned Synthesis were never refuted.

Scanned Synthesis is back on my research because of another talk with John Chowning(2013), who got curious about my work with Max Mathews. Using his characteristic delicate tone he suggested that more compositions and more research needed to be done on the topic. This kind of motivation fueled another composition into being: Os Grilos for rendered Scanned Synthesis and Ambisonics. For this composition the Scanned Synthesis instrument was debugged and updated, adding more waves and features for the initial conditions of the circular wave and timbre manipulation.

Photo of Juan Reyes

Juan Reyes is a composer and researcher whose works tackle on computer and electroacoustic music elements, their conception, processes and craft. His research is aimed towards semantics of gesture and perception as well as novel ways of performance and expression.