John Granzow

Assistant Professor
School of Music Theatre and Dance
University of Michigan
jgranzow [at] umich [dot] edu

Ph.D. Dissertation


My doctoral dissertation Digital Fabrication for Musical Applications is available through the Stanford Library at this permanent URL.

Stanford Library : dissertation (.pdf)

Sonifying the Shive Machine

I am working with Matias Vilaplana to capture the the excursions of points on kinetic scultpures invented for acoustic visualization. We are using a 16 camera Qualyses Motion Capture system to capture the dynamics of these physical wave demonstrations. We have so far built a shive machine, captured its wave patterns and sonified the wave form using MaxMSP. In this research educational acoustic demonstrations are recast as musical controllers through motion capture.

Instruments scaffolds in VR

I continue to work on Hyperreal Instruments with Anıl Çamci: In this article, we look at how technology has facilitated the materialization of impossible instruments from the twentieth century on. We then discuss the bridging of VR and fabrication as a new frontier in instrument design, where synthetic sounds can be used to condition an equally synthetic sensory scaffolding upon which the time-varying spectra can be interactively anchored: The result is new instruments that can defy our sense of audiovisual reality while satisfying our proprioceptive and haptic expectations.

Utopia Swim Club

I presently work with the Utopia Swim Club an artist collective with Christian Sandvig, Sophia Breuckner, Catie Newell, William Calvo-Quiros and myself. We come from schools across the University, including the School of Information, Stamps School of Art and Design, Architecture, American Studies and the School of Music Theatre and Dance. The collective was established through the gererous support of the Humanities collaboratory. Our practice is to place AI into unlikey scenarios to foreground the shifting attributions we make to agents that deliver our data back to us,laden with motives to have the wrold unfold in a certain way. video

Digital Fabrication for Acoustics

I presented an update to my Digital Fabrication for Acoustics curriculum at the Acoustical Society for America Conference in San Diego (Dec,2019) The increasing presence of maker spaces in academic settings provides opportunities to study acoustics through digital fabrication. Theory is coupled to physical play in the long tradition of acoustic educational. This research explores how we can make sounding objects and acoustic demonstrations to test them against our numerical driven predictions, and fuse this scientific learning with creative endeavors in digital lutherie and sound art.

Embedded DSP workshop, Stanford

In this annual workshop held at Stanford, participants learn how to program microcontrollers with the Faust programming language for low latency real-time audio Digital Signal Processing (DSP). Final projects consisted of hardware for musical applications such as digital guitar pedal effects and synthesizer modules. The Teensy 3.6 board was used as the main development platform. Its ARM Cortex-M4 microcontroller provides plenty of processing power to implement advanced DSP algorithms (e.g., feedback delay networks, physical models, band-limited oscillators, filter banks, etc.). Also, its various analog and digital inputs can be used for sensors acquisition. The lack of Operating System allows for the use of very low block sizes (i.e., 8 samples) offering extremely low audio latency. video

Axes, Society for Music Computing, Sala Unicaja de Conciertos MarĂ­a Cristina, Malaga

Luthiers use computer-controlled mills for the subtractive manufacture of guitar components. These machines have multiple motors stepping at variable rates to propel cutting tools in three dimensional paths with corresponding pitch contours. Axes is a work that brings these live robotic sounds of modern guitarmaking to the concert space. For this piece, stepper motors are fixed to the neck, body and soundboard of an unassembled guitar. The motors are driven in concert as the x, y and z axes of a toolpath derived from a digital model of the instrument. The pitched and noisy motors are filtered acoustically through the guitars components and are captured via transducers to become the source for subtractive synthesis. The actuation and vibration also make the guitar components mildly kinetic. MaxMSP is used for the additional processing and to generate nebulous quotations from the emerging guitars future/past repertoire, producing a collage of fine motor skills, both machine and human. Axes is a multichannel work that can be adapted to the channelcount in the space. prototype_clip

National Theatre, Taipei

Vox Voxel created in colaboration with Fernando-Lopez Lezcano was included in the concert of machines (5 concerts) at the National Theatre in Taipei, Taiwan in September, 2018. VoxVoxel is "composed" by designing a suitably useless 3D shape and capturing the sound of the working 3D printer using sensors. Those sounds are amplified, modified and multiplied through live processing in a computer using ardour and LV2/LADSPA plugins, and output in full matching 3D sound. 3D pixels in space. video