Sketching Sound: Spencer Salazar's Dissertation Defense
Date:
Thu, 03/09/2017 - 5:00pm - 6:00pm
Location:
CCRMA Stage
Event Type:
Other Spencer Salazar will present a defense of his doctoral dissertation, titled "Sketching Sound: Gestural Interaction for Expressive Music Programming." We have developed two prototype systems to explore programming music using mobile touchscreen interactions and technologies. The first of these, miniAudicle for iPad, is an environment for programming ChucK code on an iPad. The second prototype developed is a sound design and music composition system utilizing touch and hand-written stylus input. In this system, called “Auraglyph,” users draw a variety of audio synthesis structures, such as oscillators and filters, in an open canvas. Ultimately, we believe this research shows that the critical parameters for developing sophisticated software for new interaction technologies are consideration of the technology's inherent affordances and mindful attention to design.
For those who are not able to attend in person, a live stream of the defense will be available at https://www.youtube.com/watch?v=B4xZQxVVnJA. Light refreshments and other festivities will follow.
For those who are not able to attend in person, a live stream of the defense will be available at https://www.youtube.com/watch?v=B4xZQxVVnJA. Light refreshments and other festivities will follow.
Full Abstract:
New developments in technology lead to new types of interactions in computer music performance and composition. In the realm of mobile touchscreen devices such as phones and tablet computers, a variety of research efforts and software applications have explored the musical possibilities of multitouch interaction, the physical properties of the devices themselves, the orientation and location sensing of the devices, and their persistent connection to the network. However, these interactions have been largely ignored in the space of music programming on the device itself.
We have developed two prototype systems to explore concepts employing these interactions and technologies to program music on mobile touchscreen devices. The first of these, miniAudicle for iPad, is an environment for programming ChucK code on an iPad. The second prototype developed is a sound design and music composition system utilizing touch and hand-written stylus input. In this system, called “Auraglyph,” users draw a variety of audio synthesis structures, such as oscillators and filters, in an open canvas. Once created, these structures may be further parameterized by touch and other hand-drawn figures. These systems and the principles they embody have been evaluated personally by the author and with feedback from a variety of musicians and music technologists.
Ultimately, we believe this research shows that the critical parameters for developing sophisticated software for new interaction technologies are consideration of the technology's inherent affordances and mindful attention to design. To this end, we have proposed a set of principles for designing these systems stemming from this research and previous research in this field.
Free
Open to the Public