Real-time AudioVisual Composition with RayTone
Overview:
This workshop offers an introduction to real-time audiovisual composition and performance using RayTone - a node-based sequencing environment developed by Eito Murakami and John Burnett. RayTone allows users to load custom Chuck files for audio programming and custom Shaders for visual programming. The workshop is structured into three categories: an introduction to RayTone sequencing, lectures on Chuck and Shader programming, and activities to design original RayTone patches. The attendees will have an opportunity to perform their audiovisual composition using RayTone on the last day. Prior musical and programming experience may be helpful but not required.
About the instructors:
Eito Murakami
Eito Murakami is a master's student at Center for Computer Research in Music and Acoustics (CCRMA) at Stanford University. He graduated from University of California San Diego with bachelor's degrees in Interdisciplinary Computing and the Arts Music (ICAM) and Political Science/International Relations. Eito is an electronic composer, performer, sound designer, and virtual reality developer. By combining his classical music training with proficiency in audio and graphics software, he creates digital interfaces and instruments that promote intuitive musical performance. Specifically, he specializes in Unreal Engine to develop audiovisual infrastructure that allows multiplayer interactions in virtual 3D environments via network.
Eito is a former member of the Sonic Arts Research and Development group at UC San Diego's Qualcomm Institute. As an undergraduate researcher, he presented a virtual reality composition titled "Becoming" at SIGGRAPH 2022 - Immersive Pavilion. Additionally, he participated in The Deejays & Vinylphiles Club (DVC) at UCSD as a vice president, instructor, and marketing manager.
John Burnett
John Burnett (b. 1993) is a multimedia artist based in San Diego, California. Drawing from a background in music composition, sound design, and technology, they create technologically-augmented and reactive multimedia installation works, sound and projection design for dance and theater productions, as well as concert works and film scores. John is also a member of the Sonic Arts research team, based in the Qualcomm Institute at UC San Diego, where they research audio spatialization and audiovisual technology. John is a graduate of Oberlin Conservatory and is currently a PhD Candidate at UC San Diego.