Music 256A: Eito Murakami - RayTone


RayTone is a dynamic music sequencing environment made with Unity and Chuck. There are two categories of objects that a user can use: Control Units that generate and modify sequences of events, and Voice Units that generate sound. The output of a Control Unit can be connected to any input node of a Voice Unit to modulate audio parameters such as frequency, playback rate, dry/wet mix, etc. A circular-shaped sequencer is a special type of Control Unit whose step advances on a global clock tick to output a new value. Using the Sequencer submenu, a user can change its number of steps, clock division, minimum value, maximum value, and value interval. Other types of Control Unit include utility tools such as "Add", "Multiply", and MIDI to frequency converter. While the number and variety of these tools are currently limited, I plan to prepare more Control Units for a future distribution. The input of a Control Unit can take the output of another Control Unit. This allows a user to build nested structures that present dynamic and less predictable musical results.

- Open Control Unit Menu With "C" Key
- Open Voice Unit Menu With "V" Key
- Enter Edit Mode By Clicking On A Sequencer And Pressing "E" Key
- When In Edit Mode, Open Sequencer Sub Menu With "M" Key
- Connect An Outlet With An Inlet Using A Mouse
- Move Units By Holding "Ctrl" Key And Using A Mouse



For this week's milestone 2, I worked on designing the infrastructure to dynamically patch sequencers and "voice" objects using mouse and keyboard control. A sequencer can be spawned using the "S" key. The output of each sequencer can be connected to any input node of a voice unit. The example above only uses a simple sine oscillator with frequency control, but I plan to write Chuck files that utilize more inputs to produce complex sounds. A user has the ability to select a sequencer and enter the "edit" mode to zoom in and modify the value of each step, which is represented by its height. In order to maintain precise timing, I used two Chuck global events to control both audio and graphics cues.


My first idea for the Interactive Audiovisual Musical Sequencer utilizes glowing sea creatures whose bodies represent steps of sequences. A sequencer can be used to control various parameters, such as pitch, duration, audio effects intensity, etc. Each sequencer can be mapped to input nodes of sound generators by using the creatures’ tentacles.

My second sequencer idea is a Mario-like game where a user completes a 2D course while being chased by enemies. A melody is generated based on the type of blocks that the character destroys while running towards the goal.

My third sequencer idea is a GuitarHero-like game where a user plays a desired sequence in the first round and has the ability to overdub to the quantized sequence in the second round. This is a much more performance-oriented sequencer compared to the first idea.

Tutorial and Sequencer Research

The Chicken Sequencer tutorial was helpful for confirming Chunity ideas that I needed to review, such as using the FloatSyncer. While the tutorial did not teach exactly how I could dynamically spawn and synchronize more than one sequencer, I learned to process graphics data based on audio activities.

As a music producer and DJ, I have worked with various sequencers. The first example is a piano roll in a DAW such as Ableton, where a user can draw notes with specific pitch, duration, velocity, and probability. Another example would be a hardware instrument such as Elektron Octatrack, which allows a user to control what audio sample is played for each step of the sequencer. After doing some research on the Internet, I discovered some visual-based sequencer software/games that are less intuitive but perhaps more fun and aesthetically pleasing. Planetone – Music Sequencer is one such example, in which a user plants “sprouts to create melodies that disappear over time, trees to create permanent melodies, [and] mushrooms to play randomized notes”. While musical results I found online were rather uninspiring, I did find the concept interesting and unique. Another example is CubeSeuqnecer by Håkan Lidbo, which utilizes a camera to recognize 6 colors of 16 Rubik’s Cubes and map them to different musical instruments. The vertical and horizontal positions of each colored cell represent pitch and time respectively. I found the tangibility of the interface very inspiring, which encourages a user to pick up and rotate any of the Rubik’s Cubes to instantly create a new random yet organized sequence.