Professor: Chris Chafe (cc@ccrma.stanford.edu)
TA: Gautham J. Mysore (gautham@ccrma.stanford.edu)
Office hours by appointment
Class meetings: Tuesday and Thursday 10:00-11:50am [Class Room @ the Knoll]
This course is an opportunity for students who have completed Music 220a and Music 220b to pursue an independent research project in computer music. Students regularly present their research and project progress in a weekly seminar-style class meeting. In addition, projects in progress are documented on the web.
Part of Micael's ongoing research (supported by the Social Sciences and Humanities Reserach Council of Canada), the GRIP MAESTRO is the composer's latest attempt to incorporate real physical resistance and haptic feedback into an elecroacoustic performance interface. The interface is a modified hand exerciser called Grip Master. By measuring the position of the pads on the device and carefully mapping them to parameters of music creation, Michael hopes that the GRIP MAESTRO will provide the feel, control, and aural feedback necessary to be an effective interface between performers and their music, and between audiences and their performers.
My project involves an electonic music composition from the "ground up." I am going to delve deeply into FM synthesis and effects by writing my own VST intrument(s). The instruments plug into the Fruity Loops sequencer and many other software products seamlessly, which I will use to actually do the composition.
ChucK and Processing form an unholy alliance to assault the senses with unnatural textures and uncomfortable sounds.
Various Compositions that reflect the eternal struggle between music and various body parts, namely the ass.
A 3-dimensional, interactive music visualizer. Created with the Unity game engine, and set in a stylized outer space.
The Loop Librarian is an application that takes in a drum loop, analyzes it and suggests other drumloops from your database to use with it.
Gods of the Dead is a musical performance based on ancient traditions, realized with digital technology. It focuses on the voyage to and return from the underworld, as well as the figures who rule over the underworld or help the departed move between states of being. For me, this is an attempt at understanding the universal similarities between all belief systems: one of the purposes of almost every religion is to prepare us for whatever happens after we leave this world.
The Loop Librarian is an application that takes in a drum loop, analyzes it and suggests other drumloops from your database to use with it.
Investigation on methods of sound source separation including independent component analysis (ICA) for acoustic and instantaneously mixed audio signals. The final goal is to implement ICA in real-time in the form a Virtual Studio Technology (VST) audio plugin.
Pilot EEG experiments to classify brain-wave data of musical material: musical intervals and short melodies, along with timbre, register, and loudness.
I am creating a patch in pd that will automatically synthesize a musical accompaniment based on the signal from an electric guitar. The core of the patch will consist of a pitch detection algorithm and a wave-shaping algorithm.
The Loop Librarian is an application that takes in a drum loop, analyzes it and suggests other drumloops from your database to use with it.
Interactive musical applications for the monome.
Simulation of the scattering of sound waves in a forest.
Electric guitar pedals and amplifiers meet the flute, and merge together in harmonious ridiculously awesome accompaniment sounds. A bamboo flute, decked out with sensors (and lasers), mod-ed in ChucK and Max/MSP.
Creating teaching tools that use visual and sonic methods to demonstrate several abstract concepts used in digital signal processing.
A sound art installation that is controlled by falling streams of water (such as the fountain by Green Library). There will be sensors on the bottom that can tell if the water stream has been broken.
This project aims at exploring some of the possible musical interactions between a real musician and a virtual "robot musician" both at the perceptual and behavioral level. We will test some basic techniques for musical sequence learning, and present a music generator based on robot behavior.
In this project, we try to obtain a generalized quantitative representation of violin bowing parameters extracted from motion capturing real performances, so it allows rendering synthetic gestures out of a new input score. The rendered gestures are useful for sound synthesis purposes using violin physical models.
How to make your own homepage at CCRMA