jorgeh@ccrma

2008-2009

"A man's gotta do what a man's gotta do"

Resources

Instruments

Emacs & LaTEX

As a side effect of this project, I've learn to use Emacs and LaTEX. Her you can find a couple of useful cheatsheets

Other technologies

Past 220c projects

Others

Course site

MUSIC 220C Project:
OnLine collaborative musical instrument

The idea is to develop a musical instrument that can be played using only a web browser (maybe also using one or more midi controllers). No need for installations. If is possible, the idea is to enable multiple performers to play the instrument in a collaborative way.

Overview

Development platform Flex / Actionscript. The code will be documented using ASDoc.
Synthesis
  1. FM (phase modulation)
  2. Granular synthesis
Interaction between performer(s) and instrument One or more performers will be able to interact with the instrument. How? Is not clear yet. I'm going to analyze the possibility of using midi controllers along with the computer I/O devices.
Interaction between performers The collaborative aspect of the instrument is yet to be explored. There are at least 2 collaborations I can imagine right now:
  1. Many performers controlling different parameters of the instrument
  2. Communication between performers (messages, symbols, etc.)


Proposed development

  1. Implement different synthesis (FM and granulation) and basic filtering (LPF, HPF) and routing:

    The idea is that the performer will be able to set a custom signal flow. A possible problem at this stage is that the Flash player may have problems handling large amount of data.

  2. Implement MIDI interaction (if possible):

    I'll research if is possible to add midi interaction to the instrument, either using a controller for live performance and/or by using a midi file for pre-recorded music.

  3. Implement collaborative performance instrument:

    Study possible ways of adding interaction between multiple performers and implement one of them. A possibility is that they all play the same instrument but control different parameters; another possibility is that they all play different synchronized instruments. Some questions to be solved: who will render the sound? How to handle to performers manipulating the same parameter?



Time line

Week 1

Week 2

Implementation of the synthesis: the FM is working. Now is Granular Synthesis time. It is 80% there. I need to integrate it with real samples and also add some random control for the IOT and the duration.

Update! - Granular Synthesis is working now. You can see the web app here. Or you can check the source code.

Week 3

Apparently the only way of receiving MIDI in AS is listening to a TCP socket, so a bridge application may be required, which goes against the idea of everything in the browser, no installation required.

I'll need to install a Python CGI script to serve for the client connections. I'm working on that now.

It's alive!!!!! Is working locally and with some issues, but is breathing.

Week 4

After solving the issues of bidirectional communication between 2 browsers, I began to work on the interface. I have a working concept to play around (you can see it here). During the rest of the week and the next week the idea is to define the interface and hopefully implement it.

Some feedback from the class:

  1. What are the advantages and disadvantages of using a browser?
  2. Check the interaction design framework
  3. Is it a game? What kind of a game?
  4. What's the satisfaction for the user?
  5. Keep it simple to use (high level of abstraction)
  6. Check "The League of Automatic Music Composers" and "The HUB"

Week 5

I held a meeting with Martin Alonso, student of product design at Stanford. As a result of that meeting, some key questions/issues arose:

  1. What is the purpose of the instrument? Is it to compose a song over the internet or is it an "experiment" that enables remote collaboration without a preconceived output (lets see what happens)?
  2. Who will use the instrument?

He pointed me out into this interesting link regarding Usability Heuristics that can be use to evaluate a design.

For the interface itself, he suggested that instead of using sliders to control the synthesis (which may be familiar to music geeks but not for everyone) a more compelling interface could be to use the waveform itself (interact with it to change the frequency, change the harmonic content, etc).

I'm developing a new interface proposal. You can check it here

Week 6

After presenting my new interface, there was concense that the two separated panels give the idea of two separate instruments. A third interface proposal is required.

Some interesting links pointed out by Visda:

Juhan proposed to separate the performers by giving them different task (for instance one controlling rhythm while the other controls the melody.

Finally, Juan Pablo suggested to look into other disciplines that have devloped collaborative interfaces, to search for inspiration.

Week 7

Mainly I've been working on the paper (which means that I'm also learning LaTeX and Emacs), so not much visible progress.

Before developing a third UI proposal, I decided to integrate all the underlying pieces of software, so that's what I've been programming. The idea is to have everyhting working together so I can try functional interfaces. Until now I have integrated both synths and a rally basic module. You can see the progress here.

I almos forget, now the instrument has a name ... I present you the "MOLS"

Week 8

I'm still working in the integration of all the parts and pieces previously developed, before developing the final UI. Finally I got a working version that intergates the following parts:

  1. Local sound synthesis
  2. Inter client communication
  3. Cross client Synth creation
  4. Parameter syncronization between clients => synchronous synthesis

It took me much more time than what I thought it would. I even needed to draw a UML diagram to help me finish the implementation. You can see it here. Also, during the development process I decided to document the code using ASDoc. This is a snapshot of the documentation as it was by May 20th.

Week 9

The new UI was implemented. I still have lots of improvements to make, but finally I have the version 0.1 of the MOLS. Right now is still working in the CCRMA network, because I need to set up a public server to have worldwide access (I'm working on that with Carr). For now, you can see a couple of snapshots here and here

Update! An even newer UI was implemented. This one is less flexible, but simpler to understand and play. Also, I managed to connect the UGens in series. The new UI features an FM synth that feeds a Granulator. Simple but effective. You can see a snapshot here. Also, I changed the name of the intrument. I present you The Horgie.


Final Files
home -  email me -  useful links -  CCRMA