"A man's gotta do what a man's gotta do"
As a side effect of this project, I've learn to use Emacs and LaTEX. Her you can find a couple of useful cheatsheets
The idea is to develop a musical instrument that can be played using only a web browser (maybe also using one or more midi controllers). No need for installations. If is possible, the idea is to enable multiple performers to play the instrument in a collaborative way.
|Development platform||Flex / Actionscript. The code will be documented using ASDoc.|
|Interaction between performer(s) and instrument||One or more performers will be able to interact with the instrument. How? Is not clear yet. I'm going to analyze the possibility of using midi controllers along with the computer I/O devices.|
|Interaction between performers||The collaborative aspect of the instrument is yet to be explored. There are at least 2 collaborations I can imagine right now:
The idea is that the performer will be able to set a custom signal flow. A possible problem at this stage is that the Flash player may have problems handling large amount of data.
I'll research if is possible to add midi interaction to the instrument, either using a controller for live performance and/or by using a midi file for pre-recorded music.
Study possible ways of adding interaction between multiple performers and implement one of them. A possibility is that they all play the same instrument but control different parameters; another possibility is that they all play different synchronized instruments. Some questions to be solved: who will render the sound? How to handle to performers manipulating the same parameter?
Implementation of the synthesis: the FM is working. Now is Granular Synthesis time. It is 80% there. I need to integrate it with real samples and also add some random control for the IOT and the duration.
Apparently the only way of receiving MIDI in AS is listening to a TCP socket, so a bridge application may be required, which goes against the idea of everything in the browser, no installation required.
I'll need to install a Python CGI script to serve for the client connections. I'm working on that now.
It's alive!!!!! Is working locally and with some issues, but is breathing.
After solving the issues of bidirectional communication between 2 browsers, I began to work on the interface. I have a working concept to play around (you can see it here). During the rest of the week and the next week the idea is to define the interface and hopefully implement it.
Some feedback from the class:
I held a meeting with Martin Alonso, student of product design at Stanford. As a result of that meeting, some key questions/issues arose:
He pointed me out into this interesting link regarding Usability Heuristics that can be use to evaluate a design.
For the interface itself, he suggested that instead of using sliders to control the synthesis (which may be familiar to music geeks but not for everyone) a more compelling interface could be to use the waveform itself (interact with it to change the frequency, change the harmonic content, etc).
I'm developing a new interface proposal. You can check it here
After presenting my new interface, there was concense that the two separated panels give the idea of two separate instruments. A third interface proposal is required.
Some interesting links pointed out by Visda:
Juhan proposed to separate the performers by giving them different task (for instance one controlling rhythm while the other controls the melody.
Finally, Juan Pablo suggested to look into other disciplines that have devloped collaborative interfaces, to search for inspiration.
Mainly I've been working on the paper (which means that I'm also learning LaTeX and Emacs), so not much visible progress.
Before developing a third UI proposal, I decided to integrate all the underlying pieces of software, so that's what I've been programming. The idea is to have everyhting working together so I can try functional interfaces. Until now I have integrated both synths and a rally basic module. You can see the progress here.
I almos forget, now the instrument has a name ... I present you the "MOLS"
I'm still working in the integration of all the parts and pieces previously developed, before developing the final UI. Finally I got a working version that intergates the following parts:
It took me much more time than what I thought it would. I even needed to draw a UML diagram to help me finish the implementation. You can see it here. Also, during the development process I decided to document the code using ASDoc. This is a snapshot of the documentation as it was by May 20th.
The new UI was implemented. I still have lots of improvements to make, but finally I have the version 0.1 of the MOLS. Right now is still working in the CCRMA network, because I need to set up a public server to have worldwide access (I'm working on that with Carr). For now, you can see a couple of snapshots here and here
Update! An even newer UI was implemented. This one is less flexible, but simpler to understand and play. Also, I managed to connect the UGens in series. The new UI features an FM synth that feeds a Granulator. Simple but effective. You can see a snapshot here. Also, I changed the name of the intrument. I present you The Horgie.