Stanford

As a student at Stanford's CCRMA, I'd really like to further my understanding of how we can interface the old analog technologies that we cherish with the new, and powerful digital tools available to us. Below are some of the projects I've worked and some ideas for projects I would like to work on.



SunMusic:

SunMusic is a music visualizer created as a project for the Music 256a course. It was programmed in C++ using the openGl and RtAudio libraries. RtAudio uses the default input, for a linux machine this is the Jack audio connection toolkit. Music can be played through SunMusic by connecting a Jack compliant application (such as Audacity or VLC player) to the RtJack Api that appears in the connect window of Jack. The idea here is that the sun, represented by a spectogram mapped to the sphere in the center of the screen, is the source of music, and the spirals emitting out of it are several buffers worth of the soundfile. Many parameters are alterable by the user such as the amplitude of the waves mapped to the spirals, the amplitude of the spectogram on the sphere. Below is a screenshot of the terminal window with these options when SunMusic is run and screenshot of the visualization window:




The above image shows the visualizer in pretty much all its glory. One thing that was not mentioned previously were the cube meteors that appear in this image. These are triggered to appear when the average amplitude of the input to the visualizer is above a certain threshhold, which can be set by the user. As long as the amplitude is above that threshhold, the meteors will appear to zoom out of the screen at you.If you're interested in seeing the visualizer in action, below is a link to a youtube video where it is used with the excellent Band of Horses song "The Funeral"(TO BE POSTED SOON). And if you're even more interested, the source code is available to download below. So far this is only tested on linux, and you must have Jack for it to work. If you know what I'm talking about and have these things, simply cd into the directory called hw3 and run make.


  • Sun Music Source


  • Karpluck_WQH

    Karpluck_WQH is a very simple implementation of the karplus strong plucked string algorithm created as a homework assignment for the Music 256a course. The program is able to take MIDI input through a specified port (included as a command line argument, or, if left blank, will default to port 0). The source code for this project is availble below. Simply cd into the directory entitled wqh-hw2 and run make to compile the program. Open up your MIDI instrument of choice, connect it to the appropriate port and listen to the beautiful sounds of a very simple plucked string model. To here a simple demo of this program, listen to the wav file included below.


  • karpluck_wqh source

  • Example performance using karpluck_wqh


  • WaveDraw

    WaveDraw is an interactive wavetable synthesis/sequencing pogram that lets you draw whatever waveform you would like to hear and then sequence it, up to three waves can be stored in the sequencer, two waves can be captured and sent to the cloud and the currently drawn waveform can also be played. The motication behind this project was the fact that in computer music we are constantly looking at the source of sound, continuously peeling back a metaphorical onion to see what causes the sounds we like and what causes the ones we don't. In the physical world sound is essentially a series of waves that radiate spherically outward from the source. The shape of these waves is what causes the sound we hear. This is my (decent sounding) implementation of that same idea. Below is a screenshot of what the program might look like while you're using it



    In order to implement this I used the C++ programming language and the OpenGL graphics library. Below is a breif readme explaining the usage of program in addition to the source tarball. The source tarball should contain everything needed to work wavedraw. Note: this has only been tested on Fedora Linux systems so wheteher or not it will work on other OS's is a toss up.


  • Readme file (also included in the tarball)

  • The source code


  • Musical Bop It

    Musical Bop It is a mobile music making game based off of the classic childrens game. It allows the user to play the game using 4 commands: Tap It, Shake It, Yell It, Clap It. Created as part of the final project for Stanford's Music, Computing and Design II: Mobile Music course; I wanted to accomplish two main goals with the project. Firstly, I wanted to create mobile game that relied more on big gestures rather than tapping the screen, although that is still incorporated to some degree. Secondly, I wanted to map those gestures that came about naturally as part of designing the game to something musical, or at the very least quasi-musical. As a result of these goals, the UI of the game is very bare bones. Below you can see a screenshot of the main screen. I wanted to get people thinking less about what was displayed on the screen and more about what they were putting into the recording. As the game progresses, commands are generated, and announced from the speaker of the phone, and the user must perform them. Audio is recorded throughout the game via the microphone and the shake and tap gestures add additional sound effects to the recording. Making it farther in the game results in a crazier, more intense recording. After losing, when a player does not perform a command fast enough, or winning, they are able to listen to their recording and post it to a server where other users are able to like, listen and download the recording. To further the level of Musical Bop It madness, if a user likes a recording, they may download it, and then add their own game recording on top of it. Making Musical Bop It not only musical and interactive but also collaborative as well.



    Musical Bop It was made for Android using a hodgepodge of C++, JAVA, and FAUST(Functional Audio STream). Becuase of Android's inherent latency issues, a FAUST audio engine for android was generated by TA Romain Michon that could be used to write FAUST or C++ code that would serve as the audio callback function. UI elements and game interaction are processed by JAVA, and then information is sent to FAUST to process and generate the audio recordings. The server ineractions are handled server side by Django and Python, while post and get requests are sent from the phone via loopj.coms Asyncronous Http Client for Android.