Software Projects


CollideFx - A Ballistic Effects Generator

Official CollideFx Homepage

CollideFx is a real-time audio effects processor that integrates the physics of real objects into the parameter space of the signal chain. Much like in a traditional signal chain, a user can choose a series of effects and offer realtime control to their various parameters. In this work, we introduce a means of creating tree-like signal graphs that dynamically change their routing in response to position changes of the unit generators. The unit generators are easily controllable using the click and drag interface and respond using familiar physics, including conservation of linear and angular momentum and friction. With little difficulty, users can design interesting effects, or alternatively, can fling a unit generator into a cluster of several others to obtain more surprising results, letting the physics engine do the decision making.

Machine Learning approach to Similar Style Algorithmic Composition

I am trying to create a python application that uses my old musical compositions from the last eight or so years to automatically compose new pieces. I may use some of the ideas from David Cope's work with recombinant music and using some variant of the incremental parsing and generation algorithm as seen here. The idea is that the music can be used for a video game, where a player may remain in an area for an indeterminant amount of time. Rather than use a boring loop, this will just keep playing new, different material.
The first step involves being able to recognize styles of playing. The classes I am looking for are Bass, Drums, Lead, Rhythm and Fingerpicking (Acoustic). Drums are easy to identify because they have a dedicated MIDI channel. I am using several features to identify the classes, the most successful of which are the mean pitch, the average distance between two consecutive notes, and the average polyphony. This process uses a set of handpicked passages spanning many genres for the training set. The algorithm then turns to a test set, whose passages are recognized as one of the aforementioned classes. For Rhythm and Bass parts, chords and rhythms will be extracted. Lead parts will be compared to rhythm parts and analyzed for melodic information. Harmonic information can be extrapolated from any rhythm/bass parts playing at the same time. The recombinant music algorithms mentioned above can then use this data to piece together new melodies and rhythms. The figure presents some data showing that the classes appear to be relatively seperable. See the attached paper for some of the methodology. Successful recognition rates of roughly 98% were achieved.
Classification of Playing Styles using MIDI Information (815 kB)
I have also implemented a grammar that builds a structure for songs. The structure may look something like this: "ivtcvtcbo", where the letters stand for intro, verse, transition, chorus, bridge, and outro. Each section with the same lettering may use the same chord progression with a similar melody, but it will be somewhat randomized. The melody and chord selections will be based on the MIDI tracks that I wrote a while back and will be tagged with some parameters related to the mood of the song. That way, the mood of the generated piece can change as gameplay gets more intense. Using the PyAudio and python-musical libraries will hopefully let me generate real-time music without forcing me to use MIDI for output. Maybe some 8-bit sound?

Photo Collage Generator


I was kind of annoyed that all of these websites wanted me to pay for something that should only take a good coder a couple of hours, so I decided to make my own python script for making photo collages. It's super easy to use and it's completely free. Here's an example of what it can do with a bunch of random backgrounds and a picture of a bear.

python collage.py -i <inputfile> -d <directory> -o <outputfile> -t <tilesize> -n <numtiles> -b <blend_amt>

Ex*: python collage.py -i bear.jpg -o ~/Where/The/Little/Pictures/Are

All other options are optional. Tilesize is the number of pixels per tile, numtiles is the number of tiles in the minimum dimension of the image, and blend_amt is the amount of color correction to use (0-1). Enjoy!

collage.py

In The Shadows - Augmented Reality Ghost Hunting

In the Shadows is an iOS game where players can seek out paranormal activity in their own homes. It uses Occipital's Structure Sensor to create a mesh network of the player's current environment to place a ghost in the room with them. The ghost can lead them through their homes and help them uncover hidden virtual coins. Ghost noises are made using granular synthesis and spatialized using head related transfer functions provided here. Graphics were done using photoshop and OpenGL ES2.0 (man's best friend). This project is incomplete and in a kind of stangnant state of development, but I'd like to pick it up in the future.


Jewels - An Audio Effect for the Extension and Movement of Spectral Peaks

Jewels is a real-time spectral audio effect that extends the peaks in the frequency spectrum forward in time to create a reverberant effect. Unlike a traditional convolution reverb, Jewels is not attempting to simulate an environment or create the auditory illusion that the sound is occurring in a particular space. Instead, the effect uses the properties of the signal itself to create resonances or other unique effects. The spectral peaks can also be set to move with some trajectory, either constant, randomly, or sinusoidally. By simply manipulating the frequency shift of these bins with respect to time, a variety of different audio effects are produced ranging from Shepard tones to noise, to a shimmering effect. If instead of using a musical audio signal as an input, we use white noise, we obtain a versatile toolkit for noise synthesis.

make && ./Jewels
Source Code (6/13/14) 135kB
Documentation 4.5MB

Pitch Sensing Audio Visualizer

This audio visualizer is responsive to both the computer's audio input and MIDI input. Using the OpenGL library, it displays the microphone's input waveform in the top of the window and it shows a frequency representation of the signal in above the middle of the screen. Bubbles appear out of the piano keys at the bottom of the screen corresponding to the best guess of the note being played. The spectral power of the signal is displayed lightly in the background as a bar graph.

The pitch tracking was loosely based upon S. Marchand's "An efficient pitch-tracking algorithm using a combination of Fourier transforms". The spectrum in the middle of the screen is the Fourier Transform of the traditional Fourier Transform's magnitude plot. The white dots that appear on peaks in this spectrum are the system's attempt to connect the peaks through time, providing a more robust pitch estimate. It is very sensitive to noise and has produced iffy results on the human voice, but it works much better on the MIDI instrument.

The MIDI instrument used is a modification of the Karplus-Strong model where the strings are coupled through a resonant body. The number of strings changes dynamically, so the stability of the system is maintained via a nonlinear gain control. This produced interesting results, including feedback that changes pitch on its own and even noises resembling Jurassic Park's velociraptors.

(C++ language -- Only tested on OSX platform)
Just type these commands into the terminal:
make
./audiovisualizer


Source Code (11/1/13) 120kB
Audio Sample 506kB

Spatial and Temporal Analysis of Visualization of Disease Outbreak

This simulator was a winning entry to the 2012 Undergraduate Data Research Palooza. Using the datasets provided by the Vaccine Modeling Initiative, the simulation shows an image of the United States that progresses through time. Each fatality from a particular disease spawns a particle moving in brownian motion in a location corresponding to the location of the fatality. The year is divided into two parts, summer and winter and the particles are colored with warm and cool colors, respectively. A tally of deaths per year is kept and a graph is created as the simulation runs. The code utilizes the jGeocoder library for geospatial positioning. Click on the banner on the image above for full resolution images.

The Ants!

Since about spring of 2009, I have been crazy about ants. Why not? Ants are absolutely cool. My ant colony simulation was designed in C++ using the SDL graphics library. Two colonies of ants coexist and interact in an environment featuring regularly spawning food and unnavigable obstacles. The ants exhibit brownian motion unless they are reacting to some an external or internal stimuli. These stimuli include the detection of a pheromone trail, the presence of a food supply, or a significant decrease in its own stamina. Upon discovering a food source, an ant will pick up the food and return to the anthill leaving a pheromone trails on this return trip, the strength of which degrades over time. Foraging ants follows trails until the food is reached. After noticing that the food source has been depleted, foraging ants return to the hole destroying pheromone trails along the way. The reproduction rate of the colony is a function of the queen's health, which directly correlates to the colony's surplus of food. The two colonies are in competition for resources and will fight over food or steal from the opposite colonies food surplus. The ants' motion and behavior is based almost entirely on probability. The simulation provides a basic model for swarm intelligence and demonstrates properties of real populations including cyclical population dynamics and carrying capacity.
Computer ants weren't good enough. I got my first ant farm and ordered the ants in the mail. They were western harvester ants. This was a huge disappointment because they barely survived for two days. I fed them a little piece of strawberry and a couple drops of water and they died. I was devastated. The next ant farm was one of those bizarre space ant colonies filled with green gel. They are essentially digging in their own food. I bought western harvesters again. This time they lived for a couple months and they did some serious digging. There are pictures of the ants in the gallery above. I didn't take any after the tunnels were dug, unfortunately. I hope to make an ant farm in the future out of a fish tank and fill it with more space gel. Ants deserve to have that third dimension to dig around in. The other option is to make get two huge pieces of plexiglass and make one the size of a wall. That would be really great, too.