Final Project: everylayer

By Andrew Hershberger

Stanford University, Music 256a

About

My vision for this project was to facilitate social iteration on sound. To explore this possibility, I prototyped a piece of software that combined basic sequencing functionality with the ability to share compositions, browse the work of others, and remix sound projects made by other people.

Software Design

Originally, the implementation for this project was all in C++ and only comprised the editing functionality.

I designed a basic scene graph (see class Renderer) that included some interesting uses of polymorphism. For example, each Renderer can have a list of child-renderers, and when render() is called on the parent, it in turn makes four calls: pushProperties(), renderSelf(), renderChildren(), and popProperties(). Of these five functions, only renderSelf() is virtual, meaning that subclasses need only worry about overriding renderSelf() and not about applying transformations or rendering their children consistently.

This design worked pretty well, so I also adapted a similar pattern for doing synthesis. The public tick() function sums the result from two protected functions synthesizeSelf() and synthesizeChildren(). Only synthesizeSelf() is virtual, so subclasses never need to worry about synthesizing their children or combining results from any such synthesis.

The next major step in my design was adding in a Cocoa/AppKit frame and controls to flesh out the social-interaction components of the vision. I ended up encapsulating the C++ work that I had done to that point in a single Objective-C++ class EL2OpenGLView (a subclass of Apple's NSOpenGLView). This change required me to adapt the OpenGL setup code used in GLUT to work with NSOpenGLView. Although it was a bit of work, it paid off in the end, allowing me to work at a slightly higher level for the non-core synthesis and real-time graphics parts of the project. This helped me to spend more time on the design and less time on the particulars of things like memory management.

Currently, the data backend for everylayer is a plist in the user's home directory named everylayer2data.plist. In principle, this data source would be extended so that the app would connect to the cloud for sharing and remixing. For the purposes of this prototype, the file-based solution was a faster path to getting to a point where I could communicate the vision of the project.

Getting Started

If you would like to try everylayer, I recommend downloading the pre-built binary (Mac application package) for Mac OS X 10.6. The Xcode project is also available, but you will need to provide or install the required Boost serialization headers (not included in the project).

Once you have the app running, you can start recording by clicking in the middle of the left side of the window (to give keyboard focus to the NSOpenGLView) and then using the keyboard to record notes in a loop. The default instrument is the Plucked synthesizer from STK, but you can switch all of the notes in a track to use the Drummer STK synthesizer by pressing the right arrow key. Here are some other key-commands:

KeyAction
Up Arrow KeySelect the track one above the currently-selected track
Down Arrow KeySelect the track one below the currently-selected track
Right Arrow KeyToggle between the Plucked and Drummer synthesizers for the selected track
fn-delete or backspaceDelete the selected track
+ (Shift =)Add a new track

Note that the currently-selected track is blue in color. All other tracks are gray. This difference is subtle and too easy to miss.

Once you have a composition ready, then give it a title, add your name and photo (by clicking in the photo well) and then click "Share!". Your creation will appear in the right side bar. You can reload it at any time by selecting it in the list and clicking "Remix!".

Screenshots & Videos

everylayer interface

Download

Disclaimer: I have not tested this app on any machines other than my own. As such, if you run into trouble and want to make it work, feel free to contact me.

Source, Binary (Mac application package)

Credits

RtAudio, Synthesis ToolKit, and the boost serialization libraries are used in this software. Sample code from developer.apple.com was adapted for the purpose of putting an OpenGL view in a Cocoa app. Ge's vector3d class is also used freely throughout the code.