Mmc-august-2012/Lab3

From CCRMA Wiki
Jump to: navigation, search

Lab 3 - Sound Flares

In this lab, we will experiment with interaction between real-time sound and graphics using OpenGL. The goal is to build a graphical world inhabited by "flares". These graphical particles will follow your fingers around as you touch the screen, and make sound corresponding to their location on screen.

OpenGL bootup

Create a new single-view application. To streamline the process of getting graphics up and running, we will integrate code from the aptly titled "graphicsStuff" (download here). Add all of the files in graphicsStuff to the project. Also, you will need to add GLKit.framework and OpenGLES.framework to the project, by opening "(Project Name)" in the left sidebar, then "Targets - (Project Name)", then "Build Phases", then "Link Binary With Libraries".

Change this part of AppDelegate.m:

- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions
{
    self.window = [[UIWindow alloc] initWithFrame:[[UIScreen mainScreen] bounds]];
    // Override point for customization after application launch.
    if ([[UIDevice currentDevice] userInterfaceIdiom] == UIUserInterfaceIdiomPhone) {
        self.viewController = [[ViewController alloc] initWithNibName:@"ViewController_iPhone" bundle:nil];
    } else {
        self.viewController = [[ViewController alloc] initWithNibName:@"ViewController_iPad" bundle:nil];
    }
    self.window.rootViewController = self.viewController;
    [self.window makeKeyAndVisible];
    return YES;
}

to this:

- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions
{
    self.window = [[UIWindow alloc] initWithFrame:[[UIScreen mainScreen] bounds]];
    // Override point for customization after application launch.
    self.window.rootViewController = [[GLViewController alloc] init];
    [self.window makeKeyAndVisible];
    return YES;
}

You also need to add

#import "GLViewController.h"

to the top of AppDelegate.m to be able to create the GLViewController.

This will make GLViewController, which draws itself using OpenGL, the primary view of the application. Test that you have everything working by changing the background color to red, using glClearColor. The GLViewController method glkView:drawInRect: is the place where pretty much all OpenGL drawing commands should go; glClearColor is already in this method, so you can just change the specified color from black (0,0,0,1) to red (1,0,01). If you then see a blank red screen when you run the app, you'll know that its working.

Also add MoAudio as before, along with CoreAudio.framework and AudioToolbox.framework, as we will soon be mapping audio to the graphics.

Flare

Using the textured quad technique shown in lecture, render a "flare" or other object that follows a single touch around when it is moved around the screen. Note that, by default, GLViewController is configured so that only one touch will be active at a time, so you don't have to worry about handling more than one touch.

Make the flare more "lively" by animating one or more of its properties -- perhaps it will oscillate in size over time, change color, or change opacity.

MultiFlares

You can do this or Flare Sounds first, depending on which one you think is more interesting.

Enable multiple touches in GLViewController. There are a few different ways to do this, but the easiest is to set

self.view.multipleTouchEnabled = YES;

in the viewDidLoad method.

This will cause touchesBegan, touchesMoved, and touchesEnded to provide events for multiple touches, assuming that multiple touches are in fact occurring. Use an std::map to track which flare corresponds to which touch, so that each flare moves smoothly along with the touch. If you haven't already, you would be well served to create a Flare class to encapsulate the rendering and animation logic for a given flare, which you can then instantiate as needed each time a touchesBegan event occurs.

Flare Sounds

You can do this or MultiFlares first, depending on which one you think is more interesting.

For each flare that is on screen, have it produce a sound. This sound can be synthesized using oscillators, played from a WAV file, or any other synthesis technique you are familiar with. Make the character of the sound vary depending on the location of the touch. For example, you could change the FM modulator gain based on the X-position and the LFO frequency based on Y-position.

You'll need to make sure that you use thread-safe programming techniques to communicate between the UI thread and the audio thread, such as the method shown in lecture using circular buffers to pass messages to the audio thread.

Bonus Round (optional)

  • Introduce multiple types of flares, each with a distinctive sound, visual appearance, and animation. How should the app decide which type of flare to use when a touch occurs? (choose one at random? some sort of selection UI?)