Lab 5

Matlab: Rudimentary Pitch Tracker

For your final Matlab assignment, we’re going to build a very basic pitch-tracker / resynthesizer. Our goal is to be able to take a monophonic (one note at a time) input file and generate a digital version that copies the original.

We will do this all in a single function, which will process the input signal in chunks, or blocks, and write to output a sawtoothTone that follows the loudest frequency in each block:

[output] = trackPitch(input,fs, windowSize)
input is the original source as a vector
fs is the original sample rate
windowSize is the size of the blocks we’ll break the input into to analyze and resynthesize, in samples. This is sometimes also called ‘blocksize’ or ‘hopsize.’ The value you choose affects the resolution of the spectrum you can generate - the larger, the more accurately you can track the correct frequency. However, it also determines the temporal resolution of the resynthesized output - the pitch of the output will only change each windowSize samples.

These are the steps you will need to follow to build your trackPitch() function:

  • Create a for loop that repeats floor(length(input)/windowSize) times (we won’t worry about any extra samples we might be missing at the end).
  • For each iteration of the for loop, we need to get a block of the input signal. For example, for the first repetition we want block = input(1:windowSize), for the second iteration we want block = input(windowSize+1:2 * windowSize), and so forth. How can we use our for loop to get the right block based on what number iteration we are on?
  • Next, we use our getSpectrum() function from lab 3 on that block. We want to get the loudest frequency in that block, so use [M,I] = max(Y), where Y is the Y returned from getSpectrum(). The I value will be the index of the biggest value in Y, so we are looking for the corresponding value in the Ffrom getSpectrum(), which will be F(I).
  • Generate one block of your output sawtooth signal for the current iteration of your for loop. There are two ways you can do this:
    • The easier (albeit slower and less correct) way to do this is to append a sawtooth at the frequency we found to output using the [A,B] notation, which will concatenate vectors A and B. You’ll need to create an empty output vector before the for loop with output = []. Then we can recursively append a sawtooth to our output on each iteration of the for loop by typing output = [output, sawtoothTone(fs, F(I), 0, windowSize/fs, 8)]; (Here I’ve used 8 harmonics in my sawtooth tone, but feel free to use however many you want).
    • The cleaner and faster way to do this is to create the correct length output vector before your for loop (hint: use zeros()), and then use the same indexing within your for loop that you use to access samples in the input to assign the sawtoothTone() you generate for that block to your output.
  • Either way, you should end up with a continuous sawtooth wave output that follows the pitch of your original input.

Once you’re done with your trackPitch() function, make a short recording (10-15 seconds) of yourself singing or playing a monophonic instrument (using Garageband, Audacity, or whatever you wish, talk to Mark if you need help doing this) and use trackPitch() to generate a digital sawtooth version. Experiment with different power of 2 values for windowSize to see what gives you the best balance of frequency and time resolution.

Matlab Deliverables:

Submit your trackPitch() function along with your original recorded .wav file and generated digital sawtooth version. Report the value you used for windowSize for your final output in a comment in your function. You do not need to submit a separate Matlab script.

Game Development: Virtual Reality

Develop a game in VR using the Oculus DK2 or Vive in Unity. We’re intentionally leaving the rest of the assignment up to you. Now that you’ve had some experience with Unity, it’s time to explore what you can do when freed from the confines of a 2-D window into your game!

Some things to consider:

  • The player now has the freedom to look around. This adds all kinds of possibilities for gameplay elements that happen ‘offscreen,’ requiring the player to look around to discover things, interact with the game, etc. A VR game that only happens in front of the player is missing the point of VR.
  • Sight happens in front of us, but sound happens around us. Sound is a great way to tell the player they have to look somewhere. Use sound sources with spatialization to direct the players attention. Note: on sound sources there is a ‘Spatial Blend’ attribute that determines how much the sound source is 3D spatialized, and the default is 2D - you’ll want to move this setting all the way to 3D on each of your sound sources to take full advantage of Unity’s built-in sound spatialization. You’ll want to use headphones for the best experience.
  • Continuous movement in VR can be uncomfortable and potentially nauseating. Many early VR games suffered from causing discomfort in the player by moving them around too fast. Many more recent VR titles solve this problem by employing a ‘teleport’ locomotion mechanic - the player selects where they want to move to and they ‘blink’ to that location. This may be difficult to implement on the Oculus since you won’t have a tracked controller, but it’s something to consider. If you choose to implement more traditional movement, make sure it’s at a speed that doesn’t make the player feel sick (playing seated often helps). It’s also possible to create a game that doesn’t rely on player locomotion (i.e. the player stays in one location and only looks around).
  • Also related to movement is the fact that directions need to be considered relative to where the player is looking. If you map forward movement to the global z axis, for example, it will be unnatural for a player looking to their left to move to the right when they try to move forward. You want to base player movement / actions off the direction the camera is facing.
  • The Vive can handle room-scale VR, but the Oculus cannot. It is possible to use the Oculus standing up so the player has a little more freedom to look around, but the physical player needs to stay in one spot - seated is best.
  • If you work with the Vive, you’ll have tracked controllers you can use for interaction. These are great but require some additional work to customize the interaction you want - see the tutorial below.
  • With the Oculus, you’ll need to consider how to handle player input. Keyboard input is a possibility, but keep in mind the player won’t be able to see the keyboard, and if they need to look behind them they can’t do that and keep their hands on the keyboard. Other controllers we’ve used are a possibility - the Leap Motion can work well with VR, especially when mounted on the headset (we’ll have some 3D-printed mounts that will work with the DK2). Remember to make the Leap Controller a child of the headset / camera so it moves with the player as they look around. The GameTrak or Kinect are other potentially interesting options. If you have access to a traditional (XBox / PS4 style) game controller, that’s another good option. Of course, you could create a game that only requires the player to look at something to interact (maybe with a timer - something we’ll discuss in class on Friday). In this case it would be nice to have something that visually represents a focal point or target that moves with the headset - a Physics.Raycast from the camera through this point would be one way to check if the player is looking at a specific object.
  • We’ve only talked about Wekinator briefly in class, but this lab is ripe for using it (and it is pretty easy to use if you haven’t before). The first thing that comes to mind is to use the headset position / orientation as inputs to Wekinator, or you could train gestures from the Leap Motion or hand position from the GameTrak. Wekinator’s a great way to use machine learning to get usable control from imprecise input. Remember you’ll want to use the OSC scripts we went over in class to send values to and from Wekinator if you decide to use it. You can find out more about Wekinator on the resources page.

Getting Started with VR in Unity

If you’re working with the Oculus, this is Unity’s official (somewhat deprecated) page on setting up Unity to work with VR, although it focuses more on non-Mac machines. This is another page that talks more specifically about setting up the DK2 on a Mac. Be aware that performance is not going to be commercial grade, since Mac’s don’t officially meet graphic performance requirements, but for the sake of this lab it should be fine.
If you are working with the Vive, I highly recommend this tutorial for getting started. The machine you’ll work on should have the SteamVR package for Unity already installed (if you have your own equipment no doubt you’ve done so already, if not it’s available in the Asset Store), but you’ll need to import it into your project.

If your group wants to work with the Vive and you don’t have your own equipment to use, please let Mark know as soon as possible so we can make sure things are set up for you, show you what you need to know about the hardware, and set up a schedule that accomodates everyone, since we only have one machine for the class to share.

While the Vive is undoubtedly superior to the DK2, you can create awesome games with the latter. There’s also the added convenience of being able to take the DK2 home and work on your own machine to consider, especially since we only have one week for this lab. Remember you have the option of using the Vive for your final project if you elect not to use it for this lab. There is also a little more involved in creating a game for the Vive, which you might prefer having more time to do than just the week - check out the Vive tutorial above to get an idea if you’re on the fence. I’m not trying to talk anyone out of wanting to use the Vive, just be realistic about what you can prepare for Tuesday and be sure to plan ahead to make sure you have the time you need to create your game.

Game Design Deliverables:

Submit links to your pitch and playtest videos (details on the Lab Overview page if you’ve forgotten). You do not need to submit the actual Unity project. Be aware that screen capture may not be the best option for this Lab.

Both individual Matlab and Group Game Development portions of Lab 5 are due Tuesday, May 16th by 6:00 PM. We will play your VR games in class that evening. If your group develops a game for the Vive, please plan on coming to class a little early to make sure you’re set up for Gameplay.

Lecture

Fridays, 10:30 AM - 12:20 PM
CCRMA Classroom (Knoll 217)

Lab

Tuesdays, 6:00 - 7:50 PM
CCRMA Classroom (Knoll 217)

Office Hours

Monday 6-8 PM
Friday 1-2 PM
CCRMA Ballroom

Questions

Post on Piazza

Instructors

Poppy Crum
Instructor
poppy(at)stanford(dot)edu

Mark Hertensteiner
Teaching Assistant
hert(at)stanford(dot)edu