Lab 4

Matlab/Python: Basic Filtering

A filter is a system through which a signal can be passed to change the amplitude and phase of any sinusoidal component of that signal. Four main basic filter types exist:
1) a low-pass filter, which lets low frequencies pass through but cuts out high frequencies (also called a high-cut filter).
2) a high-pass (or low-cut) filter, which lets high frequencies pass through but cuts out low frequencies.
3) a band-pass filter, which allows frequencies through in some band in the middle of the spectrum, but filters out high and low frequencies.
4) a notch (or band-stop) filter, which allows all frequencies through except frequencies within some band in the middle of the spectrum.
The frequency at which a filter starts reducing the response (or around which the pass or stop band is centered for band-pass and notch filters) is called the cutoff frequency.

We’re going to design some basic filters just to give you some practice filtering audio signals.

First, we need to create a signal we can filter. A good signal to use for this is random noise. This will also give us a chance to add a new function to our toolbox for generating noise, which can be coupled with sounds you’ve already generated to deepen your sonic palette - many sounds can be modelled as a combination of tonal components and noise (often filtered noise) components.

Design a function that generates white noise:

[output] = generateNoise(fs, duration)
fs: sampling rate
duration: length of noise signal in seconds

White noise can be generated by simply creating a stream of random numbers between -1.0 and 1.0 (although we know by now it’s best to avoid -1.0 and 1.0; if you want to include an amplitude argument in your generateNoise(), good on you!). A handy Matlab function for generating random numbers is the randn() function.

Plot the spectrum of your white noise signal using your getSpectrum() function from last lab. Try different lengths / durations. What do you see in the spectrum for your noise signal? Why does white noise make a good signal for testing filters? Answer in script comments.

Next look up the Matlab functions butter and filter. You’ll see that the outputs of one can be used as arguments for the other. These vectors (B and A by convention) are the coefficients of the filter transfer function numerator and denominator (if you want to learn more, here’s a good place to start, or take Music 320A/B and / or EE 264). Use these two functions to design and implement the following filters for the given signals:

  • White noise low-pass filtered with a cutoff frequency of 800 Hz.
  • White noise high-pass filtered with a cutoff frequency of 1500 Hz.
  • A short clip of a song of your choice band-pass filtered with parameters of your choice.
  • A short clip of a song of your choice notch filtered with parameters of your choice.

Matlab/Python Deliverables:

Submit your generateNoise() function and test script code, which should include your implementation of the four filters. Also submit spectrum plots and .wav files of the four filtered signals listed above. Remember audiowrite() can be used to generate .wav files from signal vectors.

Game Development: Mixed Reality

Develop a game in mixed reality. You are free to use any virtual reality (VR) equipment you have, or those offered by the class. You are also allowed to design for augmented reality (AR) games built for smartphones. We’re intentionally leaving the rest of the assignment up to you. Now that you’ve had some experience with Unity, it’s time to explore what you can do when freed from the confines of a 2-D window into your game!

Some things to consider:

  • The player now has the freedom to look around. This adds all kinds of possibilities for gameplay elements that happen ‘offscreen,’ requiring the player to look around to discover things, interact with the game, etc. A mixed reality game that only happens in front of the player is missing the point of the medium.
  • Sight happens in front of us, but sound happens around us. Sound is a great way to tell the player they have to look somewhere. Use sound sources with spatialization to direct the players attention. Note: on sound sources there is a ‘Spatial Blend’ attribute that determines how much the sound source is 3D spatialized, and the default is 2D - you’ll want to move this setting all the way to 3D on each of your sound sources to take full advantage of Unity’s built-in sound spatialization. You’ll want to use headphones for the best experience.
  • If you are building for VR gear, keep in mind that continuous movement in VR can be uncomfortable and potentially nauseating. Many early VR games suffered from causing discomfort in the player by moving them around too fast. Many more recent VR titles solve this problem by employing a ‘teleport’ locomotion mechanic - the player selects where they want to move to and they ‘blink’ to that location. This may be difficult to implement on the Oculus since you won’t have a tracked controller, but it’s something to consider. If you choose to implement more traditional movement, make sure it’s at a speed that doesn’t make the player feel sick (playing seated often helps). It’s also possible to create a game that doesn’t rely on player locomotion (i.e. the player stays in one location and only looks around).
  • If you are building an AR game for smartphone, keep in mind that scaling in augmented reality can be tricky to manage. Even though Unity, ARKit, and ARCore all think in meters, motion forces won't scale with the size of your game, making it hard to generalize games to multiple spaces. Luckily, Unity has some facilities to help navigate this problem, which are detailed in this blog post.
  • Remember that directions need to be considered relative to where the player is looking! If you map forward movement to the global z axis, for example, it will be unnatural for a player looking to their left to move to the right when they try to move forward. You want to base player movement / actions off the direction the camera is facing.
  • Take advantage of your device's inputs. If you're using a VR system, make good use of the variety of inputs you have from your headset and remotes. If you're building an AR smartphone game, remember that your phone has a variety of data sources including the touchscreen, microphone, cameras, accelerometers, and more. Think creatively about game mechanics!
  • This lab is ripe for using trainable controls with Wekinator (and it is pretty easy to use if you haven’t before). You can train it with any of the inputs you have available, enabling you to code much more complex behaviors than would be possible in vanilla Unity. Wekinator’s a great way to use machine learning to get usable control from imprecise input. If you go this route, you’ll want to use OSC messages to send values to and from Wekinator. You can find out more about Wekinator on the resources page.

Getting Started with Mixed Reality (AR/VR) in Unity

For people building on VR gear: if you’re working with the Acer headset, this is Microsoft’s official page on setting up Unity to work with VR. This is another page that talks about working with MR on windows. If you are working with the Vive, I highly recommend this tutorial for getting started.

For people building AR smartphone games: iOS and Android each have their own framework for AR development — ARKit for iOS and ARCore for Android. Luckily, Unity has developed its own cross-platform asset set, called AR Foundation, to enable development in both environments. Unity has an overview of AR Foundation 2.1 (the version for Unity 2019.3 and higher) and a GitHub repo containing samples for demonstrating the functionality of AR Foundation. Go through the overview to familiarize yourself with the API. If you are using Unity 2020.1, you'll need to use AR Foundation 3.0, and if you're using Unity 2020.2 you'll need AR Foundation 4.0.

AR Development Quirks

Deploying AR games to smartphones is a little complicated when working within Apple's ecosystem. Although Unity's interface implies it can build for iOS devices, it actually can't. Instead, it builds the Unity project as an Xcode project, which must then be built from Xcode to work on an iPhone; Xcode is specific to Apple computers, and does not exist for other OSes. Even then, the game will only work on a trusted device physically connected to the Mac used to build the game from Xcode.

Android devices are somewhat easier to build for. Unity will build directly for Android phones, and those games can be installed directly from a download.

Although AR Foundation is designed to be cross-platform, there are some functions in both ARKit and ARCore that are not yet included in AR Foundation; you can see a list of features here. Perhaps the most significant of these for our purposes are ARKit's 3D object tracking and body tracking functions, which are not supported by ARCore and will therefore only work properly in iOS devices. If you use this functionality, it will be more complicated to build your game for multiple platforms. However, for this assignment, you should focus on building a game for your own device, and not worry about deploying it more broadly.

Game Design Deliverables:

Submit links to your pitch and playtest videos (details on the Lab Overview page if you’ve forgotten). You do not need to submit the actual Unity project. Be aware that screen capture may not be the best option for this Lab.

See Canvas for due dates. We will play your games in class that day. Please make sure your hardware/software is ready to go for live Gameplay.

Lecture

Fridays, 9:45 - 11:45 AM PM
CCRMA Classroom (Knoll 217)

Lab

Tuesdays, 6:00 - 7:50 PM
CCRMA Classroom (Knoll 217)

Office Hours

See Canvas
CCRMA Classroom/Discord

Questions

Post on Discord or Email/p>

Instructors

Poppy Crum
Instructor
poppy(at)stanford(dot)edu

Lloyd May
Teaching Assistant
lloydmay(at)stanford(dot)edu