220C Sadlier – Performing with Ambisonics & VR

From CCRMA Wiki
Revision as of 10:58, 26 April 2018 by Tsadlier (Talk | contribs) (Week 1)

Jump to: navigation, search

Introduction

This wiki page will serve as intermediate space for weekly blog-style updates about the development of my 220C final project. I will soon move these updates to a dedicated website that will allow me to embed samples of my progress in Unity & PD.

Week 1

During the first class of week 1, I began brainstorming final project ideas and compiled a list of platforms I was most interested in working with. I have used Max 7 quite a bit in the past, and PureData's similar paradigm plus its ease of integration with other platforms made it a compelling choice for a primary audio platform to work with. I have also been interested in gaining more experience with Unity's audio tools and experimenting with the platform's basic ambisonics support. After deciding to integrate PureData with a Unity project through libpd, I quickly realized that my project would involve a VR component.

The initial project plan I had by the end of week 1 was to create a performance piece for the CCRMA stage where a performer wears VR goggles and interacts with instruments in the virtual environment that create sound in the corresponding location on the real stage. The boundaries of the virtual space in combination with a roomscale positioning system guide the performer to (safely) walk around the stage (and perhaps around the audience). The performer can interact with objects in the VR environment that create localized sound on stage. The audience does not see the virtual world the performer is interacting with but can hear the soundscape of the world in vivid detail through the stage's ambisonics array.

Week 2

  • Discovered VR music creation platform LyraVR. Lyra
  • Emailed the staff at VHIL requesting a tour of their lab space. I am hoping to gain insight into their method of positional tracking for Roomscale VR.

Week 3

  • Received an email response from VHIL –– staff is away for the next couple of weeks, but they will be available for a visit at the beginning of May.
  • Downloaded Wwise SDK for Unity and Linux & macOS development. Wwise offers ambisonics support and suite of spatial plugins for Unity, and I began looking for features unique to Wwise that could accelerate and/or simplify development of the virtual soundscape I will build in Unity.
  • Downloaded FMOD Studio for Unity development. FMOD also supports ambisonics audio albeit in a more limited scope than Wwise––FMOD makes use of Google's Resonance Audio SDK for in-the-box ambisonics support. I plan to experiment with integrating FMOD and Unity sometime this upcoming weekend to
  • Updated Unity and PureData to most recent versions.
  • Began experimenting with example Unity projects that integrate libpd.