220C Sadlier – Performing with Ambisonics & VR
This wiki page will serve as intermediate space for weekly blog-style updates about the development of my 220C final project. I will soon move these updates to a dedicated website that will allow me to embed samples of my development progress in Unity.
During the first class of week 1, I began considering the form my final project would take and the platforms I would need to realize this project. I have used Max 7 quite a bit in the past and have been interested in PureData's ease of integration (specifically through libpd). I decided I would like to use PureData
- Discovered VR music creation platform LyraVR. Lyra
- Emailed the staff at VHIL requesting a tour of their lab space. I am hoping to gain insight into their method of positional tracking for Roomscale VR.
- Received an email response from VHIL –– staff is away for the next couple of weeks, but they will be available for a visit at the beginning of May.
- Downloaded Wwise SDK for Unity and Linux & macOS development. Wwise offers ambisonics support and suite of spatial plugins for Unity, and I began looking for features unique to Wwise that could accelerate and/or simplify development of the virtual soundscape I will build in Unity.
- Downloaded FMOD Studio for Unity development. FMOD also supports ambisonics audio albeit in a more limited scope than Wwise––FMOD makes use of Google's Resonance Audio SDK for in-the-box ambisonics support. I plan to experiment with integrating FMOD and Unity sometime this upcoming weekend to
- Updated Unity and PureData to most recent versions.
- Began experimenting with example Unity projects that integrate libpd.