Monday, Oct 24, 2022
Music 256A / CS476A, Stanford University
HW2: Sound Peeking – Final Deliverables
Name of my audio visualizer
- An audio visualizer inspired by the music video of Yoasobi’s Gunjo, a song that inspires listeners to immerse themselves in what they like and express what they see.
Production build (MacOS platform)
Instructions for using the audio visualizer
- Platform: MacOS
- Step 1: Download the build file and save it to a local folder.
- Step 2: Right click on the file → click “Open”. Allow it to use your microphone.
- If you run into the “application cannot be opened” error, set the executable flag by running “chmod -R +x <app name>.app/Contents/MacOS” in the terminal, then try opening the file again.
- Step 3: Let the narrative play (for about 75 seconds).
- Step 4: Press 1/2/3 number keys to toggle between the 3 main camera views and experiment with the different spectrum/waveform visualizations using your microphone.
View (keyboard input)
Where/how is the spectrum history visualized?
Where/how is the waveform visualized?
Circles (raindrops) on the window
1) The red wave along the windowsill, 2) Vertical movement of the car
The road surface
Red wires between the poles
Circles around the rotating planet
Stars expand/shrink according to the average of the absolute value of each data point in the waveform
Comments on constructing the audio visualizer and difficulties encountered
- I started this project with the idea of using the audiovisual waves to represent earthquake waves. From here, I started making an earth object and stars and planets around the earth in Unity. Then, I took a turn and decided to play with perspective utilizing camera panning and zooming. I thought I could zoom into a city then zoom out to the entire earth, capturing micro and macro motion with the spectrum history and time domains. The most difficult part was brainstorming ideas on what in the city or what on the earth can be aesthetically represented by the waves, which eventually led to the moving car, raindrop ripples, expanding stars, etc. Next, I played with moving the camera in a way that matches the dynamics of the music. Finally, I edited the music with Chunity and added a low pass filter and a high pass filter, each set to a different random frequency once every few seconds, to make the music sound more unnatural.
- The FA22_Music256A_CS476A discord chat. I ran into issues such as running into build errors, and my classmates who encountered the same issues helped me out.
HW2: Sound Peeking – Milestone 1
For this milestone, I:
- Completed Unity’s Roll a Ball tutorial
- Screenshots of progress
Applying input data to the Player
Displaying the count value
- Link to screen recording: https://drive.google.com/file/d/1yBn-i3H5m2gGBqB5WUz6va1UImRArUy5/view?usp=sharing
- Link to build: https://drive.google.com/file/d/1Ru1Kc3F1co8Xotp5dBcjfzvEmSMxJd6E/view?usp=sharing
- Comments on progress and experience: The Roll A Ball tutorial was very straightforward as I did it once already a couple of years ago when I just started learning Unity. Nevertheless, it helped me brush up Unity skills as I have not used the game engine since the spring.
- Completed Chunity tutorials
- Screenshots of progress
Responding to Unity Events / Using Unity Variables (1)
Using Unity Variables (2): Chuck Global Variables
- Link to screen recording: https://drive.google.com/file/d/1r-pYidwEl5CPtMMRXm-DwsfJhEsm0ILP/view?usp=sharing
- Link to build: https://drive.google.com/file/d/1SbqkKE6rcO2Fm15H3K12KCCbeLbYbeZB/view?usp=sharing
- Comments on progress and experience: The Chunity tutorial was a fun and challenging experience. It was fun because this is my first experience of audiovisual design in Unity. It was challenging because I am still not super familiar with the ChucK language.
- Played with sndpeek
- Completed Ge and Kunwoo's Artful Design TV tutorial "HelloAudioVisualizer”
- Screenshots of progress:
Step 1: Getting Audio Data
Step 2: Visualizing Waveform
Step 3: Visualizing Spectrum
- Link to screen recording: https://drive.google.com/file/d/1vdwiBLQW0Xc1TQ-1GUUJCr42FtHxb1e6/view?usp=sharing
- Link to build: https://drive.google.com/file/d/1qblk9-kVWUtULqHpZOt0hIJUwQEpCdR-/view?usp=sharing
- Comments on progress and experience: I thought this tutorial was super cool! I feel like now I am a lot more familiar with using AudioSources in Unity. The most exciting moment was when I pressed the Play button and saw the waveform move for the first time in response to my voice!
- Started working on my own visualizer in a new project using the ChunityAudioVisualizer starter code
- Brainstormed ideas for sound visualizer design (briefly described them in words and images)
- First-person journey through tube
- The player (in first-person view) travels down some half-transparent tube-like structure while the audio input (waveform + spectrum, with source as microphone input or audio files) is visualized as circles around the player along the walls of the tube-like structure.
- Walk ball
- The player (in third-person view) walks in a sphere, and the sphere’s surface is formed by the visualizations of the audio input.
HW2: Sound Peeking – Milestone 2
Video link: https://youtu.be/aaBuksNDJsI
For this milestone, I:
- Starting with the ChunityAudioVisualizer starter code, implemented a waterfall plot to display the spectrum and waveform over time.
- Functionalities so far:
- The spectrum is at the bottom, the waveform is at the top.
- The Unity program collects (takes a “snapshot”) of the current spectrum and waveform once every n frames (currently, n = 15).
- It remembers the 512 most recent spectrums/waveforms.
- The waves shift from front to back (along the z-axis) over time. In other words, the most recent spectrum/waveform is at the front, the least recent spectrum/waveform is at the back.
- Also played with color: the spectrums are random shades of blue, while the waveforms are random shades of green.
- Comments on my experience working on this milestone:
- I mostly had difficulties trying to familiarize with Unity and C#.
- For example, I spent a lot of time figuring out the right way to convert the given Spectrum and Waveform objects into prefabs, then instantiating a bunch of new prefabs, then destroying them when they become expired.
- I had the most fun experimenting with the constant n for the number of frames (as described in the second main bullet point), the amount of position shift in the z-axis, as well as randomizing color!
- Create an audio-visual narrative based on my ideas brainstormed at the end of Milestone 1.
- The first step is to transform the current waves from horizontal lines to circles. The spectrogram should originate from the center of the screen, and slowly expand outwards.