Difference between revisions of "Music-in-Motion"

From CCRMA Wiki
Jump to: navigation, search
(Introduction)
(Changes)
Line 7: Line 7:
 
With ''Music in Motion'', I wanted to explore how users could physically interact with objects in a 3-dimensional space to influence the spatial placement of sound in that space.
 
With ''Music in Motion'', I wanted to explore how users could physically interact with objects in a 3-dimensional space to influence the spatial placement of sound in that space.
  
== Changes ==
+
== Changes =
Over the course of the project, my plans for the scope of the project changed slightly. Initially, I thought that I would use two webcams in conjunction to detect depth
+
Over the course of the project, my plans for the scope of the project changed slightly. Initially, I thought that I would use two identical webcams in conjunction to detect depth, but this would have turned out to be both a programming and tuning nightmare, given that small changes to their relative positioning could easily throw off depth calculations. Given my relatively limited experience with computer vision techniques, I decided to use the relative size of the balloons on the video feed to roughly guess depth (or distance from the camera) instead.
  
 
== Software ==
 
== Software ==

Revision as of 11:52, 12 June 2017

Will have fully updated by Monday 6/12 at 11:59am PST.

Introduction

Music in Motion is an interactive sound-art installation that uses the motion of balloons through a performance space to modulate and move the 3D placement of synthesized sounds in realtime. MiM utilizes a webcam and a Max patch utilizing the cv.jit Max library to determine the location and color of balloons thrown by participants. The position data is sent from Max to Ableton Live where it is used to artificially pan synthesizers using first-order ambisonics panning plugin of Envelop for Live, as well as modulate synth pitches, filter sweeps, and other effects in realtime. Different balloon colors are tied to different notes and timbre. In an ideal installation, MiM is tuned such that participants are clearly aware of how the motion of their balloons changes the sound in the space––a participant could throw a balloon and hear the perceived sound source of their balloon’s instrument move away from them in motion with the balloon.

Motivations

With Music in Motion, I wanted to explore how users could physically interact with objects in a 3-dimensional space to influence the spatial placement of sound in that space.

= Changes

Over the course of the project, my plans for the scope of the project changed slightly. Initially, I thought that I would use two identical webcams in conjunction to detect depth, but this would have turned out to be both a programming and tuning nightmare, given that small changes to their relative positioning could easily throw off depth calculations. Given my relatively limited experience with computer vision techniques, I decided to use the relative size of the balloons on the video feed to roughly guess depth (or distance from the camera) instead.

Software

Hardware

  • MacBook Pro (running previously listed software)
  • 4 output audio interface (Komplete Audio 6)
  • Logitech wide-angle webcam (mounted 6.5-8ft high using stand)
  • 4 loudspeakers (and 4 stands positioned in a ~12’x12’ square configuration)
  • 5 balloons (green, blue, purple, red, and yellow)

Installation at Bing, Final Thoughts

Picture from outside corner of the installation, looking inward. Between the two rightmost speakers is my computer and audio interface on a table and the webcam mounted on a stand.

Links

Project Proposal

Milestone, 5/19

Binaural Demonstration Video (watch with headphones!)

Max CV patch and .zip of Ableton project (requires Jack connections + initializing Max values to set-up)