Difference between revisions of "User:Spencer/220c"

From CCRMA Wiki
Jump to: navigation, search
(April 12, 2011)
Line 3: Line 3:
 
* explore the relationship between performable instrument and installation
 
* explore the relationship between performable instrument and installation
 
* further explore overdubbed video looping concepts
 
* further explore overdubbed video looping concepts
 +
 +
== Title + Program Notes ==
 +
 +
'''''Signal Past'''''
 +
 +
This piece uses the Kinect gaming interface (a three-dimensional range imaging camera) to construct a visual space out of passing gestures by a lone performer. The performer interacts with these gestures to create sound, augmenting the space with new gestures in the process. Some gestures are transient while some persist, yielding a concurrence of varyingly static and dynamic sound textures.
 +
  
  

Revision as of 23:08, 22 May 2011

My project is to extend the virtual instrument I developed in Music 220b using the Xbox Kinect sensor, which combines a 3D depth-map with a conventional camera. The 3 overarching goals for this project are:

  • utilize higher-level semantic information analyzed from the Kinect data stream (i.e. identifying limbs/torso/head)
  • explore the relationship between performable instrument and installation
  • further explore overdubbed video looping concepts

Title + Program Notes

Signal Past

This piece uses the Kinect gaming interface (a three-dimensional range imaging camera) to construct a visual space out of passing gestures by a lone performer. The performer interacts with these gestures to create sound, augmenting the space with new gestures in the process. Some gestures are transient while some persist, yielding a concurrence of varyingly static and dynamic sound textures.


Cool Videos

Blog

April 17, 2011

OpenNI released official Mac OS binaries for NITE! Skeleton tracking should be easy, now, right...? Not quite. Still getting errors when I run the NITE samples. Arghhhhhh.

April 14, 2011

Through Chris Platz, met with Phillip from the graphics group in the CS department. Things learned:

  • finger tracking is hard/not practical at this point
  • Point Cloud can be used to synchronize/overlay the standard camera and depth camera.
  • In terms of easily available open-source software, OpenNI is basically the only game in town for high-level analysis
  • Kinect games are a good workout.

April 12, 2011

Awesome idea! Point a Kinect at the audience *and* point one at the performer! The performer can then virtually interact with the audience. Maybe pick them up and stretch them out.

Also, "human equalizer"? (thanks Mike R.!)

April 11, 2011

Discovered OpenNI. It claims to have Mac OS support but I can't find the NITE package for Mac OS (NITE is the middleware component that does skeleton analysis/body tracking). Also, OpenNI doesn't support Kinect out of the box; I found this open source project that lets the two talk to each other: SensorKinect. I can get the basic sample apps running that show video+depth, but that's not really anything groundbreaking at this point. Can't get the skeleton tracking apps to work.

April 9, 2011

Got Cinder and Kinect talking to each other. This is already a lot faster/less laggy than Processing.

April 7, 2011

Created this page.

Sketched out "stretchy string" concept. Users can stretch out jumbo-sized gooey strings and pluck them. Over time these strings harden and start to sound like bar instruments. Soft strings can break into two, while hard strings can shatter.

April 5, 2011

Started sketching interaction concepts. The one I like most so far is the "racquetball" idea, in which balls bounce back and forth between foreground/background and the user can catch them or something. But really, can I come up with something that doesn't involve balls?