Difference between revisions of "250a Accelerometer Lab"

From CCRMA Wiki
Jump to: navigation, search
(Naive Gesture Detection and Thresholding)
 
(65 intermediate revisions by 3 users not shown)
Line 1: Line 1:
<font size=5>Lab 4: Accelerometers, Audio Filters, and (optionally) Multitouch</font><br>
+
<font size=5>Lab 5: Accelerometers, Audio Filters & Graphics</font><br>
Due on Wednesday, October 21th at 5PM
+
See [https://ccrma.stanford.edu/courses/250a/schedule.html this quarter's schedule] for due dates.
  
For this lab you need an iPod Touch (loaners are available) or an iPhone running TouchOSC, and Max/MSP or Pd on a computer.
+
== Set up for lab ==
  
== Get Connected and Get Oriented ==
+
First start up your kit
Download https://ccrma.stanford.edu/courses/250a/labs/lab4/lab4.zip
+
Connect your Satellite CCRMA kit to your laptop and boot it.
 +
* Use a terminal window to login to Satellite CCRMA as usual using the command <code>ssh -XY ccrma@192.168.105.106</code> with the password <code>temppwd</code>.
 +
* Check that your kit has access to the internet through you laptop. You can do this by trying to ping Bing. Run the command <code> ping bing.com</code>/ If it is successful, it will tell you how long it takes to send a packet from your Satellite CCRMA kit to Bing and back.  
  
iPod Touches, like many newer portable electronic devices, have a 3-axis accelerometer in them, which allows designers to take into account both orientation of the device with respect to gravity as well as detecting physical gestures that are made with the phone.
+
Now copy the lab files to your kit.
 +
* At the command line in the terminal window, type <code> wget http://ccrma.stanford.edu/courses/250a/labs/lab5.zip </code> to download the lab5 files to your Beagleboard.
 +
* Unzip the file: change into your ~/ directory, and type <code>unzip lab5.zip</code>. This should create a folder called lab5/ with all the files you need.  
  
For this lab, instead of writing our own iPod applications (the subject of an entire course), we will use an iPod app called TouchOSC to send accelerometer data from the iPod to max or pd, where we will process the data and make soundTouchOSC is installed on the iPods available to use in this lab.
+
Run the command <code>arduino &</code> to start the Arduino IDE.   
 +
*Use the Arduino IDE to flash your Arduino with Standard Firmata. (See [https://ccrma.stanford.edu/wiki/Talk:250a_Microcontroller_%26_Sensors_Lab_Pd#Prepare_Arduino this link] if you forget how.)
  
(If you prefer to use your own iPod or iPhone, you are welcome to use one of the other apps which perform similar functions. Here is a review of some options: http://heuristicmusic.com/blog/?p=124.)
+
Set up your accelerometer:
 +
* if you haven't soldered your header pins to your accelerometer board, do this now (ask the TA for help if you haven't soldered before). (see [https://ccrma.stanford.edu/wiki/Talk:250a_Microcontroller_%26_Sensors_Lab_Pd#Tilt_control_with_an_Accelerometer_.28Optional.29 this previous lab) for details.
 +
* connect 5V and GND to the Vin and G pins, respectively.
 +
* connect XYZ outputs are going into the Arduino analog pins A0, A1 & A2.
  
=== Get the iPod talking to your computer via Open Sound Control ===
+
== Graphing the accelerometer output ==
* Make sure your computer and iPod are on the same network.
+
We will be using GEM (which stands for Graphics Environment for Multimedia) to make a nice visual display of the real-time accelerometer readings.
**You may need to log your iPod into CCRMA Guestnet. To do this, open safari and try to access a new web page. If you can, you are logged in. If you can't you will be asked to login.
+
It was originally written by Mark Danks to generate real-time computer graphics, especially for audio-visual compositions. It is possible to do some very fancy 3D graphics using GEM, but here we are just using it to make colorful bar charts.
* Find out the name or IP address of your computer.  
+
* On the iPod start TouchOSC and press the small 'i' to get to preferences. Select 'Network' and set Host to the name of your computer (e.g. 'cmn37.stanford.edu' or 'mylaptop.local'.)
+
* Set the outgoing port to 8000.
+
* Open accel_osc, and make sure that accelerometer messages from TouchOSC are being received in Pd/Max.
+
* use printing to examine the incoming OSC messages.
+
  
=== Get Oriented ===
+
To start PD with GEM:
* Look at the acceleration values and graphs as you move the iPod around.
+
* if remotely logged in from a Windows machine: type <code>pd  &</code> at the command prompt.
* What are the units that acceleration is reported in?
+
* if remotely logged in from a Mac OS X machine: type <code>LIBGL_ALWAYS_INDIRECT=1 pd &</code> at the command prompt. (This forces pd to transmit OpenGL commands and let your local X11 program do the real drawing--otherwise you will get errors in the pd console window and nothing will come up when you try to create gemwindow later.)
* Figure out the direction and orientation of each (x,y,z) accelerometer axis. How do you do this?
+
* Draw a picture of the x,y, and z axes and their orientation as they relate to the iPod.  (For lab submission you can include this picture or describe verbally what you discover.)
+
  
== Naive Gesture Detection and Thresholding ==
+
To get the patch to work properly, set the add the <code> lab5/audio-filters</code> folder to the PD path.
 +
* Go to File->Path->New, then select the <code>lab5/audio-filters</code> folder
 +
* Now open <code>lab5/accel.pd</code>.
 +
 
 +
After starting the patch, you'll need to:
 +
* set the proper serial port to communicate with the Arduino,
 +
* adjust the volume, turn on the input messages and
 +
* create a GEM window.
 +
 
 +
Make sure you are able to see the changing accelerometer data both in the table-graphs on the left of the patch, and in the separate GEM window.
 +
* Note which physical axes map to X, Y, and Z inputs from your Arduino board.
 +
* The current readings have an offset. Use [https://ccrma.stanford.edu/wiki/Gesture_Signal_Processing Offsets and Scaling] to make it so that you see no color bars when the acceleration in each axes is zero, and ~+/-1" bar when the acceleration is +/-1g (which equals 9.8 m/s^2).
 +
 
 +
Now, experiment a little with the GEM libraries:
 +
* Locate the help link in the upper right hand corner. Open <code>Help->Browser->GEM->Examples->01.basic->09.teapot</code>.
 +
* As soon as you open the file, a teapot should appear in your GEM window.
 +
* Experiment to rotate the teapot.
 +
<span style="color:green;">Map the accelerometer to some dimension of the teapot's rendering. Describe this in your write-up, and included a screenshot of the GEM window.</span>
 +
 
 +
== How to recognize a jerk ==
 
Here's a way to make a simple gesture detector. One obvious difference between fast jerky movements and slow gradual movements is sudden jumps in the acceleration values. As discussed in lecture, jerk is the derivative of acceleration.
 
Here's a way to make a simple gesture detector. One obvious difference between fast jerky movements and slow gradual movements is sudden jumps in the acceleration values. As discussed in lecture, jerk is the derivative of acceleration.
  
Line 35: Line 55:
 
After taking the difference you can detect when the difference is greater than some threshold.
 
After taking the difference you can detect when the difference is greater than some threshold.
  
*Start with accel_osc.  
+
*Still using '''lab5/accel.pd'''.  
*In max you can use 'delta' to find the difference, and then 'past' to determine when you've passed the threshold. 
+
*In pd you can use 'delta' to find the difference and the object 'threshold' (or 'mapping/threshold' if it doesn't recognize 'threshold').   
*In pd the object 'threshold' (or 'mapping/threshold' if it doesn't recognize 'threshold') takes the difference and thresholds.   
+
 
*Have your patch make a sound when the threshold has been surpassed.  
 
*Have your patch make a sound when the threshold has been surpassed.  
 
*You can give the user additional control of the sound based on the direction and/or the magnitude of the jerk, if you like.
 
*You can give the user additional control of the sound based on the direction and/or the magnitude of the jerk, if you like.
  
 
Congratulations, you have now written a jerk detector.
 
Congratulations, you have now written a jerk detector.
 +
 +
* Modify the jerk detector to account for the fact that our accelerometer outputs 3.3V max.
 +
*<span style="color:green;">Describe what you needed to do to make this work. </span>
 +
 +
Modify the GEM portion of the patch so that you create another set of bars next to the x,y&z axes which reflect the filtered acceleration signal.
 +
*<span style="color:green;">Take a screenshot of your modified patch. </span>
  
 
== Audio Filtering ==
 
== Audio Filtering ==
 
The purpose of this part of the lab is to get a sense for the effect of different kinds of filters, and to start thinking about (audio) signals as being comprised of frequency components. Don't worry, we'll come back to accelerometers later.
 
The purpose of this part of the lab is to get a sense for the effect of different kinds of filters, and to start thinking about (audio) signals as being comprised of frequency components. Don't worry, we'll come back to accelerometers later.
  
This section does not use the iPod.  You may want to quit TouchOSC to save battery life.
+
Open the pd patch '''lab5/audio-filters/filter-demo'''.
 
+
Open the pd patch <pre>audio-filters/filter-demo</pre>
+
  
 
This patch allows you to select one of four input sources (white noise, a sine wave, a pair of sine waves, or a collection of oud samples) and pass the sound through one of seven possible filters:
 
This patch allows you to select one of four input sources (white noise, a sine wave, a pair of sine waves, or a collection of oud samples) and pass the sound through one of seven possible filters:
 
* No filtering
 
* No filtering
 
* High pass filtering
 
* High pass filtering
* High pass filtering with a "cascade" of four hip~ objects
+
* High pass filtering with a "cascade" of four <tt>hip~</tt> objects
 
* Low pass filtering
 
* Low pass filtering
* Low pass filtering with a cascade of four lop~ objects
+
* Low pass filtering with a cascade of four <tt>lop~</tt> objects
 
* Band pass filtering
 
* Band pass filtering
  
Line 62: Line 85:
 
Start with the white noise source. (Be very careful with the output gain! White noise is extremely loud per unit of amplitude!) This is the best input for hearing the differences between different kinds of filters because it contains all frequencies. (It's called "white" noise by analogy to white light, which contains all frequencies, i.e., all colors of light.) Turn the master volume and/or your headphones way down, then select input source zero (white noise) and filter type zero (unfiltered). Beautiful, huh?
 
Start with the white noise source. (Be very careful with the output gain! White noise is extremely loud per unit of amplitude!) This is the best input for hearing the differences between different kinds of filters because it contains all frequencies. (It's called "white" noise by analogy to white light, which contains all frequencies, i.e., all colors of light.) Turn the master volume and/or your headphones way down, then select input source zero (white noise) and filter type zero (unfiltered). Beautiful, huh?
  
Now step through the other six filter types, playing with the parameters of each. Sweep the high-pass cutoff frequency. Sweep the cascaded high pass cutoff frequency and note that the four filters have "four times as much" effect on the sound as the single hip~ object. Ditto for the low pass objects. For the band pass, start with the default Q factor of 1 and sweep the center frequency. Then make the Q factor small and sweep the frequency again. Then make the Q factor large and sweep the frequency again. Now you know what these filters do.
+
Now step through the other six filter types, playing with the parameters of each. Sweep the high-pass cutoff frequency. Sweep the cascaded high pass cutoff frequency and note that the four filters have "four times as much" effect on the sound as the single <tt>hip~</tt> object. Ditto for the low pass objects. For the band pass, start with the default Q factor of 1 and sweep the center frequency. Then make the Q factor small and sweep the frequency again. Then make the Q factor large and sweep the frequency again. Now you know what these filters do.
  
 
Repeat all of the above on the single sine wave. Note that no matter what filtering you do, all you change is the gain (and phase) of the sine wave. (Geek moment: the reason is because all of these filters are "linear and time-invariant".) This is very important: filters don't add anything; they just change the balance of what's already there. Note that lowpass filtering reduces the volume of high frequency sine waves but has less effect on the volume of low frequency sine waves, etc.
 
Repeat all of the above on the single sine wave. Note that no matter what filtering you do, all you change is the gain (and phase) of the sine wave. (Geek moment: the reason is because all of these filters are "linear and time-invariant".) This is very important: filters don't add anything; they just change the balance of what's already there. Note that lowpass filtering reduces the volume of high frequency sine waves but has less effect on the volume of low frequency sine waves, etc.
Line 71: Line 94:
  
 
== Filtering Acceleration Data to Distinguish Tilt from Sudden Motion ==  
 
== Filtering Acceleration Data to Distinguish Tilt from Sudden Motion ==  
*Relaunch TouchOSC and open the guppy patch.
+
*Open the '''lab5/guppy.pd''' patch.
*Turn on OSC in the patch, then turn on audio, and then holding the iPod at a neutral position hit the calibrate button and wait a few seconds.  You may have to do the callibration a few times until the tilt values are in the range of [-1,1].
+
*Turn on the input messages in the patch, then turn on audio, and then holding the accelerometer at a neutral position hit the calibrate button and wait a few seconds.  You may have to do the calibration a few times until the tilt values are in the range of [-1,1].
*Now move the iPod around and note that the tilt appears pretty much exclusively in the "tilt" outputs, and that the sudden motion appears pretty much exclusively in the "sudden motion" outputs. Amazing! How do they do that?
+
*Now move the accelerometer around and note that the tilt appears pretty much exclusively in the "tilt" outputs, and that the sudden motion appears pretty much exclusively in the "sudden motion" outputs. Amazing! How do they do that?
  
The answering is with filtering. Caveat: although we believe filtering is the best way to solve this gesture discrimination problem, this particular implementation is somewhat of a hack. The reason is that all of Pd/max's filtering tools work only on audio signals, so the guppy patch (in particular, the accel-xover subpatch) converts the incoming OSC messages into audio signals, smooths them out, then lowpasses and highpasses them (at 5 and 20 Hertz, respectively) to differentiate tilt (the low frequency component) from sudden movements (which have lots of high frequency components).
+
The answering is with filtering. Caveat: although we believe filtering is the best way to solve this gesture discrimination problem, this particular implementation is somewhat of a hack. The reason is that all of Pd's filtering tools work only on audio signals, so the '''guppy''' patch (in particular, the '''accel-xover''' subpatch) converts the incoming data into audio signals, smooths them out, then lowpasses and highpasses them (at 5 and 20 Hertz, respectively) to differentiate tilt (the low frequency component) from sudden movements (which have lots of high frequency components).
  
 
The moral of the story is that control signals have frequency components too, just like audio signals, and you've got a lot more power if you can think about them in the "frequency domain", just like it's powerful to think about audio signals in the frequency domain. Great. Now go read Julius Smith's books: [http://www.dsprelated.com/dspbooks/filters/ filters] [http://www.dsprelated.com/dspbooks/mdft/ mdft] [http://www.dsprelated.com/dspbooks/pasp/ pasp] [http://www.dsprelated.com/dspbooks/sasp/ sasp].
 
The moral of the story is that control signals have frequency components too, just like audio signals, and you've got a lot more power if you can think about them in the "frequency domain", just like it's powerful to think about audio signals in the frequency domain. Great. Now go read Julius Smith's books: [http://www.dsprelated.com/dspbooks/filters/ filters] [http://www.dsprelated.com/dspbooks/mdft/ mdft] [http://www.dsprelated.com/dspbooks/pasp/ pasp] [http://www.dsprelated.com/dspbooks/sasp/ sasp].
  
 
*Experiment with different cutoff frequencies for the crossover
 
*Experiment with different cutoff frequencies for the crossover
*Examine briefly the stillness detector in the lower right corner.  How does this work?
+
*Examine briefly the stillness detector in the lower right corner.  <span style="color:green;">How does this work?</span>
  
== Make Some (musically-expressive, gesture-controlled) Music! ==
+
== Make Some (musically-expressive, gesture-controlled) Music & Graphics! ==
Put it all together. Create an interaction in which sound is controlled by physical gesture in some way that you find interesting. You can begin by conjoining guppy and filter-demo if you like, but you are welcome to use any method for analyzing accelerometer data or creating sound.
+
Put it all together. Create an interaction in which sound is controlled by physical gesture in some way that you find interesting, and render feedback graphically. You can begin by conjoining guppy and filter-demo if you like, but you are welcome to use any method for analyzing accelerometer data or creating sound.
  
Think about the relationship you want to enable between music and sound.  Are the qualities of movement reflected in the qualities of the sound? Is this important to you?
+
Think about the relationship you want to enable between music, vision and sound.  Are the qualities of movement reflected in the qualities of the sound? Is this important to you?
  
 
Make sure to use appropriate mappings from measured quantities to sound parameters. For example if you are controlling the frequency of an oscillator from left/right tilt, you may want to first calculate the angle of tilt from acceleration, and then map logarithmically to frequency.
 
Make sure to use appropriate mappings from measured quantities to sound parameters. For example if you are controlling the frequency of an oscillator from left/right tilt, you may want to first calculate the angle of tilt from acceleration, and then map logarithmically to frequency.
  
 
Some possibilities you may want to explore:
 
Some possibilities you may want to explore:
*Invent a specific gesture, and then figure out how to detect it.
+
*Invent a specific gesture, and then figure out how to detect it, and then show some sort of visual feedback to indicate that it was detected.
*Include in your interaction the use of the sliders, buttons, or multi-touch 2D sliders in TouchOSC. You will need to figure out what OSC messages are being sent.
+
*Include in your interaction the use of the sliders, buttons, or other Arduino inputs. You will need to figure out how to add the capability of setting up and receiving the values from the Arduino.
 
*Is there a way to get velocity or position from acceleration?
 
*Is there a way to get velocity or position from acceleration?
  
Please demo your result to the instructors, and in your lab writeup describe how you approached this (open-ended) design problem and what techniques you used to implement it.
+
Please make a video and post of your results to this new nifty [[Music 250a Lab Video Wiki]], and in your lab writeup <span style="color:green;">describe how you approached this (open-ended) design problem and what techniques you used to implement it</span>.
  
 +
'''Optional: Use the pico projector or monitor setup in the MAXLAB to render the graphical feedback. Try projecting onto a surface that isn't flat! The easiest way to use the pico projector would be to do this part of your lab on the kit setup in the MAXLAB. (Or you could shut down your kit, plug its micro SDHC card into the kit setup in the MAXLAB, and copy your work off of there. For this, see the USB Converter Drive.) Just don't hot-swap the monitor with the pico projector.'''
  
(This lab was written by Luke Dahl on 10/13/09.  Huge portions were imported from Michael Gurevich's accelerometer lab from 2007 and before.)
+
(This lab was written by Luke Dahl on 10/13/09, and modified by Wendy Ju on 10/26/11.  Huge portions were imported from Michael Gurevich's original accelerometer lab.)

Latest revision as of 10:36, 26 September 2012

Lab 5: Accelerometers, Audio Filters & Graphics
See this quarter's schedule for due dates.

Set up for lab

First start up your kit Connect your Satellite CCRMA kit to your laptop and boot it.

  • Use a terminal window to login to Satellite CCRMA as usual using the command ssh -XY ccrma@192.168.105.106 with the password temppwd.
  • Check that your kit has access to the internet through you laptop. You can do this by trying to ping Bing. Run the command ping bing.com/ If it is successful, it will tell you how long it takes to send a packet from your Satellite CCRMA kit to Bing and back.

Now copy the lab files to your kit.

  • At the command line in the terminal window, type wget http://ccrma.stanford.edu/courses/250a/labs/lab5.zip to download the lab5 files to your Beagleboard.
  • Unzip the file: change into your ~/ directory, and type unzip lab5.zip. This should create a folder called lab5/ with all the files you need.

Run the command arduino & to start the Arduino IDE.

  • Use the Arduino IDE to flash your Arduino with Standard Firmata. (See this link if you forget how.)

Set up your accelerometer:

Graphing the accelerometer output

We will be using GEM (which stands for Graphics Environment for Multimedia) to make a nice visual display of the real-time accelerometer readings. It was originally written by Mark Danks to generate real-time computer graphics, especially for audio-visual compositions. It is possible to do some very fancy 3D graphics using GEM, but here we are just using it to make colorful bar charts.

To start PD with GEM:

  • if remotely logged in from a Windows machine: type pd & at the command prompt.
  • if remotely logged in from a Mac OS X machine: type LIBGL_ALWAYS_INDIRECT=1 pd & at the command prompt. (This forces pd to transmit OpenGL commands and let your local X11 program do the real drawing--otherwise you will get errors in the pd console window and nothing will come up when you try to create gemwindow later.)

To get the patch to work properly, set the add the lab5/audio-filters folder to the PD path.

  • Go to File->Path->New, then select the lab5/audio-filters folder
  • Now open lab5/accel.pd.

After starting the patch, you'll need to:

  • set the proper serial port to communicate with the Arduino,
  • adjust the volume, turn on the input messages and
  • create a GEM window.

Make sure you are able to see the changing accelerometer data both in the table-graphs on the left of the patch, and in the separate GEM window.

  • Note which physical axes map to X, Y, and Z inputs from your Arduino board.
  • The current readings have an offset. Use Offsets and Scaling to make it so that you see no color bars when the acceleration in each axes is zero, and ~+/-1" bar when the acceleration is +/-1g (which equals 9.8 m/s^2).

Now, experiment a little with the GEM libraries:

  • Locate the help link in the upper right hand corner. Open Help->Browser->GEM->Examples->01.basic->09.teapot.
  • As soon as you open the file, a teapot should appear in your GEM window.
  • Experiment to rotate the teapot.

Map the accelerometer to some dimension of the teapot's rendering. Describe this in your write-up, and included a screenshot of the GEM window.

How to recognize a jerk

Here's a way to make a simple gesture detector. One obvious difference between fast jerky movements and slow gradual movements is sudden jumps in the acceleration values. As discussed in lecture, jerk is the derivative of acceleration.

Since our accelerometer data is discrete in time (i.e. we get one value every some number of milliseconds), we can approximate derivation by taking the difference between successive values. (Technically, this is a "one-zero highpass filter.") Look at the included delta abstraction, which simply returns the difference between subsequent input values.

After taking the difference you can detect when the difference is greater than some threshold.

  • Still using lab5/accel.pd.
  • In pd you can use 'delta' to find the difference and the object 'threshold' (or 'mapping/threshold' if it doesn't recognize 'threshold').
  • Have your patch make a sound when the threshold has been surpassed.
  • You can give the user additional control of the sound based on the direction and/or the magnitude of the jerk, if you like.

Congratulations, you have now written a jerk detector.

  • Modify the jerk detector to account for the fact that our accelerometer outputs 3.3V max.
  • Describe what you needed to do to make this work.

Modify the GEM portion of the patch so that you create another set of bars next to the x,y&z axes which reflect the filtered acceleration signal.

  • Take a screenshot of your modified patch.

Audio Filtering

The purpose of this part of the lab is to get a sense for the effect of different kinds of filters, and to start thinking about (audio) signals as being comprised of frequency components. Don't worry, we'll come back to accelerometers later.

Open the pd patch lab5/audio-filters/filter-demo.

This patch allows you to select one of four input sources (white noise, a sine wave, a pair of sine waves, or a collection of oud samples) and pass the sound through one of seven possible filters:

  • No filtering
  • High pass filtering
  • High pass filtering with a "cascade" of four hip~ objects
  • Low pass filtering
  • Low pass filtering with a cascade of four lop~ objects
  • Band pass filtering

Play with this patch to get a feeling of the effect of different kinds of filters on different input sounds.

Start with the white noise source. (Be very careful with the output gain! White noise is extremely loud per unit of amplitude!) This is the best input for hearing the differences between different kinds of filters because it contains all frequencies. (It's called "white" noise by analogy to white light, which contains all frequencies, i.e., all colors of light.) Turn the master volume and/or your headphones way down, then select input source zero (white noise) and filter type zero (unfiltered). Beautiful, huh?

Now step through the other six filter types, playing with the parameters of each. Sweep the high-pass cutoff frequency. Sweep the cascaded high pass cutoff frequency and note that the four filters have "four times as much" effect on the sound as the single hip~ object. Ditto for the low pass objects. For the band pass, start with the default Q factor of 1 and sweep the center frequency. Then make the Q factor small and sweep the frequency again. Then make the Q factor large and sweep the frequency again. Now you know what these filters do.

Repeat all of the above on the single sine wave. Note that no matter what filtering you do, all you change is the gain (and phase) of the sine wave. (Geek moment: the reason is because all of these filters are "linear and time-invariant".) This is very important: filters don't add anything; they just change the balance of what's already there. Note that lowpass filtering reduces the volume of high frequency sine waves but has less effect on the volume of low frequency sine waves, etc.

Now try this on a pair of sine waves spaced pretty widely apart in frequency (for example, 100 Hz and 2000 Hz). Hear how the different filters affect the relative volumes of the two sine waves.

Finally, play (some of) the oud sample(s) through various filters.

Filtering Acceleration Data to Distinguish Tilt from Sudden Motion

  • Open the lab5/guppy.pd patch.
  • Turn on the input messages in the patch, then turn on audio, and then holding the accelerometer at a neutral position hit the calibrate button and wait a few seconds. You may have to do the calibration a few times until the tilt values are in the range of [-1,1].
  • Now move the accelerometer around and note that the tilt appears pretty much exclusively in the "tilt" outputs, and that the sudden motion appears pretty much exclusively in the "sudden motion" outputs. Amazing! How do they do that?

The answering is with filtering. Caveat: although we believe filtering is the best way to solve this gesture discrimination problem, this particular implementation is somewhat of a hack. The reason is that all of Pd's filtering tools work only on audio signals, so the guppy patch (in particular, the accel-xover subpatch) converts the incoming data into audio signals, smooths them out, then lowpasses and highpasses them (at 5 and 20 Hertz, respectively) to differentiate tilt (the low frequency component) from sudden movements (which have lots of high frequency components).

The moral of the story is that control signals have frequency components too, just like audio signals, and you've got a lot more power if you can think about them in the "frequency domain", just like it's powerful to think about audio signals in the frequency domain. Great. Now go read Julius Smith's books: filters mdft pasp sasp.

  • Experiment with different cutoff frequencies for the crossover
  • Examine briefly the stillness detector in the lower right corner. How does this work?

Make Some (musically-expressive, gesture-controlled) Music & Graphics!

Put it all together. Create an interaction in which sound is controlled by physical gesture in some way that you find interesting, and render feedback graphically. You can begin by conjoining guppy and filter-demo if you like, but you are welcome to use any method for analyzing accelerometer data or creating sound.

Think about the relationship you want to enable between music, vision and sound. Are the qualities of movement reflected in the qualities of the sound? Is this important to you?

Make sure to use appropriate mappings from measured quantities to sound parameters. For example if you are controlling the frequency of an oscillator from left/right tilt, you may want to first calculate the angle of tilt from acceleration, and then map logarithmically to frequency.

Some possibilities you may want to explore:

  • Invent a specific gesture, and then figure out how to detect it, and then show some sort of visual feedback to indicate that it was detected.
  • Include in your interaction the use of the sliders, buttons, or other Arduino inputs. You will need to figure out how to add the capability of setting up and receiving the values from the Arduino.
  • Is there a way to get velocity or position from acceleration?

Please make a video and post of your results to this new nifty Music 250a Lab Video Wiki, and in your lab writeup describe how you approached this (open-ended) design problem and what techniques you used to implement it.

Optional: Use the pico projector or monitor setup in the MAXLAB to render the graphical feedback. Try projecting onto a surface that isn't flat! The easiest way to use the pico projector would be to do this part of your lab on the kit setup in the MAXLAB. (Or you could shut down your kit, plug its micro SDHC card into the kit setup in the MAXLAB, and copy your work off of there. For this, see the USB Converter Drive.) Just don't hot-swap the monitor with the pico projector.

(This lab was written by Luke Dahl on 10/13/09, and modified by Wendy Ju on 10/26/11. Huge portions were imported from Michael Gurevich's original accelerometer lab.)