https://ccrma.stanford.edu/mediawiki/api.php?action=feedcontributions&user=Lukedahl&feedformat=atomCCRMA Wiki - User contributions [en]2024-03-29T13:44:24ZUser contributionsMediaWiki 1.24.1https://ccrma.stanford.edu/mediawiki/index.php?title=FallConcert09&diff=9205FallConcert092009-11-09T19:42:05Z<p>Lukedahl: </p>
<hr />
<div>'''Visda Goudarzi- Junkmail'''<br />
<br />
<br />
''Tech Needs''<br />
Projection, speakers: prefer stage audio system or similar<br />
<br />
''Samples:''<br />
https://ccrma.stanford.edu/~visda/junkmail2.wav<br />
<br />
''Description:''<br />
Junkmail is an audio-visual piece as a reaction to the fact that "it takes more than 100 million trees to produce the total volume of junk mail that arrives in American mailboxes each year."<br />
<br />
''Bio:''<br />
Visda Goudarzi is a Computer Musician interested in software development for computer music, human- computer interaction, gesture based interfaces, computer graphics and the application of new media in art. She <br />
recently graduated in the MA/MST program at CCRMA. She has also a Masters degree in Computer Science from TU Vienna and plays piano.<br />
<br />
<br />
'''Sweat Shop Boys - Adam Somers and Sean Price'''<br />
<br />
''Tech Needs:'' 2 Speakers/Amp, subwoofer preferred, 2 1/4" outputs, table, power. We<br />
are happy to play in a reverberant space<br />
<br />
''Website:'' http://sweatshopboys.com/<br />
<br />
''Sounds:'' http://sweatshopboys.com/?page=sounds<br />
<br />
''Description:'' Slowly evolving, dark, drone/ambient<br />
<br />
''Bio:'' The Sweat Shop Boys are a drone act formed in 2005 at CalArts by<br />
Adam Somers and Sean Price. Over the years they have refined a<br />
vocabulary for improvised noise and ambient performance using analog<br />
modular synthesizers and custom software. Sean Price currently<br />
resides in Oakland, CA and is attending Mills College in the<br />
Electronic Music MFA program. Adam Somers resides in Palo Alto and is<br />
attending Stanford University in CCRMA's MA/MST program.<br />
<br />
<br />
<br />
'''Hongchan Choi - Mirror II, Fragmenta'''<br />
<br />
* Mirror II (Interactive audiovisual installation)<br />
<br />
''Tech Needs: '' 1x Macbook pro, 1x big screen TV or projector, 1x microphone, speaker<br />
(a pair of regular speaker or hemisphere would be fine)<br />
<br />
''Sample video: '' http://www.youtube.com/watch?v=Z_ppG79jnmc<br />
<br />
''Description: '' Mirror II is an interactive audiovisual installation designed to mimic a mirror in a weird way with audiovisual effects. Real-time slit-scanning and spectral delay are used for bizarre ambiances programmed by Max/MSP/Jitter & Javascript programming. Users can have interaction with this new breed of mirror by moving around and singing in front of it.<br />
<br />
* Fragmenta (real-time audiovisual performance OR playback recorded media)<br />
<br />
''Tech Needs: '' just a projector (with D-SUB input and an extension cable perhaps)<br />
<br />
''Sample video: '' http://www.youtube.com/watch?v=Ajz5aF8cbyQ <br />
<br />
''Description: '' The project "Fragmenta" is an aesthetic & experimental approach to creating audiovisual art with rich inter-media interaction. With the notion of "organic binding" between audio and visual objects, the main goal of these series of experiments is to make audiences feel these audiovisual scenes as an united sense. This piece was implemented with two software platforms: Chuck and Processing. OSC(OpenSound Control) was used for inter-connecting two applications. <br />
<br />
* Bio<br />
<br />
A composer && programmer who is eager to experiment an artistic mixture of music and visual. After years of study for an undergraduate degree in information engineering and a master's in computer music, Hongchan is a candidate for a doctoral degree(D.M.A.) and preparing a disseratation on visual music and audiovisual art. Creating a variety of multimedia works such as cross-modal performances and audiovisual installations, He has been participating numerous concerts and exhibitions in Seoul, Korea. Besides, with proficient experience in commercial music production as well as multimedia art, He has multi-year career in teaching in music technology and multimedia programming.<br />
<br />
''(descriptions are awfully improvised, need to be fixed.. ooh)''<br />
<br />
<br />
<br />
<br />
'''Jason Sadural - Audio Tunnel playback'''<br />
<br />
''Tech Needs''Prefers listening room<br />
<br />
''Samples''<br />
<br />
''Description''<br />
<br />
''Bio''<br />
<br />
<br />
<br />
'''Adam Sheppard, Bjorn Erlach, Xiang Zhang - title needed'''<br />
<br />
''Tech Needs'' : Stereo Speaker Set-Up with amplification. <br />
<br />
''Samples''<br />
<br />
''Description'' : Noise, Pop, Dirty Rap<br />
<br />
''Bio'' : Adam, Bjorn, and Xiang are currently students at CCRMA. They are good friends and enjoy making music together.<br />
<br />
<br />
'''Carr Wilkerson - LOLFO'''<br />
<br />
''Tech Needs'' - stereo 1/4 in out, 1/2 table space, 3 power outlets, (no video/graphics planned), it would be nice to have a sub.<br />
<br />
''Samples'' [http://dubstep.fm]<br />
<br />
''Description'' - downtempo ambient dubstep<br />
<br />
''Bio'' Carr Wilkerson is a System Administrator at CCMRA specializing in Linux and Mac OS systems. He is a controller and software system builder and sometime performer/impresario. He has a BS in Physics from Tulane University, Master of Arts in Music Science and Technology from Stanford University, and a Master of Engineering in Electrical Engineering from Tulane. In a previous life, he was a US Navy Nuclear Propulsion Engineer (think Scotty).<br />
<br />
<br />
<br />
<br />
'''Steinunn Arnardottir - Put my hands in your pocket project'''<br />
<br />
''Tech Needs''<br />
Speakers, cables from mixer to speaker (XLR or 1/4" jack outputs), table<br />
<br />
''Samples''<br />
http://mp3.breakbeat.is/breakbeat/leopold/demo/put_my_hands_unfinished_demo.mp3<br />
<br />
''Description''<br />
A dj set possibly with some live-ness added to it..<br />
<br />
''Bio''<br />
Steinunn Arnardottir received her B.Sc. degree in Electrical and<br />
Computer Engineering from the University of Iceland in 2006 and a M.A.<br />
in Music, Science and Technology from Stanford's Center for Computer<br />
Research in Music and Acoustics (CCRMA) in 2008.<br />
She is currently working toward a M.Sc. degree in Electrical<br />
Engineering at Stanford University and will graduate in Spring 2010.<br />
<br />
<br />
'''Fernando Lopez-Lezcano - title needed'''<br />
<br />
''Tech Needs'' Stage audio system<br />
<br />
''Samples''<br />
<br />
''Description''<br />
<br />
<br />
<br />
'''Cobi Van Tonder - title needed if any'''<br />
<br />
''Tech Needs''<br />
<br />
''Samples''<br />
<br />
''Description''<br />
<br />
<br />
<br />
'''Luke Dahl - The Tom Jonestown Experience'''<br />
<br />
''Tech Needs''<br />
Stereo audio transduction<br />
<br />
''Samples'' <br />
http://www.myspace.com/lukedahl<br />
<br />
''Description'' <br />
Electronic dance-like music performed live for your enjoyment.<br />
<br />
''Bio'' <br />
Luke Dahl is a PhD student at CCRMA whose research interests include physical gestures in new music instruments and musical information retrieval. He also composes and performs electronic dance music.<br />
<br />
<br />
''Bio''<br />
''Tech Needs''<br />
''Samples''<br />
''Description''<br />
''Bio''</div>Lukedahlhttps://ccrma.stanford.edu/mediawiki/index.php?title=250a_Accelerometer_Lab&diff=9004250a Accelerometer Lab2009-10-16T01:30:25Z<p>Lukedahl: /* Get the iPod talking to your computer via Open Sound Control */</p>
<hr />
<div><font size=5>Lab 4: Accelerometers, Audio Filters, and (optionally) Multitouch</font><br><br />
Due on Wednesday, October 21th at 5PM<br />
<br />
For this lab you need an iPod Touch (loaners are available) or an iPhone running TouchOSC, and Max/MSP or Pd on a computer.<br />
<br />
== Get Connected and Get Oriented ==<br />
Download https://ccrma.stanford.edu/courses/250a/labs/lab4/lab4.zip<br />
<br />
If you are using Max/MSP and do not have the OSC objects installed, you need to get them from CNMAT. You may find it useful to get their entire suite of max objects from http://cnmat.berkeley.edu/downloads<br />
<br />
iPod Touches, like many newer portable electronic devices, have a 3-axis accelerometer in them, which allows designers to take into account both orientation of the device with respect to gravity as well as detecting physical gestures that are made with the phone.<br />
<br />
For this lab, instead of writing our own iPod applications (the subject of an entire course), we will use an iPod app called TouchOSC to send accelerometer data from the iPod to max or pd, where we will process the data and make sound. TouchOSC is installed on the iPods available to use in this lab.<br />
<br />
(If you prefer to use your own iPod or iPhone, you are welcome to use one of the other apps which perform similar functions. Here is a review of some options: http://heuristicmusic.com/blog/?p=124.)<br />
<br />
=== Get the iPod talking to your computer via Open Sound Control ===<br />
* Make sure your computer and iPod are on the same network.<br />
**You may need to log your iPod into CCRMA Guestnet. To do this, open safari and try to access a new web page. If you can, you are logged in. If you can't you will be asked to login.<br />
* Find out the name or IP address of your computer. <br />
* On the iPod start TouchOSC.<br />
* You are probably in the settings screen, so select 'Network' and set Host to the name of your computer (e.g. 'cmn37.stanford.edu' or 'mylaptop.local'.)<br />
* If instead you see a Star Trek-like interface, press the small 'i' to get to settings, then follow the previous step.<br />
* Set the outgoing port to 8000.<br />
* Press 'TouchOSC' then 'Done' to get to the app. You do not have to wait for the "searching" message to finish.<br />
* Open accel_osc, and make sure that accelerometer messages from TouchOSC are being received in Pd/Max.<br />
* use printing to examine the incoming OSC messages.<br />
<br />
=== Get Oriented ===<br />
* Look at the acceleration values and graphs as you move the iPod around.<br />
* What are the units that acceleration is reported in?<br />
* Figure out the direction and orientation of each (x,y,z) accelerometer axis. How do you do this? <br />
* Draw a picture of the x,y, and z axes and their orientation as they relate to the iPod. (For lab submission you can include this picture or describe verbally what you discover.)<br />
<br />
== Naive Gesture Detection and Thresholding ==<br />
Here's a way to make a simple gesture detector. One obvious difference between fast jerky movements and slow gradual movements is sudden jumps in the acceleration values. As discussed in lecture, jerk is the derivative of acceleration.<br />
<br />
Since our accelerometer data is discrete in time (i.e. we get one value every some number of milliseconds), we can approximate derivation by taking the difference between successive values. (Technically, this is a "one-zero highpass filter.") Look at the included delta abstraction, which simply returns the difference between subsequent input values. <br />
<br />
After taking the difference you can detect when the difference is greater than some threshold.<br />
<br />
*Start with accel_osc. <br />
*In max you can use 'delta' to find the difference, and then 'past' to determine when you've passed the threshold. <br />
*In pd you can use 'delta' to find the difference and the object 'threshold' (or 'mapping/threshold' if it doesn't recognize 'threshold'). <br />
*Have your patch make a sound when the threshold has been surpassed. <br />
*You can give the user additional control of the sound based on the direction and/or the magnitude of the jerk, if you like.<br />
<br />
Congratulations, you have now written a jerk detector.<br />
<br />
== Audio Filtering ==<br />
The purpose of this part of the lab is to get a sense for the effect of different kinds of filters, and to start thinking about (audio) signals as being comprised of frequency components. Don't worry, we'll come back to accelerometers later.<br />
<br />
This section does not use the iPod. You may want to quit TouchOSC to save battery life.<br />
<br />
Open the pd patch <pre>audio-filters/filter-demo</pre><br />
<br />
This patch allows you to select one of four input sources (white noise, a sine wave, a pair of sine waves, or a collection of oud samples) and pass the sound through one of seven possible filters:<br />
* No filtering<br />
* High pass filtering<br />
* High pass filtering with a "cascade" of four hip~ objects<br />
* Low pass filtering<br />
* Low pass filtering with a cascade of four lop~ objects<br />
* Band pass filtering<br />
<br />
Play with this patch to get a feeling of the effect of different kinds of filters on different input sounds.<br />
<br />
Start with the white noise source. (Be very careful with the output gain! White noise is extremely loud per unit of amplitude!) This is the best input for hearing the differences between different kinds of filters because it contains all frequencies. (It's called "white" noise by analogy to white light, which contains all frequencies, i.e., all colors of light.) Turn the master volume and/or your headphones way down, then select input source zero (white noise) and filter type zero (unfiltered). Beautiful, huh?<br />
<br />
Now step through the other six filter types, playing with the parameters of each. Sweep the high-pass cutoff frequency. Sweep the cascaded high pass cutoff frequency and note that the four filters have "four times as much" effect on the sound as the single hip~ object. Ditto for the low pass objects. For the band pass, start with the default Q factor of 1 and sweep the center frequency. Then make the Q factor small and sweep the frequency again. Then make the Q factor large and sweep the frequency again. Now you know what these filters do.<br />
<br />
Repeat all of the above on the single sine wave. Note that no matter what filtering you do, all you change is the gain (and phase) of the sine wave. (Geek moment: the reason is because all of these filters are "linear and time-invariant".) This is very important: filters don't add anything; they just change the balance of what's already there. Note that lowpass filtering reduces the volume of high frequency sine waves but has less effect on the volume of low frequency sine waves, etc.<br />
<br />
Now try this on a pair of sine waves spaced pretty widely apart in frequency (for example, 100 Hz and 2000 Hz). Hear how the different filters affect the relative volumes of the two sine waves.<br />
<br />
Finally, play (some of) the oud sample(s) through various filters.<br />
<br />
== Filtering Acceleration Data to Distinguish Tilt from Sudden Motion == <br />
*Relaunch TouchOSC and open the guppy patch.<br />
*Turn on OSC in the patch, then turn on audio, and then holding the iPod at a neutral position hit the calibrate button and wait a few seconds. You may have to do the callibration a few times until the tilt values are in the range of [-1,1].<br />
*Now move the iPod around and note that the tilt appears pretty much exclusively in the "tilt" outputs, and that the sudden motion appears pretty much exclusively in the "sudden motion" outputs. Amazing! How do they do that?<br />
<br />
The answering is with filtering. Caveat: although we believe filtering is the best way to solve this gesture discrimination problem, this particular implementation is somewhat of a hack. The reason is that all of Pd/max's filtering tools work only on audio signals, so the guppy patch (in particular, the accel-xover subpatch) converts the incoming OSC messages into audio signals, smooths them out, then lowpasses and highpasses them (at 5 and 20 Hertz, respectively) to differentiate tilt (the low frequency component) from sudden movements (which have lots of high frequency components).<br />
<br />
The moral of the story is that control signals have frequency components too, just like audio signals, and you've got a lot more power if you can think about them in the "frequency domain", just like it's powerful to think about audio signals in the frequency domain. Great. Now go read Julius Smith's books: [http://www.dsprelated.com/dspbooks/filters/ filters] [http://www.dsprelated.com/dspbooks/mdft/ mdft] [http://www.dsprelated.com/dspbooks/pasp/ pasp] [http://www.dsprelated.com/dspbooks/sasp/ sasp].<br />
<br />
*Experiment with different cutoff frequencies for the crossover<br />
*Examine briefly the stillness detector in the lower right corner. How does this work?<br />
<br />
== Make Some (musically-expressive, gesture-controlled) Music! ==<br />
Put it all together. Create an interaction in which sound is controlled by physical gesture in some way that you find interesting. You can begin by conjoining guppy and filter-demo if you like, but you are welcome to use any method for analyzing accelerometer data or creating sound.<br />
<br />
Think about the relationship you want to enable between music and sound. Are the qualities of movement reflected in the qualities of the sound? Is this important to you?<br />
<br />
Make sure to use appropriate mappings from measured quantities to sound parameters. For example if you are controlling the frequency of an oscillator from left/right tilt, you may want to first calculate the angle of tilt from acceleration, and then map logarithmically to frequency.<br />
<br />
Some possibilities you may want to explore:<br />
*Invent a specific gesture, and then figure out how to detect it.<br />
*Include in your interaction the use of the sliders, buttons, or multi-touch 2D sliders in TouchOSC. You will need to figure out what OSC messages are being sent. It is also possible to send OSC messages back to the phone and change button states.<br />
*Is there a way to get velocity or position from acceleration?<br />
<br />
Please demo your result to the instructors, and in your lab writeup describe how you approached this (open-ended) design problem and what techniques you used to implement it.<br />
<br />
<br />
(This lab was written by Luke Dahl on 10/13/09. Huge portions were imported from Michael Gurevich's accelerometer lab from 2007 and before.)</div>Lukedahlhttps://ccrma.stanford.edu/mediawiki/index.php?title=250a_Accelerometer_Lab&diff=8995250a Accelerometer Lab2009-10-15T01:01:14Z<p>Lukedahl: /* Get the iPod talking to your computer via Open Sound Control */</p>
<hr />
<div><font size=5>Lab 4: Accelerometers, Audio Filters, and (optionally) Multitouch</font><br><br />
Due on Wednesday, October 21th at 5PM<br />
<br />
For this lab you need an iPod Touch (loaners are available) or an iPhone running TouchOSC, and Max/MSP or Pd on a computer.<br />
<br />
== Get Connected and Get Oriented ==<br />
Download https://ccrma.stanford.edu/courses/250a/labs/lab4/lab4.zip<br />
<br />
If you are using Max/MSP and do not have the OSC objects installed, you need to get them from CNMAT. You may find it useful to get their entire suite of max objects from http://cnmat.berkeley.edu/downloads<br />
<br />
iPod Touches, like many newer portable electronic devices, have a 3-axis accelerometer in them, which allows designers to take into account both orientation of the device with respect to gravity as well as detecting physical gestures that are made with the phone.<br />
<br />
For this lab, instead of writing our own iPod applications (the subject of an entire course), we will use an iPod app called TouchOSC to send accelerometer data from the iPod to max or pd, where we will process the data and make sound. TouchOSC is installed on the iPods available to use in this lab.<br />
<br />
(If you prefer to use your own iPod or iPhone, you are welcome to use one of the other apps which perform similar functions. Here is a review of some options: http://heuristicmusic.com/blog/?p=124.)<br />
<br />
=== Get the iPod talking to your computer via Open Sound Control ===<br />
* Make sure your computer and iPod are on the same network.<br />
**You may need to log your iPod into CCRMA Guestnet. To do this, open safari and try to access a new web page. If you can, you are logged in. If you can't you will be asked to login.<br />
* Find out the name or IP address of your computer. <br />
* On the iPod start TouchOSC.<br />
* You are probably in the settings screen, so select 'Network' and set Host to the name of your computer (e.g. 'cmn37.stanford.edu' or 'mylaptop.local'.)<br />
* If instead you see a Star Trek-like interface, press the small 'i' to get to settings, then follow the previous step.<br />
* Set the outgoing port to 8000.<br />
* Open accel_osc, and make sure that accelerometer messages from TouchOSC are being received in Pd/Max.<br />
* use printing to examine the incoming OSC messages.<br />
<br />
=== Get Oriented ===<br />
* Look at the acceleration values and graphs as you move the iPod around.<br />
* What are the units that acceleration is reported in?<br />
* Figure out the direction and orientation of each (x,y,z) accelerometer axis. How do you do this? <br />
* Draw a picture of the x,y, and z axes and their orientation as they relate to the iPod. (For lab submission you can include this picture or describe verbally what you discover.)<br />
<br />
== Naive Gesture Detection and Thresholding ==<br />
Here's a way to make a simple gesture detector. One obvious difference between fast jerky movements and slow gradual movements is sudden jumps in the acceleration values. As discussed in lecture, jerk is the derivative of acceleration.<br />
<br />
Since our accelerometer data is discrete in time (i.e. we get one value every some number of milliseconds), we can approximate derivation by taking the difference between successive values. (Technically, this is a "one-zero highpass filter.") Look at the included delta abstraction, which simply returns the difference between subsequent input values. <br />
<br />
After taking the difference you can detect when the difference is greater than some threshold.<br />
<br />
*Start with accel_osc. <br />
*In max you can use 'delta' to find the difference, and then 'past' to determine when you've passed the threshold. <br />
*In pd you can use 'delta' to find the difference and the object 'threshold' (or 'mapping/threshold' if it doesn't recognize 'threshold'). <br />
*Have your patch make a sound when the threshold has been surpassed. <br />
*You can give the user additional control of the sound based on the direction and/or the magnitude of the jerk, if you like.<br />
<br />
Congratulations, you have now written a jerk detector.<br />
<br />
== Audio Filtering ==<br />
The purpose of this part of the lab is to get a sense for the effect of different kinds of filters, and to start thinking about (audio) signals as being comprised of frequency components. Don't worry, we'll come back to accelerometers later.<br />
<br />
This section does not use the iPod. You may want to quit TouchOSC to save battery life.<br />
<br />
Open the pd patch <pre>audio-filters/filter-demo</pre><br />
<br />
This patch allows you to select one of four input sources (white noise, a sine wave, a pair of sine waves, or a collection of oud samples) and pass the sound through one of seven possible filters:<br />
* No filtering<br />
* High pass filtering<br />
* High pass filtering with a "cascade" of four hip~ objects<br />
* Low pass filtering<br />
* Low pass filtering with a cascade of four lop~ objects<br />
* Band pass filtering<br />
<br />
Play with this patch to get a feeling of the effect of different kinds of filters on different input sounds.<br />
<br />
Start with the white noise source. (Be very careful with the output gain! White noise is extremely loud per unit of amplitude!) This is the best input for hearing the differences between different kinds of filters because it contains all frequencies. (It's called "white" noise by analogy to white light, which contains all frequencies, i.e., all colors of light.) Turn the master volume and/or your headphones way down, then select input source zero (white noise) and filter type zero (unfiltered). Beautiful, huh?<br />
<br />
Now step through the other six filter types, playing with the parameters of each. Sweep the high-pass cutoff frequency. Sweep the cascaded high pass cutoff frequency and note that the four filters have "four times as much" effect on the sound as the single hip~ object. Ditto for the low pass objects. For the band pass, start with the default Q factor of 1 and sweep the center frequency. Then make the Q factor small and sweep the frequency again. Then make the Q factor large and sweep the frequency again. Now you know what these filters do.<br />
<br />
Repeat all of the above on the single sine wave. Note that no matter what filtering you do, all you change is the gain (and phase) of the sine wave. (Geek moment: the reason is because all of these filters are "linear and time-invariant".) This is very important: filters don't add anything; they just change the balance of what's already there. Note that lowpass filtering reduces the volume of high frequency sine waves but has less effect on the volume of low frequency sine waves, etc.<br />
<br />
Now try this on a pair of sine waves spaced pretty widely apart in frequency (for example, 100 Hz and 2000 Hz). Hear how the different filters affect the relative volumes of the two sine waves.<br />
<br />
Finally, play (some of) the oud sample(s) through various filters.<br />
<br />
== Filtering Acceleration Data to Distinguish Tilt from Sudden Motion == <br />
*Relaunch TouchOSC and open the guppy patch.<br />
*Turn on OSC in the patch, then turn on audio, and then holding the iPod at a neutral position hit the calibrate button and wait a few seconds. You may have to do the callibration a few times until the tilt values are in the range of [-1,1].<br />
*Now move the iPod around and note that the tilt appears pretty much exclusively in the "tilt" outputs, and that the sudden motion appears pretty much exclusively in the "sudden motion" outputs. Amazing! How do they do that?<br />
<br />
The answering is with filtering. Caveat: although we believe filtering is the best way to solve this gesture discrimination problem, this particular implementation is somewhat of a hack. The reason is that all of Pd/max's filtering tools work only on audio signals, so the guppy patch (in particular, the accel-xover subpatch) converts the incoming OSC messages into audio signals, smooths them out, then lowpasses and highpasses them (at 5 and 20 Hertz, respectively) to differentiate tilt (the low frequency component) from sudden movements (which have lots of high frequency components).<br />
<br />
The moral of the story is that control signals have frequency components too, just like audio signals, and you've got a lot more power if you can think about them in the "frequency domain", just like it's powerful to think about audio signals in the frequency domain. Great. Now go read Julius Smith's books: [http://www.dsprelated.com/dspbooks/filters/ filters] [http://www.dsprelated.com/dspbooks/mdft/ mdft] [http://www.dsprelated.com/dspbooks/pasp/ pasp] [http://www.dsprelated.com/dspbooks/sasp/ sasp].<br />
<br />
*Experiment with different cutoff frequencies for the crossover<br />
*Examine briefly the stillness detector in the lower right corner. How does this work?<br />
<br />
== Make Some (musically-expressive, gesture-controlled) Music! ==<br />
Put it all together. Create an interaction in which sound is controlled by physical gesture in some way that you find interesting. You can begin by conjoining guppy and filter-demo if you like, but you are welcome to use any method for analyzing accelerometer data or creating sound.<br />
<br />
Think about the relationship you want to enable between music and sound. Are the qualities of movement reflected in the qualities of the sound? Is this important to you?<br />
<br />
Make sure to use appropriate mappings from measured quantities to sound parameters. For example if you are controlling the frequency of an oscillator from left/right tilt, you may want to first calculate the angle of tilt from acceleration, and then map logarithmically to frequency.<br />
<br />
Some possibilities you may want to explore:<br />
*Invent a specific gesture, and then figure out how to detect it.<br />
*Include in your interaction the use of the sliders, buttons, or multi-touch 2D sliders in TouchOSC. You will need to figure out what OSC messages are being sent. It is also possible to send OSC messages back to the phone and change button states.<br />
*Is there a way to get velocity or position from acceleration?<br />
<br />
Please demo your result to the instructors, and in your lab writeup describe how you approached this (open-ended) design problem and what techniques you used to implement it.<br />
<br />
<br />
(This lab was written by Luke Dahl on 10/13/09. Huge portions were imported from Michael Gurevich's accelerometer lab from 2007 and before.)</div>Lukedahlhttps://ccrma.stanford.edu/mediawiki/index.php?title=250a_Accelerometer_Lab&diff=8994250a Accelerometer Lab2009-10-15T00:59:37Z<p>Lukedahl: /* Get the iPod talking to your computer via Open Sound Control */</p>
<hr />
<div><font size=5>Lab 4: Accelerometers, Audio Filters, and (optionally) Multitouch</font><br><br />
Due on Wednesday, October 21th at 5PM<br />
<br />
For this lab you need an iPod Touch (loaners are available) or an iPhone running TouchOSC, and Max/MSP or Pd on a computer.<br />
<br />
== Get Connected and Get Oriented ==<br />
Download https://ccrma.stanford.edu/courses/250a/labs/lab4/lab4.zip<br />
<br />
If you are using Max/MSP and do not have the OSC objects installed, you need to get them from CNMAT. You may find it useful to get their entire suite of max objects from http://cnmat.berkeley.edu/downloads<br />
<br />
iPod Touches, like many newer portable electronic devices, have a 3-axis accelerometer in them, which allows designers to take into account both orientation of the device with respect to gravity as well as detecting physical gestures that are made with the phone.<br />
<br />
For this lab, instead of writing our own iPod applications (the subject of an entire course), we will use an iPod app called TouchOSC to send accelerometer data from the iPod to max or pd, where we will process the data and make sound. TouchOSC is installed on the iPods available to use in this lab.<br />
<br />
(If you prefer to use your own iPod or iPhone, you are welcome to use one of the other apps which perform similar functions. Here is a review of some options: http://heuristicmusic.com/blog/?p=124.)<br />
<br />
=== Get the iPod talking to your computer via Open Sound Control ===<br />
* Make sure your computer and iPod are on the same network.<br />
**You may need to log your iPod into CCRMA Guestnet. To do this, open safari and try to access a new web page. If you can, you are logged in. If you can't you will be asked to login.<br />
* Find out the name or IP address of your computer. <br />
* On the iPod start TouchOSC.<br />
* You are probably in the settings screen, so Select 'Network' and set Host to the name of your computer (e.g. 'cmn37.stanford.edu' or 'mylaptop.local'.)<br />
* If instead you see a Star Trek-like interface, press the small 'i' to get to settings.<br />
* Set the outgoing port to 8000.<br />
* Open accel_osc, and make sure that accelerometer messages from TouchOSC are being received in Pd/Max.<br />
* use printing to examine the incoming OSC messages.<br />
<br />
=== Get Oriented ===<br />
* Look at the acceleration values and graphs as you move the iPod around.<br />
* What are the units that acceleration is reported in?<br />
* Figure out the direction and orientation of each (x,y,z) accelerometer axis. How do you do this? <br />
* Draw a picture of the x,y, and z axes and their orientation as they relate to the iPod. (For lab submission you can include this picture or describe verbally what you discover.)<br />
<br />
== Naive Gesture Detection and Thresholding ==<br />
Here's a way to make a simple gesture detector. One obvious difference between fast jerky movements and slow gradual movements is sudden jumps in the acceleration values. As discussed in lecture, jerk is the derivative of acceleration.<br />
<br />
Since our accelerometer data is discrete in time (i.e. we get one value every some number of milliseconds), we can approximate derivation by taking the difference between successive values. (Technically, this is a "one-zero highpass filter.") Look at the included delta abstraction, which simply returns the difference between subsequent input values. <br />
<br />
After taking the difference you can detect when the difference is greater than some threshold.<br />
<br />
*Start with accel_osc. <br />
*In max you can use 'delta' to find the difference, and then 'past' to determine when you've passed the threshold. <br />
*In pd you can use 'delta' to find the difference and the object 'threshold' (or 'mapping/threshold' if it doesn't recognize 'threshold'). <br />
*Have your patch make a sound when the threshold has been surpassed. <br />
*You can give the user additional control of the sound based on the direction and/or the magnitude of the jerk, if you like.<br />
<br />
Congratulations, you have now written a jerk detector.<br />
<br />
== Audio Filtering ==<br />
The purpose of this part of the lab is to get a sense for the effect of different kinds of filters, and to start thinking about (audio) signals as being comprised of frequency components. Don't worry, we'll come back to accelerometers later.<br />
<br />
This section does not use the iPod. You may want to quit TouchOSC to save battery life.<br />
<br />
Open the pd patch <pre>audio-filters/filter-demo</pre><br />
<br />
This patch allows you to select one of four input sources (white noise, a sine wave, a pair of sine waves, or a collection of oud samples) and pass the sound through one of seven possible filters:<br />
* No filtering<br />
* High pass filtering<br />
* High pass filtering with a "cascade" of four hip~ objects<br />
* Low pass filtering<br />
* Low pass filtering with a cascade of four lop~ objects<br />
* Band pass filtering<br />
<br />
Play with this patch to get a feeling of the effect of different kinds of filters on different input sounds.<br />
<br />
Start with the white noise source. (Be very careful with the output gain! White noise is extremely loud per unit of amplitude!) This is the best input for hearing the differences between different kinds of filters because it contains all frequencies. (It's called "white" noise by analogy to white light, which contains all frequencies, i.e., all colors of light.) Turn the master volume and/or your headphones way down, then select input source zero (white noise) and filter type zero (unfiltered). Beautiful, huh?<br />
<br />
Now step through the other six filter types, playing with the parameters of each. Sweep the high-pass cutoff frequency. Sweep the cascaded high pass cutoff frequency and note that the four filters have "four times as much" effect on the sound as the single hip~ object. Ditto for the low pass objects. For the band pass, start with the default Q factor of 1 and sweep the center frequency. Then make the Q factor small and sweep the frequency again. Then make the Q factor large and sweep the frequency again. Now you know what these filters do.<br />
<br />
Repeat all of the above on the single sine wave. Note that no matter what filtering you do, all you change is the gain (and phase) of the sine wave. (Geek moment: the reason is because all of these filters are "linear and time-invariant".) This is very important: filters don't add anything; they just change the balance of what's already there. Note that lowpass filtering reduces the volume of high frequency sine waves but has less effect on the volume of low frequency sine waves, etc.<br />
<br />
Now try this on a pair of sine waves spaced pretty widely apart in frequency (for example, 100 Hz and 2000 Hz). Hear how the different filters affect the relative volumes of the two sine waves.<br />
<br />
Finally, play (some of) the oud sample(s) through various filters.<br />
<br />
== Filtering Acceleration Data to Distinguish Tilt from Sudden Motion == <br />
*Relaunch TouchOSC and open the guppy patch.<br />
*Turn on OSC in the patch, then turn on audio, and then holding the iPod at a neutral position hit the calibrate button and wait a few seconds. You may have to do the callibration a few times until the tilt values are in the range of [-1,1].<br />
*Now move the iPod around and note that the tilt appears pretty much exclusively in the "tilt" outputs, and that the sudden motion appears pretty much exclusively in the "sudden motion" outputs. Amazing! How do they do that?<br />
<br />
The answering is with filtering. Caveat: although we believe filtering is the best way to solve this gesture discrimination problem, this particular implementation is somewhat of a hack. The reason is that all of Pd/max's filtering tools work only on audio signals, so the guppy patch (in particular, the accel-xover subpatch) converts the incoming OSC messages into audio signals, smooths them out, then lowpasses and highpasses them (at 5 and 20 Hertz, respectively) to differentiate tilt (the low frequency component) from sudden movements (which have lots of high frequency components).<br />
<br />
The moral of the story is that control signals have frequency components too, just like audio signals, and you've got a lot more power if you can think about them in the "frequency domain", just like it's powerful to think about audio signals in the frequency domain. Great. Now go read Julius Smith's books: [http://www.dsprelated.com/dspbooks/filters/ filters] [http://www.dsprelated.com/dspbooks/mdft/ mdft] [http://www.dsprelated.com/dspbooks/pasp/ pasp] [http://www.dsprelated.com/dspbooks/sasp/ sasp].<br />
<br />
*Experiment with different cutoff frequencies for the crossover<br />
*Examine briefly the stillness detector in the lower right corner. How does this work?<br />
<br />
== Make Some (musically-expressive, gesture-controlled) Music! ==<br />
Put it all together. Create an interaction in which sound is controlled by physical gesture in some way that you find interesting. You can begin by conjoining guppy and filter-demo if you like, but you are welcome to use any method for analyzing accelerometer data or creating sound.<br />
<br />
Think about the relationship you want to enable between music and sound. Are the qualities of movement reflected in the qualities of the sound? Is this important to you?<br />
<br />
Make sure to use appropriate mappings from measured quantities to sound parameters. For example if you are controlling the frequency of an oscillator from left/right tilt, you may want to first calculate the angle of tilt from acceleration, and then map logarithmically to frequency.<br />
<br />
Some possibilities you may want to explore:<br />
*Invent a specific gesture, and then figure out how to detect it.<br />
*Include in your interaction the use of the sliders, buttons, or multi-touch 2D sliders in TouchOSC. You will need to figure out what OSC messages are being sent. It is also possible to send OSC messages back to the phone and change button states.<br />
*Is there a way to get velocity or position from acceleration?<br />
<br />
Please demo your result to the instructors, and in your lab writeup describe how you approached this (open-ended) design problem and what techniques you used to implement it.<br />
<br />
<br />
(This lab was written by Luke Dahl on 10/13/09. Huge portions were imported from Michael Gurevich's accelerometer lab from 2007 and before.)</div>Lukedahlhttps://ccrma.stanford.edu/mediawiki/index.php?title=250a_Accelerometer_Lab&diff=8993250a Accelerometer Lab2009-10-15T00:54:09Z<p>Lukedahl: /* Make Some (musically-expressive, gesture-controlled) Music! */</p>
<hr />
<div><font size=5>Lab 4: Accelerometers, Audio Filters, and (optionally) Multitouch</font><br><br />
Due on Wednesday, October 21th at 5PM<br />
<br />
For this lab you need an iPod Touch (loaners are available) or an iPhone running TouchOSC, and Max/MSP or Pd on a computer.<br />
<br />
== Get Connected and Get Oriented ==<br />
Download https://ccrma.stanford.edu/courses/250a/labs/lab4/lab4.zip<br />
<br />
If you are using Max/MSP and do not have the OSC objects installed, you need to get them from CNMAT. You may find it useful to get their entire suite of max objects from http://cnmat.berkeley.edu/downloads<br />
<br />
iPod Touches, like many newer portable electronic devices, have a 3-axis accelerometer in them, which allows designers to take into account both orientation of the device with respect to gravity as well as detecting physical gestures that are made with the phone.<br />
<br />
For this lab, instead of writing our own iPod applications (the subject of an entire course), we will use an iPod app called TouchOSC to send accelerometer data from the iPod to max or pd, where we will process the data and make sound. TouchOSC is installed on the iPods available to use in this lab.<br />
<br />
(If you prefer to use your own iPod or iPhone, you are welcome to use one of the other apps which perform similar functions. Here is a review of some options: http://heuristicmusic.com/blog/?p=124.)<br />
<br />
=== Get the iPod talking to your computer via Open Sound Control ===<br />
* Make sure your computer and iPod are on the same network.<br />
**You may need to log your iPod into CCRMA Guestnet. To do this, open safari and try to access a new web page. If you can, you are logged in. If you can't you will be asked to login.<br />
* Find out the name or IP address of your computer. <br />
* On the iPod start TouchOSC and press the small 'i' to get to preferences. Select 'Network' and set Host to the name of your computer (e.g. 'cmn37.stanford.edu' or 'mylaptop.local'.)<br />
* Set the outgoing port to 8000.<br />
* Open accel_osc, and make sure that accelerometer messages from TouchOSC are being received in Pd/Max.<br />
* use printing to examine the incoming OSC messages.<br />
<br />
=== Get Oriented ===<br />
* Look at the acceleration values and graphs as you move the iPod around.<br />
* What are the units that acceleration is reported in?<br />
* Figure out the direction and orientation of each (x,y,z) accelerometer axis. How do you do this? <br />
* Draw a picture of the x,y, and z axes and their orientation as they relate to the iPod. (For lab submission you can include this picture or describe verbally what you discover.)<br />
<br />
== Naive Gesture Detection and Thresholding ==<br />
Here's a way to make a simple gesture detector. One obvious difference between fast jerky movements and slow gradual movements is sudden jumps in the acceleration values. As discussed in lecture, jerk is the derivative of acceleration.<br />
<br />
Since our accelerometer data is discrete in time (i.e. we get one value every some number of milliseconds), we can approximate derivation by taking the difference between successive values. (Technically, this is a "one-zero highpass filter.") Look at the included delta abstraction, which simply returns the difference between subsequent input values. <br />
<br />
After taking the difference you can detect when the difference is greater than some threshold.<br />
<br />
*Start with accel_osc. <br />
*In max you can use 'delta' to find the difference, and then 'past' to determine when you've passed the threshold. <br />
*In pd you can use 'delta' to find the difference and the object 'threshold' (or 'mapping/threshold' if it doesn't recognize 'threshold'). <br />
*Have your patch make a sound when the threshold has been surpassed. <br />
*You can give the user additional control of the sound based on the direction and/or the magnitude of the jerk, if you like.<br />
<br />
Congratulations, you have now written a jerk detector.<br />
<br />
== Audio Filtering ==<br />
The purpose of this part of the lab is to get a sense for the effect of different kinds of filters, and to start thinking about (audio) signals as being comprised of frequency components. Don't worry, we'll come back to accelerometers later.<br />
<br />
This section does not use the iPod. You may want to quit TouchOSC to save battery life.<br />
<br />
Open the pd patch <pre>audio-filters/filter-demo</pre><br />
<br />
This patch allows you to select one of four input sources (white noise, a sine wave, a pair of sine waves, or a collection of oud samples) and pass the sound through one of seven possible filters:<br />
* No filtering<br />
* High pass filtering<br />
* High pass filtering with a "cascade" of four hip~ objects<br />
* Low pass filtering<br />
* Low pass filtering with a cascade of four lop~ objects<br />
* Band pass filtering<br />
<br />
Play with this patch to get a feeling of the effect of different kinds of filters on different input sounds.<br />
<br />
Start with the white noise source. (Be very careful with the output gain! White noise is extremely loud per unit of amplitude!) This is the best input for hearing the differences between different kinds of filters because it contains all frequencies. (It's called "white" noise by analogy to white light, which contains all frequencies, i.e., all colors of light.) Turn the master volume and/or your headphones way down, then select input source zero (white noise) and filter type zero (unfiltered). Beautiful, huh?<br />
<br />
Now step through the other six filter types, playing with the parameters of each. Sweep the high-pass cutoff frequency. Sweep the cascaded high pass cutoff frequency and note that the four filters have "four times as much" effect on the sound as the single hip~ object. Ditto for the low pass objects. For the band pass, start with the default Q factor of 1 and sweep the center frequency. Then make the Q factor small and sweep the frequency again. Then make the Q factor large and sweep the frequency again. Now you know what these filters do.<br />
<br />
Repeat all of the above on the single sine wave. Note that no matter what filtering you do, all you change is the gain (and phase) of the sine wave. (Geek moment: the reason is because all of these filters are "linear and time-invariant".) This is very important: filters don't add anything; they just change the balance of what's already there. Note that lowpass filtering reduces the volume of high frequency sine waves but has less effect on the volume of low frequency sine waves, etc.<br />
<br />
Now try this on a pair of sine waves spaced pretty widely apart in frequency (for example, 100 Hz and 2000 Hz). Hear how the different filters affect the relative volumes of the two sine waves.<br />
<br />
Finally, play (some of) the oud sample(s) through various filters.<br />
<br />
== Filtering Acceleration Data to Distinguish Tilt from Sudden Motion == <br />
*Relaunch TouchOSC and open the guppy patch.<br />
*Turn on OSC in the patch, then turn on audio, and then holding the iPod at a neutral position hit the calibrate button and wait a few seconds. You may have to do the callibration a few times until the tilt values are in the range of [-1,1].<br />
*Now move the iPod around and note that the tilt appears pretty much exclusively in the "tilt" outputs, and that the sudden motion appears pretty much exclusively in the "sudden motion" outputs. Amazing! How do they do that?<br />
<br />
The answering is with filtering. Caveat: although we believe filtering is the best way to solve this gesture discrimination problem, this particular implementation is somewhat of a hack. The reason is that all of Pd/max's filtering tools work only on audio signals, so the guppy patch (in particular, the accel-xover subpatch) converts the incoming OSC messages into audio signals, smooths them out, then lowpasses and highpasses them (at 5 and 20 Hertz, respectively) to differentiate tilt (the low frequency component) from sudden movements (which have lots of high frequency components).<br />
<br />
The moral of the story is that control signals have frequency components too, just like audio signals, and you've got a lot more power if you can think about them in the "frequency domain", just like it's powerful to think about audio signals in the frequency domain. Great. Now go read Julius Smith's books: [http://www.dsprelated.com/dspbooks/filters/ filters] [http://www.dsprelated.com/dspbooks/mdft/ mdft] [http://www.dsprelated.com/dspbooks/pasp/ pasp] [http://www.dsprelated.com/dspbooks/sasp/ sasp].<br />
<br />
*Experiment with different cutoff frequencies for the crossover<br />
*Examine briefly the stillness detector in the lower right corner. How does this work?<br />
<br />
== Make Some (musically-expressive, gesture-controlled) Music! ==<br />
Put it all together. Create an interaction in which sound is controlled by physical gesture in some way that you find interesting. You can begin by conjoining guppy and filter-demo if you like, but you are welcome to use any method for analyzing accelerometer data or creating sound.<br />
<br />
Think about the relationship you want to enable between music and sound. Are the qualities of movement reflected in the qualities of the sound? Is this important to you?<br />
<br />
Make sure to use appropriate mappings from measured quantities to sound parameters. For example if you are controlling the frequency of an oscillator from left/right tilt, you may want to first calculate the angle of tilt from acceleration, and then map logarithmically to frequency.<br />
<br />
Some possibilities you may want to explore:<br />
*Invent a specific gesture, and then figure out how to detect it.<br />
*Include in your interaction the use of the sliders, buttons, or multi-touch 2D sliders in TouchOSC. You will need to figure out what OSC messages are being sent. It is also possible to send OSC messages back to the phone and change button states.<br />
*Is there a way to get velocity or position from acceleration?<br />
<br />
Please demo your result to the instructors, and in your lab writeup describe how you approached this (open-ended) design problem and what techniques you used to implement it.<br />
<br />
<br />
(This lab was written by Luke Dahl on 10/13/09. Huge portions were imported from Michael Gurevich's accelerometer lab from 2007 and before.)</div>Lukedahlhttps://ccrma.stanford.edu/mediawiki/index.php?title=250a_Accelerometer_Lab&diff=8992250a Accelerometer Lab2009-10-15T00:50:03Z<p>Lukedahl: /* Naive Gesture Detection and Thresholding */</p>
<hr />
<div><font size=5>Lab 4: Accelerometers, Audio Filters, and (optionally) Multitouch</font><br><br />
Due on Wednesday, October 21th at 5PM<br />
<br />
For this lab you need an iPod Touch (loaners are available) or an iPhone running TouchOSC, and Max/MSP or Pd on a computer.<br />
<br />
== Get Connected and Get Oriented ==<br />
Download https://ccrma.stanford.edu/courses/250a/labs/lab4/lab4.zip<br />
<br />
If you are using Max/MSP and do not have the OSC objects installed, you need to get them from CNMAT. You may find it useful to get their entire suite of max objects from http://cnmat.berkeley.edu/downloads<br />
<br />
iPod Touches, like many newer portable electronic devices, have a 3-axis accelerometer in them, which allows designers to take into account both orientation of the device with respect to gravity as well as detecting physical gestures that are made with the phone.<br />
<br />
For this lab, instead of writing our own iPod applications (the subject of an entire course), we will use an iPod app called TouchOSC to send accelerometer data from the iPod to max or pd, where we will process the data and make sound. TouchOSC is installed on the iPods available to use in this lab.<br />
<br />
(If you prefer to use your own iPod or iPhone, you are welcome to use one of the other apps which perform similar functions. Here is a review of some options: http://heuristicmusic.com/blog/?p=124.)<br />
<br />
=== Get the iPod talking to your computer via Open Sound Control ===<br />
* Make sure your computer and iPod are on the same network.<br />
**You may need to log your iPod into CCRMA Guestnet. To do this, open safari and try to access a new web page. If you can, you are logged in. If you can't you will be asked to login.<br />
* Find out the name or IP address of your computer. <br />
* On the iPod start TouchOSC and press the small 'i' to get to preferences. Select 'Network' and set Host to the name of your computer (e.g. 'cmn37.stanford.edu' or 'mylaptop.local'.)<br />
* Set the outgoing port to 8000.<br />
* Open accel_osc, and make sure that accelerometer messages from TouchOSC are being received in Pd/Max.<br />
* use printing to examine the incoming OSC messages.<br />
<br />
=== Get Oriented ===<br />
* Look at the acceleration values and graphs as you move the iPod around.<br />
* What are the units that acceleration is reported in?<br />
* Figure out the direction and orientation of each (x,y,z) accelerometer axis. How do you do this? <br />
* Draw a picture of the x,y, and z axes and their orientation as they relate to the iPod. (For lab submission you can include this picture or describe verbally what you discover.)<br />
<br />
== Naive Gesture Detection and Thresholding ==<br />
Here's a way to make a simple gesture detector. One obvious difference between fast jerky movements and slow gradual movements is sudden jumps in the acceleration values. As discussed in lecture, jerk is the derivative of acceleration.<br />
<br />
Since our accelerometer data is discrete in time (i.e. we get one value every some number of milliseconds), we can approximate derivation by taking the difference between successive values. (Technically, this is a "one-zero highpass filter.") Look at the included delta abstraction, which simply returns the difference between subsequent input values. <br />
<br />
After taking the difference you can detect when the difference is greater than some threshold.<br />
<br />
*Start with accel_osc. <br />
*In max you can use 'delta' to find the difference, and then 'past' to determine when you've passed the threshold. <br />
*In pd you can use 'delta' to find the difference and the object 'threshold' (or 'mapping/threshold' if it doesn't recognize 'threshold'). <br />
*Have your patch make a sound when the threshold has been surpassed. <br />
*You can give the user additional control of the sound based on the direction and/or the magnitude of the jerk, if you like.<br />
<br />
Congratulations, you have now written a jerk detector.<br />
<br />
== Audio Filtering ==<br />
The purpose of this part of the lab is to get a sense for the effect of different kinds of filters, and to start thinking about (audio) signals as being comprised of frequency components. Don't worry, we'll come back to accelerometers later.<br />
<br />
This section does not use the iPod. You may want to quit TouchOSC to save battery life.<br />
<br />
Open the pd patch <pre>audio-filters/filter-demo</pre><br />
<br />
This patch allows you to select one of four input sources (white noise, a sine wave, a pair of sine waves, or a collection of oud samples) and pass the sound through one of seven possible filters:<br />
* No filtering<br />
* High pass filtering<br />
* High pass filtering with a "cascade" of four hip~ objects<br />
* Low pass filtering<br />
* Low pass filtering with a cascade of four lop~ objects<br />
* Band pass filtering<br />
<br />
Play with this patch to get a feeling of the effect of different kinds of filters on different input sounds.<br />
<br />
Start with the white noise source. (Be very careful with the output gain! White noise is extremely loud per unit of amplitude!) This is the best input for hearing the differences between different kinds of filters because it contains all frequencies. (It's called "white" noise by analogy to white light, which contains all frequencies, i.e., all colors of light.) Turn the master volume and/or your headphones way down, then select input source zero (white noise) and filter type zero (unfiltered). Beautiful, huh?<br />
<br />
Now step through the other six filter types, playing with the parameters of each. Sweep the high-pass cutoff frequency. Sweep the cascaded high pass cutoff frequency and note that the four filters have "four times as much" effect on the sound as the single hip~ object. Ditto for the low pass objects. For the band pass, start with the default Q factor of 1 and sweep the center frequency. Then make the Q factor small and sweep the frequency again. Then make the Q factor large and sweep the frequency again. Now you know what these filters do.<br />
<br />
Repeat all of the above on the single sine wave. Note that no matter what filtering you do, all you change is the gain (and phase) of the sine wave. (Geek moment: the reason is because all of these filters are "linear and time-invariant".) This is very important: filters don't add anything; they just change the balance of what's already there. Note that lowpass filtering reduces the volume of high frequency sine waves but has less effect on the volume of low frequency sine waves, etc.<br />
<br />
Now try this on a pair of sine waves spaced pretty widely apart in frequency (for example, 100 Hz and 2000 Hz). Hear how the different filters affect the relative volumes of the two sine waves.<br />
<br />
Finally, play (some of) the oud sample(s) through various filters.<br />
<br />
== Filtering Acceleration Data to Distinguish Tilt from Sudden Motion == <br />
*Relaunch TouchOSC and open the guppy patch.<br />
*Turn on OSC in the patch, then turn on audio, and then holding the iPod at a neutral position hit the calibrate button and wait a few seconds. You may have to do the callibration a few times until the tilt values are in the range of [-1,1].<br />
*Now move the iPod around and note that the tilt appears pretty much exclusively in the "tilt" outputs, and that the sudden motion appears pretty much exclusively in the "sudden motion" outputs. Amazing! How do they do that?<br />
<br />
The answering is with filtering. Caveat: although we believe filtering is the best way to solve this gesture discrimination problem, this particular implementation is somewhat of a hack. The reason is that all of Pd/max's filtering tools work only on audio signals, so the guppy patch (in particular, the accel-xover subpatch) converts the incoming OSC messages into audio signals, smooths them out, then lowpasses and highpasses them (at 5 and 20 Hertz, respectively) to differentiate tilt (the low frequency component) from sudden movements (which have lots of high frequency components).<br />
<br />
The moral of the story is that control signals have frequency components too, just like audio signals, and you've got a lot more power if you can think about them in the "frequency domain", just like it's powerful to think about audio signals in the frequency domain. Great. Now go read Julius Smith's books: [http://www.dsprelated.com/dspbooks/filters/ filters] [http://www.dsprelated.com/dspbooks/mdft/ mdft] [http://www.dsprelated.com/dspbooks/pasp/ pasp] [http://www.dsprelated.com/dspbooks/sasp/ sasp].<br />
<br />
*Experiment with different cutoff frequencies for the crossover<br />
*Examine briefly the stillness detector in the lower right corner. How does this work?<br />
<br />
== Make Some (musically-expressive, gesture-controlled) Music! ==<br />
Put it all together. Create an interaction in which sound is controlled by physical gesture in some way that you find interesting. You can begin by conjoining guppy and filter-demo if you like, but you are welcome to use any method for analyzing accelerometer data or creating sound.<br />
<br />
Think about the relationship you want to enable between music and sound. Are the qualities of movement reflected in the qualities of the sound? Is this important to you?<br />
<br />
Make sure to use appropriate mappings from measured quantities to sound parameters. For example if you are controlling the frequency of an oscillator from left/right tilt, you may want to first calculate the angle of tilt from acceleration, and then map logarithmically to frequency.<br />
<br />
Some possibilities you may want to explore:<br />
*Invent a specific gesture, and then figure out how to detect it.<br />
*Include in your interaction the use of the sliders, buttons, or multi-touch 2D sliders in TouchOSC. You will need to figure out what OSC messages are being sent.<br />
*Is there a way to get velocity or position from acceleration?<br />
<br />
Please demo your result to the instructors, and in your lab writeup describe how you approached this (open-ended) design problem and what techniques you used to implement it.<br />
<br />
<br />
(This lab was written by Luke Dahl on 10/13/09. Huge portions were imported from Michael Gurevich's accelerometer lab from 2007 and before.)</div>Lukedahlhttps://ccrma.stanford.edu/mediawiki/index.php?title=250a_Accelerometer_Lab&diff=8991250a Accelerometer Lab2009-10-15T00:39:38Z<p>Lukedahl: /* Get Connected and Get Oriented */</p>
<hr />
<div><font size=5>Lab 4: Accelerometers, Audio Filters, and (optionally) Multitouch</font><br><br />
Due on Wednesday, October 21th at 5PM<br />
<br />
For this lab you need an iPod Touch (loaners are available) or an iPhone running TouchOSC, and Max/MSP or Pd on a computer.<br />
<br />
== Get Connected and Get Oriented ==<br />
Download https://ccrma.stanford.edu/courses/250a/labs/lab4/lab4.zip<br />
<br />
If you are using Max/MSP and do not have the OSC objects installed, you need to get them from CNMAT. You may find it useful to get their entire suite of max objects from http://cnmat.berkeley.edu/downloads<br />
<br />
iPod Touches, like many newer portable electronic devices, have a 3-axis accelerometer in them, which allows designers to take into account both orientation of the device with respect to gravity as well as detecting physical gestures that are made with the phone.<br />
<br />
For this lab, instead of writing our own iPod applications (the subject of an entire course), we will use an iPod app called TouchOSC to send accelerometer data from the iPod to max or pd, where we will process the data and make sound. TouchOSC is installed on the iPods available to use in this lab.<br />
<br />
(If you prefer to use your own iPod or iPhone, you are welcome to use one of the other apps which perform similar functions. Here is a review of some options: http://heuristicmusic.com/blog/?p=124.)<br />
<br />
=== Get the iPod talking to your computer via Open Sound Control ===<br />
* Make sure your computer and iPod are on the same network.<br />
**You may need to log your iPod into CCRMA Guestnet. To do this, open safari and try to access a new web page. If you can, you are logged in. If you can't you will be asked to login.<br />
* Find out the name or IP address of your computer. <br />
* On the iPod start TouchOSC and press the small 'i' to get to preferences. Select 'Network' and set Host to the name of your computer (e.g. 'cmn37.stanford.edu' or 'mylaptop.local'.)<br />
* Set the outgoing port to 8000.<br />
* Open accel_osc, and make sure that accelerometer messages from TouchOSC are being received in Pd/Max.<br />
* use printing to examine the incoming OSC messages.<br />
<br />
=== Get Oriented ===<br />
* Look at the acceleration values and graphs as you move the iPod around.<br />
* What are the units that acceleration is reported in?<br />
* Figure out the direction and orientation of each (x,y,z) accelerometer axis. How do you do this? <br />
* Draw a picture of the x,y, and z axes and their orientation as they relate to the iPod. (For lab submission you can include this picture or describe verbally what you discover.)<br />
<br />
== Naive Gesture Detection and Thresholding ==<br />
Here's a way to make a simple gesture detector. One obvious difference between fast jerky movements and slow gradual movements is sudden jumps in the acceleration values. As discussed in lecture, jerk is the derivative of acceleration.<br />
<br />
Since our accelerometer data is discrete in time (i.e. we get one value every some number of milliseconds), we can approximate derivation by taking the difference between successive values. (Technically, this is a "one-zero highpass filter.") Look at the included delta abstraction, which simply returns the difference between subsequent input values. <br />
<br />
After taking the difference you can detect when the difference is greater than some threshold.<br />
<br />
*Start with accel_osc. <br />
*In max you can use 'delta' to find the difference, and then 'past' to determine when you've passed the threshold. <br />
*In pd the object 'threshold' (or 'mapping/threshold' if it doesn't recognize 'threshold') takes the difference and thresholds. <br />
*Have your patch make a sound when the threshold has been surpassed. <br />
*You can give the user additional control of the sound based on the direction and/or the magnitude of the jerk, if you like.<br />
<br />
Congratulations, you have now written a jerk detector.<br />
<br />
== Audio Filtering ==<br />
The purpose of this part of the lab is to get a sense for the effect of different kinds of filters, and to start thinking about (audio) signals as being comprised of frequency components. Don't worry, we'll come back to accelerometers later.<br />
<br />
This section does not use the iPod. You may want to quit TouchOSC to save battery life.<br />
<br />
Open the pd patch <pre>audio-filters/filter-demo</pre><br />
<br />
This patch allows you to select one of four input sources (white noise, a sine wave, a pair of sine waves, or a collection of oud samples) and pass the sound through one of seven possible filters:<br />
* No filtering<br />
* High pass filtering<br />
* High pass filtering with a "cascade" of four hip~ objects<br />
* Low pass filtering<br />
* Low pass filtering with a cascade of four lop~ objects<br />
* Band pass filtering<br />
<br />
Play with this patch to get a feeling of the effect of different kinds of filters on different input sounds.<br />
<br />
Start with the white noise source. (Be very careful with the output gain! White noise is extremely loud per unit of amplitude!) This is the best input for hearing the differences between different kinds of filters because it contains all frequencies. (It's called "white" noise by analogy to white light, which contains all frequencies, i.e., all colors of light.) Turn the master volume and/or your headphones way down, then select input source zero (white noise) and filter type zero (unfiltered). Beautiful, huh?<br />
<br />
Now step through the other six filter types, playing with the parameters of each. Sweep the high-pass cutoff frequency. Sweep the cascaded high pass cutoff frequency and note that the four filters have "four times as much" effect on the sound as the single hip~ object. Ditto for the low pass objects. For the band pass, start with the default Q factor of 1 and sweep the center frequency. Then make the Q factor small and sweep the frequency again. Then make the Q factor large and sweep the frequency again. Now you know what these filters do.<br />
<br />
Repeat all of the above on the single sine wave. Note that no matter what filtering you do, all you change is the gain (and phase) of the sine wave. (Geek moment: the reason is because all of these filters are "linear and time-invariant".) This is very important: filters don't add anything; they just change the balance of what's already there. Note that lowpass filtering reduces the volume of high frequency sine waves but has less effect on the volume of low frequency sine waves, etc.<br />
<br />
Now try this on a pair of sine waves spaced pretty widely apart in frequency (for example, 100 Hz and 2000 Hz). Hear how the different filters affect the relative volumes of the two sine waves.<br />
<br />
Finally, play (some of) the oud sample(s) through various filters.<br />
<br />
== Filtering Acceleration Data to Distinguish Tilt from Sudden Motion == <br />
*Relaunch TouchOSC and open the guppy patch.<br />
*Turn on OSC in the patch, then turn on audio, and then holding the iPod at a neutral position hit the calibrate button and wait a few seconds. You may have to do the callibration a few times until the tilt values are in the range of [-1,1].<br />
*Now move the iPod around and note that the tilt appears pretty much exclusively in the "tilt" outputs, and that the sudden motion appears pretty much exclusively in the "sudden motion" outputs. Amazing! How do they do that?<br />
<br />
The answering is with filtering. Caveat: although we believe filtering is the best way to solve this gesture discrimination problem, this particular implementation is somewhat of a hack. The reason is that all of Pd/max's filtering tools work only on audio signals, so the guppy patch (in particular, the accel-xover subpatch) converts the incoming OSC messages into audio signals, smooths them out, then lowpasses and highpasses them (at 5 and 20 Hertz, respectively) to differentiate tilt (the low frequency component) from sudden movements (which have lots of high frequency components).<br />
<br />
The moral of the story is that control signals have frequency components too, just like audio signals, and you've got a lot more power if you can think about them in the "frequency domain", just like it's powerful to think about audio signals in the frequency domain. Great. Now go read Julius Smith's books: [http://www.dsprelated.com/dspbooks/filters/ filters] [http://www.dsprelated.com/dspbooks/mdft/ mdft] [http://www.dsprelated.com/dspbooks/pasp/ pasp] [http://www.dsprelated.com/dspbooks/sasp/ sasp].<br />
<br />
*Experiment with different cutoff frequencies for the crossover<br />
*Examine briefly the stillness detector in the lower right corner. How does this work?<br />
<br />
== Make Some (musically-expressive, gesture-controlled) Music! ==<br />
Put it all together. Create an interaction in which sound is controlled by physical gesture in some way that you find interesting. You can begin by conjoining guppy and filter-demo if you like, but you are welcome to use any method for analyzing accelerometer data or creating sound.<br />
<br />
Think about the relationship you want to enable between music and sound. Are the qualities of movement reflected in the qualities of the sound? Is this important to you?<br />
<br />
Make sure to use appropriate mappings from measured quantities to sound parameters. For example if you are controlling the frequency of an oscillator from left/right tilt, you may want to first calculate the angle of tilt from acceleration, and then map logarithmically to frequency.<br />
<br />
Some possibilities you may want to explore:<br />
*Invent a specific gesture, and then figure out how to detect it.<br />
*Include in your interaction the use of the sliders, buttons, or multi-touch 2D sliders in TouchOSC. You will need to figure out what OSC messages are being sent.<br />
*Is there a way to get velocity or position from acceleration?<br />
<br />
Please demo your result to the instructors, and in your lab writeup describe how you approached this (open-ended) design problem and what techniques you used to implement it.<br />
<br />
<br />
(This lab was written by Luke Dahl on 10/13/09. Huge portions were imported from Michael Gurevich's accelerometer lab from 2007 and before.)</div>Lukedahlhttps://ccrma.stanford.edu/mediawiki/index.php?title=250a_Accelerometer_Lab&diff=8986250a Accelerometer Lab2009-10-14T20:05:02Z<p>Lukedahl: /* Get Connected and Get Oriented */</p>
<hr />
<div><font size=5>Lab 4: Accelerometers, Audio Filters, and (optionally) Multitouch</font><br><br />
Due on Wednesday, October 21th at 5PM<br />
<br />
For this lab you need an iPod Touch (loaners are available) or an iPhone running TouchOSC, and Max/MSP or Pd on a computer.<br />
<br />
== Get Connected and Get Oriented ==<br />
Download https://ccrma.stanford.edu/courses/250a/labs/lab4/lab4.zip<br />
<br />
iPod Touches, like many newer portable electronic devices, have a 3-axis accelerometer in them, which allows designers to take into account both orientation of the device with respect to gravity as well as detecting physical gestures that are made with the phone.<br />
<br />
For this lab, instead of writing our own iPod applications (the subject of an entire course), we will use an iPod app called TouchOSC to send accelerometer data from the iPod to max or pd, where we will process the data and make sound. TouchOSC is installed on the iPods available to use in this lab.<br />
<br />
(If you prefer to use your own iPod or iPhone, you are welcome to use one of the other apps which perform similar functions. Here is a review of some options: http://heuristicmusic.com/blog/?p=124.)<br />
<br />
=== Get the iPod talking to your computer via Open Sound Control ===<br />
* Make sure your computer and iPod are on the same network.<br />
**You may need to log your iPod into CCRMA Guestnet. To do this, open safari and try to access a new web page. If you can, you are logged in. If you can't you will be asked to login.<br />
* Find out the name or IP address of your computer. <br />
* On the iPod start TouchOSC and press the small 'i' to get to preferences. Select 'Network' and set Host to the name of your computer (e.g. 'cmn37.stanford.edu' or 'mylaptop.local'.)<br />
* Set the outgoing port to 8000.<br />
* Open accel_osc, and make sure that accelerometer messages from TouchOSC are being received in Pd/Max.<br />
* use printing to examine the incoming OSC messages.<br />
<br />
=== Get Oriented ===<br />
* Look at the acceleration values and graphs as you move the iPod around.<br />
* What are the units that acceleration is reported in?<br />
* Figure out the direction and orientation of each (x,y,z) accelerometer axis. How do you do this? <br />
* Draw a picture of the x,y, and z axes and their orientation as they relate to the iPod. (For lab submission you can include this picture or describe verbally what you discover.)<br />
<br />
== Naive Gesture Detection and Thresholding ==<br />
Here's a way to make a simple gesture detector. One obvious difference between fast jerky movements and slow gradual movements is sudden jumps in the acceleration values. As discussed in lecture, jerk is the derivative of acceleration.<br />
<br />
Since our accelerometer data is discrete in time (i.e. we get one value every some number of milliseconds), we can approximate derivation by taking the difference between successive values. (Technically, this is a "one-zero highpass filter.") Look at the included delta abstraction, which simply returns the difference between subsequent input values. <br />
<br />
After taking the difference you can detect when the difference is greater than some threshold.<br />
<br />
*Start with accel+osc. <br />
*In max you can use 'delta' to find the difference, and then 'past' to determine when you've passed the threshold. <br />
*In pd the object 'threshold' (or 'mapping/threshold' if it doesn't recognize 'threshold') takes the difference and thresholds. <br />
*Have your patch make a sound when the threshold has been surpassed. <br />
*You can give the user additional control of the sound based on the direction and/or the magnitude of the jerk, if you like.<br />
<br />
Congratulations, you have now written a jerk detector.<br />
<br />
== Audio Filtering ==<br />
The purpose of this part of the lab is to get a sense for the effect of different kinds of filters, and to start thinking about (audio) signals as being comprised of frequency components. Don't worry, we'll come back to accelerometers later.<br />
<br />
This section does not use the iPod. You may want to quit TouchOSC to save battery life.<br />
<br />
Open the pd patch <pre>audio-filters/filter-demo</pre><br />
<br />
This patch allows you to select one of four input sources (white noise, a sine wave, a pair of sine waves, or a collection of oud samples) and pass the sound through one of seven possible filters:<br />
* No filtering<br />
* High pass filtering<br />
* High pass filtering with a "cascade" of four hip~ objects<br />
* Low pass filtering<br />
* Low pass filtering with a cascade of four lop~ objects<br />
* Band pass filtering<br />
<br />
Play with this patch to get a feeling of the effect of different kinds of filters on different input sounds.<br />
<br />
Start with the white noise source. (Be very careful with the output gain! White noise is extremely loud per unit of amplitude!) This is the best input for hearing the differences between different kinds of filters because it contains all frequencies. (It's called "white" noise by analogy to white light, which contains all frequencies, i.e., all colors of light.) Turn the master volume and/or your headphones way down, then select input source zero (white noise) and filter type zero (unfiltered). Beautiful, huh?<br />
<br />
Now step through the other six filter types, playing with the parameters of each. Sweep the high-pass cutoff frequency. Sweep the cascaded high pass cutoff frequency and note that the four filters have "four times as much" effect on the sound as the single hip~ object. Ditto for the low pass objects. For the band pass, start with the default Q factor of 1 and sweep the center frequency. Then make the Q factor small and sweep the frequency again. Then make the Q factor large and sweep the frequency again. Now you know what these filters do.<br />
<br />
Repeat all of the above on the single sine wave. Note that no matter what filtering you do, all you change is the gain (and phase) of the sine wave. (Geek moment: the reason is because all of these filters are "linear and time-invariant".) This is very important: filters don't add anything; they just change the balance of what's already there. Note that lowpass filtering reduces the volume of high frequency sine waves but has less effect on the volume of low frequency sine waves, etc.<br />
<br />
Now try this on a pair of sine waves spaced pretty widely apart in frequency (for example, 100 Hz and 2000 Hz). Hear how the different filters affect the relative volumes of the two sine waves.<br />
<br />
Finally, play (some of) the oud sample(s) through various filters.<br />
<br />
== Filtering Acceleration Data to Distinguish Tilt from Sudden Motion == <br />
*Relaunch TouchOSC and open the guppy patch.<br />
*Turn on OSC in the patch, then turn on audio, and then holding the iPod at a neutral position hit the calibrate button and wait a few seconds. You may have to do the callibration a few times until the tilt values are in the range of [-1,1].<br />
*Now move the iPod around and note that the tilt appears pretty much exclusively in the "tilt" outputs, and that the sudden motion appears pretty much exclusively in the "sudden motion" outputs. Amazing! How do they do that?<br />
<br />
The answering is with filtering. Caveat: although we believe filtering is the best way to solve this gesture discrimination problem, this particular implementation is somewhat of a hack. The reason is that all of Pd/max's filtering tools work only on audio signals, so the guppy patch (in particular, the accel-xover subpatch) converts the incoming OSC messages into audio signals, smooths them out, then lowpasses and highpasses them (at 5 and 20 Hertz, respectively) to differentiate tilt (the low frequency component) from sudden movements (which have lots of high frequency components).<br />
<br />
The moral of the story is that control signals have frequency components too, just like audio signals, and you've got a lot more power if you can think about them in the "frequency domain", just like it's powerful to think about audio signals in the frequency domain. Great. Now go read Julius Smith's books: [http://www.dsprelated.com/dspbooks/filters/ filters] [http://www.dsprelated.com/dspbooks/mdft/ mdft] [http://www.dsprelated.com/dspbooks/pasp/ pasp] [http://www.dsprelated.com/dspbooks/sasp/ sasp].<br />
<br />
*Experiment with different cutoff frequencies for the crossover<br />
*Examine briefly the stillness detector in the lower right corner. How does this work?<br />
<br />
== Make Some (musically-expressive, gesture-controlled) Music! ==<br />
Put it all together. Create an interaction in which sound is controlled by physical gesture in some way that you find interesting. You can begin by conjoining guppy and filter-demo if you like, but you are welcome to use any method for analyzing accelerometer data or creating sound.<br />
<br />
Think about the relationship you want to enable between music and sound. Are the qualities of movement reflected in the qualities of the sound? Is this important to you?<br />
<br />
Make sure to use appropriate mappings from measured quantities to sound parameters. For example if you are controlling the frequency of an oscillator from left/right tilt, you may want to first calculate the angle of tilt from acceleration, and then map logarithmically to frequency.<br />
<br />
Some possibilities you may want to explore:<br />
*Invent a specific gesture, and then figure out how to detect it.<br />
*Include in your interaction the use of the sliders, buttons, or multi-touch 2D sliders in TouchOSC. You will need to figure out what OSC messages are being sent.<br />
*Is there a way to get velocity or position from acceleration?<br />
<br />
Please demo your result to the instructors, and in your lab writeup describe how you approached this (open-ended) design problem and what techniques you used to implement it.<br />
<br />
<br />
(This lab was written by Luke Dahl on 10/13/09. Huge portions were imported from Michael Gurevich's accelerometer lab from 2007 and before.)</div>Lukedahlhttps://ccrma.stanford.edu/mediawiki/index.php?title=250a_Accelerometer_Lab&diff=8985250a Accelerometer Lab2009-10-14T19:56:33Z<p>Lukedahl: /* Audio Filtering */</p>
<hr />
<div><font size=5>Lab 4: Accelerometers, Audio Filters, and (optionally) Multitouch</font><br><br />
Due on Wednesday, October 21th at 5PM<br />
<br />
For this lab you need an iPod Touch (loaners are available) or an iPhone running TouchOSC, and Max/MSP or Pd on a computer.<br />
<br />
== Get Connected and Get Oriented ==<br />
iPod Touches, like many newer portable electronic devices, have a 3-axis accelerometer in them, which allows designers to take into account both orientation of the device with respect to gravity as well as detecting physical gestures that are made with the phone.<br />
<br />
For this lab, instead of writing our own iPod applications (the subject of an entire course), we will use an iPod app called TouchOSC to send accelerometer data from the iPod to max or pd, where we will process the data and make sound. TouchOSC is installed on the iPods available to use in this lab.<br />
<br />
(If you prefer to use your own iPod or iPhone, you are welcome to use one of the other apps which perform similar functions. Here is a review of some options: http://heuristicmusic.com/blog/?p=124.)<br />
<br />
=== Get the iPod talking to your computer via Open Sound Control ===<br />
* Make sure your computer and iPod are on the same network.<br />
**You may need to log your iPod into CCRMA Guestnet. To do this, open safari and try to access a new web page. If you can, you are logged in. If you can't you will be asked to login.<br />
* Find out the name or IP address of your computer. <br />
* On the iPod start TouchOSC and press the small 'i' to get to preferences. Select 'Network' and set Host to the name of your computer (e.g. 'cmn37.stanford.edu' or 'mylaptop.local'.)<br />
* Set the outgoing port to 8000.<br />
* Open accel_osc, and make sure that accelerometer messages from TouchOSC are being received in Pd/Max.<br />
* use printing to examine the incoming OSC messages.<br />
<br />
=== Get Oriented ===<br />
* Look at the acceleration values and graphs as you move the iPod around.<br />
* What are the units that acceleration is reported in?<br />
* Figure out the direction and orientation of each (x,y,z) accelerometer axis. How do you do this? <br />
* Draw a picture of the x,y, and z axes and their orientation as they relate to the iPod. (For lab submission you can include this picture or describe verbally what you discover.)<br />
<br />
== Naive Gesture Detection and Thresholding ==<br />
Here's a way to make a simple gesture detector. One obvious difference between fast jerky movements and slow gradual movements is sudden jumps in the acceleration values. As discussed in lecture, jerk is the derivative of acceleration.<br />
<br />
Since our accelerometer data is discrete in time (i.e. we get one value every some number of milliseconds), we can approximate derivation by taking the difference between successive values. (Technically, this is a "one-zero highpass filter.") Look at the included delta abstraction, which simply returns the difference between subsequent input values. <br />
<br />
After taking the difference you can detect when the difference is greater than some threshold.<br />
<br />
*Start with accel+osc. <br />
*In max you can use 'delta' to find the difference, and then 'past' to determine when you've passed the threshold. <br />
*In pd the object 'threshold' (or 'mapping/threshold' if it doesn't recognize 'threshold') takes the difference and thresholds. <br />
*Have your patch make a sound when the threshold has been surpassed. <br />
*You can give the user additional control of the sound based on the direction and/or the magnitude of the jerk, if you like.<br />
<br />
Congratulations, you have now written a jerk detector.<br />
<br />
== Audio Filtering ==<br />
The purpose of this part of the lab is to get a sense for the effect of different kinds of filters, and to start thinking about (audio) signals as being comprised of frequency components. Don't worry, we'll come back to accelerometers later.<br />
<br />
This section does not use the iPod. You may want to quit TouchOSC to save battery life.<br />
<br />
Open the pd patch <pre>audio-filters/filter-demo</pre><br />
<br />
This patch allows you to select one of four input sources (white noise, a sine wave, a pair of sine waves, or a collection of oud samples) and pass the sound through one of seven possible filters:<br />
* No filtering<br />
* High pass filtering<br />
* High pass filtering with a "cascade" of four hip~ objects<br />
* Low pass filtering<br />
* Low pass filtering with a cascade of four lop~ objects<br />
* Band pass filtering<br />
<br />
Play with this patch to get a feeling of the effect of different kinds of filters on different input sounds.<br />
<br />
Start with the white noise source. (Be very careful with the output gain! White noise is extremely loud per unit of amplitude!) This is the best input for hearing the differences between different kinds of filters because it contains all frequencies. (It's called "white" noise by analogy to white light, which contains all frequencies, i.e., all colors of light.) Turn the master volume and/or your headphones way down, then select input source zero (white noise) and filter type zero (unfiltered). Beautiful, huh?<br />
<br />
Now step through the other six filter types, playing with the parameters of each. Sweep the high-pass cutoff frequency. Sweep the cascaded high pass cutoff frequency and note that the four filters have "four times as much" effect on the sound as the single hip~ object. Ditto for the low pass objects. For the band pass, start with the default Q factor of 1 and sweep the center frequency. Then make the Q factor small and sweep the frequency again. Then make the Q factor large and sweep the frequency again. Now you know what these filters do.<br />
<br />
Repeat all of the above on the single sine wave. Note that no matter what filtering you do, all you change is the gain (and phase) of the sine wave. (Geek moment: the reason is because all of these filters are "linear and time-invariant".) This is very important: filters don't add anything; they just change the balance of what's already there. Note that lowpass filtering reduces the volume of high frequency sine waves but has less effect on the volume of low frequency sine waves, etc.<br />
<br />
Now try this on a pair of sine waves spaced pretty widely apart in frequency (for example, 100 Hz and 2000 Hz). Hear how the different filters affect the relative volumes of the two sine waves.<br />
<br />
Finally, play (some of) the oud sample(s) through various filters.<br />
<br />
== Filtering Acceleration Data to Distinguish Tilt from Sudden Motion == <br />
*Relaunch TouchOSC and open the guppy patch.<br />
*Turn on OSC in the patch, then turn on audio, and then holding the iPod at a neutral position hit the calibrate button and wait a few seconds. You may have to do the callibration a few times until the tilt values are in the range of [-1,1].<br />
*Now move the iPod around and note that the tilt appears pretty much exclusively in the "tilt" outputs, and that the sudden motion appears pretty much exclusively in the "sudden motion" outputs. Amazing! How do they do that?<br />
<br />
The answering is with filtering. Caveat: although we believe filtering is the best way to solve this gesture discrimination problem, this particular implementation is somewhat of a hack. The reason is that all of Pd/max's filtering tools work only on audio signals, so the guppy patch (in particular, the accel-xover subpatch) converts the incoming OSC messages into audio signals, smooths them out, then lowpasses and highpasses them (at 5 and 20 Hertz, respectively) to differentiate tilt (the low frequency component) from sudden movements (which have lots of high frequency components).<br />
<br />
The moral of the story is that control signals have frequency components too, just like audio signals, and you've got a lot more power if you can think about them in the "frequency domain", just like it's powerful to think about audio signals in the frequency domain. Great. Now go read Julius Smith's books: [http://www.dsprelated.com/dspbooks/filters/ filters] [http://www.dsprelated.com/dspbooks/mdft/ mdft] [http://www.dsprelated.com/dspbooks/pasp/ pasp] [http://www.dsprelated.com/dspbooks/sasp/ sasp].<br />
<br />
*Experiment with different cutoff frequencies for the crossover<br />
*Examine briefly the stillness detector in the lower right corner. How does this work?<br />
<br />
== Make Some (musically-expressive, gesture-controlled) Music! ==<br />
Put it all together. Create an interaction in which sound is controlled by physical gesture in some way that you find interesting. You can begin by conjoining guppy and filter-demo if you like, but you are welcome to use any method for analyzing accelerometer data or creating sound.<br />
<br />
Think about the relationship you want to enable between music and sound. Are the qualities of movement reflected in the qualities of the sound? Is this important to you?<br />
<br />
Make sure to use appropriate mappings from measured quantities to sound parameters. For example if you are controlling the frequency of an oscillator from left/right tilt, you may want to first calculate the angle of tilt from acceleration, and then map logarithmically to frequency.<br />
<br />
Some possibilities you may want to explore:<br />
*Invent a specific gesture, and then figure out how to detect it.<br />
*Include in your interaction the use of the sliders, buttons, or multi-touch 2D sliders in TouchOSC. You will need to figure out what OSC messages are being sent.<br />
*Is there a way to get velocity or position from acceleration?<br />
<br />
Please demo your result to the instructors, and in your lab writeup describe how you approached this (open-ended) design problem and what techniques you used to implement it.<br />
<br />
<br />
(This lab was written by Luke Dahl on 10/13/09. Huge portions were imported from Michael Gurevich's accelerometer lab from 2007 and before.)</div>Lukedahlhttps://ccrma.stanford.edu/mediawiki/index.php?title=250a_Accelerometer_Lab&diff=8984250a Accelerometer Lab2009-10-14T19:44:02Z<p>Lukedahl: /* Audio Filtering */</p>
<hr />
<div><font size=5>Lab 4: Accelerometers, Audio Filters, and (optionally) Multitouch</font><br><br />
Due on Wednesday, October 21th at 5PM<br />
<br />
For this lab you need an iPod Touch (loaners are available) or an iPhone running TouchOSC, and Max/MSP or Pd on a computer.<br />
<br />
== Get Connected and Get Oriented ==<br />
iPod Touches, like many newer portable electronic devices, have a 3-axis accelerometer in them, which allows designers to take into account both orientation of the device with respect to gravity as well as detecting physical gestures that are made with the phone.<br />
<br />
For this lab, instead of writing our own iPod applications (the subject of an entire course), we will use an iPod app called TouchOSC to send accelerometer data from the iPod to max or pd, where we will process the data and make sound. TouchOSC is installed on the iPods available to use in this lab.<br />
<br />
(If you prefer to use your own iPod or iPhone, you are welcome to use one of the other apps which perform similar functions. Here is a review of some options: http://heuristicmusic.com/blog/?p=124.)<br />
<br />
=== Get the iPod talking to your computer via Open Sound Control ===<br />
* Make sure your computer and iPod are on the same network.<br />
**You may need to log your iPod into CCRMA Guestnet. To do this, open safari and try to access a new web page. If you can, you are logged in. If you can't you will be asked to login.<br />
* Find out the name or IP address of your computer. <br />
* On the iPod start TouchOSC and press the small 'i' to get to preferences. Select 'Network' and set Host to the name of your computer (e.g. 'cmn37.stanford.edu' or 'mylaptop.local'.)<br />
* Set the outgoing port to 8000.<br />
* Open accel_osc, and make sure that accelerometer messages from TouchOSC are being received in Pd/Max.<br />
* use printing to examine the incoming OSC messages.<br />
<br />
=== Get Oriented ===<br />
* Look at the acceleration values and graphs as you move the iPod around.<br />
* What are the units that acceleration is reported in?<br />
* Figure out the direction and orientation of each (x,y,z) accelerometer axis. How do you do this? <br />
* Draw a picture of the x,y, and z axes and their orientation as they relate to the iPod. (For lab submission you can include this picture or describe verbally what you discover.)<br />
<br />
== Naive Gesture Detection and Thresholding ==<br />
Here's a way to make a simple gesture detector. One obvious difference between fast jerky movements and slow gradual movements is sudden jumps in the acceleration values. As discussed in lecture, jerk is the derivative of acceleration.<br />
<br />
Since our accelerometer data is discrete in time (i.e. we get one value every some number of milliseconds), we can approximate derivation by taking the difference between successive values. (Technically, this is a "one-zero highpass filter.") Look at the included delta abstraction, which simply returns the difference between subsequent input values. <br />
<br />
After taking the difference you can detect when the difference is greater than some threshold.<br />
<br />
*Start with accel+osc. <br />
*In max you can use 'delta' to find the difference, and then 'past' to determine when you've passed the threshold. <br />
*In pd the object 'threshold' (or 'mapping/threshold' if it doesn't recognize 'threshold') takes the difference and thresholds. <br />
*Have your patch make a sound when the threshold has been surpassed. <br />
*You can give the user additional control of the sound based on the direction and/or the magnitude of the jerk, if you like.<br />
<br />
Congratulations, you have now written a jerk detector.<br />
<br />
== Audio Filtering ==<br />
The purpose of this part of the lab is to get a sense for the effect of different kinds of filters, and to start thinking about (audio) signals as being comprised of frequency components. Don't worry, we'll come back to accelerometers later.<br />
<br />
This section does not use the iPod. You may want to quit TouchOSC to save battery life.<br />
<br />
Open the pd patch <pre>audio-filters/filter-demo</pre><br />
<br />
This patch allows you to select one of four input sources (white noise, a sine wave, a pair of sine waves, or a collection of oud samples) and pass the sound through one of seven possible filters:<br />
* No filtering<br />
* High pass filtering<br />
* High pass filtering with a "cascade" of four hip~ objects<br />
* Low pass filtering<br />
* Low pass filtering with a cascade of four lop~ objects<br />
* Band pass filtering<br />
<br />
Play with this patch to get a feeling of the effect of different kinds of filters on different input sounds.<br />
<br />
Start with the white noise source. (Be very careful with the output gain! White noise is extremely loud per unit of amplitude!) This is the best input for hearing the differences between different kinds of filters because it contains all frequencies. (It's called "white" noise by analogy to white light, which contains all frequencies, i.e., all colors of light.) Turn the master volume and/or your headphones way down, then select input source zero (white noise) and filter type zero (unfiltered). Beautiful, huh?<br />
<br />
Now step through the other six filter types, playing with the parameters of each. Sweep the high-pass cutoff frequency. Sweep the cascaded high pass cutoff frequency and note that the four filters have "four times as much" effect on the sound as the single hip~ object. Ditto for the low pass objects. For the band pass, start with the default Q factor of 1 and sweep the center frequency. Then make the Q factor small and sweep the frequency again. Then make the Q factor large and sweep the frequency again. Now you know what these filters do.<br />
<br />
Repeat all of the above on the single sine wave. Note that no matter what filtering you do, all you change is the gain (and phase) of the sine wave. (Geek moment: the reason is because all of these filters are "linear and time-invariant".) This is very important: filters don't add anything; they just change the balance of what's already there. Note that lowpass filtering reduces the volume of high frequency sine waves but has less effect on the volume of low frequency sine waves, etc.<br />
<br />
Now try this on a pair of sine waves spaced pretty widely apart in frequency (for example, 100 Hz and 2000 Hz). Hear how the different filters affect the relative volumes of the two sine waves.<br />
<br />
Finally, play some of the oud sample through various filters. Experiment with transposition and how it interacts with filtering. In particular, transpose the samples down by a large amount and see how highpass cuts all the sound (as with a low-frequency sine wave), while lowpass emphasizes the "bassiness" of the sound.<br />
<br />
== Filtering Acceleration Data to Distinguish Tilt from Sudden Motion == <br />
*Relaunch TouchOSC and open the guppy patch.<br />
*Turn on OSC in the patch, then turn on audio, and then holding the iPod at a neutral position hit the calibrate button and wait a few seconds. You may have to do the callibration a few times until the tilt values are in the range of [-1,1].<br />
*Now move the iPod around and note that the tilt appears pretty much exclusively in the "tilt" outputs, and that the sudden motion appears pretty much exclusively in the "sudden motion" outputs. Amazing! How do they do that?<br />
<br />
The answering is with filtering. Caveat: although we believe filtering is the best way to solve this gesture discrimination problem, this particular implementation is somewhat of a hack. The reason is that all of Pd/max's filtering tools work only on audio signals, so the guppy patch (in particular, the accel-xover subpatch) converts the incoming OSC messages into audio signals, smooths them out, then lowpasses and highpasses them (at 5 and 20 Hertz, respectively) to differentiate tilt (the low frequency component) from sudden movements (which have lots of high frequency components).<br />
<br />
The moral of the story is that control signals have frequency components too, just like audio signals, and you've got a lot more power if you can think about them in the "frequency domain", just like it's powerful to think about audio signals in the frequency domain. Great. Now go read Julius Smith's books: [http://www.dsprelated.com/dspbooks/filters/ filters] [http://www.dsprelated.com/dspbooks/mdft/ mdft] [http://www.dsprelated.com/dspbooks/pasp/ pasp] [http://www.dsprelated.com/dspbooks/sasp/ sasp].<br />
<br />
*Experiment with different cutoff frequencies for the crossover<br />
*Examine briefly the stillness detector in the lower right corner. How does this work?<br />
<br />
== Make Some (musically-expressive, gesture-controlled) Music! ==<br />
Put it all together. Create an interaction in which sound is controlled by physical gesture in some way that you find interesting. You can begin by conjoining guppy and filter-demo if you like, but you are welcome to use any method for analyzing accelerometer data or creating sound.<br />
<br />
Think about the relationship you want to enable between music and sound. Are the qualities of movement reflected in the qualities of the sound? Is this important to you?<br />
<br />
Make sure to use appropriate mappings from measured quantities to sound parameters. For example if you are controlling the frequency of an oscillator from left/right tilt, you may want to first calculate the angle of tilt from acceleration, and then map logarithmically to frequency.<br />
<br />
Some possibilities you may want to explore:<br />
*Invent a specific gesture, and then figure out how to detect it.<br />
*Include in your interaction the use of the sliders, buttons, or multi-touch 2D sliders in TouchOSC. You will need to figure out what OSC messages are being sent.<br />
*Is there a way to get velocity or position from acceleration?<br />
<br />
Please demo your result to the instructors, and in your lab writeup describe how you approached this (open-ended) design problem and what techniques you used to implement it.<br />
<br />
<br />
(This lab was written by Luke Dahl on 10/13/09. Huge portions were imported from Michael Gurevich's accelerometer lab from 2007 and before.)</div>Lukedahlhttps://ccrma.stanford.edu/mediawiki/index.php?title=250a_Accelerometer_Lab&diff=8983250a Accelerometer Lab2009-10-14T19:42:38Z<p>Lukedahl: /* Get the iPod talking to your computer via Open Sound Control */</p>
<hr />
<div><font size=5>Lab 4: Accelerometers, Audio Filters, and (optionally) Multitouch</font><br><br />
Due on Wednesday, October 21th at 5PM<br />
<br />
For this lab you need an iPod Touch (loaners are available) or an iPhone running TouchOSC, and Max/MSP or Pd on a computer.<br />
<br />
== Get Connected and Get Oriented ==<br />
iPod Touches, like many newer portable electronic devices, have a 3-axis accelerometer in them, which allows designers to take into account both orientation of the device with respect to gravity as well as detecting physical gestures that are made with the phone.<br />
<br />
For this lab, instead of writing our own iPod applications (the subject of an entire course), we will use an iPod app called TouchOSC to send accelerometer data from the iPod to max or pd, where we will process the data and make sound. TouchOSC is installed on the iPods available to use in this lab.<br />
<br />
(If you prefer to use your own iPod or iPhone, you are welcome to use one of the other apps which perform similar functions. Here is a review of some options: http://heuristicmusic.com/blog/?p=124.)<br />
<br />
=== Get the iPod talking to your computer via Open Sound Control ===<br />
* Make sure your computer and iPod are on the same network.<br />
**You may need to log your iPod into CCRMA Guestnet. To do this, open safari and try to access a new web page. If you can, you are logged in. If you can't you will be asked to login.<br />
* Find out the name or IP address of your computer. <br />
* On the iPod start TouchOSC and press the small 'i' to get to preferences. Select 'Network' and set Host to the name of your computer (e.g. 'cmn37.stanford.edu' or 'mylaptop.local'.)<br />
* Set the outgoing port to 8000.<br />
* Open accel_osc, and make sure that accelerometer messages from TouchOSC are being received in Pd/Max.<br />
* use printing to examine the incoming OSC messages.<br />
<br />
=== Get Oriented ===<br />
* Look at the acceleration values and graphs as you move the iPod around.<br />
* What are the units that acceleration is reported in?<br />
* Figure out the direction and orientation of each (x,y,z) accelerometer axis. How do you do this? <br />
* Draw a picture of the x,y, and z axes and their orientation as they relate to the iPod. (For lab submission you can include this picture or describe verbally what you discover.)<br />
<br />
== Naive Gesture Detection and Thresholding ==<br />
Here's a way to make a simple gesture detector. One obvious difference between fast jerky movements and slow gradual movements is sudden jumps in the acceleration values. As discussed in lecture, jerk is the derivative of acceleration.<br />
<br />
Since our accelerometer data is discrete in time (i.e. we get one value every some number of milliseconds), we can approximate derivation by taking the difference between successive values. (Technically, this is a "one-zero highpass filter.") Look at the included delta abstraction, which simply returns the difference between subsequent input values. <br />
<br />
After taking the difference you can detect when the difference is greater than some threshold.<br />
<br />
*Start with accel+osc. <br />
*In max you can use 'delta' to find the difference, and then 'past' to determine when you've passed the threshold. <br />
*In pd the object 'threshold' (or 'mapping/threshold' if it doesn't recognize 'threshold') takes the difference and thresholds. <br />
*Have your patch make a sound when the threshold has been surpassed. <br />
*You can give the user additional control of the sound based on the direction and/or the magnitude of the jerk, if you like.<br />
<br />
Congratulations, you have now written a jerk detector.<br />
<br />
== Audio Filtering ==<br />
The purpose of this part of the lab is to get a sense for the effect of different kinds of filters, and to start thinking about (audio) signals as being comprised of frequency components. Don't worry, we'll come back to accelerometers later.<br />
<br />
This section does not use the iPod. You may want to quit TouchOSC to save battery life.<br />
<br />
Open the pd patch <pre>audio-filters/filter-demo</pre><br />
<br />
This patch allows you to select one of four input sources (white noise, a sine wave, a pair of sine waves, or a collection of oud samples) and pass the sound through one of seven possible filters:<br />
* No filtering<br />
* High pass filtering with Pd's (one-pole) hip~ object<br />
* High pass filtering with a "cascade" of four hip~ objects<br />
* Low pass filtering with Pd's (one-pole) lop~ object<br />
* Low pass filtering with a cascade of four lop~ objects<br />
* Band pass filtering with Pd's bp~ object<br />
* Band pass filtering with a cascade of Pd's bp~ objects<br />
(EDIT THESE FOR MAXMSP!!!)<br />
<br />
Play with this patch to get a feeling of the effect of different kinds of filters on different input sounds.<br />
<br />
Start with the white noise source. (Be very careful with the output gain! White noise is extremely loud per unit of amplitude!) This is the best input for hearing the differences between different kinds of filters because it contains all frequencies. (It's called "white" noise by analogy to white light, which contains all frequencies, i.e., all colors of light.) Turn the master volume and/or your headphones way down, then select input source zero (white noise) and filter type zero (unfiltered). Beautiful, huh?<br />
<br />
Now step through the other six filter types, playing with the parameters of each. Sweep the high-pass cutoff frequency. Sweep the cascaded high pass cutoff frequency and note that the four filters have "four times as much" effect on the sound as the single hip~ object. Ditto for the low pass objects. For the band pass, start with the default Q factor of 1 and sweep the center frequency. Then make the Q factor small and sweep the frequency again. Then make the Q factor large and sweep the frequency again. Now you know what these filters do.<br />
<br />
Repeat all of the above on the single sine wave. Note that no matter what filtering you do, all you change is the gain (and phase) of the sine wave. (Geek moment: the reason is because all of these filters are "linear and time-invariant".) This is very important: filters don't add anything; they just change the balance of what's already there. Note that lowpass filtering reduces the volume of high frequency sine waves but has less effect on the volume of low frequency sine waves, etc.<br />
<br />
Now try this on a pair of sine waves spaced pretty widely apart in frequency (for example, 100 Hz and 2000 Hz). Hear how the different filters affect the relative volumes of the two sine waves.<br />
<br />
Finally, play some of the oud samples (via the same QWERTY keyboard triggering mechanism) through various filters. Experiment with transposition and how it interacts with filtering. In particular, transpose the samples down by a large amount and see how highpass cuts all the sound (as with a low-frequency sine wave), while lowpass emphasizes the "bassiness" of the sound.<br />
<br />
== Filtering Acceleration Data to Distinguish Tilt from Sudden Motion == <br />
*Relaunch TouchOSC and open the guppy patch.<br />
*Turn on OSC in the patch, then turn on audio, and then holding the iPod at a neutral position hit the calibrate button and wait a few seconds. You may have to do the callibration a few times until the tilt values are in the range of [-1,1].<br />
*Now move the iPod around and note that the tilt appears pretty much exclusively in the "tilt" outputs, and that the sudden motion appears pretty much exclusively in the "sudden motion" outputs. Amazing! How do they do that?<br />
<br />
The answering is with filtering. Caveat: although we believe filtering is the best way to solve this gesture discrimination problem, this particular implementation is somewhat of a hack. The reason is that all of Pd/max's filtering tools work only on audio signals, so the guppy patch (in particular, the accel-xover subpatch) converts the incoming OSC messages into audio signals, smooths them out, then lowpasses and highpasses them (at 5 and 20 Hertz, respectively) to differentiate tilt (the low frequency component) from sudden movements (which have lots of high frequency components).<br />
<br />
The moral of the story is that control signals have frequency components too, just like audio signals, and you've got a lot more power if you can think about them in the "frequency domain", just like it's powerful to think about audio signals in the frequency domain. Great. Now go read Julius Smith's books: [http://www.dsprelated.com/dspbooks/filters/ filters] [http://www.dsprelated.com/dspbooks/mdft/ mdft] [http://www.dsprelated.com/dspbooks/pasp/ pasp] [http://www.dsprelated.com/dspbooks/sasp/ sasp].<br />
<br />
*Experiment with different cutoff frequencies for the crossover<br />
*Examine briefly the stillness detector in the lower right corner. How does this work?<br />
<br />
== Make Some (musically-expressive, gesture-controlled) Music! ==<br />
Put it all together. Create an interaction in which sound is controlled by physical gesture in some way that you find interesting. You can begin by conjoining guppy and filter-demo if you like, but you are welcome to use any method for analyzing accelerometer data or creating sound.<br />
<br />
Think about the relationship you want to enable between music and sound. Are the qualities of movement reflected in the qualities of the sound? Is this important to you?<br />
<br />
Make sure to use appropriate mappings from measured quantities to sound parameters. For example if you are controlling the frequency of an oscillator from left/right tilt, you may want to first calculate the angle of tilt from acceleration, and then map logarithmically to frequency.<br />
<br />
Some possibilities you may want to explore:<br />
*Invent a specific gesture, and then figure out how to detect it.<br />
*Include in your interaction the use of the sliders, buttons, or multi-touch 2D sliders in TouchOSC. You will need to figure out what OSC messages are being sent.<br />
*Is there a way to get velocity or position from acceleration?<br />
<br />
Please demo your result to the instructors, and in your lab writeup describe how you approached this (open-ended) design problem and what techniques you used to implement it.<br />
<br />
<br />
(This lab was written by Luke Dahl on 10/13/09. Huge portions were imported from Michael Gurevich's accelerometer lab from 2007 and before.)</div>Lukedahlhttps://ccrma.stanford.edu/mediawiki/index.php?title=250a_Accelerometer_Lab&diff=8981250a Accelerometer Lab2009-10-14T06:26:48Z<p>Lukedahl: /* Naive Gesture Detection and Thresholding */</p>
<hr />
<div><font size=5>Lab 4: Accelerometers, Audio Filters, and (optionally) Multitouch</font><br><br />
Due on Wednesday, October 21th at 5PM<br />
<br />
For this lab you need an iPod Touch (loaners are available) or an iPhone running TouchOSC, and Max/MSP or Pd on a computer.<br />
<br />
== Get Connected and Get Oriented ==<br />
iPod Touches, like many newer portable electronic devices, have a 3-axis accelerometer in them, which allows designers to take into account both orientation of the device with respect to gravity as well as detecting physical gestures that are made with the phone.<br />
<br />
For this lab, instead of writing our own iPod applications (the subject of an entire course), we will use an iPod app called TouchOSC to send accelerometer data from the iPod to max or pd, where we will process the data and make sound. TouchOSC is installed on the iPods available to use in this lab.<br />
<br />
(If you prefer to use your own iPod or iPhone, you are welcome to use one of the other apps which perform similar functions. Here is a review of some options: http://heuristicmusic.com/blog/?p=124.)<br />
<br />
=== Get the iPod talking to your computer via Open Sound Control ===<br />
* Make sure your computer and iPod are on the same network.<br />
**You may need to log your iPod into CCRMA Guestnet. To do this, open safari and try to access a new web page. If you can, you are logged in. If you can't you will be asked to login.<br />
* Find out the name or IP address of your computer. <br />
* On the iPod start TouchOSC and press the small 'i' to get to preferences. Select 'Network' and set Host to the name of your computer (e.g. 'cmn37.stanford.edu' or 'mylaptop.local'.)<br />
* Set the outgoing port to 8000.<br />
* Open accel_osc.pd (INSERT LINK AND MAX PATCH HERE), and make sure that accelerometer messages from TouchOSC are being received in Pd/Max.<br />
* use printing to examine the incoming OSC messages.<br />
<br />
=== Get Oriented ===<br />
* Look at the acceleration values and graphs as you move the iPod around.<br />
* What are the units that acceleration is reported in?<br />
* Figure out the direction and orientation of each (x,y,z) accelerometer axis. How do you do this? <br />
* Draw a picture of the x,y, and z axes and their orientation as they relate to the iPod. (For lab submission you can include this picture or describe verbally what you discover.)<br />
<br />
== Naive Gesture Detection and Thresholding ==<br />
Here's a way to make a simple gesture detector. One obvious difference between fast jerky movements and slow gradual movements is sudden jumps in the acceleration values. As discussed in lecture, jerk is the derivative of acceleration.<br />
<br />
Since our accelerometer data is discrete in time (i.e. we get one value every some number of milliseconds), we can approximate derivation by taking the difference between successive values. (Technically, this is a "one-zero highpass filter.") Look at the included delta abstraction, which simply returns the difference between subsequent input values. <br />
<br />
After taking the difference you can detect when the difference is greater than some threshold.<br />
<br />
*Start with accel+osc. <br />
*In max you can use 'delta' to find the difference, and then 'past' to determine when you've passed the threshold. <br />
*In pd the object 'threshold' (or 'mapping/threshold' if it doesn't recognize 'threshold') takes the difference and thresholds. <br />
*Have your patch make a sound when the threshold has been surpassed. <br />
*You can give the user additional control of the sound based on the direction and/or the magnitude of the jerk, if you like.<br />
<br />
Congratulations, you have now written a jerk detector.<br />
<br />
== Audio Filtering ==<br />
The purpose of this part of the lab is to get a sense for the effect of different kinds of filters, and to start thinking about (audio) signals as being comprised of frequency components. Don't worry, we'll come back to accelerometers later.<br />
<br />
This section does not use the iPod. You may want to quit TouchOSC to save battery life.<br />
<br />
Open the pd patch <pre>audio-filters/filter-demo</pre><br />
<br />
This patch allows you to select one of four input sources (white noise, a sine wave, a pair of sine waves, or a collection of oud samples) and pass the sound through one of seven possible filters:<br />
* No filtering<br />
* High pass filtering with Pd's (one-pole) hip~ object<br />
* High pass filtering with a "cascade" of four hip~ objects<br />
* Low pass filtering with Pd's (one-pole) lop~ object<br />
* Low pass filtering with a cascade of four lop~ objects<br />
* Band pass filtering with Pd's bp~ object<br />
* Band pass filtering with a cascade of Pd's bp~ objects<br />
(EDIT THESE FOR MAXMSP!!!)<br />
<br />
Play with this patch to get a feeling of the effect of different kinds of filters on different input sounds.<br />
<br />
Start with the white noise source. (Be very careful with the output gain! White noise is extremely loud per unit of amplitude!) This is the best input for hearing the differences between different kinds of filters because it contains all frequencies. (It's called "white" noise by analogy to white light, which contains all frequencies, i.e., all colors of light.) Turn the master volume and/or your headphones way down, then select input source zero (white noise) and filter type zero (unfiltered). Beautiful, huh?<br />
<br />
Now step through the other six filter types, playing with the parameters of each. Sweep the high-pass cutoff frequency. Sweep the cascaded high pass cutoff frequency and note that the four filters have "four times as much" effect on the sound as the single hip~ object. Ditto for the low pass objects. For the band pass, start with the default Q factor of 1 and sweep the center frequency. Then make the Q factor small and sweep the frequency again. Then make the Q factor large and sweep the frequency again. Now you know what these filters do.<br />
<br />
Repeat all of the above on the single sine wave. Note that no matter what filtering you do, all you change is the gain (and phase) of the sine wave. (Geek moment: the reason is because all of these filters are "linear and time-invariant".) This is very important: filters don't add anything; they just change the balance of what's already there. Note that lowpass filtering reduces the volume of high frequency sine waves but has less effect on the volume of low frequency sine waves, etc.<br />
<br />
Now try this on a pair of sine waves spaced pretty widely apart in frequency (for example, 100 Hz and 2000 Hz). Hear how the different filters affect the relative volumes of the two sine waves.<br />
<br />
Finally, play some of the oud samples (via the same QWERTY keyboard triggering mechanism) through various filters. Experiment with transposition and how it interacts with filtering. In particular, transpose the samples down by a large amount and see how highpass cuts all the sound (as with a low-frequency sine wave), while lowpass emphasizes the "bassiness" of the sound.<br />
<br />
== Filtering Acceleration Data to Distinguish Tilt from Sudden Motion == <br />
*Relaunch TouchOSC and open the guppy patch.<br />
*Turn on OSC in the patch, then turn on audio, and then holding the iPod at a neutral position hit the calibrate button and wait a few seconds. You may have to do the callibration a few times until the tilt values are in the range of [-1,1].<br />
*Now move the iPod around and note that the tilt appears pretty much exclusively in the "tilt" outputs, and that the sudden motion appears pretty much exclusively in the "sudden motion" outputs. Amazing! How do they do that?<br />
<br />
The answering is with filtering. Caveat: although we believe filtering is the best way to solve this gesture discrimination problem, this particular implementation is somewhat of a hack. The reason is that all of Pd/max's filtering tools work only on audio signals, so the guppy patch (in particular, the accel-xover subpatch) converts the incoming OSC messages into audio signals, smooths them out, then lowpasses and highpasses them (at 5 and 20 Hertz, respectively) to differentiate tilt (the low frequency component) from sudden movements (which have lots of high frequency components).<br />
<br />
The moral of the story is that control signals have frequency components too, just like audio signals, and you've got a lot more power if you can think about them in the "frequency domain", just like it's powerful to think about audio signals in the frequency domain. Great. Now go read Julius Smith's books: [http://www.dsprelated.com/dspbooks/filters/ filters] [http://www.dsprelated.com/dspbooks/mdft/ mdft] [http://www.dsprelated.com/dspbooks/pasp/ pasp] [http://www.dsprelated.com/dspbooks/sasp/ sasp].<br />
<br />
*Experiment with different cutoff frequencies for the crossover<br />
*Examine briefly the stillness detector in the lower right corner. How does this work?<br />
<br />
== Make Some (musically-expressive, gesture-controlled) Music! ==<br />
Put it all together. Create an interaction in which sound is controlled by physical gesture in some way that you find interesting. You can begin by conjoining guppy and filter-demo if you like, but you are welcome to use any method for analyzing accelerometer data or creating sound.<br />
<br />
Think about the relationship you want to enable between music and sound. Are the qualities of movement reflected in the qualities of the sound? Is this important to you?<br />
<br />
Make sure to use appropriate mappings from measured quantities to sound parameters. For example if you are controlling the frequency of an oscillator from left/right tilt, you may want to first calculate the angle of tilt from acceleration, and then map logarithmically to frequency.<br />
<br />
Some possibilities you may want to explore:<br />
*Invent a specific gesture, and then figure out how to detect it.<br />
*Include in your interaction the use of the sliders, buttons, or multi-touch 2D sliders in TouchOSC. You will need to figure out what OSC messages are being sent.<br />
*Is there a way to get velocity or position from acceleration?<br />
<br />
Please demo your result to the instructors, and in your lab writeup describe how you approached this (open-ended) design problem and what techniques you used to implement it.<br />
<br />
<br />
(This lab was written by Luke Dahl on 10/13/09. Huge portions were imported from Michael Gurevich's accelerometer lab from 2007 and before.)</div>Lukedahlhttps://ccrma.stanford.edu/mediawiki/index.php?title=250a_Accelerometer_Lab&diff=8980250a Accelerometer Lab2009-10-14T06:26:00Z<p>Lukedahl: /* Naive Gesture Detection and Thresholding */</p>
<hr />
<div><font size=5>Lab 4: Accelerometers, Audio Filters, and (optionally) Multitouch</font><br><br />
Due on Wednesday, October 21th at 5PM<br />
<br />
For this lab you need an iPod Touch (loaners are available) or an iPhone running TouchOSC, and Max/MSP or Pd on a computer.<br />
<br />
== Get Connected and Get Oriented ==<br />
iPod Touches, like many newer portable electronic devices, have a 3-axis accelerometer in them, which allows designers to take into account both orientation of the device with respect to gravity as well as detecting physical gestures that are made with the phone.<br />
<br />
For this lab, instead of writing our own iPod applications (the subject of an entire course), we will use an iPod app called TouchOSC to send accelerometer data from the iPod to max or pd, where we will process the data and make sound. TouchOSC is installed on the iPods available to use in this lab.<br />
<br />
(If you prefer to use your own iPod or iPhone, you are welcome to use one of the other apps which perform similar functions. Here is a review of some options: http://heuristicmusic.com/blog/?p=124.)<br />
<br />
=== Get the iPod talking to your computer via Open Sound Control ===<br />
* Make sure your computer and iPod are on the same network.<br />
**You may need to log your iPod into CCRMA Guestnet. To do this, open safari and try to access a new web page. If you can, you are logged in. If you can't you will be asked to login.<br />
* Find out the name or IP address of your computer. <br />
* On the iPod start TouchOSC and press the small 'i' to get to preferences. Select 'Network' and set Host to the name of your computer (e.g. 'cmn37.stanford.edu' or 'mylaptop.local'.)<br />
* Set the outgoing port to 8000.<br />
* Open accel_osc.pd (INSERT LINK AND MAX PATCH HERE), and make sure that accelerometer messages from TouchOSC are being received in Pd/Max.<br />
* use printing to examine the incoming OSC messages.<br />
<br />
=== Get Oriented ===<br />
* Look at the acceleration values and graphs as you move the iPod around.<br />
* What are the units that acceleration is reported in?<br />
* Figure out the direction and orientation of each (x,y,z) accelerometer axis. How do you do this? <br />
* Draw a picture of the x,y, and z axes and their orientation as they relate to the iPod. (For lab submission you can include this picture or describe verbally what you discover.)<br />
<br />
== Naive Gesture Detection and Thresholding ==<br />
Here's a way to make a simple gesture detector. One obvious difference between fast jerky movements and slow gradual movements is sudden jumps in the acceleration values. As discussed in lecture, jerk is the derivative of acceleration.<br />
<br />
Since our accelermeter data is discrete in time (i.e. we get one value every some number of milliseconds), we can approximate derivation by taking the difference between successive values. (Technically, this is a "one-zero highpass filter.") Look at the included delta abstraction, which simply returns the difference between subsequent input values. <br />
<br />
After taking the difference you can detect when the difference is greater than some threshold.<br />
<br />
Start with accel+osc. In max you can use 'delta' to find the difference, and then 'past' to determine when you've passed the threshold. In pd the object 'threshold' (or 'mapping/threshold' if it doesn't recognize 'threshold') takes the difference and thresholds. Have your patch make a sound when the threshold has been surpassed. You can give the user additional control of the sound based on the direction and/or the magnitude of the jerk, if you like.<br />
<br />
Congratulations, you have now written a jerk detector.<br />
<br />
== Audio Filtering ==<br />
The purpose of this part of the lab is to get a sense for the effect of different kinds of filters, and to start thinking about (audio) signals as being comprised of frequency components. Don't worry, we'll come back to accelerometers later.<br />
<br />
This section does not use the iPod. You may want to quit TouchOSC to save battery life.<br />
<br />
Open the pd patch <pre>audio-filters/filter-demo</pre><br />
<br />
This patch allows you to select one of four input sources (white noise, a sine wave, a pair of sine waves, or a collection of oud samples) and pass the sound through one of seven possible filters:<br />
* No filtering<br />
* High pass filtering with Pd's (one-pole) hip~ object<br />
* High pass filtering with a "cascade" of four hip~ objects<br />
* Low pass filtering with Pd's (one-pole) lop~ object<br />
* Low pass filtering with a cascade of four lop~ objects<br />
* Band pass filtering with Pd's bp~ object<br />
* Band pass filtering with a cascade of Pd's bp~ objects<br />
(EDIT THESE FOR MAXMSP!!!)<br />
<br />
Play with this patch to get a feeling of the effect of different kinds of filters on different input sounds.<br />
<br />
Start with the white noise source. (Be very careful with the output gain! White noise is extremely loud per unit of amplitude!) This is the best input for hearing the differences between different kinds of filters because it contains all frequencies. (It's called "white" noise by analogy to white light, which contains all frequencies, i.e., all colors of light.) Turn the master volume and/or your headphones way down, then select input source zero (white noise) and filter type zero (unfiltered). Beautiful, huh?<br />
<br />
Now step through the other six filter types, playing with the parameters of each. Sweep the high-pass cutoff frequency. Sweep the cascaded high pass cutoff frequency and note that the four filters have "four times as much" effect on the sound as the single hip~ object. Ditto for the low pass objects. For the band pass, start with the default Q factor of 1 and sweep the center frequency. Then make the Q factor small and sweep the frequency again. Then make the Q factor large and sweep the frequency again. Now you know what these filters do.<br />
<br />
Repeat all of the above on the single sine wave. Note that no matter what filtering you do, all you change is the gain (and phase) of the sine wave. (Geek moment: the reason is because all of these filters are "linear and time-invariant".) This is very important: filters don't add anything; they just change the balance of what's already there. Note that lowpass filtering reduces the volume of high frequency sine waves but has less effect on the volume of low frequency sine waves, etc.<br />
<br />
Now try this on a pair of sine waves spaced pretty widely apart in frequency (for example, 100 Hz and 2000 Hz). Hear how the different filters affect the relative volumes of the two sine waves.<br />
<br />
Finally, play some of the oud samples (via the same QWERTY keyboard triggering mechanism) through various filters. Experiment with transposition and how it interacts with filtering. In particular, transpose the samples down by a large amount and see how highpass cuts all the sound (as with a low-frequency sine wave), while lowpass emphasizes the "bassiness" of the sound.<br />
<br />
== Filtering Acceleration Data to Distinguish Tilt from Sudden Motion == <br />
*Relaunch TouchOSC and open the guppy patch.<br />
*Turn on OSC in the patch, then turn on audio, and then holding the iPod at a neutral position hit the calibrate button and wait a few seconds. You may have to do the callibration a few times until the tilt values are in the range of [-1,1].<br />
*Now move the iPod around and note that the tilt appears pretty much exclusively in the "tilt" outputs, and that the sudden motion appears pretty much exclusively in the "sudden motion" outputs. Amazing! How do they do that?<br />
<br />
The answering is with filtering. Caveat: although we believe filtering is the best way to solve this gesture discrimination problem, this particular implementation is somewhat of a hack. The reason is that all of Pd/max's filtering tools work only on audio signals, so the guppy patch (in particular, the accel-xover subpatch) converts the incoming OSC messages into audio signals, smooths them out, then lowpasses and highpasses them (at 5 and 20 Hertz, respectively) to differentiate tilt (the low frequency component) from sudden movements (which have lots of high frequency components).<br />
<br />
The moral of the story is that control signals have frequency components too, just like audio signals, and you've got a lot more power if you can think about them in the "frequency domain", just like it's powerful to think about audio signals in the frequency domain. Great. Now go read Julius Smith's books: [http://www.dsprelated.com/dspbooks/filters/ filters] [http://www.dsprelated.com/dspbooks/mdft/ mdft] [http://www.dsprelated.com/dspbooks/pasp/ pasp] [http://www.dsprelated.com/dspbooks/sasp/ sasp].<br />
<br />
*Experiment with different cutoff frequencies for the crossover<br />
*Examine briefly the stillness detector in the lower right corner. How does this work?<br />
<br />
== Make Some (musically-expressive, gesture-controlled) Music! ==<br />
Put it all together. Create an interaction in which sound is controlled by physical gesture in some way that you find interesting. You can begin by conjoining guppy and filter-demo if you like, but you are welcome to use any method for analyzing accelerometer data or creating sound.<br />
<br />
Think about the relationship you want to enable between music and sound. Are the qualities of movement reflected in the qualities of the sound? Is this important to you?<br />
<br />
Make sure to use appropriate mappings from measured quantities to sound parameters. For example if you are controlling the frequency of an oscillator from left/right tilt, you may want to first calculate the angle of tilt from acceleration, and then map logarithmically to frequency.<br />
<br />
Some possibilities you may want to explore:<br />
*Invent a specific gesture, and then figure out how to detect it.<br />
*Include in your interaction the use of the sliders, buttons, or multi-touch 2D sliders in TouchOSC. You will need to figure out what OSC messages are being sent.<br />
*Is there a way to get velocity or position from acceleration?<br />
<br />
Please demo your result to the instructors, and in your lab writeup describe how you approached this (open-ended) design problem and what techniques you used to implement it.<br />
<br />
<br />
(This lab was written by Luke Dahl on 10/13/09. Huge portions were imported from Michael Gurevich's accelerometer lab from 2007 and before.)</div>Lukedahlhttps://ccrma.stanford.edu/mediawiki/index.php?title=250a_Accelerometer_Lab&diff=8977250a Accelerometer Lab2009-10-14T02:09:00Z<p>Lukedahl: /* Naive Gesture Detection and Thresholding */</p>
<hr />
<div><font size=5>Lab 4: Accelerometers, Audio Filters, and (optionally) Multitouch</font><br><br />
Due on Wednesday, October 21th at 5PM<br />
<br />
For this lab you need an iPod Touch (loaners are available) or an iPhone running TouchOSC, and Max/MSP or Pd on a computer.<br />
<br />
== Get Connected and Get Oriented ==<br />
iPod Touches, like many newer portable electronic devices, have a 3-axis accelerometer in them, which allows designers to take into account both orientation of the device with respect to gravity as well as detecting physical gestures that are made with the phone.<br />
<br />
For this lab, instead of writing our own iPod applications (the subject of an entire course), we will use an iPod app called TouchOSC to send accelerometer data from the iPod to max or pd, where we will process the data and make sound. TouchOSC is installed on the iPods available to use in this lab.<br />
<br />
(If you prefer to use your own iPod or iPhone, you are welcome to use one of the other apps which perform similar functions. Here is a review of some options: http://heuristicmusic.com/blog/?p=124.)<br />
<br />
=== Get the iPod talking to your computer via Open Sound Control ===<br />
* Make sure your computer and iPod are on the same network.<br />
**You may need to log your iPod into CCRMA Guestnet. To do this, open safari and try to access a new web page. If you can, you are logged in. If you can't you will be asked to login.<br />
* Find out the name or IP address of your computer. <br />
* On the iPod start TouchOSC and press the small 'i' to get to preferences. Select 'Network' and set Host to the name of your computer (e.g. 'cmn37.stanford.edu' or 'mylaptop.local'.)<br />
* Set the outgoing port to 8000.<br />
* Open accel_osc.pd (INSERT LINK AND MAX PATCH HERE), and make sure that accelerometer messages from TouchOSC are being received in Pd/Max.<br />
* use printing to examine the incoming OSC messages.<br />
<br />
=== Get Oriented ===<br />
* Look at the acceleration values and graphs as you move the iPod around.<br />
* What are the units that acceleration is reported in?<br />
* Figure out the direction and orientation of each (x,y,z) accelerometer axis. How do you do this? <br />
* Draw a picture of the x,y, and z axes and their orientation as they relate to the iPod. (For lab submission you can include this picture or describe verbally what you discover.)<br />
<br />
== Naive Gesture Detection and Thresholding ==<br />
Here's a way to make a simple gesture detector. One obvious difference between fast jerky movements and slow gradual movements is sudden jumps in the acceleration values. As discussed in lecture, jerk is the derivative of acceleration.<br />
<br />
Since our accelermeter data is discrete in time (i.e. we get one value every some number of milliseconds), we can approximate derivation by taking the difference between successive values. (Technically, this is a "one-zero highpass filter.") You can use the included delta abstraction, which simply returns the difference between subsequent input values. <br />
<br />
Start with accel+osc and connect a delta object to one or more acceleration values, pick a threshold that corresponds to a satisfying level of jerkiness, and use 'threshold' in Pd (or 'mapping/threshold' if 'threshold' doesn't work) or 'past' in max to make a sound when you exceed the threshold. You can give the user additional control of the sound based on the direction and/or the magnitude of the jerk, if you like.<br />
<br />
Congratulations, you have now written a jerk detector.<br />
<br />
(IN PD MAPPING/THRESHOLD DOES DIFF AND THREHSOLD. REWRITE THIS SECTION AFTER I MAKE THE MAX PATCH! -LD)<br />
<br />
== Audio Filtering ==<br />
The purpose of this part of the lab is to get a sense for the effect of different kinds of filters, and to start thinking about (audio) signals as being comprised of frequency components. Don't worry, we'll come back to accelerometers later.<br />
<br />
This section does not use the iPod. You may want to quit TouchOSC to save battery life.<br />
<br />
Open the pd patch <pre>audio-filters/filter-demo</pre><br />
<br />
This patch allows you to select one of four input sources (white noise, a sine wave, a pair of sine waves, or a collection of oud samples) and pass the sound through one of seven possible filters:<br />
* No filtering<br />
* High pass filtering with Pd's (one-pole) hip~ object<br />
* High pass filtering with a "cascade" of four hip~ objects<br />
* Low pass filtering with Pd's (one-pole) lop~ object<br />
* Low pass filtering with a cascade of four lop~ objects<br />
* Band pass filtering with Pd's bp~ object<br />
* Band pass filtering with a cascade of Pd's bp~ objects<br />
(EDIT THESE FOR MAXMSP!!!)<br />
<br />
Play with this patch to get a feeling of the effect of different kinds of filters on different input sounds.<br />
<br />
Start with the white noise source. (Be very careful with the output gain! White noise is extremely loud per unit of amplitude!) This is the best input for hearing the differences between different kinds of filters because it contains all frequencies. (It's called "white" noise by analogy to white light, which contains all frequencies, i.e., all colors of light.) Turn the master volume and/or your headphones way down, then select input source zero (white noise) and filter type zero (unfiltered). Beautiful, huh?<br />
<br />
Now step through the other six filter types, playing with the parameters of each. Sweep the high-pass cutoff frequency. Sweep the cascaded high pass cutoff frequency and note that the four filters have "four times as much" effect on the sound as the single hip~ object. Ditto for the low pass objects. For the band pass, start with the default Q factor of 1 and sweep the center frequency. Then make the Q factor small and sweep the frequency again. Then make the Q factor large and sweep the frequency again. Now you know what these filters do.<br />
<br />
Repeat all of the above on the single sine wave. Note that no matter what filtering you do, all you change is the gain (and phase) of the sine wave. (Geek moment: the reason is because all of these filters are "linear and time-invariant".) This is very important: filters don't add anything; they just change the balance of what's already there. Note that lowpass filtering reduces the volume of high frequency sine waves but has less effect on the volume of low frequency sine waves, etc.<br />
<br />
Now try this on a pair of sine waves spaced pretty widely apart in frequency (for example, 100 Hz and 2000 Hz). Hear how the different filters affect the relative volumes of the two sine waves.<br />
<br />
Finally, play some of the oud samples (via the same QWERTY keyboard triggering mechanism) through various filters. Experiment with transposition and how it interacts with filtering. In particular, transpose the samples down by a large amount and see how highpass cuts all the sound (as with a low-frequency sine wave), while lowpass emphasizes the "bassiness" of the sound.<br />
<br />
== Filtering Acceleration Data to Distinguish Tilt from Sudden Motion == <br />
*Relaunch TouchOSC and open the guppy patch.<br />
*Turn on OSC in the patch, then turn on audio, and then holding the iPod at a neutral position hit the calibrate button and wait a few seconds. You may have to do the callibration a few times until the tilt values are in the range of [-1,1].<br />
*Now move the iPod around and note that the tilt appears pretty much exclusively in the "tilt" outputs, and that the sudden motion appears pretty much exclusively in the "sudden motion" outputs. Amazing! How do they do that?<br />
<br />
The answering is with filtering. Caveat: although we believe filtering is the best way to solve this gesture discrimination problem, this particular implementation is somewhat of a hack. The reason is that all of Pd/max's filtering tools work only on audio signals, so the guppy patch (in particular, the accel-xover subpatch) converts the incoming OSC messages into audio signals, smooths them out, then lowpasses and highpasses them (at 5 and 20 Hertz, respectively) to differentiate tilt (the low frequency component) from sudden movements (which have lots of high frequency components).<br />
<br />
The moral of the story is that control signals have frequency components too, just like audio signals, and you've got a lot more power if you can think about them in the "frequency domain", just like it's powerful to think about audio signals in the frequency domain. Great. Now go read Julius Smith's books.<br />
<br />
*Experiment with different cutoff frequencies for the crossover<br />
*Examine briefly the stillness detector in the lower right corner. How does this work?<br />
<br />
== Make Some (musically-expressive, gesture-controlled) Noise! ==<br />
Put it all together. Create an interaction in which sound is controlled by physical gesture in some way that you find interesting. You can begin by conjoining guppy and filter-demo if you like, but you are welcome to use any method for analyzing accelerometer data or creating sound.<br />
<br />
Think about the relationship you want to enable between music and sound. Are the qualities of movement reflected in the qualities of the sound? Is this important to you?<br />
<br />
Make sure to use appropriate mappings from measured quantities to sound parameters. For example if you are controlling the frequency of an oscillator from left/right tilt, you may want to first calculate the angle of tilt from acceleration, and then map logarithmically to frequency.<br />
<br />
Some possibilities you may want to explore:<br />
*Invent a specific gesture, and then figure out how to detect it.<br />
*Include in your interaction the use of the sliders, buttons, or multi-touch 2D sliders in TouchOSC. You will need to figure out what OSC messages are being sent.<br />
*Is there a way to get velocity or position from acceleration?<br />
<br />
Please demo your result to the instructors, and in your lab writeup describe how you approached this (open-ended) design problem and what techniques you used to implement it.<br />
<br />
<br />
(This lab was written by Luke Dahl on 10/13/09. Huge portions were imported from Michael Gurevich's accelerometer lab from 2007 and before.)</div>Lukedahlhttps://ccrma.stanford.edu/mediawiki/index.php?title=250a_Accelerometer_Lab&diff=8976250a Accelerometer Lab2009-10-14T02:05:46Z<p>Lukedahl: /* Naive Gesture Detection and Thresholding */</p>
<hr />
<div><font size=5>Lab 4: Accelerometers, Audio Filters, and (optionally) Multitouch</font><br><br />
Due on Wednesday, October 21th at 5PM<br />
<br />
For this lab you need an iPod Touch (loaners are available) or an iPhone running TouchOSC, and Max/MSP or Pd on a computer.<br />
<br />
== Get Connected and Get Oriented ==<br />
iPod Touches, like many newer portable electronic devices, have a 3-axis accelerometer in them, which allows designers to take into account both orientation of the device with respect to gravity as well as detecting physical gestures that are made with the phone.<br />
<br />
For this lab, instead of writing our own iPod applications (the subject of an entire course), we will use an iPod app called TouchOSC to send accelerometer data from the iPod to max or pd, where we will process the data and make sound. TouchOSC is installed on the iPods available to use in this lab.<br />
<br />
(If you prefer to use your own iPod or iPhone, you are welcome to use one of the other apps which perform similar functions. Here is a review of some options: http://heuristicmusic.com/blog/?p=124.)<br />
<br />
=== Get the iPod talking to your computer via Open Sound Control ===<br />
* Make sure your computer and iPod are on the same network.<br />
**You may need to log your iPod into CCRMA Guestnet. To do this, open safari and try to access a new web page. If you can, you are logged in. If you can't you will be asked to login.<br />
* Find out the name or IP address of your computer. <br />
* On the iPod start TouchOSC and press the small 'i' to get to preferences. Select 'Network' and set Host to the name of your computer (e.g. 'cmn37.stanford.edu' or 'mylaptop.local'.)<br />
* Set the outgoing port to 8000.<br />
* Open accel_osc.pd (INSERT LINK AND MAX PATCH HERE), and make sure that accelerometer messages from TouchOSC are being received in Pd/Max.<br />
* use printing to examine the incoming OSC messages.<br />
<br />
=== Get Oriented ===<br />
* Look at the acceleration values and graphs as you move the iPod around.<br />
* What are the units that acceleration is reported in?<br />
* Figure out the direction and orientation of each (x,y,z) accelerometer axis. How do you do this? <br />
* Draw a picture of the x,y, and z axes and their orientation as they relate to the iPod. (For lab submission you can include this picture or describe verbally what you discover.)<br />
<br />
== Naive Gesture Detection and Thresholding ==<br />
Here's a way to make a simple gesture detector. One obvious difference between fast jerky movements and slow gradual movements is sudden jumps in the acceleration values. As discussed in lecture, jerk is the derivative of acceleration.<br />
<br />
Since our accelermeter data is discrete in time (i.e. we get one value every some number of milliseconds), we can approximate derivation by taking the difference between successive values. (Technically, this is a "one-zero highpass filter.") You can use the included delta abstraction, which simply returns the difference between subsequent input values. <br />
<br />
Start with accel+osc and connect a delta object to one or more acceleration values, pick a threshold that corresponds to a satisfying level of jerkiness, and use 'threshold' in Pd (or 'mapping/threshold' if 'threshold' doesn't work) or 'past' in max to make a sound when you exceed the threshold. You can give the user additional control of the sound based on the direction and/or the magnitude of the jerk, if you like.<br />
<br />
Congratulations, you have now written a jerk detector.<br />
<br />
== Audio Filtering ==<br />
The purpose of this part of the lab is to get a sense for the effect of different kinds of filters, and to start thinking about (audio) signals as being comprised of frequency components. Don't worry, we'll come back to accelerometers later.<br />
<br />
This section does not use the iPod. You may want to quit TouchOSC to save battery life.<br />
<br />
Open the pd patch <pre>audio-filters/filter-demo</pre><br />
<br />
This patch allows you to select one of four input sources (white noise, a sine wave, a pair of sine waves, or a collection of oud samples) and pass the sound through one of seven possible filters:<br />
* No filtering<br />
* High pass filtering with Pd's (one-pole) hip~ object<br />
* High pass filtering with a "cascade" of four hip~ objects<br />
* Low pass filtering with Pd's (one-pole) lop~ object<br />
* Low pass filtering with a cascade of four lop~ objects<br />
* Band pass filtering with Pd's bp~ object<br />
* Band pass filtering with a cascade of Pd's bp~ objects<br />
(EDIT THESE FOR MAXMSP!!!)<br />
<br />
Play with this patch to get a feeling of the effect of different kinds of filters on different input sounds.<br />
<br />
Start with the white noise source. (Be very careful with the output gain! White noise is extremely loud per unit of amplitude!) This is the best input for hearing the differences between different kinds of filters because it contains all frequencies. (It's called "white" noise by analogy to white light, which contains all frequencies, i.e., all colors of light.) Turn the master volume and/or your headphones way down, then select input source zero (white noise) and filter type zero (unfiltered). Beautiful, huh?<br />
<br />
Now step through the other six filter types, playing with the parameters of each. Sweep the high-pass cutoff frequency. Sweep the cascaded high pass cutoff frequency and note that the four filters have "four times as much" effect on the sound as the single hip~ object. Ditto for the low pass objects. For the band pass, start with the default Q factor of 1 and sweep the center frequency. Then make the Q factor small and sweep the frequency again. Then make the Q factor large and sweep the frequency again. Now you know what these filters do.<br />
<br />
Repeat all of the above on the single sine wave. Note that no matter what filtering you do, all you change is the gain (and phase) of the sine wave. (Geek moment: the reason is because all of these filters are "linear and time-invariant".) This is very important: filters don't add anything; they just change the balance of what's already there. Note that lowpass filtering reduces the volume of high frequency sine waves but has less effect on the volume of low frequency sine waves, etc.<br />
<br />
Now try this on a pair of sine waves spaced pretty widely apart in frequency (for example, 100 Hz and 2000 Hz). Hear how the different filters affect the relative volumes of the two sine waves.<br />
<br />
Finally, play some of the oud samples (via the same QWERTY keyboard triggering mechanism) through various filters. Experiment with transposition and how it interacts with filtering. In particular, transpose the samples down by a large amount and see how highpass cuts all the sound (as with a low-frequency sine wave), while lowpass emphasizes the "bassiness" of the sound.<br />
<br />
== Filtering Acceleration Data to Distinguish Tilt from Sudden Motion == <br />
*Relaunch TouchOSC and open the guppy patch.<br />
*Turn on OSC in the patch, then turn on audio, and then holding the iPod at a neutral position hit the calibrate button and wait a few seconds. You may have to do the callibration a few times until the tilt values are in the range of [-1,1].<br />
*Now move the iPod around and note that the tilt appears pretty much exclusively in the "tilt" outputs, and that the sudden motion appears pretty much exclusively in the "sudden motion" outputs. Amazing! How do they do that?<br />
<br />
The answering is with filtering. Caveat: although we believe filtering is the best way to solve this gesture discrimination problem, this particular implementation is somewhat of a hack. The reason is that all of Pd/max's filtering tools work only on audio signals, so the guppy patch (in particular, the accel-xover subpatch) converts the incoming OSC messages into audio signals, smooths them out, then lowpasses and highpasses them (at 5 and 20 Hertz, respectively) to differentiate tilt (the low frequency component) from sudden movements (which have lots of high frequency components).<br />
<br />
The moral of the story is that control signals have frequency components too, just like audio signals, and you've got a lot more power if you can think about them in the "frequency domain", just like it's powerful to think about audio signals in the frequency domain. Great. Now go read Julius Smith's books.<br />
<br />
*Experiment with different cutoff frequencies for the crossover<br />
*Examine briefly the stillness detector in the lower right corner. How does this work?<br />
<br />
== Make Some (musically-expressive, gesture-controlled) Noise! ==<br />
Put it all together. Create an interaction in which sound is controlled by physical gesture in some way that you find interesting. You can begin by conjoining guppy and filter-demo if you like, but you are welcome to use any method for analyzing accelerometer data or creating sound.<br />
<br />
Think about the relationship you want to enable between music and sound. Are the qualities of movement reflected in the qualities of the sound? Is this important to you?<br />
<br />
Make sure to use appropriate mappings from measured quantities to sound parameters. For example if you are controlling the frequency of an oscillator from left/right tilt, you may want to first calculate the angle of tilt from acceleration, and then map logarithmically to frequency.<br />
<br />
Some possibilities you may want to explore:<br />
*Invent a specific gesture, and then figure out how to detect it.<br />
*Include in your interaction the use of the sliders, buttons, or multi-touch 2D sliders in TouchOSC. You will need to figure out what OSC messages are being sent.<br />
*Is there a way to get velocity or position from acceleration?<br />
<br />
Please demo your result to the instructors, and in your lab writeup describe how you approached this (open-ended) design problem and what techniques you used to implement it.<br />
<br />
<br />
(This lab was written by Luke Dahl on 10/13/09. Huge portions were imported from Michael Gurevich's accelerometer lab from 2007 and before.)</div>Lukedahlhttps://ccrma.stanford.edu/mediawiki/index.php?title=250a_Accelerometer_Lab&diff=8975250a Accelerometer Lab2009-10-14T02:02:47Z<p>Lukedahl: /* Make Some (musically-expressive, gesture-controlled) Noise! */</p>
<hr />
<div><font size=5>Lab 4: Accelerometers, Audio Filters, and (optionally) Multitouch</font><br><br />
Due on Wednesday, October 21th at 5PM<br />
<br />
For this lab you need an iPod Touch (loaners are available) or an iPhone running TouchOSC, and Max/MSP or Pd on a computer.<br />
<br />
== Get Connected and Get Oriented ==<br />
iPod Touches, like many newer portable electronic devices, have a 3-axis accelerometer in them, which allows designers to take into account both orientation of the device with respect to gravity as well as detecting physical gestures that are made with the phone.<br />
<br />
For this lab, instead of writing our own iPod applications (the subject of an entire course), we will use an iPod app called TouchOSC to send accelerometer data from the iPod to max or pd, where we will process the data and make sound. TouchOSC is installed on the iPods available to use in this lab.<br />
<br />
(If you prefer to use your own iPod or iPhone, you are welcome to use one of the other apps which perform similar functions. Here is a review of some options: http://heuristicmusic.com/blog/?p=124.)<br />
<br />
=== Get the iPod talking to your computer via Open Sound Control ===<br />
* Make sure your computer and iPod are on the same network.<br />
**You may need to log your iPod into CCRMA Guestnet. To do this, open safari and try to access a new web page. If you can, you are logged in. If you can't you will be asked to login.<br />
* Find out the name or IP address of your computer. <br />
* On the iPod start TouchOSC and press the small 'i' to get to preferences. Select 'Network' and set Host to the name of your computer (e.g. 'cmn37.stanford.edu' or 'mylaptop.local'.)<br />
* Set the outgoing port to 8000.<br />
* Open accel_osc.pd (INSERT LINK AND MAX PATCH HERE), and make sure that accelerometer messages from TouchOSC are being received in Pd/Max.<br />
* use printing to examine the incoming OSC messages.<br />
<br />
=== Get Oriented ===<br />
* Look at the acceleration values and graphs as you move the iPod around.<br />
* What are the units that acceleration is reported in?<br />
* Figure out the direction and orientation of each (x,y,z) accelerometer axis. How do you do this? <br />
* Draw a picture of the x,y, and z axes and their orientation as they relate to the iPod. (For lab submission you can include this picture or describe verbally what you discover.)<br />
<br />
== Naive Gesture Detection and Thresholding ==<br />
Here's a way to make a simple gesture detector. One obvious difference between fast jerky movements and slow gradual movements is sudden jumps in the acceleration values. As discussed in lecture, jerk is the derivative of acceleration.<br />
<br />
Since our accelermeter data is discrete in time (i.e. we get one value every some number of milliseconds), we can approximate derivation by taking the difference between successive values. (Technically, this is a "one-zero highpass filter.") You can use the included delta abstraction, which simply returns the difference between subsequent input values. <br />
<br />
Start with accel+osc and connect a delta object to one or more acceleration values, pick a threshold that corresponds to a satisfying level of jerkiness, and use threshold (Pd) or past (max) to make a sound when you exceed the threshold. You can give the user additional control of the sound based on the direction and/or the magnitude of the jerk, if you like.<br />
<br />
Congratulations, you have now written a jerk detector.<br />
<br />
== Audio Filtering ==<br />
The purpose of this part of the lab is to get a sense for the effect of different kinds of filters, and to start thinking about (audio) signals as being comprised of frequency components. Don't worry, we'll come back to accelerometers later.<br />
<br />
This section does not use the iPod. You may want to quit TouchOSC to save battery life.<br />
<br />
Open the pd patch <pre>audio-filters/filter-demo</pre><br />
<br />
This patch allows you to select one of four input sources (white noise, a sine wave, a pair of sine waves, or a collection of oud samples) and pass the sound through one of seven possible filters:<br />
* No filtering<br />
* High pass filtering with Pd's (one-pole) hip~ object<br />
* High pass filtering with a "cascade" of four hip~ objects<br />
* Low pass filtering with Pd's (one-pole) lop~ object<br />
* Low pass filtering with a cascade of four lop~ objects<br />
* Band pass filtering with Pd's bp~ object<br />
* Band pass filtering with a cascade of Pd's bp~ objects<br />
(EDIT THESE FOR MAXMSP!!!)<br />
<br />
Play with this patch to get a feeling of the effect of different kinds of filters on different input sounds.<br />
<br />
Start with the white noise source. (Be very careful with the output gain! White noise is extremely loud per unit of amplitude!) This is the best input for hearing the differences between different kinds of filters because it contains all frequencies. (It's called "white" noise by analogy to white light, which contains all frequencies, i.e., all colors of light.) Turn the master volume and/or your headphones way down, then select input source zero (white noise) and filter type zero (unfiltered). Beautiful, huh?<br />
<br />
Now step through the other six filter types, playing with the parameters of each. Sweep the high-pass cutoff frequency. Sweep the cascaded high pass cutoff frequency and note that the four filters have "four times as much" effect on the sound as the single hip~ object. Ditto for the low pass objects. For the band pass, start with the default Q factor of 1 and sweep the center frequency. Then make the Q factor small and sweep the frequency again. Then make the Q factor large and sweep the frequency again. Now you know what these filters do.<br />
<br />
Repeat all of the above on the single sine wave. Note that no matter what filtering you do, all you change is the gain (and phase) of the sine wave. (Geek moment: the reason is because all of these filters are "linear and time-invariant".) This is very important: filters don't add anything; they just change the balance of what's already there. Note that lowpass filtering reduces the volume of high frequency sine waves but has less effect on the volume of low frequency sine waves, etc.<br />
<br />
Now try this on a pair of sine waves spaced pretty widely apart in frequency (for example, 100 Hz and 2000 Hz). Hear how the different filters affect the relative volumes of the two sine waves.<br />
<br />
Finally, play some of the oud samples (via the same QWERTY keyboard triggering mechanism) through various filters. Experiment with transposition and how it interacts with filtering. In particular, transpose the samples down by a large amount and see how highpass cuts all the sound (as with a low-frequency sine wave), while lowpass emphasizes the "bassiness" of the sound.<br />
<br />
== Filtering Acceleration Data to Distinguish Tilt from Sudden Motion == <br />
*Relaunch TouchOSC and open the guppy patch.<br />
*Turn on OSC in the patch, then turn on audio, and then holding the iPod at a neutral position hit the calibrate button and wait a few seconds. You may have to do the callibration a few times until the tilt values are in the range of [-1,1].<br />
*Now move the iPod around and note that the tilt appears pretty much exclusively in the "tilt" outputs, and that the sudden motion appears pretty much exclusively in the "sudden motion" outputs. Amazing! How do they do that?<br />
<br />
The answering is with filtering. Caveat: although we believe filtering is the best way to solve this gesture discrimination problem, this particular implementation is somewhat of a hack. The reason is that all of Pd/max's filtering tools work only on audio signals, so the guppy patch (in particular, the accel-xover subpatch) converts the incoming OSC messages into audio signals, smooths them out, then lowpasses and highpasses them (at 5 and 20 Hertz, respectively) to differentiate tilt (the low frequency component) from sudden movements (which have lots of high frequency components).<br />
<br />
The moral of the story is that control signals have frequency components too, just like audio signals, and you've got a lot more power if you can think about them in the "frequency domain", just like it's powerful to think about audio signals in the frequency domain. Great. Now go read Julius Smith's books.<br />
<br />
*Experiment with different cutoff frequencies for the crossover<br />
*Examine briefly the stillness detector in the lower right corner. How does this work?<br />
<br />
== Make Some (musically-expressive, gesture-controlled) Noise! ==<br />
Put it all together. Create an interaction in which sound is controlled by physical gesture in some way that you find interesting. You can begin by conjoining guppy and filter-demo if you like, but you are welcome to use any method for analyzing accelerometer data or creating sound.<br />
<br />
Think about the relationship you want to enable between music and sound. Are the qualities of movement reflected in the qualities of the sound? Is this important to you?<br />
<br />
Make sure to use appropriate mappings from measured quantities to sound parameters. For example if you are controlling the frequency of an oscillator from left/right tilt, you may want to first calculate the angle of tilt from acceleration, and then map logarithmically to frequency.<br />
<br />
Some possibilities you may want to explore:<br />
*Invent a specific gesture, and then figure out how to detect it.<br />
*Include in your interaction the use of the sliders, buttons, or multi-touch 2D sliders in TouchOSC. You will need to figure out what OSC messages are being sent.<br />
*Is there a way to get velocity or position from acceleration?<br />
<br />
Please demo your result to the instructors, and in your lab writeup describe how you approached this (open-ended) design problem and what techniques you used to implement it.<br />
<br />
<br />
(This lab was written by Luke Dahl on 10/13/09. Huge portions were imported from Michael Gurevich's accelerometer lab from 2007 and before.)</div>Lukedahlhttps://ccrma.stanford.edu/mediawiki/index.php?title=250a_Accelerometer_Lab&diff=8974250a Accelerometer Lab2009-10-14T01:55:46Z<p>Lukedahl: /* Get the iPod talking to your computer via Open Sound Control */</p>
<hr />
<div><font size=5>Lab 4: Accelerometers, Audio Filters, and (optionally) Multitouch</font><br><br />
Due on Wednesday, October 21th at 5PM<br />
<br />
For this lab you need an iPod Touch (loaners are available) or an iPhone running TouchOSC, and Max/MSP or Pd on a computer.<br />
<br />
== Get Connected and Get Oriented ==<br />
iPod Touches, like many newer portable electronic devices, have a 3-axis accelerometer in them, which allows designers to take into account both orientation of the device with respect to gravity as well as detecting physical gestures that are made with the phone.<br />
<br />
For this lab, instead of writing our own iPod applications (the subject of an entire course), we will use an iPod app called TouchOSC to send accelerometer data from the iPod to max or pd, where we will process the data and make sound. TouchOSC is installed on the iPods available to use in this lab.<br />
<br />
(If you prefer to use your own iPod or iPhone, you are welcome to use one of the other apps which perform similar functions. Here is a review of some options: http://heuristicmusic.com/blog/?p=124.)<br />
<br />
=== Get the iPod talking to your computer via Open Sound Control ===<br />
* Make sure your computer and iPod are on the same network.<br />
**You may need to log your iPod into CCRMA Guestnet. To do this, open safari and try to access a new web page. If you can, you are logged in. If you can't you will be asked to login.<br />
* Find out the name or IP address of your computer. <br />
* On the iPod start TouchOSC and press the small 'i' to get to preferences. Select 'Network' and set Host to the name of your computer (e.g. 'cmn37.stanford.edu' or 'mylaptop.local'.)<br />
* Set the outgoing port to 8000.<br />
* Open accel_osc.pd (INSERT LINK AND MAX PATCH HERE), and make sure that accelerometer messages from TouchOSC are being received in Pd/Max.<br />
* use printing to examine the incoming OSC messages.<br />
<br />
=== Get Oriented ===<br />
* Look at the acceleration values and graphs as you move the iPod around.<br />
* What are the units that acceleration is reported in?<br />
* Figure out the direction and orientation of each (x,y,z) accelerometer axis. How do you do this? <br />
* Draw a picture of the x,y, and z axes and their orientation as they relate to the iPod. (For lab submission you can include this picture or describe verbally what you discover.)<br />
<br />
== Naive Gesture Detection and Thresholding ==<br />
Here's a way to make a simple gesture detector. One obvious difference between fast jerky movements and slow gradual movements is sudden jumps in the acceleration values. As discussed in lecture, jerk is the derivative of acceleration.<br />
<br />
Since our accelermeter data is discrete in time (i.e. we get one value every some number of milliseconds), we can approximate derivation by taking the difference between successive values. (Technically, this is a "one-zero highpass filter.") You can use the included delta abstraction, which simply returns the difference between subsequent input values. <br />
<br />
Start with accel+osc and connect a delta object to one or more acceleration values, pick a threshold that corresponds to a satisfying level of jerkiness, and use threshold (Pd) or past (max) to make a sound when you exceed the threshold. You can give the user additional control of the sound based on the direction and/or the magnitude of the jerk, if you like.<br />
<br />
Congratulations, you have now written a jerk detector.<br />
<br />
== Audio Filtering ==<br />
The purpose of this part of the lab is to get a sense for the effect of different kinds of filters, and to start thinking about (audio) signals as being comprised of frequency components. Don't worry, we'll come back to accelerometers later.<br />
<br />
This section does not use the iPod. You may want to quit TouchOSC to save battery life.<br />
<br />
Open the pd patch <pre>audio-filters/filter-demo</pre><br />
<br />
This patch allows you to select one of four input sources (white noise, a sine wave, a pair of sine waves, or a collection of oud samples) and pass the sound through one of seven possible filters:<br />
* No filtering<br />
* High pass filtering with Pd's (one-pole) hip~ object<br />
* High pass filtering with a "cascade" of four hip~ objects<br />
* Low pass filtering with Pd's (one-pole) lop~ object<br />
* Low pass filtering with a cascade of four lop~ objects<br />
* Band pass filtering with Pd's bp~ object<br />
* Band pass filtering with a cascade of Pd's bp~ objects<br />
(EDIT THESE FOR MAXMSP!!!)<br />
<br />
Play with this patch to get a feeling of the effect of different kinds of filters on different input sounds.<br />
<br />
Start with the white noise source. (Be very careful with the output gain! White noise is extremely loud per unit of amplitude!) This is the best input for hearing the differences between different kinds of filters because it contains all frequencies. (It's called "white" noise by analogy to white light, which contains all frequencies, i.e., all colors of light.) Turn the master volume and/or your headphones way down, then select input source zero (white noise) and filter type zero (unfiltered). Beautiful, huh?<br />
<br />
Now step through the other six filter types, playing with the parameters of each. Sweep the high-pass cutoff frequency. Sweep the cascaded high pass cutoff frequency and note that the four filters have "four times as much" effect on the sound as the single hip~ object. Ditto for the low pass objects. For the band pass, start with the default Q factor of 1 and sweep the center frequency. Then make the Q factor small and sweep the frequency again. Then make the Q factor large and sweep the frequency again. Now you know what these filters do.<br />
<br />
Repeat all of the above on the single sine wave. Note that no matter what filtering you do, all you change is the gain (and phase) of the sine wave. (Geek moment: the reason is because all of these filters are "linear and time-invariant".) This is very important: filters don't add anything; they just change the balance of what's already there. Note that lowpass filtering reduces the volume of high frequency sine waves but has less effect on the volume of low frequency sine waves, etc.<br />
<br />
Now try this on a pair of sine waves spaced pretty widely apart in frequency (for example, 100 Hz and 2000 Hz). Hear how the different filters affect the relative volumes of the two sine waves.<br />
<br />
Finally, play some of the oud samples (via the same QWERTY keyboard triggering mechanism) through various filters. Experiment with transposition and how it interacts with filtering. In particular, transpose the samples down by a large amount and see how highpass cuts all the sound (as with a low-frequency sine wave), while lowpass emphasizes the "bassiness" of the sound.<br />
<br />
== Filtering Acceleration Data to Distinguish Tilt from Sudden Motion == <br />
*Relaunch TouchOSC and open the guppy patch.<br />
*Turn on OSC in the patch, then turn on audio, and then holding the iPod at a neutral position hit the calibrate button and wait a few seconds. You may have to do the callibration a few times until the tilt values are in the range of [-1,1].<br />
*Now move the iPod around and note that the tilt appears pretty much exclusively in the "tilt" outputs, and that the sudden motion appears pretty much exclusively in the "sudden motion" outputs. Amazing! How do they do that?<br />
<br />
The answering is with filtering. Caveat: although we believe filtering is the best way to solve this gesture discrimination problem, this particular implementation is somewhat of a hack. The reason is that all of Pd/max's filtering tools work only on audio signals, so the guppy patch (in particular, the accel-xover subpatch) converts the incoming OSC messages into audio signals, smooths them out, then lowpasses and highpasses them (at 5 and 20 Hertz, respectively) to differentiate tilt (the low frequency component) from sudden movements (which have lots of high frequency components).<br />
<br />
The moral of the story is that control signals have frequency components too, just like audio signals, and you've got a lot more power if you can think about them in the "frequency domain", just like it's powerful to think about audio signals in the frequency domain. Great. Now go read Julius Smith's books.<br />
<br />
*Experiment with different cutoff frequencies for the crossover<br />
*Examine briefly the stillness detector in the lower right corner. How does this work?<br />
<br />
== Make Some (musically-expressive, gesture-controlled) Noise! ==<br />
Put it all together. Create an interaction in which sound is controlled by physical gesture in some way that you find interesting. You can begin by conjoining guppy and filter-demo if you like, but you are welcome to use any method for analyzing accelerometer data or creating sound.<br />
<br />
Think about the relationship you want to enable between music and sound. Are the qualities of movement reflected in the qualities of the sound? Is this important to you?<br />
<br />
Some possibilities you may want to explore:<br />
*Invent a specific gesture, and then figure out how to detect it.<br />
*Include in your interaction the use of the sliders, buttons, or multi-touch 2D sliders in TouchOSC. You will need to figure out what OSC messages are being sent.<br />
<br />
Please demo your result to the instructors, and in your lab writeup describe how you approached this (open-ended) design problem and what techniques you used to implement it.<br />
<br />
<br />
(This lab was written by Luke Dahl on 10/13/09. Huge portions were imported from Michael Gurevich's accelerometer lab from 2007 and before.)</div>Lukedahlhttps://ccrma.stanford.edu/mediawiki/index.php?title=250a_Accelerometer_Lab&diff=8973250a Accelerometer Lab2009-10-14T00:42:49Z<p>Lukedahl: /* Make Some (musically-expressive, gesture-controlled) Noise! */</p>
<hr />
<div><font size=5>Lab 4: Accelerometers, Audio Filters, and (optionally) Multitouch</font><br><br />
Due on Wednesday, October 21th at 5PM<br />
<br />
For this lab you need an iPod Touch (loaners are available) or an iPhone running TouchOSC, and Max/MSP or Pd on a computer.<br />
<br />
== Get Connected and Get Oriented ==<br />
iPod Touches, like many newer portable electronic devices, have a 3-axis accelerometer in them, which allows designers to take into account both orientation of the device with respect to gravity as well as detecting physical gestures that are made with the phone.<br />
<br />
For this lab, instead of writing our own iPod applications (the subject of an entire course), we will use an iPod app called TouchOSC to send accelerometer data from the iPod to max or pd, where we will process the data and make sound. TouchOSC is installed on the iPods available to use in this lab.<br />
<br />
(If you prefer to use your own iPod or iPhone, you are welcome to use one of the other apps which perform similar functions. Here is a review of some options: http://heuristicmusic.com/blog/?p=124.)<br />
<br />
=== Get the iPod talking to your computer via Open Sound Control ===<br />
* Make sure your computer and iPod are on the same network. <br />
* Find out the name or IP address of your computer. <br />
* On the iPod start TouchOSC and press the small 'i' to get to preferences. Select 'Network' and set Host to the name of your computer (e.g. 'cmn37.stanford.edu' or 'mylaptop.local'.)<br />
* Set the outgoing port to 8000.<br />
* Open accel_osc.pd (INSERT LINK AND MAX PATCH HERE), and make sure that accelerometer messages from TouchOSC are being received in Pd/Max.<br />
* use printing to examine the incoming OSC messages.<br />
<br />
=== Get Oriented ===<br />
* Look at the acceleration values and graphs as you move the iPod around.<br />
* What are the units that acceleration is reported in?<br />
* Figure out the direction and orientation of each (x,y,z) accelerometer axis. How do you do this? <br />
* Draw a picture of the x,y, and z axes and their orientation as they relate to the iPod. (For lab submission you can include this picture or describe verbally what you discover.)<br />
<br />
== Naive Gesture Detection and Thresholding ==<br />
Here's a way to make a simple gesture detector. One obvious difference between fast jerky movements and slow gradual movements is sudden jumps in the acceleration values. As discussed in lecture, jerk is the derivative of acceleration.<br />
<br />
Since our accelermeter data is discrete in time (i.e. we get one value every some number of milliseconds), we can approximate derivation by taking the difference between successive values. (Technically, this is a "one-zero highpass filter.") You can use the included delta abstraction, which simply returns the difference between subsequent input values. <br />
<br />
Start with accel+osc and connect a delta object to one or more acceleration values, pick a threshold that corresponds to a satisfying level of jerkiness, and use threshold (Pd) or past (max) to make a sound when you exceed the threshold. You can give the user additional control of the sound based on the direction and/or the magnitude of the jerk, if you like.<br />
<br />
Congratulations, you have now written a jerk detector.<br />
<br />
== Audio Filtering ==<br />
The purpose of this part of the lab is to get a sense for the effect of different kinds of filters, and to start thinking about (audio) signals as being comprised of frequency components. Don't worry, we'll come back to accelerometers later.<br />
<br />
This section does not use the iPod. You may want to quit TouchOSC to save battery life.<br />
<br />
Open the pd patch <pre>audio-filters/filter-demo</pre><br />
<br />
This patch allows you to select one of four input sources (white noise, a sine wave, a pair of sine waves, or a collection of oud samples) and pass the sound through one of seven possible filters:<br />
* No filtering<br />
* High pass filtering with Pd's (one-pole) hip~ object<br />
* High pass filtering with a "cascade" of four hip~ objects<br />
* Low pass filtering with Pd's (one-pole) lop~ object<br />
* Low pass filtering with a cascade of four lop~ objects<br />
* Band pass filtering with Pd's bp~ object<br />
* Band pass filtering with a cascade of Pd's bp~ objects<br />
(EDIT THESE FOR MAXMSP!!!)<br />
<br />
Play with this patch to get a feeling of the effect of different kinds of filters on different input sounds.<br />
<br />
Start with the white noise source. (Be very careful with the output gain! White noise is extremely loud per unit of amplitude!) This is the best input for hearing the differences between different kinds of filters because it contains all frequencies. (It's called "white" noise by analogy to white light, which contains all frequencies, i.e., all colors of light.) Turn the master volume and/or your headphones way down, then select input source zero (white noise) and filter type zero (unfiltered). Beautiful, huh?<br />
<br />
Now step through the other six filter types, playing with the parameters of each. Sweep the high-pass cutoff frequency. Sweep the cascaded high pass cutoff frequency and note that the four filters have "four times as much" effect on the sound as the single hip~ object. Ditto for the low pass objects. For the band pass, start with the default Q factor of 1 and sweep the center frequency. Then make the Q factor small and sweep the frequency again. Then make the Q factor large and sweep the frequency again. Now you know what these filters do.<br />
<br />
Repeat all of the above on the single sine wave. Note that no matter what filtering you do, all you change is the gain (and phase) of the sine wave. (Geek moment: the reason is because all of these filters are "linear and time-invariant".) This is very important: filters don't add anything; they just change the balance of what's already there. Note that lowpass filtering reduces the volume of high frequency sine waves but has less effect on the volume of low frequency sine waves, etc.<br />
<br />
Now try this on a pair of sine waves spaced pretty widely apart in frequency (for example, 100 Hz and 2000 Hz). Hear how the different filters affect the relative volumes of the two sine waves.<br />
<br />
Finally, play some of the oud samples (via the same QWERTY keyboard triggering mechanism) through various filters. Experiment with transposition and how it interacts with filtering. In particular, transpose the samples down by a large amount and see how highpass cuts all the sound (as with a low-frequency sine wave), while lowpass emphasizes the "bassiness" of the sound.<br />
<br />
== Filtering Acceleration Data to Distinguish Tilt from Sudden Motion == <br />
*Relaunch TouchOSC and open the guppy patch.<br />
*Turn on OSC in the patch, then turn on audio, and then holding the iPod at a neutral position hit the calibrate button and wait a few seconds. You may have to do the callibration a few times until the tilt values are in the range of [-1,1].<br />
*Now move the iPod around and note that the tilt appears pretty much exclusively in the "tilt" outputs, and that the sudden motion appears pretty much exclusively in the "sudden motion" outputs. Amazing! How do they do that?<br />
<br />
The answering is with filtering. Caveat: although we believe filtering is the best way to solve this gesture discrimination problem, this particular implementation is somewhat of a hack. The reason is that all of Pd/max's filtering tools work only on audio signals, so the guppy patch (in particular, the accel-xover subpatch) converts the incoming OSC messages into audio signals, smooths them out, then lowpasses and highpasses them (at 5 and 20 Hertz, respectively) to differentiate tilt (the low frequency component) from sudden movements (which have lots of high frequency components).<br />
<br />
The moral of the story is that control signals have frequency components too, just like audio signals, and you've got a lot more power if you can think about them in the "frequency domain", just like it's powerful to think about audio signals in the frequency domain. Great. Now go read Julius Smith's books.<br />
<br />
*Experiment with different cutoff frequencies for the crossover<br />
*Examine briefly the stillness detector in the lower right corner. How does this work?<br />
<br />
== Make Some (musically-expressive, gesture-controlled) Noise! ==<br />
Put it all together. Create an interaction in which sound is controlled by physical gesture in some way that you find interesting. You can begin by conjoining guppy and filter-demo if you like, but you are welcome to use any method for analyzing accelerometer data or creating sound.<br />
<br />
Think about the relationship you want to enable between music and sound. Are the qualities of movement reflected in the qualities of the sound? Is this important to you?<br />
<br />
Some possibilities you may want to explore:<br />
*Invent a specific gesture, and then figure out how to detect it.<br />
*Include in your interaction the use of the sliders, buttons, or multi-touch 2D sliders in TouchOSC. You will need to figure out what OSC messages are being sent.<br />
<br />
Please demo your result to the instructors, and in your lab writeup describe how you approached this (open-ended) design problem and what techniques you used to implement it.<br />
<br />
<br />
(This lab was written by Luke Dahl on 10/13/09. Huge portions were imported from Michael Gurevich's accelerometer lab from 2007 and before.)</div>Lukedahlhttps://ccrma.stanford.edu/mediawiki/index.php?title=250a_Accelerometer_Lab&diff=8972250a Accelerometer Lab2009-10-14T00:41:33Z<p>Lukedahl: /* Make some (musically-expressive, gesture-controlled) Noise! */</p>
<hr />
<div><font size=5>Lab 4: Accelerometers, Audio Filters, and (optionally) Multitouch</font><br><br />
Due on Wednesday, October 21th at 5PM<br />
<br />
For this lab you need an iPod Touch (loaners are available) or an iPhone running TouchOSC, and Max/MSP or Pd on a computer.<br />
<br />
== Get Connected and Get Oriented ==<br />
iPod Touches, like many newer portable electronic devices, have a 3-axis accelerometer in them, which allows designers to take into account both orientation of the device with respect to gravity as well as detecting physical gestures that are made with the phone.<br />
<br />
For this lab, instead of writing our own iPod applications (the subject of an entire course), we will use an iPod app called TouchOSC to send accelerometer data from the iPod to max or pd, where we will process the data and make sound. TouchOSC is installed on the iPods available to use in this lab.<br />
<br />
(If you prefer to use your own iPod or iPhone, you are welcome to use one of the other apps which perform similar functions. Here is a review of some options: http://heuristicmusic.com/blog/?p=124.)<br />
<br />
=== Get the iPod talking to your computer via Open Sound Control ===<br />
* Make sure your computer and iPod are on the same network. <br />
* Find out the name or IP address of your computer. <br />
* On the iPod start TouchOSC and press the small 'i' to get to preferences. Select 'Network' and set Host to the name of your computer (e.g. 'cmn37.stanford.edu' or 'mylaptop.local'.)<br />
* Set the outgoing port to 8000.<br />
* Open accel_osc.pd (INSERT LINK AND MAX PATCH HERE), and make sure that accelerometer messages from TouchOSC are being received in Pd/Max.<br />
* use printing to examine the incoming OSC messages.<br />
<br />
=== Get Oriented ===<br />
* Look at the acceleration values and graphs as you move the iPod around.<br />
* What are the units that acceleration is reported in?<br />
* Figure out the direction and orientation of each (x,y,z) accelerometer axis. How do you do this? <br />
* Draw a picture of the x,y, and z axes and their orientation as they relate to the iPod. (For lab submission you can include this picture or describe verbally what you discover.)<br />
<br />
== Naive Gesture Detection and Thresholding ==<br />
Here's a way to make a simple gesture detector. One obvious difference between fast jerky movements and slow gradual movements is sudden jumps in the acceleration values. As discussed in lecture, jerk is the derivative of acceleration.<br />
<br />
Since our accelermeter data is discrete in time (i.e. we get one value every some number of milliseconds), we can approximate derivation by taking the difference between successive values. (Technically, this is a "one-zero highpass filter.") You can use the included delta abstraction, which simply returns the difference between subsequent input values. <br />
<br />
Start with accel+osc and connect a delta object to one or more acceleration values, pick a threshold that corresponds to a satisfying level of jerkiness, and use threshold (Pd) or past (max) to make a sound when you exceed the threshold. You can give the user additional control of the sound based on the direction and/or the magnitude of the jerk, if you like.<br />
<br />
Congratulations, you have now written a jerk detector.<br />
<br />
== Audio Filtering ==<br />
The purpose of this part of the lab is to get a sense for the effect of different kinds of filters, and to start thinking about (audio) signals as being comprised of frequency components. Don't worry, we'll come back to accelerometers later.<br />
<br />
This section does not use the iPod. You may want to quit TouchOSC to save battery life.<br />
<br />
Open the pd patch <pre>audio-filters/filter-demo</pre><br />
<br />
This patch allows you to select one of four input sources (white noise, a sine wave, a pair of sine waves, or a collection of oud samples) and pass the sound through one of seven possible filters:<br />
* No filtering<br />
* High pass filtering with Pd's (one-pole) hip~ object<br />
* High pass filtering with a "cascade" of four hip~ objects<br />
* Low pass filtering with Pd's (one-pole) lop~ object<br />
* Low pass filtering with a cascade of four lop~ objects<br />
* Band pass filtering with Pd's bp~ object<br />
* Band pass filtering with a cascade of Pd's bp~ objects<br />
(EDIT THESE FOR MAXMSP!!!)<br />
<br />
Play with this patch to get a feeling of the effect of different kinds of filters on different input sounds.<br />
<br />
Start with the white noise source. (Be very careful with the output gain! White noise is extremely loud per unit of amplitude!) This is the best input for hearing the differences between different kinds of filters because it contains all frequencies. (It's called "white" noise by analogy to white light, which contains all frequencies, i.e., all colors of light.) Turn the master volume and/or your headphones way down, then select input source zero (white noise) and filter type zero (unfiltered). Beautiful, huh?<br />
<br />
Now step through the other six filter types, playing with the parameters of each. Sweep the high-pass cutoff frequency. Sweep the cascaded high pass cutoff frequency and note that the four filters have "four times as much" effect on the sound as the single hip~ object. Ditto for the low pass objects. For the band pass, start with the default Q factor of 1 and sweep the center frequency. Then make the Q factor small and sweep the frequency again. Then make the Q factor large and sweep the frequency again. Now you know what these filters do.<br />
<br />
Repeat all of the above on the single sine wave. Note that no matter what filtering you do, all you change is the gain (and phase) of the sine wave. (Geek moment: the reason is because all of these filters are "linear and time-invariant".) This is very important: filters don't add anything; they just change the balance of what's already there. Note that lowpass filtering reduces the volume of high frequency sine waves but has less effect on the volume of low frequency sine waves, etc.<br />
<br />
Now try this on a pair of sine waves spaced pretty widely apart in frequency (for example, 100 Hz and 2000 Hz). Hear how the different filters affect the relative volumes of the two sine waves.<br />
<br />
Finally, play some of the oud samples (via the same QWERTY keyboard triggering mechanism) through various filters. Experiment with transposition and how it interacts with filtering. In particular, transpose the samples down by a large amount and see how highpass cuts all the sound (as with a low-frequency sine wave), while lowpass emphasizes the "bassiness" of the sound.<br />
<br />
== Filtering Acceleration Data to Distinguish Tilt from Sudden Motion == <br />
*Relaunch TouchOSC and open the guppy patch.<br />
*Turn on OSC in the patch, then turn on audio, and then holding the iPod at a neutral position hit the calibrate button and wait a few seconds. You may have to do the callibration a few times until the tilt values are in the range of [-1,1].<br />
*Now move the iPod around and note that the tilt appears pretty much exclusively in the "tilt" outputs, and that the sudden motion appears pretty much exclusively in the "sudden motion" outputs. Amazing! How do they do that?<br />
<br />
The answering is with filtering. Caveat: although we believe filtering is the best way to solve this gesture discrimination problem, this particular implementation is somewhat of a hack. The reason is that all of Pd/max's filtering tools work only on audio signals, so the guppy patch (in particular, the accel-xover subpatch) converts the incoming OSC messages into audio signals, smooths them out, then lowpasses and highpasses them (at 5 and 20 Hertz, respectively) to differentiate tilt (the low frequency component) from sudden movements (which have lots of high frequency components).<br />
<br />
The moral of the story is that control signals have frequency components too, just like audio signals, and you've got a lot more power if you can think about them in the "frequency domain", just like it's powerful to think about audio signals in the frequency domain. Great. Now go read Julius Smith's books.<br />
<br />
*Experiment with different cutoff frequencies for the crossover<br />
*Examine briefly the stillness detector in the lower right corner. How does this work?<br />
<br />
== Make Some (musically-expressive, gesture-controlled) Noise! ==<br />
Put it all together. Create an interaction in which sound is controlled by physical gesture in some way that you find interesting. You can begin by conjoining guppy and filter-demo if you like, but you are welcome to use any method for analyzing accelerometer data or creating sound.<br />
<br />
Think about the relationship you want to enable between music and sound. Are the qualities of movement reflected in the qualities of the sound? Is this important to you?<br />
<br />
Some possibilities you may want to explore:<br />
*Invent a specific gesture, and then figure out how to detect it.<br />
*Include in your interaction the use of the sliders, buttons, or multi-touch 2D sliders in TouchOSC. You will need to figure out what OSC messages are being sent.<br />
<br />
Please demo your result to the instructors, and in your lab writeup describe how you approached this (open-ended) design problem and what techniques you used to implement it.</div>Lukedahlhttps://ccrma.stanford.edu/mediawiki/index.php?title=250a_Accelerometer_Lab&diff=8971250a Accelerometer Lab2009-10-14T00:33:05Z<p>Lukedahl: /* Filtering acceleration data to distinguish tilt from sudden motion */</p>
<hr />
<div><font size=5>Lab 4: Accelerometers, Audio Filters, and (optionally) Multitouch</font><br><br />
Due on Wednesday, October 21th at 5PM<br />
<br />
For this lab you need an iPod Touch (loaners are available) or an iPhone running TouchOSC, and Max/MSP or Pd on a computer.<br />
<br />
== Get Connected and Get Oriented ==<br />
iPod Touches, like many newer portable electronic devices, have a 3-axis accelerometer in them, which allows designers to take into account both orientation of the device with respect to gravity as well as detecting physical gestures that are made with the phone.<br />
<br />
For this lab, instead of writing our own iPod applications (the subject of an entire course), we will use an iPod app called TouchOSC to send accelerometer data from the iPod to max or pd, where we will process the data and make sound. TouchOSC is installed on the iPods available to use in this lab.<br />
<br />
(If you prefer to use your own iPod or iPhone, you are welcome to use one of the other apps which perform similar functions. Here is a review of some options: http://heuristicmusic.com/blog/?p=124.)<br />
<br />
=== Get the iPod talking to your computer via Open Sound Control ===<br />
* Make sure your computer and iPod are on the same network. <br />
* Find out the name or IP address of your computer. <br />
* On the iPod start TouchOSC and press the small 'i' to get to preferences. Select 'Network' and set Host to the name of your computer (e.g. 'cmn37.stanford.edu' or 'mylaptop.local'.)<br />
* Set the outgoing port to 8000.<br />
* Open accel_osc.pd (INSERT LINK AND MAX PATCH HERE), and make sure that accelerometer messages from TouchOSC are being received in Pd/Max.<br />
* use printing to examine the incoming OSC messages.<br />
<br />
=== Get Oriented ===<br />
* Look at the acceleration values and graphs as you move the iPod around.<br />
* What are the units that acceleration is reported in?<br />
* Figure out the direction and orientation of each (x,y,z) accelerometer axis. How do you do this? <br />
* Draw a picture of the x,y, and z axes and their orientation as they relate to the iPod. (For lab submission you can include this picture or describe verbally what you discover.)<br />
<br />
== Naive Gesture Detection and Thresholding ==<br />
Here's a way to make a simple gesture detector. One obvious difference between fast jerky movements and slow gradual movements is sudden jumps in the acceleration values. As discussed in lecture, jerk is the derivative of acceleration.<br />
<br />
Since our accelermeter data is discrete in time (i.e. we get one value every some number of milliseconds), we can approximate derivation by taking the difference between successive values. (Technically, this is a "one-zero highpass filter.") You can use the included delta abstraction, which simply returns the difference between subsequent input values. <br />
<br />
Start with accel+osc and connect a delta object to one or more acceleration values, pick a threshold that corresponds to a satisfying level of jerkiness, and use threshold (Pd) or past (max) to make a sound when you exceed the threshold. You can give the user additional control of the sound based on the direction and/or the magnitude of the jerk, if you like.<br />
<br />
Congratulations, you have now written a jerk detector.<br />
<br />
== Audio Filtering ==<br />
The purpose of this part of the lab is to get a sense for the effect of different kinds of filters, and to start thinking about (audio) signals as being comprised of frequency components. Don't worry, we'll come back to accelerometers later.<br />
<br />
This section does not use the iPod. You may want to quit TouchOSC to save battery life.<br />
<br />
Open the pd patch <pre>audio-filters/filter-demo</pre><br />
<br />
This patch allows you to select one of four input sources (white noise, a sine wave, a pair of sine waves, or a collection of oud samples) and pass the sound through one of seven possible filters:<br />
* No filtering<br />
* High pass filtering with Pd's (one-pole) hip~ object<br />
* High pass filtering with a "cascade" of four hip~ objects<br />
* Low pass filtering with Pd's (one-pole) lop~ object<br />
* Low pass filtering with a cascade of four lop~ objects<br />
* Band pass filtering with Pd's bp~ object<br />
* Band pass filtering with a cascade of Pd's bp~ objects<br />
(EDIT THESE FOR MAXMSP!!!)<br />
<br />
Play with this patch to get a feeling of the effect of different kinds of filters on different input sounds.<br />
<br />
Start with the white noise source. (Be very careful with the output gain! White noise is extremely loud per unit of amplitude!) This is the best input for hearing the differences between different kinds of filters because it contains all frequencies. (It's called "white" noise by analogy to white light, which contains all frequencies, i.e., all colors of light.) Turn the master volume and/or your headphones way down, then select input source zero (white noise) and filter type zero (unfiltered). Beautiful, huh?<br />
<br />
Now step through the other six filter types, playing with the parameters of each. Sweep the high-pass cutoff frequency. Sweep the cascaded high pass cutoff frequency and note that the four filters have "four times as much" effect on the sound as the single hip~ object. Ditto for the low pass objects. For the band pass, start with the default Q factor of 1 and sweep the center frequency. Then make the Q factor small and sweep the frequency again. Then make the Q factor large and sweep the frequency again. Now you know what these filters do.<br />
<br />
Repeat all of the above on the single sine wave. Note that no matter what filtering you do, all you change is the gain (and phase) of the sine wave. (Geek moment: the reason is because all of these filters are "linear and time-invariant".) This is very important: filters don't add anything; they just change the balance of what's already there. Note that lowpass filtering reduces the volume of high frequency sine waves but has less effect on the volume of low frequency sine waves, etc.<br />
<br />
Now try this on a pair of sine waves spaced pretty widely apart in frequency (for example, 100 Hz and 2000 Hz). Hear how the different filters affect the relative volumes of the two sine waves.<br />
<br />
Finally, play some of the oud samples (via the same QWERTY keyboard triggering mechanism) through various filters. Experiment with transposition and how it interacts with filtering. In particular, transpose the samples down by a large amount and see how highpass cuts all the sound (as with a low-frequency sine wave), while lowpass emphasizes the "bassiness" of the sound.<br />
<br />
== Filtering Acceleration Data to Distinguish Tilt from Sudden Motion == <br />
*Relaunch TouchOSC and open the guppy patch.<br />
*Turn on OSC in the patch, then turn on audio, and then holding the iPod at a neutral position hit the calibrate button and wait a few seconds. You may have to do the callibration a few times until the tilt values are in the range of [-1,1].<br />
*Now move the iPod around and note that the tilt appears pretty much exclusively in the "tilt" outputs, and that the sudden motion appears pretty much exclusively in the "sudden motion" outputs. Amazing! How do they do that?<br />
<br />
The answering is with filtering. Caveat: although we believe filtering is the best way to solve this gesture discrimination problem, this particular implementation is somewhat of a hack. The reason is that all of Pd/max's filtering tools work only on audio signals, so the guppy patch (in particular, the accel-xover subpatch) converts the incoming OSC messages into audio signals, smooths them out, then lowpasses and highpasses them (at 5 and 20 Hertz, respectively) to differentiate tilt (the low frequency component) from sudden movements (which have lots of high frequency components).<br />
<br />
The moral of the story is that control signals have frequency components too, just like audio signals, and you've got a lot more power if you can think about them in the "frequency domain", just like it's powerful to think about audio signals in the frequency domain. Great. Now go read Julius Smith's books.<br />
<br />
*Experiment with different cutoff frequencies for the crossover<br />
*Examine briefly the stillness detector in the lower right corner. How does this work?<br />
<br />
== Make some (musically-expressive, gesture-controlled) Noise! ==</div>Lukedahlhttps://ccrma.stanford.edu/mediawiki/index.php?title=250a_Accelerometer_Lab&diff=8970250a Accelerometer Lab2009-10-14T00:32:31Z<p>Lukedahl: /* getting oriented */</p>
<hr />
<div><font size=5>Lab 4: Accelerometers, Audio Filters, and (optionally) Multitouch</font><br><br />
Due on Wednesday, October 21th at 5PM<br />
<br />
For this lab you need an iPod Touch (loaners are available) or an iPhone running TouchOSC, and Max/MSP or Pd on a computer.<br />
<br />
== Get Connected and Get Oriented ==<br />
iPod Touches, like many newer portable electronic devices, have a 3-axis accelerometer in them, which allows designers to take into account both orientation of the device with respect to gravity as well as detecting physical gestures that are made with the phone.<br />
<br />
For this lab, instead of writing our own iPod applications (the subject of an entire course), we will use an iPod app called TouchOSC to send accelerometer data from the iPod to max or pd, where we will process the data and make sound. TouchOSC is installed on the iPods available to use in this lab.<br />
<br />
(If you prefer to use your own iPod or iPhone, you are welcome to use one of the other apps which perform similar functions. Here is a review of some options: http://heuristicmusic.com/blog/?p=124.)<br />
<br />
=== Get the iPod talking to your computer via Open Sound Control ===<br />
* Make sure your computer and iPod are on the same network. <br />
* Find out the name or IP address of your computer. <br />
* On the iPod start TouchOSC and press the small 'i' to get to preferences. Select 'Network' and set Host to the name of your computer (e.g. 'cmn37.stanford.edu' or 'mylaptop.local'.)<br />
* Set the outgoing port to 8000.<br />
* Open accel_osc.pd (INSERT LINK AND MAX PATCH HERE), and make sure that accelerometer messages from TouchOSC are being received in Pd/Max.<br />
* use printing to examine the incoming OSC messages.<br />
<br />
=== Get Oriented ===<br />
* Look at the acceleration values and graphs as you move the iPod around.<br />
* What are the units that acceleration is reported in?<br />
* Figure out the direction and orientation of each (x,y,z) accelerometer axis. How do you do this? <br />
* Draw a picture of the x,y, and z axes and their orientation as they relate to the iPod. (For lab submission you can include this picture or describe verbally what you discover.)<br />
<br />
== Naive Gesture Detection and Thresholding ==<br />
Here's a way to make a simple gesture detector. One obvious difference between fast jerky movements and slow gradual movements is sudden jumps in the acceleration values. As discussed in lecture, jerk is the derivative of acceleration.<br />
<br />
Since our accelermeter data is discrete in time (i.e. we get one value every some number of milliseconds), we can approximate derivation by taking the difference between successive values. (Technically, this is a "one-zero highpass filter.") You can use the included delta abstraction, which simply returns the difference between subsequent input values. <br />
<br />
Start with accel+osc and connect a delta object to one or more acceleration values, pick a threshold that corresponds to a satisfying level of jerkiness, and use threshold (Pd) or past (max) to make a sound when you exceed the threshold. You can give the user additional control of the sound based on the direction and/or the magnitude of the jerk, if you like.<br />
<br />
Congratulations, you have now written a jerk detector.<br />
<br />
== Audio Filtering ==<br />
The purpose of this part of the lab is to get a sense for the effect of different kinds of filters, and to start thinking about (audio) signals as being comprised of frequency components. Don't worry, we'll come back to accelerometers later.<br />
<br />
This section does not use the iPod. You may want to quit TouchOSC to save battery life.<br />
<br />
Open the pd patch <pre>audio-filters/filter-demo</pre><br />
<br />
This patch allows you to select one of four input sources (white noise, a sine wave, a pair of sine waves, or a collection of oud samples) and pass the sound through one of seven possible filters:<br />
* No filtering<br />
* High pass filtering with Pd's (one-pole) hip~ object<br />
* High pass filtering with a "cascade" of four hip~ objects<br />
* Low pass filtering with Pd's (one-pole) lop~ object<br />
* Low pass filtering with a cascade of four lop~ objects<br />
* Band pass filtering with Pd's bp~ object<br />
* Band pass filtering with a cascade of Pd's bp~ objects<br />
(EDIT THESE FOR MAXMSP!!!)<br />
<br />
Play with this patch to get a feeling of the effect of different kinds of filters on different input sounds.<br />
<br />
Start with the white noise source. (Be very careful with the output gain! White noise is extremely loud per unit of amplitude!) This is the best input for hearing the differences between different kinds of filters because it contains all frequencies. (It's called "white" noise by analogy to white light, which contains all frequencies, i.e., all colors of light.) Turn the master volume and/or your headphones way down, then select input source zero (white noise) and filter type zero (unfiltered). Beautiful, huh?<br />
<br />
Now step through the other six filter types, playing with the parameters of each. Sweep the high-pass cutoff frequency. Sweep the cascaded high pass cutoff frequency and note that the four filters have "four times as much" effect on the sound as the single hip~ object. Ditto for the low pass objects. For the band pass, start with the default Q factor of 1 and sweep the center frequency. Then make the Q factor small and sweep the frequency again. Then make the Q factor large and sweep the frequency again. Now you know what these filters do.<br />
<br />
Repeat all of the above on the single sine wave. Note that no matter what filtering you do, all you change is the gain (and phase) of the sine wave. (Geek moment: the reason is because all of these filters are "linear and time-invariant".) This is very important: filters don't add anything; they just change the balance of what's already there. Note that lowpass filtering reduces the volume of high frequency sine waves but has less effect on the volume of low frequency sine waves, etc.<br />
<br />
Now try this on a pair of sine waves spaced pretty widely apart in frequency (for example, 100 Hz and 2000 Hz). Hear how the different filters affect the relative volumes of the two sine waves.<br />
<br />
Finally, play some of the oud samples (via the same QWERTY keyboard triggering mechanism) through various filters. Experiment with transposition and how it interacts with filtering. In particular, transpose the samples down by a large amount and see how highpass cuts all the sound (as with a low-frequency sine wave), while lowpass emphasizes the "bassiness" of the sound.<br />
<br />
== Filtering acceleration data to distinguish tilt from sudden motion == <br />
*Relaunch TouchOSC and open the guppy patch.<br />
*Turn on OSC in the patch, then turn on audio, and then holding the iPod at a neutral position hit the calibrate button and wait a few seconds. You may have to do the callibration a few times until the tilt values are in the range of [-1,1].<br />
*Now move the iPod around and note that the tilt appears pretty much exclusively in the "tilt" outputs, and that the sudden motion appears pretty much exclusively in the "sudden motion" outputs. Amazing! How do they do that?<br />
<br />
The answering is with filtering. Caveat: although we believe filtering is the best way to solve this gesture discrimination problem, this particular implementation is somewhat of a hack. The reason is that all of Pd/max's filtering tools work only on audio signals, so the guppy patch (in particular, the accel-xover subpatch) converts the incoming OSC messages into audio signals, smooths them out, then lowpasses and highpasses them (at 5 and 20 Hertz, respectively) to differentiate tilt (the low frequency component) from sudden movements (which have lots of high frequency components).<br />
<br />
The moral of the story is that control signals have frequency components too, just like audio signals, and you've got a lot more power if you can think about them in the "frequency domain", just like it's powerful to think about audio signals in the frequency domain. Great. Now go read Julius Smith's books.<br />
<br />
*Experiment with different cutoff frequencies for the crossover<br />
*Examine briefly the stillness detector in the lower right corner. How does this work?<br />
<br />
== Make some (musically-expressive, gesture-controlled) Noise! ==</div>Lukedahlhttps://ccrma.stanford.edu/mediawiki/index.php?title=250a_Accelerometer_Lab&diff=8969250a Accelerometer Lab2009-10-14T00:32:17Z<p>Lukedahl: /* getting the iPod to talk to your computer via Open Sound Control */</p>
<hr />
<div><font size=5>Lab 4: Accelerometers, Audio Filters, and (optionally) Multitouch</font><br><br />
Due on Wednesday, October 21th at 5PM<br />
<br />
For this lab you need an iPod Touch (loaners are available) or an iPhone running TouchOSC, and Max/MSP or Pd on a computer.<br />
<br />
== Get Connected and Get Oriented ==<br />
iPod Touches, like many newer portable electronic devices, have a 3-axis accelerometer in them, which allows designers to take into account both orientation of the device with respect to gravity as well as detecting physical gestures that are made with the phone.<br />
<br />
For this lab, instead of writing our own iPod applications (the subject of an entire course), we will use an iPod app called TouchOSC to send accelerometer data from the iPod to max or pd, where we will process the data and make sound. TouchOSC is installed on the iPods available to use in this lab.<br />
<br />
(If you prefer to use your own iPod or iPhone, you are welcome to use one of the other apps which perform similar functions. Here is a review of some options: http://heuristicmusic.com/blog/?p=124.)<br />
<br />
=== Get the iPod talking to your computer via Open Sound Control ===<br />
* Make sure your computer and iPod are on the same network. <br />
* Find out the name or IP address of your computer. <br />
* On the iPod start TouchOSC and press the small 'i' to get to preferences. Select 'Network' and set Host to the name of your computer (e.g. 'cmn37.stanford.edu' or 'mylaptop.local'.)<br />
* Set the outgoing port to 8000.<br />
* Open accel_osc.pd (INSERT LINK AND MAX PATCH HERE), and make sure that accelerometer messages from TouchOSC are being received in Pd/Max.<br />
* use printing to examine the incoming OSC messages.<br />
<br />
=== getting oriented ===<br />
* Look at the acceleration values and graphs as you move the iPod around.<br />
* What are the units that acceleration is reported in?<br />
* Figure out the direction and orientation of each (x,y,z) accelerometer axis. How do you do this? <br />
* Draw a picture of the x,y, and z axes and their orientation as they relate to the iPod. (For lab submission you can include this picture or describe verbally what you discover.)<br />
<br />
== Naive Gesture Detection and Thresholding ==<br />
Here's a way to make a simple gesture detector. One obvious difference between fast jerky movements and slow gradual movements is sudden jumps in the acceleration values. As discussed in lecture, jerk is the derivative of acceleration.<br />
<br />
Since our accelermeter data is discrete in time (i.e. we get one value every some number of milliseconds), we can approximate derivation by taking the difference between successive values. (Technically, this is a "one-zero highpass filter.") You can use the included delta abstraction, which simply returns the difference between subsequent input values. <br />
<br />
Start with accel+osc and connect a delta object to one or more acceleration values, pick a threshold that corresponds to a satisfying level of jerkiness, and use threshold (Pd) or past (max) to make a sound when you exceed the threshold. You can give the user additional control of the sound based on the direction and/or the magnitude of the jerk, if you like.<br />
<br />
Congratulations, you have now written a jerk detector.<br />
<br />
== Audio Filtering ==<br />
The purpose of this part of the lab is to get a sense for the effect of different kinds of filters, and to start thinking about (audio) signals as being comprised of frequency components. Don't worry, we'll come back to accelerometers later.<br />
<br />
This section does not use the iPod. You may want to quit TouchOSC to save battery life.<br />
<br />
Open the pd patch <pre>audio-filters/filter-demo</pre><br />
<br />
This patch allows you to select one of four input sources (white noise, a sine wave, a pair of sine waves, or a collection of oud samples) and pass the sound through one of seven possible filters:<br />
* No filtering<br />
* High pass filtering with Pd's (one-pole) hip~ object<br />
* High pass filtering with a "cascade" of four hip~ objects<br />
* Low pass filtering with Pd's (one-pole) lop~ object<br />
* Low pass filtering with a cascade of four lop~ objects<br />
* Band pass filtering with Pd's bp~ object<br />
* Band pass filtering with a cascade of Pd's bp~ objects<br />
(EDIT THESE FOR MAXMSP!!!)<br />
<br />
Play with this patch to get a feeling of the effect of different kinds of filters on different input sounds.<br />
<br />
Start with the white noise source. (Be very careful with the output gain! White noise is extremely loud per unit of amplitude!) This is the best input for hearing the differences between different kinds of filters because it contains all frequencies. (It's called "white" noise by analogy to white light, which contains all frequencies, i.e., all colors of light.) Turn the master volume and/or your headphones way down, then select input source zero (white noise) and filter type zero (unfiltered). Beautiful, huh?<br />
<br />
Now step through the other six filter types, playing with the parameters of each. Sweep the high-pass cutoff frequency. Sweep the cascaded high pass cutoff frequency and note that the four filters have "four times as much" effect on the sound as the single hip~ object. Ditto for the low pass objects. For the band pass, start with the default Q factor of 1 and sweep the center frequency. Then make the Q factor small and sweep the frequency again. Then make the Q factor large and sweep the frequency again. Now you know what these filters do.<br />
<br />
Repeat all of the above on the single sine wave. Note that no matter what filtering you do, all you change is the gain (and phase) of the sine wave. (Geek moment: the reason is because all of these filters are "linear and time-invariant".) This is very important: filters don't add anything; they just change the balance of what's already there. Note that lowpass filtering reduces the volume of high frequency sine waves but has less effect on the volume of low frequency sine waves, etc.<br />
<br />
Now try this on a pair of sine waves spaced pretty widely apart in frequency (for example, 100 Hz and 2000 Hz). Hear how the different filters affect the relative volumes of the two sine waves.<br />
<br />
Finally, play some of the oud samples (via the same QWERTY keyboard triggering mechanism) through various filters. Experiment with transposition and how it interacts with filtering. In particular, transpose the samples down by a large amount and see how highpass cuts all the sound (as with a low-frequency sine wave), while lowpass emphasizes the "bassiness" of the sound.<br />
<br />
== Filtering acceleration data to distinguish tilt from sudden motion == <br />
*Relaunch TouchOSC and open the guppy patch.<br />
*Turn on OSC in the patch, then turn on audio, and then holding the iPod at a neutral position hit the calibrate button and wait a few seconds. You may have to do the callibration a few times until the tilt values are in the range of [-1,1].<br />
*Now move the iPod around and note that the tilt appears pretty much exclusively in the "tilt" outputs, and that the sudden motion appears pretty much exclusively in the "sudden motion" outputs. Amazing! How do they do that?<br />
<br />
The answering is with filtering. Caveat: although we believe filtering is the best way to solve this gesture discrimination problem, this particular implementation is somewhat of a hack. The reason is that all of Pd/max's filtering tools work only on audio signals, so the guppy patch (in particular, the accel-xover subpatch) converts the incoming OSC messages into audio signals, smooths them out, then lowpasses and highpasses them (at 5 and 20 Hertz, respectively) to differentiate tilt (the low frequency component) from sudden movements (which have lots of high frequency components).<br />
<br />
The moral of the story is that control signals have frequency components too, just like audio signals, and you've got a lot more power if you can think about them in the "frequency domain", just like it's powerful to think about audio signals in the frequency domain. Great. Now go read Julius Smith's books.<br />
<br />
*Experiment with different cutoff frequencies for the crossover<br />
*Examine briefly the stillness detector in the lower right corner. How does this work?<br />
<br />
== Make some (musically-expressive, gesture-controlled) Noise! ==</div>Lukedahlhttps://ccrma.stanford.edu/mediawiki/index.php?title=250a_Accelerometer_Lab&diff=8968250a Accelerometer Lab2009-10-14T00:31:52Z<p>Lukedahl: /* Get connected and get oriented */</p>
<hr />
<div><font size=5>Lab 4: Accelerometers, Audio Filters, and (optionally) Multitouch</font><br><br />
Due on Wednesday, October 21th at 5PM<br />
<br />
For this lab you need an iPod Touch (loaners are available) or an iPhone running TouchOSC, and Max/MSP or Pd on a computer.<br />
<br />
== Get Connected and Get Oriented ==<br />
iPod Touches, like many newer portable electronic devices, have a 3-axis accelerometer in them, which allows designers to take into account both orientation of the device with respect to gravity as well as detecting physical gestures that are made with the phone.<br />
<br />
For this lab, instead of writing our own iPod applications (the subject of an entire course), we will use an iPod app called TouchOSC to send accelerometer data from the iPod to max or pd, where we will process the data and make sound. TouchOSC is installed on the iPods available to use in this lab.<br />
<br />
(If you prefer to use your own iPod or iPhone, you are welcome to use one of the other apps which perform similar functions. Here is a review of some options: http://heuristicmusic.com/blog/?p=124.)<br />
<br />
=== getting the iPod to talk to your computer via Open Sound Control ===<br />
* Make sure your computer and iPod are on the same network. <br />
* Find out the name or IP address of your computer. <br />
* On the iPod start TouchOSC and press the small 'i' to get to preferences. Select 'Network' and set Host to the name of your computer (e.g. 'cmn37.stanford.edu' or 'mylaptop.local'.)<br />
* Set the outgoing port to 8000.<br />
* Open accel_osc.pd (INSERT LINK AND MAX PATCH HERE), and make sure that accelerometer messages from TouchOSC are being received in Pd/Max.<br />
* use printing to examine the incoming OSC messages.<br />
<br />
=== getting oriented ===<br />
* Look at the acceleration values and graphs as you move the iPod around.<br />
* What are the units that acceleration is reported in?<br />
* Figure out the direction and orientation of each (x,y,z) accelerometer axis. How do you do this? <br />
* Draw a picture of the x,y, and z axes and their orientation as they relate to the iPod. (For lab submission you can include this picture or describe verbally what you discover.)<br />
<br />
== Naive Gesture Detection and Thresholding ==<br />
Here's a way to make a simple gesture detector. One obvious difference between fast jerky movements and slow gradual movements is sudden jumps in the acceleration values. As discussed in lecture, jerk is the derivative of acceleration.<br />
<br />
Since our accelermeter data is discrete in time (i.e. we get one value every some number of milliseconds), we can approximate derivation by taking the difference between successive values. (Technically, this is a "one-zero highpass filter.") You can use the included delta abstraction, which simply returns the difference between subsequent input values. <br />
<br />
Start with accel+osc and connect a delta object to one or more acceleration values, pick a threshold that corresponds to a satisfying level of jerkiness, and use threshold (Pd) or past (max) to make a sound when you exceed the threshold. You can give the user additional control of the sound based on the direction and/or the magnitude of the jerk, if you like.<br />
<br />
Congratulations, you have now written a jerk detector.<br />
<br />
== Audio Filtering ==<br />
The purpose of this part of the lab is to get a sense for the effect of different kinds of filters, and to start thinking about (audio) signals as being comprised of frequency components. Don't worry, we'll come back to accelerometers later.<br />
<br />
This section does not use the iPod. You may want to quit TouchOSC to save battery life.<br />
<br />
Open the pd patch <pre>audio-filters/filter-demo</pre><br />
<br />
This patch allows you to select one of four input sources (white noise, a sine wave, a pair of sine waves, or a collection of oud samples) and pass the sound through one of seven possible filters:<br />
* No filtering<br />
* High pass filtering with Pd's (one-pole) hip~ object<br />
* High pass filtering with a "cascade" of four hip~ objects<br />
* Low pass filtering with Pd's (one-pole) lop~ object<br />
* Low pass filtering with a cascade of four lop~ objects<br />
* Band pass filtering with Pd's bp~ object<br />
* Band pass filtering with a cascade of Pd's bp~ objects<br />
(EDIT THESE FOR MAXMSP!!!)<br />
<br />
Play with this patch to get a feeling of the effect of different kinds of filters on different input sounds.<br />
<br />
Start with the white noise source. (Be very careful with the output gain! White noise is extremely loud per unit of amplitude!) This is the best input for hearing the differences between different kinds of filters because it contains all frequencies. (It's called "white" noise by analogy to white light, which contains all frequencies, i.e., all colors of light.) Turn the master volume and/or your headphones way down, then select input source zero (white noise) and filter type zero (unfiltered). Beautiful, huh?<br />
<br />
Now step through the other six filter types, playing with the parameters of each. Sweep the high-pass cutoff frequency. Sweep the cascaded high pass cutoff frequency and note that the four filters have "four times as much" effect on the sound as the single hip~ object. Ditto for the low pass objects. For the band pass, start with the default Q factor of 1 and sweep the center frequency. Then make the Q factor small and sweep the frequency again. Then make the Q factor large and sweep the frequency again. Now you know what these filters do.<br />
<br />
Repeat all of the above on the single sine wave. Note that no matter what filtering you do, all you change is the gain (and phase) of the sine wave. (Geek moment: the reason is because all of these filters are "linear and time-invariant".) This is very important: filters don't add anything; they just change the balance of what's already there. Note that lowpass filtering reduces the volume of high frequency sine waves but has less effect on the volume of low frequency sine waves, etc.<br />
<br />
Now try this on a pair of sine waves spaced pretty widely apart in frequency (for example, 100 Hz and 2000 Hz). Hear how the different filters affect the relative volumes of the two sine waves.<br />
<br />
Finally, play some of the oud samples (via the same QWERTY keyboard triggering mechanism) through various filters. Experiment with transposition and how it interacts with filtering. In particular, transpose the samples down by a large amount and see how highpass cuts all the sound (as with a low-frequency sine wave), while lowpass emphasizes the "bassiness" of the sound.<br />
<br />
== Filtering acceleration data to distinguish tilt from sudden motion == <br />
*Relaunch TouchOSC and open the guppy patch.<br />
*Turn on OSC in the patch, then turn on audio, and then holding the iPod at a neutral position hit the calibrate button and wait a few seconds. You may have to do the callibration a few times until the tilt values are in the range of [-1,1].<br />
*Now move the iPod around and note that the tilt appears pretty much exclusively in the "tilt" outputs, and that the sudden motion appears pretty much exclusively in the "sudden motion" outputs. Amazing! How do they do that?<br />
<br />
The answering is with filtering. Caveat: although we believe filtering is the best way to solve this gesture discrimination problem, this particular implementation is somewhat of a hack. The reason is that all of Pd/max's filtering tools work only on audio signals, so the guppy patch (in particular, the accel-xover subpatch) converts the incoming OSC messages into audio signals, smooths them out, then lowpasses and highpasses them (at 5 and 20 Hertz, respectively) to differentiate tilt (the low frequency component) from sudden movements (which have lots of high frequency components).<br />
<br />
The moral of the story is that control signals have frequency components too, just like audio signals, and you've got a lot more power if you can think about them in the "frequency domain", just like it's powerful to think about audio signals in the frequency domain. Great. Now go read Julius Smith's books.<br />
<br />
*Experiment with different cutoff frequencies for the crossover<br />
*Examine briefly the stillness detector in the lower right corner. How does this work?<br />
<br />
== Make some (musically-expressive, gesture-controlled) Noise! ==</div>Lukedahlhttps://ccrma.stanford.edu/mediawiki/index.php?title=250a_Accelerometer_Lab&diff=8967250a Accelerometer Lab2009-10-14T00:31:20Z<p>Lukedahl: /* Filtering acceleration data to distinguish tilt from sudden motion */</p>
<hr />
<div><font size=5>Lab 4: Accelerometers, Audio Filters, and (optionally) Multitouch</font><br><br />
Due on Wednesday, October 21th at 5PM<br />
<br />
For this lab you need an iPod Touch (loaners are available) or an iPhone running TouchOSC, and Max/MSP or Pd on a computer.<br />
<br />
== Get connected and get oriented ==<br />
iPod Touches, like many newer portable electronic devices, have a 3-axis accelerometer in them, which allows designers to take into account both orientation of the device with respect to gravity as well as detecting physical gestures that are made with the phone.<br />
<br />
For this lab, instead of writing our own iPod applications (the subject of an entire course), we will use an iPod app called TouchOSC to send accelerometer data from the iPod to max or pd, where we will process the data and make sound. TouchOSC is installed on the iPods available to use in this lab.<br />
<br />
(If you prefer to use your own iPod or iPhone, you are welcome to use one of the other apps which perform similar functions. Here is a review of some options: http://heuristicmusic.com/blog/?p=124.)<br />
<br />
=== getting the iPod to talk to your computer via Open Sound Control ===<br />
* Make sure your computer and iPod are on the same network. <br />
* Find out the name or IP address of your computer. <br />
* On the iPod start TouchOSC and press the small 'i' to get to preferences. Select 'Network' and set Host to the name of your computer (e.g. 'cmn37.stanford.edu' or 'mylaptop.local'.)<br />
* Set the outgoing port to 8000.<br />
* Open accel_osc.pd (INSERT LINK AND MAX PATCH HERE), and make sure that accelerometer messages from TouchOSC are being received in Pd/Max.<br />
* use printing to examine the incoming OSC messages.<br />
<br />
=== getting oriented ===<br />
* Look at the acceleration values and graphs as you move the iPod around.<br />
* What are the units that acceleration is reported in?<br />
* Figure out the direction and orientation of each (x,y,z) accelerometer axis. How do you do this? <br />
* Draw a picture of the x,y, and z axes and their orientation as they relate to the iPod. (For lab submission you can include this picture or describe verbally what you discover.)<br />
<br />
== Naive Gesture Detection and Thresholding ==<br />
Here's a way to make a simple gesture detector. One obvious difference between fast jerky movements and slow gradual movements is sudden jumps in the acceleration values. As discussed in lecture, jerk is the derivative of acceleration.<br />
<br />
Since our accelermeter data is discrete in time (i.e. we get one value every some number of milliseconds), we can approximate derivation by taking the difference between successive values. (Technically, this is a "one-zero highpass filter.") You can use the included delta abstraction, which simply returns the difference between subsequent input values. <br />
<br />
Start with accel+osc and connect a delta object to one or more acceleration values, pick a threshold that corresponds to a satisfying level of jerkiness, and use threshold (Pd) or past (max) to make a sound when you exceed the threshold. You can give the user additional control of the sound based on the direction and/or the magnitude of the jerk, if you like.<br />
<br />
Congratulations, you have now written a jerk detector.<br />
<br />
== Audio Filtering ==<br />
The purpose of this part of the lab is to get a sense for the effect of different kinds of filters, and to start thinking about (audio) signals as being comprised of frequency components. Don't worry, we'll come back to accelerometers later.<br />
<br />
This section does not use the iPod. You may want to quit TouchOSC to save battery life.<br />
<br />
Open the pd patch <pre>audio-filters/filter-demo</pre><br />
<br />
This patch allows you to select one of four input sources (white noise, a sine wave, a pair of sine waves, or a collection of oud samples) and pass the sound through one of seven possible filters:<br />
* No filtering<br />
* High pass filtering with Pd's (one-pole) hip~ object<br />
* High pass filtering with a "cascade" of four hip~ objects<br />
* Low pass filtering with Pd's (one-pole) lop~ object<br />
* Low pass filtering with a cascade of four lop~ objects<br />
* Band pass filtering with Pd's bp~ object<br />
* Band pass filtering with a cascade of Pd's bp~ objects<br />
(EDIT THESE FOR MAXMSP!!!)<br />
<br />
Play with this patch to get a feeling of the effect of different kinds of filters on different input sounds.<br />
<br />
Start with the white noise source. (Be very careful with the output gain! White noise is extremely loud per unit of amplitude!) This is the best input for hearing the differences between different kinds of filters because it contains all frequencies. (It's called "white" noise by analogy to white light, which contains all frequencies, i.e., all colors of light.) Turn the master volume and/or your headphones way down, then select input source zero (white noise) and filter type zero (unfiltered). Beautiful, huh?<br />
<br />
Now step through the other six filter types, playing with the parameters of each. Sweep the high-pass cutoff frequency. Sweep the cascaded high pass cutoff frequency and note that the four filters have "four times as much" effect on the sound as the single hip~ object. Ditto for the low pass objects. For the band pass, start with the default Q factor of 1 and sweep the center frequency. Then make the Q factor small and sweep the frequency again. Then make the Q factor large and sweep the frequency again. Now you know what these filters do.<br />
<br />
Repeat all of the above on the single sine wave. Note that no matter what filtering you do, all you change is the gain (and phase) of the sine wave. (Geek moment: the reason is because all of these filters are "linear and time-invariant".) This is very important: filters don't add anything; they just change the balance of what's already there. Note that lowpass filtering reduces the volume of high frequency sine waves but has less effect on the volume of low frequency sine waves, etc.<br />
<br />
Now try this on a pair of sine waves spaced pretty widely apart in frequency (for example, 100 Hz and 2000 Hz). Hear how the different filters affect the relative volumes of the two sine waves.<br />
<br />
Finally, play some of the oud samples (via the same QWERTY keyboard triggering mechanism) through various filters. Experiment with transposition and how it interacts with filtering. In particular, transpose the samples down by a large amount and see how highpass cuts all the sound (as with a low-frequency sine wave), while lowpass emphasizes the "bassiness" of the sound.<br />
<br />
== Filtering acceleration data to distinguish tilt from sudden motion == <br />
*Relaunch TouchOSC and open the guppy patch.<br />
*Turn on OSC in the patch, then turn on audio, and then holding the iPod at a neutral position hit the calibrate button and wait a few seconds. You may have to do the callibration a few times until the tilt values are in the range of [-1,1].<br />
*Now move the iPod around and note that the tilt appears pretty much exclusively in the "tilt" outputs, and that the sudden motion appears pretty much exclusively in the "sudden motion" outputs. Amazing! How do they do that?<br />
<br />
The answering is with filtering. Caveat: although we believe filtering is the best way to solve this gesture discrimination problem, this particular implementation is somewhat of a hack. The reason is that all of Pd/max's filtering tools work only on audio signals, so the guppy patch (in particular, the accel-xover subpatch) converts the incoming OSC messages into audio signals, smooths them out, then lowpasses and highpasses them (at 5 and 20 Hertz, respectively) to differentiate tilt (the low frequency component) from sudden movements (which have lots of high frequency components).<br />
<br />
The moral of the story is that control signals have frequency components too, just like audio signals, and you've got a lot more power if you can think about them in the "frequency domain", just like it's powerful to think about audio signals in the frequency domain. Great. Now go read Julius Smith's books.<br />
<br />
*Experiment with different cutoff frequencies for the crossover<br />
*Examine briefly the stillness detector in the lower right corner. How does this work?<br />
<br />
== Make some (musically-expressive, gesture-controlled) Noise! ==</div>Lukedahlhttps://ccrma.stanford.edu/mediawiki/index.php?title=250a_Accelerometer_Lab&diff=8966250a Accelerometer Lab2009-10-14T00:29:32Z<p>Lukedahl: /* Get connected and get oriented */</p>
<hr />
<div><font size=5>Lab 4: Accelerometers, Audio Filters, and (optionally) Multitouch</font><br><br />
Due on Wednesday, October 21th at 5PM<br />
<br />
For this lab you need an iPod Touch (loaners are available) or an iPhone running TouchOSC, and Max/MSP or Pd on a computer.<br />
<br />
== Get connected and get oriented ==<br />
iPod Touches, like many newer portable electronic devices, have a 3-axis accelerometer in them, which allows designers to take into account both orientation of the device with respect to gravity as well as detecting physical gestures that are made with the phone.<br />
<br />
For this lab, instead of writing our own iPod applications (the subject of an entire course), we will use an iPod app called TouchOSC to send accelerometer data from the iPod to max or pd, where we will process the data and make sound. TouchOSC is installed on the iPods available to use in this lab.<br />
<br />
(If you prefer to use your own iPod or iPhone, you are welcome to use one of the other apps which perform similar functions. Here is a review of some options: http://heuristicmusic.com/blog/?p=124.)<br />
<br />
=== getting the iPod to talk to your computer via Open Sound Control ===<br />
* Make sure your computer and iPod are on the same network. <br />
* Find out the name or IP address of your computer. <br />
* On the iPod start TouchOSC and press the small 'i' to get to preferences. Select 'Network' and set Host to the name of your computer (e.g. 'cmn37.stanford.edu' or 'mylaptop.local'.)<br />
* Set the outgoing port to 8000.<br />
* Open accel_osc.pd (INSERT LINK AND MAX PATCH HERE), and make sure that accelerometer messages from TouchOSC are being received in Pd/Max.<br />
* use printing to examine the incoming OSC messages.<br />
<br />
=== getting oriented ===<br />
* Look at the acceleration values and graphs as you move the iPod around.<br />
* What are the units that acceleration is reported in?<br />
* Figure out the direction and orientation of each (x,y,z) accelerometer axis. How do you do this? <br />
* Draw a picture of the x,y, and z axes and their orientation as they relate to the iPod. (For lab submission you can include this picture or describe verbally what you discover.)<br />
<br />
== Naive Gesture Detection and Thresholding ==<br />
Here's a way to make a simple gesture detector. One obvious difference between fast jerky movements and slow gradual movements is sudden jumps in the acceleration values. As discussed in lecture, jerk is the derivative of acceleration.<br />
<br />
Since our accelermeter data is discrete in time (i.e. we get one value every some number of milliseconds), we can approximate derivation by taking the difference between successive values. (Technically, this is a "one-zero highpass filter.") You can use the included delta abstraction, which simply returns the difference between subsequent input values. <br />
<br />
Start with accel+osc and connect a delta object to one or more acceleration values, pick a threshold that corresponds to a satisfying level of jerkiness, and use threshold (Pd) or past (max) to make a sound when you exceed the threshold. You can give the user additional control of the sound based on the direction and/or the magnitude of the jerk, if you like.<br />
<br />
Congratulations, you have now written a jerk detector.<br />
<br />
== Audio Filtering ==<br />
The purpose of this part of the lab is to get a sense for the effect of different kinds of filters, and to start thinking about (audio) signals as being comprised of frequency components. Don't worry, we'll come back to accelerometers later.<br />
<br />
This section does not use the iPod. You may want to quit TouchOSC to save battery life.<br />
<br />
Open the pd patch <pre>audio-filters/filter-demo</pre><br />
<br />
This patch allows you to select one of four input sources (white noise, a sine wave, a pair of sine waves, or a collection of oud samples) and pass the sound through one of seven possible filters:<br />
* No filtering<br />
* High pass filtering with Pd's (one-pole) hip~ object<br />
* High pass filtering with a "cascade" of four hip~ objects<br />
* Low pass filtering with Pd's (one-pole) lop~ object<br />
* Low pass filtering with a cascade of four lop~ objects<br />
* Band pass filtering with Pd's bp~ object<br />
* Band pass filtering with a cascade of Pd's bp~ objects<br />
(EDIT THESE FOR MAXMSP!!!)<br />
<br />
Play with this patch to get a feeling of the effect of different kinds of filters on different input sounds.<br />
<br />
Start with the white noise source. (Be very careful with the output gain! White noise is extremely loud per unit of amplitude!) This is the best input for hearing the differences between different kinds of filters because it contains all frequencies. (It's called "white" noise by analogy to white light, which contains all frequencies, i.e., all colors of light.) Turn the master volume and/or your headphones way down, then select input source zero (white noise) and filter type zero (unfiltered). Beautiful, huh?<br />
<br />
Now step through the other six filter types, playing with the parameters of each. Sweep the high-pass cutoff frequency. Sweep the cascaded high pass cutoff frequency and note that the four filters have "four times as much" effect on the sound as the single hip~ object. Ditto for the low pass objects. For the band pass, start with the default Q factor of 1 and sweep the center frequency. Then make the Q factor small and sweep the frequency again. Then make the Q factor large and sweep the frequency again. Now you know what these filters do.<br />
<br />
Repeat all of the above on the single sine wave. Note that no matter what filtering you do, all you change is the gain (and phase) of the sine wave. (Geek moment: the reason is because all of these filters are "linear and time-invariant".) This is very important: filters don't add anything; they just change the balance of what's already there. Note that lowpass filtering reduces the volume of high frequency sine waves but has less effect on the volume of low frequency sine waves, etc.<br />
<br />
Now try this on a pair of sine waves spaced pretty widely apart in frequency (for example, 100 Hz and 2000 Hz). Hear how the different filters affect the relative volumes of the two sine waves.<br />
<br />
Finally, play some of the oud samples (via the same QWERTY keyboard triggering mechanism) through various filters. Experiment with transposition and how it interacts with filtering. In particular, transpose the samples down by a large amount and see how highpass cuts all the sound (as with a low-frequency sine wave), while lowpass emphasizes the "bassiness" of the sound.<br />
<br />
== Filtering acceleration data to distinguish tilt from sudden motion == <br />
*Relaunch TouchOSC and open the guppy patch.<br />
*Turn on OSC in the patch, then turn on audio, and then holding the iPod at a neutral position hit the calibrate button and wait a few seconds. You may have to do the callibration a few times until the tilt values are in the range of [-1,1].<br />
*Now move the iPod around and note that the tilt appears pretty much exclusively in the "tilt" outputs, and that the sudden motion appears pretty much exclusively in the "sudden motion" outputs. Amazing! How do they do that?<br />
<br />
The answering is with filtering. Caveat: although we believe filtering is the best way to solve this gesture discrimination problem, this particular implementation is somewhat of a hack. The reason is that all of Pd/max's filtering tools work only on audio signals, so the guppy patch (in particular, the accel-xover subpatch) converts the incoming OSC messages into audio signals, smooths them out, then lowpasses and highpasses them (at 5 and 20 Hertz, respectively) to differentiate tilt (the low frequency component) from sudden movements (which have lots of high frequency components).<br />
<br />
The moral of the story is that control signals have frequency components too, just like audio signals, and you've got a lot more power if you can think about them in the "frequency domain", just like it's powerful to think about audio signals in the frequency domain. Great. Now go read Julius Smith's books.<br />
<br />
*Experiment with different cutoff frequencies for the crossover<br />
*Examine briefly the stillness detector in the lower right corner. How does this work?</div>Lukedahlhttps://ccrma.stanford.edu/mediawiki/index.php?title=250a_Accelerometer_Lab&diff=8965250a Accelerometer Lab2009-10-14T00:29:06Z<p>Lukedahl: /* Get connected and get oriented */</p>
<hr />
<div><font size=5>Lab 4: Accelerometers, Audio Filters, and (optionally) Multitouch</font><br><br />
Due on Wednesday, October 21th at 5PM<br />
<br />
For this lab you need an iPod Touch (loaners are available) or an iPhone running TouchOSC, and Max/MSP or Pd on a computer.<br />
<br />
== Get connected and get oriented ==<br />
iPod Touches, like many newer portable electronic devices, have a 3-axis accelerometer in them, which allows designers to take into account both orientation of the device with respect to gravity as well as detecting physical gestures that are made with the phone.<br />
<br />
For this lab, instead of writing our own iPod applications (the subject of an entire course), we will use an iPod app called TouchOSC to send accelerometer data from the iPod to max or pd, where we will process the data and make sound. TouchOSC is installed on the iPods available to use in this lab.<br />
<br />
(If you prefer to use your own iPod or iPhone, you are welcome to use one of the other apps which perform similar functions. Here is a review of some options: http://heuristicmusic.com/blog/?p=124.)<br />
<br />
<br />
=== getting the iPod to talk to your computer via Open Sound Control ===<br />
* Make sure your computer and iPod are on the same network. <br />
* Find out the name or IP address of your computer. <br />
* On the iPod start TouchOSC and press the small 'i' to get to preferences. Select 'Network' and set Host to the name of your computer (e.g. 'cmn37.stanford.edu' or 'mylaptop.local'.)<br />
* Set the outgoing port to 8000.<br />
* Open accel_osc.pd (INSERT LINK AND MAX PATCH HERE), and make sure that accelerometer messages from TouchOSC are being received in Pd/Max.<br />
* use printing to examine the incoming OSC messages.<br />
<br />
=== getting oriented ===<br />
* Look at the acceleration values and graphs as you move the iPod around.<br />
* What are the units that acceleration is reported in?<br />
* Figure out the direction and orientation of each (x,y,z) accelerometer axis. How do you do this? <br />
* Draw a picture of the x,y, and z axes and their orientation as they relate to the iPod. (For lab submission you can include this picture or describe verbally what you discover.)<br />
<br />
== Naive Gesture Detection and Thresholding ==<br />
Here's a way to make a simple gesture detector. One obvious difference between fast jerky movements and slow gradual movements is sudden jumps in the acceleration values. As discussed in lecture, jerk is the derivative of acceleration.<br />
<br />
Since our accelermeter data is discrete in time (i.e. we get one value every some number of milliseconds), we can approximate derivation by taking the difference between successive values. (Technically, this is a "one-zero highpass filter.") You can use the included delta abstraction, which simply returns the difference between subsequent input values. <br />
<br />
Start with accel+osc and connect a delta object to one or more acceleration values, pick a threshold that corresponds to a satisfying level of jerkiness, and use threshold (Pd) or past (max) to make a sound when you exceed the threshold. You can give the user additional control of the sound based on the direction and/or the magnitude of the jerk, if you like.<br />
<br />
Congratulations, you have now written a jerk detector.<br />
<br />
== Audio Filtering ==<br />
The purpose of this part of the lab is to get a sense for the effect of different kinds of filters, and to start thinking about (audio) signals as being comprised of frequency components. Don't worry, we'll come back to accelerometers later.<br />
<br />
This section does not use the iPod. You may want to quit TouchOSC to save battery life.<br />
<br />
Open the pd patch <pre>audio-filters/filter-demo</pre><br />
<br />
This patch allows you to select one of four input sources (white noise, a sine wave, a pair of sine waves, or a collection of oud samples) and pass the sound through one of seven possible filters:<br />
* No filtering<br />
* High pass filtering with Pd's (one-pole) hip~ object<br />
* High pass filtering with a "cascade" of four hip~ objects<br />
* Low pass filtering with Pd's (one-pole) lop~ object<br />
* Low pass filtering with a cascade of four lop~ objects<br />
* Band pass filtering with Pd's bp~ object<br />
* Band pass filtering with a cascade of Pd's bp~ objects<br />
(EDIT THESE FOR MAXMSP!!!)<br />
<br />
Play with this patch to get a feeling of the effect of different kinds of filters on different input sounds.<br />
<br />
Start with the white noise source. (Be very careful with the output gain! White noise is extremely loud per unit of amplitude!) This is the best input for hearing the differences between different kinds of filters because it contains all frequencies. (It's called "white" noise by analogy to white light, which contains all frequencies, i.e., all colors of light.) Turn the master volume and/or your headphones way down, then select input source zero (white noise) and filter type zero (unfiltered). Beautiful, huh?<br />
<br />
Now step through the other six filter types, playing with the parameters of each. Sweep the high-pass cutoff frequency. Sweep the cascaded high pass cutoff frequency and note that the four filters have "four times as much" effect on the sound as the single hip~ object. Ditto for the low pass objects. For the band pass, start with the default Q factor of 1 and sweep the center frequency. Then make the Q factor small and sweep the frequency again. Then make the Q factor large and sweep the frequency again. Now you know what these filters do.<br />
<br />
Repeat all of the above on the single sine wave. Note that no matter what filtering you do, all you change is the gain (and phase) of the sine wave. (Geek moment: the reason is because all of these filters are "linear and time-invariant".) This is very important: filters don't add anything; they just change the balance of what's already there. Note that lowpass filtering reduces the volume of high frequency sine waves but has less effect on the volume of low frequency sine waves, etc.<br />
<br />
Now try this on a pair of sine waves spaced pretty widely apart in frequency (for example, 100 Hz and 2000 Hz). Hear how the different filters affect the relative volumes of the two sine waves.<br />
<br />
Finally, play some of the oud samples (via the same QWERTY keyboard triggering mechanism) through various filters. Experiment with transposition and how it interacts with filtering. In particular, transpose the samples down by a large amount and see how highpass cuts all the sound (as with a low-frequency sine wave), while lowpass emphasizes the "bassiness" of the sound.<br />
<br />
== Filtering acceleration data to distinguish tilt from sudden motion == <br />
*Relaunch TouchOSC and open the guppy patch.<br />
*Turn on OSC in the patch, then turn on audio, and then holding the iPod at a neutral position hit the calibrate button and wait a few seconds. You may have to do the callibration a few times until the tilt values are in the range of [-1,1].<br />
*Now move the iPod around and note that the tilt appears pretty much exclusively in the "tilt" outputs, and that the sudden motion appears pretty much exclusively in the "sudden motion" outputs. Amazing! How do they do that?<br />
<br />
The answering is with filtering. Caveat: although we believe filtering is the best way to solve this gesture discrimination problem, this particular implementation is somewhat of a hack. The reason is that all of Pd/max's filtering tools work only on audio signals, so the guppy patch (in particular, the accel-xover subpatch) converts the incoming OSC messages into audio signals, smooths them out, then lowpasses and highpasses them (at 5 and 20 Hertz, respectively) to differentiate tilt (the low frequency component) from sudden movements (which have lots of high frequency components).<br />
<br />
The moral of the story is that control signals have frequency components too, just like audio signals, and you've got a lot more power if you can think about them in the "frequency domain", just like it's powerful to think about audio signals in the frequency domain. Great. Now go read Julius Smith's books.<br />
<br />
*Experiment with different cutoff frequencies for the crossover<br />
*Examine briefly the stillness detector in the lower right corner. How does this work?</div>Lukedahlhttps://ccrma.stanford.edu/mediawiki/index.php?title=250a_Accelerometer_Lab&diff=8964250a Accelerometer Lab2009-10-14T00:27:48Z<p>Lukedahl: /* Filtering acceleration data to distinguish tilt from sudden motion */</p>
<hr />
<div><font size=5>Lab 4: Accelerometers, Audio Filters, and (optionally) Multitouch</font><br><br />
Due on Wednesday, October 21th at 5PM<br />
<br />
For this lab you need an iPod Touch (loaners are available) or an iPhone running TouchOSC, and Max/MSP or Pd on a computer.<br />
<br />
== Get connected and get oriented ==<br />
iPod Touches, like many newer portable electronic devices, have a 3-axis accelerometer in them, which allows designers to take into account both orientation of the device with respect to gravity as well as detecting physical gestures that are made with the phone.<br />
<br />
For this lab, instead of writing our own iPod applications (the subject of an entire course), we will use an iPod app called TouchOSC to send accelerometer data from the iPod to max or pd, where we will process the data and make sound. TouchOSC is installed on the iPods available to use in this lab.<br />
<br />
If you prefer to use your own iPod or iPhone, you are welcome to use one of the other apps which perform similar functions. Here is a review of some options: http://heuristicmusic.com/blog/?p=124.<br />
<br />
<br />
=== getting the iPod to talk to your computer via Open Sound Control ===<br />
* Make sure your computer and iPod are on the same network. <br />
* Find out the name or IP address of your computer. <br />
* On the iPod start TouchOSC and press the small 'i' to get to preferences. Select 'Network' and set Host to the name of your computer (e.g. 'cmn37.stanford.edu' or 'mylaptop.local'.)<br />
* Set the outgoing port to 8000.<br />
* Open accel_osc.pd (INSERT LINK AND MAX PATCH HERE), and make sure that accelerometer messages from TouchOSC are being received in Pd/Max.<br />
* use printing to examine the incoming OSC messages.<br />
<br />
=== getting oriented ===<br />
* Look at the acceleration values and graphs as you move the iPod around.<br />
* What are the units that acceleration is reported in?<br />
* Figure out the direction and orientation of each (x,y,z) accelerometer axis. How do you do this? <br />
* Draw a picture of the x,y, and z axes and their orientation as they relate to the iPod. (For lab submission you can include this picture or describe verbally what you discover.)<br />
<br />
== Naive Gesture Detection and Thresholding ==<br />
Here's a way to make a simple gesture detector. One obvious difference between fast jerky movements and slow gradual movements is sudden jumps in the acceleration values. As discussed in lecture, jerk is the derivative of acceleration.<br />
<br />
Since our accelermeter data is discrete in time (i.e. we get one value every some number of milliseconds), we can approximate derivation by taking the difference between successive values. (Technically, this is a "one-zero highpass filter.") You can use the included delta abstraction, which simply returns the difference between subsequent input values. <br />
<br />
Start with accel+osc and connect a delta object to one or more acceleration values, pick a threshold that corresponds to a satisfying level of jerkiness, and use threshold (Pd) or past (max) to make a sound when you exceed the threshold. You can give the user additional control of the sound based on the direction and/or the magnitude of the jerk, if you like.<br />
<br />
Congratulations, you have now written a jerk detector.<br />
<br />
== Audio Filtering ==<br />
The purpose of this part of the lab is to get a sense for the effect of different kinds of filters, and to start thinking about (audio) signals as being comprised of frequency components. Don't worry, we'll come back to accelerometers later.<br />
<br />
This section does not use the iPod. You may want to quit TouchOSC to save battery life.<br />
<br />
Open the pd patch <pre>audio-filters/filter-demo</pre><br />
<br />
This patch allows you to select one of four input sources (white noise, a sine wave, a pair of sine waves, or a collection of oud samples) and pass the sound through one of seven possible filters:<br />
* No filtering<br />
* High pass filtering with Pd's (one-pole) hip~ object<br />
* High pass filtering with a "cascade" of four hip~ objects<br />
* Low pass filtering with Pd's (one-pole) lop~ object<br />
* Low pass filtering with a cascade of four lop~ objects<br />
* Band pass filtering with Pd's bp~ object<br />
* Band pass filtering with a cascade of Pd's bp~ objects<br />
(EDIT THESE FOR MAXMSP!!!)<br />
<br />
Play with this patch to get a feeling of the effect of different kinds of filters on different input sounds.<br />
<br />
Start with the white noise source. (Be very careful with the output gain! White noise is extremely loud per unit of amplitude!) This is the best input for hearing the differences between different kinds of filters because it contains all frequencies. (It's called "white" noise by analogy to white light, which contains all frequencies, i.e., all colors of light.) Turn the master volume and/or your headphones way down, then select input source zero (white noise) and filter type zero (unfiltered). Beautiful, huh?<br />
<br />
Now step through the other six filter types, playing with the parameters of each. Sweep the high-pass cutoff frequency. Sweep the cascaded high pass cutoff frequency and note that the four filters have "four times as much" effect on the sound as the single hip~ object. Ditto for the low pass objects. For the band pass, start with the default Q factor of 1 and sweep the center frequency. Then make the Q factor small and sweep the frequency again. Then make the Q factor large and sweep the frequency again. Now you know what these filters do.<br />
<br />
Repeat all of the above on the single sine wave. Note that no matter what filtering you do, all you change is the gain (and phase) of the sine wave. (Geek moment: the reason is because all of these filters are "linear and time-invariant".) This is very important: filters don't add anything; they just change the balance of what's already there. Note that lowpass filtering reduces the volume of high frequency sine waves but has less effect on the volume of low frequency sine waves, etc.<br />
<br />
Now try this on a pair of sine waves spaced pretty widely apart in frequency (for example, 100 Hz and 2000 Hz). Hear how the different filters affect the relative volumes of the two sine waves.<br />
<br />
Finally, play some of the oud samples (via the same QWERTY keyboard triggering mechanism) through various filters. Experiment with transposition and how it interacts with filtering. In particular, transpose the samples down by a large amount and see how highpass cuts all the sound (as with a low-frequency sine wave), while lowpass emphasizes the "bassiness" of the sound.<br />
<br />
== Filtering acceleration data to distinguish tilt from sudden motion == <br />
*Relaunch TouchOSC and open the guppy patch.<br />
*Turn on OSC in the patch, then turn on audio, and then holding the iPod at a neutral position hit the calibrate button and wait a few seconds. You may have to do the callibration a few times until the tilt values are in the range of [-1,1].<br />
*Now move the iPod around and note that the tilt appears pretty much exclusively in the "tilt" outputs, and that the sudden motion appears pretty much exclusively in the "sudden motion" outputs. Amazing! How do they do that?<br />
<br />
The answering is with filtering. Caveat: although we believe filtering is the best way to solve this gesture discrimination problem, this particular implementation is somewhat of a hack. The reason is that all of Pd/max's filtering tools work only on audio signals, so the guppy patch (in particular, the accel-xover subpatch) converts the incoming OSC messages into audio signals, smooths them out, then lowpasses and highpasses them (at 5 and 20 Hertz, respectively) to differentiate tilt (the low frequency component) from sudden movements (which have lots of high frequency components).<br />
<br />
The moral of the story is that control signals have frequency components too, just like audio signals, and you've got a lot more power if you can think about them in the "frequency domain", just like it's powerful to think about audio signals in the frequency domain. Great. Now go read Julius Smith's books.<br />
<br />
*Experiment with different cutoff frequencies for the crossover<br />
*Examine briefly the stillness detector in the lower right corner. How does this work?</div>Lukedahlhttps://ccrma.stanford.edu/mediawiki/index.php?title=250a_Accelerometer_Lab&diff=8963250a Accelerometer Lab2009-10-14T00:22:46Z<p>Lukedahl: /* Audio Filtering */</p>
<hr />
<div><font size=5>Lab 4: Accelerometers, Audio Filters, and (optionally) Multitouch</font><br><br />
Due on Wednesday, October 21th at 5PM<br />
<br />
For this lab you need an iPod Touch (loaners are available) or an iPhone running TouchOSC, and Max/MSP or Pd on a computer.<br />
<br />
== Get connected and get oriented ==<br />
iPod Touches, like many newer portable electronic devices, have a 3-axis accelerometer in them, which allows designers to take into account both orientation of the device with respect to gravity as well as detecting physical gestures that are made with the phone.<br />
<br />
For this lab, instead of writing our own iPod applications (the subject of an entire course), we will use an iPod app called TouchOSC to send accelerometer data from the iPod to max or pd, where we will process the data and make sound. TouchOSC is installed on the iPods available to use in this lab.<br />
<br />
If you prefer to use your own iPod or iPhone, you are welcome to use one of the other apps which perform similar functions. Here is a review of some options: http://heuristicmusic.com/blog/?p=124.<br />
<br />
<br />
=== getting the iPod to talk to your computer via Open Sound Control ===<br />
* Make sure your computer and iPod are on the same network. <br />
* Find out the name or IP address of your computer. <br />
* On the iPod start TouchOSC and press the small 'i' to get to preferences. Select 'Network' and set Host to the name of your computer (e.g. 'cmn37.stanford.edu' or 'mylaptop.local'.)<br />
* Set the outgoing port to 8000.<br />
* Open accel_osc.pd (INSERT LINK AND MAX PATCH HERE), and make sure that accelerometer messages from TouchOSC are being received in Pd/Max.<br />
* use printing to examine the incoming OSC messages.<br />
<br />
=== getting oriented ===<br />
* Look at the acceleration values and graphs as you move the iPod around.<br />
* What are the units that acceleration is reported in?<br />
* Figure out the direction and orientation of each (x,y,z) accelerometer axis. How do you do this? <br />
* Draw a picture of the x,y, and z axes and their orientation as they relate to the iPod. (For lab submission you can include this picture or describe verbally what you discover.)<br />
<br />
== Naive Gesture Detection and Thresholding ==<br />
Here's a way to make a simple gesture detector. One obvious difference between fast jerky movements and slow gradual movements is sudden jumps in the acceleration values. As discussed in lecture, jerk is the derivative of acceleration.<br />
<br />
Since our accelermeter data is discrete in time (i.e. we get one value every some number of milliseconds), we can approximate derivation by taking the difference between successive values. (Technically, this is a "one-zero highpass filter.") You can use the included delta abstraction, which simply returns the difference between subsequent input values. <br />
<br />
Start with accel+osc and connect a delta object to one or more acceleration values, pick a threshold that corresponds to a satisfying level of jerkiness, and use threshold (Pd) or past (max) to make a sound when you exceed the threshold. You can give the user additional control of the sound based on the direction and/or the magnitude of the jerk, if you like.<br />
<br />
Congratulations, you have now written a jerk detector.<br />
<br />
== Audio Filtering ==<br />
The purpose of this part of the lab is to get a sense for the effect of different kinds of filters, and to start thinking about (audio) signals as being comprised of frequency components. Don't worry, we'll come back to accelerometers later.<br />
<br />
This section does not use the iPod. You may want to quit TouchOSC to save battery life.<br />
<br />
Open the pd patch <pre>audio-filters/filter-demo</pre><br />
<br />
This patch allows you to select one of four input sources (white noise, a sine wave, a pair of sine waves, or a collection of oud samples) and pass the sound through one of seven possible filters:<br />
* No filtering<br />
* High pass filtering with Pd's (one-pole) hip~ object<br />
* High pass filtering with a "cascade" of four hip~ objects<br />
* Low pass filtering with Pd's (one-pole) lop~ object<br />
* Low pass filtering with a cascade of four lop~ objects<br />
* Band pass filtering with Pd's bp~ object<br />
* Band pass filtering with a cascade of Pd's bp~ objects<br />
(EDIT THESE FOR MAXMSP!!!)<br />
<br />
Play with this patch to get a feeling of the effect of different kinds of filters on different input sounds.<br />
<br />
Start with the white noise source. (Be very careful with the output gain! White noise is extremely loud per unit of amplitude!) This is the best input for hearing the differences between different kinds of filters because it contains all frequencies. (It's called "white" noise by analogy to white light, which contains all frequencies, i.e., all colors of light.) Turn the master volume and/or your headphones way down, then select input source zero (white noise) and filter type zero (unfiltered). Beautiful, huh?<br />
<br />
Now step through the other six filter types, playing with the parameters of each. Sweep the high-pass cutoff frequency. Sweep the cascaded high pass cutoff frequency and note that the four filters have "four times as much" effect on the sound as the single hip~ object. Ditto for the low pass objects. For the band pass, start with the default Q factor of 1 and sweep the center frequency. Then make the Q factor small and sweep the frequency again. Then make the Q factor large and sweep the frequency again. Now you know what these filters do.<br />
<br />
Repeat all of the above on the single sine wave. Note that no matter what filtering you do, all you change is the gain (and phase) of the sine wave. (Geek moment: the reason is because all of these filters are "linear and time-invariant".) This is very important: filters don't add anything; they just change the balance of what's already there. Note that lowpass filtering reduces the volume of high frequency sine waves but has less effect on the volume of low frequency sine waves, etc.<br />
<br />
Now try this on a pair of sine waves spaced pretty widely apart in frequency (for example, 100 Hz and 2000 Hz). Hear how the different filters affect the relative volumes of the two sine waves.<br />
<br />
Finally, play some of the oud samples (via the same QWERTY keyboard triggering mechanism) through various filters. Experiment with transposition and how it interacts with filtering. In particular, transpose the samples down by a large amount and see how highpass cuts all the sound (as with a low-frequency sine wave), while lowpass emphasizes the "bassiness" of the sound.<br />
<br />
== Filtering acceleration data to distinguish tilt from sudden motion == <br />
Relaunch <br />
<br />
Reconnect your accelerometer and open the Pd patch pd/guppy-help.pd. Follow the three numbered steps, including the calibration. To calibrate, return the accelerometer board to its "upright and locked" position, press the "calibrate" button, and leave it alone for about a second. Now move around the accelerometer and note that the tilt appears pretty much exclusively in the "tilt" outputs, and that the sudden motion appears pretty much exclusively in the "sudden motion" outputs. Amazing! How do they do that?<br />
<br />
The answering is with filtering. Caveat: although we believe filtering is the best way to solve this gesture discrimination problem, this particular implementation is somewhat of a hack. The reason is that all of Pd's filtering tools work only on audio signals, so the guppy patch (in particular, the accel-xover subpatch) converts the incoming OSC messages into audio signals, smooths them out, then lowpasses and highpasses them (at 5 and 20 Hertz, respectively) to differentiate tilt (the low frequency component) from sudden movements (which have lots of high frequency components).<br />
<br />
The moral of the story is that control signals have frequency components too, just like audio signals, and you've got a lot more power if you can think about them in the "frequency domain", just like it's powerful to think about audio signals in the frequency domain. Great. Now go read Julius Smith's books.</div>Lukedahlhttps://ccrma.stanford.edu/mediawiki/index.php?title=250a_Accelerometer_Lab&diff=8962250a Accelerometer Lab2009-10-14T00:06:08Z<p>Lukedahl: /* getting the iPod to talk to your computer via Open Sound Control */</p>
<hr />
<div><font size=5>Lab 4: Accelerometers, Audio Filters, and (optionally) Multitouch</font><br><br />
Due on Wednesday, October 21th at 5PM<br />
<br />
For this lab you need an iPod Touch (loaners are available) or an iPhone running TouchOSC, and Max/MSP or Pd on a computer.<br />
<br />
== Get connected and get oriented ==<br />
iPod Touches, like many newer portable electronic devices, have a 3-axis accelerometer in them, which allows designers to take into account both orientation of the device with respect to gravity as well as detecting physical gestures that are made with the phone.<br />
<br />
For this lab, instead of writing our own iPod applications (the subject of an entire course), we will use an iPod app called TouchOSC to send accelerometer data from the iPod to max or pd, where we will process the data and make sound. TouchOSC is installed on the iPods available to use in this lab.<br />
<br />
If you prefer to use your own iPod or iPhone, you are welcome to use one of the other apps which perform similar functions. Here is a review of some options: http://heuristicmusic.com/blog/?p=124.<br />
<br />
<br />
=== getting the iPod to talk to your computer via Open Sound Control ===<br />
* Make sure your computer and iPod are on the same network. <br />
* Find out the name or IP address of your computer. <br />
* On the iPod start TouchOSC and press the small 'i' to get to preferences. Select 'Network' and set Host to the name of your computer (e.g. 'cmn37.stanford.edu' or 'mylaptop.local'.)<br />
* Set the outgoing port to 8000.<br />
* Open accel_osc.pd (INSERT LINK AND MAX PATCH HERE), and make sure that accelerometer messages from TouchOSC are being received in Pd/Max.<br />
* use printing to examine the incoming OSC messages.<br />
<br />
=== getting oriented ===<br />
* Look at the acceleration values and graphs as you move the iPod around.<br />
* What are the units that acceleration is reported in?<br />
* Figure out the direction and orientation of each (x,y,z) accelerometer axis. How do you do this? <br />
* Draw a picture of the x,y, and z axes and their orientation as they relate to the iPod. (For lab submission you can include this picture or describe verbally what you discover.)<br />
<br />
== Naive Gesture Detection and Thresholding ==<br />
Here's a way to make a simple gesture detector. One obvious difference between fast jerky movements and slow gradual movements is sudden jumps in the acceleration values. As discussed in lecture, jerk is the derivative of acceleration.<br />
<br />
Since our accelermeter data is discrete in time (i.e. we get one value every some number of milliseconds), we can approximate derivation by taking the difference between successive values. (Technically, this is a "one-zero highpass filter.") You can use the included delta abstraction, which simply returns the difference between subsequent input values. <br />
<br />
Start with accel+osc and connect a delta object to one or more acceleration values, pick a threshold that corresponds to a satisfying level of jerkiness, and use threshold (Pd) or past (max) to make a sound when you exceed the threshold. You can give the user additional control of the sound based on the direction and/or the magnitude of the jerk, if you like.<br />
<br />
Congratulations, you have now written a jerk detector.<br />
<br />
== Audio Filtering ==<br />
The purpose of this part of the lab is to get a sense for the effect of different kinds of filters, and to start thinking about (audio) signals as being comprised of frequency components. Don't worry, we'll come back to accelerometers later.<br />
<br />
Open the pd patch <pre>audio-filters/filter-demo</pre><br />
<br />
This patch allows you to select one of four input sources (white noise, a sine wave, a pair of sine waves, or a collection of oud samples) and pass the sound through one of seven possible filters:<br />
<br />
* No filtering<br />
* High pass filtering with Pd's (one-pole) hip~ object<br />
* High pass filtering with a "cascade" of four hip~ objects<br />
* Low pass filtering with Pd's (one-pole) lop~ object<br />
* Low pass filtering with a cascade of four lop~ objects<br />
* Band pass filtering with Pd's bp~ object<br />
* Band pass filtering with a cascade of Pd's bp~ objects<br />
(EDIT THESE FOR MAXMSP!!!)<br />
<br />
Play with this patch to get a feeling of the effect of different kinds of filters on different input sounds.<br />
<br />
Start with the white noise source. (Be very careful with the output gain! White noise is extremely loud per unit of amplitude!) This is the best input for hearing the differences between different kinds of filters because it contains all frequencies. (It's called "white" noise by analogy to white light, which contains all frequencies, i.e., all colors of light.) Turn the master volume and/or your headphones way down, then select input source zero (white noise) and filter type zero (unfiltered). Beautiful, huh?<br />
<br />
Now step through the other six filter types, playing with the parameters of each. Sweep the high-pass cutoff frequency. Sweep the cascaded high pass cutoff frequency and note that the four filters have "four times as much" effect on the sound as the single hip~ object. Ditto for the low pass objects. For the band pass, start with the default Q factor of 1 and sweep the center frequency. Then make the Q factor small and sweep the frequency again. Then make the Q factor large and sweep the frequency again. Now you know what these filters do.<br />
<br />
Repeat all of the above on the single sine wave. Note that no matter what filtering you do, all you change is the gain (and phase) of the sine wave. (Geek moment: the reason is because all of these filters are "linear and time-invariant".) This is very important: filters don't add anything; they just change the balance of what's already there. Note that lowpass filtering reduces the volume of high frequency sine waves but has less effect on the volume of low frequency sine waves, etc.<br />
<br />
Now try this on a pair of sine waves spaced pretty widely apart in frequency (for example, 100 Hz and 2000 Hz). Hear how the different filters affect the relative volumes of the two sine waves.<br />
<br />
Finally, play some of the oud samples (via the same QWERTY keyboard triggering mechanism) through various filters. Experiment with transposition and how it interacts with filtering. In particular, transpose the samples down by a large amount and see how highpass cuts all the sound (as with a low-frequency sine wave), while lowpass emphasizes the "bassiness" of the sound.</div>Lukedahlhttps://ccrma.stanford.edu/mediawiki/index.php?title=250a_Accelerometer_Lab&diff=8961250a Accelerometer Lab2009-10-14T00:05:54Z<p>Lukedahl: /* getting the iPod to talk to your computer via Open Sound Control */</p>
<hr />
<div><font size=5>Lab 4: Accelerometers, Audio Filters, and (optionally) Multitouch</font><br><br />
Due on Wednesday, October 21th at 5PM<br />
<br />
For this lab you need an iPod Touch (loaners are available) or an iPhone running TouchOSC, and Max/MSP or Pd on a computer.<br />
<br />
== Get connected and get oriented ==<br />
iPod Touches, like many newer portable electronic devices, have a 3-axis accelerometer in them, which allows designers to take into account both orientation of the device with respect to gravity as well as detecting physical gestures that are made with the phone.<br />
<br />
For this lab, instead of writing our own iPod applications (the subject of an entire course), we will use an iPod app called TouchOSC to send accelerometer data from the iPod to max or pd, where we will process the data and make sound. TouchOSC is installed on the iPods available to use in this lab.<br />
<br />
If you prefer to use your own iPod or iPhone, you are welcome to use one of the other apps which perform similar functions. Here is a review of some options: http://heuristicmusic.com/blog/?p=124.<br />
<br />
<br />
=== getting the iPod to talk to your computer via Open Sound Control ===<br />
* Make sure your computer and iPod are on the same network. <br />
* Find out the name or IP address of your computer. <br />
* On the iPod start TouchOSC and press the small 'i' to get to preferences. Select 'Network' and set Host to the name of your computer (e.g. 'cmn37.stanford.edu' or 'mylaptop.local'.)<br />
* set the outgoing port to 8000.<br />
* Open accel_osc.pd (INSERT LINK AND MAX PATCH HERE), and make sure that accelerometer messages from TouchOSC are being received in Pd/Max.<br />
* use printing to examine the incoming OSC messages.<br />
<br />
=== getting oriented ===<br />
* Look at the acceleration values and graphs as you move the iPod around.<br />
* What are the units that acceleration is reported in?<br />
* Figure out the direction and orientation of each (x,y,z) accelerometer axis. How do you do this? <br />
* Draw a picture of the x,y, and z axes and their orientation as they relate to the iPod. (For lab submission you can include this picture or describe verbally what you discover.)<br />
<br />
== Naive Gesture Detection and Thresholding ==<br />
Here's a way to make a simple gesture detector. One obvious difference between fast jerky movements and slow gradual movements is sudden jumps in the acceleration values. As discussed in lecture, jerk is the derivative of acceleration.<br />
<br />
Since our accelermeter data is discrete in time (i.e. we get one value every some number of milliseconds), we can approximate derivation by taking the difference between successive values. (Technically, this is a "one-zero highpass filter.") You can use the included delta abstraction, which simply returns the difference between subsequent input values. <br />
<br />
Start with accel+osc and connect a delta object to one or more acceleration values, pick a threshold that corresponds to a satisfying level of jerkiness, and use threshold (Pd) or past (max) to make a sound when you exceed the threshold. You can give the user additional control of the sound based on the direction and/or the magnitude of the jerk, if you like.<br />
<br />
Congratulations, you have now written a jerk detector.<br />
<br />
== Audio Filtering ==<br />
The purpose of this part of the lab is to get a sense for the effect of different kinds of filters, and to start thinking about (audio) signals as being comprised of frequency components. Don't worry, we'll come back to accelerometers later.<br />
<br />
Open the pd patch <pre>audio-filters/filter-demo</pre><br />
<br />
This patch allows you to select one of four input sources (white noise, a sine wave, a pair of sine waves, or a collection of oud samples) and pass the sound through one of seven possible filters:<br />
<br />
* No filtering<br />
* High pass filtering with Pd's (one-pole) hip~ object<br />
* High pass filtering with a "cascade" of four hip~ objects<br />
* Low pass filtering with Pd's (one-pole) lop~ object<br />
* Low pass filtering with a cascade of four lop~ objects<br />
* Band pass filtering with Pd's bp~ object<br />
* Band pass filtering with a cascade of Pd's bp~ objects<br />
(EDIT THESE FOR MAXMSP!!!)<br />
<br />
Play with this patch to get a feeling of the effect of different kinds of filters on different input sounds.<br />
<br />
Start with the white noise source. (Be very careful with the output gain! White noise is extremely loud per unit of amplitude!) This is the best input for hearing the differences between different kinds of filters because it contains all frequencies. (It's called "white" noise by analogy to white light, which contains all frequencies, i.e., all colors of light.) Turn the master volume and/or your headphones way down, then select input source zero (white noise) and filter type zero (unfiltered). Beautiful, huh?<br />
<br />
Now step through the other six filter types, playing with the parameters of each. Sweep the high-pass cutoff frequency. Sweep the cascaded high pass cutoff frequency and note that the four filters have "four times as much" effect on the sound as the single hip~ object. Ditto for the low pass objects. For the band pass, start with the default Q factor of 1 and sweep the center frequency. Then make the Q factor small and sweep the frequency again. Then make the Q factor large and sweep the frequency again. Now you know what these filters do.<br />
<br />
Repeat all of the above on the single sine wave. Note that no matter what filtering you do, all you change is the gain (and phase) of the sine wave. (Geek moment: the reason is because all of these filters are "linear and time-invariant".) This is very important: filters don't add anything; they just change the balance of what's already there. Note that lowpass filtering reduces the volume of high frequency sine waves but has less effect on the volume of low frequency sine waves, etc.<br />
<br />
Now try this on a pair of sine waves spaced pretty widely apart in frequency (for example, 100 Hz and 2000 Hz). Hear how the different filters affect the relative volumes of the two sine waves.<br />
<br />
Finally, play some of the oud samples (via the same QWERTY keyboard triggering mechanism) through various filters. Experiment with transposition and how it interacts with filtering. In particular, transpose the samples down by a large amount and see how highpass cuts all the sound (as with a low-frequency sine wave), while lowpass emphasizes the "bassiness" of the sound.</div>Lukedahlhttps://ccrma.stanford.edu/mediawiki/index.php?title=250a_Accelerometer_Lab&diff=8960250a Accelerometer Lab2009-10-14T00:03:31Z<p>Lukedahl: /* Audio Filtering */</p>
<hr />
<div><font size=5>Lab 4: Accelerometers, Audio Filters, and (optionally) Multitouch</font><br><br />
Due on Wednesday, October 21th at 5PM<br />
<br />
For this lab you need an iPod Touch (loaners are available) or an iPhone running TouchOSC, and Max/MSP or Pd on a computer.<br />
<br />
== Get connected and get oriented ==<br />
iPod Touches, like many newer portable electronic devices, have a 3-axis accelerometer in them, which allows designers to take into account both orientation of the device with respect to gravity as well as detecting physical gestures that are made with the phone.<br />
<br />
For this lab, instead of writing our own iPod applications (the subject of an entire course), we will use an iPod app called TouchOSC to send accelerometer data from the iPod to max or pd, where we will process the data and make sound. TouchOSC is installed on the iPods available to use in this lab.<br />
<br />
If you prefer to use your own iPod or iPhone, you are welcome to use one of the other apps which perform similar functions. Here is a review of some options: http://heuristicmusic.com/blog/?p=124.<br />
<br />
<br />
=== getting the iPod to talk to your computer via Open Sound Control ===<br />
* Make sure your computer and iPod are on the same network. <br />
* Find out the name or IP address of your computer. <br />
* On the iPod start TouchOSC and press the small 'i' to get to preferences. Select 'Network' and set Host to the name of your computer (e.g. 'cmn37.stanford.edu' or 'mylaptop.local'.)<br />
* Open accel_osc.pd (INSERT LINK AND MAX PATCH HERE), and make sure that accelerometer messages from TouchOSC are being received in Pd/Max.<br />
* use printing to examine the incoming OSC messages.<br />
<br />
=== getting oriented ===<br />
* Look at the acceleration values and graphs as you move the iPod around.<br />
* What are the units that acceleration is reported in?<br />
* Figure out the direction and orientation of each (x,y,z) accelerometer axis. How do you do this? <br />
* Draw a picture of the x,y, and z axes and their orientation as they relate to the iPod. (For lab submission you can include this picture or describe verbally what you discover.)<br />
<br />
== Naive Gesture Detection and Thresholding ==<br />
Here's a way to make a simple gesture detector. One obvious difference between fast jerky movements and slow gradual movements is sudden jumps in the acceleration values. As discussed in lecture, jerk is the derivative of acceleration.<br />
<br />
Since our accelermeter data is discrete in time (i.e. we get one value every some number of milliseconds), we can approximate derivation by taking the difference between successive values. (Technically, this is a "one-zero highpass filter.") You can use the included delta abstraction, which simply returns the difference between subsequent input values. <br />
<br />
Start with accel+osc and connect a delta object to one or more acceleration values, pick a threshold that corresponds to a satisfying level of jerkiness, and use threshold (Pd) or past (max) to make a sound when you exceed the threshold. You can give the user additional control of the sound based on the direction and/or the magnitude of the jerk, if you like.<br />
<br />
Congratulations, you have now written a jerk detector.<br />
<br />
== Audio Filtering ==<br />
The purpose of this part of the lab is to get a sense for the effect of different kinds of filters, and to start thinking about (audio) signals as being comprised of frequency components. Don't worry, we'll come back to accelerometers later.<br />
<br />
Open the pd patch <pre>audio-filters/filter-demo</pre><br />
<br />
This patch allows you to select one of four input sources (white noise, a sine wave, a pair of sine waves, or a collection of oud samples) and pass the sound through one of seven possible filters:<br />
<br />
* No filtering<br />
* High pass filtering with Pd's (one-pole) hip~ object<br />
* High pass filtering with a "cascade" of four hip~ objects<br />
* Low pass filtering with Pd's (one-pole) lop~ object<br />
* Low pass filtering with a cascade of four lop~ objects<br />
* Band pass filtering with Pd's bp~ object<br />
* Band pass filtering with a cascade of Pd's bp~ objects<br />
(EDIT THESE FOR MAXMSP!!!)<br />
<br />
Play with this patch to get a feeling of the effect of different kinds of filters on different input sounds.<br />
<br />
Start with the white noise source. (Be very careful with the output gain! White noise is extremely loud per unit of amplitude!) This is the best input for hearing the differences between different kinds of filters because it contains all frequencies. (It's called "white" noise by analogy to white light, which contains all frequencies, i.e., all colors of light.) Turn the master volume and/or your headphones way down, then select input source zero (white noise) and filter type zero (unfiltered). Beautiful, huh?<br />
<br />
Now step through the other six filter types, playing with the parameters of each. Sweep the high-pass cutoff frequency. Sweep the cascaded high pass cutoff frequency and note that the four filters have "four times as much" effect on the sound as the single hip~ object. Ditto for the low pass objects. For the band pass, start with the default Q factor of 1 and sweep the center frequency. Then make the Q factor small and sweep the frequency again. Then make the Q factor large and sweep the frequency again. Now you know what these filters do.<br />
<br />
Repeat all of the above on the single sine wave. Note that no matter what filtering you do, all you change is the gain (and phase) of the sine wave. (Geek moment: the reason is because all of these filters are "linear and time-invariant".) This is very important: filters don't add anything; they just change the balance of what's already there. Note that lowpass filtering reduces the volume of high frequency sine waves but has less effect on the volume of low frequency sine waves, etc.<br />
<br />
Now try this on a pair of sine waves spaced pretty widely apart in frequency (for example, 100 Hz and 2000 Hz). Hear how the different filters affect the relative volumes of the two sine waves.<br />
<br />
Finally, play some of the oud samples (via the same QWERTY keyboard triggering mechanism) through various filters. Experiment with transposition and how it interacts with filtering. In particular, transpose the samples down by a large amount and see how highpass cuts all the sound (as with a low-frequency sine wave), while lowpass emphasizes the "bassiness" of the sound.</div>Lukedahlhttps://ccrma.stanford.edu/mediawiki/index.php?title=250a_Accelerometer_Lab&diff=8959250a Accelerometer Lab2009-10-13T23:53:05Z<p>Lukedahl: /* Naive Gesture Detection and Thresholding */</p>
<hr />
<div><font size=5>Lab 4: Accelerometers, Audio Filters, and (optionally) Multitouch</font><br><br />
Due on Wednesday, October 21th at 5PM<br />
<br />
For this lab you need an iPod Touch (loaners are available) or an iPhone running TouchOSC, and Max/MSP or Pd on a computer.<br />
<br />
== Get connected and get oriented ==<br />
iPod Touches, like many newer portable electronic devices, have a 3-axis accelerometer in them, which allows designers to take into account both orientation of the device with respect to gravity as well as detecting physical gestures that are made with the phone.<br />
<br />
For this lab, instead of writing our own iPod applications (the subject of an entire course), we will use an iPod app called TouchOSC to send accelerometer data from the iPod to max or pd, where we will process the data and make sound. TouchOSC is installed on the iPods available to use in this lab.<br />
<br />
If you prefer to use your own iPod or iPhone, you are welcome to use one of the other apps which perform similar functions. Here is a review of some options: http://heuristicmusic.com/blog/?p=124.<br />
<br />
<br />
=== getting the iPod to talk to your computer via Open Sound Control ===<br />
* Make sure your computer and iPod are on the same network. <br />
* Find out the name or IP address of your computer. <br />
* On the iPod start TouchOSC and press the small 'i' to get to preferences. Select 'Network' and set Host to the name of your computer (e.g. 'cmn37.stanford.edu' or 'mylaptop.local'.)<br />
* Open accel_osc.pd (INSERT LINK AND MAX PATCH HERE), and make sure that accelerometer messages from TouchOSC are being received in Pd/Max.<br />
* use printing to examine the incoming OSC messages.<br />
<br />
=== getting oriented ===<br />
* Look at the acceleration values and graphs as you move the iPod around.<br />
* What are the units that acceleration is reported in?<br />
* Figure out the direction and orientation of each (x,y,z) accelerometer axis. How do you do this? <br />
* Draw a picture of the x,y, and z axes and their orientation as they relate to the iPod. (For lab submission you can include this picture or describe verbally what you discover.)<br />
<br />
== Naive Gesture Detection and Thresholding ==<br />
Here's a way to make a simple gesture detector. One obvious difference between fast jerky movements and slow gradual movements is sudden jumps in the acceleration values. As discussed in lecture, jerk is the derivative of acceleration.<br />
<br />
Since our accelermeter data is discrete in time (i.e. we get one value every some number of milliseconds), we can approximate derivation by taking the difference between successive values. (Technically, this is a "one-zero highpass filter.") You can use the included delta abstraction, which simply returns the difference between subsequent input values. <br />
<br />
Start with accel+osc and connect a delta object to one or more acceleration values, pick a threshold that corresponds to a satisfying level of jerkiness, and use threshold (Pd) or past (max) to make a sound when you exceed the threshold. You can give the user additional control of the sound based on the direction and/or the magnitude of the jerk, if you like.<br />
<br />
Congratulations, you have now written a jerk detector.<br />
<br />
== Audio Filtering ==<br />
The purpose of this part of the lab is to get a sense for the effect of different kinds of filters, and to start thinking about (audio) signals as being comprised of frequency components. Don't worry, we'll come back to accelerometers later.<br />
<br />
Open the pd patch <pre>audio-filters/filter-demo</pre><br />
<br />
This patch allows you to select one of four input sources (white noise, a sine wave, a pair of sine waves, or a collection of oud samples) and pass the sound through one of seven possible filters:<br />
<br />
* No filtering<br />
* High pass filtering with Pd's (one-pole) hip~ object<br />
* High pass filtering with a "cascade" of four hip~ objects<br />
* Low pass filtering with Pd's (one-pole) lop~ object<br />
* Low pass filtering with a cascade of four lop~ objects<br />
* Band pass filtering with Pd's bp~ object<br />
* Band pass filtering with a cascade of Pd's bp~ objects<br />
(EDIT THESE FOR MAXMSP!!!)<br />
<br />
Play with this patch to get a feeling of the effect of different kinds of filters on different input sounds.<br />
<br />
Be very careful with the output gain! White noise is extremely loud (per unit of amplitude)!<br />
<br />
Start with the white noise source. This is the best input for hearing the differences between different kinds of filters because it contains all frequencies. (It's called "white" noise by analogy to white light, which contains all frequencies, i.e., all colors of light.) Turn the master volume and/or your headphones way down, then select input source zero (white noise) and filter type zero (unfiltered). Beautiful, huh?<br />
<br />
Now step through the other six filter types, playing with the parameters of each. Sweep the high-pass cutoff frequency. Sweep the cascaded high pass cutoff frequency and note that the four filters have "four times as much" effect on the sound as the single hip~ object. Ditto for the low pass objects. For the band pass, start with the default Q factor of 1 and sweep the center frequency. Then make the Q factor small and sweep the frequency again. Then make the Q factor large and sweep the frequency again. Now you know what these filters do.<br />
<br />
Repeat all of the above on the single sine wave. Note that no matter what filtering you do, all you change is the gain (and phase) of the sine wave. (Geek moment: the reason is because all of these filters are "linear and time-invariant".) This is very important: filters don't add anything; they just change the balance of what's already there. Note that lowpass filtering reduces the volume of high frequency sine waves but has less effect on the volume of low frequency sine waves, etc.<br />
<br />
Now try this on a pair of sine waves spaced pretty widely apart in frequency (for example, 100 Hz and 2000 Hz). Hear how the different filters affect the relative volumes of the two sine waves.<br />
<br />
Finally, play some of the oud samples (via the same QWERTY keyboard triggering mechanism) through various filters. Experiment with transposition and how it interacts with filtering. In particular, transpose the samples down by a large amount and see how highpass cuts all the sound (as with a low-frequency sine wave), while lowpass emphasizes the "bassiness" of the sound.</div>Lukedahlhttps://ccrma.stanford.edu/mediawiki/index.php?title=250a_Accelerometer_Lab&diff=8958250a Accelerometer Lab2009-10-13T23:47:15Z<p>Lukedahl: /* Get connected and get oriented */</p>
<hr />
<div><font size=5>Lab 4: Accelerometers, Audio Filters, and (optionally) Multitouch</font><br><br />
Due on Wednesday, October 21th at 5PM<br />
<br />
For this lab you need an iPod Touch (loaners are available) or an iPhone running TouchOSC, and Max/MSP or Pd on a computer.<br />
<br />
== Get connected and get oriented ==<br />
iPod Touches, like many newer portable electronic devices, have a 3-axis accelerometer in them, which allows designers to take into account both orientation of the device with respect to gravity as well as detecting physical gestures that are made with the phone.<br />
<br />
For this lab, instead of writing our own iPod applications (the subject of an entire course), we will use an iPod app called TouchOSC to send accelerometer data from the iPod to max or pd, where we will process the data and make sound. TouchOSC is installed on the iPods available to use in this lab.<br />
<br />
If you prefer to use your own iPod or iPhone, you are welcome to use one of the other apps which perform similar functions. Here is a review of some options: http://heuristicmusic.com/blog/?p=124.<br />
<br />
<br />
=== getting the iPod to talk to your computer via Open Sound Control ===<br />
* Make sure your computer and iPod are on the same network. <br />
* Find out the name or IP address of your computer. <br />
* On the iPod start TouchOSC and press the small 'i' to get to preferences. Select 'Network' and set Host to the name of your computer (e.g. 'cmn37.stanford.edu' or 'mylaptop.local'.)<br />
* Open accel_osc.pd (INSERT LINK AND MAX PATCH HERE), and make sure that accelerometer messages from TouchOSC are being received in Pd/Max.<br />
* use printing to examine the incoming OSC messages.<br />
<br />
=== getting oriented ===<br />
* Look at the acceleration values and graphs as you move the iPod around.<br />
* What are the units that acceleration is reported in?<br />
* Figure out the direction and orientation of each (x,y,z) accelerometer axis. How do you do this? <br />
* Draw a picture of the x,y, and z axes and their orientation as they relate to the iPod. (For lab submission you can include this picture or describe verbally what you discover.)<br />
<br />
== Naive Gesture Detection and Thresholding ==<br />
Here's a way to make a simple gesture detector. One obvious difference between fast jerky movements and slow gradual movements is sudden jumps in the acceleration values. As discussed in lecture, jerk is the derivative of acceleration.<br />
<br />
Since our accelermeter data is discrete in time (i.e. we get one value every some number of milliseconds), we can approximate derivation by taking the difference between successive values. (Technically, this is a "one-zero highpass filter.") You can use the included delta abstraction, which simply returns the difference between subsequent input values. <br />
<br />
Start with accel+osc and connect a delta object to one or more acceleration values, pick a threshold that corresponds to a satisfying level of jerkiness, and use threshold (Pd) or past (max) to make a sound when you exceed the threshold. You can give the user additional control of the sound based on the direction and/or the magnitude of the jerk, if you like.<br />
<br />
Congratulations, you have now written a jerk detector.</div>Lukedahlhttps://ccrma.stanford.edu/mediawiki/index.php?title=250a_Accelerometer_Lab&diff=8957250a Accelerometer Lab2009-10-13T23:45:32Z<p>Lukedahl: /* Naive Gesture Detection and Thresholding */</p>
<hr />
<div><font size=5>Lab 4: Accelerometers, Audio Filters, and (optionally) Multitouch</font><br><br />
Due on Wednesday, October 21th at 5PM<br />
<br />
For this lab you need an iPod Touch (loaners are available) or an iPhone running TouchOSC, and Max/MSP or Pd on a computer.<br />
<br />
== Get connected and get oriented ==<br />
iPod Touches, like many newer portable electronic devices, have a 3-axis accelerometer in them, which allows designers to take into account both orientation of the device with respect to gravity as well as detecting physical gestures that are made with the phone.<br />
<br />
For this lab, instead of writing our own iPod applications (the subject of an entire course), we will use an iPod app called TouchOSC to send accelerometer data from the iPod to max or pd. This app is installed on the iPods available to use in this lab.<br />
<br />
If you prefer to use your own iPod or iPhone, you are welcome to use one of the other apps which perform similar functions. Here is a review of some options: http://heuristicmusic.com/blog/?p=124.<br />
<br />
=== getting the iPod to talk to your computer via Open Sound Control ===<br />
* Make sure your computer and iPod are on the same network. <br />
* Find out the name or IP address of your computer. <br />
* On the iPod start TouchOSC and press the small 'i' to get to preferences. Select 'Network' and set Host to the name of your computer (e.g. 'cmn37.stanford.edu' or 'mylaptop.local'.)<br />
* Open accel_osc.pd (INSERT LINK AND MAX PATCH HERE), and make sure that accelerometer messages from TouchOSC are being received in Pd/Max.<br />
* use printing to examine the incoming OSC messages.<br />
<br />
=== getting oriented ===<br />
* Look at the acceleration values and graphs as you move the iPod around.<br />
* What are the units that acceleration is reported in?<br />
* Figure out the direction and orientation of each (x,y,z) accelerometer axis. How do you do this? <br />
* Draw a picture of the x,y, and z axes and their orientation as they relate to the iPod. (For lab submission you can include this picture or describe verbally what you discover.)<br />
<br />
== Naive Gesture Detection and Thresholding ==<br />
Here's a way to make a simple gesture detector. One obvious difference between fast jerky movements and slow gradual movements is sudden jumps in the acceleration values. As discussed in lecture, jerk is the derivative of acceleration.<br />
<br />
Since our accelermeter data is discrete in time (i.e. we get one value every some number of milliseconds), we can approximate derivation by taking the difference between successive values. (Technically, this is a "one-zero highpass filter.") You can use the included delta abstraction, which simply returns the difference between subsequent input values. <br />
<br />
Start with accel+osc and connect a delta object to one or more acceleration values, pick a threshold that corresponds to a satisfying level of jerkiness, and use threshold (Pd) or past (max) to make a sound when you exceed the threshold. You can give the user additional control of the sound based on the direction and/or the magnitude of the jerk, if you like.<br />
<br />
Congratulations, you have now written a jerk detector.</div>Lukedahlhttps://ccrma.stanford.edu/mediawiki/index.php?title=250a_Accelerometer_Lab&diff=8956250a Accelerometer Lab2009-10-13T23:32:54Z<p>Lukedahl: /* getting osc messages going to your computer */</p>
<hr />
<div><font size=5>Lab 4: Accelerometers, Audio Filters, and (optionally) Multitouch</font><br><br />
Due on Wednesday, October 21th at 5PM<br />
<br />
For this lab you need an iPod Touch (loaners are available) or an iPhone running TouchOSC, and Max/MSP or Pd on a computer.<br />
<br />
== Get connected and get oriented ==<br />
iPod Touches, like many newer portable electronic devices, have a 3-axis accelerometer in them, which allows designers to take into account both orientation of the device with respect to gravity as well as detecting physical gestures that are made with the phone.<br />
<br />
For this lab, instead of writing our own iPod applications (the subject of an entire course), we will use an iPod app called TouchOSC to send accelerometer data from the iPod to max or pd. This app is installed on the iPods available to use in this lab.<br />
<br />
If you prefer to use your own iPod or iPhone, you are welcome to use one of the other apps which perform similar functions. Here is a review of some options: http://heuristicmusic.com/blog/?p=124.<br />
<br />
=== getting the iPod to talk to your computer via Open Sound Control ===<br />
* Make sure your computer and iPod are on the same network. <br />
* Find out the name or IP address of your computer. <br />
* On the iPod start TouchOSC and press the small 'i' to get to preferences. Select 'Network' and set Host to the name of your computer (e.g. 'cmn37.stanford.edu' or 'mylaptop.local'.)<br />
* Open accel_osc.pd (INSERT LINK AND MAX PATCH HERE), and make sure that accelerometer messages from TouchOSC are being received in Pd/Max.<br />
* use printing to examine the incoming OSC messages.<br />
<br />
=== getting oriented ===<br />
* Look at the acceleration values and graphs as you move the iPod around.<br />
* What are the units that acceleration is reported in?<br />
* Figure out the direction and orientation of each (x,y,z) accelerometer axis. How do you do this? <br />
* Draw a picture of the x,y, and z axes and their orientation as they relate to the iPod. (For lab submission you can include this picture or describe verbally what you discover.)<br />
<br />
== Naive Gesture Detection and Thresholding ==<br />
One obvious difference between fast jerky movements and slow gradual movements is sudden jumps in the acceleration values. We will detect these jumps with a simple threshold test.<br />
<br />
Take a look at the help patch for the delta abstraction. The delta abstraction simply returns the difference between subsequent input values. (Technically, this is a "one-zero highpass filter.")<br />
<br />
Connect a delta object to one or more acceleration values, pick a threshold that corresponds to a satisfying level of jerkiness, and cause Pd to make a sound when you exceed the threshold. For extra credit, give the user additional control of the sound based on the direction and/or the magnitude of the jerk.<br />
<br />
Congratulations; you have now written a jerk detector.</div>Lukedahlhttps://ccrma.stanford.edu/mediawiki/index.php?title=250a_Accelerometer_Lab&diff=8955250a Accelerometer Lab2009-10-13T23:25:04Z<p>Lukedahl: /* Get connected and get oriented */</p>
<hr />
<div><font size=5>Lab 4: Accelerometers, Audio Filters, and (optionally) Multitouch</font><br><br />
Due on Wednesday, October 21th at 5PM<br />
<br />
For this lab you need an iPod Touch (loaners are available) or an iPhone running TouchOSC, and Max/MSP or Pd on a computer.<br />
<br />
== Get connected and get oriented ==<br />
iPod Touches, like many newer portable electronic devices, have a 3-axis accelerometer in them, which allows designers to take into account both orientation of the device with respect to gravity as well as detecting physical gestures that are made with the phone.<br />
<br />
For this lab, instead of writing our own iPod applications (the subject of an entire course), we will use an iPod app called TouchOSC to send accelerometer data from the iPod to max or pd. This app is installed on the iPods available to use in this lab.<br />
<br />
If you prefer to use your own iPod or iPhone, you are welcome to use one of the other apps which perform similar functions. Here is a review of some options: http://heuristicmusic.com/blog/?p=124.<br />
<br />
=== getting osc messages going to your computer ===<br />
* Make sure your computer and iPod are on the same network. <br />
* Find out the name or IP address of your computer. <br />
* On the iPod start TouchOSC and press the small 'i' to get to preferences. Select 'Network' and set Host to the name of your computer (e.g. 'cmn37.stanford.edu' or 'mylaptop.local'.)<br />
* Open accel_osc.pd (INSERT LINK AND MAX PATCH HERE), and make sure that accelerometer messages from TouchOSC are being received in Pd/Max.<br />
* use printing to examine the incoming OSC messages.<br />
<br />
=== getting oriented ===<br />
* Look at the acceleration values and graphs as you move the iPod around.<br />
* What are the units that acceleration is reported in?<br />
* Figure out the direction and orientation of each (x,y,z) accelerometer axis. How do you do this? <br />
* Draw a picture of the x,y, and z axes and their orientation as they relate to the iPod. (For lab submission you can include this picture or describe verbally what you discover.)<br />
<br />
== Naive Gesture Detection and Thresholding ==<br />
One obvious difference between fast jerky movements and slow gradual movements is sudden jumps in the acceleration values. We will detect these jumps with a simple threshold test.<br />
<br />
Take a look at the help patch for the delta abstraction. The delta abstraction simply returns the difference between subsequent input values. (Technically, this is a "one-zero highpass filter.")<br />
<br />
Connect a delta object to one or more acceleration values, pick a threshold that corresponds to a satisfying level of jerkiness, and cause Pd to make a sound when you exceed the threshold. For extra credit, give the user additional control of the sound based on the direction and/or the magnitude of the jerk.<br />
<br />
Congratulations; you have now written a jerk detector.</div>Lukedahlhttps://ccrma.stanford.edu/mediawiki/index.php?title=250a_Accelerometer_Lab&diff=8954250a Accelerometer Lab2009-10-13T22:57:32Z<p>Lukedahl: /* Get connected and get oriented */</p>
<hr />
<div><font size=5>Lab 4: Accelerometers, Audio Filters, and (optionally) Multitouch</font><br><br />
Due on Wednesday, October 21th at 5PM<br />
<br />
For this lab you need an iPod Touch (loaners are available) or an iPhone running TouchOSC, and Max/MSP or Pd on a computer.<br />
<br />
== Get connected and get oriented ==<br />
iPod Touches, like many newer portable electronic devices, have a 3-axis accelerometer in them, which allows designers to take into account both orientation of the device with respect to gravity as well as detecting physical gestures that are made with the phone.<br />
<br />
For this lab, instead of writing our own iPod applications (the subject of an entire course), we will use an iPod app called TouchOSC to send accelerometer data from the iPod to max or pd. This app is installed on the iPods available to use in this lab.<br />
<br />
If you prefer to use your own iPod or iPhone, you are welcome to use one of the other apps which perform similar functions. Here is a review of some options: http://heuristicmusic.com/blog/?p=124.<br />
<br />
=== getting osc messages going to your computer ===<br />
* Make sure your computer and iPod are on the same network. <br />
* Find out the name or IP address of your computer. <br />
* On the iPod start TouchOSC and press the small 'i' to get to preferences. Select 'Network' and set Host to the name of your computer (e.g. 'cmn37.stanford.edu' or 'mylaptop.local'.)<br />
<br />
Open accel_osc.pd (INSERT LINK AND MAX PATCH HERE)</div>Lukedahlhttps://ccrma.stanford.edu/mediawiki/index.php?title=250a_Accelerometer_Lab&diff=8953250a Accelerometer Lab2009-10-13T22:34:03Z<p>Lukedahl: </p>
<hr />
<div><font size=5>Lab 4: Accelerometers, Audio Filters, and (optionally) Multitouch</font><br><br />
Due on Wednesday, October 21th at 5PM<br />
<br />
For this lab you need an iPod Touch (loaners are available) or an iPhone running TouchOSC, and Max/MSP or Pd on a computer.<br />
<br />
== Get connected and get oriented ==<br />
iPod Touches, like many newer portable electronic devices, have a 3-axis accelerometer in them, which allows designers to take into account both orientation of the device with respect to gravity as well as detecting physical gestures that are made with the phone.<br />
<br />
For this lab, instead of writing our own iPod applications (the subject of an entire course), we will use an iPod app called TouchOSC to send accelerometer data from the iPod to max or pd. This app is installed on the iPods available to use in this lab.<br />
<br />
If you prefer to use your own iPod or iPhone, you are welcome to use one of the other apps which perform similar functions. Here is a review of some options: http://heuristicmusic.com/blog/?p=124.</div>Lukedahlhttps://ccrma.stanford.edu/mediawiki/index.php?title=250a_Accelerometer_Lab&diff=8952250a Accelerometer Lab2009-10-13T22:03:09Z<p>Lukedahl: Created page with '<font size=5>Lab 4: Accelerometers, Audio Filters, and (optionally) Multitouch</font><br> Due on Wednesday, October 21th at 5PM For this lab you need an iPod Touch (loaners are …'</p>
<hr />
<div><font size=5>Lab 4: Accelerometers, Audio Filters, and (optionally) Multitouch</font><br><br />
Due on Wednesday, October 21th at 5PM<br />
<br />
For this lab you need an iPod Touch (loaners are available) or an iPhone running TouchOSC, and Max/MSP or PD on a computer.</div>Lukedahlhttps://ccrma.stanford.edu/mediawiki/index.php?title=Gesture_Signal_Processing_and_Music&diff=8949Gesture Signal Processing and Music2009-10-12T07:43:06Z<p>Lukedahl: /* Accelerometers: How to analyze data */</p>
<hr />
<div>== Gesture and Measuring Movement ==<br />
* What is a gesture? Physical. Bodily: hands, face, posture. Non-verbal. What do gestures communicate? Is a gesture more like a button or a handle?<br />
* What is a musical gesture?<br />
<br />
===Techniques for measuring human movement===<br />
* Camera-based<br />
** 2D image processing<br />
** 3D Motion Capture using many cameras<br />
* Goniometry: measuring joint angles with physically attached sensors<br />
** Laetitia Sonami's Lady's Glove: http://www.youtube.com/watch?v=HYTrNOmSRSo&feature=player_embedded<br />
* Electromyography: measures muscle activation by measuring electrical activity<br />
** Pamela Z's controller (which I believe uses emg): http://www.pamelaz.com/VociMov.html<br />
* With accelerometers attached to the body (e.g. wiimote, iphone)<br />
<br />
===Really basic physics===<br />
* Position: x<br />
* Velocity: v = dx/dt<br />
* Acceleration: a = dv/dt<br />
** Proportional to force (Newton's 2nd law of motion: F = ma )<br />
* Jerk: j = da/dt<br />
<br />
== Interlude #1: Open Sound Control ==<br />
* Open Sound Control: http://opensoundcontrol.org/<br />
* iPod Touch has 3 axis accelerometer<br />
* We will use TouchOSC to transmit accelerometer data as OSC messages over wifi to Max/Pd<br />
** http://hexler.net/software/touchosc<br />
<br />
== Interlude #2: Filtering of Audio Signals ==<br />
* Can view any sound as being composed of sinusiodal waves at different frequencies<br />
* A filter removes energy at only certain frequencies:<br />
**A low-pass filter removes high freqs. <br />
**A high-pass filter removes low freqs. <br />
** A band-pass only lets certain middle frequencies through.<br />
* Is differentiation a low-pass or high-pass filter? What about integration?<br />
<br />
== Back to movement and music ==<br />
=== Accelerometers: How to analyze data ===<br />
* How to distinguish orientation vs movement?<br />
* Can get jerk via difference (as approximation of differentiation)<br />
* How to get velocity from acceleration (or position from velocity)?<br />
** True integrator (will eventually overflow): <pre>y = y_prev + x.</pre><br />
** Leaky integrator as approximation of integration: <pre>y = a*y_prev + (1-a)*x</pre><br />
* Thresholding - detecting specific events<br />
** max: past. pd: threshold<br />
* Classification through training of Machine Learning algorithms<br />
**e.g. Rebecca Fiebrink's Wekinator: http://wekinator.cs.princeton.edu/<br />
<br />
=== Mapping to Sound ===<br />
* How do we create a musically expressive instrument from sensor data?<br />
* We need to map the data from one domain to another appropriately:<br />
** linear: to map [a b] to [c d] use <pre>z = (s-a)*(d-c)/(b-a) + c</pre><br />
** exponential: more appropriate in many situations since perceptual qualities are often logarthmic.<br />
** any arbitrary function<br />
* max: scale, pd: ???<br />
* How do we decide the best mapping? (i.e. How do we select what physical parameter controls which musical parameter?)<br />
**this is an art, not a science!</div>Lukedahlhttps://ccrma.stanford.edu/mediawiki/index.php?title=Gesture_Signal_Processing_and_Music&diff=8948Gesture Signal Processing and Music2009-10-12T07:42:31Z<p>Lukedahl: /* Mapping to Sound */</p>
<hr />
<div>== Gesture and Measuring Movement ==<br />
* What is a gesture? Physical. Bodily: hands, face, posture. Non-verbal. What do gestures communicate? Is a gesture more like a button or a handle?<br />
* What is a musical gesture?<br />
<br />
===Techniques for measuring human movement===<br />
* Camera-based<br />
** 2D image processing<br />
** 3D Motion Capture using many cameras<br />
* Goniometry: measuring joint angles with physically attached sensors<br />
** Laetitia Sonami's Lady's Glove: http://www.youtube.com/watch?v=HYTrNOmSRSo&feature=player_embedded<br />
* Electromyography: measures muscle activation by measuring electrical activity<br />
** Pamela Z's controller (which I believe uses emg): http://www.pamelaz.com/VociMov.html<br />
* With accelerometers attached to the body (e.g. wiimote, iphone)<br />
<br />
===Really basic physics===<br />
* Position: x<br />
* Velocity: v = dx/dt<br />
* Acceleration: a = dv/dt<br />
** Proportional to force (Newton's 2nd law of motion: F = ma )<br />
* Jerk: j = da/dt<br />
<br />
== Interlude #1: Open Sound Control ==<br />
* Open Sound Control: http://opensoundcontrol.org/<br />
* iPod Touch has 3 axis accelerometer<br />
* We will use TouchOSC to transmit accelerometer data as OSC messages over wifi to Max/Pd<br />
** http://hexler.net/software/touchosc<br />
<br />
== Interlude #2: Filtering of Audio Signals ==<br />
* Can view any sound as being composed of sinusiodal waves at different frequencies<br />
* A filter removes energy at only certain frequencies:<br />
**A low-pass filter removes high freqs. <br />
**A high-pass filter removes low freqs. <br />
** A band-pass only lets certain middle frequencies through.<br />
* Is differentiation a low-pass or high-pass filter? What about integration?<br />
<br />
== Back to movement and music ==<br />
=== Accelerometers: How to analyze data ===<br />
* How to distinguish orientation vs movement?<br />
* Can get jerk via difference (as approximation of differentiation)<br />
* How to get velocity from acceleration (or position from velocity)?<br />
** True integrator (will eventually overflow): <br />
***<pre>y = y_prev + x.</pre><br />
** Leaky integrator as approximation of integration: <br />
***<pre>y = a*y_prev + (1-a)*x</pre><br />
* Thresholding - detecting specific events<br />
** max: past. pd: threshold<br />
* Classification through training of Machine Learning algorithms<br />
**e.g. Rebecca Fiebrink's Wekinator: http://wekinator.cs.princeton.edu/<br />
<br />
=== Mapping to Sound ===<br />
* How do we create a musically expressive instrument from sensor data?<br />
* We need to map the data from one domain to another appropriately:<br />
** linear: to map [a b] to [c d] use <pre>z = (s-a)*(d-c)/(b-a) + c</pre><br />
** exponential: more appropriate in many situations since perceptual qualities are often logarthmic.<br />
** any arbitrary function<br />
* max: scale, pd: ???<br />
* How do we decide the best mapping? (i.e. How do we select what physical parameter controls which musical parameter?)<br />
**this is an art, not a science!</div>Lukedahlhttps://ccrma.stanford.edu/mediawiki/index.php?title=Gesture_Signal_Processing_and_Music&diff=8947Gesture Signal Processing and Music2009-10-12T07:41:55Z<p>Lukedahl: /* Mapping to Sound */</p>
<hr />
<div>== Gesture and Measuring Movement ==<br />
* What is a gesture? Physical. Bodily: hands, face, posture. Non-verbal. What do gestures communicate? Is a gesture more like a button or a handle?<br />
* What is a musical gesture?<br />
<br />
===Techniques for measuring human movement===<br />
* Camera-based<br />
** 2D image processing<br />
** 3D Motion Capture using many cameras<br />
* Goniometry: measuring joint angles with physically attached sensors<br />
** Laetitia Sonami's Lady's Glove: http://www.youtube.com/watch?v=HYTrNOmSRSo&feature=player_embedded<br />
* Electromyography: measures muscle activation by measuring electrical activity<br />
** Pamela Z's controller (which I believe uses emg): http://www.pamelaz.com/VociMov.html<br />
* With accelerometers attached to the body (e.g. wiimote, iphone)<br />
<br />
===Really basic physics===<br />
* Position: x<br />
* Velocity: v = dx/dt<br />
* Acceleration: a = dv/dt<br />
** Proportional to force (Newton's 2nd law of motion: F = ma )<br />
* Jerk: j = da/dt<br />
<br />
== Interlude #1: Open Sound Control ==<br />
* Open Sound Control: http://opensoundcontrol.org/<br />
* iPod Touch has 3 axis accelerometer<br />
* We will use TouchOSC to transmit accelerometer data as OSC messages over wifi to Max/Pd<br />
** http://hexler.net/software/touchosc<br />
<br />
== Interlude #2: Filtering of Audio Signals ==<br />
* Can view any sound as being composed of sinusiodal waves at different frequencies<br />
* A filter removes energy at only certain frequencies:<br />
**A low-pass filter removes high freqs. <br />
**A high-pass filter removes low freqs. <br />
** A band-pass only lets certain middle frequencies through.<br />
* Is differentiation a low-pass or high-pass filter? What about integration?<br />
<br />
== Back to movement and music ==<br />
=== Accelerometers: How to analyze data ===<br />
* How to distinguish orientation vs movement?<br />
* Can get jerk via difference (as approximation of differentiation)<br />
* How to get velocity from acceleration (or position from velocity)?<br />
** True integrator (will eventually overflow): <br />
***<pre>y = y_prev + x.</pre><br />
** Leaky integrator as approximation of integration: <br />
***<pre>y = a*y_prev + (1-a)*x</pre><br />
* Thresholding - detecting specific events<br />
** max: past. pd: threshold<br />
* Classification through training of Machine Learning algorithms<br />
**e.g. Rebecca Fiebrink's Wekinator: http://wekinator.cs.princeton.edu/<br />
<br />
=== Mapping to Sound ===<br />
* How do we create a musically expressive instrument from sensor data?<br />
* We need to map the data from one domain to another appropriately:<br />
** linear: to map [a b] to [c d] use z = (s-a)*(d-c)/(b-a) + c<br />
** exponential: more appropriate in many situations since perceptual qualities are often logarthmic.<br />
** any arbitrary function<br />
* max: scale, pd: ???<br />
* How do we decide the best mapping? (i.e. How do we select what physical parameter controls which musical parameter?)<br />
**this is an art, not a science!</div>Lukedahlhttps://ccrma.stanford.edu/mediawiki/index.php?title=Gesture_Signal_Processing_and_Music&diff=8946Gesture Signal Processing and Music2009-10-12T07:37:51Z<p>Lukedahl: /* Interlude #2: Filtering of Audio Signals */</p>
<hr />
<div>== Gesture and Measuring Movement ==<br />
* What is a gesture? Physical. Bodily: hands, face, posture. Non-verbal. What do gestures communicate? Is a gesture more like a button or a handle?<br />
* What is a musical gesture?<br />
<br />
===Techniques for measuring human movement===<br />
* Camera-based<br />
** 2D image processing<br />
** 3D Motion Capture using many cameras<br />
* Goniometry: measuring joint angles with physically attached sensors<br />
** Laetitia Sonami's Lady's Glove: http://www.youtube.com/watch?v=HYTrNOmSRSo&feature=player_embedded<br />
* Electromyography: measures muscle activation by measuring electrical activity<br />
** Pamela Z's controller (which I believe uses emg): http://www.pamelaz.com/VociMov.html<br />
* With accelerometers attached to the body (e.g. wiimote, iphone)<br />
<br />
===Really basic physics===<br />
* Position: x<br />
* Velocity: v = dx/dt<br />
* Acceleration: a = dv/dt<br />
** Proportional to force (Newton's 2nd law of motion: F = ma )<br />
* Jerk: j = da/dt<br />
<br />
== Interlude #1: Open Sound Control ==<br />
* Open Sound Control: http://opensoundcontrol.org/<br />
* iPod Touch has 3 axis accelerometer<br />
* We will use TouchOSC to transmit accelerometer data as OSC messages over wifi to Max/Pd<br />
** http://hexler.net/software/touchosc<br />
<br />
== Interlude #2: Filtering of Audio Signals ==<br />
* Can view any sound as being composed of sinusiodal waves at different frequencies<br />
* A filter removes energy at only certain frequencies:<br />
**A low-pass filter removes high freqs. <br />
**A high-pass filter removes low freqs. <br />
** A band-pass only lets certain middle frequencies through.<br />
* Is differentiation a low-pass or high-pass filter? What about integration?<br />
<br />
== Back to movement and music ==<br />
=== Accelerometers: How to analyze data ===<br />
* How to distinguish orientation vs movement?<br />
* Can get jerk via difference (as approximation of differentiation)<br />
* How to get velocity from acceleration (or position from velocity)?<br />
** True integrator (will eventually overflow): <br />
***<pre>y = y_prev + x.</pre><br />
** Leaky integrator as approximation of integration: <br />
***<pre>y = a*y_prev + (1-a)*x</pre><br />
* Thresholding - detecting specific events<br />
** max: past. pd: threshold<br />
* Classification through training of Machine Learning algorithms<br />
**e.g. Rebecca Fiebrink's Wekinator: http://wekinator.cs.princeton.edu/<br />
<br />
=== Mapping to Sound ===</div>Lukedahlhttps://ccrma.stanford.edu/mediawiki/index.php?title=Gesture_Signal_Processing_and_Music&diff=8945Gesture Signal Processing and Music2009-10-12T07:28:51Z<p>Lukedahl: /* Gesture and Measuring Movement */</p>
<hr />
<div>== Gesture and Measuring Movement ==<br />
* What is a gesture? Physical. Bodily: hands, face, posture. Non-verbal. What do gestures communicate? Is a gesture more like a button or a handle?<br />
* What is a musical gesture?<br />
<br />
===Techniques for measuring human movement===<br />
* Camera-based<br />
** 2D image processing<br />
** 3D Motion Capture using many cameras<br />
* Goniometry: measuring joint angles with physically attached sensors<br />
** Laetitia Sonami's Lady's Glove: http://www.youtube.com/watch?v=HYTrNOmSRSo&feature=player_embedded<br />
* Electromyography: measures muscle activation by measuring electrical activity<br />
** Pamela Z's controller (which I believe uses emg): http://www.pamelaz.com/VociMov.html<br />
* With accelerometers attached to the body (e.g. wiimote, iphone)<br />
<br />
===Really basic physics===<br />
* Position: x<br />
* Velocity: v = dx/dt<br />
* Acceleration: a = dv/dt<br />
** Proportional to force (Newton's 2nd law of motion: F = ma )<br />
* Jerk: j = da/dt<br />
<br />
== Interlude #1: Open Sound Control ==<br />
* Open Sound Control: http://opensoundcontrol.org/<br />
* iPod Touch has 3 axis accelerometer<br />
* We will use TouchOSC to transmit accelerometer data as OSC messages over wifi to Max/Pd<br />
** http://hexler.net/software/touchosc<br />
<br />
== Interlude #2: Filtering of Audio Signals ==<br />
* Can view any sound as being composed of sinusiodal waves at different frequencies<br />
* A filter removes energy at only certain frequencies:<br />
**A low-pass filter removes high freqs. <br />
**A high-pass filter removes low freqs. <br />
** A band-pass only lets certain middle frequencies through.</div>Lukedahlhttps://ccrma.stanford.edu/mediawiki/index.php?title=Gesture_Signal_Processing_and_Music&diff=8944Gesture Signal Processing and Music2009-10-12T07:24:13Z<p>Lukedahl: /* Gesture and Measuring Movement */</p>
<hr />
<div>== Gesture and Measuring Movement ==<br />
* What is a gesture? Physical. Bodily: hands, face, posture. Non-verbal. What do gestures communicate? Is a gesture more like a button or a handle?<br />
* What is a musical gesture?<br />
<br />
===Techniques for measuring human movement===<br />
* Camera-based<br />
** 2D image processing<br />
** 3D Motion Capture using many cameras<br />
* Goniometry: measuring joint angles with physically attached sensors<br />
** Laetitia Sonami's Lady's Glove: http://www.youtube.com/watch?v=HYTrNOmSRSo&feature=player_embedded<br />
* Electromyography: measures muscle activation by measuring electrical activity<br />
** Pamela Z's controller (which I believe uses emg): http://www.pamelaz.com/VociMov.html<br />
* With accelerometers attached to the body (e.g. wiimote, iphone)<br />
<br />
===Really basic physics===<br />
* Position: x<br />
* Velocity: v = dx/dt<br />
* Acceleration: a = dv/dt<br />
** Proportional to force (Newton's 2nd law of motion: F = ma )<br />
* Jerk: j = da/dt</div>Lukedahlhttps://ccrma.stanford.edu/mediawiki/index.php?title=Gesture_Signal_Processing_and_Music&diff=8943Gesture Signal Processing and Music2009-10-12T07:19:05Z<p>Lukedahl: Created page with '== Gesture and Measuring Movement == * What is a gesture? Physical. Bodily: hands, face, posture. Non-verbal. What do gestures communicate? Is a gesture more like a button or…'</p>
<hr />
<div>== Gesture and Measuring Movement ==<br />
* What is a gesture? Physical. Bodily: hands, face, posture. Non-verbal. What do gestures communicate? Is a gesture more like a button or a handle?<br />
* What is a musical gesture?<br />
<br />
===Techniques for measuring human movement===<br />
* Camera-based<br />
** 2D image processing<br />
** 3D Motion Capture using many cameras<br />
* Goniometry: measuring joint angles with physically attached sensors<br />
** Laetitia Sonami's Lady's Glove:</div>Lukedahlhttps://ccrma.stanford.edu/mediawiki/index.php?title=250a_Microcontroller_%26_Sensors_Lab&diff=8786250a Microcontroller & Sensors Lab2009-10-02T01:03:10Z<p>Lukedahl: /* Install Firmata onto your Arduino Nano */</p>
<hr />
<div><font size=5>Lab 2: Microcontroller and Sensors</font><br><br />
Due on Wednesday, October 7th at 5PM<br />
<br />
For this lab you need your [[MaxKit]], and Max/MSP or PD on a computer.<br />
<br />
== Download software ==<br />
<br />
* Make a Lab2/ folder for this lab.<br />
<br />
* Download the following:<br />
** [http://arduino.cc/en/Main/Software Arduino Software]<br />
*** If you have never used an Arduino with your computer before, you need to install the FTDI driver that comes with the Arduino software; this enables your computer to recognize the USB serial communication chip on the Arduino.<br />
** Firmata [http://firmata.org/wiki/Download firmware ] (Use Firmata-2.0beta3.zip)<br />
** Either [http://ccrma.stanford.edu/courses/250a/250a_maxuino-005.zip Maxuino] or [http://ccrma.stanford.edu/courses/250a/250a_pduino-0.5.zip Pduino]. (Do this even if you have already downloaded Maxuino or Pduino from the Firmata site, since we have included lab-specific patches.)<br />
<br />
== Mount your Arduino Nano on your breadboard==<br />
<br />
We will be powering the Nano and the breadboard with current from the USB port, which is good for up to 500mA of 5 V±5%-- probably enough for most input circuits, although not enough if you plan to run a lot of LEDs or motors.<br />
* The Nano should sit at the bottom of the breadboard, so that the pins lie in rows 49-64 on either side.<br />
* Using jumper wires, connect the row 52 pin (GND) on the left side of the Nano to the blue ground (GND) rail.<br />
* Connect the row 52 pin (+5V) on the right side of the Nano to the red Power rail.<br />
* Use jumper wires to connect the power and GND rails on the left side of the breadboard to the right.<br />
<br />
[[Image:Lab2-2.jpg]]<br />
<br />
== Install Firmata onto your Arduino Nano ==<br />
* Install the Arduino software<br />
** Unarchive the file, and move it to your applications folder.<br />
** Install the FTDI driver that comes with the Arduino software so that your computer will recognize the Arduino when it is plugged into the USB port.<br />
* Unarchive the Firmata firmware that you downloaded previously.<br />
* Connect your Arduino Nano to your computer using a USB cable.<br />
* Use the Arduino software program to open ''StandardFirmata.pde'', which lies inside the folder ''Firmata-2.0beta3:Firmata:examples:StandardFirmata''. ('''Do NOT open the version of StandardFirmata that lies directly within the pull-down menus of the Arduino software.''')<br />
* Use Tools->Board and Tools->Serial Port to select the Arduino Nano (Atmega 328) and USBserial tty port, then hit the Play button to verify and compile the program. <br />
* Upload the Firmata firmware to your Arduino Nano using upload button, the fourth square button from the left (the one with the sideways arrow).<br />
* Close the Arduino program. (This is important because it frees up the USB serial port so that Max or PD can talk to the Arduino board next.)<br />
<br />
== Buttons, Switches and LEDs ==<br />
<br />
* Build the circuit that is detailed in the following figures. Use components and jumpers to construct your circuits on the solderless bread-board.<br />
<br />
[[Image:Breadboard.png]]<br />
<br />
* Here's how to wire a simple 2-resistor circuit on the solderless bread-board (for example R1 = 10K, R2 = 10K):<br />
<br />
[[Image:Jumper.png]]<br />
<br />
What will A0 read?<br />
<br />
(How do you know what resistor you have? Use the [http://www.dannyg.com/examples/res2/resistor.htm resistor calculator]!)<br />
<br />
<br />
=== Build the Button and LED Circuit ===<br />
We'll start our tutorial with three simple light circuits. <br />
<br />
* In the first one, the LED is permanently on.<br />
* In the second, the LED only lights up when a button is pressed and a circuit is completed. <br />
* In the third example, we'll replace the manual switch with an Arduino pin (set to output mode), so we can control the LED from our program.<br />
<br />
[[Image:3Buttons.gif]]<br />
<br />
==== Power a LED (always on) ====<br />
<br />
[[Image:Lab2-3.jpg]]<br />
<br />
Build the following circuit on your breadboard. Use a 220Ohm resistor (red red brown gold). <br />
<br />
Because the LED is a diode, it has a set voltage drop across the leads; exceeding this causes heat to build up and the LED to fail prematurely. So! It is always important to have a resistor in series with the LED.<br />
<br />
Also, another consequence of the LED being a diode is that it has directionality. The longer lead, the anode, should be connected towards power; the shorter, cathode, should be connected towards ground. (In the photo, the longer lead has a bent "knee.")<br />
<br />
==== Make a light switch ====<br />
<br />
Next, we'll insert a switch into the circuit. The momentary switches in your kit are "normal open", meaning that the circuit is interrupted in the idle state, when the switch is not pressed. Pressing the switch closes the circuit until you let go again.<br />
<br />
[[Image:Lab2-4.jpg]]<br />
<br />
Use a multimeter to see what happens to the voltage on either side of the LED when you press the switch.<br />
<br />
==== Toggling LED with PD or Max ====<br />
<br />
[[Image:Lab2-5.jpg]]<br />
<br />
In the third example, we'll replace the manual switch with an Arduino pin (set to output mode), so we can control the LED from our program. The safe way to do this is to let the Arduino pin sink current - if we toggle the pin low, it acts as ground and current flows through the resistor and the LED as it did in the previous examples. When we take the pin high, to 5V, there is no potential difference and no current flows - the LED stays off.<br />
<br />
* In the ArduinoLab patch, press the "toggling LED with software" button in the upper right to preset the outputs properly. The patch expects you to connect the LED to digital pin 2 (D2).<br />
<br />
<br />
Optional: Try changing your patch so the light stays on when you press the mouse button, and stays off when you press it again. After that, change your patch so the light blinks on/off. Then, have your patch button switch the light between on and blinking.<br />
<br />
=== Sensing buttons in software ===<br />
<br />
We've used code to trigger output - what about the other direction, sensing physical input in code? Just as easy. Here is a simple switch circuit:<br />
<br />
[[Image: switch.png]] [[Image: Lab2-6.jpg]]<br />
<br />
When the switch is open, the Arduino pin (set to input mode) is pulled to 5V - in software, we'll read Arduino.HIGH. When the switch is closed, the voltage at the Arduino pin falls to 0V - in software, we'll read Arduino.LOW. The pull-up resistor is used to limit the current going through the circuit. In software, we can check the value of the pin and switch between graphics accordingly.<br />
<br />
* In the ArduinoLab patch, press the "sensing buttons in software" button to preset the outputs properly. The patch expects you to connect the switch to digital pin 4 (D4).<br />
<br />
=== Fading LEDs (optional) ===<br />
<br />
What about those "breathing" LEDs on Mac Powerbooks? The fading from bright to dim and back is done using pulse-width modulation (PWM). In essence, the LED is toggled on and off rapidly, say 1000 times a second, faster than your eye can follow. The percentage of time the LED is on (the duty) controls the perceived brightness. To control an LED using PWM, you'll have to connect it to one of the pins that support PWM output - 9, 10 or 11 on the Arduino. Then write a patch that cycles the PWM values.<br />
<br />
* In the ArduinoLab patch, press the "Fading LEDs" button to preset the outputs properly. The patch expects you to connect the LED to digital pins 9-11 (D9-11). <br />
* In your Arduino Kit, you have a RGB LED which has four leads (it's white when not lit); it's basically like 3 LEDs sharing the same ground. Use PWM and this [[http://blog.ncode.ca/?p=38 pin out information]] to make the LED cycle through a rainbow of colors. <br />
<br />
[[Image: Lab2-7.jpg]]<br />
<br />
== Using Analog Sensors ==<br />
<br />
Now we will work with the continuous input values provided by analog sensors - potentiometers, accelerometers, distance rangers, etc.<br />
<br />
=== Make a Light Dimmer ===<br />
<br />
In this example we'll build a light dimmer: a knob connected to a light so that when you turn the knob, the light increases or decreases in brightness. We'll use a potentiometer. The potentiometer has three terminals - the resistance between the first and the third terminal is constant (10k Ohms in our case). The resistance between terminals 1 and 2 (and between 2 and 3) varies as you turn the shaft of the potentiometer. If you apply 5V to terminal 1, connect terminal 3 to Ground, you will get a continuously varying voltage at terminal 2 as you turn the shaft (Why that is the case will become clearer once you've learned about voltage dividers further below).<br />
<br />
* Connect the middle pin of the potentiometer to analog input 0, the other two to +5V and ground.<br />
* Through a 220Ohm resistor, connect an LED to pin 9(anode or long side to resistor, cathode to pin 9) <br />
* In the ArduinoLab patch, press the "Light Dimmer" button to preset the outputs properly. The patch expects you to connect the LED to digital pin 9 (D9) and the potentiometer to analog pin 0 (A0). Use the PWM controls under output controls in the ArduinoLab patch to control the lightness and dimness of the LED.<br />
<br />
<br />
<br />
<br />
<br />
[[Image: Lab2-8.jpg]]<br />
<br />
* Optional: hook up two LEDs, one red one green. as you turn in one direction, red gets brighter; in the other, green gets brighter. In the middle, both are off.<br />
<br />
=== Drawing a graph of analog input ===<br />
Let's understand better what the values are that we are reading from the analog input. To do so, we will use a slider graph to show how the analog values are changing in time.<br />
<br />
Leave the potentiometer part of your circuit, you may take off the LED part if you want to. Use A0 or A1 to graph the analog input in the ArduinoLab patch.<br />
<br />
<br />
=== Thresholding with a Range Sensor (optional) ===<br />
<br />
Thresholding is the process of turning continuous data into a discrete yes/no decision.<br />
<br />
To learn about thresholding, we'll connect the IR range sensor. The circuit is trivial: just connect red to 5V, black to ground, and yellow to analog input A0.<br />
<br />
Take a look at the data the sensor returns with sensor_graph_02 - when the field of view of the sensor is clear (no obstacle - point it at the ceiling), it returns a low voltage. Move your hand high over the sensor, then start lowering it - you should see the output voltage rise, until you are about 4" away. The sensor has a range of operation of 4"-30".<br />
<br />
Let's do something useful with that data. Imagine a smart cookie jar that reminds you not to snack in between meals. We could put an IR ranger into the lid. Whenever a hand comes too close, our program could play a warning sound or flash a warning light. Write a patch to manage this!<br />
<br />
=== Tilt control with an Accelerometer ===<br />
In this example, we'll simulate the motion of a ball on a tilting plane in software and control the tilt through a sensor. Think of it as a first step to build your own electronic game of Labyrinth. The right sensor to use is an accelerometer. Accelerometers can report on both static and dynamic acceleration -- think of static acceleration as the angle the accelerometer is held with respect to the ground (the acceleration measured here is due to gravity). Dynamic acceleration occurs when you shake the sensor.<br />
<br />
[[Image:Lab2-9.jpg]]<br />
<br />
The accelerometer in your kit is a 3-axis, +-2g sensor (1g is the acceleration due to gravity). It comes with 0.1" header pins that fit into the breadboard. The connections you need to make are VCC to the 3V3 pin on the Arduino (it will be in row 62), GND to ground, and X, Y, and Z to the first three analog input pins on the Arduino board. You can ignore the ST (self test) pin.<br />
<br />
[[Image:accel_on_board.jpg]] <br />
<br />
Push the accelerometer into the breadboard and make the connections as shown:<br />
<br />
Now, get a feel for the data the accelerometer provides. Use the Light Dimmer presets, which will track analog values on A0 and A1. Then pick up the Arduino+accelerometer board and tilt it in various directions. Start by holding it so that the accelerometer board is parallel to the ground. Find in which direction the X reading increases and decreases; do the same for the Y reading. Are the labels on the accelerometer board correct?<br />
<br />
=== Voltage Dividers ===<br />
<br />
In your kit, the potentiometer, IR distance ranger, and accelerometer are especially easy to work with since they directly output a changing voltage that can be read by one of Arduino's analog input pins.<br />
<br />
<br />
Other sensors don't give you a varying output voltage per se, but instead change their resistance. Examples in your kit are the force sensitive resistor (FSR) and the bend or flex sensor. It is easy to get a changing voltage based on a changing resistance through a voltage divider circuit(Wikipedia page). The idea is that you put two resistors in series between power and ground: one that changes resistance (your sensor), and one of a known, fixed resistance. At the point in between the two resistors, you can measure how much the voltage has dropped through the first resistor. This value changes as the ratio of resistances between variable and fixed resistors change. More formally:<br />
<br />
[[Image:res_divider.png]]<br />
<br />
* potentiometer:<br />
<br />
[[Image:Pot.png]]<br />
<br />
* force-sensitive resistor (FSR):<br />
<br />
[[Image:FSR.png]]<br />
<br />
Try both circuits. Test the resistance range of your sensor. If you want 2.5 volts to be the middle, make the comparison resistor (33k in the diagram) the "average" value of the FSR's resistance. Test this with a multimeter.<br />
<br />
* Bend Sensor<br />
[[Image:Bend_sensor.png]]<br />
<br />
<br />
<br />
== Putting it all Together ==<br />
<br />
* Create a patch to make sounds based on button and sensor values from the Arduino. You can try to adapt your patches from Lab 1, or come up with a new patch. <br />
<br />
* Try to make a simple musical interaction. Think about music -<br />
** does it have dynamics?<br />
** can you turn the sound off?<br />
** can it be expressive?<br />
<br />
<center>[[250a 2009]]</center><br />
[[Category:250a]][[Category:PID]]</div>Lukedahlhttps://ccrma.stanford.edu/mediawiki/index.php?title=250a_Microcontroller_%26_Sensors_Lab&diff=8766250a Microcontroller & Sensors Lab2009-09-30T22:57:26Z<p>Lukedahl: /* Tilt control with an Accelerometer */</p>
<hr />
<div><font size=5>Lab 2: Microcontroller and Sensors</font><br><br />
Due on Wednesday, October 7th at 5PM<br />
<br />
For this lab you need your [[MaxKit]], and Max/MSP or PD on a computer.<br />
<br />
== Download software ==<br />
<br />
* Make a Lab2/ folder for this lab.<br />
<br />
* Download the following:<br />
** [http://arduino.cc/en/Main/Software Arduino Software]<br />
*** If you have never used an Arduino with your computer before, you need to install the FTDI driver that comes with the Arduino software; this enables your computer to recognize the USB serial communication chip on the Arduino.<br />
** Firmata [http://firmata.org/wiki/Download firmware ] (Use Firmata-2.0beta3.zip)<br />
** Firmata [http://firmata.org/wiki/Download host software ], either [http://ccrma.stanford.edu/courses/250a/PID_maxuino-005.zip Maxuino] or [http://at.or.at/hans/pd/Pduino-0.5beta2.zip Pduino].<br />
<br />
== Mount your Arduino Nano on your breadboard==<br />
<br />
We will be powering the Nano and the breadboard with current from the USB port, which is good for up to 500mA of 5 V±5%-- probably enough for most input circuits, although not enough if you plan to run a lot of LEDs or motors.<br />
* The Nano should sit at the bottom of the breadboard, so that the pins lie in rows 49-64 on either side.<br />
* Using jumper wires, connect the row 52 pin (GND) on the left side of the Nano to the blue ground (GND) rail.<br />
* Connect the row 52 pin (+5V) on the right side of the Nano to the red Power rail.<br />
* Use jumper wires to connect the power and GND rails on the left side of the breadboard to the right.<br />
<br />
[[Image:Lab2-2.jpg]]<br />
<br />
== Install Firmata onto your Arduino Nano ==<br />
* Install the Arduino software<br />
** Unarchive the file, and move it to your applications folder.<br />
** Install the FTDI driver that comes with the Arduino software so that your computer will recognize the Arduino when it is plugged into the USB port.<br />
* Unarchive the Firmata firmware that you downloaded previously.<br />
* Use the Arduino software program to open ''StandardFirmata.pde'', which lies inside the folder ''Firmata-2.0beta3:Firmata:examples:StandardFirmata''. ('''Do NOT open the version of StandardFirmata that lies directly within the pull-down menus of the Arduino software.''')<br />
* Connect your Arduino Nano to your computer using a USB cable.<br />
* Use Tools->Board and Tools->Serial Port to select the Arduino Nano (Atmega 328) and USBserial tty port, then hit the Play button to verify and compile the program. <br />
* Upload the Firmata firmware to your Arduino Nano using upload button, the fourth square button from the left (the one with the sideways arrow).<br />
* Close the Arduino program. (This is important because it frees up the USB serial port so that Max or PD can talk to the Arduino board next.)<br />
<br />
== Buttons, Switches and LEDs ==<br />
<br />
* Build the circuit that is detailed in the following figures. Use components and jumpers to construct your circuits on the solderless bread-board.<br />
<br />
[[Image:Breadboard.png]]<br />
<br />
* Here's how to wire a simple 2-resistor circuit on the solderless bread-board (for example R1 = 10K, R2 = 10K):<br />
<br />
[[Image:Jumper.png]]<br />
<br />
What will A0 read?<br />
<br />
(How do you know what resistor you have? Use the [http://www.dannyg.com/examples/res2/resistor.htm resistor calculator]!)<br />
<br />
<br />
=== Build the Button and LED Circuit ===<br />
We'll start our tutorial with three simple light circuits. <br />
<br />
* In the first one, the LED is permanently on.<br />
* In the second, the LED only lights up when a button is pressed and a circuit is completed. <br />
* In the third example, we'll replace the manual switch with an Arduino pin (set to output mode), so we can control the LED from our program.<br />
<br />
[[Image:3Buttons.gif]]<br />
<br />
==== Power a LED (always on) ====<br />
<br />
[[Image:Lab2-3.jpg]]<br />
<br />
Build the following circuit on your breadboard. Use a 220Ohm resistor (red red brown gold). <br />
<br />
Because the LED is a diode, it has a set voltage drop across the leads; exceeding this causes heat to build up and the LED to fail prematurely. So! It is always important to have a resistor in series with the LED.<br />
<br />
Also, another consequence of the LED being a diode is that it has directionality. The longer lead, the anode, should be connected towards power; the shorter, cathode, should be connected towards ground. (In the photo, the longer lead has a bent "knee.")<br />
<br />
==== Make a light switch ====<br />
<br />
Next, we'll insert a switch into the circuit. The momentary switches in your kit are "normal open", meaning that the circuit is interrupted in the idle state, when the switch is not pressed. Pressing the switch closes the circuit until you let go again.<br />
<br />
[[Image:Lab2-4.jpg]]<br />
<br />
Use a multimeter to see what happens to the voltage on either side of the LED when you press the switch.<br />
<br />
==== Toggling LED with PD or Max ====<br />
<br />
[[Image:Lab2-5.jpg]]<br />
<br />
In the third example, we'll replace the manual switch with an Arduino pin (set to output mode), so we can control the LED from our program. The safe way to do this is to let the Arduino pin sink current - if we toggle the pin low, it acts as ground and current flows through the resistor and the LED as it did in the previous examples. When we take the pin high, to 5V, there is no potential difference and no current flows - the LED stays off.<br />
<br />
* In the ArduinoLab patch, press the "toggling LED with software" button to preset the outputs properly. The patch expects you to connect the LED to digital pin 2 (D2).<br />
<br />
<br />
Optional: Try changing your patch so the light stays on when you press the mouse button, and stays off when you press it again. After that, change your patch so the light blinks on/off. Then, have your patch button switch the light between on and blinking.<br />
<br />
=== Sensing buttons in software ===<br />
<br />
We've used code to trigger output - what about the other direction, sensing physical input in code? Just as easy. Here is a simple switch circuit:<br />
<br />
[[Image: switch.png]] [[Image: Lab2-6.jpg]]<br />
<br />
When the switch is open, the Arduino pin (set to input mode) is pulled to 5V - in software, we'll read Arduino.HIGH. When the switch is closed, the voltage at the Arduino pin falls to 0V - in software, we'll read Arduino.LOW. The pull-up resistor is used to limit the current going through the circuit. In software, we can check the value of the pin and switch between graphics accordingly.<br />
<br />
* In the ArduinoLab patch, press the "sensing buttons in software" button to preset the outputs properly. The patch expects you to connect the switch to digital pin 3 (D3).<br />
<br />
=== Fading LEDs (optional) ===<br />
<br />
What about those "breathing" LEDs on Mac Powerbooks? The fading from bright to dim and back is done using pulse-width modulation (PWM). In essence, the LED is toggled on and off rapidly, say 1000 times a second, faster than your eye can follow. The percentage of time the LED is on (the duty) controls the perceived brightness. To control an LED using PWM, you'll have to connect it to one of the pins that support PWM output - 9, 10 or 11 on the Arduino. Then write a patch that cycles the PWM values.<br />
<br />
* In the ArduinoLab patch, press the "Fading LEDs" button to preset the outputs properly. The patch expects you to connect the LED to digital pin 9 (D9). <br />
* In your Arduino Kit, you have a RGB LED which has four leads (it's white when not lit); it's basically like 3 LEDs sharing the same ground. Use PWM and this [[http://www.kingbrightusa.com/product.asp?catalog_name=LED&product_id=WP154A4SUREPBGVGAW data sheet]] to make the LED cycle through a rainbow of colors. <br />
<br />
[[Image: Lab2-7.jpg]]<br />
<br />
== Using Analog Sensors ==<br />
<br />
Now we will work with the continuous input values provided by analog sensors - potentiometers, accelerometers, distance rangers, etc.<br />
<br />
=== Make a Light Dimmer ===<br />
<br />
In this example we'll build a light dimmer: a knob connected to a light so that when you turn the knob, the light increases or decreases in brightness. We'll use a potentiometer. The potentiometer has three terminals - the resistance between the first and the third terminal is constant (10k Ohms in our case). The resistance between terminals 1 and 2 (and between 2 and 3) varies as you turn the shaft of the potentiometer. If you apply 5V to terminal 1, connect terminal 3 to Ground, you will get a continuously varying voltage at terminal 2 as you turn the shaft (Why that is the case will become clearer once you've learned about voltage dividers further below).<br />
<br />
* Connect the middle pin of the potentiometer to analog input 0, the other two to +5V and ground.<br />
* Through a 220Ohm resistor, connect an LED to pin 9(anode or long side to resistor, cathode to pin 9) <br />
* In the ArduinoLab patch, press the "Light Dimmer" button to preset the outputs properly. The patch expects you to connect the LED to digital pin 9 (D9) and the potentiometer to analog pin 0 (A0). <br />
<br />
<br />
<br />
<br />
<br />
[[Image: Lab2-8.jpg]]<br />
<br />
* Optional: hook up two LEDs, one red one green. as you turn in one direction, red gets brighter; in the other, green gets brighter. In the middle, both are off.<br />
<br />
=== Drawing a graph of analog input ===<br />
Let's understand better what the values are that we are reading from the analog input. To do so, we will use a slider graph to show how the analog values are changing in time.<br />
<br />
Leave the potentiometer part of your circuit, you may take off the LED part if you want to. Use A0 or A1 to graph the analog input in the ArduinoLab patch.<br />
<br />
<br />
=== Thresholding with a Range Sensor (optional) ===<br />
<br />
Thresholding is the process of turning continuous data into a discrete yes/no decision.<br />
<br />
To learn about thresholding, we'll connect the IR range sensor. The circuit is trivial: just connect red to 5V, black to ground, and yellow to analog input A0.<br />
<br />
Take a look at the data the sensor returns with sensor_graph_02 - when the field of view of the sensor is clear (no obstacle - point it at the ceiling), it returns a low voltage. Move your hand high over the sensor, then start lowering it - you should see the output voltage rise, until you are about 4" away. The sensor has a range of operation of 4"-30".<br />
<br />
Let's do something useful with that data. Imagine a smart cookie jar that reminds you not to snack in between meals. We could put an IR ranger into the lid. Whenever a hand comes too close, our program could play a warning sound or flash a warning light. Write a patch to manage this!<br />
<br />
=== Tilt control with an Accelerometer ===<br />
In this example, we'll simulate the motion of a ball on a tilting plane in software and control the tilt through a sensor. Think of it as a first step to build your own electronic game of Labyrinth. The right sensor to use is an accelerometer. Accelerometers can report on both static and dynamic acceleration -- think of static acceleration as the angle the accelerometer is held with respect to the ground (the acceleration measured here is due to gravity). Dynamic acceleration occurs when you shake the sensor.<br />
<br />
[[Image:Lab2-9.jpg]]<br />
<br />
The accelerometer in your kit is a 3-axis, +-2g sensor (1g is the acceleration due to gravity). It comes with 0.1" header pins that fit into the breadboard. The connections you need to make are VCC to the 3V pin on teh arduino, GND to ground, and X, Y, and Z to the first three analog input pins on the Arduino board. You can ignore the ST (self test) pin.<br />
<br />
[[Image:accel_on_board.jpg]] <br />
<br />
Push the accelerometer into the breadboard and make the connections as shown:<br />
<br />
Now, get a feel for the data the accelerometer provides. Use the Light Dimmer presets, which will track analog values on A0 and A1. Then pick up the Arduino+accelerometer board and tilt it in various directions. Start by holding it so that the accelerometer board is parallel to the ground. Find in which direction the X reading increases and decreases; do the same for the Y reading. Are the labels on the accelerometer board correct?<br />
<br />
=== Voltage Dividers ===<br />
<br />
In your kit, the potentiometer, IR distance ranger, and accelerometer are especially easy to work with since they directly output a changing voltage that can be read by one of Arduino's analog input pins.<br />
<br />
<br />
Other sensors don't give you a varying output voltage per se, but instead change their resistance. Examples in your kit are the force sensitive resistor (FSR) and the bend or flex sensor. It is easy to get a changing voltage based on a changing resistance through a voltage divider circuit(Wikipedia page). The idea is that you put two resistors in series between power and ground: one that changes resistance (your sensor), and one of a known, fixed resistance. At the point in between the two resistors, you can measure how much the voltage has dropped through the first resistor. This value changes as the ratio of resistances between variable and fixed resistors change. More formally:<br />
<br />
[[Image:res_divider.png]]<br />
<br />
* potentiometer:<br />
<br />
[[Image:Pot.png]]<br />
<br />
* force-sensitive resistor (FSR):<br />
<br />
[[Image:FSR.png]]<br />
<br />
Try both circuits. Test the resistance range of your sensor. If you want 2.5 volts to be the middle, make the comparison resistor (33k in the diagram) the "average" value of the FSR's resistance. Test this with a multimeter.<br />
<br />
* Bend Sensor<br />
[[Image:Bend_sensor.png]]<br />
<br />
<br />
<br />
== Putting it all Together ==<br />
<br />
* Create a patch to make sounds based on button and sensor values from the Arduino. You can try to adapt your patches from Lab 1, or come up with a new patch. <br />
<br />
* Try to make a simple musical interaction. Think about music -<br />
** does it have dynamics?<br />
** can you turn the sound off?<br />
** can it be expressive?<br />
<br />
<center>[[250a 2009]]</center><br />
[[Category:250a]][[Category:PID]]</div>Lukedahlhttps://ccrma.stanford.edu/mediawiki/index.php?title=250a_Microcontroller_%26_Sensors_Lab&diff=8765250a Microcontroller & Sensors Lab2009-09-30T22:42:33Z<p>Lukedahl: /* Tilt control with an Accelerometer */</p>
<hr />
<div><font size=5>Lab 2: Microcontroller and Sensors</font><br><br />
Due on Wednesday, October 7th at 5PM<br />
<br />
For this lab you need your [[MaxKit]], and Max/MSP or PD on a computer.<br />
<br />
== Download software ==<br />
<br />
* Make a Lab2/ folder for this lab.<br />
<br />
* Download the following:<br />
** [http://arduino.cc/en/Main/Software Arduino Software]<br />
*** If you have never used an Arduino with your computer before, you need to install the FTDI driver that comes with the Arduino software; this enables your computer to recognize the USB serial communication chip on the Arduino.<br />
** Firmata [http://firmata.org/wiki/Download firmware ] (Use Firmata-2.0beta3.zip)<br />
** Firmata [http://firmata.org/wiki/Download host software ], either [http://ccrma.stanford.edu/courses/250a/PID_maxuino-005.zip Maxuino] or [http://at.or.at/hans/pd/Pduino-0.5beta2.zip Pduino].<br />
<br />
== Mount your Arduino Nano on your breadboard==<br />
<br />
We will be powering the Nano and the breadboard with current from the USB port, which is good for up to 500mA of 5 V±5%-- probably enough for most input circuits, although not enough if you plan to run a lot of LEDs or motors.<br />
* The Nano should sit at the bottom of the breadboard, so that the pins lie in rows 49-64 on either side.<br />
* Using jumper wires, connect the row 52 pin (GND) on the left side of the Nano to the blue ground (GND) rail.<br />
* Connect the row 52 pin (+5V) on the right side of the Nano to the red Power rail.<br />
* Use jumper wires to connect the power and GND rails on the left side of the breadboard to the right.<br />
<br />
[[Image:Lab2-2.jpg]]<br />
<br />
== Install Firmata onto your Arduino Nano ==<br />
* Install the Arduino software<br />
** Unarchive the file, and move it to your applications folder.<br />
** Install the FTDI driver that comes with the Arduino software so that your computer will recognize the Arduino when it is plugged into the USB port.<br />
* Unarchive the Firmata firmware that you downloaded previously.<br />
* Use the Arduino software program to open ''StandardFirmata.pde'', which lies inside the folder ''Firmata-2.0beta3:Firmata:examples:StandardFirmata''. ('''Do NOT open the version of StandardFirmata that lies directly within the pull-down menus of the Arduino software.''')<br />
* Connect your Arduino Nano to your computer using a USB cable.<br />
* Use Tools->Board and Tools->Serial Port to select the Arduino Nano (Atmega 328) and USBserial tty port, then hit the Play button to verify and compile the program. <br />
* Upload the Firmata firmware to your Arduino Nano using upload button, the fourth square button from the left (the one with the sideways arrow).<br />
* Close the Arduino program. (This is important because it frees up the USB serial port so that Max or PD can talk to the Arduino board next.)<br />
<br />
== Buttons, Switches and LEDs ==<br />
<br />
* Build the circuit that is detailed in the following figures. Use components and jumpers to construct your circuits on the solderless bread-board.<br />
<br />
[[Image:Breadboard.png]]<br />
<br />
* Here's how to wire a simple 2-resistor circuit on the solderless bread-board (for example R1 = 10K, R2 = 10K):<br />
<br />
[[Image:Jumper.png]]<br />
<br />
What will A0 read?<br />
<br />
(How do you know what resistor you have? Use the [http://www.dannyg.com/examples/res2/resistor.htm resistor calculator]!)<br />
<br />
<br />
=== Build the Button and LED Circuit ===<br />
We'll start our tutorial with three simple light circuits. <br />
<br />
* In the first one, the LED is permanently on.<br />
* In the second, the LED only lights up when a button is pressed and a circuit is completed. <br />
* In the third example, we'll replace the manual switch with an Arduino pin (set to output mode), so we can control the LED from our program.<br />
<br />
[[Image:3Buttons.gif]]<br />
<br />
==== Power a LED (always on) ====<br />
<br />
[[Image:Lab2-3.jpg]]<br />
<br />
Build the following circuit on your breadboard. Use a 220Ohm resistor (red red brown gold). <br />
<br />
Because the LED is a diode, it has a set voltage drop across the leads; exceeding this causes heat to build up and the LED to fail prematurely. So! It is always important to have a resistor in series with the LED.<br />
<br />
Also, another consequence of the LED being a diode is that it has directionality. The longer lead, the anode, should be connected towards power; the shorter, cathode, should be connected towards ground. (In the photo, the longer lead has a bent "knee.")<br />
<br />
==== Make a light switch ====<br />
<br />
Next, we'll insert a switch into the circuit. The momentary switches in your kit are "normal open", meaning that the circuit is interrupted in the idle state, when the switch is not pressed. Pressing the switch closes the circuit until you let go again.<br />
<br />
[[Image:Lab2-4.jpg]]<br />
<br />
Use a multimeter to see what happens to the voltage on either side of the LED when you press the switch.<br />
<br />
==== Toggling LED with PD or Max ====<br />
<br />
[[Image:Lab2-5.jpg]]<br />
<br />
In the third example, we'll replace the manual switch with an Arduino pin (set to output mode), so we can control the LED from our program. The safe way to do this is to let the Arduino pin sink current - if we toggle the pin low, it acts as ground and current flows through the resistor and the LED as it did in the previous examples. When we take the pin high, to 5V, there is no potential difference and no current flows - the LED stays off.<br />
<br />
* In the ArduinoLab patch, press the "toggling LED with software" button to preset the outputs properly. The patch expects you to connect the LED to digital pin 2 (D2).<br />
<br />
<br />
Optional: Try changing your patch so the light stays on when you press the mouse button, and stays off when you press it again. After that, change your patch so the light blinks on/off. Then, have your patch button switch the light between on and blinking.<br />
<br />
=== Sensing buttons in software ===<br />
<br />
We've used code to trigger output - what about the other direction, sensing physical input in code? Just as easy. Here is a simple switch circuit:<br />
<br />
[[Image: switch.png]] [[Image: Lab2-6.jpg]]<br />
<br />
When the switch is open, the Arduino pin (set to input mode) is pulled to 5V - in software, we'll read Arduino.HIGH. When the switch is closed, the voltage at the Arduino pin falls to 0V - in software, we'll read Arduino.LOW. The pull-up resistor is used to limit the current going through the circuit. In software, we can check the value of the pin and switch between graphics accordingly.<br />
<br />
* In the ArduinoLab patch, press the "sensing buttons in software" button to preset the outputs properly. The patch expects you to connect the switch to digital pin 3 (D3).<br />
<br />
=== Fading LEDs (optional) ===<br />
<br />
What about those "breathing" LEDs on Mac Powerbooks? The fading from bright to dim and back is done using pulse-width modulation (PWM). In essence, the LED is toggled on and off rapidly, say 1000 times a second, faster than your eye can follow. The percentage of time the LED is on (the duty) controls the perceived brightness. To control an LED using PWM, you'll have to connect it to one of the pins that support PWM output - 9, 10 or 11 on the Arduino. Then write a patch that cycles the PWM values.<br />
<br />
* In the ArduinoLab patch, press the "Fading LEDs" button to preset the outputs properly. The patch expects you to connect the LED to digital pin 9 (D9). <br />
* In your Arduino Kit, you have a RGB LED which has four leads (it's white when not lit); it's basically like 3 LEDs sharing the same ground. Use PWM and this [[http://www.kingbrightusa.com/product.asp?catalog_name=LED&product_id=WP154A4SUREPBGVGAW data sheet]] to make the LED cycle through a rainbow of colors. <br />
<br />
[[Image: Lab2-7.jpg]]<br />
<br />
== Using Analog Sensors ==<br />
<br />
Now we will work with the continuous input values provided by analog sensors - potentiometers, accelerometers, distance rangers, etc.<br />
<br />
=== Make a Light Dimmer ===<br />
<br />
In this example we'll build a light dimmer: a knob connected to a light so that when you turn the knob, the light increases or decreases in brightness. We'll use a potentiometer. The potentiometer has three terminals - the resistance between the first and the third terminal is constant (10k Ohms in our case). The resistance between terminals 1 and 2 (and between 2 and 3) varies as you turn the shaft of the potentiometer. If you apply 5V to terminal 1, connect terminal 3 to Ground, you will get a continuously varying voltage at terminal 2 as you turn the shaft (Why that is the case will become clearer once you've learned about voltage dividers further below).<br />
<br />
* Connect the middle pin of the potentiometer to analog input 0, the other two to +5V and ground.<br />
* Through a 220Ohm resistor, connect an LED to pin 9(anode or long side to resistor, cathode to pin 9) <br />
* In the ArduinoLab patch, press the "Light Dimmer" button to preset the outputs properly. The patch expects you to connect the LED to digital pin 9 (D9) and the potentiometer to analog pin 0 (A0). <br />
<br />
<br />
<br />
<br />
<br />
[[Image: Lab2-8.jpg]]<br />
<br />
* Optional: hook up two LEDs, one red one green. as you turn in one direction, red gets brighter; in the other, green gets brighter. In the middle, both are off.<br />
<br />
=== Drawing a graph of analog input ===<br />
Let's understand better what the values are that we are reading from the analog input. To do so, we will use a slider graph to show how the analog values are changing in time.<br />
<br />
Leave the potentiometer part of your circuit, you may take off the LED part if you want to. Use A0 or A1 to graph the analog input in the ArduinoLab patch.<br />
<br />
<br />
=== Thresholding with a Range Sensor (optional) ===<br />
<br />
Thresholding is the process of turning continuous data into a discrete yes/no decision.<br />
<br />
To learn about thresholding, we'll connect the IR range sensor. The circuit is trivial: just connect red to 5V, black to ground, and yellow to analog input A0.<br />
<br />
Take a look at the data the sensor returns with sensor_graph_02 - when the field of view of the sensor is clear (no obstacle - point it at the ceiling), it returns a low voltage. Move your hand high over the sensor, then start lowering it - you should see the output voltage rise, until you are about 4" away. The sensor has a range of operation of 4"-30".<br />
<br />
Let's do something useful with that data. Imagine a smart cookie jar that reminds you not to snack in between meals. We could put an IR ranger into the lid. Whenever a hand comes too close, our program could play a warning sound or flash a warning light. Write a patch to manage this!<br />
<br />
=== Tilt control with an Accelerometer ===<br />
In this example, we'll simulate the motion of a ball on a tilting plane in software and control the tilt through a sensor. Think of it as a first step to build your own electronic game of Labyrinth. The right sensor to use is an accelerometer. Accelerometers can report on both static and dynamic acceleration -- think of static acceleration as the angle the accelerometer is held with respect to the ground (the acceleration measured here is due to gravity). Dynamic acceleration occurs when you shake the sensor.<br />
<br />
[[Image:Lab2-9.jpg]]<br />
<br />
The accelerometer in your kit is a 3-axis, +-2g sensor (1g is the acceleration due to gravity). It comes with 0.1" header pins that fit into the breadboard. The connections you need to make are VCC to the 3V pin on teh arduino, GND to ground, and X, Y, and Z to the first three analog input pins on the Arduino board. You can ignore the ST (self test) pin.<br />
<br />
[[Image:accel_on_board.jpg]] <br />
<br />
Push the accelerometer into the breadboard and make the connections as shown:<br />
<br />
Now, get a feel for the data the accelerometer provides. Use the Light Dimmer presets, which will track analog values on A0 and A1. Then pick up the Arduino+accelerometer board and tilt it in various directions. Start by holding it so that the accelerometer board is parallel to the ground. Find in which direction the X reading increases and decreases; do the same for the Y reading.<br />
<br />
=== Voltage Dividers ===<br />
<br />
In your kit, the potentiometer, IR distance ranger, and accelerometer are especially easy to work with since they directly output a changing voltage that can be read by one of Arduino's analog input pins.<br />
<br />
<br />
Other sensors don't give you a varying output voltage per se, but instead change their resistance. Examples in your kit are the force sensitive resistor (FSR) and the bend or flex sensor. It is easy to get a changing voltage based on a changing resistance through a voltage divider circuit(Wikipedia page). The idea is that you put two resistors in series between power and ground: one that changes resistance (your sensor), and one of a known, fixed resistance. At the point in between the two resistors, you can measure how much the voltage has dropped through the first resistor. This value changes as the ratio of resistances between variable and fixed resistors change. More formally:<br />
<br />
[[Image:res_divider.png]]<br />
<br />
* potentiometer:<br />
<br />
[[Image:Pot.png]]<br />
<br />
* force-sensitive resistor (FSR):<br />
<br />
[[Image:FSR.png]]<br />
<br />
Try both circuits. Test the resistance range of your sensor. If you want 2.5 volts to be the middle, make the comparison resistor (33k in the diagram) the "average" value of the FSR's resistance. Test this with a multimeter.<br />
<br />
* Bend Sensor<br />
[[Image:Bend_sensor.png]]<br />
<br />
<br />
<br />
== Putting it all Together ==<br />
<br />
* Create a patch to make sounds based on button and sensor values from the Arduino. You can try to adapt your patches from Lab 1, or come up with a new patch. <br />
<br />
* Try to make a simple musical interaction. Think about music -<br />
** does it have dynamics?<br />
** can you turn the sound off?<br />
** can it be expressive?<br />
<br />
<center>[[250a 2009]]</center><br />
[[Category:250a]][[Category:PID]]</div>Lukedahlhttps://ccrma.stanford.edu/mediawiki/index.php?title=SLOrk/Instruments/SoundBounce&diff=7858SLOrk/Instruments/SoundBounce2009-05-19T23:39:45Z<p>Lukedahl: /* How to play (using the ipod touch) */</p>
<hr />
<div>== Description ==<br />
<br />
This instrument uses a virtual bouncing ball and a physical accelerometer-based control to control sounds.<br />
<br />
== Usage ==<br />
<br />
=== Installation & Set up ===<br />
* Setting up the iPod Touch:<br />
*# in ipod Settings, select the correct network, e.g. SlorkNet <br />
*# launch TouchOSC on the ipod<br />
*# in TouchOSC settings set host to the IP address or name of your laptop<br />
*# note the IP address of the iPod touch.<br />
<br />
* Setting up the chuck patch (for players):<br />
*# open chuckball-catch.ck<br />
*# on line 14 set player_number to which player you are (eg 0 for player 1, 1 for player 2, ..., 3 for player 4.)<br />
*# on lines 18-22 set the IP addresses or machine names for all four players and the conductor.<br />
*# on line 25 set the IP address of your iPod Touch.<br />
<br />
* Setting up the chuck patch (for the conductor):<br />
*# open conductor-sender.ck<br />
*# on lines 9-12 set the IP addresses or machine names for the four players.<br />
*# on line 21 set the IP address of your iPod Touch.<br />
<br />
=== Running (for players) ===<br />
# Start TouchOSC: on the ipod launch TouchOSC and enter the first screen in the "Simple" layout.<br />
# Start the max patch: launch chuckball-catch-max<br />
# Start the chuck patch: start the shred for chuckball-catch.ck<br />
# Make sure they are all talking: You should see accelerometer data appearing in the max patch, and bounce and throw gestures should be recognized. The four buttons at the bottom should act as a selector, i.e. selecting one should deselect the others.<br />
<br />
=== Running (for the conductor) ===<br />
=== Running (for players) ===<br />
# Start TouchOSC: on the ipod launch TouchOSC and enter the second screen in the "Simple" layout.<br />
# Start the max patch: launch conductor-sender-max<br />
# Start the chuck patch: start the shred for conductor-sender.ck<br />
# Make sure they are all talking: You should see accelerometer data appearing in the max patch, and throw gestures should be recognized. The four buttons at the bottom should act as a selector, i.e. selecting one should deselect the others.<br />
<br />
=== How to play (using the ipod touch) ===<br />
* How to select and serve a ball (for the conductor): the bottom four buttons of the 4x4 grid select the ball type. the buttom four buttons on the screen select which player to throw to. A flicking gesture with the ipod sideways throws the ball.<br />
* How to bounce a ball: hold the ipod horizontal with the screen up and flick the ipod up.<br />
* How to pass a ball: select which player to throw the ball to with the bottom four buttons on the screen. Then flick the ipod while holding it sideways.<br />
<br />
* more details about bouncing: the max patch detects when the ipod pitch angle has gone above ~10 degrees. When the pitch has gone past about 30 degrees the ball is thrown. The time between these two events is used to calculate the velocity of the bounce. You must return the pitch to below 10 degrees to make another bounce. Little yellow buttons on the max patch help you see these events. You can only hit a ball when it is near the appex or falling. You cannot hit it on the way up.<br />
<br />
=== How to play (using the keyboard) ===<br />
* For the conductor: how to select and serve a ball...<br />
* How to bounce a ball...<br />
* How to pass a ball<br />
<br />
=== The game ===<br />
* How to start the game...<br />
* The rules...<br />
* How the game ends...</div>Lukedahl