https://ccrma.stanford.edu/mediawiki/api.php?action=feedcontributions&user=Nkruge&feedformat=atomCCRMA Wiki - User contributions [en]2024-03-19T01:50:00ZUser contributionsMediaWiki 1.24.1https://ccrma.stanford.edu/mediawiki/index.php?title=128-spring-2010-Assignment3&diff=9975128-spring-2010-Assignment32010-05-19T15:26:29Z<p>Nkruge: /* Virtual Handbell Choir */</p>
<hr />
<div>== Group (example) ==<br />
<br />
* '''members''': Jieun Oh<br />
* ('''tentative) title of piece''': Converge<br />
* '''summary of piece concept''': collect data (location, time, audio recording, text, pictures, tapping gestures) from performers prior to the concert, and combine the elements into a piece during performance based on audience preference<br />
* '''link to all related files (chuck, audio files, instructions, scores)''' : [http://ccrma.stanford.edu/~jieun5 converge files]<br />
* '''documentation of what you tried as of May 11''': blah blah blah<br />
<br />
== Virtual Handbell Choir ==<br />
<br />
* '''members''': Nick, Daniel, Jay <br />
* ('''tentative) title of piece''': Virtual Handbell Choir (very tentative)<br />
* '''summary of piece concept''': Air handbells for 10 slork stations w/Golf Controlllers, featuring a guitar hero-like GUI which will automatically disseminate the parts of a MIDI song to each Slork station, with each player being in charge of two bells.<br />
* '''link to all related files (chuck, audio files, instructions, scores)''' : [http://ccrma.stanford.edu/~nkruge/slork/HandbellChoir.zip source files]<br />
Milestones - <br />
check! Instrument - Nick, Daniel - Wednesday 12th<br />
check! Network - Jay - Monday 10th (something done to pass to daniel)<br />
check! MIDI - Jay - Monday 10th (something done to pass to daniel)<br />
check! Graphics - Daniel - Monday 17th<br />
check! INTEGRATION - ALL - Wednesday 19th<br />
check! Piece - Nick - At least something by Wednesday 19th<br />
Refine code and graphics, compile into Audicle - Wednesday 26th<br />
First version of final piece - Wednesday 26th<br />
<br />
<br />
ideas for network: <br />
1.) songs stored on server - client downloads midi, parses - determines what it needs and displays that<br />
2.) songs stored on server - server parses song - client looks to server for specific part it needs - downloads and displays that<br />
<br />
ideas for graphics:<br />
- flies at you at an angle like rock band<br />
- hits are circles, higher velocity = larger circle + different color<br />
- long rectangle for the fast dinging thing<br />
- lower horizontal bar to show you need to dampen<br />
<br />
ideas for instrument:<br />
- dampen to your chest and/or with pedal - at least have pedal be a "kill switch" in case of error <br />
- use STK shaker code to excite the bell sound<br />
<br />
* '''documentation of what you tried as of May 11''':<br />
<br />
** '''Nick''': processing of handbell samples to transpose into 3 full octaves and basic, functioning control with controller<br />
** '''Daniel''': Worked the instrument so that it is now up to scratch. Definitely not a final version of the instrument, but its a lot more responsive - I've modified the original work done for the instrument, and the main problem we were having is that it wasn't sensing a lot of the "hits." This is fixed through a sort of memory structure that I created so the instrument is a little more intelligent and can therefore not be so bad at triggering.<br />
** '''Jay''':<br />
<br />
* '''documentation of what you tried as of May 17''':<br />
<br />
- fully working v1.0 of guitar hero style score reader<br />
- revised instrument with aftertouch vibrato<br />
- OSC networked "start" button<br />
- etude.mid for rehearsal<br />
- printed step by step instructions for smoother rehearsal<br />
<br />
== Group Awesome ==<br />
<br />
* '''members''': Giancarlo Daniele, Ben Holtz, Linden Melvin<br />
* ('''tentative) title of piece''': Sampling Machine<br />
* '''summary of piece concept''': Our goal is to use sampling as a form of expression. Each slork station will have twenty samples at their disposal (mapped to keyboard keys) from iconic songs defining decades in American music. Each machine is an "instrument" capable of taking the samples and passing it through a chuck effect or filter. Our piece has a few tentative components 1) A game component, where one station plays one sample at a time and passes it to another station, which has to play that sample, add a new sample, and pass it along. 2) A score-d component, where each station will read keypresses, etc from a score 3) An improvised component, where a conductor points stations that are responsible for playing certain parts.<br />
* '''link to all related files (chuck, audio files, instructions, scores)''' : <br />
* '''documentation of what you tried as of May 11''': As a group, we've come up with a variety of different samples, "instruments" (coded chuck w/ sndbuff effects), and a very preliminary idea the various components of our piece.<br />
<br />
== Everybodyeverybody ==<br />
<br />
* '''members''': Alan Hshieh, Aaron Zarraga, Isaac Wang<br />
* ('''tentative) title of piece''': Fanfare for the Common Man<br />
* '''summary of piece concept''': The idea for our piece is that we want to combine "live" coding with a pre-written score in order to simulate a full orchestra. Each SlOrk station will be a different orchestral section and they will have an assigned part to play. The piece will be slow and harmonic to allow each person to physically type in a new note's frequency value for their instrument. These coding screens will be projected in order to show off the live coding aspect. We want to create a piece that is interesting to watch as well. We plan to make the smack sensor swap out the current shread with the new one that is currently being typed. Therefore everyone will be striking their instrument at the appropriate times to play each note. This will be awesome to look at, especially since we plan on having some Taiko drums in the orchestra. We are still trying to develop our ideas as far as what we want for the final piece, but we know that we want the song to be epic.<br />
* '''link to all related files (chuck, audio files, instructions, scores)''' : http://hshieh.com/slork<br />
* '''documentation of what you tried as of May 11''': We worked out our concept, figured out a basic rendition of Twinkle Twinkle Little Star to demo in class, and will be meeting May 12th to compose and find tune. <br />
<br />
== noise and headbanging ==<br />
<br />
* '''members''': adam somers, uri nieto, charlie forkish<br />
* ('''tentative) title of piece''': Concerto For Touchboard and Headbang Orchestra<br />
* '''summary of piece concept''': the headbang orchestra will be composed of twelve players wearing GameTrak controller gloves around their necks triggering samples by headbanging. there will be a percussion section composed of two kick drum players, one snare drum player, and four hi-hat players. there will be three guitar players each on a different power chord, and two more each triggering a different riff or shred. the soloists will be adam playing noisy awesome on the touchboard and uri modulating adam's awesome by headbanging and hairswirling. charlie will be conducting the piece.<br />
* '''link to all related files (chuck, audio files, instructions, scores)''' : [https://ccrma.stanford.edu/~cforkish/slork/concerto.zip concerto.zip]<br />
* '''documentation of what you tried as of May 11''': we have written a basic chuck patch for the headbang orchestra to trigger their samples, a basic patch to modulate noise by headbanging, and a rough outline of what the headbang orchestra will be playing underneath the soloists.<br />
<br />
== Spaetial ==<br />
<br />
* '''members''': Lekan Wang<br />
* ('''tentative) title of piece''': Spaetial Multithreaded Sound Balls? (this is tentative, right?)<br />
* '''summary of piece concept''': The idea is to have everyone control every speaker by interacting with so-called "Sound Balls" on a common canvas that's projected onto a large screen. Each ball is individually affected by gravity and viscosity, and makes sounds as it floats, bounces, gets near other balls, collides, and dies. There is a server running, and all laptops are clients. When a sound is generated by a ball, the server uses the location of the ball on the canvas to calculate which speakers should play the sound and at what gains. The effect for the audience is that the sounds the balls are making are actually physically moving across the stage. The piece I'm thinking about composing takes advantage of this by tightly integrating space and visual imagery into the composition--various sounds may start on the opposite ends of the stage, but when they combine, they form a whole that has its own character, and can be see on the screen as translucent balls blending with each other.<br />
* '''link to all related files (chuck, audio files, instructions, scores)''' : [http://www.stanford.edu/~lekanw/slork/]<br />
* '''documentation of what you tried as of May 11''': Completed the networking protocol, and the basic infrastructure for the rest of the code. Currently, you can create a "Basic Ball" and it will make sounds in the appropriate speakers, but I purposefully made the code extremely extensible, so it should be pretty easy now to add in the other features.<br />
<br />
Plan for the Future<br />
+++Core -- Finish by end of 5/14+++<br />
* Debug polyphony/chuck sporking issue<br />
* Select/delete balls<br />
* Clear all balls<br />
* Variable gravity<br />
* Ball types/Note select<br />
* Continuous and proximity sounds<br />
<br />
+++Tier 1 -- Finish by 5/16+++<br />
* Make the server work on not just my laptop...<br />
* Ball transparency<br />
* Automatically kill chuck when quitting<br />
* Route all low-freq signals to laptops with subs connected<br />
* Size of ball inversely proportional to freq<br />
* Color should match the type of sound in some way<br />
* Update speaker selection and gain algorithm so audience member will perceive same sound pressure uniformly across stage<br />
<br />
+++Tier 2 -- If time permits+++<br />
* Quantization/Metronome<br />
* Cleaner client view of balls (transparency?)<br />
* Multichannel, Stereo/rotation effects<br />
<br />
== Group twt ==<br />
<br />
* '''members''': Jorge, Visda, Stephen, Luke, Carr <br />
* ('''tentative) title of piece''': under construction<br />
* '''summary of piece concept''': audience and performers are tweeting, we transform it into an audio visual piece in real time<br />
* '''link to all related files (chuck, audio files, instructions, scores)''' : ccrma-gate.stanford.edu/user/j/jorgeh/svn-reps/projects/twttr/<br />
* '''documentation of what you tried as of May 11''':<br />
* Carr: is working on the visualization of the txt using processing<br />
* Jorge and Stephen: are working on server app and also on the twitter web-client capable of posting tweets <br />
* Visda and Luke: are working on sound design<br />
<br />
so far the connection between the tweets and chuck files via osc is working. we can manipulate simple sound structures using the receiving osc messages from the server.<br />
<br />
== Group ==<br />
<br />
* '''members''': <br />
* ('''tentative) title of piece''': <br />
* '''summary of piece concept''':<br />
* '''link to all related files (chuck, audio files, instructions, scores)''' : <br />
* '''documentation of what you tried as of May 11''':</div>Nkrugehttps://ccrma.stanford.edu/mediawiki/index.php?title=128-spring-2010-Assignment3&diff=9974128-spring-2010-Assignment32010-05-19T15:26:09Z<p>Nkruge: /* Virtual Handbell Choir */</p>
<hr />
<div>== Group (example) ==<br />
<br />
* '''members''': Jieun Oh<br />
* ('''tentative) title of piece''': Converge<br />
* '''summary of piece concept''': collect data (location, time, audio recording, text, pictures, tapping gestures) from performers prior to the concert, and combine the elements into a piece during performance based on audience preference<br />
* '''link to all related files (chuck, audio files, instructions, scores)''' : [http://ccrma.stanford.edu/~jieun5 converge files]<br />
* '''documentation of what you tried as of May 11''': blah blah blah<br />
<br />
== Virtual Handbell Choir ==<br />
<br />
* '''members''': Nick, Daniel, Jay <br />
* ('''tentative) title of piece''': Virtual Handbell Choir (very tentative)<br />
* '''summary of piece concept''': Air handbells for 10 slork stations w/Golf Controlllers, featuring a guitar hero-like GUI which will automatically disseminate the parts of a MIDI song to each Slork station, with each player being in charge of two bells.<br />
* ''link to all related files (chuck, audio files, instructions, scores)''' : [http://ccrma.stanford.edu/~nkruge/slork/HandbellChoir.zip source files]<br />
Milestones - <br />
check! Instrument - Nick, Daniel - Wednesday 12th<br />
check! Network - Jay - Monday 10th (something done to pass to daniel)<br />
check! MIDI - Jay - Monday 10th (something done to pass to daniel)<br />
check! Graphics - Daniel - Monday 17th<br />
check! INTEGRATION - ALL - Wednesday 19th<br />
check! Piece - Nick - At least something by Wednesday 19th<br />
Refine code and graphics, compile into Audicle - Wednesday 26th<br />
First version of final piece - Wednesday 26th<br />
<br />
<br />
ideas for network: <br />
1.) songs stored on server - client downloads midi, parses - determines what it needs and displays that<br />
2.) songs stored on server - server parses song - client looks to server for specific part it needs - downloads and displays that<br />
<br />
ideas for graphics:<br />
- flies at you at an angle like rock band<br />
- hits are circles, higher velocity = larger circle + different color<br />
- long rectangle for the fast dinging thing<br />
- lower horizontal bar to show you need to dampen<br />
<br />
ideas for instrument:<br />
- dampen to your chest and/or with pedal - at least have pedal be a "kill switch" in case of error <br />
- use STK shaker code to excite the bell sound<br />
<br />
* '''documentation of what you tried as of May 11''':<br />
<br />
** '''Nick''': processing of handbell samples to transpose into 3 full octaves and basic, functioning control with controller<br />
** '''Daniel''': Worked the instrument so that it is now up to scratch. Definitely not a final version of the instrument, but its a lot more responsive - I've modified the original work done for the instrument, and the main problem we were having is that it wasn't sensing a lot of the "hits." This is fixed through a sort of memory structure that I created so the instrument is a little more intelligent and can therefore not be so bad at triggering.<br />
** '''Jay''':<br />
<br />
* '''documentation of what you tried as of May 17''':<br />
<br />
- fully working v1.0 of guitar hero style score reader<br />
- revised instrument with aftertouch vibrato<br />
- OSC networked "start" button<br />
- etude.mid for rehearsal<br />
- printed step by step instructions for smoother rehearsal<br />
<br />
== Group Awesome ==<br />
<br />
* '''members''': Giancarlo Daniele, Ben Holtz, Linden Melvin<br />
* ('''tentative) title of piece''': Sampling Machine<br />
* '''summary of piece concept''': Our goal is to use sampling as a form of expression. Each slork station will have twenty samples at their disposal (mapped to keyboard keys) from iconic songs defining decades in American music. Each machine is an "instrument" capable of taking the samples and passing it through a chuck effect or filter. Our piece has a few tentative components 1) A game component, where one station plays one sample at a time and passes it to another station, which has to play that sample, add a new sample, and pass it along. 2) A score-d component, where each station will read keypresses, etc from a score 3) An improvised component, where a conductor points stations that are responsible for playing certain parts.<br />
* '''link to all related files (chuck, audio files, instructions, scores)''' : <br />
* '''documentation of what you tried as of May 11''': As a group, we've come up with a variety of different samples, "instruments" (coded chuck w/ sndbuff effects), and a very preliminary idea the various components of our piece.<br />
<br />
== Everybodyeverybody ==<br />
<br />
* '''members''': Alan Hshieh, Aaron Zarraga, Isaac Wang<br />
* ('''tentative) title of piece''': Fanfare for the Common Man<br />
* '''summary of piece concept''': The idea for our piece is that we want to combine "live" coding with a pre-written score in order to simulate a full orchestra. Each SlOrk station will be a different orchestral section and they will have an assigned part to play. The piece will be slow and harmonic to allow each person to physically type in a new note's frequency value for their instrument. These coding screens will be projected in order to show off the live coding aspect. We want to create a piece that is interesting to watch as well. We plan to make the smack sensor swap out the current shread with the new one that is currently being typed. Therefore everyone will be striking their instrument at the appropriate times to play each note. This will be awesome to look at, especially since we plan on having some Taiko drums in the orchestra. We are still trying to develop our ideas as far as what we want for the final piece, but we know that we want the song to be epic.<br />
* '''link to all related files (chuck, audio files, instructions, scores)''' : http://hshieh.com/slork<br />
* '''documentation of what you tried as of May 11''': We worked out our concept, figured out a basic rendition of Twinkle Twinkle Little Star to demo in class, and will be meeting May 12th to compose and find tune. <br />
<br />
== noise and headbanging ==<br />
<br />
* '''members''': adam somers, uri nieto, charlie forkish<br />
* ('''tentative) title of piece''': Concerto For Touchboard and Headbang Orchestra<br />
* '''summary of piece concept''': the headbang orchestra will be composed of twelve players wearing GameTrak controller gloves around their necks triggering samples by headbanging. there will be a percussion section composed of two kick drum players, one snare drum player, and four hi-hat players. there will be three guitar players each on a different power chord, and two more each triggering a different riff or shred. the soloists will be adam playing noisy awesome on the touchboard and uri modulating adam's awesome by headbanging and hairswirling. charlie will be conducting the piece.<br />
* '''link to all related files (chuck, audio files, instructions, scores)''' : [https://ccrma.stanford.edu/~cforkish/slork/concerto.zip concerto.zip]<br />
* '''documentation of what you tried as of May 11''': we have written a basic chuck patch for the headbang orchestra to trigger their samples, a basic patch to modulate noise by headbanging, and a rough outline of what the headbang orchestra will be playing underneath the soloists.<br />
<br />
== Spaetial ==<br />
<br />
* '''members''': Lekan Wang<br />
* ('''tentative) title of piece''': Spaetial Multithreaded Sound Balls? (this is tentative, right?)<br />
* '''summary of piece concept''': The idea is to have everyone control every speaker by interacting with so-called "Sound Balls" on a common canvas that's projected onto a large screen. Each ball is individually affected by gravity and viscosity, and makes sounds as it floats, bounces, gets near other balls, collides, and dies. There is a server running, and all laptops are clients. When a sound is generated by a ball, the server uses the location of the ball on the canvas to calculate which speakers should play the sound and at what gains. The effect for the audience is that the sounds the balls are making are actually physically moving across the stage. The piece I'm thinking about composing takes advantage of this by tightly integrating space and visual imagery into the composition--various sounds may start on the opposite ends of the stage, but when they combine, they form a whole that has its own character, and can be see on the screen as translucent balls blending with each other.<br />
* '''link to all related files (chuck, audio files, instructions, scores)''' : [http://www.stanford.edu/~lekanw/slork/]<br />
* '''documentation of what you tried as of May 11''': Completed the networking protocol, and the basic infrastructure for the rest of the code. Currently, you can create a "Basic Ball" and it will make sounds in the appropriate speakers, but I purposefully made the code extremely extensible, so it should be pretty easy now to add in the other features.<br />
<br />
Plan for the Future<br />
+++Core -- Finish by end of 5/14+++<br />
* Debug polyphony/chuck sporking issue<br />
* Select/delete balls<br />
* Clear all balls<br />
* Variable gravity<br />
* Ball types/Note select<br />
* Continuous and proximity sounds<br />
<br />
+++Tier 1 -- Finish by 5/16+++<br />
* Make the server work on not just my laptop...<br />
* Ball transparency<br />
* Automatically kill chuck when quitting<br />
* Route all low-freq signals to laptops with subs connected<br />
* Size of ball inversely proportional to freq<br />
* Color should match the type of sound in some way<br />
* Update speaker selection and gain algorithm so audience member will perceive same sound pressure uniformly across stage<br />
<br />
+++Tier 2 -- If time permits+++<br />
* Quantization/Metronome<br />
* Cleaner client view of balls (transparency?)<br />
* Multichannel, Stereo/rotation effects<br />
<br />
== Group twt ==<br />
<br />
* '''members''': Jorge, Visda, Stephen, Luke, Carr <br />
* ('''tentative) title of piece''': under construction<br />
* '''summary of piece concept''': audience and performers are tweeting, we transform it into an audio visual piece in real time<br />
* '''link to all related files (chuck, audio files, instructions, scores)''' : ccrma-gate.stanford.edu/user/j/jorgeh/svn-reps/projects/twttr/<br />
* '''documentation of what you tried as of May 11''':<br />
* Carr: is working on the visualization of the txt using processing<br />
* Jorge and Stephen: are working on server app and also on the twitter web-client capable of posting tweets <br />
* Visda and Luke: are working on sound design<br />
<br />
so far the connection between the tweets and chuck files via osc is working. we can manipulate simple sound structures using the receiving osc messages from the server.<br />
<br />
== Group ==<br />
<br />
* '''members''': <br />
* ('''tentative) title of piece''': <br />
* '''summary of piece concept''':<br />
* '''link to all related files (chuck, audio files, instructions, scores)''' : <br />
* '''documentation of what you tried as of May 11''':</div>Nkrugehttps://ccrma.stanford.edu/mediawiki/index.php?title=128-spring-2010-Assignment3&diff=9973128-spring-2010-Assignment32010-05-19T15:24:30Z<p>Nkruge: /* Virtual Handbell Choir */</p>
<hr />
<div>== Group (example) ==<br />
<br />
* '''members''': Jieun Oh<br />
* ('''tentative) title of piece''': Converge<br />
* '''summary of piece concept''': collect data (location, time, audio recording, text, pictures, tapping gestures) from performers prior to the concert, and combine the elements into a piece during performance based on audience preference<br />
* '''link to all related files (chuck, audio files, instructions, scores)''' : [http://ccrma.stanford.edu/~jieun5 converge files]<br />
* '''documentation of what you tried as of May 11''': blah blah blah<br />
<br />
== Virtual Handbell Choir ==<br />
<br />
* '''members''': Nick, Daniel, Jay <br />
* ('''tentative) title of piece''': Virtual Handbell Choir (very tentative)<br />
* '''summary of piece concept''': Air handbells for 10 slork stations w/Golf Controlllers, featuring a guitar hero-like GUI which will automatically disseminate the parts of a MIDI song to each Slork station, with each player being in charge of two bells.<br />
* '''[[link to all related files (chuck, audio files, instructions, scores)]]''' : <br />
Milestones - <br />
check! Instrument - Nick, Daniel - Wednesday 12th<br />
check! Network - Jay - Monday 10th (something done to pass to daniel)<br />
check! MIDI - Jay - Monday 10th (something done to pass to daniel)<br />
check! Graphics - Daniel - Monday 17th<br />
check! INTEGRATION - ALL - Wednesday 19th<br />
check! Piece - Nick - At least something by Wednesday 19th<br />
Refine code and graphics, compile into Audicle - Wednesday 26th<br />
First version of final piece - Wednesday 26th<br />
<br />
<br />
ideas for network: <br />
1.) songs stored on server - client downloads midi, parses - determines what it needs and displays that<br />
2.) songs stored on server - server parses song - client looks to server for specific part it needs - downloads and displays that<br />
<br />
ideas for graphics:<br />
- flies at you at an angle like rock band<br />
- hits are circles, higher velocity = larger circle + different color<br />
- long rectangle for the fast dinging thing<br />
- lower horizontal bar to show you need to dampen<br />
<br />
ideas for instrument:<br />
- dampen to your chest and/or with pedal - at least have pedal be a "kill switch" in case of error <br />
- use STK shaker code to excite the bell sound<br />
<br />
* '''documentation of what you tried as of May 11''':<br />
<br />
** '''Nick''': processing of handbell samples to transpose into 3 full octaves and basic, functioning control with controller<br />
** '''Daniel''': Worked the instrument so that it is now up to scratch. Definitely not a final version of the instrument, but its a lot more responsive - I've modified the original work done for the instrument, and the main problem we were having is that it wasn't sensing a lot of the "hits." This is fixed through a sort of memory structure that I created so the instrument is a little more intelligent and can therefore not be so bad at triggering.<br />
** '''Jay''':<br />
<br />
* '''documentation of what you tried as of May 17''':<br />
<br />
- fully working v1.0 of guitar hero style score reader<br />
- revised instrument with aftertouch vibrato<br />
- OSC networked "start" button<br />
- etude.mid for rehearsal<br />
- printed step by step instructions for smoother rehearsal<br />
<br />
== Group Awesome ==<br />
<br />
* '''members''': Giancarlo Daniele, Ben Holtz, Linden Melvin<br />
* ('''tentative) title of piece''': Sampling Machine<br />
* '''summary of piece concept''': Our goal is to use sampling as a form of expression. Each slork station will have twenty samples at their disposal (mapped to keyboard keys) from iconic songs defining decades in American music. Each machine is an "instrument" capable of taking the samples and passing it through a chuck effect or filter. Our piece has a few tentative components 1) A game component, where one station plays one sample at a time and passes it to another station, which has to play that sample, add a new sample, and pass it along. 2) A score-d component, where each station will read keypresses, etc from a score 3) An improvised component, where a conductor points stations that are responsible for playing certain parts.<br />
* '''link to all related files (chuck, audio files, instructions, scores)''' : <br />
* '''documentation of what you tried as of May 11''': As a group, we've come up with a variety of different samples, "instruments" (coded chuck w/ sndbuff effects), and a very preliminary idea the various components of our piece.<br />
<br />
== Everybodyeverybody ==<br />
<br />
* '''members''': Alan Hshieh, Aaron Zarraga, Isaac Wang<br />
* ('''tentative) title of piece''': Fanfare for the Common Man<br />
* '''summary of piece concept''': The idea for our piece is that we want to combine "live" coding with a pre-written score in order to simulate a full orchestra. Each SlOrk station will be a different orchestral section and they will have an assigned part to play. The piece will be slow and harmonic to allow each person to physically type in a new note's frequency value for their instrument. These coding screens will be projected in order to show off the live coding aspect. We want to create a piece that is interesting to watch as well. We plan to make the smack sensor swap out the current shread with the new one that is currently being typed. Therefore everyone will be striking their instrument at the appropriate times to play each note. This will be awesome to look at, especially since we plan on having some Taiko drums in the orchestra. We are still trying to develop our ideas as far as what we want for the final piece, but we know that we want the song to be epic.<br />
* '''link to all related files (chuck, audio files, instructions, scores)''' : http://hshieh.com/slork<br />
* '''documentation of what you tried as of May 11''': We worked out our concept, figured out a basic rendition of Twinkle Twinkle Little Star to demo in class, and will be meeting May 12th to compose and find tune. <br />
<br />
== noise and headbanging ==<br />
<br />
* '''members''': adam somers, uri nieto, charlie forkish<br />
* ('''tentative) title of piece''': Concerto For Touchboard and Headbang Orchestra<br />
* '''summary of piece concept''': the headbang orchestra will be composed of twelve players wearing GameTrak controller gloves around their necks triggering samples by headbanging. there will be a percussion section composed of two kick drum players, one snare drum player, and four hi-hat players. there will be three guitar players each on a different power chord, and two more each triggering a different riff or shred. the soloists will be adam playing noisy awesome on the touchboard and uri modulating adam's awesome by headbanging and hairswirling. charlie will be conducting the piece.<br />
* '''link to all related files (chuck, audio files, instructions, scores)''' : [https://ccrma.stanford.edu/~cforkish/slork/concerto.zip concerto.zip]<br />
* '''documentation of what you tried as of May 11''': we have written a basic chuck patch for the headbang orchestra to trigger their samples, a basic patch to modulate noise by headbanging, and a rough outline of what the headbang orchestra will be playing underneath the soloists.<br />
<br />
== Spaetial ==<br />
<br />
* '''members''': Lekan Wang<br />
* ('''tentative) title of piece''': Spaetial Multithreaded Sound Balls? (this is tentative, right?)<br />
* '''summary of piece concept''': The idea is to have everyone control every speaker by interacting with so-called "Sound Balls" on a common canvas that's projected onto a large screen. Each ball is individually affected by gravity and viscosity, and makes sounds as it floats, bounces, gets near other balls, collides, and dies. There is a server running, and all laptops are clients. When a sound is generated by a ball, the server uses the location of the ball on the canvas to calculate which speakers should play the sound and at what gains. The effect for the audience is that the sounds the balls are making are actually physically moving across the stage. The piece I'm thinking about composing takes advantage of this by tightly integrating space and visual imagery into the composition--various sounds may start on the opposite ends of the stage, but when they combine, they form a whole that has its own character, and can be see on the screen as translucent balls blending with each other.<br />
* '''link to all related files (chuck, audio files, instructions, scores)''' : [http://www.stanford.edu/~lekanw/slork/]<br />
* '''documentation of what you tried as of May 11''': Completed the networking protocol, and the basic infrastructure for the rest of the code. Currently, you can create a "Basic Ball" and it will make sounds in the appropriate speakers, but I purposefully made the code extremely extensible, so it should be pretty easy now to add in the other features.<br />
<br />
Plan for the Future<br />
+++Core -- Finish by end of 5/14+++<br />
* Debug polyphony/chuck sporking issue<br />
* Select/delete balls<br />
* Clear all balls<br />
* Variable gravity<br />
* Ball types/Note select<br />
* Continuous and proximity sounds<br />
<br />
+++Tier 1 -- Finish by 5/16+++<br />
* Make the server work on not just my laptop...<br />
* Ball transparency<br />
* Automatically kill chuck when quitting<br />
* Route all low-freq signals to laptops with subs connected<br />
* Size of ball inversely proportional to freq<br />
* Color should match the type of sound in some way<br />
* Update speaker selection and gain algorithm so audience member will perceive same sound pressure uniformly across stage<br />
<br />
+++Tier 2 -- If time permits+++<br />
* Quantization/Metronome<br />
* Cleaner client view of balls (transparency?)<br />
* Multichannel, Stereo/rotation effects<br />
<br />
== Group twt ==<br />
<br />
* '''members''': Jorge, Visda, Stephen, Luke, Carr <br />
* ('''tentative) title of piece''': under construction<br />
* '''summary of piece concept''': audience and performers are tweeting, we transform it into an audio visual piece in real time<br />
* '''link to all related files (chuck, audio files, instructions, scores)''' : ccrma-gate.stanford.edu/user/j/jorgeh/svn-reps/projects/twttr/<br />
* '''documentation of what you tried as of May 11''':<br />
* Carr: is working on the visualization of the txt using processing<br />
* Jorge and Stephen: are working on server app and also on the twitter web-client capable of posting tweets <br />
* Visda and Luke: are working on sound design<br />
<br />
so far the connection between the tweets and chuck files via osc is working. we can manipulate simple sound structures using the receiving osc messages from the server.<br />
<br />
== Group ==<br />
<br />
* '''members''': <br />
* ('''tentative) title of piece''': <br />
* '''summary of piece concept''':<br />
* '''link to all related files (chuck, audio files, instructions, scores)''' : <br />
* '''documentation of what you tried as of May 11''':</div>Nkrugehttps://ccrma.stanford.edu/mediawiki/index.php?title=128-spring-2010-Assignment3&diff=9936128-spring-2010-Assignment32010-05-12T04:49:57Z<p>Nkruge: /* Virtual Handbell Choir */</p>
<hr />
<div>== Group (example) ==<br />
<br />
* '''members''': Jieun Oh<br />
* ('''tentative) title of piece''': Converge<br />
* '''summary of piece concept''': collect data (location, time, audio recording, text, pictures, tapping gestures) from performers prior to the concert, and combine the elements into a piece during performance based on audience preference<br />
* '''link to all related files (chuck, audio files, instructions, scores)''' : [http://ccrma.stanford.edu/~jieun5 converge files]<br />
* '''documentation of what you tried as of May 11''': blah blah blah<br />
<br />
== Virtual Handbell Choir ==<br />
<br />
* '''members''': Nick, Daniel, Jay <br />
* ('''tentative) title of piece''': Virtual Handbell Choir (very tentative)<br />
* '''summary of piece concept''': Air handbells for 15 slork stations w/Golf Controlllers, featuring a guitar hero-like GUI which will automatically disseminate the parts of a MIDI song to each Slork station, with each player being in charge of two bells.<br />
* '''link to all related files (chuck, audio files, instructions, scores)''' : <br />
Milestones - <br />
Instrument - Nick, Daniel - Wednesday 12th<br />
Network - Jay - Monday 10th (something done to pass to daniel)<br />
MIDI - Jay - Monday 10th (something done to pass to daniel)<br />
Graphics - Daniel - Monday 17th<br />
INTEGRATION - ALL - Wednesday 19th<br />
Piece - Nick - At least something by Wednesday 19th<br />
<br />
ideas for network: <br />
1.) songs stored on server - client downloads midi, parses - determines what it needs and displays that<br />
2.) songs stored on server - server parses song - client looks to server for specific part it needs - downloads and displays that<br />
<br />
ideas for graphics:<br />
- flies at you at an angle like rock band<br />
- hits are circles, higher velocity = larger circle + different color<br />
- long rectangle for the fast dinging thing<br />
- lower horizontal bar to show you need to dampen<br />
<br />
ideas for instrument:<br />
- dampen to your chest and/or with pedal - at least have pedal be a "kill switch" in case of error <br />
- use STK shaker code to excite the bell sound<br />
<br />
* '''documentation of what you tried as of May 11''':<br />
<br />
** '''Nick''': processing of handbell samples to transpose into 3 full octaves and basic, functioning control with controller<br />
** '''Daniel''':<br />
** '''Jay''':<br />
<br />
**'''plan to integrate further for basic instrument wednesday'''<br />
<br />
== Group ==<br />
<br />
* '''members''': <br />
* ('''tentative) title of piece''': <br />
* '''summary of piece concept''':<br />
* '''link to all related files (chuck, audio files, instructions, scores)''' : <br />
* '''documentation of what you tried as of May 11''': <br />
<br />
<br />
'''Giancarlo Daniele, Ben Holtz, Linden Melvin'''<br />
<br />
For our in-class demo session, we plan to</div>Nkrugehttps://ccrma.stanford.edu/mediawiki/index.php?title=128-spring-2010-Assignment3&diff=9935128-spring-2010-Assignment32010-05-12T04:46:56Z<p>Nkruge: /* Group */</p>
<hr />
<div>== Group (example) ==<br />
<br />
* '''members''': Jieun Oh<br />
* ('''tentative) title of piece''': Converge<br />
* '''summary of piece concept''': collect data (location, time, audio recording, text, pictures, tapping gestures) from performers prior to the concert, and combine the elements into a piece during performance based on audience preference<br />
* '''link to all related files (chuck, audio files, instructions, scores)''' : [http://ccrma.stanford.edu/~jieun5 converge files]<br />
* '''documentation of what you tried as of May 11''': blah blah blah<br />
<br />
== Virtual Handbell Choir ==<br />
<br />
* '''members''': Nick, Daniel, Jay <br />
* ('''tentative) title of piece''': Virtual Handbell Choir (very tentative)<br />
* '''summary of piece concept''': Air handbells for 15 slork stations w/Golf Controlllers, featuring a guitar hero-like GUI which will automatically disseminate the parts of a MIDI song to each Slork station, with each player being in charge of two bells.<br />
* '''link to all related files (chuck, audio files, instructions, scores)''' : <br />
Milestones - <br />
Instrument - Nick, Daniel - Wednesday 12th<br />
Network - Jay - Monday 10th (something done to pass to daniel)<br />
MIDI - Jay - Monday 10th (something done to pass to daniel)<br />
Graphics - Daniel - Monday 17th<br />
INTEGRATION - ALL - Wednesday 19th<br />
Piece - Nick - At least something by Wednesday 19th<br />
<br />
ideas for network: <br />
1.) songs stored on server - client downloads midi, parses - determines what it needs and displays that<br />
2.) songs stored on server - server parses song - client looks to server for specific part it needs - downloads and displays that<br />
<br />
ideas for graphics:<br />
- flies at you at an angle like rock band<br />
- hits are circles, higher velocity = larger circle + different color<br />
- long rectangle for the fast dinging thing<br />
- lower horizontal bar to show you need to dampen<br />
<br />
ideas for instrument:<br />
- dampen to your chest and/or with pedal - at least have pedal be a "kill switch" in case of error <br />
- use STK shaker code to excite the bell sound<br />
<br />
* '''documentation of what you tried as of May 11''':<br />
<br />
** '''Nick''':<br />
** '''Daniel''':<br />
** '''Jay''':<br />
<br />
== Group ==<br />
<br />
* '''members''': <br />
* ('''tentative) title of piece''': <br />
* '''summary of piece concept''':<br />
* '''link to all related files (chuck, audio files, instructions, scores)''' : <br />
* '''documentation of what you tried as of May 11''': <br />
<br />
<br />
'''Giancarlo Daniele, Ben Holtz, Linden Melvin'''<br />
<br />
For our in-class demo session, we plan to</div>Nkrugehttps://ccrma.stanford.edu/mediawiki/index.php?title=256b-winter-2010/degotchafier&diff=9596256b-winter-2010/degotchafier2010-03-03T09:53:40Z<p>Nkruge: /* student provided solutions */</p>
<hr />
<div>= Music 256b De-Gotcha-Fier =<br />
This page is intended to host solutions to those annoying little things that pop up while we're programming, especially those that take hours to solve and end up being very simple solutions. It is often difficult to find that "silver bullet" solution to a problem, but this wiki is dedicated to just that.<br />
<br />
* [http://ccrma.stanford.edu/courses/256b-winter-2010/ course homepage]<br />
* [[256b-winter-2010/(i)FAQ|(in)frequently asked questions]] (please post, we are watching this page!)<br />
<br />
<br />
== weblink provided solutions ==<br />
*If you are using a UITabBarController and the auto-rotate functions do not work, follow [http://arashpayan.com/blog/index.php/2008/09/04/change-iphoneipod-app-orientation-within-a-uitabbarcontroller/ this simple tutorial].<br />
<br />
== student provided solutions ==<br />
* certain objects, such as UIAlertView's, do not switch to landscape mode even when your plist is set to display landscape.<br />
To make certain that everything starts in landscape, drop in: [application setStatusBarOrientation:UIInterfaceOrientationLandscapeRight animated:NO]; in your didFInishLaunching. If you would like your home button to the left, use: [application setStatusBarOrientation:UIInterfaceOrientationLandscapeLeft animated:NO];<br />
<br />
* Using multiple nib files results in a SIGABRT "unrecognized selector" when you run IBActions off of your NavController or TabBar.<br />
You must set the class of BOTH the new nib file's "File's Owner" and the NavController or TabBar's individual tab (accessed by clicking the tab itself, not the tab picture) which leads to the nib file, to the same ViewController class.</div>Nkrugehttps://ccrma.stanford.edu/mediawiki/index.php?title=256b-winter-2010/degotchafier&diff=9595256b-winter-2010/degotchafier2010-03-03T09:53:04Z<p>Nkruge: /* student provided solutions */</p>
<hr />
<div>= Music 256b De-Gotcha-Fier =<br />
This page is intended to host solutions to those annoying little things that pop up while we're programming, especially those that take hours to solve and end up being very simple solutions. It is often difficult to find that "silver bullet" solution to a problem, but this wiki is dedicated to just that.<br />
<br />
* [http://ccrma.stanford.edu/courses/256b-winter-2010/ course homepage]<br />
* [[256b-winter-2010/(i)FAQ|(in)frequently asked questions]] (please post, we are watching this page!)<br />
<br />
<br />
== weblink provided solutions ==<br />
*If you are using a UITabBarController and the auto-rotate functions do not work, follow [http://arashpayan.com/blog/index.php/2008/09/04/change-iphoneipod-app-orientation-within-a-uitabbarcontroller/ this simple tutorial].<br />
<br />
== student provided solutions ==<br />
* certain objects, such as UIAlertView's, do not switch to landscape mode even when your plist is set.<br />
To make certain that everything starts in landscape, drop in: [application setStatusBarOrientation:UIInterfaceOrientationLandscapeRight animated:NO]; in your didFInishLaunching. If you would like your home button to the left, use: [application setStatusBarOrientation:UIInterfaceOrientationLandscapeLeft animated:NO];<br />
<br />
* Using multiple nib files results in a SIGABRT "unrecognized selector" when you run IBActions off of your NavController or TabBar.<br />
You must set the class of BOTH the new nib file's "File's Owner" and the NavController or TabBar's individual tab (accessed by clicking the tab itself, not the tab picture) which leads to the nib file, to the same ViewController class.</div>Nkrugehttps://ccrma.stanford.edu/mediawiki/index.php?title=256b-winter-2010/degotchafier&diff=9594256b-winter-2010/degotchafier2010-03-03T09:44:53Z<p>Nkruge: /* weblink provided solutions */</p>
<hr />
<div>= Music 256b De-Gotcha-Fier =<br />
This page is intended to host solutions to those annoying little things that pop up while we're programming, especially those that take hours to solve and end up being very simple solutions. It is often difficult to find that "silver bullet" solution to a problem, but this wiki is dedicated to just that.<br />
<br />
* [http://ccrma.stanford.edu/courses/256b-winter-2010/ course homepage]<br />
* [[256b-winter-2010/(i)FAQ|(in)frequently asked questions]] (please post, we are watching this page!)<br />
<br />
<br />
== weblink provided solutions ==<br />
*If you are using a UITabBarController and the auto-rotate functions do not work, follow [http://arashpayan.com/blog/index.php/2008/09/04/change-iphoneipod-app-orientation-within-a-uitabbarcontroller/ this simple tutorial].<br />
<br />
== student provided solutions ==<br />
* certain objects, such as UIAlertView's, do not switch to landscape mode even when your plist is set.<br />
To make certain that everything starts in landscape, drop in: [application setStatusBarOrientation:UIInterfaceOrientationLandscapeRight animated:NO]; in your didFInishLaunching. If you would like your home button to the left, use: [application setStatusBarOrientation:UIInterfaceOrientationLandscapeLeft animated:NO];</div>Nkrugehttps://ccrma.stanford.edu/mediawiki/index.php?title=256b-winter-2010/degotchafier&diff=9593256b-winter-2010/degotchafier2010-03-03T09:44:28Z<p>Nkruge: /* weblink provided solutions */</p>
<hr />
<div>= Music 256b De-Gotcha-Fier =<br />
This page is intended to host solutions to those annoying little things that pop up while we're programming, especially those that take hours to solve and end up being very simple solutions. It is often difficult to find that "silver bullet" solution to a problem, but this wiki is dedicated to just that.<br />
<br />
* [http://ccrma.stanford.edu/courses/256b-winter-2010/ course homepage]<br />
* [[256b-winter-2010/(i)FAQ|(in)frequently asked questions]] (please post, we are watching this page!)<br />
<br />
<br />
== weblink provided solutions ==<br />
*If you are using a UITabBarController and the auto-rotate functions do not work, follow [http://arashpayan.com/blog/index.php/2008/09/04/change-iphoneipod-app-orientation-within-a-uitabbarcontroller/ | this simple tutorial].<br />
<br />
== student provided solutions ==<br />
* certain objects, such as UIAlertView's, do not switch to landscape mode even when your plist is set.<br />
To make certain that everything starts in landscape, drop in: [application setStatusBarOrientation:UIInterfaceOrientationLandscapeRight animated:NO]; in your didFInishLaunching. If you would like your home button to the left, use: [application setStatusBarOrientation:UIInterfaceOrientationLandscapeLeft animated:NO];</div>Nkrugehttps://ccrma.stanford.edu/mediawiki/index.php?title=256b-winter-2010/degotchafier&diff=9592256b-winter-2010/degotchafier2010-03-03T09:43:59Z<p>Nkruge: /* weblink provided solutions */</p>
<hr />
<div>= Music 256b De-Gotcha-Fier =<br />
This page is intended to host solutions to those annoying little things that pop up while we're programming, especially those that take hours to solve and end up being very simple solutions. It is often difficult to find that "silver bullet" solution to a problem, but this wiki is dedicated to just that.<br />
<br />
* [http://ccrma.stanford.edu/courses/256b-winter-2010/ course homepage]<br />
* [[256b-winter-2010/(i)FAQ|(in)frequently asked questions]] (please post, we are watching this page!)<br />
<br />
<br />
== weblink provided solutions ==<br />
*If you are using a UITabBarController and the auto-rotate functions do not work, follow [[http://arashpayan.com/blog/index.php/2008/09/04/change-iphoneipod-app-orientation-within-a-uitabbarcontroller/|this simple tutorial]].<br />
<br />
== student provided solutions ==<br />
* certain objects, such as UIAlertView's, do not switch to landscape mode even when your plist is set.<br />
To make certain that everything starts in landscape, drop in: [application setStatusBarOrientation:UIInterfaceOrientationLandscapeRight animated:NO]; in your didFInishLaunching. If you would like your home button to the left, use: [application setStatusBarOrientation:UIInterfaceOrientationLandscapeLeft animated:NO];</div>Nkrugehttps://ccrma.stanford.edu/mediawiki/index.php?title=256b-winter-2010/degotchafier&diff=9591256b-winter-2010/degotchafier2010-03-03T09:39:51Z<p>Nkruge: /* Music 256b De-Gotcha-Fier */</p>
<hr />
<div>= Music 256b De-Gotcha-Fier =<br />
This page is intended to host solutions to those annoying little things that pop up while we're programming, especially those that take hours to solve and end up being very simple solutions. It is often difficult to find that "silver bullet" solution to a problem, but this wiki is dedicated to just that.<br />
<br />
* [http://ccrma.stanford.edu/courses/256b-winter-2010/ course homepage]<br />
* [[256b-winter-2010/(i)FAQ|(in)frequently asked questions]] (please post, we are watching this page!)<br />
<br />
<br />
== weblink provided solutions ==<br />
<br />
<br />
== student provided solutions ==<br />
* certain objects, such as UIAlertView's, do not switch to landscape mode even when your plist is set.<br />
To make certain that everything starts in landscape, drop in: [application setStatusBarOrientation:UIInterfaceOrientationLandscapeRight animated:NO]; in your didFInishLaunching. If you would like your home button to the left, use: [application setStatusBarOrientation:UIInterfaceOrientationLandscapeLeft animated:NO];</div>Nkrugehttps://ccrma.stanford.edu/mediawiki/index.php?title=256b-winter-2010/degotchafier&diff=9590256b-winter-2010/degotchafier2010-03-03T09:39:31Z<p>Nkruge: /* student provided solutions */</p>
<hr />
<div>= Music 256b De-Gotcha-Fier =<br />
This page is intended to host solutions to those annoying little things that pop up while we're programming, especially those that take hours to solve and end up being very simple solutions. It is often difficult to find that "silver bullet" solution to a problem, but this wiki is dedicated to just that.<br />
<br />
* [http://ccrma.stanford.edu/courses/256b-winter-2010/ course homepage]<br />
* [[256b-winter-2010/(i)FAQ|(in)frequently asked questions]] (please post, we are watching this page!)<br />
<br />
<br />
== weblink provided solutions ==<br />
<br />
<br />
== student provided solutions ==<br />
+ certain objects, such as UIAlertView's, do not switch to landscape mode even when your plist is set.<br />
To make certain that everything starts in landscape, drop in: [application setStatusBarOrientation:UIInterfaceOrientationLandscapeRight animated:NO]; in your didFInishLaunching. If you would like your home button to the left, use: [application setStatusBarOrientation:UIInterfaceOrientationLandscapeLeft animated:NO];</div>Nkrugehttps://ccrma.stanford.edu/mediawiki/index.php?title=256b-winter-2010/degotchafier&diff=9589256b-winter-2010/degotchafier2010-03-03T09:26:38Z<p>Nkruge: This page is intended to host solutions to those annoying little things that pop up while we're programming, especially those that take hours to solve and end up being very simple solutions.</p>
<hr />
<div>= Music 256b De-Gotcha-Fier =<br />
This page is intended to host solutions to those annoying little things that pop up while we're programming, especially those that take hours to solve and end up being very simple solutions. It is often difficult to find that "silver bullet" solution to a problem, but this wiki is dedicated to just that.<br />
<br />
* [http://ccrma.stanford.edu/courses/256b-winter-2010/ course homepage]<br />
* [[256b-winter-2010/(i)FAQ|(in)frequently asked questions]] (please post, we are watching this page!)<br />
<br />
<br />
== weblink provided solutions ==<br />
<br />
<br />
== student provided solutions ==</div>Nkrugehttps://ccrma.stanford.edu/mediawiki/index.php?title=256b-winter-2010&diff=9588256b-winter-20102010-03-03T09:20:09Z<p>Nkruge: /* Music 256b | Mobile Music (Music, Computing, and Design II) */</p>
<hr />
<div>= Music 256b | Mobile Music (Music, Computing, and Design II) =<br />
<br />
* [http://ccrma.stanford.edu/courses/256b-winter-2010/ course homepage]<br />
* [[256b-winter-2010/(i)FAQ|(in)frequently asked questions]] (please post, we are watching this page!)<br />
<br />
<br />
== labs + assignments ==<br />
* homework 1: [[256b-winter-2010/hw1 | iPhone programming lab: audio + interaction]]<br />
* homework 2: [[256b-winter-2010/hw2 | SonicSlingShot]]<br />
* homework 3: [[256b-winter-2010/hw3 | Design Your Own Instrument]]<br />
<br />
<br />
== final projects ==<br />
* coming later!<br />
<br />
<br />
== useful resources ==<br />
* check out notes and code from the [http://ccrma.stanford.edu/courses/256b-fall-2010/lectures/ lectures]<br />
* [[256b-winter-2010/Bib|bibliography and readings]]<br />
* [[256b-winter-2010/degotchafier|MUS256B iPhone Programming De-Gotcha-fier]]<br />
<br />
<br />
== Office Hours ==<br />
* TBD<br />
<br />
<br />
<br />
[[Category: Courses]]</div>Nkrugehttps://ccrma.stanford.edu/mediawiki/index.php?title=220a-fall-2009/FinalProjects&diff=9323220a-fall-2009/FinalProjects2009-11-17T05:55:46Z<p>Nkruge: /* FINAL PRESENTATIONS SCHEDULE */</p>
<hr />
<div>[http://ccrma.stanford.edu/wiki/220a-fall-2009 BACK TO WIKI MAIN PAGE]<br />
<br />
==Sign up for Meetings with Michelle and Michael==<br />
<br />
* Tuesday, November 17th: <i> ... in 10 minute chunks (from 9am until 12pm) ... </i><br />
<br />
**9:30 ~ Jacqueline Gordon<br />
**9:40 ~ Adam Sheppard<br />
**9:50 ~ Xiang Zhang<br />
**10:00 ~ Hongchan Choi<br />
**10:10 ~ Linden Melvin<br />
**10:20 ~ Stephen Henderson<br />
**10:30 ~ Loren Yu<br />
**10:40 ~ Matt Bush<br />
**10:50 ~ Brian Yoo<br />
**11:00 ~ Andrew Plan<br />
**11:10 ~ Michael Repper<br />
**11:20 ~ Nick Kruge<br />
**11:30 ~ Christopher Fajardo<br />
**11:40 ~ Zach Brand<br />
**11:50 ~ Ben Roth<br />
**12:00 ~ Uri Nieto<br />
**12:10 ~ Jonathan Potter<br />
**12:20 ~ Jason Chen<br />
<br />
<br />
* Thursday, November 19th: <i> ... in 10 minute chunks (from 9am until 12pm) ... </i><br />
<br />
**9:30 ~ David Kettler<br />
**9:40 ~ Marc Evans<br />
**9:50 ~ Max Halvorson<br />
**10:00 ~dohi moon<br />
**10:10 ~ Alex Kaneko<br />
**10:20 ~ Sewon Jang<br />
**10:30 ~ Daniel Hollingshead<br />
**10:40 ~ Brian Lewis<br />
**10:50 ~ Ben Cunningham<br />
**11:00 ~ Rebecca Hsu<br />
**11:10 ~ Roy Fejgin<br />
**11:20 ~ Alberto Guzman<br />
**11:30 ~ Adam Somers<br />
**11:40 ~ Locky Casey<br />
**11:50 ~ Jacob Wittenberg<br />
**12:00 ~ Colin Raffel<br />
**12:10 ~ Tyler Maue<br />
**12:20 ~ Sarah Masimore<br />
<br />
== FINAL PRESENTATIONS SCHEDULE ==<br />
*Tuesday, December 1, 2009:<br />
**10:00 ~ Zach Brand<br />
**10:10 ~ Xiang Zhang<br />
**10:20 ~<br />
**10:30 ~dohi moon<br />
**10:40 ~ Matt Bush<br />
**10:50 ~<br />
**11:00 ~ <br />
**11:10 ~ Loren Yu<br />
**11:20 ~<br />
**11:30 ~<br />
**11:40 ~<br />
**11:50 ~<br />
<br />
<br />
*Thursday, December 3, 2009:<br />
**10:00 ~ Uri Nieto<br />
**10:10 ~ Sarah Masimore<br />
**10:20 ~ Ben Roth<br />
**10:30 ~ Jacob Wittenberg<br />
**10:40 ~ Tyler Maue<br />
**10:50 ~ Jonathan Potter<br />
**11:00 ~ Linden Melvin<br />
**11:10 ~ Adam Sheppard<br />
**11:20 ~ Stephen Henderson<br />
**11:30 ~ Alex Kaneko<br />
**11:40 ~ Roy Fejgin<br />
**11:50 ~<br />
<br />
<br />
*Thursday, December 10, 2009:<br />
**3:30pm ~ Jacqueline Gordon<br />
**3:40pm ~ Adam Somers<br />
**3:50pm ~ Marc Evans<br />
**4:00pm ~ Christopher Fajardo<br />
**4:10pm ~ Sewon Jang<br />
**4:20pm ~ Brian Yoo<br />
**4:30pm ~ Daniel Hollingshead<br />
**4:40pm ~ Brian Lewis<br />
**4:50pm ~ Michael Repper<br />
**5:00pm ~ Andrew Plan<br />
**5:10pm ~ David Kettler<br />
**5:20pm ~ Nick Kruge<br />
**DO NOT BOOK AFTER 5:20pm unless ALL OTHER SLOTS ARE FILLED IN<br />
***5:30pm ~<br />
***5:40pm ~<br />
***5:50pm ~<br />
<br />
<br />
...<br />
<br />
----<br />
<br />
<br />
[http://ccrma.stanford.edu/wiki/220a-fall-2009 <b>BACK TO WIKI MAIN PAGE</b>]</div>Nkrugehttps://ccrma.stanford.edu/mediawiki/index.php?title=256a-fall-2009/final&diff=9228256a-fall-2009/final2009-11-10T23:08:31Z<p>Nkruge: /* Music 256a | Final Projects */</p>
<hr />
<div>= Music 256a | Final Projects =<br />
<br />
[[Growl Hero]]<br />
<br><br />
[http://ccrma.stanford.edu/~adam/256a/project The Insaniac]</div>Nkrugehttps://ccrma.stanford.edu/mediawiki/index.php?title=256a-fall-2009/final&diff=9227256a-fall-2009/final2009-11-10T23:06:04Z<p>Nkruge: /* Music 256a | Final Projects */</p>
<hr />
<div>= Music 256a | Final Projects =<br />
<br />
[[Growl Hero]]<br />
The Insaniac - [http://ccrma.stanford.edu/~adam/256a/project]</div>Nkrugehttps://ccrma.stanford.edu/mediawiki/index.php?title=220a-fall-2009/FinalProjects&diff=9226220a-fall-2009/FinalProjects2009-11-10T18:50:17Z<p>Nkruge: /* Sign up for Meetings with Michelle and Michael */</p>
<hr />
<div>[http://ccrma.stanford.edu/wiki/220a-fall-2009 BACK TO WIKI MAIN PAGE]<br />
<br />
==Sign up for Meetings with Michelle and Michael==<br />
<br />
* Tuesday, November 17th: <i> ... in 10 minute chunks (from 9am until 12pm) ... </i><br />
**9:00 ~ Brian Yoo<br />
**9:10 ~Xiang Zhang<br />
**9:20 ~<br />
**9:30 ~ Jacqueline Gordon<br />
**9:40 ~ Adam Sheppard<br />
**9:50 ~<br />
**10:00 ~ Hongchan Choi<br />
**10:10 ~ Linden Melvin<br />
**10:20 ~<br />
**10:30 ~ Loren Yu<br />
**10:40 ~ Matt Bush<br />
**10:50 ~<br />
**11:00 ~ Andrew Plan<br />
**11:10 ~ Michael Repper<br />
**11:20 ~ Nick Kruge<br />
**11:30 ~ Christopher Fajardo<br />
**11:40 ~ Zach Brand<br />
**11:50 ~ Ben Roth<br />
**12:00 ~ Uri Nieto<br />
**12:10 ~ Jonathan Potter<br />
**12:20 ~ Jason Chen<br />
<br />
<br />
* Thursday, November 19th: <i> ... in 10 minute chunks (from 9am until 12pm) ... </i><br />
**9:00 ~<br />
**9:10 ~<br />
**9:20 ~<br />
**9:30 ~<br />
**9:40 ~<br />
**9:50 ~<br />
**10:00 ~dohi moon<br />
**10:10 ~<br />
**10:20 ~<br />
**10:30 ~ Daniel Hollingshead<br />
**10:40 ~ Brian Lewis<br />
**10:50 ~<br />
**11:00 ~<br />
**11:10 ~<br />
**11:20 ~<br />
**11:30 ~<br />
**11:40 ~<br />
**11:50 ~ Jacob Wittenberg<br />
**12:00 ~ Colin Raffel<br />
**12:10 ~ Tyler Maue<br />
**12:20 ~ Sarah Masimore<br />
<br />
== FINAL PRESENTATIONS SCHEDULE ==<br />
*Tuesday, December 1, 2009:<br />
**10:00 ~ Zach Brand<br />
**10:10 ~<br />
**10:20 ~<br />
**10:30 ~dohi moon<br />
**10:40 ~ Matt Bush<br />
**10:50 ~<br />
**11:00 ~ Andrew Plan<br />
**11:10 ~ Loren Yu<br />
**11:20 ~<br />
**11:30 ~<br />
**11:40 ~<br />
**11:50 ~<br />
<br />
<br />
*Thursday, December 3, 2009:<br />
**10:00 ~ Uri Nieto<br />
**10:10 ~ Sarah Masimore<br />
**10:20 ~ Ben Roth<br />
**10:30 ~ Jacob Wittenberg<br />
**10:40 ~ Tyler Maue<br />
**10:50 ~ Jonathan Potter<br />
**11:00 ~ Linden Melvin<br />
**11:10 ~ Adam Sheppard<br />
**11:20 ~<br />
**11:30 ~<br />
**11:40 ~<br />
**11:50 ~<br />
<br />
<br />
*Thursday, December 10, 2009:<br />
**3:30pm ~ Jacqueline Gordon<br />
**3:40pm ~ <br />
**3:50pm ~<br />
**4:00pm ~ Christopher Fajardo<br />
**4:10pm ~ <br />
**4:20pm ~ Brian Yoo<br />
**4:30pm ~ Daniel Hollingshead<br />
**4:40pm ~ Brian Lewis<br />
**4:50pm ~ Michael Repper<br />
**5:00pm ~<br />
**5:10pm ~<br />
**5:20pm ~<br />
**DO NOT BOOK AFTER 5:20pm unless ALL OTHER SLOTS ARE FILLED IN<br />
***5:30pm ~<br />
***5:40pm ~<br />
***5:50pm ~<br />
<br />
<br />
...<br />
<br />
----<br />
<br />
<br />
[http://ccrma.stanford.edu/wiki/220a-fall-2009 <b>BACK TO WIKI MAIN PAGE</b>]</div>Nkrugehttps://ccrma.stanford.edu/mediawiki/index.php?title=220a-fall-2009/studentmusic&diff=8711220a-fall-2009/studentmusic2009-09-29T17:22:08Z<p>Nkruge: /* SIGN-UP for MUSIC Presentations on wiki */</p>
<hr />
<div>== SIGN-UP for MUSIC Presentations on wiki ==<br />
* Thurs. Sept. 24 = Dohi & Adam<br />
* Tues. Sept. 29 = Adam Somers, Uri Nieto, Zach Brand<br />
* Thurs. Oct. 1 = Sarah Masimore, Nick Kruge<br />
* Tues. Oct. 6 =<br />
* Thurs. Oct. 8 = Jacob Wittenberg<br />
* Tues. Oct. 13 = Ben Roth<br />
* Thurs. Oct. 15 = <br />
* Tues. Oct. 20 = Colin Raffel<br />
* Thurs. Oct. 22 =<br />
* Tues. Oct. 27 = Matt Bush<br />
* Thurs. Oct. 29 =<br />
* Tues. Nov. 3 =<br />
* Thurs. Nov. 5 =<br />
* Tues. Nov. 10 =<br />
* Thurs. Nov. 12 = John Bauer<br />
* Tues. Nov. 17 =<br />
<br />
<br />
<br />
--<br />
* [http://cm-wiki.stanford.edu/wiki/220a-fall-2009 Back to 220a wiki Page]</div>Nkruge