Homework: Gestural Synth

  • Out: Nov 15, 2022 Tuesday
  • Due: Nov 29, 2022 Tuesday


Try out the webchuck "laptop Thereminimum" . Your external mouse is mapped to pitch and timbre and the Laptop's trackpad is for volume. Record a tune. Actually, record a few takes and see if you get better at it (you can keep hitting start-record-stop without refreshing the page). This homework is a quick study on how to combine laptop physical controls with webchuck. Familiarize yourself with handling gestural data and submit your own laptop instrument plus a short recording of a performance on it.

Key Results

The following is an example of how final project submissions will look, too.

  1. Submit your project to the hw5 directory.
  2. In that directory, create an index.html file which includes:
  • Documentation of changes you made on top of the starter code.
  • A link to your instrument (which is the webchuck page itself).
  • A link that plays a short audio clip of a performance using it.

Starter Code

  • the gestural synth is defined in both the JavaScript code and live chuck code
  • external mouse Y position
  • external mouse scroll wheel
  • trackpad scroll "wheel" (two-finger scroll)
  • all gesture values are scaled to a 0.0 - 1.0 range

Position and scroll movements produce continuous updates which are handled in the code. Discrete events can be combined, too. In the demo, keyboard click events are also used to set timbre. This feature is redundant with the mouse scroll wheel (but useful if there's no external mouse). It gives a coarse way of setting the timbral mix

Create a new instrument that doesn't sound like the demo instrument.


Other gesture signals can be used besides the ones in the demo:

  • mouse movement in the `X' dimension
  • triggers from mouse button events
  • triggers from trackpad button events
  • triggers from keyboard specific key events

Depending on the browser used and hardware and operating system, JavaScript can also track signals from mobile gyroscope and acclerometer.

Other sound sources to consider:

  • physical models (for example, Clarinet)
  • ADC (mic or hydrophone)
  • SndBuf for reading in a .wav file with gestural control of looping
  • gestural control of sound effects (filters, chorus, etc.)

Patterns can also be the target of real-time gestural control.

  • isorhythm patterns (instead of 4 against 3, set the cycle lengths dynamically)
  • random walk (excursion, update rate)
  • periodic patterns
  • dynamical systems with gestures for real-time control of parameters