Homework: Auditory Streaming Illusion

  • Out: Oct 17, 2023 Tuesday
  • Due: Oct 24, 2023 Tuesday

Products

The homework studies are built around one-click files that can run from anywhere. Working on a study involves experimentation with possibilities and likely produces many files with a variety of ideas. Keep them around! Give them descriptive names. Sketches are great to go back to. They can accumulate on your own computer and / or the CCRMA server. The final product will be the one in your CCRMA server hw2 directory named index.html

In-class lab

Try these examples in the IDE.
(learn to start, stop and play multiple shreds at the same time)

This version of the IDE comes preloaded with a custom 220a chuck class called isoRhythmClass that plays a sequence of tones.
(see the isoRhythmClass.ck code, but it only plays when sporked as in the next examples)

Export an IDE project using the IDE's File : Export WebChucK Page feature. It will produce a full "one-click" webapp, called index.html.
(it is saved locally, can be renamed, can be copied to a server or shared as an attachment, for example)

Overview

Make a short (1~2 min.) version of the auditory streaming illusion explained and demonstrated here. For this study, FM synthesis will be used to demonstrate the perceptual phenomena of "auditory streaming." Run the starter code examples above, play with them to understand how it works and then invent replacements and variations which fulfill the assignment. Solutions should depart from the starter code in creative ways.

Objectives

  • Exploring auditory illusions through synthesis
  • Manipulating FM to create distinct timbres
  • Algorithmic control of musical patterns
  • Encapsulating (larger) code structures as reusable classes and preloading them from files
  • Sporking sections and layers of a composition

Background

Listen to Al Bregman's demonstration of the illusion. The phenomenon of auditory streaming is a fascinating one. It is the process by which your mind groups together perceived objects or events and suggests similarities that make the groups cohere. In music, often your mind groups objects for you over time - hence your being able to listen to a band and follow the lines of the different instrumentalists somewhat independently!

Here's a visual analogy - the mind groups together the objects when more closely spaced.

In this assignment, you'll be using the FM instrument from the starter code isoRhythmClass that's builtin to the IDE to create multiple different sound sources that then perceptually group together when played in closer timing. You can alter aspects of the class itself as demonstrated in the examples, alter class member variable and alter FM synth instrument. You can also create a child class that inherits the parent's behavior and extends it. See "inheritance" under Chuck's class hierarchy documentation.

Download the starter code and edit it to create an auditory illusion that works convincingly - shoot for one that maximally segregates three or more voices when played at a fast tempo. The effect depends on differentiation of sonic parameters. Choose your own dimension(s) for the effect e.g., timbral, spatial and/or envelope qualities. Tune the effect by ear and make it different from the starter code example. Like the starter code example, you should hear the effect of a slow note sequence blossoming into illusory polyphonic lines as it increases speed.

This block diagram from John Chowning's The Synthesis of Complex Audio Spectra by Means of Frequency Modulation shows a typical simple FM instrument.

The illusion can be created with any radical enough sonic differences. For the assignment, we're restricted to contrasts within those parameters belonging to Chuck's FM patches or your own custom FM synth. For the latter, you might start with the very basic code shown in simple FM with 2 sine oscillators adding envelopes to shape the evolution of a note. Also see the demos from the 2022 class that are listed below. They explore distinct timbres and creating musical sections with synthpatches and engines. They run in "monolithic" one-click files rather than the IDE, but their code can be copied over to the IDE. (Note that some of them use mouse/trackpad control which we don't yet have in the IDE framework.)

Ultimately, these are the kinds of choices for voicing the illusion.

  • Choice of synthesizer
  • Loudness (gain)
  • Timbre parameters (e.g., carrier / modulator frequency ratios)
  • Spectral brightness (e.g., modulation index)
  • Envelopes on carrier and modulator (e.g., ADSR settings)
  • Left / right panning (stereo panning with Pan2, or just chouce of dac.chan)

Nothing Happens, what broke? (debugger is your friend)

It's always a good idea to work incrementally, keeping copies of recent known-to-be-working programs that you can always go back to (aka "reality restoration"), and then extending ahead one feature at a time. If a change breaks things it's way easier to debug if the only difference is a small one. Keeping a sequence of working versions is way easier than uncovering a bug lurking in one file with a slew of changes.

All programming environments have ways to poke and prod code and so do browsers. Their debuggers are there to help and can be turned on and off with a keystroke. Inside the debugger, look for the JavaScript debug console as that's where your debugging print statements are displayed. It'll show printing from both Chuck and JavaScript. The syntax for print statements in Chuck is <<< "hi from Chuck" >>> and from JavaScript console.log{"hi from JavaScript"}.

Browser Code Parsing (warning)

We're throwing all kinds of code into html files like hw2/index.html. These one-click files are where the action is and contain a mix of html code, Chuck code and JavaScript (languages which each have their own interpreters, syntax and parsers). When the file gets opened in a browser, all of its contents are first parsed as html code to read instructions that build the page that will be displayed. But, as you've seen we're also including the Chuck code that gets displayed as live code. That creates a complication everyone needs to be aware of. When the html parser sees the character > directly followed by text, like >xxx, it interprets that as special html browser code. Ughh, instead of Chuck code. Inserting a space fixes the problem. This is a very good thing to know, otherwise the Chuck code mysteriously disappears. Until this gets fixed by higher powers be sure to add that space to guard against the problem: > xxx instead of >xxx. That way, Chuck comments like <<< "hi" >>>, or comparisons like x < y won't get mangled by the html parser and the result won't create a Chuck syntax error. This applies only to Chuck code listed in the html file itself. Anything typed directly when interacting with live code, that's safe (it's sent directly to the Chuck interpreter). Same with any Chuck code that's directly preloaded from other files.

Non-IDE Examples (Webapps demonstrating extra ideas in Chuck)

Manipulating FM to create distinct timbres from 2022 (won't run in the 2023 IDE)

Exampledemonstratesof parameters
FMDemo.htmlmouse controlm.freq, index
FMDemo2.htmlfixed settingsm.freq, index
FMDemo3.htmlenvelopescADSR, mADSR
FMDemo4.htmlalgorithmic controlall of the above
FMDemo5.htmlalgorithmic controlpanning
FMpatches.htmlbuilt-insynthpatches

With algorithmic control of pitch, timbre parameters and spatial cues, patterns like a pattern of 4 elements of one dimension cycling against 3 elements of another dimension can be generated from processes. Pattern engines can be abstracted and preloaded for use in compositional experimentation.

Generating musical sections with synthpatches and engines

  • Encapsulating (larger) code structures as reusable classes and preloading them from files makes the code simpler, cleaner and easier to manipulate
  • Sporking sections and layers are the "big handles" for construction of a composition
Example.htmldemonstratesof algorithm
4against3.htmlcyclingpitches, timbres
preloadPattern.htmlpreloadingpattern engine
crazedCalliope.htmlsporkingsections
calliopeChorus.htmlsound effectsinstruments

Three of these examples preload the file isoRhythm.ck which must live on the server. Consider using your own public classes, putting their definitions into similar files like the example. In order to preload them, they first need to get copied to your CCRMA server hw2 directory. The very last example above demonstrates how to extend a public class after preloading to give it some tweaks.

Composition

There's music to be found in the juxtaposition of sections both horizontally and vertically.

  • Experiment in the live code with horizontal composition to assemble sections in sequence
  • Experiment in the live code with vertical composition to layer multiple sections (pay attention to overall gain so the audio doesn't clip)
  • (optional) Add sound effects, for example, incorporating chorus and / or reverb as the last stage before dac output

Final product

  • Save the final product as a file in your CCRMA server hw2 directory named index.html -- it should play a short (1~2 min.) piece when clicked on from the homework factory