Music 220b: Lecture 1 Slides

Some definitions...

Let's fire up our Digital Webster and take a look at what it says.

Two worlds

The first part of our music creation task: go from our mental image of the music to a score, meaning here a representation of the list of musical events in a particular syntax. Second step: render that representation so that it is transformed into sound. The first step is taken care of by the Common Music package (although it is not the only alternative). The program that takes care of the second step depends on the selected score syntax. Each syntax corresponds to a different sound rendering program.

The rendering packages

Here we have a list of the currently available rendering packages that can be controlled through Common Music. We can roughly split them into real time or non real time rendering systems, although the difference is becoming academic as computers are getting to be so fast that so called "non real time" synthesis packages are already sunthesizing sound faster than real time.

How do we run lisp here at CCRMA?

Two ways: as a subprocess inside the general purpose emacs editor, or as a subprocess inside the "Lisp" application (which lives in /LocalApps/ Of course you can also use the lisp image from a terminal...

Where are things?

Some rendering packages share the same lisp image with Common Music. Other rendering packages we'll be using are separate programs but can be called from within Lisp (like the MIDI driver or the MusicKit scorefile player).

©1995 Fernando Lopez-Lezcano. All Rights Reserved.