Final Project Proposal:
I'm a fan of the Serum synthesizer, but I've run into issues using it with my DawDreamer. For example, with Serum and DawDreamer, I can load presets and change various parameters such as the amount of modulation from Envelope 1 to Oscillator 2's panning, but I can't change the routing itself. The routing is baked into the preset. In other words, the modulation matrix can't change, only the amount of modulation for each entry in the table. I want some kind of API in which I can decide the routing.
The second major issue is that I can't load a Wavetable from Python. The wavetable is also baked into the preset in some non-modifiable way.
The third issue is not related to sound design but instead to using the sound signals for visual design. I'd like to have access to all of the intermediate signals in a modular synthesis setup. So if there are envelopes and LFOs, I'd want an audio-rate stream of all of them in addition to the final stereo signal. I intend to use these signals some time later for real-time audio-reactive visual design.
To summarize, I plan on making a wavetable synthesizer with many features:
- 2D Wavetable oscillators
- The 2D wavetables can be set via Python or some C++ API.
- Various filters and FX
- API for modular routing (modular synthesis)
- LFOs and Envelopes whose outputs are accessible
As a spinoff project, once I figure out some of the wavetable stuff, it would also be great to have a basic "sampler" instrument such as the one in Ableton Live or Kontakt. I want to be able to provide samples via Python or C++. I'd then want to have some of the basic Sampler features such as ADSR envelopes to control volume, panning, filter cutoffs etc.
I need to look at the inner workings of surge-python because it might have a good example of setting modular routing via Python.
- I learned Faust.
- I got [FaucK](https://ccrma.stanford.edu/~rmichon/fauck/) working on Windows. A pull request for the chugins repository is [here](https://github.com/ccrma/chugins/pull/49).
- I made a basic IDE for Faust inside TouchDesigner: [TD-Faust](https://github.com/DBraun/TD-Faust/). The basic workflow is the same as FaucK. An additional cool feature is that it can generate a UI of TouchDesigner widgets based on the Faust code you write. This uses the same [APIUI](https://github.com/grame-cncm/faust/blob/master-dev/architecture/faust/gui/APIUI.h) as FaucK. An alternative workflow uses the polyphonic DSP factory classes and the [MidiUI](https://github.com/grame-cncm/faust/blob/master-dev/architecture/faust/gui/MidiUI.h). This doesn't have the same UI generator feature, but at least it's polyphonic with hardware MIDI.
- I got kind of distracted and put a lot of time into fixing faust2juce for Windows. Here's the PR which got merged: (https://github.com/grame-cncm/faust/pull/576)
- I looked at the source code of [Vital](https://github.com/mtytel/vital/), which does spectral deformations of wavetables. I'm very interested in understanding how its modular aspects work, for example, how an ADSR can be routed to affect a filter cutoff.
- I browsed some repositories of Samplers such as [CTAG-JUCE-Sampler](https://github.com/NiklasWan/CTAG-JUCE-Sampler) and [JUCE_simple_sampler](https://github.com/vincentchoqueuse/JUCE_simple_sampler/blob/master/Source/CustomSampler.cpp#L271).
- Ultimately I decided to clone JUCE's official [SamplerPluginDemo](https://github.com/juce-framework/JUCE/blob/master/examples/Plugins/SamplerPluginDemo.h). I encountered an issue with not hearing audio (https://github.com/juce-framework/JUCE/issues/893) but partially resolved it. This project plays back a sample at different speeds based on the MIDI note. It uses MPE (Midi Polyphonic Expression). The samples are linearly interpolated for different pitches. There are no amplitude envelopes or filters. Now I'm studying the code and seeing if I can add more modular features, like routing an ADSR to a filter cutoff.