Programming Etude #3: "Wekinate Your World"

Wekinator 1 - GOOD MORNING

Have you ever fallen asleep while working? Here's an AI to reduce your suffering.

Sadly, this experiment/tool is based on real-life experiences. It was developed during my documentary shoot week this quarter, where a lot of traveling, producing, and filming made sleep a rare commodity. (Cue: 'All Work and No Play Makes Jack a Dull Boy.') I used FaceOSC to track my head position and the openness of my eyes and lips. Essentially, Wekinator outputs a single value indicating whether I'm about to fall asleep. Then, it plays that ANNOYING iPhone alarm sound to wake me up. :)

Video Demo:

Wekinator 2 - Celeste

A playful attempt to automate game music based on your hand gesture – the ambition was to match the hand gestures when playing a game controller, but the changes were too subtle for Wekinator to register.

Celeste is one of my favorite video games of all time, and its soundtrack is golden. I have drawn so much inspiration from it, especially in designing electronic music. Since my last project, I have been eager to integrate gameplay with real-time music creation. My initial idea was to use VisionOSC to track my hand position while playing a controller and to train gestures for music selection. That turned out to be much harder than I had imagined, so my plan B is to simplify to basic hand gestures.

Video Demo:

Wekinator 3 - Fire, Walk with Me

Kids, don't play with fire using your hands.

A 'music instrument' inspired by the countless YouTube Fireplace background ambient sounds, and this is an attempt to control FIRE with my HAND!!!! I used VisionOSC to map my hand position to components of LiSa granular synthesis of a YouTube Fireplace recording. To be honest, Wekinator was surprisingly well-trained for hand gesture control!

Video Demo:

Checkpoint Reflection

To be frank, I have had very limited time to learn and experiment with Wekinator so far. The first of my three tools comes from real-life experience - desperate times call for desperate measures. To briefly summarize my findings and learning so far, the input-to-output routine seems very interesting. I wish there were a way to individually display the dataset used for training and selectively delete some parts of it.

Final Reflection

Making interactive tools with Wekinator has been another journey of struggle and challenge. One issue is deciding what kind of activity I really want to automate with AI. I understand that part of this assignment is to have fun, be creative, and keep the scope of your Wekinator creations small. I started off with a few really ambitious ideas but failed miserably. Wekinator is such an interesting tool. Unlike other AI models that we have used in the past two projects, it provides more of a black-box experience where one is training the model. Initially, I thought it would be a very simple project: I just tell the model what to do and adjust the details accordingly. That was so wrong on so many levels. LOL. Somehow, it requires more and more manual labeling and knowledge in ChucK music instrument design, with which I wasn't really familiar or had tried in the past two projects. Whereas the first two projects could be programmed a lot more manually to type and lay out a musical melody, instrument design requires understanding how each output should be programmed and interpreted in a meaningful way that can translate into a controllable sound with variation. It is very hard. Not to make any excuses, but the project also caught me at a very busy time when I didn't have the necessary time and resources, unlike during the creation of other projects, to go in-depth in learning how to manipulate sound in ChucK with objects like frequency and pitch changes. If I had more time, I would love to delve into more and perhaps use the FM synth and LiSa granular control. Maybe this is something I can explore in my final project.