final project

jack xiao

musical fruit ninja!

description

i really loved fruit ninja when i was younger, and when i got an xbox kinect platform for my 10th birthday, i spend so many hours playing gesture-controlled motion-capture fruit ninja. so, my aim was to recreate this experience with a twist: incorporating new sounds and an option for more user control. the core of the game is the same as the orignal fruit ninja: slice as many fruit as you can. each fruit, when sliced, emits a unique sound. this is where things get interesting. there are 2 sound modes: default and pitched. default sounds are simply a unique alternative to the standard fruit slicing sounds, with some interesting textrues and randomization. the pitched sound mode, on the other hand, allows the user full control over the pitch based on where on the screen the fruit is sliced. the higher the slice, the higher the pitch. in addition to pitch, in either mode, the volume/gain of the sound is controlled by the speed/force at which the fruit is sliced. within the pitched sounds, there are also two modes: one mode that represents a diatonic scale, and the other mode is a chromatic scale. the user can toggle between these modes as desired. with the pitched sounds, there is also a line guide that can be toggled, which illustrates where on the screen each pitch is represented by. finally, there is the choice to automate the fruit tossing. the default mode is user-controlled, where the user can control which fruit gets tossed when using the numbers on the keyboard. toggling the auto setting turns the game into what the original fruit ninja feels like, with fruit flying in at different paces and locations. there are also bonus fruits, just like the real game (of course, paired with a new unique sound). a blue freeze fruit slows the movement by half, also turning the screen blue to match the vibe. the striped frenzy fruit triggers 10 seconds of craziness where tons of fruit begin flying in, building some cool sound textures.

then, in addition to the unity portion, there is a python script that can be downloaded that allows the user to control the mouse using hand motions. this runs from the computer itself, so the mouse contol extends to the whole computer, and since the blade is controlled by the mouse, this would thus allow for hand gesturer controlling of the blade.

 

video (for some reason youtube refuses to process the video in HD (even though the file i uploaded is definitely 1080p), so maybe live demo would be better?)

 

build

build (for MacOS - Intel64)

 

unity project

project

 

screenshots

frenzy mode

audioviz

 

guide lines in pitched mode

audioviz

 

instructions for running hand control:

game controls:

 

things that could still be improved

 

 

 

 

milestone 3: minimal system

progress update

the primary focus for this milestone was to get the handtraking/computer vision component working. and so, surprisingly, unity was not the sole source of my suffering for the past week. instead, while i expected the python to come by more easily (and it was certainly true that the coding itself in python was much more straightforward), setting up the python environment with the necessary computer vision, computer control/GUI packages took many hours of trial and error. i needed too use OpenCV for the computer vision/handtracking itself, then i needed autopy in order to add mouse control to the python program, and then i needed mediapipe in order to provide a nice display of how the hand was being tracked. i initially tried setting up a conda environmen, but for some reason, mediapipe and autopy didn't work very well with conda. so instead, i resorted to pip. then, certain packages didn't work with certain versions of python (most notably autopy, which doesn't work/build correctly with Python 3.9, but did work when i switched back to Python 3.7). i was able to get the handtracking/mouse control program working in PyCharm, and at that point i was able to get it running concurrently with the unity project (while admittedly not in the most convenient way, you have to manually run the python program and start the unity game separately). i spent a day or two trying to integrate the two together, to no avail. i was ideally trying to find a way to just kick-off a concurrent python program from a Unity C# script, but so far everything i've tried does not work. i also added some basic control to the fruit, where the user can control which fruit gets thrown using the keyboard. see questions/next steps sections for more info on what is to come.

 

video

 

some questions/blockers

next steps

 

 

 

milestone 2: early prototype

progress update

i decided to pursue the musical fruit ninja idea, since it was the most "different" from what i've done before and i thought it would be the most visually interesting option. and, the addition of computer vision and/or motion capture as a medium of interaction sounded both challenging and engaging. before starting the unity project, i spent a lot of time searching for the best way to incorporate camera input into the game. i found a "markerless motion capture" application called Captury that had a free unity plugin, but as i was trying to work with the plugin, i eventually realized that it depended on applying for and purchasing the full Captury Live software bundle, which was unfortunate. thus, i had to find different way to incorporate camera input.

since i was realy short on time until this milestone was due, i decided to just get the "fruit ninja" part of the game finished with some audio incorporated and adding in camera input later on. i looked around and found a solid unity fruit ninja tutorial on youtube that i followed closely for this milestone, with a few small design/implementation changes to better suit where i think my project is headed. so, all the core functionality is from this tutorial: https://www.youtube.com/watch?v=xTT1Ae_ifhM and is fully working with mouse input.

at the moment, each fruit is mapped to a different sound in ChucK, and when the fruit is sliced, the corresponding sound is played. the blade/slicing is controlled by the mouse. currently, fruit flies into the screen at randomized angles and frequencies within defined ranges, i plan for this to be more defined (sequencer-like) to be able to create more intentional musical statements and more defined "choreography".

it took a surprisingly long time to follow the tutorial and get the game working, since there were a lot of details to incorporate and aspects of Unity that i had not worked with often before (colliders and rigidbodies). however, now that the core game is functionally there, it will be much easier to incorprate the additional elements that i think will make this final project unique, namely the sounds and the interaction (see next steps). it is already pretty fun to play around with, i find the act of slicing fruit to be very theraputic and soothing, and really brings me back to the days of playing fruit ninja on my phone (and the Xbox Kinect version, which i'd say is the main insipration behind this project)

 

video

 

next steps

first, i will work on incorporating camera input, allowing the player to control the blade and the slicing directly using their hand and a camera. right now, the idea with the most potential is to use OpenCV and Python (or maybe Tensorflow Object Decion/TensorMouse) to write a program that allows a user to control the mouse using hand gestures and camera streaming, and then translating that mouse control into Unity. since the current iteration already depends on mouse input directly, once i finish writing the python program to move the mouse using my hand, it should be relatively straightforward to add that to the Unity project.

next, i will work on making the fruits more defined and intentional. i will add some interface with which a user can construct a piece of music by controlling the pattern/rhythm/timing at which certain fruits are launched. this shares some similarities with the sequencer project, but this version of it will look very different from the sequencer i ended up creating for hw3. i might also allow the user to control the direction at which certain fruits are launched (to create a more defined choreography), or perhaps that would be better if the direction of fruit launch was left outside the hands of the user and was instead fully controlled by the sequencer itself. i plan to also incorporate some pitched sounds, and so the interface would need some way for the user to control the pitches as well.

finally, the last big next step is to add additional "game features" from fruit ninja and map those game events to different musical events. for example, slicing multiple fruit in a single directional slice (from what i remember of the game) yields some combo reward or score multiplier, so this could trigger a certain musical event. other aspects from fruit ninja that i want to include are the "bonus bananas" where when you slice the specially colored bonus banana, a different effect is applied depending on the type of banana ("freeze banana" causes game to slow down by half, "multiplier banana" doubles score for each fruit, "frenzy banana" triggers 10 or so seconds of rapid fruit launching from all sides of the screen, etc). the "freeze banana" especially, i think would be very effective in a musically-oriented version of the game. i have yet to decide how exactly i'll incorporate these elements, but it will be decided as i implement other features and get a better feel for what is practical and feasible while still being artistically viable.

miscellaneous addition(s): making the fruits look better with additional elements in the prefabs (e.g. give the apple a stem and leaf instead of just being a red sphere, give the watermelon some stripes, give the mango a gradient, give the kiwi seeds, make the lemon actually the shape of a lemon, etc), but that might be challenging to implement given the way the fruit slicing needs is implemented. this feature is not strictly necessary, so i will only do it if there is time.

 

 

 

milestone 1: ideas

idea 1: musical fruit ninja + computer vision

use computer vision (or other method) to idenify silhouettte and movement, to control on-screen display/visuals where notes fly into the screen and you try to slice them.

image1

 

idea 2: pitch game

a game where the player needs to hum/sing various pitches to navigate a spaceship to avoid asteroids, with music in the background that harmonizes the correct note.

image2

 

idea 3: navigating time

use components of my clock audio-visualized to design a more complex world based around time, where the player can freely move around and either control/lose control of time. this idea will probably be more like a pure story/narrative.

image3