Difference between revisions of "GuitarFace"

From CCRMA Wiki
Jump to: navigation, search
(The Product)
 
(30 intermediate revisions by the same user not shown)
Line 1: Line 1:
 +
Welcome to the GuitarFace Wiki! Check out the GitHub repository for GuitarFace here: https://github.com/ginacollecchia/GuitarFace.
 +
 +
Why is this project called '''guitar face'''? It started with a feeling. Which felt a lot like this.
 +
 
[[File:Brianmay_guitar_face.jpg]]
 
[[File:Brianmay_guitar_face.jpg]]
  
 +
Oh hey there Brian May! I bet you're playing some sweet jams. Here are some other amazing guitar faces. Try to notice similarities between them.
  
== Goals ==
+
[[File:Gary_Moore_guitar.jpg]]
  
To identify virtuosity in rock guitar solos by analyzing MIDI/Guitar Pro data with supervised learning (SVM).
+
Gary Moore, amazing
  
== Motivation ==
+
[[File:Ritchie_guitar.jpg]]
  
As guitar players ourselves, we can appreciate a masterful guitar solo. Fast, dynamic solos comprise difficulty to some extent, but there's also "extra-musical" things like stage presence, emotion, and cool.
+
Ritchie Sambora, squeezing one out
  
Solving the difference between good and bad guitar solos is a monstrous task, but we believe that there are a number of MVPs in the same vein that would serve guitarists and other musicians well. A tool that tells you if you're playing on the beat, for example, is very useful for solo guitarists and other "bedroom musicians" who don't have that kind of feedback from a band, or perhaps can't even tell until playback. Personal experience with lining up tracks in a DAW points to a need for excellent rhythm when recording and overdubbing. Other enhancements to the soloing experience could include knowing how close your sound is to Jimmy Page's, or transcribing a live improvisation of a song you want to remember later, or fine tuning your intonation and technique.  
+
[[File:Metheny_guitar.jpg]]
  
We choose rock music because of the spotlight that the guitar is frequently given in a rock band. Rock music is one of the largest genres of music, but there are individual artists who stand out as titans in the genre. Led Zeppelin, The Doors, and Jimi Hendrix are just a few examples of bands that truly convey masterfulness in the genre. Solos like these will make up our data set of "good" solos; we have yet to identify our data set of "bad" solos, but in general, the more extreme bad/good, the better.
+
Pat Metheny is trying, people.
  
Now, which features to include?
+
[[File:Malmsteen_guitar.jpg‎]]
  
To give a quick example, rock and roll has 4 core instruments: the electric guitar, bass, drums, and voice. We could compare it to other genres on a basis of instrumentation, and find some differentiation. For example, jazz would have a noisier distribution of instruments: perhaps a peak over drums and the trumpet, but would we see the same over the piano and guitar?
+
Yngwie! So many nooootes.
  
Similarly, there are other features besides instrumentation that are core to rock, and furthermore rock guitar soloing:
+
[[File:Joe_Satriani_guitar.jpg]]
* scale / key / intonation
+
* timing
+
* dynamic range (loudness)
+
* pitch range
+
* repetitiveness
+
* non-pitched decorations
+
* vibrato, bend
+
* stage presence (physical movements, facial expression)
+
* use of pedals / FX
+
* music theory of the context, expectation (build-up and violation)
+
* creativity
+
  
All of these features, when performed well or if they fall flat, can completely make or break a song. Violating these are clear violations of expectation. Naturally, we are interested in the measure of quality: what makes a good guitar solo? By "good", we mean retains our interest, impresses, and even inspires. We want to minimize the amount of subjectivity in our definition of quality, and we can do this by choosing features that we feel best contribute to "quality". 'The '''pitch''' content (melody) of the solo is a convincing example, knowing what we do about major and minor keys and other scales. '''Timing''' is another: in general, things should fall on the beat or integer divisions thereof.
+
Joe Satriani, stunned
  
== User ==
+
''(source for most of these images: http://www.guitarburn.com/2009/09/the-10-most-disturbing-guitar-faces/)''
  
Musicians, specifically guitar players, and more specifically improvisers / soloists. This could and should be able to be scaled to any instrument capable of producing MIDI output.
+
'''Similarities''':
 +
* open mouth, maybe even some tongue action, OR pursed lips
 +
* closed eyes OR buggy eyes
 +
* scrunched face OR long face
 +
* raised head?
  
== The Product ==
+
With these observations, we believe there might be extractable features of a '''guitar face'''. Using face tracking algorithms, we were able to successfully detect an open or closed mouth and open or closed eyelid. Upon this detection, a guitar face graphical event occurs.
  
The product will use a support vector machine (SVM) to input MIDI data (tablature -> Guitar Pro -> MIDI) and feature vectors and output a classification. The product should have a few modes: '''practice''' and '''test''', for example. During '''practice''', one could see the raw data, and evaluate things in the feature space, such as timing, moments of vibrato, intonation, and more. At this stage, they could also scrub their data and extract a musical score.  
+
MIDI guitar input is paired with the computer vision in order to trigger events in a visual environment, so that guitarists (and other musicians alike) can get real time feedback of their playing, and fun rewards for their practice sessions.  
  
'''Test''' would output a health meter as the guitarist is playing, i.e., the software is analyzing the solo in real time. This idea points to a gaming context, where 2 friends could duel and see who comes out on top, over a range of different categories (who has better technique? timing? pitch ranges/jumps? etc.).
+
== Usage ==
  
== Libraries and Pre-existing Code ==
+
Open GuitarFace.xcodeproj with Xcode. After running, pressing '''s''' will start the GuitarFace game. Press '''q''' at any time to quit.
  
* Liblinear SVM (C++): http://www.csie.ntu.edu.tw/~cjlin/liblinear/
+
== Goals ==
* HMMLib (C++): http://www.cs.au.dk/~asand/?page_id=152
+
* Tablature -> MIDI: http://www.guitar-pro.com/en/index.php
+
* JGuido (C++) graphical rendering of raw MIDI -> score: http://csl.sony.fr/downloads/papers/2013/fober-13a.pdf
+
  
 +
* To detect the facial features of the '''guitar face''' using OpenCV (SUCCESS!)
 +
* To make a visualizer of musical MIDI data that records and rewards user input in real time (SURE!)
 +
* To be able to compare sessions against one another and track individual progress (NOT SO MUCH!)
  
'''Relevant papers'''
+
Networking was a little too last-minute to be implemented. With networking, we were hoping to send guitar face events to a second player, which would appear on their screen.
  
* "Machine Learning Techniques for Real-time Improvisational Solo Trading": http://citeseerx.ist.psu.edu/viewdoc/download;jsessionid=2CC1E4C5C751E6D66E54855B2995F797?doi=10.1.1.129.8045&rep=rep1&type=pdf
+
== Assumptions ==
* "Using Machine-Learning Methods for Musical Style Modeling": http://musicweb.ucsd.edu/~sdubnov/Papers/CM.pdf
+
  
 +
In our design of the musical data that we'll be providing to players, we make a few assumptions about what musicians want. Our model is similar to an exercise routine, where the user could set goals / thresholds for their practice sessions. We are interested in tracking the following variables, namely in the context of solo guitar playing and improvisation:
 +
* duration of the session / number of notes played
 +
* count of pitches in the user-defined key
 +
* count of notes on or off the user-defined beats, e.g. tempo and meter (and an optional metronome feature)
 +
* count of vibratos*
 +
* count of pitch bends*
 +
* count of slides*
 +
* count of power chords
 +
* count of musical intervals (m2, M2, P4, etc.)
 +
* count of "big jumps" between pitch (>P8)
 +
* chord labeling
 +
* dynamic range (moving average / smart)
 +
* pitch range
 +
* stage presence, '''guitar face'''
 +
* fretboard heat-map: where on the guitar neck are you playing most frequently?*
 +
* repetition of pitch sequences
 +
* clarity and consistency of dynamics and intonation (streaks)
 +
* mixture of divisions of the beat (i.e., all quarter notes = bad)
 +
* pace of playing
 +
* extra-musical moments, such as palm mutes and artificial harmonics, if detectable
 +
* the distribution of these features over the specific beats of a measure
  
'''Sensors / Accessories'''
+
(*these data were not possible to gather from the MIDI guitar input)
  
* Roland G3 MIDI guiar pickup: http://www.rolandus.com/products/details/651
+
These comprise the analytic structure of our software. We can abstract musical functions such as measure and chord naming. Each of these variables will trigger an event, whether it's a histogram/graph, numerical count, graphical object (a squiggly for vibrato, changing with time, fading away completely after some time lapse), or graphical environment change. Major changes will happen when certain (imaginary or user-set) goals are reached--for example, "100 notes!"->the tunnel is blue now!->(thought bubble) whoaaa!
* Computer camera to detect guitar face?
+
* Accelerometer / Game-Track to track hip gyration?
+
  
== Design ==
+
== Storyboard ==
  
 +
[[File:Environment.jpg]]
  
== Testing ==
+
''Image: One idea for the graphical environment of GuitarFace. Facial tracking shown in the top left. Moving tunnel where the present is closest to the eye and past is at the center. MIDI events shown within the tunnel: 2 squiggly lines for vibrato, a star for a recent something or other. Tunnel texture looks here a lot like an FFT waterfall, but should rather represent pitches or just be pot-smokin'. Data and records shown along the bottom (first iteration ideas). Notifications of rewards in the top right.''
  
 +
MIDI data is output in real time from a MIDI guitar pedal unit, the Roland GR55. The data isn't perfect, so hopefully, we won't have to spend much time or energy scrubbing the data in order to get decent, guitar-like musical data into the system. We don't need to play it back, anyways. We could also augment the USB MIDI data from the GR55 with 3 other forms of data: the audio signal from the guitar, MIDI out from the GR55, and the audio signal from the GR55, which sounds significantly better than the USB out.
 +
 +
The MIDI data are passed through functions to compute the aforementioned measures, such as number of notes. These functions are wired up to the graphical display and the graphical events.
 +
 +
Finally, a summary of the session is output once the session is terminated, either at the end of the backing track or when the user presses 'd'.
 +
 +
== User ==
 +
 +
Musicians, specifically guitar players, interested in knowing more about their practice sessions. However, this software can be used by any instrument capable of producing MIDI output. The output can serve as a score, though we don't really think of it primarily that way; we're more interested in the statistics and progress one makes toward set goals in their practice time, such as the change in duration from session to session, or pitch and volume content.
 +
 +
== Libraries and previous work ==
 +
 +
* OpenCV (computer vision): http://opencv.org/
 +
* LibBass (simple audio player): http://www.un4seen.com/
 +
* Roland FriendJam, for use with their MIDI sensors / pedals / interfaces: http://www.roland.com/FriendJam/Guitar/
 +
* FaceTracker (66-point facial "mask"): https://github.com/kylemcdonald/FaceTracker
 +
 +
'''Sensors / Accessories'''
 +
 +
* Godin MIDI guitar
 +
* Roland GR55 guitar MIDI interface / pedals
 +
* Computer camera to detect guitar face
 +
* Lighting for the face
 +
* FUTURE: Accelerometer / Game-Track to track hip gyration
  
 
== Team ==
 
== Team ==
Line 74: Line 115:
  
 
== Milestones ==
 
== Milestones ==
* Week 1: data acquisition and machine learning research
+
* Week 1: OpenCV compilation, research; most of MIDI code; graphics setup
* Week 2: feature vector design, data scrubbing
+
* Week 2: MIDI + graphics integration and design; make the thing work, basically; OpenCV progress
* Week 3: App architecture, UI/UX
+
* Week 3: Guitar face feature detection
* Week 4: Code code code
+
* Week 4: Sex it up
 
+
== Scratchwork ==
+
 
+
Ultimately, we hope to implement our model in the form of a game. It makes sense that actually playing the game should inform us further, since our players will be providing quality data. The game would provide feedback for your soloing, by rewarding good solos and punishing bad ones. If played by 2 people, this could send messages from a partner's superior solo to the opponents in the form of damage or further musical obstacles. The opponents could be dueling asynchronously, or essentially jamming together.
+

Latest revision as of 13:27, 12 December 2013

Welcome to the GuitarFace Wiki! Check out the GitHub repository for GuitarFace here: https://github.com/ginacollecchia/GuitarFace.

Why is this project called guitar face? It started with a feeling. Which felt a lot like this.

Brianmay guitar face.jpg

Oh hey there Brian May! I bet you're playing some sweet jams. Here are some other amazing guitar faces. Try to notice similarities between them.

Gary Moore guitar.jpg

Gary Moore, amazing

Ritchie guitar.jpg

Ritchie Sambora, squeezing one out

Metheny guitar.jpg

Pat Metheny is trying, people.

Malmsteen guitar.jpg

Yngwie! So many nooootes.

Joe Satriani guitar.jpg

Joe Satriani, stunned

(source for most of these images: http://www.guitarburn.com/2009/09/the-10-most-disturbing-guitar-faces/)

Similarities:

  • open mouth, maybe even some tongue action, OR pursed lips
  • closed eyes OR buggy eyes
  • scrunched face OR long face
  • raised head?

With these observations, we believe there might be extractable features of a guitar face. Using face tracking algorithms, we were able to successfully detect an open or closed mouth and open or closed eyelid. Upon this detection, a guitar face graphical event occurs.

MIDI guitar input is paired with the computer vision in order to trigger events in a visual environment, so that guitarists (and other musicians alike) can get real time feedback of their playing, and fun rewards for their practice sessions.

Usage

Open GuitarFace.xcodeproj with Xcode. After running, pressing s will start the GuitarFace game. Press q at any time to quit.

Goals

  • To detect the facial features of the guitar face using OpenCV (SUCCESS!)
  • To make a visualizer of musical MIDI data that records and rewards user input in real time (SURE!)
  • To be able to compare sessions against one another and track individual progress (NOT SO MUCH!)

Networking was a little too last-minute to be implemented. With networking, we were hoping to send guitar face events to a second player, which would appear on their screen.

Assumptions

In our design of the musical data that we'll be providing to players, we make a few assumptions about what musicians want. Our model is similar to an exercise routine, where the user could set goals / thresholds for their practice sessions. We are interested in tracking the following variables, namely in the context of solo guitar playing and improvisation:

  • duration of the session / number of notes played
  • count of pitches in the user-defined key
  • count of notes on or off the user-defined beats, e.g. tempo and meter (and an optional metronome feature)
  • count of vibratos*
  • count of pitch bends*
  • count of slides*
  • count of power chords
  • count of musical intervals (m2, M2, P4, etc.)
  • count of "big jumps" between pitch (>P8)
  • chord labeling
  • dynamic range (moving average / smart)
  • pitch range
  • stage presence, guitar face
  • fretboard heat-map: where on the guitar neck are you playing most frequently?*
  • repetition of pitch sequences
  • clarity and consistency of dynamics and intonation (streaks)
  • mixture of divisions of the beat (i.e., all quarter notes = bad)
  • pace of playing
  • extra-musical moments, such as palm mutes and artificial harmonics, if detectable
  • the distribution of these features over the specific beats of a measure

(*these data were not possible to gather from the MIDI guitar input)

These comprise the analytic structure of our software. We can abstract musical functions such as measure and chord naming. Each of these variables will trigger an event, whether it's a histogram/graph, numerical count, graphical object (a squiggly for vibrato, changing with time, fading away completely after some time lapse), or graphical environment change. Major changes will happen when certain (imaginary or user-set) goals are reached--for example, "100 notes!"->the tunnel is blue now!->(thought bubble) whoaaa!

Storyboard

Environment.jpg

Image: One idea for the graphical environment of GuitarFace. Facial tracking shown in the top left. Moving tunnel where the present is closest to the eye and past is at the center. MIDI events shown within the tunnel: 2 squiggly lines for vibrato, a star for a recent something or other. Tunnel texture looks here a lot like an FFT waterfall, but should rather represent pitches or just be pot-smokin'. Data and records shown along the bottom (first iteration ideas). Notifications of rewards in the top right.

MIDI data is output in real time from a MIDI guitar pedal unit, the Roland GR55. The data isn't perfect, so hopefully, we won't have to spend much time or energy scrubbing the data in order to get decent, guitar-like musical data into the system. We don't need to play it back, anyways. We could also augment the USB MIDI data from the GR55 with 3 other forms of data: the audio signal from the guitar, MIDI out from the GR55, and the audio signal from the GR55, which sounds significantly better than the USB out.

The MIDI data are passed through functions to compute the aforementioned measures, such as number of notes. These functions are wired up to the graphical display and the graphical events.

Finally, a summary of the session is output once the session is terminated, either at the end of the backing track or when the user presses 'd'.

User

Musicians, specifically guitar players, interested in knowing more about their practice sessions. However, this software can be used by any instrument capable of producing MIDI output. The output can serve as a score, though we don't really think of it primarily that way; we're more interested in the statistics and progress one makes toward set goals in their practice time, such as the change in duration from session to session, or pitch and volume content.

Libraries and previous work

Sensors / Accessories

  • Godin MIDI guitar
  • Roland GR55 guitar MIDI interface / pedals
  • Computer camera to detect guitar face
  • Lighting for the face
  • FUTURE: Accelerometer / Game-Track to track hip gyration

Team

  • Roshan Vidyashankar
  • Gina Collecchia

Milestones

  • Week 1: OpenCV compilation, research; most of MIDI code; graphics setup
  • Week 2: MIDI + graphics integration and design; make the thing work, basically; OpenCV progress
  • Week 3: Guitar face feature detection
  • Week 4: Sex it up