LiveLyrics combines audio visualization with song lyrics letting users create real time motion grahics

  • Audio Analysis guided animations
  • Midi controllable effects
  • Ability to record and save performances

The idea for LiveLyric came about one day when surfing YouTube. I saw that for any popular song that I searched for a "lyrics" video came up which would show the words on screen in time with the song. People want to see the words on screen with the music.

Unfortunately these videos lacked the energy and spontanaety of the songs they represented. The process for the creation of such videos was far less engaging than it could be. This problem is what gave birth to LiveLyric

 

Popular Music has two major components: the music itself and the meaning. I wanted both of these components to be reflected within the visualization of the lyrics. Secondly, popular music is often best enjoyed through active motion and dance. I wanted the creation process to be fun, active and simple -- sort of like tapping your finger to the beat. The final result would be a visualization that reflected both the music and the meaning of the song with a quality similar to motion graphics created by professional designers.

Motion Graphics often require careful alignment and syncing to the beat because they are not done in realtime. By letting users just tap to a beat this alignment process can be made much simpler. The second idea was to do "audio guided" animation. User controls are used to do high level animation but the "texture" of the animation is controlled through audio analysis. The audio signal determines certain constants within the program such as the speed of animations as well as the scaling properties of the text.

The entire project is built around an open-source game engine called Panda3D. Panda3D handles many common complex tasks such as building and managing a scene graph or handling advanced texturing and lighting effects in a cross platform manner.I chose to use Panda because I wanted to focus as much as possible on the artistic aspect of implementation. Much of the time was spent tweaking the parameters of the animation to get aesthetically pleasing results.

Another reason I chose to use Panda3D was it's support for the Python programming language. Python made it much simpler to record and playback user actions via the pickle module. I simply had each user action stored in an array which was serialized using pickle and could be deserialized and used directly for future runs.

The RtMidi library was used to read in midi notes. Midi can be used to control several parameters including:

  • Which lyric segment we are displaying
  • The rotation speed and direction of the text
  • The scale of the text
  • Quick rotate left/right controls
  • Text Effects
  • Particle Effects

To run the code you will need to do the following. Note that these downloads will not affect your native Python installation (that is why you must use the ppython command (Panda Python)

1) Download Panda3D

http://www.panda3d.org/download/panda3d-1.7.0/Panda3D-1.7.0.dmg

2) Install PyAudio

http://people.csail.mit.edu/hubert/pyaudio/packages/pyaudio-0.2.4.dmg

3) Install PyRtMidi

svn co http://svn2.assembla.com/svn/pkaudio/pyrtmidi

ppython setup.py build

ppython setup.py install

4) Download LiveLyric:

Click here to download LiveLyric

To run just connect a MIDI controller then enter:

ppython main.py

If you don't have a MIDI controller you can use Z,X keys to control the current lyric segment and Q,W to control rotation. There are more effects on the MIDI controller

You can use the following arguments:

-record <output file name> Records all actions (save at any point using SHIFT+S)
-play <input file name> Plays back recorded action from file
-lyric <input file name> Sets the Lyrics file
-sound <input wav file name> Sets the sound file
-midi Enables midi control

The keyboard shortcuts

q Tilt text left
w Tilt text right
a,s,d,f a,s control background texture, d,f control particle texture
z,x previous/next lyric
c text emphasis effect