Research

My research interests are wide, ranging from Digital Signal Processing to Machine Listening and Human Computer Interaction. My dissertation topic is computational beat tracking using Gradient Frequency Neural Networks.

One key aspect I'm particularly interested in, is how musical expression is mediated (informed?) by perception. In general people have either focused on musical perception (cognitive science, auditory models, etc.) or on musical expression. I personally think the two should be tightly coupled, simply because that's the way we humans, learn, play and enjoy music.

Projects I've worked on

Date Project
2013-2014 Ping-Pong: sonic location system In collaboration with Hyung-Suk Kim, we developed a system using off-the-shelf devices to measure inter-device ranges, and location in a 3D space. Furthermore, we use pitched signals, so the system can be made into a musical location system. The system and results have been presented in ASA 2013 and NIME 2014.
2013 Query by Tapping Extended. In collaboration with Blair Kaneshiro, Hyung-Suk Kim and Jieun Oh, we collected a dataset for query-by-tapping, which also includes tapped melodic contour and subject specific data. The dataset and preliminary analyses will be presented in ISMIR 2013.
2012 Unsupervised Feature Learning for automatic music annotation. In collaboration with Juhan Nam, we worked on a pipeline for unsupervised feature extraction. The pipeline and results were presented in ISMIR 2012.
2012 A new paradigm for user guided audio stretching. In collaboration with Nick Bryan, we propose spring-chain metaphor to aid user for quicker and more flexible sound stretching. The proposed method will be presented in DAFx 2012.
2011 - ongoing How to become a successful dad (a.k.a. "I can't sleep as I used to ..."). On going research, with a 9 month long gestation and indefinite end date.
2010 - 2011 TweetDreams: a multimedia musical performance made from live Twitter data. Conceptualized and developed in collaboration with Luke Dahl and Carr Wilkerson.
2010 - 2011 This Is Not a Comb Filter: an image based real-time audio synthesizer/filtering application. More details in the final paper.
2010 interV: my first iPhone instrument, based on a simple accelerometer based interaction
2009 - 2010 LAMB: a location aware social music toy. The users can create music boxes and leave them somewhere on the planet for other people to find (currently not working because of server problems).
2009 - 2010 MoMu: A mobile music toolkit. I was part of the development team.
2009 - 2010 Horgie: The work leading to this app was done as part of my 220c Final Project. It was the culmination of the ASS project. More details in the final paper.
2009 3d visual sound: a simple system to place animated visual/sonic objects in a 3d space (designed specifically for CCRMA's Listening Room, for the 2009 setup)
2009 ASS-Metronome: a simple experiment to test the stability of and ActionScript clock clock.
2009 ASStk: a partial port of the STK to ActionScript.
2009 ASS FM: being at CCRMA, I had the moral duty of starting with an FM synthesizer, honoring the great John Chowning.

Note: ASS (ActionScript Synthesis) project is a collection of experiments on web sound synthesis and collaborative interfaces.