My research interests are wide, ranging from Digital Signal Processing to Machine Listening and Human Computer Interaction. Currently I'm in the process of defining my dissertation topic. Is still early to state it in one sentence, but I think it will be related to the idea of developing expressive machines. This is not a new topic, but it still far from being solved. It is my interest to advance the state-of-the-art in this field.

One key aspect I'm particularly interested in, is how musical expression is mediated (informed?) by perception. In general people have either focused on musical perception (cognitive science, auditory models, etc.) or on musical expression. I personally think the two should be tightly coupled, simply because that's the way we humans, learn, play and enjoy music.

Projects I've worked on

2013Ping-Pong: sonic location system In collaboration with Hyung-Suk Kim, we developed a system using off-the-shelf devices to measure inter-device ranges, and location in a 3D space. Furthermore, we use pitched signals, so the system can be made into a musical location system. The system and results will be presented in ASA 2013.
2013Query by Tapping Extended. In collaboration with Blair Bohannan, Hyung-Suk Kim and Jieun Oh, we collected a dataset for query-by-tapping, which also includes tapped melodic contour and subject specific data. The dataset and preliminary analyses will be presented in ISMIR 2013.
2012Unsupervised Feature Learning for automatic music annotation. In collaboration with Juhan Nam, we worked on a pipeline for unsupervised feature extraction. The pipeline and results were presented in ISMIR 2012.
2012A new paradigm for user guided audio stretching. In collaboration with Nick Bryan, we propose spring-chain metaphor to aid user for quicker and more flexible sound stretching. The proposed method will be presented in DAFx 2012.
2011 - ongoingHow to become a successful dad (a.k.a. "I haven't slept in a long time"). On going research, with a 9 month long gestation.
2010 - 2011TweetDreams: a multimedia musical performance made from live Twitter data. Conceptualized and developed in collaboration with Luke Dahl and Carr Wilkerson.
2010 - 2011This Is Not a Comb Filter: an image based real-time audio synthesizer/filtering application. More details in the final paper.
2010interV: my first iPhone instrument, based on a simple accelerometer based interaction.
2009 - 2010LAMB: a location aware social music toy. The users can create music boxes and leave them somewhere on the planet for other people to find (currently not working because of server problems).
2009 - 2010MoMu: A mobile music toolkit. I was part of the development team.
2009 - 2010Horgie: (TODO: move the app to a working server). The work leading to this app was done as part of my 220c Final Project. It was the culmination of the ASS project. More details in the final paper.
20093d visual sound: a simple system to place animated visual/sonic objects in a 3d space (designed speifically for CCRMA's Listening Room, as it was in 2009)
2009ASS-Metronome: a simple experiment to test the stability of the clock.
2009ASStk: a partial port of the STK to ActionScript.
2009ASS FM: being at CCRMA, I had the moral duty of starting with an FM synthesizer, honoring the great John Chowning.

Note: ASS (ActionScript Synthesis) project is a collection of experiments on web sound synthesis and collaborative interfaces.