UDKOSC is a visually and aurally immersive rendered multi-user musical performance environment built in the Unreal Development Kit (UDK), a freely-available commercial gaming engine. Control data generated by avatar motion, gesture and location is routed through a bi-directional Open Sound Control implementation and used to drive virtual instruments within a multi-channel ambisonic sound-server.
By integrating the Open Sound Control (OSC) protocol into the UDK codebase, user-controlled avatars control software-based sound servers in real-time with their motion through and interaction with the rendered environment itself. Artifacts in the environment, including projectiles and the static-mesh building blocks of wall, floor and object components have been repurposed as control-data generating reactive entities with which performers can interact and manipulate. In this manner, the virtual environment and the interactions that occur within UDKOSC can be intuitively repurposed into tools for musical composition, performance, network-aided collaboration and transmission.
The UDKOSC engine powers the interactive pieces ECHO::Canyon, Tele-Harmonium and ECHO::Improvisation and was used to generate test examples and data sets for my doctoral thesis Perceptually Coherent Mapping Schemata for Virtual Space and Musical Method (2014)
Credits:
Programming: Rob Hamilton
Links:
github: https://github.com/robertkhamilton/udkosc
development wiki: https://ccrma.stanford.edu/wiki/UDKOSC
References: