Reality by Example is a tool and meta-environment that allows users to create and populate a virtual world in VR, from within VR. It uses interactive machine learning to enable users to shape terrain, music, and creature sounds and animations by providing examples. The mapping from examples to reality is learned instantaneously from the user's provided examples and updates whenever they change the examples or provide more. The overarching goal is to improve creation methods for amateur creators and enable them to understand VR as a medium for creation, not just content consumption. The work is still in progress.
Here's a few WIP demo videos:
A demo of many (but not all) of the features of in situ creation of VR worlds by example using interactive machine learning. The user is placing examples of how the world should look and how its creatures should behave, and then the world decides how to fill in the gaps.
Shown: terrain height, "bumpiness", texture/color, music features for background music, and creatures.
A closer look at high level musical parameters, and a demonstration of recording animations that respond to the current tempo of the music, rather than playing back at a constant rate.
A demonstration of the "fine-tune" terrain methods that involve placing and modifying individual examples.
A quick demonstration of three classes of creatures: air, water, and land. Also shows naming, and an "audience view" that slews to the VR camera's position instead of instantly teleporting.