TerraSona


Get the source

Idea / Premise


Dec 8

TerraSona is a world created by music and the users experience of the music in the world space. An independent program creates a series of events from a list of songs that, when applied to a terrain patch specified by a height field, result in interesting terrain. The user then runs a visualization and explores the world. Their actions in the world cause the music played to change and to influence the terrain in different ways.


Dec 1

From the original idea, based on current progress and feedback, I'm thinking of trying to visualize an album (a seemingly convenient size for keeping a manageable memory size and world footprint) via terrain, and allow the user to explore the album by moving around the world created by the different songs.

Note this means networking is basically being abandoned for this project, so some design considerations made with networking in mind may not be quite as useful for the current design.


Original

My goal is to create an environment that will encourage emergent gameplay amongst the users in a world where the terrain is created by music.

Users appear in a flat, featureless world, and a song starts to play (possibly suggested by a user, possibly taken at random from a selection of songs the users have made available). As the song progresses, features of the song (beat, pitch, etc.) cause changes in the world around the users. For example, fractal clouds might form in response to the frequency domain representation of the music. Beats might cause mountains to grow, or valleys to form. Changes in volume or tempo might deform existing features.

Motivation


Dec 1

There's a shift in motivation, too. Originally, the goal was to encourage emergent gameplay. Instead, I am tyring to give the user a new way of interacting with the music they already have.


Original

I had several existing games in mind when I conceived this idea; perhaps not necessarily as motivations for doing this, but rather as sounding boards that informed the conception of this world.

First, Eternal Sonata, a role-playing game by Tri-Crescendo. I've played the early parts of this game (have not completed it), but the overall premise is that this is a world envisioned by Chopin in the last days of his life. The world around the player and the characters that inhabit it are named after musical forms, features and instruments. I feel creating a world that is literally created by music takes this concept one step farther.

What shape user interaction takes may be beyond the scope of this course. Creating a coordinated, shared and dynamic world will probably be challenging enough. However, in order to make his a compelling multiuser experience, something must draw users into the world. The idea of allowing users to bring arbitrary objects into the world has been realized in the game Scribblenauts (which I have only read about). Giving the users a large number of objects which they could fit into the game world would provide building blocks for emergent behavior.

The Thing (What is it)


Dec 8

The final product is a coordinated dynamic world where a user can watch terrain being deformed by a song, and via their movement, change the mix of music and the effect of each song on the terrain.

The user creates the terrain deformation script using several songs they wish to view. The viewer application then uses the script and the music used to visualize the terrain.


Original

The final product should be a shared, coordinated dynamic world where users can experience view and explore the terrain created by their music.

The interaction will be divided into a three parts: Adding music from which the world can be created, connecting with a group of people, and exploring the changing world.

Ideally, adding music will be kept as simple as possible. Probably all that is necessary here is an in-application file browser.

Connecting with a group of people should happen through a central server, although for now it might have to be done by specifying an IP and port.

Interacting with the environment would be divided into a few categories: navigation, generation and manipulation. Navigation should be handled through a standard FPS-like interface, either with a keyboard interface or a mouselook + keyboard control scheme. Generation would ultimately be menu-based, but could involve search and selection. Manipulation would be an extension of the keyboard interface.

If songs are to be chosen at random, no further interaction is necessary. Howevr, giving users control over the next song played would give the users another mechanism for directing their shared explanation of the world.

Design


Dec 8

Users select the music they want to visualize, prepre it, and run the viewer. In the viewer, the user moves with standard "wasd" keys and mouselook. The shift key allows the user to move quickly.

Each of the songs included in the script are given a center position in the terain and a radius of influence. As the user moves towards a song, its volume gets louder, and the events associated with that song influence the terrain more. When a user is outside the sphere of influence of a it is not played and does not affect the terrain.


Original

User

Users should start the program, which will connect to a server and either get the in-process song/terrain, or the server will prompt them to pick a song in order to start the process. Users will then watch the world transform around them as the song plays, and be able to explore it accordingly.

One neat user-experience idea I got based on brainstorming was to define the world transformations in terms of the "Four Elements," Earth, Air, Wind and Fire.

Technical

For graphics rendering, I am planning to user OGRE, a graphics rendering engine based on OpenGL. OGRE will allow me to quickly generate aesthetically pleasing terrain that can change in real-time.

For sound processing and playback, I need to do some further investigation since I would like to use mp3 files from users' collections. Preliminary investigation suggests MAD could be appropriate, and I am also investigating mpg123.

For networking, I will probably use OSC as implemented by liblo.

I plan to have the server pre-process a selected mp3, generating a text file containing time-indexed terrain deformation commands. All connected clients will then get a copy of this text file, and the server will then start sending timing cues while the music is played for each user. I'm not sure if I will then have the server stream the music, or send the song out as part of the "map" download.

As multiple clients will be exploring the world, they should be visible to each other. The server will trust clients to report their position as it changes, and then report the presence of other clients to each connected client. If this was a serious MMORPG, this would be vulnerable to hacking. However, since I don't anticipate creating an environment where users could gain an advantage by moving outside of the pre-defined areas, this should not be a concern.

Testing


Testing will be done primarily by myself, with volunteers amongst roommates and friends if/when networking gets added.

I want users to get a sense of the type of environment created by certain songs. The greatest measure of success for me would be for users to hear one song while viewing the generated environment and picking the next one based on anticipating viewing the environment generated by that next song.

Team


Probably just me.

Milestones


Archived below is the early milestones. Instead, this is a more realistic breakdown of how the idea morphed and functionality was added over time.

17 Nov

Goal Result
Determine underlying technologies
  • OGRE used for graphics
  • libmad used for mp3 decoding
Demonstrate they can be used successfully for the project
  • Starting from an OGRE terrain demo, demonstrated that terrain can be manipulated in real-time (or close enough to get it to look okay).
  • Starting from the libmad minimad.c example, successfully decoded and played back an mp3. Using OpenGL, demonstrated subband-based energy peak detection.

1 Dec

Goal Result
Define terrain deformation event structure that results in realistic terrain that has some recognizable mapping to a song
  • Decided on a fractal-style algorithm using "Faults" that randomly deforms the terrain. Over several hundred iterations, the terrain looks interesting, but blocky. To counter "blockiness", added smoothing events that consider neighboring terrain and make more realistic looking terrain with multiple passes.
  • These are expressed as TerrainEvent objects, which morph the terrain at a given simulation time.
Map energy spikes to terrain deformation events
  • Near this goal: currently, at detection of a beat in a particular subband, the sound playback part generates a terrain-independent "ProtoEvent" that will eventually be mapped to an actual TerrainEvent. These ProtoEvents are organized into a TerraSonaScript object, which can serialize itself to a text script and be initialized from a text script.

8 Dec

Goal Result
Allow user to map and load multiple songs
  • The scripter file takes four songs, and the scripting system handles events per-song.
Integrate the beatvis playback with the graphics viewer and map each song to a point on the terrain.
  • Audio playback now drives simulation time, which determines when events are applied to the terrain. Strength of event application depends on users distance from each song.

Original


The early milestones are currently set to be tech-demo style. The interactivity is currently pushed back to later milestones. However, clearly this needs to be designed with network interaction in mind from the beginning. Defining the networking might be moved to an earlier milestone.

Funny


Google Translator: Terra Sona


Andrew Vogel, avogel at ccrma, Fall 2010