The primary goal of the project is to sonify data from six tree species
from a study on Alaskan forests; over the course of the piece, we move
through time and space, tracking populations of trees - living and dead
- in sound.
Sonification
Data from this study was sonified (handwavey explanation pending
further developments) and presented in two movements: first, a solo
from a single, initially dominant tree species, then a pass with all
tree species voices singing.
Visualization
Another goal of the project has been to create a "multi-pass"
interpretation of a single data set: that is, to present simultaneous,
parallel representations of the data in multiple modalities and
interpretation schemes. Along with the musical representation, tree
data also feeds into a visualizer Max patch which generates blooming
colors. The colors "reseed" to represent a change of sample plot (i.e.
where the tree data came from), and their relative weightings are
modulated by amplitude of the music playing and by motion as observed
by the laptop's webcam.
Screenshots
Movement
The movement accompanying the piece was choreographed around the forest
music, and thus its dynamics represent a third interpretation of the
data set. Originally, the intention was to perform this piece of the
project aerially, ideally while suspended from a tree. Although the
idea has some compelling symbolism, it quickly became apparent that for
both pragmatic and aesthetic reasons, it was a better idea to keep it
simple.
"With a series of artistic research phases known as
Bêta Tests and an academic line of enquiry lasting several years, the
Gynoïdes Project is a wide-ranging operation aimed at creating
alternative, feminist strategies for circus creation. Here the artistic
director, Marie-Andrée Robitaille, talks about the project's
objectives, about the expectations and pressures placed on women in
circus, and about the sonification of circus
equipment – a strategy developed during Bêta Test V."
"In this paper, a series of exploratory design
processes resulting in proofs of concepts are presented, showing
strategies for effective use of three different modes of sonic
interaction in contemporary circus. Each design process is based on
participatory studio work, involving professional circus artists. All
of the proofs of concepts have been evaluated, both with studio studies
and public circus performances, taking the work beyond theoretical
laboratory projects and properly engaging the practice and culture of
contemporary circus.
The first exploration uses a contortionist’s
extreme bodily manipulation as inspiration for sonic
manipulations in an accompanying piece of music.
The second exploration uses electric amplification
of acoustic sounds as a transformative enhancement of existing
elements of circus performance.
Finally, a sensor based system of
real-time sonification of body gestures is explored and ideas
from the sonification of dance are translated into the realm of circus
Stelarc is a performance artist who has visually probed
and acoustically amplified his body... He has used medical instruments,
prosthetics, robotics, Virtual Reality systems, the Internet and
biotechnology to engineer intimate and involuntary
interfaces with the body. He explores Alternate Anatomical
Architectures with augmented and extended body constructs. He
has performed with a THIRD HAND, an EXTENDED ARM, a VIRTUAL ARM, a
STOMACH SCULPTURE and EXOSKELETON, a 6-legged walking robot. His
FRACTAL FLESH, PING BODY and PARASITE performances explored involuntary, remote and
internet choreography of the body with electrical stimulation of the
muscles.
"The Amplified Body performances begin in 1971 and
continue till the late 1980s. EEG (brainwaves), EMG (muscles), ECG
(heartbeat) and Ultrasound (blood-flow) were used. I also amplified
stomach sounds. The performances began when the body was switched on
and were completed when the body was switched off. The cacophony of
sound was altered both by physiological control and also through the
fatigue of the body over the duration of the performance. For example,
adjusting attention altered brainwave frequency and slowing down
breathing affected heartbeat. Constricting the radial artery of the
wrist changed the ultrasound measure of blood-flow. Several kinds of
ultrasound sensors were used. The flat type sensor was fixed to the
wrist for shallow monitoring of blood-flow. The handheld pencil-type
sensor was used for amplifying deeper heart activity, like the heart
valves opening and closing and the sloshing of the blood through them."
"Split physicality and reduced bodily autonomy are
explored as performers’ movements directly
trigger and influence the
bodies of others through the use of electrical stimulation.
Performers
Meghan Anderson and Jacob Regan perform short movement cycles (roughly
2 minutes) while interactions between all three performers change. Each
performer is wearing the following equipment:
Wrist pack with accelerometer, Particle Photon (WiFi
connected
microprocessor)
Body pack with accelerometer, Particle Photon, relay
unit, 2 TENS units
Accelerometer data is sent to laptop, where I can
create interactions
by assigning accelerometer
data, through the creation of thresholds, to trigger specific
electrodes on each performer’s body. Additionally, any
time Anderson or Regan receive an impulse from a TENS unit, I receive a
similar impulse, stretched longer in duration.
The Heart Chamber Orchestra (HCO) is an audiovisual
performance. The orchestra consists of 12 classical musicians and the
artist duo Peter Votava and Erich Berger (aka Terminalbeach). The
heartbeats of the musicians control a computer composition and
visualisation environment. The musical score is generated in real time
by the heartbeats of the musicians. They read and play this score from
a computer screen placed in front of them. HCO forms a structure where
music literally “comes from the heart.”
Created for the "Sites of Memory" show in the basement
of Stephan Stoyanov Gallery, this piece was inspired in part by
passages from Gaston Bachelard's The Poetics of Space: "it is possible,
almost without commentary, to oppose the rationality
of the roof to the irrationality of the cellar". The cellar is
"first and foremost the dark entity of the house, the one that partakes
of subterranean forces. When we dream there, we are in harmony with the
irrationality of the depths". The piece presents the visitor with a
circular portal "drawn" on a wall, using 12 small proximity
sensors to sketch this wheel-like portal's outlines. This
portal is an interactive sound interface - when visitors place any part
of their body in front of the sensors, they will trigger sound. Moving
towards or away from the sensors changes varying qualities of the sound
- increasing or decreasing speed, pitch, volume, etc. So the psychological and
referential qualities of the sonic space (coloring the built
space that houses the piece), will shift with a visitor’s conscious and
unconscious movements.
ATLAS in silico is the result of a dynamic art + science
collaboration. It presents the entire first release of 17.4 million
metagenomics sequences from the Global Ocean Sampling Expedition (GOS)
in an installation blending virtual reality‚ spatialized audio‚
interactive computer graphics‚ and full-body interaction. The
dream-like‚ and luminous interactive virtual environment enables
participants to see‚ hear and playfully interact with the metagenomics
data. As they interact with dynamic patterns of light and sound‚
participants explore a Scalable Metadata Environment (MDE) made from
GOS data that spans from the scale of molecules to the scale of
socio-economic trends. MDEs are one of the new technologies developed
and prototyped in the creation of ATLAS in silico. They are 4D virtual
environments structured by quantitative and qualitative metadata
describing multidimensional data collections.
Prof. Ruth West (University of North Texas), is engaged
in collaborative research with Prof. Cindy Grimm (Oregon State
Univserity) and Prof. Tao Ju (Washington University St. Louis) to study
the cognitive and perceptual basis of how experts extract 3D shapes
from volumetric data, such as electron tomography, MRI or CT imaging.
This process, known as “segmentation” plays an essential role in the
interpretation and analysis of volume data in a variety of application
domains. Understanding what a segmenter sees, thinks, and does while
interacting with a data set will help to make future tools more
efficient, alleviating the major scientific bottleneck posed by the
time-intensive nature of segmentation. It will also help in developing
better tools to improve the accuracy and repeatability of the
segmentation process, positively enhancing the quality of the resulting
data for use in a variety of applications, including biomedicine,
clinical practice and environmental engineering.
Dreamspace Fragments is an exploration into generating
representations of the narrative content of dreams as virtual
environments‚ or “dreamspaces.” Using a derivative of the
classification system for dream content analyses developed by Calvin
Hall I encoded the narrative content of each of 1’ dreams recorded from
1989 to 2000 into numeric values which were then used to define the
parameters for generating 3-dimensional objects that are rendered as
VRML worlds.