‘By transforming the metaphor of Sisyphus with synthetic soles, one does not transform the secret of the stone. Pedestrian’s movements build physics of momentum, art, love and technology, the stone never falls from the moebius strip’.
Kinetics are the signs of a living system in action. The composer wishes to extend VR technology to nourish the human listening capacity as to guide our actions in an environment. The computational space is configured for an intelligible representation of a system of references in which the living system performs its movements with an ecological orientation. the sense of ecology intersects both internal and external orientation towards one’s own body and environment. This empowers as observer as a performer.
Beyond the notions of space in terms of moving objects bound in physical space, a complex space can be functionalized in computational space such that an observer/performer can access and interact with the computational processes, and generate traces as well. In this case the traces are the record of kinetic movements associated with the computational processes. The kinetic movements are inseparable from the processes, as the movements are guided and motivated by the total action reports and evaluations based upon visual and auditory feedback.
Rolling Stone is created and performed using ScoreGraph, an object-oriented software environment. ScoreGraph was developed for configuring virtual reality applications by specifying the connectivity and synchronization of parallel processes computing numerical models of time and space.These include graphical objects and scenes, physically-based models of dynamic systems, control signals from hardware interface devices and models for sound synthesis. ScoreGraph allows these models to be computed asynchronously and to exchange control signals for synchronous display inreal-time. An interface protocol enables the run-time specification of aspects of the spatial layout, visual display, numerical simulations and control signal synchronizations. The temporal nature of events are sometimes non-linear and sometimes linear, requiring a protocol with attributes of both a directed graph and a musical score.
Rolling Stone is performed using CyberBoots, foot-mounted pressure sensors, attached to a fuzzy logic inference process. The fuzzy logic observes continuous pressure changes and recognizes discrete temporal and positional patterns such as walking and leaning. Sound is produced by real-time software synthesis on multiple computers running in parallel, controlled by virtual events. The CyberBoots and the ScoreGraph and Sound Server environments were created by the Audio Development Group at NCSA. The graphical display is supported by the CAVE libraries developed at the Electronic Visualization Laboratory, University of Illinois at Chicago. Rolling Stone is produced by Robin Bargar. Video: Rolling Stone Shepard’s Tones Torus
- Insook Choi (USA), composer-in-residence at the National Center for Supercomputing Applications and Researcher in Human-Computer Intelligent Interaction for the Beckman Institute, University of Illinois, has created numerous projects integrating computing environments into performances for artistic venues. Her research directions include sound synthesis with nonlinear dynamical systems, real-time control strategies for high-dimensional models, and auditory display in virtual environments. Choi’s research papers are often coupled with compositions and presented across the fields of engineering, art and music.