XXX Gallery, May 20, 2016
“Hong Kong’s first algorave! free for ISEA delegates; 100HKD for non-ISEA delegates
An algorave is an event where people dance to music generated from algorithms, often using live coding techniques. Algoraves can include a range of styles… and has been described as a meeting point of hacker philosophy, geek culture, and clubbing.
Although live coding is common place, any algorithmic music is welcome which is “wholly or predominantly characterised by the emission of a succession of repetitive conditionals”… although algorave musicians have been compared with DJs, they are in fact live musicians or improvisers, creating music live… (from Wikipedia)
Line-up: DJ Sniff, Renick Bell, Yen Tzu Chang, Calum Gunn, Shelly Knotts, + Ryan Jordan”
About the Renick Bell performance:
In “Hong Kong Algorave Performance, 160520” improvised programming generates danceable percussive music emphasizing generative rhythms and their variations. All of my interaction with the system is projected for the audience to see. The custom live coding system, a Haskell language library called Conductive. It triggers a software sampler built with the Haskell bindings to the SuperCollider synthesizer and loaded with thousands of audio samples (as many as 18,000). Through live coding, I manipulate multiple concurrent processes that spawn events, including the number of processes, the type of events spawned, and other parameters.
At least two methods of generation of base rhythms are used: stochastic methods and L-systems (an algorithm describing plant growth). In the former, sets of rhythmic figures are generated stochastically. From them, figures are selected at random and joined to form larger patterns. In the latter, L-systems are coded live and used to generate patterns. These patterns are then processed into a stack of variations with higher and lower event density. That stack is traveresed according to a time-varying value to create dynamically changing rhythms. Simultaneously, patterns in which audio samples and other parameters are assigned to sequences of time intervals are generated through similar methods. The concurrent processes read the generated data and use it to synthesize sound events according to the rhythm patterns described above.
This performance also uses additional agent processes which change system parameters (conductor agents) run alongside sample-playing agent processes (instrumentalist agents). These conductor agents are the result of my recent research into how autonomous processes can complement live coding activity. These conductors stop and start instrumentalists, as well as change the other parameters used by the instrumentalists for sample-triggering, such as which sample to play and which rhythm pattern to follow. The live coding involves not only the patterns for rhythms and samples but also the algorithms which the conductors use during the performance.
My interaction involves activities such as generating data, continuously reselecting which data to use throughout the performance, changing the number of running concurrent processes, and determining when changes occur. By manipulating both instrumentalist and conductor agents and the data they read, a rapidly changing stream of rhythmically complex bass, percussion, noise, and tones is improvised according to a rough sketch of the overall performance structure. This improvisation crosses the genre boundaries of bass music, noise music, and free improvisation. The projection shows all of my activities, including code editing and execution of code in the interpreter. When I press “Enter” on the keyboard, the line under the cursor is sent to the interpreter and immediately executed. Pressing F11 causes the code block under the cursor to be sent and executed. Text output of functions is printed in the interpreter.
The primary technologies used include:
– Conductive, a library for live coding in Haskell
– the Haskell programming language, through the Glasglow Haskell Compiler interpreter
– the SuperCollider synthesis engine (but not its programming language)
– hsc3, the Haskell bindings to SuperCollider
– the xmonad window manager
– the vim text editor
– the tmux terminal multiplexer
– the tslime plugin for vim
Other open-source tools are essential for the performance, including:
– an Arch Linux computer
– the Calf Jack Host
- Renick Bell, improvising algorithmic sound and images, coding, teaching, based in Tokyo, Japan. I improvise music performances through live coding using a software library that I have written called Conductive. Live coding, or the performance through programming, enables a performer to manipulate symbols rather than use physical gestures to carry out a performance. For me, physical gesture is more limited in expressivity than manipulation of symbols representing abstractions; while humans have learned to use a complex vocabulary of gestures to produce art, the realtime manipulation of text-based symbols may increase the range of what is expressible. I also feel it is more convenient than typical graphical software that a user manipulates with a mouse. Through live coding, I achieve a text-based control center. I can specify complex parameter changes to be executed simultaneously while using a variety of existing programming tools to increase efficiency. From another perspective, live coding extends algorithmic composition and turns it into a live performance rather than a write/compile/run loop from traditional software development or electronic music composition. Because the music is generated algorithmically, I am often suprised and challenged by the resulting output. With these tools and methods, I can explore combinations of sounds and rhythm patterns, most of which I could not achieve without the use of my software. I am seeking such experience for myself in performances. At the same time, I want to continually make the code that I use more expressive. Exploration of these areas fascinates me. renickbell.net
Full text and photo (PDF) p. 32-34