In the late 1990’s a new performance practice appeared in the more experimental venues of the musical world, where performers would step onto stage with a rather strange musical instrument, the laptop. These performance contexts, in pubs and clubs, were primarily designed for pop or rock bands. Instead of locating themselves behind the mixer, where the best sound is normally to be heard, they placed their equipment on the stage, typically on a table, and presented some rather refreshing and novel musical worlds. Whilst the audience appreciated the texturally sophisticated world of sound these instruments were capable of, the performance aspect of the music suffered. What were these musicians actually doing behind these screens on the stage?
A decade later some solutions had evolved, addressing this lack of coupling between the performer’s gestures and the sound emitted by the speakers. One of them is VJing. By analysing the sound signal – typically through Fast Fourier Transform Analysis or even OSC messages sent from the sound generating software – the VJ is able to generate visuals that connect and represent the sound in endless interesting, yet arbitrary, ways. Another solution is represented by a field often called NIME (New Interfaces for Musical Expression), with university courses and conferences dedicated to the investigation (see www.nime.org). Here various interfaces have been designed that allow the performer to use her body, in a manner inspired by acoustic instruments, to control a digital sound engine. The third response to the problem of the exclusiveness of computer music performance is live coding.
- Dr. Thor Magnusson (UK), a senior lecturer at University of Brighton’s Faculty of Arts, works in the fields of music and generative art. Recent research and publications are on improvisation in electronic music, philosophy of technology, instrument design and music programming language design. Thor is a co-founder and member of the ixi audio collective.
Full text (PDF) p. 198-200