The proliferation of touchscreen smartphones and tablets has created a surge of new interest in incorporating personal mobile devices into visual and sonic art. These devices provide millions of users access to an array of sophisticated sensors that artists can take advantage of in interactive installations and performances. Unfortunately, the development of software enabling the creation of custom interfaces for such works lags far behind the hardware advances of the last three years. Apple’s prohibition against the majority of user scripting has stopped developers from allowing artists to define interfaces with complexities going beyond simple banks of virtual sliders and knobs; such scripting is a necessity when creating dynamic interfaces that can change their behavior based on hardware sensor readings, user input and network requests. An additional problem with current artistic interface applications is that they only permit artists to take advantage of two of the many sensors in mobile devices (the touchscreen and triple-axis accelerometer) while ignoring other valuable sensors for interactivity such as gyroscopes, microphones and video cameras.
This paper will contextualize Control by looking at the history of previous touchscreen devices like the JazzMutant Lemur, by examining other iOS and Android solutions for artistic interface development and by looking at the influence of two movements in the field of Human Computer Interaction on the project: End-User Programming and Meta-Design. The paper will end by discussing the author’s personal use of Control in his artistic and musical practice and by offering propositions about the future role of mobile devices in audience interaction pieces and participatory artistic installations. charlie-roberts.com/Control
- Charles Roberts, University Of California, Santa Barbara, USA charlie-roberts.com