If you could share and exchange your embodied dream experience, imagery, emotions and sensations with your friends and loved ones, how would you do it? If you could not only share and exchange, but remix and collage them, what would they look like or feel like? How would this work?
The aim of my PhD art-research is, at a meta-conceptual level, to uncover new understandings of the sensations of ‘liveness’ (Auslander, 1999) and ‘presence’, which may emerge from the use of mobile technologies and wearable devices within performance contexts. To explore these concepts, I chose to create a practical project to investigate them through and within several participatory performances, including live visual explorations meant to simulate dream and embodied VJ-ing (video jockying). The project MindTouch, discussed here, is a mobile performance project that uses biofeedback sensors and mobile media phones in live, staged, streaming, performance video events, to simulate dream embodiment and telepathic exchange. The aim of this paper is to discuss the project and research conducted at the SMARTlab Digital Media Institute at the University of East London, under the direction of Professor Lizbeth Goodman, in terms of technical and aesthetic developments from 2008 to present, as well as the final phase of staging the performance events, beginning July 2009.
- Camille Baker Interactive Arts, SMARTlab, University East London, UK
Full Text (PDF) p. 521-529