[ISEA2006] Paper: Jill Scott – e-skin: wearable interfaces for the visually impaired

Abstract

For the visually impaired navigating the city is a truly invisible experience, one that relies on cross-modal interaction, from audible and tactile stimuli. E-skin uses new technologies of mobile and pervasive computing, and new touch and sound-based metaphors, which may help the visually impaired learn to experience the city as a shared cultural feedback loop.

Through workshops with congenitally blind users, E-skin is evolving into an experimental project to develop wearable electronic skin for special awareness and Interactive learning. The interface is based on the biological sensorial of the human skin. In neuron-psychology clinical studies, clearly prove that the information-acquisition and processing capabilities of human skin rival those of our visual and auditory system (Kaczmarek and Bach-y-Rita, 1995). Tactility can be combined with sound feedback and transferred via cross-modal interaction into neural patterns, which are optimal for learning (Te Boekhorst et al. 2003). The different receptors participating in our bodily haptic system contribute to a neural synthesis that interprets position, movement, and mechanical skin inputs. Druyan (Druyan, 1997) argues that this combination of kinaesthetic and sensory perception creates particularly strong neural pathways in the brain. It has been also been proven that our haptic system is superior to vision with regard to its capability to discriminate different textures (Sathian et al., 1997; Verry, 1998) as well as micro-spatial properties such as compliance, elasticity, viscosity and temperature (Lederman, 1983; Zangaladze, et al., 1999). Vision and haptics (tactility) complement each other in that vision aids in the perception of macro-geometry while haptics excel in the detection of micro-geometry. Research on blind people’s mobility (Colledge et al, 1996, Ungar et al, 1996) indicates, that the deficiency in the visual channel is compensated with touch and hearing (Lahav and Mioduser, 2003). In case of non-impaired people most of the information required for mental mapping of space is gathered through the visual channel (Lynch, 1960). From our related studies and workshops we have found that while there is has been a lot of studies in the role of skin modalities in object manipulation and recognition, there is relative little information available on how pressure, temperature and vibration participate in our perception of space. In our e-skin prototypes temperature plays an important role orientation and navigation (the skin can detect a warming of only 0.4 degrees centigrade and a cooling of 0.15 degrees centigrade (Kenshalo et al., 1961), providing us with important information about our current environment. In e-skin we are attempting to fill a gap about the relationship between tactility and sound with the aim of improving access to information and navigation. The aim of e-skin is to develop two wearable and usable accessories, with slightly different configurations for the visually impaired. The first version of e-skin will support the visually impaired in the market street or in supermarkets and the second version would enable the person to attend and actively participate in cultural events (art galleries, theatre events or dancing on a dance floor). Existing navigation aids for blind people are either obtrusive, not wearable and/or only rely on acoustic feedback. Many of these devices are difficult to use for blind people, for example, acoustic feedback interferes with and blocks normal hearing, one of the most important senses for gathering information about the environment. Our technical goal is to substitute vision by a combination of tactile and acoustic feedback and actuation provided by accessories with sensor and actuators linked to a wearable computer (QBIC). These devices would be connected using a Wifi network. The e-skin accessories will consist of several interactive components: – An armband with pressure sensitive and electro-tactile fabric, acoustic bone transducer, accelerometers, temperature sensor, a directional RFID antenna. And vibration motors – An ankle band containing accelerometers and vibration motors – A shoulder strap with electronic compass and ultrasonic range sensor – A computer belt, containing a QBIC computer (inc. Wifi), RFID readers, antennas and vibration motors.

E-skin is being developed for and tested in two different city environments: Firstly, a real shopping market environment or street. This setting has been chosen because shopping is one of the most inaccessible and frustrating experiences for any visually impaired person. There are too many similar looking products, too little variations in light, and too many obstacles in their way. From a research point of view, the supermarket is interesting because it is a highly dynamic and complex environment. Also many retailers are looking at the potentials of RFID technology for their environments. The second environment is a dance or disco, in which visually impaired people can participate in a dance event supported and enhanced by the e-skin interface. We have been experimenting with our eight blind collaborators who have expressed a great interest in working with the potentials of gesture and communication in cultural environments with non-impaired people. From a research perspective, the workshops test the technical usability and ergonomics of e-skin, as well as extend the robustness of the interface in a fast and dynamic multi user setting. As an artist who works in Artificial Intelligence, embodied interaction should take into account the physical and social phenomena, which unfold in real time and real space as a part of the world, and know-how in which we are situated. Many artists use 3D virtual environments because they are interested to improve the level of audience immersion. Acoustic feedback, alone is not strong enough forms of interaction to achieve truly immersive results. E-skinE-skin is a shared research and development attempt to combine know-how in sensory perception, cultural studies, engineering and software design in order to humanize technology for the visually impaired. The tactile and sound perception of the visually impaired can be utilized to increase their awareness of space. The combination of tactile sensors with movement interaction can generate valuable feedback for orientation and navigation. These potentials might also be helpful for people who have only slight vision deficiencies (like the elderly). e-skin allows for a very intuitive interaction due to its sensory modalities, feedback methods, and ergonomics potentials, which are all based on the human skin. Unlike most interface developments, this project integrates visually impaired users into the entire process of HCI Interface development. The project will increase the accessibility of two environments (shopping mall and dance stage) to visually impaired people. We consider this an important towards the overcoming of the social separation in the city street, between handicapped and normal people.

  • Jill Scott, Australia/Zurich, Switzerland. Writer, Artist, Professor, Lecturer, holds a Masters Degree in Communications from San Francisco State University, Research Professor at The University of Applied Science (FHA) and the Academy of Art and Design (HGKZ) in Zurich, Switzerland, former Professor of Installation Design at the Bauhaus University in Weimar, Germany, lecturer at the University of New South Wales, College of Fine Arts, Sydney, awards include an Award of distinction at Ars Electronica, a Doctorate in Media Philosophy from the University of Wales, Great Britain, Artist in Residence and project coordinator for the Medien museum at the Zentrum fur Kunst and Medien Technology in Karlsruhe (ZKM) and a Research Fellow at The Center for Advanced Inquiry into the Interactive Arts.