Sensory Substitution - Vision Substitution


« The vOICe Home Page
Recent publications about sensory substitution »

Blind User of The vOICe
Blind user wearing The vOICe
This blind lady wears The vOICe daily, here "seeing" with a covert "spy camera" hidden inside her special video sunglasses. The notebook PC running The vOICe software is inside her backpack.
Here she is finding her trash container, which got carelessly tossed by a city worker after it was picked up.
Photography: courtesy Barbara Schweizer
Sensory substitution means replacement of one sensory input (vision, hearing, touch, taste or smell) by another, while preserving some of the key functions of the original sense. In particular, research in this area aims at providing some equivalent of vision via hearing or touch, or some equivalent of hearing via vision or touch. Blindness and deafness are generally considered to be among the sensory disabilities that have the greatest impact on everyday life, and the quest for good sensory substitution devices or prostheses for blindness and/or deafness therefore presents a great challenge. In case of partial blindness and/or deafness, sensory substitution may also augment rather than fully replace.

Let's see... about sensory substitution
  BrainPort vision device
tongue display
AuxDeco FSRS
forehead display
The vOICe
auditory display
Cost estimate ~$10,000 ~$15,000 < $500 *
Pixels (electrodes) ~400 512 > 10,000
Shades of gray 2-4 ? 2-4 ? > 16
Availability several countries Japan global
* cost indication for a setup based on a Windows netbook PC with camera glasses
Blind man visually picks up objects using The vOICe (PIP version) Blind man finds his cane using The vOICe Blind man performs street crossing using The vOICe Blind man scores a goal using The vOICe Blind man analyzes abstract house shape using The vOICe (PIP version) Blind man plays visual tic-tac-toe using The vOICe
YouTube video clips of training during SBIR phase I evaluation study on The vOICe by MetaModal LLC

In the remainder of this page, we will discuss options for vision substitution (also called visual substitution or sight substitution): sensory substitution targetting (total) blindness. Deafness lies outside the scope of this site, although much progress has been made in recent years with the development of cochlear implants. The cochlear implant is a prosthetic sensory bypass rather than a sensory substitution device, but the fact that it maps acoustical inputs (one physical modality) directly onto neural stimuli (another physical modality) raises several issues and problems that are similar to those in sensory substitution devices mapping one sensory modality to another. Sensory substitution approaches may also play a role in other sensory prosthetics, for instance for sensory augmentation or  cognitive augmentation through neuroelectronics. On the website of  Dr. Timothy Hain (dizziness-and-hearing.com) a number of literature references on the use of auditory and other sensory feedback for treating bilateral vestibular loss are given.

Vision Substitution

Apart from significant medical progress in applying medicine and surgery for treating several causes of blindness, much of the pioneering work on blindness has focussed on two main issues:
  1. Reading and writing

    Louis Braille (1809-1852) in the 19th century developed the well-known raised dot code now named after him. This tactile representation was later supplemented by blindness aids like the mechanical Braille printer and typewriter (Brailler), again followed by more versatile electronic implementations.

    In the early 1960s, John Linvill and James Bliss developed the  Optacon, involving a tactile 6 × 24 matrix of vibrating pins comprising an area of 12.7 × 29.2 mm and with active pins vibrating at about 230 Hz. The Optacon directly maps the brightness input from a camera to a corresponding vibrotactile pattern representing printed (non-Braille) characters in the field of view of the camera. Production of the Optacon by TeleSensory was discontinued end 1996. A direct auditory representation of print was provided by the camera-based 10-tone Stereotoner, which sounded 1 column of 10 pixels at any given moment while the user had to manually scan the print by moving the camera over it.

    In another attempt to circumvent the difficulty of reading Braille, the Kurzweil reading machine (KRM) was developed around 1975 by Raymond Kurzweil. This device is an early dedicated text-to-speech engine, of which newer versions have been made. As with the Optacon, increased competition may come from Soundblaster-type speech synthesizers that are now commonly and cheaply available in multimedia PC's, combined with optical character recognition (OCR) software and a camera for acquiring (printed) textual material that is not already electronically available in the form of ASCII characters.

    Screen magnifiers (CCTV's), including contrast enhancement, were developed as visual aids for the partially sighted.

  2. Obstacle detection and avoidance

    Obstacle detectors are intended to improve orientation and mobility by giving spatial information about the immediate neighourhood relevant to navigation while walking. Electronic devices for that purpose are therefore also termed electronic travel aids (ETAs) or blind mobility aids.

    The long cane is a very simple but effective mechanical device for probing the environment for obstacles through touch, but also to some degree through acoustical reflections resulting from tapping. Its range for providing touch feedback is only a few feet, and it does not normally detect overhanging obstacles. The guide dog has helped to alleviate some of the limitations of the long cane for (some) blind travellers, but w.r.t. general orientation and navigation it still provides little information.

    Sonic Pathfinder
    Peter Meijer (left) wearing Tony Heyes' (right) Sonic Pathfinder. May 31, 1997.
    Particularly the early ETA's tried to extend the small range and angular coverage of the long cane either acoustically (ultrasonically) or optically (laser light). A German laser cane (German: Laser-Langstock) is under development at  VISTAC GmbH. The Mowat sensor and Nottingham obstacle detector (NOD) are both hand-held ultrasonic torches, developed around 1980, that map ultrasonic reflections into vibrotactile or auditory signals, respectively. The Mowat and NOD devices are no longer in production. Leslie Kay, Tony Heyes and Allan Dodds played a central role in the development of sonar-based aids for the blind. Kay developed an environmental imaging sensor called the Sonicguide that provides more spatial information than the early obstacle detectors.

     Tony Heyes has improved upon the Nottingham obstacle detector (NOD), and his device is known as the Sonic Pathfinder (1984). This is a head-mounted pulse-echo sonar system comprising three receivers and two transmitters, controlled by a microcomputer.
    Sonic Pathfinder
    Tony Heyes (left) explaining the Pathfinder to Hester. May 31, 1997.
    It does not give information about surface texture, and normally the auditory display indicates only the nearest object, which is why it should be classified as an obstacle detector rather than as an imaging device. Heyes' approach is rather different from Kay's in that the Sonic Pathfinder deliberatily supplies only the minimum but most relevant information for travel needed by the user, whereas Kay strives for more information-rich sonar-based displays. Heyes argues, for instance, that if some object moves away from the user, the user does not need to know this in terms of safe and efficient travel, and there may also be less confusion while minimizing interference with hearing normal environmental sounds (e.g., traffic). The Sonic Pathfinder uses brief tones on a musical scale to denote distance, with pitch descending when approaching an object. Perception of objects on the left and on the right is further supported by tones for the corresponding ear. Two additional, off-center, sonar beams are used to detect objects on the left or right. The Sonic Pathfinder is used in conjunction with the long cane.

    Other sonar devices for the blind include the  Miniguide,  UltraCane, BAT 'K' Sonar-Cane (ksonar.com no longer online) and an optional sonar extension for The vOICe.

During the last few decades, research has also begun to explore options for ``true'' vision substitution through low to medium resolution camera or sonar image maps. This has happened in spite of claims about sensory overload (e.g., by Shingledecker) that was supposed to result from these richer information sources. No decisive evidence for this overload argument exists, however, and it is easy to come up with counterexamples that weaken the overload argument.

There are several distinct sensory substitution approaches heading towards medium resolution environmental imaging:

 

Literature and conference presentations:

The New York Times on sensory substitution:

 ``New Tools to Help Patients Reclaim Damaged Senses,'' by Sandra Blakeslee for The New York Times, November 23, 2004.
``In another approach, Dr. Peter Meijer, a Dutch scientist working independently, has developed a system for blind people to see with their ears. A small device converts signals from a video camera into sound patterns delivered by stereo headset to the ears. Changes in frequency connote up or down. Changes in pixel brightness are sensed as louder or softer sounds.''

The New Scientist on sensory substitution:

 ``Senses special: The feeling of colour (subscription required),'' by Helen Phillips for the New Scientist, January 29, 2005, pp. 40-43.
``In preliminary tests with a similar device designed by engineer Peter Meijer from Eindhoven in the Netherlands, the signals took a little getting used to. But after a couple of hours of feedback either from touching or being told what they were viewing, people were able to recognise objects by their sound. They could tell plants from statues and crosses from circles. But they weren't fooled into thinking they were seeing. O'Regan's prediction is that the more they can make the sound information follow the rules of visual images, the more like seeing it will feel.''

The New York Times on The vOICe:

 ``Seeing With Your Ears,'' by Alison Motluk for The New York Times, December 11, 2005.

On The vOICe approach (camera input auditory display research):

Meijer, P.B.L., ``An Experimental System for Auditory Image Representations,'' IEEE Transactions on Biomedical Engineering, Vol. 39, No. 2, pp. 112-121, Feb 1992. Reprinted in the 1993 IMIA Yearbook of Medical Informatics, pp. 291-300.

The vOICe Home Page Meijer, P.B.L., ``Cross-Modal Sensory Streams,'' invited presentation and demonstration at SIGGRAPH 98 in Orlando, Florida, USA, July 19-24, 1998. Conference Abstracts and Applications, ACM SIGGRAPH 98, 1998, p. 184.

Meijer, P.B.L., ``Seeing with Sound for the Blind: Is it Vision?,'' invited presentation at the Tucson 2002 conference on Consciousness in Tucson, Arizona, USA, Monday April 8, 2002.

Fletcher, P.D., ``Seeing with Sound: A Journey into Sight,'' invited presentation at the Tucson 2002 conference on Consciousness in Tucson, Arizona, USA, Monday April 8, 2002.

Amedi, A., Camprodon, J., Merabet, L., Meijer, P. and Pascual-Leone, A., ``Towards closing the gap between visual neuroprostheses and sight restoration: Insights from studying vision, cross-modal plasticity and sensory substitution,'' [Abstract]. Journal of Vision, Vol. 6, No. 13, 12a, 2006. Abstract available  online.

Amedi, A. Stern, W., Camprodon, J. A., Bermpohl, F. Merabet, L., Rotman, S. Hemond, C., Meijer, P. and Pascual-Leone, A. ``Shape conveyed by visual-to-auditory sensory substitution activates the lateral occipital complex,'' Nature Neuroscience, Vol. 10, No. 6, pp. 687 - 689, June 2007.

Proulx, M. J., Stoerig, P., Ludowig, E. and I. Knoll,  ``Seeing 'Where' through the Ears: Effects of Learning-by-Doing and Long-Term Sensory Deprivation on Localization Based on Image-to-Sound Substitution,'' PLoS ONE, Vol. 3, No. 3, March 2008, e1840.

Kim, J.-K. and Zatorre, R. J., ``Generalized learning of visual-to-auditory substitution in sighted individuals,'' Brain Research, Vol. 242, pp. 263-275, 2008. Available  online (PDF file).

Merabet, L. B., Battelli, L., Obretenova, S., Maguire, S., Meijer P. and Pascual-Leone A., ``Functional recruitment of visual cortex for sound encoded object identification in the blind,'' Neuroreport, Vol. 20, No. 2, pp. 132-138, January 2009.

Striem-Amit, E.,  ``Neuroplasticity in the blind and sensory substitution for vision,'' PhD thesis, 2013 (PDF file).

For other useful literature check out recent publications about sensory substitution. In relation to audio biofeedback (ABF), balance problems and visual impairment, see

Dozza, M., Chiari, L., Chan, B., Rocchi, L., Horak, F. B. and Cappello, A., ``Influence of a Portable Audio-Biofeedback Device on Structural Properties of Postural Sway,'' Journal of Neuroengineering and Rehabilitation, Vol. 2, No. 13, May 2005. Available  online (PDF file).

Growing military interest in sensory substitution is implied in the 2008 National Research Council (NRC) report for the US Defense Intelligence Agency, titled  "Emerging Cognitive Neuroscience and Related Technologies", which refers to both The vOICe (Motluk, 2005) and tactile displays (Bach-y-Rita and others). This report was authored by the "Committee on Military and Intelligence Methodology for Emergent Neurophysiological and Cognitive/Neural Research in the Next Two Decades".

Copyright © 1996 - 2024 Peter B.L. Meijer