|
BrainPort vision device tongue display |
AuxDeco FSRS forehead display |
The vOICe auditory display |
|
Cost estimate | ~$10,000 | ~$15,000 | < $500 * |
Pixels (electrodes) | ~400 | 512 | > 10,000 |
Shades of gray | 2-4 ? | 2-4 ? | > 16 |
Availability | several countries | Japan | global |
* cost indication for a setup based on a Windows netbook PC with camera glasses |
|||
YouTube video clips of training during SBIR phase I evaluation study on The vOICe by MetaModal LLC
|
In the remainder of this page, we will discuss options for vision substitution (also called visual substitution or sight substitution): sensory substitution targetting (total) blindness. Deafness lies outside the scope of this site, although much progress has been made in recent years with the development of cochlear implants. The cochlear implant is a prosthetic sensory bypass rather than a sensory substitution device, but the fact that it maps acoustical inputs (one physical modality) directly onto neural stimuli (another physical modality) raises several issues and problems that are similar to those in sensory substitution devices mapping one sensory modality to another. Sensory substitution approaches may also play a role in other sensory prosthetics, for instance for sensory augmentation or cognitive augmentation through neuroelectronics. On the website of Dr. Timothy Hain (dizziness-and-hearing.com) a number of literature references on the use of auditory and other sensory feedback for treating bilateral vestibular loss are given.
Louis Braille (1809-1852) in the 19th century developed
the well-known raised dot code now named after him. This tactile representation was later supplemented
by blindness aids like the mechanical Braille printer and typewriter (Brailler), again followed
by more versatile electronic implementations.
In the early 1960s,
In another attempt to circumvent the difficulty of reading Braille, the Kurzweil
reading machine (KRM) was developed around 1975 by
Screen magnifiers (CCTV's), including contrast enhancement, were developed as
visual aids for the partially sighted.
Obstacle detectors are intended to improve orientation and mobility
by giving spatial information about the immediate neighourhood relevant
to navigation while walking. Electronic devices for that purpose are
therefore also termed electronic travel aids (ETAs) or blind mobility
aids.
The long cane is a very simple but effective mechanical device for
probing the environment for obstacles through touch, but also to some
degree through acoustical reflections resulting from tapping.
Its range for providing touch feedback is only a few feet,
and it does not normally detect overhanging obstacles.
The guide dog has helped to alleviate some of the limitations
of the long cane for (some) blind travellers, but w.r.t.
general orientation and navigation it still provides little
information.
Tony Heyes
has improved upon the Nottingham obstacle detector (NOD), and his device is known
as the Sonic Pathfinder (1984). This is a head-mounted pulse-echo sonar system
comprising three receivers and two transmitters, controlled by a microcomputer.
Other sonar devices for the blind include the
Miniguide,
UltraCane,
Particularly the early ETA's tried to extend the small range and angular coverage
of the long cane either acoustically (ultrasonically) or optically (laser light).
A German laser cane (German: Laser-Langstock) is under development at
VISTAC GmbH.
The Mowat sensor and Nottingham
obstacle detector (NOD) are both hand-held ultrasonic torches,
developed around 1980, that map ultrasonic reflections into
vibrotactile or auditory signals, respectively. The Mowat and NOD
devices are no longer in production. Leslie Kay, Tony Heyes and
Peter Meijer (left) wearing Tony Heyes' (right) Sonic Pathfinder. May 31, 1997.
It does not give information about surface texture, and normally the auditory
display indicates only the nearest object, which is why it should be classified
as an obstacle detector rather than as an imaging device. Heyes' approach is
rather different from Kay's in that the Sonic Pathfinder deliberatily supplies
only the minimum but most relevant information for travel needed by the user,
whereas Kay strives for more information-rich sonar-based displays. Heyes
argues, for instance, that if some object moves away from the user, the user
does not need to know this in terms of safe and efficient travel, and
there may also be less confusion while minimizing interference with hearing
normal environmental sounds (e.g., traffic). The Sonic Pathfinder uses brief
tones on a musical scale to denote distance, with pitch descending when
approaching an object. Perception of objects on the left and on the right is
further supported by tones for the corresponding ear. Two additional,
off-center, sonar beams are used to detect objects on the left or right.
The Sonic Pathfinder is used in conjunction with the long cane.
Tony Heyes (left) explaining the Pathfinder to Hester. May 31, 1997.
There are several distinct sensory substitution approaches heading towards medium resolution environmental imaging:
In the experimental cortical implant approach one uses an array of electrodes
placed in direct contact with the visual cortex. This is also called a neuroprosthesis
or brain implant. Pioneered by Giles Brindley in the 1960s, and William Dobelle (1941-2004)
et al. in the early 1970s. Research into cortical implants is also done by
Tongue display
Forehead Retina System
Whereas the former two approaches were (implicitly) using visual input, research on auditory displays currently considers two options for environmental input to create a visual prosthesis (through an Auditory Vision Substitution System, or AVSS):
Leslie Kay
(1922-2020) has done much research on sonar devices, leading to the ``sonic glasses,''
or Sonicguide, around 1974. More recently, he has improved the angular resolution of
his Binaural Sensory Aid (BSA) system using a third narrow beam transmitter for
creating an additional monaural signal. This newer system was first called the
Trisensor, but is now known as the KASPA system (Kay's Advanced Spatial Perception Aid).
KASPA represents object distance by pitch, but also represents surface texture through
timbre. Use is made of echo location through frequency-modulated (FM) signals. The
improved, but still modest, resolution probably positions Kay's work somewhere between
obstacle detection and environmental imaging.
The best angular resolution is about one degree in the
horizontal plane (azimuth detection) for the central beam, which is quite good,
but vertical resolution (elevation) is poor - making the ``view'' somewhat similar
to constrained vision with binocular viewing through a narrow horizontal slit.
Note: It remains difficult to settle on a good measure for equivalent image
resolution as offered by sonar approaches, because sonar is not based on mapping
an image matrix to another sensory modality, but on the implicit preservation
of some degree of detailed environmental information in complex ultrasonic
interference and acoustic delay patterns, thus supporting spatial processing.
Nevertheless, Kay's work does show some features (resolution, texture, parallax)
that support its classification as an imaging device, or vision substitute.
Peter Meijer developed a device that maps
Pending psychophysical and neuroscientific experiments on the role of mental imagery
and visualization in auditory synthetic vision, images have been reconstructed from The vOICe
sounds through computer analyses in order to prove the preservation of a significant amount
of image information in the sounds. The vOICe mapping thus creates a near-isomorphism between
vision and hearing for a limited resolution and frame rate. An example reconstruction is shown below:
The vOICe in IEEE Spectrum |
Eduverse 2008 presentation by Peter Meijer |
So far, the psychophysical or psychological research community has not yet found conclusive answers w.r.t. the potential of this approach for the blind, and the relevant limitations in human auditory perception and learning abilities for comprehension and development of visual qualia (sensations) therefore remain largely unknown or anecdotal, although the training effort is expected to be significant while involving perceptual recalibration for accurate sensorimotor feedback. One of the key research questions is to what extent the use of a sensory substitution system cannot only provide synthetic vision in a functional sense for extended situational awareness through active sensing, but also lead to visual sensations through forms of induced artificial synesthesia. Also see the auditory model page for computer simulations of human auditory processing of The vOICe sounds, and the related projects page for other projects in which image-to-sound mappings similar to or related to The vOICe mapping are employed.
Literature and conference presentations:
The New York Times on sensory substitution:
The New Scientist on sensory substitution:
The New York Times on The vOICe:
On The vOICe approach (camera input auditory display research):
Meijer, P.B.L., ``Cross-Modal Sensory Streams,'' invited presentation and demonstration
at SIGGRAPH 98 in Orlando, Florida, USA, July 19-24, 1998.
Conference Abstracts and Applications, ACM SIGGRAPH 98, 1998, p. 184.
Meijer, P.B.L., ``Seeing with Sound for the Blind: Is it Vision?,''
invited presentation at the Tucson 2002 conference on Consciousness in Tucson,
Arizona, USA, Monday April 8, 2002.
Fletcher, P.D., ``Seeing with Sound: A Journey into Sight,''
invited presentation at the Tucson 2002 conference on Consciousness in Tucson,
Arizona, USA, Monday April 8, 2002.
Amedi, A., Camprodon, J., Merabet, L., Meijer, P. and Pascual-Leone, A.,
``Towards closing the gap between visual neuroprostheses and sight restoration:
Insights from studying vision, cross-modal plasticity and sensory substitution,''
[Abstract]. Journal of Vision, Vol. 6, No. 13, 12a, 2006. Abstract available
online.
Amedi, A. Stern, W., Camprodon, J. A., Bermpohl, F. Merabet, L., Rotman, S. Hemond, C., Meijer, P. and Pascual-Leone, A.
``Shape conveyed by visual-to-auditory sensory substitution activates the lateral occipital complex,''
Nature Neuroscience, Vol. 10, No. 6, pp. 687 - 689, June 2007.
Proulx, M. J., Stoerig, P., Ludowig, E. and I. Knoll,
``Seeing 'Where' through the Ears: Effects of Learning-by-Doing and Long-Term Sensory Deprivation on Localization Based on Image-to-Sound Substitution,''
PLoS ONE, Vol. 3, No. 3, March 2008, e1840.
Kim, J.-K. and Zatorre, R. J., ``Generalized learning of visual-to-auditory substitution in sighted individuals,''
Brain Research, Vol. 242, pp. 263-275, 2008. Available
online (PDF file).
Merabet, L. B., Battelli, L., Obretenova, S., Maguire, S., Meijer P. and Pascual-Leone A.,
``Functional recruitment of visual cortex for sound encoded object identification in the blind,''
Neuroreport, Vol. 20, No. 2, pp. 132-138, January 2009.
Striem-Amit, E., ``Neuroplasticity in the blind and sensory substitution for vision,'' PhD thesis, 2013 (PDF file).
``New Tools to Help Patients Reclaim Damaged Senses,''
by Sandra Blakeslee for The New York Times, November 23, 2004.
``In another approach, Dr. Peter Meijer, a Dutch scientist
working independently, has developed a system for blind
people to see with their ears. A small device converts
signals from a video camera into sound patterns delivered by
stereo headset to the ears. Changes in frequency connote up
or down. Changes in pixel brightness are sensed as louder or
softer sounds.''
``Senses special: The feeling of colour (subscription required),''
by Helen Phillips for the New Scientist, January 29, 2005, pp. 40-43.
``In preliminary tests with a similar device designed by
engineer Peter Meijer from Eindhoven in the Netherlands, the
signals took a little getting used to. But after a couple of
hours of feedback either from touching or being told what
they were viewing, people were able to recognise objects by
their sound. They could tell plants from statues and crosses
from circles. But they weren't fooled into thinking they
were seeing. O'Regan's prediction is that the more they can
make the sound information follow the rules of visual
images, the more like seeing it will feel.''
``Seeing With Your Ears,''
by Alison Motluk for The New York Times, December 11, 2005.
Meijer, P.B.L., ``An Experimental System for Auditory Image Representations,''
IEEE Transactions on Biomedical Engineering, Vol. 39, No. 2, pp. 112-121, Feb 1992. Reprinted
in the 1993 IMIA Yearbook of Medical Informatics, pp. 291-300.
For other useful literature check out recent publications about
sensory substitution.
In relation to audio biofeedback (ABF), balance problems and visual impairment, see
Dozza, M., Chiari, L., Chan, B., Rocchi, L., Horak, F. B. and Cappello, A.,
``Influence of a Portable Audio-Biofeedback Device on Structural Properties of Postural Sway,''
Journal of Neuroengineering and Rehabilitation, Vol. 2, No. 13, May 2005. Available
online (PDF file).
Growing military interest in sensory substitution is implied in the 2008 National Research
Council (NRC) report for the US Defense Intelligence Agency, titled
"Emerging Cognitive Neuroscience and Related Technologies",
which refers to both The vOICe (Motluk, 2005) and tactile displays (Bach-y-Rita and others).
This report was authored by the "Committee on Military and Intelligence Methodology for Emergent
Neurophysiological and Cognitive/Neural Research in the Next Two Decades".