LO is a meta-modal operator for shape

An fMRI study using auditory-to-visual sensory substitution

« The vOICe Home Page

A poster based on use of The vOICe was presented at

HBM 2006, the 12th Annual Meeting of the  Organization for Human Brain Mapping (OHBM)

June 11-15, 2006, Florence, Italy.

First author: Amir Amedi.
Poster #415 TH-AM - Thursday, June 15, 2006, 11:30 AM, poster session 26.

 Abstract
Authors

Amir Amedi 1, Felix Bermpohl 1, Joan Camprodon 1, Lotfi Merabet 1, Peter Meijer 2 and Alvaro Pascual-Leone 1.

1 Center for Non-Invasive Magnetic Brain Stimulation, Dept. of Neurology, BIDMC, Harvard Medical School, MA02215, USA.
2 Philips Research, Eindhoven, The Netherlands. [Update: currently no longer working at Philips.]

We have previously hypothesized that part of the lateral occipital complex (LOC / LOtv) is processing and representing the geometrical shape of objects. This was based on the fact that LO is robustly activated by visual and tactile object recognition, sensory modalities in which shape information is rich (Amedi et al. 2002). Furthermore, LO is only negligibly activated by "auditory objects" (i.e. by the typical sounds made by objects) in which object recognition is generally based on associations rather than on shape information. If LOC / LOtv is indeed an operator for shape, we hypothesized that it would be recruited for processing auditory information if that is (somehow) rich in shape information. Visual-to-auditory sensory substitution devices (SSD) provide a unique opportunity to study such a situation systematically: visual images are captured by a camera and then transformed, according to an arbitrary algorithm, into soundscapes that contain shape information conveyed by sound. We report here fMRI results in two blind (one congenitally and one late blind) expert users of a visual-to-auditory SSD called "The vOICe". This system encodes a visual scene following 3 simple rules: 1) the vertical axis is represented by sound frequency; 2) the horizontal axis is represented by time and stereo panning; and 3) brightness is encoded by loudness (Meijer, 1992). In the current experiment we compared object recognition based on information arriving from different sources. The design included object recognition by touch (object palpation), auditory (e.g. typical sounds made by objects), visual-to-auditory transformed objects (using visual objects encoded into soundscapes), and corresponding low-level sensory controls.

Consistent with prior results, LOC / LOtv is robustly activated by object palpation, but shows significantly less, minimal activation to the typical sounds made by objects (though activation in these blind subjects is greater than that seen in sighted subjects). Critically, this area shows robust activation to the soundscapes of visual- to-auditory transformed objects (‘The vOICe’ condition), that is as prominent as that seen for the object palpation, and equivalent to what might be expected to visual object processing in sighted.

These results support the meta-modal theory of the brain (Hamilton and Pascual-Leone 2000), in which cortical regions are defined by the computation they apply rather than their dominant sensory modality input. In this case we show that LO is recruited for shape processing by information presented via stimuli that code shape using an artificial mapping.


Poster references

A. Amedi, G. Jacobson, T. Hendler, R. Malach and E. Zohary, ``Convergence of visual and tactile shape processing in the human lateral occipital complex,'' Cerebral Cortex, Vol. 12, No. 11 (November 2002), pp. 1202-1212. Available  online (PDF file).

A. Amedi, R. Malach, T. Hendler, S. Peled and E. Zohary, ``Visuo-haptic object-related activation in the ventral visual pathway,'' Nature Neuroscience, Vol. 4, No. 3 (March 2001), pp. 324-330. Available  online (PDF file).

P. B. L. Meijer, ``An Experimental System for Auditory Image Representations,'' IEEE Transactions on Biomedical Engineering, Vol. 39, No. 2, pp. 112-121, Feb 1992. Reprinted in the 1993 IMIA Yearbook of Medical Informatics, pp. 291-300.

Related is the conference presentation at the 11th Annual Meeting of the Organization for Human Brain Mapping (HBM 2005) in Toronto, Canada, June 12-16, 2005, titled "Neural correlates of visual-to-auditory sensory substitution in proficient blind users".

Other sensory substitution related presentations at HBM 2006 include:

Tactile motion discrimination through the tongue in blindness: An fMRI study
Isabelle Matteau, Ron Kupers, Christian Casanova and Maurice Ptito

Visual-like perception through an auditory-for-visual substitution prosthesis
Colline C. Poirier, Anne G. De Volder and Christian Scheiber

Note: The vOICe technology is being explored and developed under the Open Innovation paradigm together with R&D partners around the world.

Copyright © 1996 - 2024 Peter B.L. Meijer