September 1-5, 2008, Bodrum, Turkey.
|
Amir Amedi 1, Ella Striem 1, Uri Hertz 1, William Stern 2, Peter Meijer 3, Lotfi Merabet 2, and Alvaro Pascual-Leone 2.
1 The Hebrew University of Jerusalem, Jerusalem, Israel.
The talk will present fMRI and behavioral experiments of auditory-visual integration in humans. It will focus on integration in sighted but also in sight restoration set-up, looking into the effects of learning and brain plasticity. New findings regarding the nature of sensory representations for dynamic stimuli ranging from pure tones to complex, natural object sounds will be presented. Firstly, I will highlight the use of sensory substitution devices (SSDs) in the context of blindness. In SSDs, visual information captured by an artificial receptor is delivered to the brain using non-visual sensory information. Using an auditory-to-visual SSD called "The vOICe" we find that blind achieve successful performance on object recognition and localization tasks, and specific recruitment of ventral and dorsal 'visual' structures. Comparable recruitment was observed also in sighted learning to use this device but not in sighted learning arbitrary associations between sounds and object identity. These results suggest "The vOICe" can be useful for blind individuals' daily activities but it also has a potential use to 'guide' visual cortex to interpret visual information arriving from prosthesis. The three topographical senses (vision, audition and touch) are characterized by a topographical mapping of the sensory world onto primary and secondary brain areas. In such topographical maps, adjacent neurons represent adjacent sensory building blocks (e.g. visual field, tone frequency and body parts) in each sense. Using fMRI, we applied continuous and periodic sensory stimulation, to detect further topographically sensitive areas in the auditory modality. We used phase locking Fourier Techniques combined with spherical cortex-based alignment approach to detect such topographic maps. Using these methods, I will report in the second part of the talk the preliminary finding of multiple novel tonotopic maps in humans organized in mirror-symmetric representations, stretching from primary auditory cortex to occipito-temporal and parietal-temporal cortex. These maps are organized in parallel, mirror symmetry bands as found in the visual cortex. Our results suggest that mirror symmetry topographical mapping may be a fundamental principle of mapping in both vision and audition and might be a more common characteristic of associative cortex than previously suspected. Such maps, some of them located in the border of temporal and occipital and parietal cortex can serve as a basis for audio-visual transformations required in SSDs and reported in the first part of the talk. This work has been supported by an R21-EY0116168 grant (To APL), the International Human Frontiers Science Program Organization LTF grant and by an EU reintegration FP7 grant (to AA).
|
See also the ICON X presentation abstract of an oral presentation by Lotfi Merabet et al. titled "Retinotopic visual cortex mapping using a visual-to-auditory sensory-substitution device".
Note: The vOICe technology is being explored and developed under the Open Innovation
paradigm together with