Peter Meijer "Peter_Meijer" at
Thu Jun 11 13:41:46 EST 1998

Gernot S Doetsch wrote:
> ...
> Well, then the question would be, if these higher
> cortical areas also have a plastic capability and how to tell these
> neurones to re-learn their input. So far as I know, there are no studies
> on plasticity in higher cortical areas and it could be the big challenge
> for the next decade.
> ...
> Another example:
> Braille readers and string instrument players have enlarged finger
> representations. This means, that neurons not responding to finger
> stimulation at birth do so after some months or years of training. So I
> would say, they have moved their function from not-finger to finger
> representation and processing.

I have followed the recent thread about neural plasticity
with increasing interest, particularly where it touches upon 
questions of cross-modal plasticity. My own major reason 
for being interested is that it is nowadays technically 
feasible to design sensory substitution devices, of which 
my vision substitution approach is a specific example at

   The vOICe Learning Edition

proposing an experimental auditory display for the blind.
[Of course there are alternative approaches from other 
researchers, for instance based on tactile rather than 
auditory displays. I just happen to focus my interest on 
auditory displays for a variety of reasons.]

However, the fact that something is technically feasible 
need not imply that humans can indeed learn to fully exploit 
the cross-modal mapping technology. Even after taking known
additional psychophysical limitations into account, many 
questions remain about where the major bottlenecks are in 
the information stream from camera to sounds to human 
perception and mental interpretation of these sounds. 
Preliminary experiments with computer models to analyze 
peripheral auditory processing seem to indicate that much 
image information can indeed be conveyed via an auditory 
display, as illustrated at my web page

Yet neural processing beyond the peripheral stages appears
to be largely unknown territory. I know of some plasticity 
related publications as referenced near the bottom of my 
abstract.htm page (e.g., Langner, Paus, Cohen, Rauschecker).
Nevertheless, I wondered if neuroscience could say more about
(the limitations of) cross-model perception, plasticity,
bandwidth of relevant neural pathways, etc. Could one define 
sensible fMRI experiments to investigate this in a systematic 
manner in order to obtain more objective information? Using
TMS perhaps? Or measuring brain activity in the visual cortex
induced by auditory input, both before and after training for 
understanding the visual "soundscapes"? I'd welcome further 
input and insights from neuroscientists! 

Peter Meijer

More information about the Neur-sci mailing list