Multisensory Processing


Multimodal Auditory Cortical Areas

Examples in Chapter 2 , Brain Plasticity and Perceptual Learning show that perceptual learning is more efficient when the training is multisensorial. Here we will review that combining the information from different senses is essential for successful interaction with real-life situations. It is often believed that this integration occurs at later processing stages and mostly in higher association cortices ( ), whereas other studies suggest that sensory convergence may occur in primary sensory cortex ( ). However, the point of convergence may be even subcortical ( ). noted that: “Converging anatomical and physiological evidence indicates that cells within the (inferior colliculus) IC are sensitive to visual, oculomotor, eye position, and somatosensory information as well as to signals relating to behavioral context and reward. … The presence of non-auditory signals throughout all subdivisions of the IC—including both ascending and descending regions—provides a point of entry for these signals to reach auditory processing at all stages from brainstem to cortex.”

In this chapter, we initially limit ourselves to the neocortex, but pick up the subcortical areas in Section 3.3 by reviewing somato-auditory interactions along the entire auditory pathway.

Animal Data

The auditory cortex of nonhuman primates consists of 13 interconnected areas distributed across three major regions on the superior temporal gyrus: core, belt, and parabelt. traced both corticocortical and thalamocortical connections in marmosets and macaques. In addition to those with core auditory cortex, the cortical connections of the belt areas CM and CL included somatosensory (retroinsular, Ri) and multisensory areas (temporal parietal). Thalamic inputs included the medial geniculate complex and several multisensory nuclei (supra-geniculate, posterior, limitans, medial pulvinar), but not the ventroposterior complex. The core (A1, R) and rostromedial areas of auditory cortex have only sparse multisensory connections ( Fig. 3.1 ). found that the Ri area is one of the principle sources of somatosensory input to the caudal belt, while multisensory regions of cortex and thalamus may also contribute. Recent studies suggest that sensory convergence can already occur in primary sensory cortices. A good example for early convergence appears to be the auditory cortex, for which auditory-evoked activity can be modulated by visual and tactile stimulation. found that both primary (core) and nonprimary (belt) auditory fields in monkeys can be activated by the mere presentation of visual scenes. Audiovisual (AV) convergence was restricted to caudal fields; prominently the primary auditory cortex and belt fields and continued in the auditory parabelt and the superior temporal sulcus (STS) ( Fig. 3.1 ). The same fields exhibited enhancement of auditory activation by visual stimulation and showed stronger enhancement for less effective stimuli, two characteristics of sensory integration. found basically the same results in awake and anesthetized monkeys, indicating that attention does not have an effect. More extensive information can be found in the work of .

Figure 3.1, Summary of thalamocortical inputs to A1, CM, CL, Tpt, and Ri in this study. Heavier arrows indicate denser projections. Auditory areas receive the densest inputs from the MGC and variable projections from the multisensory nuclei. Tpt receives the densest multisensory inputs and modest auditory projections. Ri has uncertain auditory inputs and stronger inputs from multisensory and somatosensory nuclei. A1, primary auditory cortex; CL, caudolateral belt area; CM, caudomedial belt area; MGC, medial geniculate complex; Lim, limitans nucleus; PM, medial pulvinar; Po, posterior nucleus; Ri, retroinsular area; Sg, suprageniculate nucleus; Tpt, temporal pariotemporal are; VPI, ventroposterior nucleus, inferior division.

Human Findings

noted that perceptual objects often have both a visual and an auditory component, i.e., they can be seen and heard, and this information arrives in cortex simultaneously through distinct sensory channels. The cross-modal object features are linked by reference to the primary sensory cortices. The binding of familiar, though continued exposure, auditory and visual components is referred to as semantic, multisensory integration. recorded spatiotemporal patterns underlying multisensory processing at multiple cortical stages using magnetoencephalography recordings of meaningful related cross-modal and unimodal stimuli. Already for latencies of 100 ms after stimulus onset, posterior parietal brain regions responded preferentially to cross-modal stimuli irrespective of task instructions or relatedness between the auditory and visual components. recalled that: “In audiovisual speech studies, neural activity is consistently reported in posterior superior temporal sulcus (pSTS) and this site has been implicated in multimodal integration.” then used fMRI to investigate how visual speech influences activity in auditory cortex above and beyond its response to auditory speech. Subjects were presented with auditory speech with and without congruent visual input. It appeared that congruent visual speech increased the BOLD activity in auditory cortex, indicating early multisensory processing.

Hearing Loss Affects Multisensory Representation in Animals

Cross-modal reorganization may occur following damage to mature sensory systems ( ) and may be the neural substrate that allows a compensatory visual function. tested this hypothesis using a battery of visual psychophysical tasks and found that congenitally deaf cats, compared with hearing cats, have superior localization in the peripheral field and lower visual movement detection thresholds. In the deaf cats, selective reversible deactivation of posterior auditory cortex (PAF) by cooling eliminated superior visual localization abilities, whereas deactivation of the dorsal auditory cortex (DZ) eliminated superior visual motion detection. Thus, the different perceptual visual improvements were dependent on specific and different subregions of auditory cortex. ’s data suggested that: “The improved localization of visual stimuli in deaf cats was eliminated by deactivating area PAF, whereas the enhanced sensitivity to visual motion was blocked by disabling area DZ. Because neither cortical area influenced visual processing in hearing cats, these data indicate both that cross-modal reorganization occurred in the PAF and DZ and that the reorganization was functional and highly specific” ( Fig. 3.2 ).

Figure 3.2, Summary diagram illustrating the double-dissociation of visual functions in auditory cortex of the deaf cat. Bilateral deactivation of PAF, but not DZ, resulted in the loss of enhanced visual localization in the far periphery. On the other hand, bilateral deactivation of DZ, but not PAF, resulted in higher movement detection thresholds. The lower panel shows a lateral view of the cat cerebrum highlighting the locations of PAF and DZ. PAF, posterior auditory field; DZ, dorsal zone.

Extending the results on field DZ, found that: “Overall, the pattern of cortical projections to DZ was similar in both hearing and deafened animals. However, there was a progressive increase in projection strength among hearing and late- and early-deafened cats from an extrastriate visual cortical region known to be involved in the processing of visual motion, the posterolateral lateral suprasylvian area (PLLS).” This suggested that the increase in projection strength from PLLS is larger for early-deafened than in late-deafened animals.

For more anterior auditory cortical areas, found that in hearing cats, the cortical auditory field of the anterior ectosylvian sulcus (FAES; Fig. 3.2 ) is largely responsive to acoustic stimulation and its unilateral deactivation results in profound contralateral acoustic orienting deficits. They also found that recordings in the FAES of early-deafened adults revealed robust responses to visual stimulation in the contralateral visual field. A second group of early-deafened cats was trained to localize visual targets in a perimetry array. In these animals, cooling loops were surgically placed on the FAES to reversibly deactivate the region, which resulted in substantial contralateral visual orienting deficits. found that “crossmodal plasticity can substitute one sensory modality for another while maintaining the functional repertoire of the reorganized region.” then looked at the effects of deafening on the anterior auditory field (AAF) and observed that neurons in early-deafened AAF could not be activated by auditory stimulation. Instead, the majority (78%) was activated by somatosensory cues, while fewer were driven by visual stimulation (44%). These results indicated to them that, “following postnatal deafness, both somatosensory and visual modalities participate in crossmodal reinnervation of the AAF.” In a study investigating multisensory projections to AAF following early- and late-onset deafness, injected a retrograde tracer in AAF, which in early-deaf cats, resulted in ipsilateral neuronal labeling in visual and somatosensory cortices increased by 329% and 101%, respectively, whereas, labeling in auditory areas was reduced by 36%. Less marked differences were observed in late-deaf cats. Conserved thalamocortical connectivity, following early- and late-onset deafness, suggested that thalamic inputs to AAF do not depend on acoustic experience. However, corticocortical connectivity following early-onset deafness changed considerably thereby demonstrating the importance for cortical development of early acoustic experience.

Human Findings Following Sensory Deprivation

Demonstration of experience-dependent plasticity has been provided by studies of sensory-deprived individuals (e.g., blind or deaf), showing that brain regions deprived of their natural inputs change their sensory tuning to support the processing of inputs coming from the spared senses ( Fig. 3.3 ).

Figure 3.3, Example of the massive activation elicited by sounds in the occipital cortex of blind adults, based on data from Collignon et al. (2011) : It depicts the activation obtained when contrasting early-blind individuals (EB) versus sighted controls (SC) when both groups of participants were exposed to auditory stimuli only.

Despite the massively different cortical activation in early blind subjects, early on had stated in a review paper: “Reports that visual areas V1 and V2 (the primary and secondary visual cortices, respectively) are recruited during auditory language processing in post-lingually deaf individuals after they receive cochlear implants (CIs) also indicate a link between plastic changes in the spared modality and deprivation of the auditory system.” According to this finding seems to reflect the greater reliance of CI users on visual cues during the processing of oral language, rather than plasticity caused by deafness per se. noted that: “Psychophysical thresholds for visual contrast sensitivity, visual flicker, brightness discrimination, direction of motion and motion velocity are similar in deaf and hearing individuals.” This agrees with an early study ( ) based on standard audiometry and tactile thresholds that did not show any differences between blind and sighted individuals.

found that: “Individuals who became blind early in life, but not those who lost their sight later, can process sounds faster, localize sounds more accurately and have sharper auditory spatial tuning—as measured both behaviourally and using event-related potentials (ERPs)—than sighted individuals ( ).” This is supported by the finding that auditory and somatosensory ERPs over posterior cortical areas are larger and the processing is faster in blind than in sighted subjects, indicating that these areas are recruited by the remaining modalities ( ). This type of compensation might be mediated by enhanced recruitment of multimodal areas of cortex by the remaining modalities ( Fig. 3.3 ). Not surprisingly, the areas that show reorganization after sensory deprivation seem to be part of the cortical network that mediates cross-modal processing in normally sighted, hearing individuals ( ).

reported that in the deaf Heschl’s gyrus, the site of human primary auditory cortex, fMRI signal change was greater for somatosensory and bimodal stimuli than found in hearing participants. Particularly, visual responses in Heschl’s gyrus, which were larger in deaf than hearing persons, were smaller than those elicited by somatosensory stimulation. However, in the superior temporal cortex the visual response was comparable to the somatosensory one. The same research group ( ) using individually defined primary auditory cortex areas found that the fMRI signal change for more peripheral stimuli in the visual field was larger than for perifoveal stimuli in deaf. In contrast, there was no difference in hearing people. In addition, auditory cortex in the deaf contains significantly less white matter and larger gray matter–white matter ratios than in hearing participants ( ).

In a recent review paper, reported that: “ recorded electrophysiological responses of early-deaf cochlear implant (CI) recipients elicited by visual motion. … The authors observed a negative correlation between the strength of crossmodal recruitment and scores to linguistic tests, thus ultimately suggesting that crossmodal takeover interferes with proper language recovery.” For an excellent illustration of this see ( Fig. 3.4 ). found smaller visual-evoked P100 amplitudes and reduced visual cortex activation in CI users compared with normal hearing listeners. This suggests a visual takeover of the auditory cortex, and that such cross-modal plasticity may be one of the main sources of the high variability observed in CI outcomes. suggested that: “Crossmodal plasticity is ultimately and unavoidably maladaptive for optimal auditory recovery and that its presence should be considered as a negative predictor of successful auditory restoration through cochlear implantation.” This somewhat pessimistic statement has been weakened by recent animal studies using CIs in congenital deaf white cats, where concluded that: “Cross-modal reorganization was less detrimental for neurosensory restoration than previously thought.”

Figure 3.4, (A) Visual cross-modal reorganization in children with CIs. Visual gradient stimulation was presented to a child with normal hearing and two children with CIs. Current density reconstructions of the cortical visual P2 component computed via sLORETA show activated regions as illustrated on sagittal MRI slices. Yellow regions reflect maximal cortical activation, while brown/black regions reflect the areas of least activation. Left panel: A 10-year-old child with normal hearing shows activation of higher-order occipital cortices in response to visual stimuli. Middle panel: An 8-year-old cochlear implanted child with a speech perception score of 96% on the Lexical Neighborhood Test shows similar activation of higher-order visual areas, such as middle occipital gyrus, fusiform gyrus, and lingual gyrus. Right panel: In contrast, a 7-year-old cochlear implanted child with a speech perception score of 67% on the Multisyllabic Lexical Neighborhood Test shows activation of occipital areas and superior temporal gyrus and medial temporal gyrus. (B) Somatosensory cross-modal reorganization in children with CIs. Vibrotactile stimulation of the right index finger was presented to a child with normal hearing and two children with CIs. Current density reconstructions of the cortical somatosensory N70 component computed via sLORETA show activated regions as illustrated in coronal MRI slices. Left panel: A normal hearing 7-year-old child shows activation of somatosensory cortex in the postcentral gyrus. Middle panel: A 13-year-old cochlear implanted child with a speech perception score of 94% on the Consonant Nucleus Consonant (CNC) test shows similar activation of somatosensory cortex in postcentral gyrus. Right panel: In contrast, a 15-year-old cochlear implanted child who showed average performance on the CNC speech perception test (76%) exhibited activation of the somatosensory cortex, superior and transverse temporal gyri, and parietal cortex.

You're Reading a Preview

Become a Clinical Tree membership for Full access and enjoy Unlimited articles

Become membership

If you are a member. Log in here