Abstract

We report evidence for a context- and not stimulus-dependent functional asymmetry in the left and right human auditory midbrain, thalamus, and cortex in response to monaural sounds. Neural activity elicited by left- and right-ear stimulation was measured simultaneously in the cochlear nuclei, inferior colliculi (ICs), medial geniculate bodies (MGBs), and auditory cortices (ACs) in 2 functional magnetic resonance imaging experiments. In experiment 1, pulsed noise was presented monaurally to either ear, or binaurally, simulating a moving sound source. In experiment 2, only monaural sounds were presented. The results show a modulation of the neural responses to monaural sounds by the presence of binaural sounds at a time scale of tens of seconds: In the absence of binaural stimulation, the left and right ICs, MGBs, and ACs responded stronger to stimulation of the contralateral ear. When blocks of binaural stimuli were interspersed in the sound sequence, the contralateral preference vanished in those structures in the right hemisphere. The resulting hemispheric asymmetry was similar to the asymmetry demonstrated for spatial sound processing. Taken together, the data demonstrate that functional asymmetries in auditory processing are modulated by context. The observed long time constant suggests that this effect results from a “top–down” mechanism.

Introduction

Complex natural stimuli appear to be processed preferentially in either the left or the right human auditory cortex (AC). For instance, fast temporal modulations, as found in speech, seem to be preferentially processed in the left AC, whereas fine-grained frequency analysis has been found to engage the right AC more strongly than the left (Zatorre and Belin 2001; Zatorre and others 2002; Tervaniemi and Hugdahl 2003). Converging evidence from lesion and functional magnetic resonance imaging (FMRI) studies suggests a hemispheric asymmetry in auditory spatial processing as well: Zatorre and Penhune (2001) demonstrated that damage to the right AC differentially impairs sound localization performance. Consistent with that suggestion, a recent FMRI study showed that the right AC responds to perceived sound movement in both acoustic hemifields, whereas the left AC responds predominantly to movement in the contralateral hemifield (Krumbholz, Schonwiesner, von Cramon, and others 2005). This asymmetry in human auditory spatial information processing mirrors the known asymmetry in human visual spatial information processing (for a review, see Marshall and Fink 2001; Halligan and others 2003).

The role of the subcortical auditory pathway in the formation of lateralized processing is unknown. Previous studies suggest that response asymmetries in subcortical auditory structures exist and may contribute to cerebral lateralization. Studies of brain stem auditory-evoked potentials (Levine and McGaffigan 1983; Levine and others 1988) reported a rightward asymmetry during monaural click-train stimulation: Stimulation of the right ear elicited larger brain stem responses than stimulation of the left ear, suggesting an increased number of active neurons or increased firing synchrony in the brain stem structures along the afferent auditory path from the right ear to the left AC. The authors of those studies related the rightward asymmetry in the brain stem responses to the left hemisphere dominance for speech processing. Other studies reported asymmetries in peripheral auditory structures. In particular, the magnitude of active cochlear amplification, assessed by click-train–evoked otoacoustic emissions, appears to be greater in the right than in the left cochlea (Khalfa and Collet 1996; Khalfa and others 1998). Again, the authors linked this peripheral asymmetry to speech-related asymmetries in the cerebral hemispheres. However, to support this hypothesis, it would have been necessary to demonstrate asymmetrical cortical responses to the employed, acoustically simple, stimuli. Indeed, asymmetrical responses of the AC to simple sounds (amplitude-modulated sinusoids) were found with FMRI (Devlin and others 2003). In that study, sinusoids presented monaurally to the left or right ear yielded a greater blood oxygen level–dependent (BOLD) response in the left than right primary AC, irrespective of which ear was stimulated. The authors suggested that this asymmetrical response may indicate a generic left hemisphere dominance for auditory processing.

Whether auditory processing in humans is by default lateralized to the speech-dominant side of the brain is a crucial question in understanding auditory function. An immediately related question is whether functional lateralization is confined to the cerebral cortex or can be detected in subcortical auditory structures as well. The present study addresses both questions by simultaneously measuring cortical and subcortical responses to simple stimuli in the context of auditory spatial processing, which has been shown to produce a cortical activation asymmetry that is independent and different from the lateralization of speech processing.

Materials and Methods

Subjects

A total of 20 subjects (8 males) between 23 and 32 years of age, with age-typical normal audiogram and no history of hearing disorder or neurological disease, participated in the 2 experiments after having given informed consent. All subjects were right handed and attained a score of 100% in the Edinburgh Handedness Questionnaire (Oldfield 1971). The experimental procedures were approved by the local ethics committee. Four subjects participated in both experiments; thus, 12 subjects participated in each experiment.

Stimuli and Experimental Protocol

The acoustic stimuli were trains of noise bursts; the same stimuli elicited a reliable activation asymmetry in a previous experiment on auditory spatial processing (Krumbholz, Schonwiesner, von Cramon, and others 2005). The noise bursts had a duration of 50 ms each and were presented at a rate of 10/s. They were continuously generated afresh and filtered between 0.2 and 3.2 kHz (Tucker Davis Technologies [Alachua, FL], System 3). The first experiment comprised 2 monaural, 2 binaural, and a silent condition (Fig. 1).

Figure 1

Sequence of acoustic stimulation and image acquisition. The pulsed noise stimuli are drawn as black traces, the upper for the left and the lower for the right channel. The image acquisitions are drawn as gray boxes. In experiment 1 (upper panel), 4 sound conditions were alternated with a silent condition: monaural right stimulation (mon right), monaural left stimulation (mon left), binaural stimulation with identical signal at both channels (perceived as sound coming from the center point of the interaural axis, bin central), and binaural stimulation with dynamically varying phase angle between the channels (perceived as sound movement along the interaural axis, bin move). Experiment 2 contained only the monaural conditions. The TR was about 10.5 s (the exact duration depends on the subject's cardiac cycle). A schema of the hemodynamic responses to the image acquisition noise (gray curve) and experimental stimulation (black curve) is shown in the lower rightmost TR to illustrate that sparse imaging minimizes the effect of scanner noise on recorded brain activation. Note that each condition was presented in blocks of 5 TR, only one of which is depicted here.

Figure 1

Sequence of acoustic stimulation and image acquisition. The pulsed noise stimuli are drawn as black traces, the upper for the left and the lower for the right channel. The image acquisitions are drawn as gray boxes. In experiment 1 (upper panel), 4 sound conditions were alternated with a silent condition: monaural right stimulation (mon right), monaural left stimulation (mon left), binaural stimulation with identical signal at both channels (perceived as sound coming from the center point of the interaural axis, bin central), and binaural stimulation with dynamically varying phase angle between the channels (perceived as sound movement along the interaural axis, bin move). Experiment 2 contained only the monaural conditions. The TR was about 10.5 s (the exact duration depends on the subject's cardiac cycle). A schema of the hemodynamic responses to the image acquisition noise (gray curve) and experimental stimulation (black curve) is shown in the lower rightmost TR to illustrate that sparse imaging minimizes the effect of scanner noise on recorded brain activation. Note that each condition was presented in blocks of 5 TR, only one of which is depicted here.

In the monaural conditions, trains of noise bursts were played separately to the left or right ear. This results in conditions monaural right stimulation (mon right) and monaural left stimulation (mon left). These conditions were used to access functional asymmetries in auditory structures. The binaural conditions were included to generate the context of auditory spatial processing. In one binaural condition (bin central), identical noise bursts were played to both ears simultaneously, generating the perception of a stationary sound source in the center of the head. In the other binaural condition (bin move), the signals to the left and right ear were dynamically time shifted relative to each other (in the range of ±1.2 ms), a manipulation that is perceived as a sound moving back and forth between the ears. This manipulation differs from a classical binaural beat stimulus: we used pulsed noise instead of sine tones as carrier. The AC is known to respond asymmetrically to sound source motion in the left and right acoustic hemifield (Baumgart and others 1999; Warren and others 2002; Krumbholz, Schonwiesner, von Cramon, and others 2005). In the present study, the binaural sound conditions merely provided the context of auditory spatial processing for the listeners; the brain responses during the binaural conditions were discarded. The binaural conditions were used to assess binaural interaction in the auditory system; those results are presented elsewhere (Krumbholz, Schonwiesner, Rubsamen, and others 2005). The second experiment was identical to the first, with the exception that all binaural conditions were replaced by silent baseline conditions (in the case of two or more consecutive silent blocks only one was presented). To minimize eye movements in the direction of the sounds and ensure a constant state of alertness throughout the experiment, subjects performed a visual task. They fixated a cross at the midpoint of the visual axis and were instructed to press a button upon each occurrence of the letter “Z” in 2 random sequences of one-digit numbers, presented to the left and right of the fixation cross. The timing of the visual task was unrelated to the sound stimulation. The numbers were presented once every 2 s for 50 ms. This is a relatively easy task (all subjects attained >90% correct responses), but it requires constant attention and monitoring of the fixation point.

The auditory stimuli were presented through magnetic resonance–compatible electrostatic headphones (Sennheiser model HE 60), which were fitted into industrial ear protectors that passively shielded the subjects from the scanner noise (Bilsom model 2452). The system was built according to the guidelines of the MRC Institute of Hearing Research sound system (Palmer and others 1998). The stimuli were presented at 85-dB sound pressure level (relative to 20 μPa), but in 4 subjects, the intensity was reduced to 75-dB sound pressure level minimally because of discomfort due to high subjective loudness. “Sparse imaging” (Edmister and others 1999; Hall and others 1999) was used to minimize the effects of the scanner noise on measured brain responses to the experimental stimuli. This technique takes advantage of the lag of the hemodynamic response by introducing a gap between consecutive image acquisitions that allows the response to the scanner noise to decay to almost resting level before the next image is acquired. The stimuli are presented during the gap between image acquisitions, thus each scan records primarily stimulus-related hemodynamic responses (indicated in Fig. 1, lower panel right; diagrams of measured hemodynamic responses can be found in Hall and others 1999). We used a gap of about 8.4 s. Including 2.1 s image acquisition time at the beginning of every repetition, the average repetition time (TR) was 10.5 s. Cardiac gating (Guimaraes and others 1998) was used to minimize motion artifacts in the brain stem signal resulting from pulsation of the basilar artery. The functional images were triggered 300 ms after the R-wave in the electrocardiogram, when the cardiac cycle is in its diastolic phase. The experimental conditions were presented in blocks of 5 trials. In experiment 1, 4 sound blocks containing the 4 sound conditions in pseudorandom order were alternated with a single silence block. In experiment 2, sound and silence blocks were presented in alternation. A total of 250 images (corresponding to 50 epochs) were acquired per subject in each experiment.

FMRI Data Acquisition

BOLD contrast images were acquired with a 3-T Bruker Medspec whole-body scanner using gradient echo planar imaging (average TR = 10.5 s, echo time [TE] = 30 ms, flip angle = 90°, acquisition bandwidth = 100 kHz). The functional images consisted of 28 ascending slices with an in-plane resolution of 3 × 3 mm, a slice thickness of 3 mm, and an interslice gap of 1 mm. Slices were oriented along the line connecting the anterior and posterior commissures and positioned so that the lowest slices covered the cochlear nucleus (CN) just below the pons. The slices were acquired in direct temporal succession in 2.1 s. A high-resolution structural image was acquired from each subject using a 3-dimensional modified driven equilibrium Fourier transform sequence (Ugurbil and others 1993) with 128 slices of 1.5-mm each (field of view = 25 × 25 × 19.2 cm, data matrix 256 × 256, TR = 1.3 s, TE = 10 ms). To assist registration, a set of T1-weighted echo planar imaging slices was acquired using the same parameters as for the functional images (inversion time = 1200 ms, TR = 45 s, 4 averages).

Data Analysis

The data were analyzed with the software package Leipzig Image Processing and Statistical Inference Algorithms (Lohmann and others 2001). The raw functional images of each listener were carefully checked for baseline artifacts and excess head motion. The following preprocessing steps were performed: baseline correction (0.0019-Hz high-pass filter), correction for remaining head motion (<2 mm), and transformation into Talairach space. The normalized functional images were then spatially smoothed with Gaussian kernels of 3- and 10-mm full-width half-maximum to optimize for the signals from the brain stem and the cortex, respectively. The extent of auditory structures in the brain stem is only a few millimeters, and their location with respect to macroanatomical landmarks varies little across individuals. The detection of auditory activity in the brain stem can thus be optimized with a small smoothing kernel. In contrast, auditory cortical regions are comparatively large, and their boundaries exhibit a considerable interindividual variability with respect to macroanatomy (Rademacher and others 2001), warranting a larger smoothing kernel. Importantly, if the smoothing kernel fits with the size of the activation, then an activity measure at a local maximum contains activity information from the surrounding activated area and is, theoretically, the best characterization of activity in that area. The image time series of the group of 12 subjects in each experiment, comprising a total of 3000 volumes, was analyzed with a fixed-effects general linear model. Experimental conditions were modeled as boxcar functions convolved with a generic hemodynamic response function including a response delay of 6 s. The height threshold for activation on statistical parameter maps (SPMs) was z = 3.1 (P ≤ 0.001 uncorrected). Second level random-effects analyses were implemented by extracting the BOLD signal change in response to monaural sound stimulation from individual regions of interest (ROIs). The magnitude of lateralization in the cortical and subcortical ROIs was accessed by computing lateralization indices (LIs) of brain activation, that is, dividing the difference of the BOLD response to the right and left monaural sounds by their sum and expressing the result in percent: LI = 100 × [BOLD (right) − BOLD (left)]/[BOLD (right) + BOLD (left)]. Thus, positive LIs signify a stronger response to right than to left monaural sounds, whereas negative LIs signify a stronger response to left than to right monaural sounds. Taking individual LI as random effects, the significance of lateralization across the group was assessed with Student's t-tests (Gosset 1908). We did not apply the random-effects model directly to the SPM because, although established for cortical brain responses, it is not well suited for the analysis of small subcortical activations. The limited extent of these activations converts small location differences into a large interindividual variance. Subcortical ROI were defined in 2 steps: First, a sphere of 8-mm radius was drawn around the location of the respective structure on the individual anatomical scans with reference to standard human brain atlases. Second, the local z-score maximum in individual SPMs that fell into the sphere and its 8-voxel neighborhood formed the final ROI. This combined anatomy- and function-based definition is necessary for structures, like the medial geniculate bodies (MGBs), that are not clearly discernable on the anatomical images. It also compensates for the nonlinear displacement of anatomical loci between structural and functional scans.

Results

The activation produced by the sum of both monaural sound conditions was compared with the silent baseline condition to highlight cortical and subcortical regions sensitive to noise stimuli. The activated regions included the CNs, the inferior colliculi (ICs), the MGBs, and the ACs on both sides of the brain. The Talairach coordinates of the most significant voxel in each structure in this contrast were as follows (x, y, z)—left CN: −14, −42, −30; right CN: 10, −42, −30; left IC: −8, −36, −3; right IC: 4, −36, −3; left MGB: −17, −30, 0; right MGB: 13, −30, −3; left AC: -47, −27, 12; and right AC: 40, 35, 25. Subsequently, the monaural conditions were individually compared with the silence condition as well as with each other in order to reveal asymmetries in the cortical and subcortical activation during monaural stimulation.

Activation during Monaural Left- and Right-Ear Stimulation

The activation during monaural stimulation of the left and right ear was compared with the activation during the silent baseline condition in 2 separate contrasts. For experiment 1, SPMs of these 2 contrasts are shown in Figure 2.

Figure 2

Activation maps for experiment 1. From left to right, ascending the auditory pathway, axial and coronal anatomical slices through the CNs, the ICs, the MGBs, and the ACs are shown. (White arrowheads denote the respective structures of interest in slices with several activated areas.) Superimposed are color-coded SPMs that show significant activation of auditory structures following monaural stimulation of the left and right ear compared with the silent baseline condition. Monaural left- and right-ear stimulation resulted in a strong activation of the ipsilateral CN. The IC, MGB, and AC of the left hemisphere responded stronger to the (contralateral) right-ear stimulation and less strong to the (ipsilateral) left-ear stimulation than the respective structures of the right hemisphere. In the right hemisphere, no such activation difference was observed. (SPMs are z-maps thresholded at 3.1 [P < 0.001 uncorrected]. For the axial display of the AC, the slicing plane has been rotated by 30°, as indicated by the dashed line in the schematic inset, to show the full length of Heschl's gyrus.)

Figure 2

Activation maps for experiment 1. From left to right, ascending the auditory pathway, axial and coronal anatomical slices through the CNs, the ICs, the MGBs, and the ACs are shown. (White arrowheads denote the respective structures of interest in slices with several activated areas.) Superimposed are color-coded SPMs that show significant activation of auditory structures following monaural stimulation of the left and right ear compared with the silent baseline condition. Monaural left- and right-ear stimulation resulted in a strong activation of the ipsilateral CN. The IC, MGB, and AC of the left hemisphere responded stronger to the (contralateral) right-ear stimulation and less strong to the (ipsilateral) left-ear stimulation than the respective structures of the right hemisphere. In the right hemisphere, no such activation difference was observed. (SPMs are z-maps thresholded at 3.1 [P < 0.001 uncorrected]. For the axial display of the AC, the slicing plane has been rotated by 30°, as indicated by the dashed line in the schematic inset, to show the full length of Heschl's gyrus.)

Left-ear stimulation caused a significant activation of the left CN (see first and third row in Fig. 2). Activation of the right CN just reached the significance criterion (P < 0.001). In subsequent auditory structures, activation shifted to the contralateral side. The right IC and MGB responded more strongly than their left-side counterparts, which barely reached the significance criterion. The AC was activated bilaterally, with the activation being more pronounced in the right hemisphere. At the level of the CN, the response to right-ear stimulation was essentially mirror symmetric to the response to left-ear stimulation, that is, the right monaural sounds elicited a strong activation in the right CN and a weak activation in the left CN. Beyond the CN, however, the activation pattern to right-ear stimulation differed markedly from being a mirror-symmetric equivalent of the pattern during left-ear stimulation (second and fourth row in Fig. 2): ICs, MGBs, and ACs on the left and right sides responded with more or less equal strength to the right-ear stimulation.

This difference in the responses to left- and right-ear stimulation was further analyzed by extracting a measure of the activation strength from the most significant voxel in each auditory structure (Fig. 3). Activation strength refers to contrast-weighted parameter estimates from the general linear model (effect sizes). In experiment 1, we found a hemispheric asymmetry in the IC, MGB, and AC responses: The right-side structures responded about equally strongly to sound stimulation from the left and right ear (no significant differences in effect sizes), whereas the left-side structures responded predominantly to the right-ear stimulation (t-test—IC: P < 0.0001, t = 7.35; MGB: P < 0.001, t = 3.84; and AC: P < 0.0001, t = 9.8; Fig. 3A). Experiment 2 tested whether this functional asymmetry was related to the presence of binaural sounds in experiment 1, by replacing them with silent intervals. In experiment 2, ICs, MGBs, and ACs on both sides responded stronger to contralateral stimulation (Fig. 3B). No hemispheric asymmetry was detected. In both experiments, monaural stimulation elicited stronger activation of the ipsilateral CN than the contralateral CN.

Figure 3

Activation of left and right side auditory structures with (A: experiment 1) and without (B: experiment 2) interspersed binaural stimulation. Each data point gives the activation strength (mean effect size with standard error of the mean) of the most significant voxel in an auditory structure. Responses are sorted by cerebral hemisphere and side of stimulation. The right AC, MGB, and IC respond notably different to the identical stimulation in experiments 1 and 2. In experiment 1, these structures respond about equally strongly to left and right monaural stimulation, whereas in experiment 2, they respond more strongly to contralateral stimulation. The AC, MGB, and IC on the left side show stronger responses to contralateral stimulation in both experiments. The left and right CNs responded stronger to ipsilateral stimulation in both experiments. Note that absolute effect sizes should not be compared across experiments. The higher number of silent baseline trials in experiment 2 (replacing binaural stimulation in experiment 1) leads to a lower level of adaptation and therefore greater responses in some structures.

Figure 3

Activation of left and right side auditory structures with (A: experiment 1) and without (B: experiment 2) interspersed binaural stimulation. Each data point gives the activation strength (mean effect size with standard error of the mean) of the most significant voxel in an auditory structure. Responses are sorted by cerebral hemisphere and side of stimulation. The right AC, MGB, and IC respond notably different to the identical stimulation in experiments 1 and 2. In experiment 1, these structures respond about equally strongly to left and right monaural stimulation, whereas in experiment 2, they respond more strongly to contralateral stimulation. The AC, MGB, and IC on the left side show stronger responses to contralateral stimulation in both experiments. The left and right CNs responded stronger to ipsilateral stimulation in both experiments. Note that absolute effect sizes should not be compared across experiments. The higher number of silent baseline trials in experiment 2 (replacing binaural stimulation in experiment 1) leads to a lower level of adaptation and therefore greater responses in some structures.

Responses across Subjects

To test whether the results were consistent across individuals, effect sizes were extracted from ROIs for all auditory structures in individual subjects, and LIs were computed as a measure of the magnitude of lateralization of the brain responses. From the results of the group analysis, we expected ipsilateral LI in left and right CNs in both experiments, contralateral LI in all auditory structures above the CN in experiment 2 and in the left-side structures above CN in experiment 1, but LI close to zero in the respective right-side structures in experiment 1.

Figure 4 shows LI of all subjects in the left- (left column) and right-side ROI (right column). Most subjects had clearly identifiable activation maxima in all structures. In both experiments, the majority of CN responses were stronger to ipsilateral than to contralateral stimulation (31 out of 48), only 7 out of 48 showed lateralization to the contralateral side (remaining responses: 1 nonlateralized, 9 not detected). The left IC, MGB, and AC exhibited clear contralateralization in both experiments (significant across subjects in almost all cases, trend in the left MGB in experiment 1; of 72 measurements, 59 [82%] were contralateral, 6 [8%] ipsilateral, 4 [6%] nonlateralized, 4 not detected). In the right IC, MGB, and AC, the pattern of lateralization differed between the 2 experiments: lateralization in experiment 1 was weaker and less consistent in direction (15 out of 36 [42%] contralateral, 10 [28%] ipsilateral, 5 [14%] nonlateralized, 6 [17%] not detected) than in experiment 2 (29 out of 36 [81%] contralateral, 4 [11%] ipsilateral, 1 [3%] nonlateralized, 2 [6%] not detected). The individual data thus confirmed the results of the group analysis. The small group activation in the right-side structures in experiment 1 appeared to result from 2 factors: a greater proportion of the right-side than the left-side structures showed an LI close to zero, and among those right-side structures that exhibited a sizable LI, there was no consistency in direction of the lateralization.

Figure 4

Lateralization of auditory activation in all individual subjects in response to monaural sound stimulation with (A: experiment 1) and without (B: experiment 2) interspersed binaural stimulation. Each black bar indicates one subject's LI (in %) and direction of lateralization (ipsi- or contralateral) in an auditory structure in the left or right hemisphere. The same order of subjects is kept in all plots of each experiment to allow within-subject comparisons. Four of the 12 subjects in experiment 1 also participated in experiment 2; their data are shown in the upper 4 bars of all plots. The P values and stars indicate the significance level of t-tests for the lateralization within an auditory structure across subjects.

Figure 4

Lateralization of auditory activation in all individual subjects in response to monaural sound stimulation with (A: experiment 1) and without (B: experiment 2) interspersed binaural stimulation. Each black bar indicates one subject's LI (in %) and direction of lateralization (ipsi- or contralateral) in an auditory structure in the left or right hemisphere. The same order of subjects is kept in all plots of each experiment to allow within-subject comparisons. Four of the 12 subjects in experiment 1 also participated in experiment 2; their data are shown in the upper 4 bars of all plots. The P values and stars indicate the significance level of t-tests for the lateralization within an auditory structure across subjects.

Discussion

Our results demonstrate a context- and not stimulus-dependent functional asymmetry in the human IC, MGB, and AC in response to left and right monaural noise-burst stimulation. The lateralization pattern observed is similar to the one reported for auditory spatial processing and different from the left hemisphere dominance for speech processing. Importantly, this asymmetry is susceptible to experimental manipulation and thus does not solely reflect fixed anatomical asymmetries in the auditory pathway.

Responses to Left- and Right-Ear Stimulation

Our experimental paradigm was sensitive enough to access asymmetrical responses in subcortical structures, as indicated by the interindividual consistency of results and the observed activation shift from ipsi- to contralateral between the CN and the subsequent auditory pathway. The contralateral CN responses might have been caused by bone conduction of sound presented to the ipsilateral ear. The effect of bone conduction is considerable at high sound intensities; cortical activation was detected by monaural stimulation at 95 dB of the deaf ear of unilateral deaf subjects (Scheffler and others 1998). The observed activation shift is in agreement with the contralaterality established in the brain stem between CN and IC (“acoustic chiasm,” Glendenning and Masterton 1983). In humans, this is supported by neuroimaging studies demonstrating that monaural sound stimulation elicits stronger responses in the hemisphere contralateral to the stimulated ear (Loveless and others 1994; Hirano and others 1997; Scheffler and others 1998; Jäncke and others 2002; Suzuki and others 2002). In experiments 1 and 2, we found stronger responses to contralateral than ipsilateral stimulation in the left IC, MGB, and AC. The ratio of contra- to ipsilateral response magnitude is consistent with previous studies (Scheffler and others 1998; Jäncke and others 2002).

In contrast to previous studies, the right IC, MGB, and AC responded about equally strongly to sound stimulation from the left and right ear in experiment 1. The asymmetry is clearest in AC and IC. On the left side, both structures exhibited a clear contralateral stimulus preference: responses to right monaural stimulation were significantly larger than responses to left monaural stimulation. In contrast, no consistent side preference was detected in the right-side IC or AC in experiment 1; these structures responded on average about equally well to stimuli from both ears. The MGB showed a trend for the same lateralization pattern, although less consistent across individuals. The variability of the MGB responses may have partly been caused by the ascending slice acquisition, which reduces the benefit of cardiac triggering in superior slices.

It is unlikely that the results of experiment 1 can be explained by a spillover of the BOLD responses in the binaural conditions. Auditory BOLD responses decay in about 8 s after stimulus offset to baseline (Belin and others 1999; Backes and van Dijk 2002), which is short in comparison with the block length of >50 s in our experiments.

There is a subthreshold trend in our participant sample for the left AC and MGB to respond more strongly to monaural sounds than the same structures on the right side. This tendency in the AC, but not in the MGB, is in accord with the observation by Devlin and others (2003) of stronger responses to monaural tones in the left than right AC. That study used a frequency discrimination task, which might have contributed to the stronger left-side activations. Brechmann and Scheich (2005) demonstrated that discriminating pitch direction may induce such a response laterality. We used a non–sound-related distraction task in the present experiment.

Conditions Under Which Functional Laterality May Occur

There is a discussion about whether physical or semantic stimulus aspects of sounds are the main determinant of hemispheric lateralization. Some studies suggest that the presence of certain acoustic features (such as complex spectrotemporal modulations [Zatorre and Belin 2001; Schönwiesner and others 2005] or sound movement [Krumbholz, Schonwiesner, von Cramon, and others 2005]) may be sufficient to induce lateralized activation. Others, however, find changes in lateralization when the same stimuli are presented in different listening contexts (such as pitch classification task vs. no task [Brechmann and Scheich 2005] or speech vs. nonspeech context [Shtyrov and others 2005]). Such “bottom–up” versus “top–down” accounts for differential lateralization of neural activity are not exclusive options; the same neuronal populations may be activated by top–down and bottom–up projections, such that their activity depends on both physical and semantic stimulus aspects. Here, we demonstrate that the long-term (>25 s) stimulus context of auditory spatial processing modulates the laterality of responses to identical stimuli even when attention is directed toward the visual modality. This result goes beyond previous reports of contextual effects. The changes in lateralization observed by Brechmann and Scheich (2005) are caused by changes in attention with different task instructions. Shtyrov and others (2005) found changes in lateralization of responses to a short noise burst when it was imbedded in a word, pseudoword, or noise. They did not modify task instructions (as in our study, the subjects' attention was guided toward the visual modality), but their effects reside on the much shorter time scale of milliseconds compared with a time scale of tens of seconds in our experiment.

Devlin and others (2003) reported that both left and right monaural stimulation with amplitude-modulated pure tones activated the left primary AC more strongly than the right. The authors suggested that this result points to a general left hemisphere dominance for auditory processing that might have contributed to the evolution of speech-dominant left hemisphere. Although our demonstration of a different lateralization pattern does not exclude that such a left hemisphere bias may exist in many listening situations, this shows that it does not account for all situations. The asymmetry observed here (experiment 1) relates to auditory spatial rather than speech processing. We explicitly included an auditory motion condition, and the resulting lateralization pattern resembles the hemisphere specialization for the processing of acoustic spatial information found in a lesion study (Zatorre and Penhune 2001) and an FMRI study (Krumbholz, Schonwiesner, von Cramon, and others 2005). If the binaural motion stimuli are replaced by silent intervals, the hemispheric asymmetry vanishes and a contralateral activation predominance emerges. This finding suggests that the hemispheric distribution of auditory information throughout the auditory pathway is not fixed but rather depends on the acoustical context.

Top–Down or Bottom–Up?

The response asymmetry observed along the auditory pathway may either originate in the brain stem and then be reflected at the level of higher processing structures (bottom–up mechanism) or, alternatively, be generated in the cortex and projected down to subcortical structures (top–down mechanism). Finally, it may result from a combination of both. The bottom–up hypothesis would imply a modulation of neuronal responses in subcortical structures by the binaural motion stimuli. This modulation would have to extend over at least half a block length of the current experiment (>25 s). A stimulus-specific adaptation over tens of seconds has recently been found in AC neurons but, importantly, not in MGB neurons (Ulanovsky and others 2004). Furthermore, IC and MGB do not specifically respond to binaural motion stimuli; such responses are mainly found in the planum temporale (Krumbholz, Schonwiesner, Rubsamen, and others 2005). This finding argues strongly for a top–down mechanism evoking the observed functional asymmetry.

A lateralized response of the AC could be projected down to subcortical levels via efferent auditory projections—the corticofugal system. Monosynaptic projections descend from AC to MGB (FitzPatrick and Imig 1978; Winer and others 2001), IC (FitzPatrick and Imig 1978), and CN (Jacomme and others 2003). The main recipient is the MGB. Remarkably, efferent tracts from AC to MGB are considerably larger than afferent projections from MGB to AC (Winer and others 2001). Winer and others (1998, 2001) also demonstrated a significant projection from every AC field to the IC in various mammals. These projections are mainly ipsilateral (Saldana and others 1996; Druga and others 1997). Focal electrical activation of the AC elicits changes in frequency tuning, sensitivity, and temporal response pattern in IC neurons (Yan and Suga 1996, 1998; Zhang and others 1997; Zhang and Suga 2000; Yan and Ehret 2001, 2002). Hence, the corticofugal system seems capable of modulating the activation balance between left and right ICs and MGBs. There are direct demonstrations of strong efferent modulation of activity even in the lower brain stem down to the auditory periphery in cats (Hernandez-Peon and others 1961) and humans (Perrot and others 2005).

Top–down projections can denote either recurrent corticocortical connections (those might convey cognitive effects from, for instance, prefrontal cortex to sensory cortices) or projections from sensory cortices to subcortical structures, as outlined above. In the present experiments, we recorded activity from the AC and subcortical structures. There was no significant differential activation outside the superior temporal plane in any of the experimental conditions. We can therefore only draw conclusions about the latter kind of efferent projection. In addition, most of our participants reported that they were not aware of the sound stimulation while concentrating on the visual task for most of the experimental time, suggesting that the asymmetry observed in experiment 1 relies on interactions between low-level AC and subcortical structures.

In conclusion, the present study reports evidence for a functional hemispheric asymmetry in the left- and right-side AC and subcortical structures in response to monaural noise-burst stimulation. Importantly, this asymmetry depends on the acoustic context in which the stimuli are presented. Infrequent binaural stimulation shifts the monaural responses in the AC and in subcortical structures to resemble the hemispheric asymmetry for auditory spatial processing. At present, the dependence on acoustic context and its long time constant argue for a top–down mechanism underlying the observed laterality effect. Such a mechanism would be consistent with previous studies of the functional anatomy underlying cerebral specialization in the visual system, which have demonstrated in the context of lexical versus visuospatial decision making that top–down mechanisms are important determinants of hemispheric specialization (Stephan and others 2003). This suggests that such a context- rather than stimulus-dependent functional lateralization of neural activations constitutes a more general mechanism of cerebral specialization. The current study extends prior studies by showing that these mechanisms also apply to auditory processing and, furthermore, can also be observed at the level of subcortical processing.

This study was supported by the Max Planck Society, Helmholtz Association, German National Academic Foundation (MS), German Research Foundation (DFG-KFO-112, GRF), and Bundesministerium für Bildung und Forschung (BMBF, BICW). We thank one anonymous referee and Lutz Jancke for helpful comments. Conflict of Interest: None declared.

References

Backes
WH
van Dijk
P
Simultaneous sampling of event-related BOLD responses in auditory cortex and brainstem
Magn Reson Med
 , 
2002
, vol. 
47
 
1
(pg. 
90
-
96
)
Baumgart
F
Gaschler-Markefski
B
Woldorff
MG
Heinze
HJ
Scheich
H
A movement-sensitive area in auditory cortex
Nature
 , 
1999
, vol. 
400
 
6746
(pg. 
724
-
726
)
Belin
P
Zatorre
RJ
Hoge
R
Evans
AC
Pike
B
Event-related fMRI of the auditory cortex
Neuroimage
 , 
1999
, vol. 
10
 
4
(pg. 
417
-
429
)
Brechmann
A
Scheich
H
Hemispheric shifts of sound representation in auditory cortex with conceptual listening
Cereb Cortex
 , 
2005
, vol. 
15
 
5
(pg. 
578
-
587
)
Devlin
JT
Raley
J
Tunbridge
E
Lanary
K
Floyer-Lea
A
Narain
C
Cohen
I
Behrens
T
Jezzard
P
Matthews
PM
Moore
DR
Functional asymmetry for auditory processing in human primary auditory cortex
J Neurosci
 , 
2003
, vol. 
23
 
37
(pg. 
11516
-
11522
)
Druga
R
Syka
J
Rajkowska
G
Projections of auditory cortex onto the inferior colliculus in the rat
Physiol Res
 , 
1997
, vol. 
46
 
3
(pg. 
215
-
222
)
Edmister
WB
Talavage
TM
Ledden
PJ
Weisskoff
RM
Improved auditory cortex imaging using clustered volume acquisitions
Hum Brain Mapp
 , 
1999
, vol. 
7
 
2
(pg. 
89
-
97
)
FitzPatrick
KA
Imig
TJ
Projections of auditory cortex upon the thalamus and midbrain in the owl monkey
J Comp Neurol
 , 
1978
, vol. 
177
 
4
(pg. 
537
-
555
)
Glendenning
K
Masterton
R
Acoustic chiasm: efferent projections of the lateral superior olive
J Neurosci
 , 
1983
, vol. 
3
 
8
(pg. 
1521
-
1537
)
Gosset
WS
The probable error of a mean
Biometrika
 , 
1908
, vol. 
6
 
1
(pg. 
1
-
25
)
Guimaraes
AR
Melcher
JR
Talavage
TM
Baker
JR
Ledden
P
Rosen
BR
Kiang
NY
Fullerton
BC
Weisskoff
RM
Imaging subcortical auditory activity in humans
Hum Brain Mapp
 , 
1998
, vol. 
6
 
1
(pg. 
33
-
41
)
Hall
DA
Haggard
MP
Akeroyd
MA
Palmer
AR
Summerfield
AQ
Elliott
MR
Gurney
EM
Bowtell
RW
“Sparse” temporal sampling in auditory fMRI
Hum Brain Mapp
 , 
1999
, vol. 
7
 
3
(pg. 
213
-
223
)
Halligan
PW
Fink
GR
Marshall
JC
Vallar
G
Spatial cognition: evidence from visual neglect
Trends Cogn Sci
 , 
2003
, vol. 
7
 
3
(pg. 
125
-
133
)
Hernandez-Peon
R
Brust-Carmona
H
Penaloza-Rojas
J
Bach-Y-Rita
G
The efferent control of afferent signals entering the central nervous system
Ann N Y Acad Sci
 , 
1961
, vol. 
89
 (pg. 
866
-
882
)
Hirano
S
Naito
Y
Okazawa
H
Kojima
H
Honjo
I
Ishizu
K
Yenokura
Y
Nagahama
Y
Fukuyama
H
Konishi
J
Cortical activation by monaural speech sound stimulation demonstrated by positron emission tomography
Exp Brain Res
 , 
1997
, vol. 
113
 
1
(pg. 
75
-
80
)
Jacomme
AV
Nodal
FR
Bajo
VM
Manunta
Y
Edeline
JM
Babalian
A
Rouiller
EM
The projection from auditory cortex to cochlear nucleus in guinea pigs: an in vivo anatomical and in vitro electrophysiological study
Exp Brain Res
 , 
2003
, vol. 
153
 
4
(pg. 
467
-
476
)
Jäncke
L
Wüstenberg
T
Schulze
K
Heinze
HJ
Asymmetric hemodynamic responses of the human auditory cortex to monaural and binaural stimulation
Hear Res
 , 
2002
, vol. 
170
 
1–2
(pg. 
166
-
178
)
Khalfa
S
Collet
L
Functional asymmetry of medial olivocochlear system in humans. Towards a peripheral auditory lateralization
Neuroreport
 , 
1996
, vol. 
7
 
5
(pg. 
993
-
996
)
Khalfa
S
Micheyl
C
Veuillet
E
Collet
L
Peripheral auditory lateralization assessment using TEOAEs
Hear Res
 , 
1998
, vol. 
121
 
1–2
(pg. 
29
-
34
)
Krumbholz
K
Schonwiesner
M
Rubsamen
R
Zilles
K
Fink
GR
von Cramon
DY
Hierarchical processing of sound location and motion in the human brainstem and planum temporale
Eur J Neurosci
 , 
2005
, vol. 
21
 
1
(pg. 
230
-
238
)
Krumbholz
K
Schonwiesner
M
von Cramon
DY
Rubsamen
R
Shah
NJ
Zilles
K
Fink
GR
Representation of interaural temporal information from left and right auditory space in the human planum temporale and inferior parietal lobe
Cereb Cortex
 , 
2005
, vol. 
15
 
3
(pg. 
317
-
324
)
Levine
RA
Liederman
J
Riley
P
The brainstem auditory evoked potential asymmetry is replicable and reliable
Neuropsychologia
 , 
1988
, vol. 
26
 
4
(pg. 
603
-
614
)
Levine
RA
McGaffigan
PM
Right-left asymmetries in the human brain stem: auditory evoked potentials
Electroencephalogr Clin Neurophysiol
 , 
1983
, vol. 
55
 
5
(pg. 
532
-
537
)
Lohmann
G
Muller
K
Bosch
V
Mentzel
H
Hessler
S
Chen
L
Zysset
S
von Cramon
DY
LIPSIA—a new software system for the evaluation of functional magnetic resonance images of the human brain
Comput Med Imaging Graph
 , 
2001
, vol. 
25
 
6
(pg. 
449
-
457
)
Loveless
N
Vasama
JP
Makela
J
Hari
R
Human auditory cortical mechanisms of sound lateralisation: III. Monaural and binaural shift responses
Hear Res
 , 
1994
, vol. 
81
 
1–2
(pg. 
91
-
99
)
Marshall
JC
Fink
GR
Spatial cognition: where we were and where we are
Neuroimage
 , 
2001
, vol. 
14
 
1 Pt 2
(pg. 
S2
-
S7
)
Palmer
AR
Bullock
DC
Chambers
JD
A high-output, high quality sound system for use in auditory fMRI
Neuroimage
 , 
1998
, vol. 
7
 pg. 
S359
 
Perrot
X
Ryvlin
P
Isnard
J
Guenot
M
Catenoix
H
Fischer
C
Mauguiere
F
Collet
L
Evidence for corticofugal modulation of peripheral auditory activity in humans
Cereb Cortex
 , 
2005
September
8
 
10.1093/cercor/bhj035.
Oldfield
RC
The assessment and analysis of handedness: the Edinburgh inventory
Neuropsychologia
 , 
1971
, vol. 
9
 (pg. 
97
-
113
)
Rademacher
J
Morosan
P
Schormann
T
Schleicher
A
Werner
C
Freund
HJ
Zilles
K
Probabilistic mapping and volume measurement of human primary auditory cortex
Neuroimage
 , 
2001
, vol. 
13
 
4
(pg. 
669
-
683
)
Saldana
E
Feliciano
M
Mugnaini
E
Distribution of descending projections from primary auditory neocortex to inferior colliculus mimics the topography of intracollicular projections
J Comp Neurol
 , 
1996
, vol. 
371
 
1
(pg. 
15
-
40
)
Scheffler
K
Bilecen
D
Schmid
N
Tschopp
K
Seelig
J
Auditory cortical responses in hearing subjects and unilateral deaf patients as detected by functional magnetic resonance imaging
Cereb Cortex
 , 
1998
, vol. 
8
 
2
(pg. 
156
-
163
)
Schönwiesner
M
Rübsamen
R
von Cramon
DY
Hemispheric asymmetry for spectral and temporal processing in the human antero-lateral auditory belt cortex
Eur J Neurosci
 , 
2005
, vol. 
22
 
6
(pg. 
1521
-
1528
)
Shtyrov
Y
Pihko
E
Pulvermuller
F
Determinants of dominance: is language laterality explained by physical or linguistic features of speech?
Neuroimage
 , 
2005
, vol. 
27
 
1
(pg. 
37
-
47
)
Stephan
KE
Marshall
JC
Friston
KJ
Rowe
JB
Ritzl
A
Zilles
K
Fink
GR
Lateralized cognitive processes and lateralized task control in the human brain
Science
 , 
2003
, vol. 
301
 
5631
(pg. 
384
-
386
)
Suzuki
M
Kitano
H
Kitanishi
T
Itou
R
Shiino
A
Nishida
Y
Yazawa
Y
Ogawa
F
Kitajima
K
Cortical and subcortical activation with monaural monosyllabic stimulation by functional MRI
Hear Res
 , 
2002
, vol. 
163
 
1–2
(pg. 
37
-
45
)
Tervaniemi
M
Hugdahl
K
Lateralization of auditory-cortex functions
Brain Res Brain Res Rev
 , 
2003
, vol. 
43
 
3
(pg. 
231
-
246
)
Ugurbil
K
Garwood
M
Ellermann
J
Hendrich
K
Hinke
R
Hu
X
Kim
SG
Menon
R
Merkle
H
Ogawa
S
Imaging at high magnetic fields: initial experiences at 4 T
Magn Reson Q
 , 
1993
, vol. 
9
 
4
(pg. 
259
-
277
)
Ulanovsky
N
Las
L
Farkas
D
Nelken
I
Multiple time scales of adaptation in auditory cortex neurons
J Neurosci
 , 
2004
, vol. 
24
 
46
(pg. 
10440
-
10453
)
Warren
JD
Zielinski
BA
Green
GG
Rauschecker
JP
Griffiths
TD
Perception of sound-source motion by the human brain
Neuron
 , 
2002
, vol. 
34
 
1
(pg. 
139
-
148
)
Winer
JA
Diehl
JJ
Larue
DT
Projections of auditory cortex to the medial geniculate body of the cat
J Comp Neurol
 , 
2001
, vol. 
430
 
1
(pg. 
27
-
55
)
Winer
JA
Larue
DT
Diehl
JJ
Hefti
BJ
Auditory cortical projections to the cat inferior colliculus
J Comp Neurol
 , 
1998
, vol. 
400
 
2
(pg. 
147
-
174
)
Yan
J
Ehret
G
Corticofugal reorganization of the midbrain tonotopic map in mice
Neuroreport
 , 
2001
, vol. 
12
 
15
(pg. 
3313
-
3316
)
Yan
J
Ehret
G
Corticofugal modulation of midbrain sound processing in the house mouse
Eur J Neurosci
 , 
2002
, vol. 
16
 
1
(pg. 
119
-
128
)
Yan
J
Suga
N
Corticofugal modulation of time-domain processing of biosonar information in bats
Science
 , 
1996
, vol. 
273
 
5278
(pg. 
1100
-
1103
)
Yan
W
Suga
N
Corticofugal modulation of the midbrain frequency map in the bat auditory system
Nat Neurosci
 , 
1998
, vol. 
1
 
1
(pg. 
54
-
58
)
Zatorre
RJ
Belin
P
Spectral and temporal processing in human auditory cortex
Cereb Cortex
 , 
2001
, vol. 
11
 
10
(pg. 
946
-
953
)
Zatorre
RJ
Belin
P
Penhune
VB
Structure and function of auditory cortex: music and speech
Trends Cogn Sci
 , 
2002
, vol. 
6
 
1
(pg. 
37
-
46
)
Zatorre
RJ
Penhune
VB
Spatial localization after excision of human auditory cortex
J Neurosci
 , 
2001
, vol. 
21
 
16
(pg. 
6321
-
6328
)
Zhang
Y
Suga
N
Modulation of responses and frequency tuning of thalamic and collicular neurons by cortical activation in mustached bats
J Neurophysiol
 , 
2000
, vol. 
84
 
1
(pg. 
325
-
333
)
Zhang
Y
Suga
N
Yan
J
Corticofugal modulation of frequency processing in bat auditory system
Nature
 , 
1997
, vol. 
387
 
6636
(pg. 
900
-
903
)

Author notes

Marc Schönwiesner and Katrin Krumbholz contributed equally to this work.