Abstract

The conflict between vision and proprioception has been proposed to explain why healthy subjects perform worse than proprioceptively deafferented patients in conditions with optical displacement, e.g. novel mirror drawing. It is not known which brain processes depend upon the successful integration of visual and proprioceptive information and are therefore impaired when these modalities disagree. With fMRI in healthy subjects we compared brain activity across two conditions with similar visual and proprioceptive stimulation and similar task demands that differed by the congruence of movement showed by the two modalities. Subjects felt the passive movement of the right index finger on a rectangular field and watched a cursor moving on a computer screen. Cursor and finger locations either mapped onto each other (congruent condition) or did not (incongruent condition). Monitoring incongruent compared with congruent movement activated the premotor area bilaterally and the right temporoparietal junction. These brain areas have previously been associated with shifts in the attended location in the visual space. These findings suggest an interaction between vision and proprioception in orienting to spatial locations.

Introduction

When movements are monitored in a mirror or a video display, the usual congruence between the visual and the proprioceptive hand spaces is disrupted. In such situations movement accuracy initially decreases, improving as a result of practice. It has been repeatedly observed that patients who lack proprioception outperform healthy controls when performance is tested during early stages of adaptation to a new visuomotor transformation (Lajoie et al., 1992; Vercher et al., 1996; Guedon et al., 1998; Fourneret et al., 2002). This intriguing finding has been explained by assuming that the visuo-proprioceptive conflict experienced by healthy people impairs performance and delays adaptation (Lajoie et al., 1992).

Although there are many studies that have investigated the brain's response during exposure to visual and proprioceptive stimulation separately, little is known about the brain areas that integrate spatial information across modalities. Knowing which brain areas react to the spatial disagreement between visual and proprioceptive feedback may shed light on the brain processes that are affected by the visuo-proprioceptive conflict. The aim of this fMRI study was to identify brain areas that respond to conflicting as opposed to coordinated visual and proprioceptive movement. To this end we compared brain activity across two conditions that differed by the congruence of movement trajectory conveyed by the visual and proprioceptive sensory channels. To dissociate hand movement and its visual image we used a computer with a finger-operated mouse and a visual display. Proprioceptive stimuli were delivered by passive movement. Passive movement stimulates receptors in the muscle, skin and joints and provides directional cues in the absence of a motor command (Jones et al., 2001).

Materials and Methods

Subjects

Eleven right-handed healthy volunteers (eight female, median age 27 years, range 22–31 years) gave written consent to participate in this study. The study was performed according to the Helsinki II Declaration and was approved by the local Ethics Committee (KF01-028/02).

Task

Throughout the experiment, the subjects watched the movement of a white square (visual angle = 3.5°) on a black LCD display (visual angle = 32 × 23°) and felt the passive movement of their right index finger on the rectangular field (size 30 × 24 mm) of a mouse-like device (Felix, Altra). The mouse was fixated in the horizontal plane on a support which rested on the subject's thigh. The subject's wrist and the other fingers were immobilized with velcro straps on the mouse support, and the index finger was fixated with tape onto the mobile piece of the mouse. An experimenter (DB) stood beside the scanner and moved the mobile piece of the mouse, and hence also the subject's finger (Fig. 1A). The trajectory of finger movement was arbitrarily chosen and covered the entire rectangular field. In the congruent condition, the screen cursor displayed the actual movement of the finger. In the incongruent condition, the subjects saw a playback of cursor movement recorded during the congruent condition, so that finger and cursor locations were randomly associated. The playback solution was preferred to computer generated motion in order to fully match motion speed and trajectory across conditions. In order to reduce the possibility that the subjects remembered the patterns of cursor movement across conditions, this playback was constructed for each presentation of the incongruent condition by choosing a cursor movement sequence at random from previous same-session congruent condition recordings and playing this recording in reverse (Fig. 1B). For instance, if the presentations of the congruent condition within one session are numbered from 1 to 8, then an example of the order of presentation in the incongruent condition could be 1r 2r 1r 3r 3r 5r 4r (‘r’ meaning ‘reversed’). The recorded sequence was padded with up to 1 s of computer generated cursor movement that brought the cursor from the actual mouse position to the first cursor position in the playback, and from the last cursor position in the playback to the actual mouse position. This was done in order to avoid a sudden cursor movement at the transition between conditions. The computer recorded finger and cursor location every 0.05 s.

Figure 1.

Diagram of the experimental set-up and behavioral conditions. (A) The experimenter (Exp) stood beside the scanner moving the subject's right index finger on a mouse field. The subject (Subj) lay in the scanner facing an LCD screen and responded with the left hand by pressing the response box key. (B) In the congruent condition finger and cursor movement followed the same trajectory. In the incongruent condition, the movement of the cursor was played from a memory buffer recorded during a previous trial of the congruent condition. The arrowhead indicates the direction of movement.

Figure 1.

Diagram of the experimental set-up and behavioral conditions. (A) The experimenter (Exp) stood beside the scanner moving the subject's right index finger on a mouse field. The subject (Subj) lay in the scanner facing an LCD screen and responded with the left hand by pressing the response box key. (B) In the congruent condition finger and cursor movement followed the same trajectory. In the incongruent condition, the movement of the cursor was played from a memory buffer recorded during a previous trial of the congruent condition. The arrowhead indicates the direction of movement.

To ensure that the subjects attended to both sensory modalities, they were instructed to press a key with the left thumb whenever they detected a transition between conditions.

Before the scanning session, the subjects were familiarized with both experimental conditions and practiced the detection task.

Data Acquisition

Gradient echo echoplanar images at a resolution of 3.75 × 3.75 × 3.75 mm were acquired with TR = 4.1 s, TE = 66 ms, flip angle = 90° with a 1.5 T Siemens, Vision scanner. A total of 24 slices, covering a field of view = 89 mm, were positioned obliquely, between the coronal and axial plane, so as to cover optimally the parietal lobe, leaving out the anterior temporal and anteroinferior frontal lobes. High-resolution T1-weighted anatomical scans covering the entire brain were acquired at 1 × 1 × 1 mm resolution. Each subject completed two sessions of functional data acquisition giving a total of 2 × 73 volumes. The first three images in each session were discarded to allow for equilibrium in the magnetization. The two conditions were presented alternately, in blocks of 20 s. Each session consisted of 15 blocks, starting with the congruent condition. Because the sequences of cursor movement in the incongruent condition were constructed using recorded sequences from the congruent condition, it was not possible to randomize the order of condition presentation.

Data Analysis

Using SPM99 (http://www.fil.ion.ucl.ac.uk/spm/spm99.html), data were realigned, co-registered to the subject's structural image, normalized to the space of the Montreal Neurological Institute (MNI) brain template and smoothed with a Gaussian filter of 8 mm FWHM (full-width, half maximum). The difference between conditions was estimated on group data using the general linear model as implemented in SPM99 (random-effects model). The design matrix for each individual subject included one covariate of interest for the difference between conditions, modelled with a boxcar convolved with a standard haemodynamic response function, six nuisance covariates containing realignment parameters and one nuisance covariate containing the left-hand key press events modelled with the standard haemodynamic response function. Additionally, a high-pass filter with a cut-off period of 80 s, implemented in SPM99, was applied to the BOLD signal in order to remove low-frequency noise. To better estimate the temporal autocorrelation in the fMRI data the signal was smoothed with a Gaussian filter of 4 s FWHM. The difference between conditions was estimated voxelwise for each individual subject, and then tested across subjects using a one-sample t-test. Clusters of contiguous voxels (extent threshold = 0 voxels, height-threshold P < 0.001 uncorrected) are reported at a significance threshold of P < 0.001 at the cluster-level, corrected for multiple comparisons (Friston et al., 1996).

Results

Behavioral Data

The average finger velocity calculated post hoc was 29.33 ± 3.57 mm/s (mean ± standard deviation) for the congruent condition and 28.51 ± 2.59 mm/s for the incongruent condition. There was no statistical difference between conditions in the average finger- (paired t-test, P = 0.242) or cursor-speed (P = 0.395). This confirms that finger and cursor movement were well-matched across conditions.

In the incongruent condition, the subjects were able to detect correctly 79.31 ± 9.16% of the transitions with a reaction time in milliseconds of 2749.73 ± 1214.38. In the congruent condition, the transition to the incongruent condition was detected correctly in 81.16 ± 9.72% cases with a reaction time of 2250.36 ± 957.73 ms. The difference in reaction times between conditions was not significant (paired t-test, P = 0.123). The long reaction times and the high number of errors support previous results showing limited awareness of the correspondence between visual and motor spaces (Fourneret and Jeannerod, 1998).

fMRI Data

Compared with the congruent condition, the incongruent condition was associated with higher neural activity bilaterally in the premotor cortices/supplementary motor area and in the right temporoparietal junction (Fig. 2 and Table 1).

Figure 2.

Brain areas with higher activity in the incongruent compared with the congruent condition. The congruent condition was subtracted from the incongruent condition and the result is showed as a parametric map of the t-statistic. The map is thresholded at a corrected P-value < 0.001 at the cluster level (random-effects analysis). For the purpose of anatomical localization, the map is superposed on a single-subject T1-weighted image in corregistration with the stereotactic space of the MNI template. The z-value for each transversal slice is the stereotactic coordinate in vertical direction in the MNI space. The left side of the brain is shown to the left.

Figure 2.

Brain areas with higher activity in the incongruent compared with the congruent condition. The congruent condition was subtracted from the incongruent condition and the result is showed as a parametric map of the t-statistic. The map is thresholded at a corrected P-value < 0.001 at the cluster level (random-effects analysis). For the purpose of anatomical localization, the map is superposed on a single-subject T1-weighted image in corregistration with the stereotactic space of the MNI template. The z-value for each transversal slice is the stereotactic coordinate in vertical direction in the MNI space. The left side of the brain is shown to the left.

Table 1

Clusters of voxels showing significant activity increase in the incongruent compared with the congruent condition

Anatomical location of the cluster
 
No. of voxels
 
Coordinates at peak activation (x y z)
 
z-Value at peak activation
 
Cluster-level P-value (corrected)
 
L precentral gyrus/frontal lobe 304 −38 0 40 4.91 <0.001 
R temporoparietal junction 369 48 −42 8 4.66 <0.001 
R precentral gyrus/frontal lobe + SMA
 
1170
 
52 14 38
 
4.07
 
<0.001
 
Anatomical location of the cluster
 
No. of voxels
 
Coordinates at peak activation (x y z)
 
z-Value at peak activation
 
Cluster-level P-value (corrected)
 
L precentral gyrus/frontal lobe 304 −38 0 40 4.91 <0.001 
R temporoparietal junction 369 48 −42 8 4.66 <0.001 
R precentral gyrus/frontal lobe + SMA
 
1170
 
52 14 38
 
4.07
 
<0.001
 

L, left; R, right; SMA, supplementary motor area; (x, y, z), lateral, sagittal and vertical coordinates in the stereotactic space of the MNI brain template.

Discussion

This experiment identified brain areas that responded to the spatial disagreement between visual and proprioceptive feedback by comparing the brain's haemodynamic response across two conditions. These conditions differed by the congruence of the movement trajectory conveyed by the visual and proprioceptive sensory channels. The subjects were exposed to similar sensory stimuli and performed the same task across conditions. The main finding of this experiment was the higher level of activity in the right temporoparietal junction and the premotor cortices in the incongruent condition compared with the congruent condition. Previous studies have demonstrated increases in brain activity in these areas when subjects covertly shift their attention or overtly move their eyes from one location to another (Corbetta et al., 1998). Right temporoparietal and sometimes premotor lesions are associated with visuospatial neglect (Vallar, 2001), a disorder in which deficits in orienting attention in space are common (Bartolomeo and Chokron, 2002). The right temporoparietal junction is also active when subjects detect visual targets at unexpected locations (Corbetta et al., 2000). The increased activity in the right temporoparietal junction in the incongruent condition in the present study may thus reflect spatial disorientation when the visual and proprioceptive feedback point towards incompatible locations.

Besides responding during shifts in spatial attention, the right temporoparietal junction also reacts to stimulus novelty, regardless of location (Downar et al., 2000; Corbetta and Shulman, 2002). It is, however, unlikely that the differential activation in the right temporoparietal junction found in the present study is related to the novelty of the incongruent condition. First, the subjects were exposed to both conditions during the practice trials and were thus familiar with the incongruent visuo-proprioceptive stimulation at the beginning of the scanning session. Secondly, if this were the case, then the difference between conditions would have been expected to decrease over time as the subjects became accustomed to the incongruent condition. To identify brain areas where the difference between the incongruent and the congruent condition decreased over time, we repeated the statistical analysis after inserting a covariate that modeled a linear time-by-condition interaction. This comparison did not yield any statistically significant voxels, even when the threshold was decreased to P = 0.01, uncorrected and the search volume was limited to the right temporoparietal cluster.

No temporoparietal junction activation was identified in a previous PET experiment where a condition with visuo-proprioceptive mismatch induced by wearing prism goggles was compared with a control condition with no sensory-sensory conflict (Clower et al., 1996). The disagreement between the results of these apparently very similar studies could be explained by a putative activation of the right temporoparietal junction during the control condition of the previous experiment. In this control condition a visual target jumped to an unexpected spatial location while the subjects were performing target directed movements. Because the right temporoparietal junction is activated when subjects detect stimuli at unpredictable locations in the visual field (Corbetta et al., 2000), the control condition may have recruited this brain area.

In the present study, we used passive rather than active movement in order to separate peripheral proprioceptive inflow from central signals related to action planning and initiation (Sperry, 1950; Von Holst and Mittelstaedt, 1950). Thus, the difference in brain activity between the incongruent and the congruent condition reflects specifically the sensory–sensory mismatch. During active movements performed with incongruent visual feedback the inferior parietal lobe of the right hemisphere and the left premotor area are activated (Farrer and Frith, 2002). Our findings suggest that the presence of a visuo-proprioceptive conflict is sufficient for activating these brain regions.

In this study a qualitative paradigm was used in order to optimize the sensitivity for detecting a change in the fMRI signal across conditions. A further experiment with a parametric paradigm that systematically varies the angle between the direction of finger movements and the direction of cursor movements can be used to test whether the activity of the right temporoparietal lobe correlates with the degree of spatial incongruence between the visual and proprioceptive input.

The limited size of the visual field and the limited neural capacity for information processing restricts the number of spatial locations from where sensory stimuli can be sampled and analyzed. There is now a large body of evidence for perceptual connections between the visual space and other unimodal sensory spaces such as the auditory (Spence and Driver, 1997; McDonald et al., 2000) or the tactile (di Pellegrino et al., 1997; Mattingley et al., 1997; Spence et al., 2000) space. This interplay between modalities is believed to facilitate an orienting response that increases the efficiency with which incoming sensory stimuli are processed (Driver and Spence, 1998a,b). The question whether a similar attentional link exist between the proprioceptive hand space and the visual space where hand movements are monitored has, to our knowledge, not being addressed yet. The present results suggests that such a connection may exist.

We thank Sussi Larsen and Torben Lund for assistance with the fMRI set-up, to Ivan Toni and Christian Gerlach for comments on study design and results and to Matthew G. Liptrot for proof-reading. Funded by Rigshospitalet and The Foundation for Research in Neurology (D.B.) and the EU-project MAPAWAMO (F.Å.N.).

References

Bartolomeo P, Chokron S (
2002
) Orienting of attention in left unilateral neglect.
Neurosci Biobehav Rev
 
26
:
217
–234.
Clower DM, Hoffman JM, Votaw JR, Faber TL, Woods RP, Alexander GE (
1996
) Role of posterior parietal cortex in the recalibration of visually guided reaching.
Nature
 
383
:
618
–621.
Corbetta M, Shulman GL (
2002
) Control of goal-directed and stimulus-driven attention in the brain.
Nat Rev Neurosci
 
3
:
201
–215.
Corbetta M, Akbudak E, Conturo TE, Snyder AZ, Ollinger JM, Drury HA, Linenweber MR, Petersen SE, Raichle ME, Van Essen DC, Shulman GL (
1998
) A common network of functional areas for attention and eye movements.
Neuron
 
21
:
761
–773.
Corbetta M, Kincade JM, Ollinger JM, McAvoy MP, Shulman GL (
2000
) Voluntary orienting is dissociated from target detection in human posterior parietal cortex.
Nat Neurosci
 
3
:
292
–297.
di Pellegrino G, Ladavas E, Farne A (
1997
) Seeing where your hands are.
Nature
 
388
:
730
.
Downar J, Crawley AP, Mikulis DJ, Davis KD (
2000
) A multimodal cortical network for the detection of changes in the sensory environment.
Nat Neurosci
 
3
:
277
–283.
Driver J, Spence C (
1998
a) Crossmodal attention.
Curr Opin Neurobiol
 
8
:
245
–253.
Driver J, Spence C (
1998
b) Cross-modal links in spatial attention.
Philos Trans R Soc Lond B Biol Sci
 
353
:
1319
–1331.
Von Holst E, Mittelstaedt H (
1950
) Das reafferenzprinzip.
Naturwissenschaften
 
37
:
464
–476.
Farrer C, Frith CD (
2002
) Experiencing oneself vs another person as being the cause of an action: the neural correlates of the experience of agency.
Neuroimage
 
15
:
596
–603.
Fourneret P, Jeannerod M (
1998
) Limited conscious monitoring of motor performance in normal subjects.
Neuropsychologia
 
36
:
1133
–1140.
Fourneret P, Paillard J, Lamarre Y, Cole J, Jeannerod M (
2002
) Lack of conscious recognition of one's own actions in a haptically deafferented patient.
Neuroreport
 
13
:
541
–547.
Friston KJ, Holmes A, Poline JB, Price CJ, Frith CD (
1996
) Detecting activations in PET and fMRI: levels of inference and power.
Neuroimage
 
4
:
223
–235.
Guedon O, Gauthier G, Cole J, Vercher J-L, Blouin J (
1998
) Adaptation in visuomanual tracking depends on intact proprioception.
J Motor Behav
 
30
:
234
–248.
Jones KE, Wessberg J, Vallbo AB (
2001
) Directional tuning of human forearm muscle afferents during voluntary wrist movements.
J Physiol
 
536
:
635
–647.
Lajoie Y, Paillard J, Teasdale N, Bard C, Fleury M, Forget R, Lamarre Y (
1992
) Mirror drawing in a deafferented patient and normal subjects: visuoproprioceptive conflict.
Neurology
 
42
:
1104
–1106.
McDonald JJ, Teder-Salejarvi WA, Hillyard SA (
2000
) Involuntary orienting to sound improves visual perception.
Nature
 
407
:
906
–908.
Mattingley JB, Driver J, Beschin N, Robertson IH (
1997
) Attentional competition between modalities: extinction between touch and vision after right hemisphere damage.
Neuropsychologia
 
35
:
867
–880.
Spence C, Driver J (
1997
) Audiovisual links in exogenous covert spatial orienting.
Percept Psychophys
 
59
:
1
–22.
Spence C, Pavani F, Driver J (
2000
) Crossmodal links between vision and touch in covert endogenous spatial attention.
J Exp Psychol Hum Percept Perform
 
26
:
1298
–1319.
Sperry RW (
1950
) Neural basis of spontaneous optokinetic responses produced by visual inversion.
J Comp Physiol Psychol
 
43
:
482
–489.
Vallar G (
2001
) Extrapersonal visual unilateral spatial neglect and its neuroanatomy.
Neuroimage
 
14
:
S52
–S58.
Vercher JL, Gauthier GM, Guedon O, Blouin J, Cole J, Lamarre Y (
1996
) Self-moved target eye tracking in control and deafferented subjects: roles of arm motor command and proprioception in arm–eye coordination.
J Neurophysiol
 
76
:
1133
–1144.

Author notes

1Neurobiology Research Unit, N9201, Copenhagen University Hospital, Copenhagen, Rigshospitalet, Denmark, 2Danish Research Centre for Magnetic Resonance, Hvidovre Hospital, Hvidovre, Denmark, 3Informatics and Mathematical Modelling, Technical University of Denmark, Lyngby, Denmark and 4Department of Clinical Physiology and Nuclear Medicine, Copenhagen University Hospital, Rigshospitalet, Copenhagen, Denmark