Abstract

Electrophysiological and brain imaging studies have shown that different populations of neurons contribute to perceptual decision making. Perceptual judgment is a complicated process that has several subprocesses, including the final step of a discrete choice among available possibilities. Using the psychophysical paradigm of difference scaling combined with functional magnetic resonance imaging, we identify an area within a distributed representation that is consistently invoked in perceptual decision. Difference judgments based on visual (color, form, and motion) cues and auditory cues show that a population of neurons in the posterior banks of the intraparietal sulcus (PIPS) is consistently activated for perceptual judgment across visual attributes and sensory modalities, suggesting that those neurons in PIPS are associated with perceptual judgment.

Introduction

Decisions are made all the time in daily life, and how perceptual decisions are correlated with neuronal activity is of central interest to neuroscientists. Single-cell studies have shown that perceptual judgment involves neurons in different cortical areas of the brain, including middle temporal (MT), lateral intraparietal (LIP), amygdala, and prefrontal areas (Newsome et al. 1989; Kim and Shadlen 1999; Shadlen and Newsome 2001; Bechara et al. 2003; Platt and Glimcher 1999). In most single-unit studies, neurons in a particular area of the brain are investigated for their firing rates correlated with the animal's choice behavior in decision making. Functional magnetic resonance imaging (fMRI) studies have also shown that particular areas are involved in particular tasks of perceptual judgment (Heekeren et al. 2004; Walton et al. 2004; Hsu et al. 2005). For example, Heekeren et al. (2004) showed that perceptual judgment in a recognition task in which the subjects are required to identify a face or a house was correlated with changes in activation in the prefrontal cortex. Then, it seems that different areas of the brain are involved in different tasks of perceptual judgment.

It must be noted, however, that there has not been much focus on the task itself in the investigation of perceptual judgment. In most of the studies, one particular perceptual judgment task was used, but given that decision making is a complicated multifaceted process, it is important to consider different processes involved in perceptual judgment. For example, suppose that we make judgments on a red moving bar and a green moving bar. Depending on the attributes of the bar used to compare the 2 bars, perceptual judgment involves different processes. If we compare which moves faster, we use motion mechanism, whereas if we compare which color is redder, then we use a color mechanism, in addition to the mechanisms associated with perceptual judgment. In this study, we came up with an experimental design, which allowed us to remove possible subprocesses one by one through the process of elimination.

The rationale behind such a study is that there is a common area of the brain that is involved for all perceptual judgment tasks. That is because various decision tasks share the same subprocess of perceptual decision, regardless of the kinds of tasks and stimuli. What is common in all decisions is their final step, a simple discrete choice between available possibilities. For example, when we buy a car, we consider various things, including prices, durability, etc. (subprocesses), but whatever the subprocesses are, the final step is the choice of one car over the others. There has rarely been an attempt to consider these subprocesses in the study of perceptual judgment. Thus, we made an effort to minimize possible subprocesses other than the final subprocess of choice in perceptual judgment. There has also been little effort in studying perceptual judgment, using fMRI, to employ the same stimuli in tasks involving different attributes of vision, for example, color, motion, etc. Testing across different attributes of vision is important in investigating perceptual judgment because neuronal resources common in those tasks may stand out across visual attributes.

In the study, we used a psychophysical task of difference judgment (Scheneider 1980; Maloney and Yang 2003) for perceptual decision. As shown in the Methods, because 2 differences are compared in this task, the task is more demanding than simple discrimination. This task is useful for testing some attributes of vision. For example, simple discrimination with color stimuli is rather difficult in psychophysical experiments, for example, comparing green and red. In such a case, a solution can be introducing a third stimulus that will serve as a criterion, such as a gray and asking subjects to decide which is closer to gray (Larimer et al. 1974). Difference judgment allowed us to bypass such a hurdle. Moreover, we compared the results obtained with this task with those with a simple discrimination task. With such efforts, we found an area of the brain that is consistently activated for perceptual judgment tasks across sensory modalities.

Methods

Subjects

Six healthy volunteer subjects (3 females and 3 males), ages 27–51 years (5 of them under 30), participated in the experiments. Three were familiar with fMRI experiments, but none of them was familiar with the experimental hypothesis. Subjects had no past history of brain abnormality, either functional or anatomical. Informed consent was obtained according to the procedures approved by SUNY Upstate Medical University Institutional Review Board.

Stimuli

Example stimuli are shown in Figures 2, 4, and 5. For all the experiments, 9 values of a stimulus were chosen for each of the attributes (color, form, orientation, motion, and auditory duration). For color, 9 isoluminant and equidistant colors were chosen from each of the 2 axes of color space, L–M and S (Derrington et al. 1984). Although we used 2 different axes of color space, the activation results shown for the 2 axes were not different as shown in Figure 2 (2 columns); hence, only data on L–M axes are shown in the subsequent figures. For all the experiments, stimuli were colored. For form, 9 colored rectangles varied in widths from 0.7 to 1.5 degrees of visual angle. For orientation, 9 colored bars had orientations between 0 and 180 degrees. For apparent motion, a rectangle was presented for 200 ms and as soon as it disappeared another was presented on the opposite side of the fixation and lasted for 200 ms. This created apparent motion, which was confirmed by subject's verbal report. The temporal interval between the first and second rectangles, on which the speed of the apparent motion depended, varied from trial to trial, ranging from 100 to 500 ms. For tones, square wave tones at 1.2 kHz were generated with 9 durations ranging from 100 to 500 ms. Those 9 values of each attribute were equally spaced from the initial to final values. For example, 9 orientation angles were equally spaced between 0 and 180 degrees, and 9 tone durations were equally divided between 100 and 500 ms. Out of the 9 values chosen for each attribute, 120 sets of 4 values (or quadruples) were selected with replacement. For example, with 9 values of color (e.g., numbers 1 through 9 representing each color between red and green), 4 values (e.g., 7, 9, 2, 6) were picked, which was the first quadruple. Each quadruple was used in each trial. During each trial, 2 pairs of each quadruple were used. For example, a color pair (7, 9) was presented for 800 ms, followed by the other pair of the same quadruple (2, 6). The same process of selecting a quadruple out of the 9 values is repeated, until 120 quadruples were obtained. A quadruple can be a nested case; in the quadruple of (4, 9, 3, 5), for example, the second pair (3, 5) is within the range of the first pair (4, 9). Such nested quadruples were thrown out because previous experiments showed that such cases were harder to judge (Maloney and Yang 2003). Out of the remaining set, 36 quadruples were randomly selected for experiments. The order within each quadruple was pseudorandomized. Each quadruple was used in one trial, one pair for the first interval and the other pair for the second interval.

Subjects lay down inside the scanner and viewed the stimuli presented on the screen installed near the bore entrance. Only 12 by 40 degrees in visual angle were available for the subject's visual field. This was sufficient for the main experiments because the stimuli occupied only a small part of the visual field. For retinotopy experiments, the stimuli were much larger (35 by 35 degrees in visual angle) and thus were presented to the subjects using a back projection system in which 40 × 40 degrees visual field were made available. The projection screen illuminated by a NEC MT 1020 projector was calibrated for color display, and the mean luminance of the colored rectangles was set to 18 cd/m2. All the stimuli were presented against a black background whose look-up table values were set to zero. The measured luminance value of the background was 0.023 cd/m2.

Experimental Task

A trial of stimulus configuration is shown in the second row of Figure 1. A set of 2 values of a visual attribute (A, B, e.g., 2 patches with different colors) is presented for 800 ms, and 200 ms later, another set (C, D) is presented, followed by a 700-ms window in which subjects either make a decision (judgment block) or keep fixating (passive viewing block). The subject's task was to press a button to indicate in which interval the difference was greater in color, form, orientation, motion, or tone duration. For form, the difference judgment was made on the area of the rectangles, and because the top and bottom of the rectangles were fixed, subjects are able to do the task just by comparing the widths of the rectangles. For motion, subjects judged on the speed differences of apparent motion of the rectangles, and it is possible that subjects may just compare the temporal differences between the left and right rectangles. However, it seemed unlikely due to subjects' verbal report and the actual data that showed activation in the motion area MT/MST (middle superior temporal). In the 3 control experiments (motor, working memory, and task specificity), subjects were asked not to press a button but to subvocalize their decisions (first or second) on which interval contained the greater difference. Note that the 4 values (e.g., A, B, C, and D in Fig. 1) in each quadruple were randomly ordered. Thus, whether difference comparison in the task was easy or not was randomized throughout stimulus presentations. This manipulation was designed to find the effects of the experimental task independent of whether the task was easy or difficult.

Figure 1.

Experimental design. Two different sessions for judgment (top) and passive viewing (bottom) are shown. In each session, one block consists of judgment or passive viewing lasting for 6 trials (one trial = 1 TR), followed by a blank presentation, and the block was repeated 6 times in each session. Stimulus configuration in each trial is shown in the middle row. Two values (A, B, e.g., different colored patches) of a visual attribute (e.g., color) were presented for 800 ms while subjects were fixating, and 200 ms later, another set of 2 values (C, D) were presented, and subjects made a judgment or just fixated (passive viewing) in the last 700-ms window. The 2 sessions were randomly presented in all the experiments.

Figure 1.

Experimental design. Two different sessions for judgment (top) and passive viewing (bottom) are shown. In each session, one block consists of judgment or passive viewing lasting for 6 trials (one trial = 1 TR), followed by a blank presentation, and the block was repeated 6 times in each session. Stimulus configuration in each trial is shown in the middle row. Two values (A, B, e.g., different colored patches) of a visual attribute (e.g., color) were presented for 800 ms while subjects were fixating, and 200 ms later, another set of 2 values (C, D) were presented, and subjects made a judgment or just fixated (passive viewing) in the last 700-ms window. The 2 sessions were randomly presented in all the experiments.

Experimental Design

The fMRI experimental design is shown in Figure 1. There were 2 sessions for each experiment. In the judgment session (top row), subjects did 6 trials of judgment task, followed by a blank presentation for 10 s. This judgment–blank block was repeated 6 times in the session. The procedure was the same in the passive viewing session (bottom row) except that subjects did not do the task but kept fixating in each trial. Each trial lasted for 2.5 s (1 time of repetition [TR], time between successive scans), except for motion and audition experiments in which the trial lasted for 2 TRs (Fig. 4B,C, top). The 2 sessions were presented randomly for all 5 experiments, which were also randomly interleaved. Note that in the auditory task, there is also a fixation point presented on the monitor while subjects were performing the auditory task.

In the above experimental design, task and blank conditions were compared in one session and passive viewing and blank conditions were compared in another session. Percent changes in blood oxygenation level–dependent (BOLD) response were calculated within each session and compared across sessions. Such a design was an attempt to prevent the performance in the task condition from influencing the performance in the passive viewing condition. Because this design might lead to statistical artifacts, we used another experimental design in which task, passive viewing, and blank conditions were run in the same session for 2 visual attributes (color and form) with 2 subjects. When task and passive viewing conditions were compared against the blank condition, the activations for the 2 conditions were similar to those shown in Figure 2. Thus, all the experiments were carried out with the initial experimental design shown in Figure 1.

Figure 2.

Perceptual judgment for color and form for a subject. (A) Experimental task for color and form experiments. A pair of colored rectangles in Interval 1 was followed 200 ms later by another set in Interval 2. Subjects pressed a button to indicate their decisions (which difference is greater in color [or in form]?) during the last 700 ms of each trial. (B) Retinotopic responses during passive viewing of the stimuli. (C) Activations for the color judgment task. (D) Activations for the form judgment task in which subjects made judgment on the area of the rectangles. The 2 columns in (B, C, and D) indicate the 2 axes (L–M and S) of the DKL color space (Derrington et al. 1984). Plot C is expanded in (C′) in which cortical areas are labeled and activated areas are identified. Relevant sulci are shown as solid lines; parietal/occipital sulcus (POS) and transverse occipital sulcus (TOS). The solid circle is the putative LIP area that Sereno et al. (2001) found using a saccadic eye movement task. The center of this circle is set to their locus of “saccadotopic” LIP, x = −24, y = −65, and z = 53 in Talairach coordinates, for reference. The color bar shown in the bottom right of (C′) indicates the probability levels at which the activations are due to chance, and it applies to all figures.

Figure 2.

Perceptual judgment for color and form for a subject. (A) Experimental task for color and form experiments. A pair of colored rectangles in Interval 1 was followed 200 ms later by another set in Interval 2. Subjects pressed a button to indicate their decisions (which difference is greater in color [or in form]?) during the last 700 ms of each trial. (B) Retinotopic responses during passive viewing of the stimuli. (C) Activations for the color judgment task. (D) Activations for the form judgment task in which subjects made judgment on the area of the rectangles. The 2 columns in (B, C, and D) indicate the 2 axes (L–M and S) of the DKL color space (Derrington et al. 1984). Plot C is expanded in (C′) in which cortical areas are labeled and activated areas are identified. Relevant sulci are shown as solid lines; parietal/occipital sulcus (POS) and transverse occipital sulcus (TOS). The solid circle is the putative LIP area that Sereno et al. (2001) found using a saccadic eye movement task. The center of this circle is set to their locus of “saccadotopic” LIP, x = −24, y = −65, and z = 53 in Talairach coordinates, for reference. The color bar shown in the bottom right of (C′) indicates the probability levels at which the activations are due to chance, and it applies to all figures.

Behavioral Data Collection and Analysis

Each fMRI experiment had 36 trials. The same experiments were repeated for 126 trials of behavioral data while the subject was still inside the scanner, as soon as fMRI data collection was over. We calculated how accurate subjects' responses were with respect to the physical values of the stimuli. Note, however, that the purpose of the difference scaling task was not to measure how accurate subjects' response was, or to correlate it with fMRI response, but to let the subjects engage in the given tasks. Moreover, for one visual attribute (color), differences in physical values are not well defined and vary from one color space to another, not to mention their perceptual counterparts. Thus, instead of correlating the accuracy of perceptual response with fMRI response, these additional behavioral data were subjected to maximum likelihood estimation (Maloney and Yang 2003) to find a perceptual scale corresponding to the physical scale of each attribute.

fMRI Data Collection and Analysis

MRI data were collected on a 1.5-T Philips Intera System, running release 11 software. A parallel element head coil (using 6 of 8 channels) was used with SENSE reconstruction and an acceleration factor of 2. Functional data were obtained with 4-mm isotropic resolution using a gradient echo planar imaging sequence with TR of 2500 ms and echo time (TE) (time for signal decay) of 60 ms. High-resolution anatomical images were obtained using a T1-weighted 3D fast field echo sequence. Anatomical brains were reconstructed and flattened using Freesurfer software (http://surfer.nmr.mgh.harvard.edu). The whole brain was scanned during all the experiments, but the occipito–parietal–temporal regions of interest (ROIs) were flattened and functional data were overlaid. The data analysis was carried out for the entire inflated brain, but left and right hemispheres showed similar activation patterns for the ROIs, thus only data on the left hemisphere are overlaid on to the flattened brain.

In the retinotopy experiments, rings (expanding or contracting) and wedges (rotating clockwise or counterclockwise) were presented while subjects were fixating. One stimulus cycle had a ring or a wedge presented in 4 different phases. Eight stimulus cycles were run in each session, and 4 sessions were conducted to obtain reliable retinotopy data. The phase-locked data were analyzed and averaged using a retinotopy-enabled version of Freesurfer to identify different visual areas. Because individual brains differ, ROIs were explicitly defined by the following criteria. V1, V2, V3, VP, and V4/V8 were identified using phase-encoded retinal stimulation (Sereno et al. 1995). V3A/V7 was identified for the areas dorsal and anterior to V3. PIPS was defined as the posterior banks of the intraparietal sulcus, close to V7. Those areas were delineated and shown with dotted lines on the flattened cortex throughout the figures (Figs 2, 4, and 5). Activations were first identified in the functional brain using FSL (FMRIB Software Library, Smith et al. 2004), and corresponding Talairach coordinates were identified in the same subjects' anatomical brains that were flattened via Freesurfer (http://surfer.nmr.mgh.harvard.edu) to find ROIs.

For BOLD measurement, we used TR of 2.5 s and TE of 60 ms. Functional scan had 23 slices with each slice 220 by 220 mm2 wide, and one voxel had 4 by 4 by 4 mm3 with 1 mm gap. Each subject had 5 experimental conditions (5 attributes) and 2 sessions for each of the conditions in addition to 3 control conditions. Thus, each subject had 13 data sets. These functional data were analyzed using FSL (Smith et al. 2004). For initial data processing, the data were corrected for head motion. Mean intensity of voxels was globally normalized, and for smoothing, the data were subjected to temporal and spatial smoothing via Gaussian filters with space constants of 100 and 5, respectively. For all subjects, functional brains were aligned to anatomical brains, which were registered to the Montreal Neurological Institute (MNI) average brain (Brett et al. 2002).

Each subject's set of functional data was fitted with a univariate general linear model as implemented in FSL. The model has 2 terms. One is a function that mimicked the design of the experiment in which each subject either made a decision or just fixated, and the other is error term. To enhance the fitting, smoothing was carried out using a gamma function with standard deviation of 3. Cluster-based thresholding was done with P = 0.01. The rendered data were transformed into a format that Freesurfer understands. Then, the functional data were registered to the anatomical data to produce a transformation matrix, which was used to overlay the functional data onto the anatomical data. For the overlaid images, ROIs (PIPS, V3A/V7, V4/V8, MT, and Brodmann area 41 [BA41]) were drawn using a Freesurfer tool, and their amplitudes were obtained for each subject and experiment.

After initial analysis of individual subjects, the data were averaged across subjects using Freesurfer for anatomical data and FSL for functional data for each experiment (color, form, orientation, motion, and audition). For functional data group analysis, mixed effects analysis was used to consider both fixed effects due to the within-session variance and random effects due to across-subject variance. For average of anatomical data, we used the average of subject's brains, rather than the MNI average brain. The occipital/parietal/temporal part of this averaged brain was flattened using Freesurfer. The average of functional activation across subjects was overlaid onto the flattened image using Freesurfer.

Results

For perceptual decision making, we employed the task of difference judgment, a method that is well established in psychophysics (Scheneider 1980; Maloney and Yang 2003). In this task, 2 values (suprathreshold) of an attribute are presented in Interval 1, followed by another pair in Interval 2. An example of the stimuli for the difference judgment task is shown in Figure 2A. The task is to press a button to indicate in which interval the magnitude of difference in the attribute (e.g., color) was greater.

The behavioral responses were veridical in 66–83% of the trials across subjects in the experiments reported here. Note that those rates were not correlated with BOLD activation, for reasons elaborated in the Methods. Instead, behavioral data were analyzed using maximum likelihood estimation and are shown in Figure 3. The horizontal axis indicates normalized physical values, and the vertical axis shows corresponding perceptual values. Nine physical values were used in difference scaling, and subjects' responses in difference scaling performance were subjected to maximum likelihood estimation (see Maloney and Yang 2003) to obtain estimated perceptual values. Rows indicate 2 subjects, and columns show visual attributes, color, and area. If the dots fall on the 45-degree line (dotted), subjects' perceptual values are veridical. The behavioral data of the 2 subjects suggest that there is more variability in the difference scaling for area than for color and that subjects' perceptual scales for area are closer to veridical than those for color.

Figure 3.

Behavioral data. Rows show 2 representative subjects, and columns indicate 2 visual attributes, color and area. The horizontal axis is physical scale, and the vertical axis is perceptual scale. Subjects did the psychophysical task of difference scaling with one set given in each trial, chosen randomly out of 120 quadruples. Subjects' responses were subjected to maximum likelihood estimation, which led to estimated perceptual values. The dotted line shows hypothetical veridical responses. Error bars indicate 1 standard error of the means.

Figure 3.

Behavioral data. Rows show 2 representative subjects, and columns indicate 2 visual attributes, color and area. The horizontal axis is physical scale, and the vertical axis is perceptual scale. Subjects did the psychophysical task of difference scaling with one set given in each trial, chosen randomly out of 120 quadruples. Subjects' responses were subjected to maximum likelihood estimation, which led to estimated perceptual values. The dotted line shows hypothetical veridical responses. Error bars indicate 1 standard error of the means.

The stimuli were relatively small (subtending 2 degrees in visual angle), and passive viewing led to a focal retinotopically appropriate activation in the parafoveal regions (Fig. 2B) shown on a flattened left occipital/parietal cortex (Figs 2, 4, and 5 show data for one subject, Fig. 6 reports group data averaged across 6 subjects). However, when color difference judgment was required with the same stimuli, 3 distinct ROIs were additionally activated (Fig. 2C,C′): the V4/V8 complex along the ventral pathway and V3A/V7 complex and the PIPS along the dorsal pathway. These results demonstrate that the V4/V8 areas were involved in processing color, consistent with previous studies (Zeki 1973; Hadjikhanie et al. 1998). Note, also, that V4/V8 neurons are actively participating in color processing because V4/V8 neurons showed little response when the subjects passively viewed the color stimuli but strongly responded when the subjects were involved in a task involving the color stimuli. This is consistent with the idea that V4/V8 neurons show attention- or task-related response (Motter 1994; Connor et al. 1997).

Figure 4.

Control experiments for motor, memory, and simple discrimination. The top row illustrates control tasks in each trial, whereas the bottom row shows corresponding data. (A, D) Task and data when subjects did not press the button but subvocalized their color difference judgments. Most of the activations along the IPS disappeared (pink circle), compared with Figure 1C, but the posterior banks (PIPS) remained unchanged. (B) Working memory control task in which a quadruple of colors were shown simultaneously. (C) Simple discrimination task in which only one colored rectangle was shown in each interval. (E, F) Data for working memory and simple discrimination experiments, which show activation patterns similar to those in Figure 1C,D.

Figure 4.

Control experiments for motor, memory, and simple discrimination. The top row illustrates control tasks in each trial, whereas the bottom row shows corresponding data. (A, D) Task and data when subjects did not press the button but subvocalized their color difference judgments. Most of the activations along the IPS disappeared (pink circle), compared with Figure 1C, but the posterior banks (PIPS) remained unchanged. (B) Working memory control task in which a quadruple of colors were shown simultaneously. (C) Simple discrimination task in which only one colored rectangle was shown in each interval. (E, F) Data for working memory and simple discrimination experiments, which show activation patterns similar to those in Figure 1C,D.

Figure 5.

Perceptual judgment for orientation, motion, and audition. (A, B, and C) illustrate the tasks in each trial, and (D, E, and F) show corresponding data. (D) Responses during passive viewing and activations for judgments of orientation differences. Note that in addition to the 3 areas, V4/V8, V3A/V7, and PIPS, 2 distinct areas along the ventral pathway have also been activated for orientation judgment, including lateral occipital (LO) areas. (E) Responses during passive viewing of motion stimuli in MT/MST areas (top) and additional activation for motion difference judgment in V3A/V7 and PIPS areas (bottom). (F) Tonotopic responses to tones (top) in BA41 (the primary auditory cortex) and additional activation for auditory difference judgments in the PIPS areas (bottom).

Figure 5.

Perceptual judgment for orientation, motion, and audition. (A, B, and C) illustrate the tasks in each trial, and (D, E, and F) show corresponding data. (D) Responses during passive viewing and activations for judgments of orientation differences. Note that in addition to the 3 areas, V4/V8, V3A/V7, and PIPS, 2 distinct areas along the ventral pathway have also been activated for orientation judgment, including lateral occipital (LO) areas. (E) Responses during passive viewing of motion stimuli in MT/MST areas (top) and additional activation for motion difference judgment in V3A/V7 and PIPS areas (bottom). (F) Tonotopic responses to tones (top) in BA41 (the primary auditory cortex) and additional activation for auditory difference judgments in the PIPS areas (bottom).

Figure 6.

BOLD changes across sensory modalities in group analysis. The amplitude changes were averaged across 6 subjects for judgment tasks in each experiment. Functional and anatomical data were averaged respectively across subjects, and the averaged brains were overlaid onto each other (see Methods). The vertical axis shows percent changes in BOLD response, and the horizontal axis indicates 4 visual attributes and auditory task conditions. Within each experimental condition, bars represent ROIs. For color, form, and orientation, 3 ROIs (V4/V8, V3A/V7, and PIPS) are shown, and for the other 2 conditions, additional ROIs are shown, MT for motion, and BA41 (BA41) for audition. Note that in the auditory condition, the second bar (V4/V7) has zero height. Error bars indicate 1 standard error of the means.

Figure 6.

BOLD changes across sensory modalities in group analysis. The amplitude changes were averaged across 6 subjects for judgment tasks in each experiment. Functional and anatomical data were averaged respectively across subjects, and the averaged brains were overlaid onto each other (see Methods). The vertical axis shows percent changes in BOLD response, and the horizontal axis indicates 4 visual attributes and auditory task conditions. Within each experimental condition, bars represent ROIs. For color, form, and orientation, 3 ROIs (V4/V8, V3A/V7, and PIPS) are shown, and for the other 2 conditions, additional ROIs are shown, MT for motion, and BA41 (BA41) for audition. Note that in the auditory condition, the second bar (V4/V7) has zero height. Error bars indicate 1 standard error of the means.

In the second experiment, the same stimuli were used for difference judgment on form. The task was to press the button to indicate in which interval the difference in area of the rectangles was greater. The activations (Fig. 2D) are similar to those found for color difference judgment, that is, in V4/V8, V3A/V7, and PIPS. This suggests, first, that form and color share the same visual pathway, consistent with electrophysiological findings (Friedman et al. 2003). Second, perceptual judgment involves distributed neuronal representation (Claeys et al. 2004), one in the ventral pathway and the other in the dorsal pathway. This is inconsistent with the traditional dichotomy of the 2 pathways (Ungerleider and Haxby 1994). We next conducted 3 control experiments to examine whether these 3 areas are associated with the subprocesses other than the final choice behavior in perceptual decisions.

First, the activation in the PIPS areas may have to do with the motor response via button press (Calton et al. 2002). Subjects were asked not to press a button but still to make a difference judgment on the same stimuli, indicating silently to themselves whether the difference is greater in the “first” or “second” interval (Fig. 4A). When button press was eliminated, the activations in V4/V8, V3A/V7, and PIPS (Fig. 4D) were similar to those found in color or form experiments. Moreover, it is unlikely that those activations are due to subvocalization (Bullmore et al. 1996; MacSweeney 2000; Huang et al. 2001) because most of these subvocalization studies have shown activations associated with silent speech in such areas as premotor, temporal, or language areas. Note, also, that the activations in the areas anterior to PIPS (pink circle, Fig. 4D) along the intraparietal sulcus (IPS), which were prominent in Figure 2C,D, are no longer activated without motor response. This shows that although massive areas along IPS were activated for color and form tasks, much of it can be explained by motor response.

We conducted 2 additional control experiments without the motor requirement but with subvocalization to further eliminate activations due to factors other than the motor requirement. In the second control experiment, the 2 pairs of stimuli were shown at the same time (Fig. 4B), so that working memory (Walsh 1999) associated with the comparison in difference judgment was eliminated. The task was to indicate whether the difference in color was greater in the “top” or “bottom” pair. Passive viewing led to greater areas of activation in the parafoveal regions (the activated areas between dotted lines Fig. 4E) because 2 pairs of stimuli shown at the same time occupied larger areas of the visual field. The results on judgment without working memory (Fig. 4E) were similar in the pattern of activation to those for color and form judgment. This is consistent with the finding that judgment can be dissociated from working memory (Bechara et al. 1998). In the third control, a simple discrimination (Fig. 4C), instead of difference judgment, was used to test task specificity. Simple discrimination showed activations (Fig. 4F) similar to those for difference judgment, suggesting that activations shown in Figure 2C,D are not task specific.

We further tested whether the 3 areas are specific to the choice behavior in perceptual judgment by employing 2 other visual attributes, orientation, an aspect of form vision, and motion, commonly associated with the dorsal pathway. Subjects were asked to press a button to make a difference judgment on the orientation of colored bars (Fig. 5A). Activation during passive viewing of the bars was limited to the parafoveal representation (Fig. 5D, top). When orientation judgment was required, we observed activations (Fig. 5D, bottom) similar to those for color or form judgment. Note also that areas along the ventral pathway, including lateral occipital (LO) areas, have been additionally activated during orientation judgment.

In the motion experiment (Fig. 5B), a rectangle in Interval 1 moved from the left of the fixation point to the right (apparent motion), followed by another pair moving at a different speed. Another set of rectangles was shown in Interval 2 moving at different speeds. The task was to press a button to indicate the interval in which the difference in speed (Liu and Newsome 2005) was greater. Passive viewing of the motion stimulus produced retinotopic activation in MT/MST (Culham et al. 1988; Tootell et al. 1995) (Fig. 5E, top). When speed judgment was required, however, 2 distinct areas, V3A/V7 complex and PIPS, were additionally activated (Fig. 5E, bottom). The additional activation found only in V3A/V7 and PIPS suggests that the activations in V4/V8 complex were specific to particular visual attributes (color or form).

For the 4 visual attributes we tested, the results demonstrated that the necessary component of neuronal representation of perceptual decision includes neurons in V3A/V7 and PIPS areas regardless of the visual attribute. Thus, these 2 areas are candidate sites for the final choice subprocess in perceptual judgment. However, if either or both of these areas are necessary for perceptual decision making in general, they must subserve decisions for other sensory modalities. Thus, we extended the investigation to auditory perception.

A set of 2 tones with different durations were delivered to the subjects' ears in Interval 1 and another set in Interval 2 while they were fixating on the cross shown on the monitor (Fig. 5C). As expected, passive listening to the tones showed activation in the primary auditory cortex, BA41 (Rademacher et al. 2001) (Fig. 5F, top). When a difference judgment was required on tone durations, both auditory cortex and PIPS showed strong activation at the same time (Fig. 5F, bottom). Note, however, that only PIPS was activated additionally. This suggests that activation of the V3A/V7 complex was specific to visual processing. Thus, if we assume that brain areas are functionally specialized, the data suggest that PIPS is the area associated with the choice subprocess in perceptual judgment, at least as tested here.

The size of activations in PIPS, the region close to V7 and around the posterior banks of the IPS, showed some variation across experiments and subjects. The oval limits (caudal/rostral and ventral/dorsal) of the ellipsoidal activation in the PIPS were chosen for each subject in Talairach coordinates and averaged across within each subject across experiments, as shown in Table 1. Within-subject variability (standard deviation) of the PIPS activations ranged from 2 to 9 mm in Talairach coordinates. Between-subject variability is shown in Figure 6. The changes in BOLD response (vertical axis) were shown against the experimental attributes of vision and audition. The horizontal axis shows ROIs within each experimental condition: V4/V8, V3A/V7, PIPS, MT, and BA41. For color, form, and orientation, 3 ROIs (V4/V8, V3A/V7, and PIPS) were similarly activated. V4/V8 activation was very little in motion experiments and none in the auditory experiments, whereas V3A/V7 activation for motion was as strong as that for color, form, and orientation but much less for audition. However, the strong PIPS area activation was consistent across experimental conditions. Tukey's multiple comparison test was performed controlling the overall Type I error rate at P = 0.05 (degrees of freedom = 4,20) for each experiment. For color, form, and orientation experiments, the activations of 3 ROIs (V4/V8, V3A/V7, and PIPS) were not statistically different. For the motion experiment, there were statistically significant differences in activation between V4/V8 and V3A/V7, V4, or MT; confidence intervals were (1.19, 4.38), (1.10, 4.30), and (1.62, 4.82), respectively. For the tone duration experiment, there were also significant differences between V4/V8 and PIPS or the primary auditory cortex and V7 and PIPS or the primary cortex; confidence intervals were (0.78, 5.61), (1.34, 6.16), and (0.58, 5.40), respectively.

Table 1

Talairach coordinates of the PIPS

Subjects PIPS limits in Talairach coordinates 
S1 (−43, −51, 35) (−46, −48, 50) (−31, −47, 38) (−39, −61, 50) 
S2 (−41, −57, 44) (−31, −47, 35) (−30, −62, 50) (−24, −59, 35) 
S3 (−35, −45, 44) (−37, −51, 34) (−36, −60, 49) (−26, −55, 57) 
S4 (−18, −59, 16) (−21, −73, 18) (−15, −69, 21) (−15, −73, 29) 
S5 (−20, −53, 41) (−26, −71, 51) (−20, −60, 40) (−11, −68, 42) 
S6 (−34, −37, 25) (−53, −51, 38) (−40, −61, 40) (−30, −47, 38) 
Subjects PIPS limits in Talairach coordinates 
S1 (−43, −51, 35) (−46, −48, 50) (−31, −47, 38) (−39, −61, 50) 
S2 (−41, −57, 44) (−31, −47, 35) (−30, −62, 50) (−24, −59, 35) 
S3 (−35, −45, 44) (−37, −51, 34) (−36, −60, 49) (−26, −55, 57) 
S4 (−18, −59, 16) (−21, −73, 18) (−15, −69, 21) (−15, −73, 29) 
S5 (−20, −53, 41) (−26, −71, 51) (−20, −60, 40) (−11, −68, 42) 
S6 (−34, −37, 25) (−53, −51, 38) (−40, −61, 40) (−30, −47, 38) 

Note: The locations of PIPS are shown in Talairach coordinates for each subject. Four oval limits of the PIPS, shown in pink in Figure 2, area are averaged across experiments for each subject.

Discussion

The goal of the experiment was to test the hypothesis that there is a population of neurons that is associated with perceptual judgment. We tested the hypothesis using the psychophysical task of difference judgment. One critical element in testing the hypothesis is the rationale that if an area of the brain is associated with perceptual judgment, it should show up across tasks and across sensory modalities. We used color, form, orientation, and apparent motion tasks for visual judgment and a tone duration task for auditory judgment. We found that the PIPS is consistently activated in various tasks across 2 sensory modalities.

Although the present findings suggest that the neurons in the PIPS area are involved in the final binary choice behavior in perceptual judgment, such a conclusion is premature because we have not ruled out all possible subprocesses other than the final choice behavior. We made an effort to eliminate some subprocesses leading to the final binary choice (e.g., motor or working memory) and also identified modality-specific responses (e.g., visual or auditory). The difference judgment task used here for perceptual judgment, however, still involves 2 distinct subprocesses, calculation (Simon et al. 2002) (e.g., subtraction of speeds) and binary choice (first or second). The data suggest that the neurons in PIPS are involved in both processes and whether the 2 processes can be meaningfully dissociated needs an experimental investigation.

Another factor that could have contributed to the effects we found is attention. Attention has been one of the major factors used to account for fMRI activation (Yantis et al. 2002; Bisley and Goldberg 2003). Most psychophysical tasks involve attention, and it could have been a factor that account for the effects we found because the task used in the present study, difference scaling, requires active attention. Thus, attention is part of the decision-making process in our study. Is there a judgment task that does not involve attention? The answer depends on how much of attentional resources are allocated to a given task. Although the experimental task used in the present study requires active attention, it is conceivable to find a situation in which people can make judgment with as little effort as possible. For example, when a person drives a car, he or she can make an “effortless” judgment on whether to go ahead or not on a yellow light. Although it is moot how much neural resources are involved in such a case, experimenters can at least design an experiment to study the degree of contribution by attention in the decision-making process.

Studies of perceptual decision making have suggested that the degree of activation in human decision-making tasks is correlated with task difficulty (Heekeren et al. 2004). Note, however, that the difficulty in the difference scaling task as implemented in the present study was pseudorandomized. Thus, although it is possible, as Heekeren et al. (2004) have found, that trials with more difficulty had stronger activations, the effects we found go beyond any such effect due to task difficulty. This is confirmed in the 2 control experiments. The task in the second control experiment was easier than the original experimental task because the subjects did not need to hold the difference information in memory to do the task. Similarly, the task in the third control experiment was also easier because it was not a difference comparison but a simple comparison. In both control experiments, however, the activations in the PIPS area were similar to those found in the original task.

LIP has also been associated with perceptual judgment in electrophysiological research as well as in fMRI studies, but in those studies, saccades and motor response were required (Kim and Shadlen 1999; Liu and Newsome 2005; Schluppeck et al. 2005). The “saccadotopic” LIP (solid circle, shown in Fig. 1C′), identified as the same Talairach coordinates found by Sereno et al. (1995), is not an area consistently activated throughout our experiments. Subjects were required to fixate in our experiments, but we cannot rule out the possibility that they sometimes made saccades. It seems unlikely, however, that the subjects initiated systematic saccades, as in saccades-initiated tasks. Such evidence comes from the auditory task in which there was no visual display (only the fixation cross) for the subjects to make saccades for.

Conflict of Interest: None declared.

References

Bechara
A
Damasio
H
Damasio
AR
Role of the amygdala in decision-making
Ann N Y Acad Sci
 , 
2003
, vol. 
985
 (pg. 
356
-
369
)
Bechara
A
Damasio
H
Tranel
D
Anderson
SW
Dissociation of working memory from decision making within the human prefrontal cortex
J Neurosci
 , 
1998
, vol. 
18
 (pg. 
428
-
437
)
Bisley
JW
Goldberg
ME
Neuronal activity in the lateral intraparietal area and spatial attention
Science
 , 
2003
, vol. 
299
 (pg. 
81
-
86
)
Brett
M
Johnsrude
IS
Owen
AM
The problem of functional localization in the human brain
Nat Rev
 , 
2002
, vol. 
3
 (pg. 
243
-
249
)
Bullmore
ET
Rabe-Hesketh
S
Morris
RG
Williams
SC
Gregory
L
Gray
JA
Brammer
MJ
Functional magnetic resonance image analysis of a large-scale neuro-cognitive network
Neuroimage
 , 
1996
, vol. 
4
 (pg. 
16
-
33
)
Calton
JL
Dickinson
AR
Snyder
LH
Non-spatial, motor-specific activation in posterior parietal cortex
Nat Neurosci
 , 
2002
, vol. 
5
 (pg. 
580
-
588
)
Claeys
KG
Dupont
P
Cornette
L
Sunaert
S
Van Hecke
P
De Schutter
E
Orban
GA
Color discrimination involves ventral and dorsal stream visual areas
Cereb Cortex
 , 
2004
, vol. 
14
 (pg. 
803
-
822
)
Connor
CE
Preddie
DC
Gallant
JL
Van Essen
DC
Spatial attention effects in Macaque area V4
J Neurosci
 , 
1997
, vol. 
17
 (pg. 
3201
-
3214
)
Culham
JC
Brandt
SA
Cavanagh
PC
Kanwisher
NG
Dale
AM
Tootell
RB
Cortical fMRI activation produced by attentive tracking of moving targets
J Neurophysiol
 , 
1988
, vol. 
80
 (pg. 
2657
-
2670
)
Derrington
AM
Krauskopf
J
Lennie
P
Chromatic mechanisms in lateral geniculate nucleus of macaque
J Physiol (Lond)
 , 
1984
, vol. 
357
 (pg. 
241
-
265
)
Friedman
HS
Zhou
H
von der Heydt
R
The coding of uniform colour figures in monkey visual cortex
J Physiol
 , 
2003
, vol. 
548
 (pg. 
593
-
613
)
Hadjikhani
N
Liu
AK
Dale
AM
Cavanagh
P
Tootell
RB
Retinotopy and color sensitivity in human visual cortical area V8
Nat Neurosci
 , 
1998
, vol. 
1
 (pg. 
235
-
241
)
Heekeren
HR
Marrett
S
Bandettini
PA
Ungerleider
LG
A general mechanism for perceptual decision-making in the human brain
Nature
 , 
2004
, vol. 
431
 (pg. 
859
-
862
)
Hsu
M
Bhatt
M
Adolphs
R
Tranel
D
Camerer
CF
Neural systems responding to degrees of uncertainty in human decision-making
Science
 , 
2005
, vol. 
310
 (pg. 
1680
-
1683
)
Huang
J
Carr
TH
Cao
Y
Comparing cortical activations for silent and overt speech using event-related fMRI
Hum Brain Mapp
 , 
2001
, vol. 
15
 (pg. 
39
-
53
)
Kim
JN
Shadlen
MN
Neural correlates of a decision in the dorsolateral prefrontal cortex of the macaque
Nat Neurosci
 , 
1999
, vol. 
2
 (pg. 
176
-
185
)
Larimer
J
Krantz
DH
Cicerone
CM
Opponent-process additivity: I. Red/green equilibria
Vision Res
 , 
1974
, vol. 
14
 (pg. 
1127
-
1140
)
Liu
J
Newsome
W
Correlation between speed perception and neural activity in the middle temporal visual area
J Neurosci
 , 
2005
, vol. 
25
 (pg. 
711
-
722
)
MacSweeney
M
Amaro
E
Calvert
GA
Campbell
R
David
AS
McGuire
P
Williams
SCR
Woll
B
Brammer
M
Silent speechreading in the absence of scanner noise: an event-related fMRI study
Neuroreport
 , 
2000
, vol. 
11
 
8
(pg. 
1729
-
1733
)
Maloney
LT
Yang
JN
Maximum likelihood difference scaling
J Vision
 , 
2003
, vol. 
3
 (pg. 
573
-
585
)
Motter
BC
Neural correlates of attentive selection for color or luminance in extrastriate area V4
J Neurosci
 , 
1994
, vol. 
14
 (pg. 
2178
-
2189
)
Newsome
WT
Britten
KH
Movshon
JA
Neuronal correlates of a perceptual decision
Nature
 , 
1989
, vol. 
341
 (pg. 
52
-
54
)
Platt
ML
Glimcher
PW
Neural correlates of decision variables in parietal cortex
Nature
 , 
1999
, vol. 
400
 (pg. 
233
-
238
)
Rademacher
J
Morosan
P
Schormann
T
Schleicher
A
Werner
C
Freund
H-J
Zilles
K
Probabilistic mapping and volume measurement of human auditory cortex
Neuroimage
 , 
2001
, vol. 
13
 (pg. 
669
-
683
)
Scheneider
B
A technique for the nonmetric analysis of paired comparisons of psychological intervals
Psychometrika
 , 
1980
, vol. 
45
 (pg. 
357
-
372
)
Schluppeck
D
Glimcher
P
Heeger
DJ
Topographic organization for delayed saccades in human posterior parietal cortex
J Neurophysiol
 , 
2005
, vol. 
94
 (pg. 
1372
-
1384
)
Sereno
MI
Dale
AM
Reppas
JB
Kwong
KK
Belliveau
JW
Brady
TJ
Rosen
BR
Tootell
RB
Borders of multiple visual areas in humans revealed by functional magnetic resonance imaging
Science
 , 
1995
, vol. 
268
 (pg. 
889
-
893
)
Sereno
MI
Pitzalis
S
Martinez
A
Mapping of contralateral space in retinotopic coordinates by a parietal cortical area in human
Science
 , 
2001
, vol. 
294
 (pg. 
1350
-
1354
)
Shadlen
MN
Newsome
WT
Neural basis of a perceptual decision in the parietal cortex (area LIP) of the Rhesus monkey
J Neurophysiol
 , 
2001
, vol. 
86
 (pg. 
1916
-
1936
)
Simon
O
Mangin
JF
Cohen
L
Le Bihan
D
Dehaene
S
Topographical layout of hand, eye, calculation, and language-related areas in the human parietal lobe
Neuron
 , 
2002
, vol. 
33
 (pg. 
475
-
487
)
Smith
SM
Woolrich
MW
Jekinson
M
Beckmann
CF
Behrens
TE
Johansen-Berg
H
Bannister
PR
De Luca
M
Drobnjak
I
Flitney
DE
, et al.  . 
Advances in functional and structural MR image analysis and implementation as FSL
Neuroimage
 , 
2004
, vol. 
23
 (pg. 
S208
-
S219
)
Tootell
RB
Reppas
JB
Kwong
KK
Malach
R
Born
RT
Brady
TJ
Rosen
BR
Belliveau
JW
Functional analysis of human MT and related visual cortical areas using magnetic resonance imaging
J Neurosci
 , 
1995
, vol. 
15
 (pg. 
3215
-
3230
)
Ungerleider
LG
Haxby
JV
‘What’ and ‘where’ in the human brain
Curr Opin Neurobiol
 , 
1994
, vol. 
4
 (pg. 
157
-
165
)
Walsh
V
How does the cortex construct color?
PNAS
 , 
1999
, vol. 
96
 (pg. 
13594
-
13596
)
Walton
ME
Devlin
JT
Rushworth
MFS
Interactions between decision making and performance monitoring within prefrontal cortex
Nat Neurosci
 , 
2004
, vol. 
7
 (pg. 
1259
-
1265
)
Yantis
S
Schwarzbach
J
Serences
JT
Carlson
RL
Steinmetz
MA
Pekar
JJ
Courtney
SM
Transient neural activity in human parietal cortex during spatial attention shifts
Nat Neurosci
 , 
2002
, vol. 
5
 (pg. 
995
-
1002
)
Zeki
SM
Color coding in Rhesus monkey prestriate cortex
Brain Res
 , 
1973
, vol. 
53
 (pg. 
422
-
427
)