Abstract

Functional magnetic resonance imaging (fMRI) was used to compare brain activation to static facial displays versus dynamic changes in facial identity or emotional expression. Static images depicted prototypical fearful, angry and neutral expressions. Identity morphs depicted identity changes from one person to another, always with neutral expressions. Emotion morphs depicted expression changes from neutral to fear or anger, creating the illusion that the actor was ‘getting scared’ or ‘getting angry’ in real-time. Brain regions implicated in processing facial affect, including the amygdala and fusiform gyrus, showed greater responses to dynamic versus static emotional expressions, especially for fear. Identity morphs activated a dorsal fronto-cingulo-parietal circuit and additional ventral areas, including the amygdala, that also responded to the emotion morphs. Activity in the superior temporal sulcus discriminated emotion morphs from identity morphs, extending its known role in processing biologically relevant motion. The results highlight the importance of temporal cues in the neural coding of facial displays.

Introduction

An important function of facial expression is the social communication of changes in affective states. The dynamic perception of expressive features likely recruits specialized processing resources to direct appropriate actions in response to observed sequences in facial motion. Such dynamic information may be integral to the mental representation of faces (Freyd, 1987).

Behavioral studies have shown that humans are sensitive to temporal cues in facial displays. For example, subjects can temporally order scrambled sequences of videotaped emotional reactions, even when consecutive frames contain subtle transitions in facial expression (Edwards, 1998). In fact, performance improves under time constraints, suggesting that the extraction of dynamic features in facial expression occurs relatively automatically. In other circumstances, dynamic information contributes to face recognition abilities (Christie and Bruce, 1998; Lander et al., 1999) and judgements of facial affect (Bassili, 1978, 1979; Kamachi et al., 2001) and identity (Seamon, 1982; Hill and Johnston, 2001; Thornton and Kourtzi, 2002). As with other aspects of emotional perception, identification of expression changes may exhibit mood-congruent biases. Niedenthal and colleagues (Niedenthal et al., 2000) showed that participants induced into a sad or happy mood take longer than controls to detect a change in morphed expressions that slowly decrease in intensity of displayed sadness or happiness, respectively.

Motion cues may also dissociate perceptual abilities in patients with neurologic and developmental disorders. Humphreys et al. (Humphreys et al., 1993) reported a double dissociation in two patients relative to performance on facial affect and identity tasks. Prosopagnosic patient H.J.A., who sustained ventral occipitotemporal damage, had difficulties with both facial identity and expression judgements using static photographs. However, his performance improved when asked to categorize facial expressions using moving point-light displays. On the other hand, patient G.K., who sustained bilateral parietal lobe damage, had relatively good performance on facial identity tasks but was impaired at facial affect recognition using either static or dynamic cues. Children with psychopathic tendencies also present with selective impairments in identifying emotion from cinematic displays of slowly morphing expressions (Blair et al., 2001). In contrast, autistic children may benefit from slow dynamic information when categorizing emotional expressions (Gepner et al., 2001). This latter finding differs from other autistic deficits on motion-processing tasks that require faster temporal integration, including lip reading (de Gelder et al., 1991; Spencer et al., 2000). Finally, the perception of biological motion in Williams syndrome is spared relative to other aspects of motion perception and visuomotor integration (Jordan et al., 2002).

The neural substrates that mediate dynamic perception of emotional facial expressions are unknown. Previous neuroimaging studies have been limited to posed snapshots that lack temporal cues inherent in everyday socioemotional interactions. These studies have emphasized category-specific representations in the amygdala and associated frontolimbic structures by comparing responses to faces portraying basic emotions [e.g. (Breiter et al., 1996; Morris et al., 1996; Phillips et al., 1997, 1998; Sprengylmeyer et al., 1997; Whalen et al., 1998, 2001; Blair et al., 1999; Kesler-West et al., 2001)]. A separate line of research has revealed brain regions responsive to biological motion, including movement of face parts, but the role of emotion has been largely untested. Shifts of eye gaze, mouth movements and ambulation videos using animation or point-light displays elicit activity in the superior temporal sulcus and anatomically related areas [reviewed in Allison et al. (Allison et al., 2000)]. Some of these regions, such as the amygdala (Bonda et al., 1996; Kawashima et al., 1999), also participate in facial affect recognition, suggesting a potential link between dorsal stream processing of biological motion and ventral stream processing of emotional salience.

The present study was designed to integrate these literatures by investigating perception of negative affect using facial stimuli that varied in their dynamic properties. Prototypical expressions of fear and anger were morphed with neutral expressions of the same actors to form the impression that the actors were becoming scared or angry in real-time (Fig. 1). fMRI activation to the emotion morphs was contrasted with the static expressions. In addition, identity morphs were created that blended facial identities across pairs of actors with neutral expressions. This condition was included to evaluate the specificity of the results with respect to changes in facial affect versus identity, and to dissociate the signaling of biologically plausible from implausible motion. Hypothesis about four brain regions were made a priori:

  1. The amygdala would preferentially respond to fear stimuli relative to the other categories and would show greater activation to fear morphs than static fear images.

  2. The fusiform gyrus would respond to all stimuli but would show a similar bias as the amygdala, given the neuromodulatory influence of the amygdala on face processing along the ventral visual stream (Morris et al., 1998; Pessoa et al., 2002).

  3. The superior temporal sulcus would discriminate emotion from identity morphs, since only the emotion morphs are biologically plausible.

  4. Visual area V5/MT+ would respond to all dynamic stimuli indiscriminately, given its sensitivity to visual motion cues (Tootell et al., 1995; Huk et al., 2002) and its sparser connectivity with the amygdala relative to rostroventral sectors of the visual stream (Amaral et al., 1992).

Materials and Methods

Subjects

Twelve healthy adults provided written informed consent to participate in the study. Two of these subjects were dropped due to excessive head movement (center-of-mass motion estimates >3.75 mm in x, y or z planes). The remaining 10 participants (five male, five female; age range = 21–30 years) were included in the statistical analysis. All participants were right-handed and were screened for history of neurologic and psychiatric illness and substance abuse. Procedures for human subjects were approved by the Institutional Review Board at Duke University.

Stimulus Development

Facial affect stimuli that are panculturally representative of basic emotions were taken from the Ekman series (Ekman and Friesen, 1976; Matsumoto and Ekman, 1989). Prototypical expressions of fear and anger were morphed with neutral expressions of the same actor to create the dynamic emotional stimuli. The expression change depicted in the morph always portrayed increasing emotional intensity (i.e. from neutral to 100% fear, or neutral to 100% anger). In addition, pairs of actors with neutral expressions were morphed to create dynamic changes in identity. Identity morphs always combined pairs of actors of the same gender and ethnicity. All actors in the emotion morphs were included in the identity morphs, and all actors in the static images were included in the dynamic stimuli. A subset of actors portrayed both fear and anger.

Emotion morphs were used instead of videotaped expressions to allow experimental control over the rate and duration of the changes, as in previous studies [e.g. (Niedenthal et al., 2000)]. Morphs were created using MorphMan 2000 software (STOIK, Moscow, Russia). All faces were initially cropped with an ovoid mask to exclude extraneous cues (hair, ears, neckline, etc.). The images were then normalized for luminance and contrast and presented against a mid-gray background. Approximately 150 fiducial markers were placed on each digital source image in the morph pair and individually matched by computer mouse to corresponding points on the target image. Areas of the face relevant for perceiving changes in identity and expression, such as the eyes, mouth, and corrogator and obicularis oculi muscles were densely sampled (Ekman and Friesen, 1978; Bassili, 1979). All expressions were posed with full frontal orientations (i.e. there were no changes in viewpoint either across or within morphs). Morphs were presented at a rate of 30 frames/s, consistent with previous studies (Thornton and Kourtzi, 2002). Forty-three frames were interpolated between the morph end points to provide smooth transitions across a 1500 ms duration. The final morph frame was presented for 200 ms for a total stimulus duration of 1700 ms. This duration approximates real-time changes of facial affect using videotaped expressions (Gepner et al., 2001). Morphs were saved in .avi format and displayed as movie clips. Static displays of 100% fear, 100% anger, and neutral expressions were taken from the first and last frames of the emotion and identity morph movies and were presented for the same total duration as the morphs. Figure 1 illustrates four frames of a neutral-to-fear morph. Complete examples of neutral-to-anger and identity morph movies can be found at http://www.mind.duke.edu/level2/faculty/labar/face_morphs.htm.

Experimental Design

Participants viewed 36 unique exemplars of each of four stimulus categories — static neutral, static emotional, dynamic neutral (identity morph), dynamic emotional (emotion morph). Half of the emotional stimuli represented fear and half represented anger. Each exemplar was presented twice during the course of the experiment (total 72 stimuli of each category). Stimuli were presented in a pseudorandom event-related design, subject to the constraint that no more than two exemplars of each category were presented in a row to avoid mood induction effects. Faces were separated by a central fixation cross. The intertrial interval varied between 12 and 15 s (mean 13.5 s) to allow hemodynamic and psychophysiological responses to return to baseline levels between stimulus presentations (Fig. 1). The testing session was divided into eight runs of 8 min 24 s duration. Run order was counterbalanced across participants, and no stimuli were repeated within each half-session of five runs. Stimulus presentation was controlled by CIGAL software (Voyvodic, 1999) modified in-house to present video animations. Participants performed a three-alternative forced-choice categorical judgement in response to each face. Specifically, they used a three-button response box to indicate whether each face depicted an emotion morph (change in emotional expression), identity morph (change from one person to another), or static picture (no changes). Participants were told to respond whenever they could identify the category; speed of response was not emphasized. One example of each category was shown to the participants prior to entering the magnet to familiarize them with the stimuli.

Imaging Parameters and Data Analysis

MR images were acquired on a 1.5 T General Electric Signa NVi scanner (Milwaukee, WI) equipped with 41 mT/m gradients. The subject’s head was immobilized using a vacuum cushion and tape. The anterior (AC) and posterior commissures (PC) were identified in the mid-sagittal slice of a localizer series. Thirty-four contiguous slices were prescribed parallel to the AC–PC plane for high-resolution T1-weighted structural images [repetition time (TR) = 450 ms, echo time (TE) = 20 ms, field-of-view (FOV) = 24 cm, matrix = 2562, slice thickness = 3.75 mm]. An additional series of T1-weighted structural images oriented perpendicular to the AC–PC were also acquired using the parameters specified above. Gradient echo echoplanar images sensitive to blood-oxygenation-level-dependent (BOLD) contrast were subsequently collected in the same transaxial plane as the initial set of T1-weighted structural images (TR = 3 s, TE = 40 ms, FOV = 24 cm, matrix = 642, flip angle = 90°, slice thickness = 3.75 mm, resulting in 3.75 mm3 isotropic voxels).

The fMRI data analysis utilized a voxel-based approach implemented in SPM99 (Wellcome Department of Cognitive Neurology, London, UK). Functional images were temporally adjusted for interleaved slice acquisition and realigned to the image taken proximate to the anatomic study using affine transformation routines. The realigned scans were coregistered to the anatomic scan obtained within each session and normalized to SPM’s template image, which conforms to the Montreal Neurologic Institute’s standardized brain space and closely approximates Talairach and Tournoux’s (Talairach and Tournoux, 1988) stereotaxic atlas. The functional data were high-pass filtered and spatially smoothed with a 8 mm isotropic Gaussian kernel prior to statistical analysis. The regressors for the time-series data were convolved with a canonical hemodynamic response profile and its temporal derivative as implemented in SPM99. Statistical contrasts were set up using a random-effects model to calculate signal differences between the conditions of interest. Statistical parametric maps were derived by applying linear contrasts to the parameter estimates for the events of interest, resulting in a t-statistic for every voxel. Then, group averages were calculated by employing pairwise t-tests on the resulting contrast images. This sequential approach accounts for intersubject variability and permits generalization to the population at large. Interaction terms were analyzed in subsequent pairwise t-tests after the main effect maps were calculated to avoid false positive activations from the baseline period in the control conditions. The resultant statistical parametric maps were thresholded at a voxelwise uncorrected P < 0.001 and a spatial extent of five contiguous voxels.

Results

The experimental hypotheses were primarily targeted at simple effects, which are emphasized below. Additionally, a general emotion × motion interaction analysis was conducted as well as a main effects analysis for the motion variable to examine the role of visual area MT+.

fMRI Activation to Emotion Morphs

Compared to static emotional expressions, emotion morphs elicited responses along a bilateral frontotemporal circuit, including ventrolateral prefrontal cortex, substantia innominata, amygdala, parahippocampal gyrus, and fusiform gyrus (Fig. 2 and Table 1). The activations in this contrast were predominantly restricted to ventral brain regions, with some additional dorsal activity in the dorsomedial prefrontal cortex, left precentral sulcus, right intraparietal sulcus, and putative visual motion area MT+. This contrast reveals brain regions whose responses were enhanced by dynamic changes in negative facial affect over and above their responses to the same expressions presented statically.

The above analysis combined fear and anger expressions. Nearly identical patterns were found when the fear morphs were selectively compared against static fear images (Table 2). The brain regions whose responses differentiated anger morphs from static anger expressions were somewhat more restricted to a subset of those in the combined analysis (Table 2). When the fear and anger morphs were directly contrasted against each other, category-specific activations were revealed. Relative to anger morphs, fear morphs preferentially engaged the lateral and dorsomedial prefrontal cortex, amygdala, midbrain, parahippocampal gyrus, fusiform gyrus, and posterior cingulate gyrus. An additional site of activation within the posterior superior temporal sulcus was found at a less stringent statistical threshold (Fig. 2 and Table 3). Many of these activations were lateralized to the right hemisphere. Relative to fear morphs, anger morphs preferentially engaged a region of ventrolateral prefrontal cortex and supramarginal gyrus (Fig. 2 and Table 3).

fMRI Activation to Identity Morphs

Relative to static neutral expressions, identity morphs elicited responses along both dorsal and ventral processing streams (Fig. 2 and Table 4). The dorsal circuit included the dorsomedial prefrontal cortex, precentral sulcus, intraparietal sulcus, caudate nucleus, thalamus and visual area MT+. The ventral regions included the inferior frontal gyrus, amygdala, and fusiform gyrus. Most of the activations were bilateral.

Direct comparisons revealed brain regions that selectively coded changes in facial affect versus changes in facial identity (Fig. 2 and Table 5). For these comparisons, fear and anger morphs were combined. Relative to the emotion morphs, identity morphs elicited greater responses in predominantly dorsal regions (dorsal anterior cingulate gyrus, dorsal inferior frontal gyrus, intraparietal sulcus, caudate nucleus), and a large, ventrolateral region of the temporal lobe (inferior temporal/ posterior fusiform gyri). Relative to the identity morphs, emotion morphs elicited greater responses in ventral anterior cingulate gyrus and ventromedial prefrontal cortex, middle frontal gyrus (rostral area 8), medial fusiform gyrus, and both anterior and posterior segments of the superior temporal sulcus. These latter brain regions may form a circuit that distinguishes biologically relevant and plausible motion from other types of motion.

Emotion × Motion Interaction

A formal analysis was conducted to determine which brain regions showed an interaction between the emotion and motion factors in the experimental design (Table 6). A double-subtraction procedure was employed to compare the magnitude of the motion effect (dynamic versus static) across the facial expression categories (emotional versus neutral). For this analysis, fear and anger expressions were combined. Brain regions that were more sensitive to motion within the emotional expressions (compared to the neutral expressions) included the fusiform gyrus, anterior cingulate gyrus/ventromedial prefrontal cortex, the superior temporal gyrus and middle frontal gyrus. Brain regions that were more sensitive to motion within the neutral expressions (compared to the emotional expressions) included the inferior temporal gyrus/posterior fusiform gyrus, intraparietal sulcus, basal ganglia, dorsal anterior cingulate gyrus and lateral inferior frontal gyrus. These results are nearly identical to the activations shown in the simple effects analysis where the emotion morphs and identity morphs were directly contrasted against each other (Fig. 2 and Table 5).

Main Effect of Motion

Brain regions sensitive to facial motion cues across emotional expression and identity changes were identified by a main effects analysis (Table 7). For this analysis, fear and anger expressions were combined. The results showed significant motion-related activity in six brain regions: visual area MT+, amygdala, inferior frontal gyrus, dorsomedial prefrontal cortex, intraparietal sulcus and caudate nucleus. All activations were bilateral except the caudate nucleus, which was left-sided.

Behavioral Results

A within-subjects ANOVA computed on behavioral accuracy data revealed a significant interaction between factors of emotion (fear, anger, neutral) and motion (dynamic, static), F(2,18) = 14.68, P < 0.001. Main effects of emotion [F(2,18) = 11.65, P < 0.001] and motion [F(2,18) = 48.69, P < 0.0001] were also found. Overall, participants were more accurate in identifying static images than dynamic images. Post hoc t-tests showed that across the static images, accuracy for anger (96 ± 1%) was worse than that for fear (98 ± 1%) or neutral (98 ± 1%) expressions, which did not significantly differ from each other. However, these data are potentially confounded by ceiling effects. Across the dynamic images, accuracy was worse for identity morphs (38 ± 4%) than either anger (64 ± 8%) or fear (63 ± 8%) morphs, which did not significantly differ from each other.

Behavioral Accuracy and Brain Activation

Inspection of individual subject data revealed that 6 of the 10 participants were less accurate in their recognition judgements of the identity morphs relative to the emotion morphs. We conducted a post hoc analysis of the identity morph versus emotion morph contrast (Table 5 and Fig. 2) to determine any potential influence of behavioral accuracy on the statistical parametric maps. Participants were subdivided into two groups based on behavioral performance — those for whom accuracy was equated across the morph categories (n = 4) and those for whom accuracy was worse for the identity morphs (n = 6). Because of the small sample sizes, we used both conservative (P < 0.001 uncorrected) and liberal (P < 0.05 uncorrected) threshold criteria for determining statistical significance in the identity morph versus emotion morph contrasts from each subgroup. A t-test was then computed across groups and thresholded at P < 0.001 uncorrected. The only brain region that showed differential activation in the identity morph versus emotion morph contrast as a function of behavioral accuracy was the right inferior frontal gyrus (BA 44). This area was more engaged to the identity morphs by the group with poorer performance, perhaps reflecting cognitive effort. However, this brain region emerged only at the more liberal statistical cutoff.

Discussion

Generating emotional expressions requires sequenced movements of facial muscles, which have long been identified psychophysiologically (Ekman and Friesen, 1978; Bassili, 1978, 1979). Voluntary (or unintended) perturbations of characteristic movements or timing in facial expression can significantly alter their meaning from the perspective of an observer (Hess and Kleck, 1997). This has been documented in the distinction between genuine and false smiles (Ekman and Friesen, 1982). The present study shows that specific regions of the brain are sensitive to the perception of fluid changes in physiognomy relevant for the social signaling of changes in affective states. These brain regions preferentially signal real-time increases in facial affect intensity over static, canonical displays of the same emotions. Some of the brain regions selectively responded to perceived changes in fear or anger, whereas others more generally distinguished facial expression changes from facial identity changes induced by stimulus morphing. The functional implications of the findings are discussed relative to the known anatomy of facial affect and biological motion perception.

Role of the Amygdala

The amygdala has been a focus of investigation in facial affect perception, yet its exact function remains debated. Several neuroimaging studies have reported amygdala activation to fearful faces (Breiter et al., 1996; Phillips et al., 1997, 1998; Whalen et al., 1998, 2001), but others have failed to replicate these results (Sprengylmeyer et al., 1997; Kesler-West et al., 2001; Pine et al., 2001). The amygdala’s response to fearful faces may be enhanced when the expression is caricatured (Morris et al., 1996, 1998), under conditions of full attention [(Pessoa et al., 2002); but see (Vuilleumier et al., 2001)], or when judgements involve the simultaneous presentation of multiple face exemplars (Hariri et al., 2000; Vuilleumier et al., 2001). Whereas some studies show fear specificity (Morris et al., 1996, 1998; Phillips et al., 1997, 1998; Whalen et al., 1998, 2001), others report generalization to other emotion categories, including happiness (Breiter et al., 1996; Kesler-West et al., 2001; Canli et al., 2002; Pessoa et al., 2002). Amygdala activation to faces further varies across subjects according to social (Hart et al., 2000; Phelps et al., 2000), personality (Canli et al., 2002) and genetic (Hariri et al., 2002) factors.

All of these studies presented posed facial displays devoid of temporal cues integral to real-life social exchanges. In the present study, we have shown that the amygdala’s activity is enhanced by faces containing dynamic information relative to static snapshots of the same faces. Consistent with our hypotheses, the amygdala responded more to emotion morphs than to static emotional faces, especially for fearful expressions. Direct comparisons showed a specificity for fear over anger in the dynamic morphs but not in the static images. The category specificity in amygdala processing of dynamic facial displays cannot be attributed to other potential confounding features across the faces. Facial identity was the same across the dynamic and static images, and the morphing procedure allowed experimental control over the rate, intensity, duration and spatial orientation of the expression changes. Dynamic stimuli may be more effective in engaging the amygdala during face perception tasks and can potentially clarify the extent to which its responses exhibit category specificity.

Surprisingly, the amygdala also responded to dynamic changes in facial identity that were emotionally neutral. The intensity of amygdala activation to identity morphs was indistinguishable from that to the emotion morphs, even when the analyses were restricted to fear. We speculate that morphed stimuli containing rapid, artificial changes in facial identity elicit amygdala activity due to their threat or novelty value. In evolution, camouflage and other means of disguising identity have been effectively employed as deception devices in predator–prey interactions (Mitchell, 1993). It is possible that rapid changes in identity are interpreted by the amygdala as potentially threatening and consequently engage the body’s natural defense reactions. Alternatively, the amygdala may play a broader role in the perception of facial motion beyond that involved in emotional expression. The amygdala is known to participate in eye gaze and body movement perception (Brothers et al., 1990), even when the stimuli have no apparent emotional content (Young et al., 1995; Bonda et al., 1996; Kawashima et al., 1999). This account, though, does not explain why the amygdala activation was stronger for fear and identity morphs relative to anger morphs.

Parametric studies have indicated that the amygdala codes the intensity of fear on an expressor’s face [(Morris et al., 1996, 1998); but see (Phillips et al., 1997)]. However, it is not clear whether intense emotional expressions recruit amygdala processing because of perceptual, experiential or cognitive factors. Most imaging studies on facial affect perception have used blocked-design protocols that further complicate interpretation of results in this regard. The application of dynamic stimuli may help distinguish between these potential underlying mechanisms. In the present study, the emotion on the actor’s face did not reach full intensity until the last frame of each morph movie, whereas full intensity was continuously expressed in the static images. Thus, the intensity of expressed emotion in the morphs was, on average, only half of that portrayed in the static displays. If the amygdala simply codes the perceived intensity of fear in the expressor’s face, one would expect more activation for the static than dynamic stimuli. The statistical parametric maps do not support this interpretation. Alternatively, the amygdala’s response may shift in latency to the time point at which full intensity was expressed. The temporal resolution of fMRI may be insufficient to reveal this possibility.

Neural processing in the amygdala may relate to abstract or cognitive representations of fear (Phelps et al., 2001). Previous studies have demonstrated that the amygdala’s response to sensory stimuli depends upon contextual cues and evaluative processes. For instance, amygdala activity to facial expressions increases in a mood-congruent fashion (Schneider et al., 1997), and amygdala activation to neutral faces is greater in social phobics, who may interpret these stimuli as threatening (Birbaumer et al., 1998). The amygdala is also engaged by neutral stimuli that signal possible or impending threatening events, as in fear conditioning (Büchel et al., 1998; LaBar et al., 1998) and anticipatory anxiety (Phelps et al., 2001). Rapid changes in fear intensity (and perhaps facial identity) may indicate an imminent source of threat in the environment, which recruits amygdala activity as a trigger for the defense/vigilance system (Whalen, 1998; Fendt and Fanselow, 1999; LaBar and LeDoux, 2001). Further work is needed to elucidate the contributions of the amygdala to perceptual, experiential and cognitive aspects of fear.

The present results may partly explain why patients with amygdala lesions often exhibit deficits in judging the intensity of fear in facial displays. These patients sometimes have sufficient semantic knowledge to label fear, but they typically underestimate the degree of fear expressed in posed displays [(Adolphs et al., 1994; Young et al., 1995; Calder et al., 1996; Broks et al., 1998; Anderson et al., 2000; Anderson and Phelps, 2000); but see (Hamann et al., 1996)]. Fear recognition tasks require perceptual judgements of the extent of physiognomic change in the face relative to less fearful states and/or a canonical template of fear. If the amygdala codes dynamic cues in facial displays, these patients may have difficulty using kinetic information in face snapshots to make intensity judgements without additional contextual cues. Cognitive or experiential aspects of fear may also contribute to performance on recognition tasks. Testing amygdala-lesioned patients with dynamic stimuli may help determine the specific mechanism underlying their deficit and could potentially reduce the variability in performance across individual patients (Adolphs et al., 1999).

Role of the Superior Temporal Sulcus (STS) and Associated Regions

Electrophysiological and brain imaging studies in humans and monkeys have implicated the cortex in and surrounding the banks of the STS in the social perception of biological motion [reviewed in Allison et al. (Allison et al., 2000)]. Biological motion associated with eye gaze direction (Puce et al., 1998; Wicker et al., 1998; Hoffman and Haxby, 2000), mouth movements (Calvert et al., 1997; Puce et al., 1998; Campbell et al., 2001), and hand and body action sequences (Bonda et al., 1996; Howard et al., 1996; Grèzes et al., 1998; Neville et al., 1998; Grossman et al., 2000) engage the STS, particularly in its mid-to-posterior aspect. The present study extends the known role of the STS to the perception of dynamic changes in facial expression. The posterior aspect of the STS region responsive to the emotion morphs overlaps with the areas implicated in these previous studies. Activity of STS neurons in monkeys is evoked by static pictures of grimaces, yawns, threat displays and other expressions relevant for socioemotional interactions with conspecifics (Perrett et al., 1985, 1992; Hasselmo et al., 1989; Brothers and Ring, 1993). These images potentially recruit neural processing in the monkey STS because of implied motion in the expressions (Freyd, 1987; Allison et al., 2000; Kourtzi and Kanwisher, 2000; Senior et al., 2000). This may also explain why the STS was not activated in the emotion morph versus static emotion contrasts.

Interestingly, the STS preferentially signaled dynamic changes in facial expression relative to dynamic changes in facial identity. Given that rapid changes in facial identity do not exist in the physical world, this result supports the hypothesis that the STS distinguishes biologically plausible from biologically implausible or non-biological motion (Allison et al., 2000). In this regard, the STS is dissociated from visual motion area V5/MT+, which is situated just posterior and ventral to the banks of the STS (Zeki et al., 1991; McCarthy et al., 1995; Tootell et al., 1995; Dumoulin et al., 2000; Kourtzi and Kanwisher, 2000; Huk et al., 2002). As predicted, area MT+ was activated by all dynamic stimuli and did not prefer emotion over identity morphs. To our knowledge, this is the first demonstration of responses in area MT+ to dynamic facial expressions in humans. Differential STS processing of emotion morphs may alternatively relate to specific aspects of facial motion present in these stimuli. At a more relaxed statistical criterion, the posterior STS did discriminate fear from anger morphs, which involve distinct facial actions (Ekman and Friesen, 1978). Further research is needed to evaluate whether this STS activity is related to specific facial actions or reflects modulation by the amygdala or other limbic structures.

Multimodal portions of the STS integrate form and motion through anatomic links with both ventral and dorsal visual areas (Rockland and Pandya, 1981; Desimone and Ungerleider, 1986; Oram and Perrett, 1996). In turn, the STS is interconnected with the prefrontal cortex in a gradient from ventral to dorsal regions as one proceeds along its rostrocaudal extent (Petrides and Pandya, 1988). The STS is connected to limbic and paralimbic regions, such as the amygdala and cingulate gyrus, via direct projections (Herzog and Van Hoesen, 1976; Pandya et al., 1981; Amaral et al., 1992) and through dorsal frontoparietal and temporopolar interfaces (Barbas and Mesulam, 1981; Cavada and Goldman-Rakic, 1989; Petrides and Pandya, 1988, 1999; Morecraft et al., 1993). Components of this frontotemporolimbic circuit, including the medial fusiform gyrus, rostral area 8, medial prefrontal cortex/ventral anterior cingulate gyrus and temporopolar cortex/anterior STS, also distinguished emotion morphs from identity morphs in consort with the posterior STS.

Role of the Inferotemporal Cortex

Dissociable regions within the inferior temporal and fusiform gyri signaled dynamic changes in facial identity versus facial expression — anteriomedial fusiform gyrus for expression changes and posterolateral inferotemporal cortex (inferior temporal gyrus and posterior fusiform gyrus) for identity changes. Anatomically segregated processing was also found across the superior and inferior temporal neocortex for facial affect and identity, respectively. Such regional specificity may account for the variability in performance across these two domains of face recognition in prosopagnosics with varying locations and extents of brain damage (Hasselmo et al., 1989; Humphreys et al., 1993; Haxby et al., 2000). Portions of the fusiform gyrus exhibited category specificity for fear over anger morphs, perhaps due to modulatory feedback from limbic structures such as the amygdala (Amaral et al., 1992). Previous imaging studies have shown enhanced fusiform gyrus activity for fearful expressions (Breiter et al., 1996; Sprengylmeyer et al., 1997; Pessoa et al., 2002). As revealed by connectivity modeling, the amygdala interacts with various sectors along the ventral visual stream during facial affect perception tasks (Morris et al., 1998; Pessoa et al., 2002).

Computational models and single-cell recordings in monkeys support a role for the inferior temporal gyrus in neural coding of facial identity independent of facial affect (Hasselmo et al., 1989; Haxby et al., 2000). Inferotemporal activity in the present study may reflect dual coding of the identities present within the morph, since this area is hypothesized to participate in the structural encoding of faces (Kanwisher et al., 1997; Allison et al., 1999). This possibility could be confirmed in electrophysiological experiments with high temporal resolution. Importantly, the morph stimuli were created with smooth transitions between frames and presented at a rate that avoided ‘strobing’ effects, which potentially engender recoding of each face in successive frames. Alternatively, face processing along the inferotemporal cortex may be subject to attentional modulation. Campbell et al. (Campbell et al., 2001) found greater activity in inferotemporal cortex during viewing of ‘meaningless’ facial actions (gurning) relative to ‘meaningful’ facial actions of speech-reading. These authors also postulated an attentional account for their results.

Limitations and Future Directions

The present study was limited in three primary ways. First, only morphed images of fear and anger were presented. It is unknown if the results extend to other emotional expression categories. The creation of morphed stimuli is time-consuming and inclusion of all categories is difficult to accommodate within a single event-related fMRI paradigm. Future studies should compare morphed images of fear and anger to other expressions to determine the specificity of the present results. Secondly, only incremental emotional expression changes were presented. Future studies should compare incremental versus decremental changes in fear and anger to determine the sensitivity of the brain regions to directional aspects of morphed expressions. Finally, the individuals in the present study were less accurate in their categorical recognition of dynamic relative to static images. Although the statistical analysis of accuracy revealed only one brain area that may reflect cognitive effort on the task (BA 44), this analysis may have been underpowered due to sample size constraints. Future studies should compare activation to dynamic and static facial expressions under experimental conditions in which task performance is equated and/or unrelated to the primary experimental manipulation (e.g. during gender judgements).

Conclusion

Temporal cues play an important role in the neural coding of facial affect that requires integration between ventrolimbic brain regions sensitive to emotional salience and frontotemporal regions triggered by biologically relevant motion. The role of dynamic information in face perception has been largely ignored by neuroscientists. As noted by Gloor [(Gloor, 1997), p. 220], ‘One only has to think of how much more information the sight of a moving as opposed to an immobile face conveys to realize the importance of such a mechanism’. Many structures implicated in facial affect perception preferred dynamic over static displays, and these regions were dissociated from other brain areas responsive to rapid changes in facial identity. Anatomically segregated patterns of brain activation across temporal, frontal and cingulate cortices provide support for computational models and single-cell recordings in monkeys regarding the partial independence of facial affect and identity processing. The use of an event-related design further ensured that the signal changes were stimulus-driven and not related to potential mood induction across blocks of emotionally valent trials. Adoption of this experimental approach may invigorate new directions of cognitive neuroscience research using interactive stimuli that more closely approximate real-time changes in facial expression relevant for the social communication of emotions.

Notes

This work was supported by National Institutes of Health grants R01 DA14094 and U54 MH06418, a NARSAD Young Investigator Award, and a Ralph E. Powe Junior Faculty Enhancement Award from Oak Ridge Associated Universities. The authors wish to thank Paul Kartheiser and Harlan Fichtenholtz for assistance with data collection and analysis.

Address correspondence to Kevin S. LaBar, Center for Cognitive Neuroscience, Box 90999, Duke University, Durham, NC 27708, USA. Email: kevin.labar@duke.edu.

Table 1

fMRI activation to emotion morphs (fear/anger combined)

Emotion morph > static emotion BA Side T value x y z 
Middle temporal gyrus 21 left 7.67 −53 −60 
  right 5.90 49 −68 −4 
Intraparietal sulcus 7/40 left 6.70 −34 −64 26 
  right 4.53 34 −83 30 
Substantia innominata  left 6.39 −11 −4 −15 
  right 3.99 15 −15 
Amygdala  left 6.02 −19 −8 −23 
  right 5.23 26 −4 −23 
Parahippocampal gyrus 28/36 right 5.89 19 −26 −11 
  left 5.32 −11 38 −15 
Fusiform gyrus 19/37 left 5.83 −49 −60 −19 
  right 5.30 53 −49 −19 
Inferior frontal gyrus 45 left 5.42 −53 26 11 
 47 left 5.28 −34 19 −19 
  right 5.00 34 23 −11 
Dorsomedial prefrontal cortex 32 midline 5.17 −4 26 41 
Precentral sulcus left 4.78 −41 −4 38 
Emotion morph > static emotion BA Side T value x y z 
Middle temporal gyrus 21 left 7.67 −53 −60 
  right 5.90 49 −68 −4 
Intraparietal sulcus 7/40 left 6.70 −34 −64 26 
  right 4.53 34 −83 30 
Substantia innominata  left 6.39 −11 −4 −15 
  right 3.99 15 −15 
Amygdala  left 6.02 −19 −8 −23 
  right 5.23 26 −4 −23 
Parahippocampal gyrus 28/36 right 5.89 19 −26 −11 
  left 5.32 −11 38 −15 
Fusiform gyrus 19/37 left 5.83 −49 −60 −19 
  right 5.30 53 −49 −19 
Inferior frontal gyrus 45 left 5.42 −53 26 11 
 47 left 5.28 −34 19 −19 
  right 5.00 34 23 −11 
Dorsomedial prefrontal cortex 32 midline 5.17 −4 26 41 
Precentral sulcus left 4.78 −41 −4 38 
Table 2

fMRI activation to emotion morphs, broken down by category

 BA Side T value x y z 
Fear morph > fear static 
    Substantia innominata  right 9.02 11 −4 −15 
  left 8.33 −15 −15 
    Dorsomedial prefrontal cortex 32 midline 7.46 26 38 
    Middle temporal gyrus 21 right 6.64 53 −68 −4 
  left 5.57 −49 −71 
    Ventral striatum  left 6.59 −15 11 −4 
    Midbrain  left 5.63 −11 −41 −11 
  right 4.48 −30 −23 
    Fusiform/inferior temporal gyri 19/37 left 5.43 −53 −64 −15 
  right 5.2 50 −56 −11 
    Inferior frontal gyrus 47 left 5.16 −30 23 −23 
 44 right 4.19 53 19 
    Precentral sulcus left 4.82 −54 38 
    Intraparietal sulcus right 4.77 38 −79 23 
    Amygdala  right 4.75 26 −4 −26 
  left 4.31 −15 −15 −11 
Anger morph > anger static 
    Inferior frontal gyrus 47 right 6.77 38 26 −11 
  left 4.89 −53 26 
    Middle temporal gyrus 21 right 5.91 49 −68 
  left 5.31 −53 −64 
    Precentral sulcus left 5.79 −41 −4 34 
    Fusiform gyrus 19/37 left 5.27 −49 −68 −19 
 BA Side T value x y z 
Fear morph > fear static 
    Substantia innominata  right 9.02 11 −4 −15 
  left 8.33 −15 −15 
    Dorsomedial prefrontal cortex 32 midline 7.46 26 38 
    Middle temporal gyrus 21 right 6.64 53 −68 −4 
  left 5.57 −49 −71 
    Ventral striatum  left 6.59 −15 11 −4 
    Midbrain  left 5.63 −11 −41 −11 
  right 4.48 −30 −23 
    Fusiform/inferior temporal gyri 19/37 left 5.43 −53 −64 −15 
  right 5.2 50 −56 −11 
    Inferior frontal gyrus 47 left 5.16 −30 23 −23 
 44 right 4.19 53 19 
    Precentral sulcus left 4.82 −54 38 
    Intraparietal sulcus right 4.77 38 −79 23 
    Amygdala  right 4.75 26 −4 −26 
  left 4.31 −15 −15 −11 
Anger morph > anger static 
    Inferior frontal gyrus 47 right 6.77 38 26 −11 
  left 4.89 −53 26 
    Middle temporal gyrus 21 right 5.91 49 −68 
  left 5.31 −53 −64 
    Precentral sulcus left 5.79 −41 −4 34 
    Fusiform gyrus 19/37 left 5.27 −49 −68 −19 
Table 3

Comparison of fMRI activation to fear and anger morphs

 BA Side T value x y z 
aCluster threshold reduced to three voxels. 
Fear morph > anger morph 
    Fusiform gyrus 37 right 6.63 38 −68 −15 
    Middle frontal gyrus 9/46 right 6.50 38 34 23 
  left 6.36 −34 41 23 
    Midbrain  right 6.00 11 −23 −11 
    Amygdala  right 5.43 11 −11 −19 
    Posterior cingulate gyrus 31 midline 5.38 −38 45 
    Parahippocampal gyrus 28/36 left 5.33 −23 −19 −26 
    Inferior frontal gyrus 44 right 4.96 60 23 
    Dorsomedial prefrontal cortex 32 right 4.30 11 49 
    Superior temporal sulcusa 22 right 4.06 53 −38 
Anger morph > fear morph 
    Supramarginal gyrus 40 right 6.21 49 −53 34 
    Inferior frontal gyrus 47 right 5.47 19 34 −4 
 BA Side T value x y z 
aCluster threshold reduced to three voxels. 
Fear morph > anger morph 
    Fusiform gyrus 37 right 6.63 38 −68 −15 
    Middle frontal gyrus 9/46 right 6.50 38 34 23 
  left 6.36 −34 41 23 
    Midbrain  right 6.00 11 −23 −11 
    Amygdala  right 5.43 11 −11 −19 
    Posterior cingulate gyrus 31 midline 5.38 −38 45 
    Parahippocampal gyrus 28/36 left 5.33 −23 −19 −26 
    Inferior frontal gyrus 44 right 4.96 60 23 
    Dorsomedial prefrontal cortex 32 right 4.30 11 49 
    Superior temporal sulcusa 22 right 4.06 53 −38 
Anger morph > fear morph 
    Supramarginal gyrus 40 right 6.21 49 −53 34 
    Inferior frontal gyrus 47 right 5.47 19 34 −4 
Table 4

fMRI activation to identity morphs

Identity morph > neutral static BA Side T value x y z 
Fusiform gyrus 19/37 right 12.67 49 −56 −19 
  left 7.28 −49 −53 −26 
Intraparietal sulcus 7/40 right 8.59 34 −56 45 
  left 5.72 −41 −64 30 
Inferior frontal gyrus 47 right 7.92 45 26 −8 
  left 6.88 −30 19 −8 
 45 right 6.82 53 23 
  left 5.85 −49 41 −4 
Middle temporal gyrus 21 right 7.37 64 −45 −11 
  left 5.41 −56 −53 
Thalamus  right 6.85 11 −15 
  left 6.58 −8 −30 −19 
Caudate nucleus  right 5.99 11 11 
Precentral sulcus left 6.46 −49 45 
Dorsomedial prefrontal cortex 32 midline 6.44 −8 23 49 
Amygdala  left 4.09 −15 −8 −19 
Identity morph > neutral static BA Side T value x y z 
Fusiform gyrus 19/37 right 12.67 49 −56 −19 
  left 7.28 −49 −53 −26 
Intraparietal sulcus 7/40 right 8.59 34 −56 45 
  left 5.72 −41 −64 30 
Inferior frontal gyrus 47 right 7.92 45 26 −8 
  left 6.88 −30 19 −8 
 45 right 6.82 53 23 
  left 5.85 −49 41 −4 
Middle temporal gyrus 21 right 7.37 64 −45 −11 
  left 5.41 −56 −53 
Thalamus  right 6.85 11 −15 
  left 6.58 −8 −30 −19 
Caudate nucleus  right 5.99 11 11 
Precentral sulcus left 6.46 −49 45 
Dorsomedial prefrontal cortex 32 midline 6.44 −8 23 49 
Amygdala  left 4.09 −15 −8 −19 
Table 5

Comparison of fMRI activation to emotion morphs versus identity morphs

 BA Side T value x y z 
Emotion morph > identity morph 
    Fusiform gyrus 37 right 8.26 30 −49 −11 
    Middle frontal gyrus right 6.85 34 30 45 
  left 6.20 −34 34 38 
    Anterior cingulate gyrus 24 midline 5.75 38 −4 
    Superior temporal sulcus (anterior) 21/38 right 4.84 53 −4 −25 
    Ventromedial prefrontal cortex 10/32 midline 4.75 53 
    Temporal pole 38 right 4.02 38 −26 
    Superior temporal sulcus (posterior) 21 right 4.49 68 −30 
 39 left 3.86 −60 −41 26 
Identity morph > emotion morph 
    Inferior temporal gyrus 37 right 6.32 64 −41 −11 
  left 4.13 −60 −41 −11 
    Fusiform gyrus 19 right 5.99 45 −60 −19 
    Inferior frontal gyrus 44 left 4.96 −53 11 38 
  right 3.57 49 11 23 
 right 4.15 56 23 23 
    Anterior cingulate gyrus 24 midline 4.73 26 41 
    Caudate nucleus  right 4.58 11 
    Intraparietal sulcus 7/40 left 4.20 −34 −64 53 
  right 4.08 34 −75 45 
 BA Side T value x y z 
Emotion morph > identity morph 
    Fusiform gyrus 37 right 8.26 30 −49 −11 
    Middle frontal gyrus right 6.85 34 30 45 
  left 6.20 −34 34 38 
    Anterior cingulate gyrus 24 midline 5.75 38 −4 
    Superior temporal sulcus (anterior) 21/38 right 4.84 53 −4 −25 
    Ventromedial prefrontal cortex 10/32 midline 4.75 53 
    Temporal pole 38 right 4.02 38 −26 
    Superior temporal sulcus (posterior) 21 right 4.49 68 −30 
 39 left 3.86 −60 −41 26 
Identity morph > emotion morph 
    Inferior temporal gyrus 37 right 6.32 64 −41 −11 
  left 4.13 −60 −41 −11 
    Fusiform gyrus 19 right 5.99 45 −60 −19 
    Inferior frontal gyrus 44 left 4.96 −53 11 38 
  right 3.57 49 11 23 
 right 4.15 56 23 23 
    Anterior cingulate gyrus 24 midline 4.73 26 41 
    Caudate nucleus  right 4.58 11 
    Intraparietal sulcus 7/40 left 4.20 −34 −64 53 
  right 4.08 34 −75 45 
Table 6

fMRI results: emotion by motion interaction

 BA Side T value x y z 
Motion effect for emotional expression >Motion effect for neutral expression 
    Fusiform gyrus 37 right 6.95 30 −49 −11 
    Middle frontal gyrus left 6.89 −30 38 38 
  right 4.64 38 30 41 
    Anterior cingulate gyrus 24 midline 4.64 38 −4 
    Ventromedial prefrontal cortex 10/32 midline 4.51 56 
    Superior temporal sulcus (posterior) 39 left 3.99 −60 −41 26 
Motion effect for neutral expression >Motion effect for emotional expression 
    Inferior temporal gyrus 37 right 6.73 64 −41 −15 
    Fusiform gyrus 19 right 6.05 41 −60 −19 
    Inferior frontal gyrus 44 left 5.66 −53 11 34 
 right 5.06 56 23 23 
    Intraparietal sulcus 7/40 left 4.83 −34 −68 49 
    Caudate nucleus  right 4.64 11 
    Ventral striatum  right 4.30 −4 
    Anterior cingulate gyrus 24 midline 4.32 26 45 
 BA Side T value x y z 
Motion effect for emotional expression >Motion effect for neutral expression 
    Fusiform gyrus 37 right 6.95 30 −49 −11 
    Middle frontal gyrus left 6.89 −30 38 38 
  right 4.64 38 30 41 
    Anterior cingulate gyrus 24 midline 4.64 38 −4 
    Ventromedial prefrontal cortex 10/32 midline 4.51 56 
    Superior temporal sulcus (posterior) 39 left 3.99 −60 −41 26 
Motion effect for neutral expression >Motion effect for emotional expression 
    Inferior temporal gyrus 37 right 6.73 64 −41 −15 
    Fusiform gyrus 19 right 6.05 41 −60 −19 
    Inferior frontal gyrus 44 left 5.66 −53 11 34 
 right 5.06 56 23 23 
    Intraparietal sulcus 7/40 left 4.83 −34 −68 49 
    Caudate nucleus  right 4.64 11 
    Ventral striatum  right 4.30 −4 
    Anterior cingulate gyrus 24 midline 4.32 26 45 
Table 7

fMRI results: main effect of motion

Morphed images > static images BA Side T value x y z 
Middle temporal gyrus 21 right 10.71 49 −56 −15 
  left 7.61 −53 −60 −4 
Amygdala  left 10.30 −15 −4 −19 
  right 8.24 15 −15 −19 
Inferior frontal gyrus 47 left 7.57 −38 15 −15 
  right 7.43 34 23 −8 
Intraparietal sulcus 7/40 right 6.56 34 −75 30 
  left 6.13 −26 −64 49 
Dorsomedial prefrontal cortex 32 midline 6.44 −4 30 38 
Caudate nucleus  left 4.60 −8 11 
Morphed images > static images BA Side T value x y z 
Middle temporal gyrus 21 right 10.71 49 −56 −15 
  left 7.61 −53 −60 −4 
Amygdala  left 10.30 −15 −4 −19 
  right 8.24 15 −15 −19 
Inferior frontal gyrus 47 left 7.57 −38 15 −15 
  right 7.43 34 23 −8 
Intraparietal sulcus 7/40 right 6.56 34 −75 30 
  left 6.13 −26 −64 49 
Dorsomedial prefrontal cortex 32 midline 6.44 −4 30 38 
Caudate nucleus  left 4.60 −8 11 
Figure 1.

Experimental paradigm. Examples of static angry and neutral expressions and four frames of a fear morph are depicted. ISI = interstimulus interval.

Figure 1.

Experimental paradigm. Examples of static angry and neutral expressions and four frames of a fear morph are depicted. ISI = interstimulus interval.

Figure 2.

Random-effects analysis of the group-averaged fMRI results. (a) Brain regions showing greater activity to dynamic emotional expression changes than to static faces portraying a fixed emotion (fear and anger combined). Color bar indicates T values. (b) Brain regions showing greater activity to dynamic changes in facial identity than to static faces with neutral expression. Color bar indicates T values. (c) Dissociable brain regions for the perception of fearful (red) versus angry (green) emotional expression changes. T values are thresholded and truncated at T = 2.5 for illustration purposes. (d) Dissociable brain regions for the perception of changes in facial affect (red; fear and anger combined) versus changes in facial identity (green). T values are thresholded and truncated at T = 2.5 for illustration purposes. Note that emotion morphs and identity morphs engage different sectors within frontal, temporal and cingulate cortices. Abbreviations: ACG = anterior cingulate gyrus, AMY = amygdala, aSTS = anterior superior temporal sulcus, dmPFC = dorsomedial prefrontal cortex, FFG = fusiform gyrus, IFG = inferior frontal gyrus, IPS = intraparietal sulcus, ITG = inferior temporal gyrus, MFG = middle frontal gyrus, MT = motion-sensitive area of middle temporal gyrus, PCG = posterior cingulate gyrus, pSTS = posterior superior temporal sulcus, SI = substantia innominata, SMG = supramarginal gyrus, Th = thalamus.

Figure 2.

Random-effects analysis of the group-averaged fMRI results. (a) Brain regions showing greater activity to dynamic emotional expression changes than to static faces portraying a fixed emotion (fear and anger combined). Color bar indicates T values. (b) Brain regions showing greater activity to dynamic changes in facial identity than to static faces with neutral expression. Color bar indicates T values. (c) Dissociable brain regions for the perception of fearful (red) versus angry (green) emotional expression changes. T values are thresholded and truncated at T = 2.5 for illustration purposes. (d) Dissociable brain regions for the perception of changes in facial affect (red; fear and anger combined) versus changes in facial identity (green). T values are thresholded and truncated at T = 2.5 for illustration purposes. Note that emotion morphs and identity morphs engage different sectors within frontal, temporal and cingulate cortices. Abbreviations: ACG = anterior cingulate gyrus, AMY = amygdala, aSTS = anterior superior temporal sulcus, dmPFC = dorsomedial prefrontal cortex, FFG = fusiform gyrus, IFG = inferior frontal gyrus, IPS = intraparietal sulcus, ITG = inferior temporal gyrus, MFG = middle frontal gyrus, MT = motion-sensitive area of middle temporal gyrus, PCG = posterior cingulate gyrus, pSTS = posterior superior temporal sulcus, SI = substantia innominata, SMG = supramarginal gyrus, Th = thalamus.

References

Adolphs R, Tranel D, Damasio H, Damasio A (
1994
) Impaired recognition of emotion in facial expressions following bilateral damage to the human amygdala.
Nature
 
372
:
669
–672.
Adolphs R, Tranel D, Hamann SB, Young A, Calder A, Anderson AK et al. (
1999
) Recognition of facial emotion in nine subjects with bilateral amygdala damage.
Neuropsychologia
 
37
:
1111
–1117.
Allison T, Puce A, Spencer DD, McCarthy G (
1999
) Electrophysiological studies of human face perception. I: Potentials generated in occipitotemporal cortex by face and non-face stimuli.
Cereb Cortex
 
9
:
415
–430.
Allison T, Puce A, McCarthy G (
2000
) Social perception from visual cues: role of the STS region.
Trends Cogn Sci
 
4
:
267
–278.
Amaral DG, Price JL, Pitkänen A, Carmichael ST (
1992
) Anatomical organization of the primate amygdaloid complex. In: The amygdala: neurobiological aspects of emotion, memory, and mental dysfunction (Aggleton JP, ed.), pp. 1–66. New York: Wiley-Liss.
Anderson AK, Phelps EA (
2000
) Expression without recognition: contributions of the human amygdala to emotional communication.
Psychol Sci
 
11
:
106
–111.
Anderson AK, Spencer DD, Fulbright RK, Phelps EA (
2000
) Contribution of the anteromedial temporal lobes to the evaluation of facial emotion.
Neuropsychology
 
14
:
526
–536.
Barbas H, Mesulam M-M (
1981
) Organization of afferent input to subdivisions of area 8 in the rhesus monkey.
J Comp Neurol
 
200
:
407
–431.
Bassili JN (
1978
) Facial motion in the perception of faces and of emotional expression.
J Exp Psychol Hum Percept Perform
 
4
:
373
–379.
Bassili JN (
1979
) Emotion recognition: the role of facial movement and the relative importance of upper and lower areas of the face.
J Pers Soc Psychol
 
37
:
2049
–2058.
Birbaumer N, Grodd W, Diedrich O, Klose U, Erb M, Lotze M et al. (
1998
) FMRI reveals amygdala activation to human faces in social phobics.
Neuroreport
 
9
:
1223
–1226.
Blair RJR, Morris JS, Frith CD, Perrett DI, Dolan RJ (
1999
) Dissociable neural responses to facial expressions of sadness and anger.
Brain
 
122
:
883
–893.
Blair RJR, Colledge E, Murray L, Mitchell DGV (
2001
) A selective impairment in the processing of sad and fearful expressions in children with psychopathic tendencies.
J Abnorm Child Psychol
 
29
:
491
–498.
Bonda E, Petrides M, Ostry D, Evans A (
1996
) Specific involvement of human parietal systems and the amygdala in the perception of biological motion.
J Neurosci
 
16
:
3737
–3744.
Breiter HC, Etcoff NL, Whalen PJ, Kennedy WA, Rauch SL, Buckner RL et al. (
1996
) Response and habituation of the human amygdala during visual processing of facial expression.
Neuron
 
17
:
875
–887.
Broks P, Young AW, Maratos EJ, Coffey PJ, Calder AJ, Isaac C et al. (
1998
) Face processing impairments after encephalitis: amygdala damage and recognition of fear.
Neuropsychologia
 
36
:
59
–70.
Brothers L, Ring B (
1993
) Mesial temporal neurons in the macaque monkey with responses selective for aspects of social stimuli.
Behav Brain Res
 
57
:
53
–61.
Brothers L, Ring B, Kling A (
1990
) Response of neurons in the macaque amygdala to complex social stimuli.
Behav Brain Res
 
41
:
199
–213.
Büchel C, Morris JS, Dolan RJ, Friston KJ (
1998
) Brain systems mediating aversive conditioning: an event-related fMRI study.
Neuron
 
20
:
947
–957.
Calder AJ, Young AW, Rowland D, Perrett DI, Hodges JR, Etcoff NL (
1996
) Facial emotion recognition after bilateral amygdala damage: differentially severe impairment of fear.
Cogn Neuropsychol
 
13
:
699
–745.
Calvert GA, Bullmore ET, Brammer MJ, Campbell R, Williams SCR, McGuire PW et al. (
1997
) Activation of auditory cortex during silent lipreading.
Science
 
276
:
593
–596.
Campbell R, MacSweeney M, Surguladze S, Calvert G, McGuire P, Suckling J et al. (
2001
) Cortical substrates for the perception of face actions: an fMRI study of the specificity of activation for seen speech and for meaningless lower-face acts (gurning).
Cogn Brain Res
 
12
:
233
–243.
Canli T, Sivers H, Whitfield SL, Gotlib IH, Gabrieli JDE (
2002
) Amygdala response to happy faces as a function of extraversion.
Science
 
296
:
2191
.
Cavada C, Goldman-Rakic PS (
1989
) Posterior parietal cortex in rhesus monkey: II. Evidence for segregated corticocortical networks linking sensory and limbic areas with the frontal lobe.
J Comp Neurol
 
287
:
422
–445.
Christie F, Bruce V (
1998
) The role of dynamic information in the recognition of unfamiliar faces.
Mem Cognit
 
26
:
780
–790.
de Gelder B, Vroomen J, van der Heide L (
1991
) Face recognition and lip-reading in autism.
Eur J Cogn Psychol
 
3
:
69
–86.
Desimone R, Ungerleider LG (
1986
) Multiple visual areas in the caudal superior temporal sulcus of the macaque.
J Comp Neurol
 
248
:
164
–189.
Dumoulin SO, Bittar RG, Kabani NJ, Baker CL Jr, LeGoualher G, Pike BG et al. (
2000
) A new anatomical landmark for reliable identification of human area V5/MT: a quantitative analysis of sulcal patterning.
Cereb Cortex
 
10
:
454
–463.
Edwards K (
1998
) The face of time: temporal cues in facial expressions of emotion.
Psychol Sci
 
9
:
270
–276.
Ekman P, Friesen WV (
1976
) Measuring facial movement.
Environ Psychol Nonverb Behav
 
1
:
56
–75.
Ekman P, Friesen WV (
1978
) The facial action coding system. Palo Alto, CA: Consulting Psychologists Press.
Ekman P, Friesen W V (
1982
) Felt, false, and miserable smiles.
J Nonverb Behav
 
6
:
238
–252.
Fendt M, Fanselow MS (
1999
) The neuroanatomical and neurochemical basis of conditioned fear.
Neurosci Biobehav Rev
 
23
:
743
–760.
Freyd JJ (
1987
) Dynamic mental representations.
Psychol Rev
 
94
:
427
–238.
Gepner B, Deruelle C, Grynfeltt S (
2001
) Motion and emotion: a novel approach to the study of face processing by young autistic children.
J Autism Dev Disord
 
31
:
37
–45.
Gloor P (
1997
) The temporal lobe and limbic system. New York: Oxford University Press.
Grèzes J, Costes N, Decety J (
1998
) Top-down effect of strategy on the perception of human biological motion: a PET investigation.
Cogn Neuropsychol
 
15
:
553
–582.
Grossman E, Donnelly M, Price R, Pickens D, MorganV, Neighbor G et al. (
2000
) Brain areas involved in perception of biological motion.
J Cogn Neurosci
 
12
:
711
–720.
Hamann SB, Stefanacci L, Squire LR, Adolphs R, Tranel D, Damasio H, Damasio A (
1996
) Recognizing facial emotion.
Nature
 
379
:
497
.
Hariri AR, Bookheimer SY, Mazziotta JC (
2000
) Modulating emotional responses: effects of a neocortical network on the limbic system.
Neuroreport
 
11
:
43
–48.
Hariri AR, Mattay VS, Tessitore A, Kolachana B, Fera F, Goldman D et al. (
2002
) Serotonin transporter genetic variation and the response of the human amygdala.
Science
 
297
:
400
–403.
Hart AJ, Whalen PJ, Shin LM, McInerney SC, Fischer H, Rauch SL (
2000
) Differential response in the human amygdala to racial outgroup vs. ingroup face stimuli.
Neuroreport
 
11
:
2351
–2355.
Hasselmo ME, Rolls ET, Bayli, GC (
1989
) The role of expression and identity in the face-selective responses of neurons in the temporal visual cortex of the monkey.
Behav Brain Res
 
32
:
203
–218.
Haxby JV, Hoffman EA, Gobbini MI (
2000
) The distributed human neural system for face perception.
Trends Cogn Sci
 
4
:
223
–233.
Herzog AW, Van Hoesen GW (
1976
) Temporal neocortical afferent connections to the amygdala in the rhesus monkey.
Brain Res
 
115
:
57
–69.
Hess U, Kleck RE (
1997
) Differentiating emotion elicited and deliberate emotional facial expressions. In: What the face reveals: basic and applied studies of spontaneous expression using the facial action coding system (FACS) (Ekman P, Rosenberg EL, eds), pp. 271–286. New York: Oxford University Press.
Hill H, Johnston A (
2001
) Categorizing sex and identity from the biological motion of faces.
Curr Biol
 
11
:
880
–885.
Hoffman EA, Haxby JV (
2000
) Distinct representations of eye gaze and identity in the distributed human neural system for face perception.
Nat Neurosci
 
3
:
80
–84.
Howard RJ, Brammer M, Wright I, Woodruff PW, Bullmore ET, Zeki S (
1996
) A direct demonstration of functional specialization within motion-related visual and auditory cortex of the human brain.
Curr Biol
 
6
:
1015
–1019.
Huk AC, Dougherty RF, Heeger DJ (
2002
) Retintotopy and functional subdivision of human areas MT and MST.
J Neurosci
 
22
:
7195
–7205.
Humphreys GW, Donnelly N, Riddoch MJ (
1993
) Expression is computed separately from facial identity, and it is computed separately for moving and static faces: neuropsychological evidence.
Neuropsychologia
 
31
:
173
–181.
Jordan H, Reiss JE, Hoffman JE, Landau B (
2002
) Intact perception of biological motion in the face of profound spatial deficits: Williams syndrome.
Psychol Sci
 
13
:
162
–167.
Kamachi M, Bruce V, Mukaida S, Gyoba J, Yoshikawa S, Akamatsu S (
2001
) Dynamic properties influence the perception of facial expressions.
Perception
 
30
:
875
–887.
Kanwisher N, McDermott J, Chun MM (
1997
) The fusiform face area: a module in human extrastriate cortex specialized for face perception.
J Neurosci
 
17
:
4302
–4311.
Kawashima R, Sugiura M, Kato T, Nakamura A, Natano K, Ito K et al. (
1999
) The human amygdala plays an important role in gaze monitoring.
Brain
 
122
:
779
–783.
Kesler-West ML, Andersen AH, Smith CD, Avison MJ, Davis CE, Kryscio RJ et al. (
2001
) Neural substrates of facial emotion processing using fMRI.
Cogn Brain Res
 
11
:
213
–226.
Kourtzi Z, Kanwisher N (
2000
) Representation of perceived object shape by the human lateral occipital cortex.
Science
 
283
:
1506
–1509.
LaBar KS, LeDoux JE (
2001
) Coping with danger: the neural basis of defensive behaviors and fearful feelings. In: Handbook of physiology, section 7: the endocrine system, Vol. IV: coping with the environment: neural and endocrine mechanisms (McEwen BS, ed.), pp. 139–154. New York: Oxford University Press.
LaBar KS, Gatenby JC, Gore JC, LeDoux JE, Phelps EA (
1998
) Human amygdala activation during conditioned fear acquisition and extinction: a mixed-trial fMRI study.
Neuron
 
20
:
937
–945.
Lander K, Christie F, Bruce V (
1999
) The role of movement in the recognition of famous faces.
Mem Cognit
 
27
:
974
–985.
Matsumoto D, Ekman P (
1989
) American–Japanese cultural differences in intensity ratings of facial expressions of emotion.
Motiv Emot
 
13
:
143
–157.
McCarthy G, Spicer M, Adrignolo A, Luby M, Gore J, Allison T (
1995
) Brain activation associated with visual motion studied by functional magnetic resonance imaging in humans.
Hum Brain Mapp
 
2
:
234
–243.
Mitchell RW (
1993
) Animals as liars: the human face of nonhuman duplicity. In: Lying and deception in everyday life (Lewis M, Saarni C, eds), pp. 59–89. New York: Guilford Press.
Morecraft RJ, Guela C, Mesulam M-M (
1993
) Architecture of connectivity within a cingulofrontoparietal neurocognitive network.
Arch Neurol
 
50
:
279
–284.
Morris JS, Frith CD, Perrett DI, Rowland D, Young AW, Calder AJ et al. (
1996
) A differential neural response in the human amygdala to fearful and happy facial expressions.
Nature
 
383
:
812
–815.
Morris JS, Friston KJ, Büchel C, Frith CD, Young AW, Calder AJ et al. (
1998
) A neuromodulatory role for the human amygdala in processing emotional facial expresssions.
Brain
 
121
:
47
–57.
Neville HJ, Bavelier D, Corina D, Rauschecker J, Karni A, Lalwani A et al. (
1998
) Cerebral organization for language in deaf and hearing subjects: biological constraints and effects of experience.
Proc Natl Acad Sci USA
 
95
:
922
–929.
Niedenthal PM, Halberstadt JB, Margolin J, Innes-Ker Å-H (
2000
) Emotional state and the detection of change in facial expression of emotion.
Eur J Soc Psychol
 
30
:
211
–222.
Oram MW, Perrett DI (
1996
) Integration of form and motion in the anterior superior temporal polysensory area (STPa) of the macaque monkey.
J Neurophysiol
 
76
:
109
–129.
Pandya DN, Van Hoesen GW, Mesulam M-M (
1981
) Efferent connections of the cingulate gyrus in the rhesus monkey.
Exp Brain Res
 
42
:
319
–330.
Perrett DI, Smith PAJ, Potter DD, Mistlin AJ, Head AS, Milner AD et al. (
1985
) Visual cells in the temporal cortex sensitive to face view and gaze direction.
Proc R Soc Lond B
 
223
:
293
–317.
Perrett DI, Hietanen JK, Oram MW, Benson PJ (
1992
) Organization and functions of cells responsive to faces in the temporal cortex.
Phil Trans R Soc Lond B
 
335
:
23
–30.
Pessoa L, McKenna M, Gutierrez E, Ungerleider LG (
2002
) Neural processing of emotional faces requires attention.
Proc Natl Acad Sci USA
 
99
:
11458
–11463.
Petrides M, Pandya DN (
1988
) Association fiber pathways to the frontal cortex from the superior temporal region in the rhesus monkey.
J Comp Neurol
 
273
:
52
–66.
Petrides M, Pandya DN (
1999
) Dorsolateral prefrontal cortex: comparative cytoarchitectonic analysis in the human and the macaque brain and corticocortical connection patterns.
Eur J Neurosci
 
11
:
1011
–1036.
Phelps EA, O’Connor KJ, Gatenby JC, Gore JC, Grillon C, Davis M (
2000
) Activation of the left amygdala to a cognitive representation of fear.
Nat Neurosci
 
4
:
437
–441.
Phelps EA, O’Connor KJ, Cunningham WA, Funayama ES, Gatenby JC, Gore JC et al. (
2001
) Performance on indirect measures of race evaluation predicts amygdala activation.
J Cogn Neurosci
 
12
:
729
–738.
Phillips ML, Young AW, Senior C, Brammer M, Andrew C, Calder AJ et al. (
1997
) A specific neural substrate for perceiving facial expressions of disgust.
Nature
 
389
:
495
–498.
Phillips ML, Young AW, Scott SK, Calder AJ, Andrew C, Giampietro V et al. (
1998
) Neural responses to facial and vocal expressions of fear and disgust.
Proc R Soc Lond B
 
265
:
1809
–1817.
Pine DS, Szeszko PR, Bilder RM, Ardekani B, Grun J, Zaragn E et al. (
2001
) Cortical brain regions engaged by masked emotional faces in adolescents and adults: an fMRI study.
Emotion
 
1
:
137
–147.
Puce A, Allison T, Bentin S, Gore JC, McCarthy G (
1998
) Temporal cortex activation in humans viewing eye and mouth movements.
J Neurosci
 
18
:
2188
–2199.
Rockland KS, Pandya DN (
1981
) Cortical connections of the occipital lobe in the rhesus monkey: interconnections between areas 17, 18, 19 and the superior temporal sulcus.
Brain Res
 
212
:
249
–270.
Schneider F, Grodd W, Weiss U, Klose U, Mayer KR, Nagele T et al. (
1997
) Functional MRI reveals left amygdala activation during emotion.
Psychiatr Res
 
76
:
75
–82.
Seamon JG (
1982
) Dynamic facial recognition: examination of a natural phenomenon.
Am J Psychol
 
85
:
363
–381.
Senior C, Barnes J, Giampietro V, Simmons A, Bullmore ET, Brammer M et al. (
2000
) The functional neuroanatomy of implicit-motion perception or ‘representational momentum’.
Curr Biol
 
10
:
16
–22.
Spencer J, O’Brien J, Riggs K, Braddick O, Atkinson J, Wattam-Bell J (
2000
) Motion processing in autism: evidence for a dorsal stream deficiency.
Neuroreport
 
11
:
2765
–2767.
Sprengylmeyer R, Rausch M, Eysel UT, Przuntek H (
1997
) Neural structures associated with recognition of facial expressions of basic emotions.
Proc R Soc Lond B
 
265
:
1927
–1931.
Talairach J, Tournoux P (
1988
) Co-planar stereotaxic atlas of the human brain. New York: Thieme.
Thornton IM, Kourtzi Z (
2002
) A matching advantage for dynamic human faces.
Perception
 
31
:
113
–132.
Tootell RBH, Reppas JB, Kwong KK, Malach R, Born RT, Brady TJ et al. (
1995
) Functional analysis of human MT and related visual cortical areas using magnetic resonance imaging.
J Neurosci
 
15
:
3215
–3230.
Voyvodic JT (
1999
) Real-time fMRI paradigm control, physiology, and behavior combined with near real-time statistical analysis.
Neuroimage
 
10
:
91
–106.
Vuilleumier P, Armony JL, Driver J, Dolan RJ (
2001
) Effects of attention and emotion on face processing in the human brain: an event-related fMRI study.
Neuron
 
30
:
829
–841.
Whalen PJ (
1998
) Fear, vigilance, and ambiguity: initial neuroimaging studies of the human amygdala.
Curr Direct Psychol Sci
 
7
:
177
–188.
Whalen PJ, Rauch SL, Etcoff NL, McInerney SC, Lee MB, Jenike MA (
1998
) Masked presentations of emotional facial expressions modulate amygdala activity without explicit knowledge.
J Neurosci
 
18
:
411
–418.
Whalen PJ, Shin LM, McInerney SC, Fischer H, Wright CI, Rauch SL (
2001
) A functional MRI study of human amygdala responses to facial expressions of fear versus anger.
Emotion
 
1
:
70
–83.
Wicker B, Michel F, Henaff MA, Decety J (
1998
) Brain regions involved in the perception of gaze: a PET study.
Neuroimage
 
8
:
221
–227.
Young AW, Aggleton JP, Hellawell DJ, Johnson M, Broks P, Hanley JR (
1995
) Face processing impairments after amygdalotomy.
Brain
 
118
:
15
–24.
Zeki SM, Watson JD, Leuck CJ, Friston KJ, Kennard C, Frackowiak RS (
1991
) A direct demonstration of functional specialization in human visual cortex.
J Neurosci
 
11
:
641
–649.