Abstract

The attentional effects triggered by emotional stimuli in humans have been substantially investigated, but little is known about the impact of affective valence on the processing of meaning. Here, we used a cross-modal priming paradigm involving visually presented adjective–noun dyads and environmental sounds of controlled affective valence to test the contributions of conceptual relatedness and emotional congruence to priming. Participants undergoing event-related potential recording indicated whether target environmental sounds were related in meaning to adjective–noun dyads presented as primes. We tested spontaneous emotional priming by manipulating the congruence between the affective valence of the adjective in the prime and that of the sound. While the N400 was significantly reduced in amplitude by both conceptual relatedness and emotional congruence, there was no interaction between the 2 factors. The same pattern of results was found when participants judged the emotional congruence between environmental sounds and adjective–noun dyads. These results support the hypothesis that conceptual and emotional processes are functionally independent regardless of the specific cognitive focus of the comprehender.

Introduction

Intuitively, there seems to be a dissociation between subjective emotions and objective knowledge (As French philosopher Blaise Pascal puts it: “The heart has its reasons, of which reason knows nothing.”). Indeed, one can consciously analyze a situation or a meaningful event as being rationally acceptable while it is affectively shocking and vice versa. However, the hypothesis of a relative independence of affective vis-à-vis other conceptual processing lacks scientific evidence and the question of interactions between our emotional state and our knowledge of the world has seldom been addressed. How is the meaning of external sensory events (e.g., environmental sounds) extracted by the brain in relation to their affective value as compared with nonaffective conceptual content?

This study focuses on a peak of event-related brain potentials (ERPs), the N400, to examine differences between conceptual and affective processing in a priming paradigm. ERPs are averaged brain signals recorded from the surface of the scalp and time locked to the onset of a stimulus of interest. The N400 is a negative-going waveform that peaks around 400 ms after stimulus onset. It is classically viewed as an index of semantic integration of meaningful stimuli within the context of presentation (Kutas and Hillyard 1980, 1984). For example, N400 amplitude is reduced when the target word is preceded by a semantically related prime word (e.g., doctor—nurse) as compared with an unrelated prime word (e.g., doctor—table). This priming effect has been found across sensory modalities (Kutas and Federmeier 2000) and across verbal/nonverbal coding domains (e.g., Connolly et al. 1995; Grigor et al. 1999; Cummings et al. 2006; Orgs et al. 2006). In order to study affective versus nonaffective conceptual priming in the present experiment, we used environmental sounds as targets in a priming paradigm. N400 modulations by semantic relatedness in word-environmental sound pairs have been reported previously (Van Petten and Rheinfelder 1995; Cummings et al. 2006; Orgs et al. 2008; but see also Chiu and Schacter 1995; Stuart and Jones 1995).

Previous studies using emotionally valenced pictures and words have mainly focused on early posterior negativity (EPN; Kissler et al. 2007; Scott et al. 2009) and late emotion-related P3 effects (Fischler and Bradley 2006; Herbert et al. 2008). Only a few studies have examined the impact of affective valence on semantic processing using the N400. Research with emotionally valenced spoken words has indicated greater N400 negativity for emotionally incongruent as compared with congruent stimuli. For instance, Schirmer and Kotz (2003) independently manipulated affective prosody (happy, neutral, angry) and affective valence (positive, neutral, negative) of spoken words and asked participants to rate the stimuli in 1 of the 2 dimensions. In the affective valence rating condition, they found an interference effect for incongruent stimuli, which translated into greater N400 amplitudes. Furthermore, they found this effect only in female listeners. However, Schirmer and Kotz (2003) and Schirmer et al. (2002) did not compare the effects of emotional congruence and semantic relatedness. It is therefore unknown whether the evaluation of emotional prosody and that of factual content show relative independence or interact.

Other studies have reported that mismatch between affective valence of sentence prosody and affective valence of target words increases N400 amplitudes (Schirmer et al. 2002, 2005). Furthermore, within the visual modality, affective words embedded in sentences have been found to enhance N400 amplitude (Holt et al. 2008). But, to our knowledge, none of the previous studies attempted to distinguish semantic effects from affective valence effects. In addition, effects of emotionally valenced environmental sounds have seldom been investigated so far.

Here, we used a cross-modal priming paradigm to characterize the interplay of conceptual and emotional processing during the comprehension of written words and environmental sounds. Sounds rated on average as unpleasant or as positive in a preliminary evaluation session were presented in a cross-modal priming context. In each trial, a word pair consisting of an adjective (of controlled affective valence) and an emotionally neutral noun was displayed in standard left/right orientation at the center on a screen before an environmental sound was presented through headphones. The sound was either conceptually related or unrelated to the noun and either emotionally congruent or incongruent with the adjective (see Table 1). We elected to use adjective–noun dyads as primes because manipulating conceptual relatedness and emotional congruence using only nouns in a fully counterbalanced design proved impossible.

Table 1

Experimental design and stimulus examples

Emotional congruence Conceptual relatedness between noun and sound 
Conceptually related (CR) Conceptually unrelated (CU) 
Sound emotionally congruent with adjective (EC) Terrible car—car alarm Depressing music—car alarm 
Sound emotionally incongruent with adjective (EI) Adorable car—car alarm Inspiring music—car alarm 
Emotional congruence Conceptual relatedness between noun and sound 
Conceptually related (CR) Conceptually unrelated (CU) 
Sound emotionally congruent with adjective (EC) Terrible car—car alarm Depressing music—car alarm 
Sound emotionally incongruent with adjective (EI) Adorable car—car alarm Inspiring music—car alarm 

Note: The same adjective–noun pairs and the same environmental sounds were presented in all conditions, albeit paired differently. This prevented spurious ERP differences due to perceptual variability between experimental conditions.

Two groups of participants were asked to judge whether or not prime words and target sounds were related in meaning (Experiment 1) or congruent with regard to affective valence (Experiment 2), respectively. This design enabled us to test the independent effect of conceptual relatedness and emotional congruence on the amplitude of the N400. Critically, no behavioral or ERP differences could be attributed to spurious perceptual differences between conditions because the same sounds were used in each of the 4 experimental conditions. We predicted that conceptual relatedness would significantly reduce N400 mean amplitude and hypothesized the same effect for emotional congruence. Furthermore, based on the classical philosophical viewpoint regarding the relative independence of conceptual and emotional content evaluation, we hypothesized that the 2 factors would not interact at any point in time whether participants were focusing on the conceptual or emotional dimension of the stimuli.

Materials and Methods

Participants

Two groups of 14 participants, matched for gender, mean age, and level of education, gave informed consent to take part in the experiment, which was approved by the ethics committee of Bangor University. All participants were right handed based on the Edinburgh Handedness Inventory (Oldfield 1971) and had self-reported normal hearing. Participants were paid with course and print credits as part fulfillment of their degree.

Stimuli

Word stimuli consisted of 40 adjectives of positive valence (mean = 7.5 ± 0.6 on a scale of 1–9; 1 being very negative and 9 being very positive; Bradley and Lang 1999), 40 adjectives of negative valence (mean = 2.9 ± 1) and 40 highly imageable concrete neutral nouns (mean concreteness = 568 ± 67 and mean imageability = 594 ± 44; on a scale from 100 to 700, Coltheart 1981; mean valence = 5.2 ± 1.2). Auditory stimuli were 40 digitized sounds (44.1 KHz sampling rate, 16 bit encoding, mono, DC offset corrected, normalized to the same maximal peak amplitude, mean duration = 1115 ± 386 ms) of positive or negative valence based on a previous rating procedure involving 97 participants (scale of 1–5, 1 being unpleasant and 5 being pleasant; Thierry and Roberts 2007). Twenty sounds were positive (mean valence = 3.39 ± 0.32) and 20 were negative (mean valence = 1.97 ± 0.35). Both positive and negative sounds were collected from royalty-free internet sound libraries or digitally recorded within our laboratory and contained human (e.g., scream, laughter), animal (e.g., meowing, growling), and mechanical (e.g., sleigh bells, machine gun) sounds. All 40 sounds were also rated for recognizability by 33 participants who did not take part in the ERP experiment. They were rated 4.3 ± 0.52 on a scale from 1 to 5 on average, 1 being unrecognizable and 5 being very recognizable. Another group of 13 participants rated the conceptual relatedness 1) between adjective–noun dyads and sounds and 2) between single nouns and sounds on a scale from 1 to 5, 1 being unrelated and 5 being strongly related. The results show that there was a main effect of conceptual relatedness, no effect of emotional congruency, and no interaction when it comes to overt matching judgments. These results were neither different for adjective–noun dyads nor for nouns presented in isolation. Adjective–noun dyads and sounds were paired so as to construct 80 trials in each of the 4 experimental conditions (Table 1, Appendix). Note that it was not possible to control for emotional category (e.g., anger, sadness, or disgust) and generate stimulus pairs in sufficient numbers, therefore this study only aimed at testing affective valence congruency effects. Target sounds were presented twice in each of the experimental conditions, that is, 8 times across the whole experiment, in order to cancel out spurious perceptual differences between conditions. Sounds were either conceptually related or unrelated to the noun and were either congruent or not with the adjective in terms of affective valence. The valence of adjectives and sounds was highly positively correlated for congruent pairings (r = 0.88, P < 0.0001) and highly negatively correlated for incongruent pairings (r = −0.91, P < 0.0001). The high positive correlation between the emotional valence of adjectives and sounds in the congruent condition shows that the dyads and sounds were strongly emotionally related (i.e., both positive or both negative). The negative correlation found for incongruent pairings validated the pairs as genuinely incongruent (i.e., opposite affective polarity).

Design and Procedure

Volume levels were adjusted at a comfortable level for each participant individually prior to the electroencephalography (EEG) recording session. Adjective–noun dyads were presented on a single line at the center of a computer monitor for 500 ms, followed by a fixation cross which remained on the screen until the presentation of the next visual prime. Sounds were presented with a variable interstimulus interval of 400, 500 or 600 ms after the words within a window of 3500 ms so as to reduce the impact of the ERP elicited by the preceding prime stimulus. In the first experiment, participants were instructed to press one button when the object described in the dyad was related in meaning to the environmental sound presented as a target (e.g., ANGRY DOG—[dog growling]) and another button when they could not see a relationship (e.g., ANGRY DOG—[kiss]). Note that the adjective never provided a conceptual cue regarding the source of the environmental sound. Participants' response automatically triggered the next trial after a minimal duration of 400 ms. In case of no response, the next trial was initiated 3500 ms after sound onset. The same procedure was followed in the second experiment, except that participants were instructed to make a congruency judgment regarding the affective valence of dyads and sounds (e.g., congruent: HAPPY CAT—[sleigh bells] and incongruent: HURT CAT—[sleigh bells]). Note that the noun by itself provided no cue regarding the affective valence of the environmental sound. Each of the 4 experimental blocks consisted of 80 trials and contained 2 sound repetitions, which never pertained to the same experimental condition. Conditions were randomized within each block. Block order and response side were fully counterbalanced between participants. Half of the participants pressed a right-hand button for conceptually related/emotionally congruent pairs and left-hand button for conceptually unrelated/emotionally incongruent pairs and the other half received the reverse instruction. This conformed to the structure of a 2 × 2 design with conceptual relatedness and emotional congruence as factors.

ERP Acquisition and Processing

Electrophysiological data were recorded in reference to Cz at a rate of 1 kHz from 64 Ag/AgCl electrodes embedded in an elastic cap (Easycap, Herrsching, Germany) placed according to the 10–20 convention using 64-channel and SynAmp2 amplifiers (Compumedics, Charlotte, NC). Impedances were kept below 7 kΩ. EEG activity was filtered online between 0.01 and 200 Hz and refiltered off-line using a 40 Hz low-pass zero phase shift digital filter. Eyeblinks were mathematically corrected after modeling (Gratton et al. 1983) and remaining artifacts manually dismissed. Epochs ranged from −100 to 1000 ms after the onset of the sound stimulus. Error trials were dismissed from further analysis. There were more than 30 trials in each of the 4 conditions for each of the participants included in the final analysis (mean number of accepted trials per condition = 56 ± 13). Baseline correction was performed in reference to prestimulus activity, and individual averages were digitally rereferenced to the arithmetic mean of the left and right mastoid channels.

Statistical Analysis

ERP components were determined based on the mean global field power measured across the scalp, which summarizes the contribution of all electrodes in the form of a single vector norm (Picton et al. 2000). This allowed automatic peak detection time locked to electrodes of maximum amplitude in the following intervals: 80–120 ms (N1), 150–220 ms (P2a), 260–320 ms (N2), 320–370 ms (P2b), and 370–500 ms (N400). Peak amplitudes and latencies were analyzed for each component using a 2 (conceptually related/unrelated) × 2 (emotionally congruent/incongruent) × 62 (levels of electrode) repeated measures analysis of variance (ANOVA). The results of ANOVAs conducted with 62 levels of electrodes were then replicated in a subset of 9 frontocentral electrodes (F3, Fz, F4, FC1, FCz, FC2, C1, Cz, C2). All the main effects and interaction between factors other than electrodes found in the 62-electrode analysis were replicated in the 9-electrode analysis. Lateralization effects were tested with an ANOVA involving hemispheres (2 levels) and all electrodes except midline ones (27 levels).

Results

Behavioral Results

In the first experiment, no main effect of conceptual relatedness or emotional congruence was found on reaction times and no interaction was found between the 2 factors. Participants made significantly more errors for conceptually related dyads–sound pairs (mean = 24.8 ± 2.7%) than unrelated pairs (mean = 2.1 ± 0.4%; F1,13 = 74.4, P < 0.0001). There was no main effect of emotional congruence and no interaction between the 2 factors on error rates (Fig. 1). In the second experiment, manipulations of conceptual relatedness and emotional congruency had no main effect and there was no interaction between these 2 factors on reaction time. The task-relevant factor of emotional congruence increased the mean error rate for congruent pairs (mean = 27.1 ± 4.9%) as compared with incongruent pairs (mean = 5.3 ± 1.9%; F1,13 = 80.6, P < 0.0001). There was no main effect of conceptual relatedness and no interaction between the 2 factors on error rates (Fig. 2). Importantly, the pattern of reaction time did not change over the course of either of the 2 experiments.

Figure 1.

Behavioral results in the conceptual relatedness judgment task (Experiment 1). Mean reaction times (bars, left-hand scale) and error rates (circle, right-hand scale) in the 4 experimental conditions: CR/U conceptually related/unrelated, EC/I: emotionally congruent/incongruent. Error bars depict standard error of the mean in all cases. Note that the error bars are too small to show in the case of mean error rates in conditions CUEC and CUEI.

Figure 1.

Behavioral results in the conceptual relatedness judgment task (Experiment 1). Mean reaction times (bars, left-hand scale) and error rates (circle, right-hand scale) in the 4 experimental conditions: CR/U conceptually related/unrelated, EC/I: emotionally congruent/incongruent. Error bars depict standard error of the mean in all cases. Note that the error bars are too small to show in the case of mean error rates in conditions CUEC and CUEI.

Figure 2.

Behavioral results in the emotional congruence judgment task (Experiment 2). Mean reaction times (bars, left-hand scale) and error rates (circle, right-hand scale) in the 4 experimental conditions: CR/U conceptually related/unrelated, EC/I: emotionally congruent/incongruent. Error bars depict standard error of the mean in all cases.

Figure 2.

Behavioral results in the emotional congruence judgment task (Experiment 2). Mean reaction times (bars, left-hand scale) and error rates (circle, right-hand scale) in the 4 experimental conditions: CR/U conceptually related/unrelated, EC/I: emotionally congruent/incongruent. Error bars depict standard error of the mean in all cases.

ERP Results

In Experiment 1, ERPs elicited by environmental sounds displayed a sequence of peaks typically associated with the processing of meaningful auditory stimuli: N1, P2a, N2, P2b, and N400. None of the component latencies were affected by experimental conditions. The N1 peaked at 104 ± 8 ms on average and was maximal at Cz. N1 amplitude was not significantly affected by either experimental factor. The P2a peaked at 189 ± 18 ms on average and was maximal at Cz. Unexpectedly, P2a mean amplitude was significantly reduced for conceptually unrelated as compared with related sounds (F1,13 = 8.31, P < 0.05), but there was no main effect of emotional congruence and no interaction. The N2 and P2b peaked on average at 290 ± 16 ms and 347 ± 13 ms, respectively and were maximal at Fz. Their mean amplitudes were significantly affected by conceptual relatedness (N2: F1,13 = 31.7, P < 0.0001; P2b: F1,13 = 25, P < 0.0001) in the same direction as in the P2a range (Fig. 3a), but there was no main effect of emotional congruence (Fig. 3b) and no interaction between the 2 factors.

Figure 3.

Event-related potentials elicited at 9 electrode sites in Experiment 1. (a) Main effect of conceptual relatedness. (b) Main effect of emotional congruence.

Figure 3.

Event-related potentials elicited at 9 electrode sites in Experiment 1. (a) Main effect of conceptual relatedness. (b) Main effect of emotional congruence.

The N400 peaked at 462 ± 49 ms on average and was maximal over frontocentral regions. N400 mean amplitudes were significantly affected by conceptual relatedness (F1,13 = 20.9, P < 0.001, η2 = 0.77), such that unrelated targets elicited greater N400 amplitudes than related targets. There was also a significant electrode main effect (F61,793 = 9.66, P < 0.0001, η2 = 0.08), relating to maximal overall N400 amplitudes over frontocentral electrodes (Fig. 3a). In addition, there was a main effect of emotional congruence (F1,13 = 8.6, P < 0.02, η2 = 0.14): emotionally incongruent sounds elicited greater N400 amplitudes than the same sounds presented as emotionally congruent targets (Fig. 3b). But, emotional congruence failed to interact with the electrode factor (F1,13 = 2.1, P > 0.1). Critically, there was no interaction between conceptual relatedness and emotional congruence in the N400 range (F1,13 = 0.62, P > 0.1).

Consistent with the pattern of results in Experiment 1, early ERP components (i.e., N1 and P1) were not affected by either conceptual relatedness or emotional congruency in the second experiment. Significant priming effects were found in the N400 range (i.e., around 200–500 ms on average), where conceptual relatedness (F1,13 = 16.1, P < 0.001, η2 = 0.35) and emotional congruency (F1,13 = 19.6, P < 0.001, η2 = 0.53) reduced mean ERP amplitude, respectively (Fig. 4a,b). Again, both effects showed frontal–central distribution that was maximal at Cz. No interaction between these 2 factors was found within this temporal window (F1,13 = 0.77, P > 0.1).

Figure 4.

Event-related potentials elicited at 9 electrode sites in Experiment 2. (a) Main effect of conceptual relatedness. (b) Main effect of emotional congruence.

Figure 4.

Event-related potentials elicited at 9 electrode sites in Experiment 2. (a) Main effect of conceptual relatedness. (b) Main effect of emotional congruence.

Discussion

The aim of this study was to determine whether the emotional processing of visually presented words and environmental sounds is independent from the processing of conceptual information as indexed by classical repetition priming effects. This was achieved by observing the priming effect of the 2 factors on the amplitude of the N400. We found a significant modulation of the N400 by conceptual relatedness and emotional congruence, respectively, with no interaction between the 2 factors whether participants focused on the conceptual or the emotional dimension of the stimuli.

In both Experiments 1 and 2, we found no main effect of conceptual relatedness or emotional congruency on reaction times. Since the behavioral data was derived from correct trials only, the lack of priming cannot be attributed squarely to a disagreement between participants and experimenters on what does or does not constitute a related pair. Moreover, we found that error rates were relatively high for conceptually related or emotionally congruent pairs and low for unrelated or incongruent pairs. Taking into consideration the absence of a reaction time difference between conditions, this pattern of results suggests that both the conceptual and the emotional tasks require elaborate efforts and that we were successful in directing participants' attention onto task-relevant properties of the stimuli since participants were biased toward judging the dyads–sound pairs as unrelated or incongruent. These findings do not affect our interpretation of the ERP data because only trials where the participants made a decision in agreement with experimenters regarding relationships between words and sounds were included. However, in both experiments, the task-irrelevant factor (emotional congruency in Experiment 1 and conceptual relatedness in Experiment 2) failed to affect either reaction times or error rates making it impossible to evaluate the implicit processing of emotional congruence and conceptual relatedness based on behavioral performance in these experiments.

In Experiment 1, the ERP modulation elicited by conceptual relatedness appeared earlier than is traditionally reported in priming studies involving spoken words (e.g., Van Petten and Rheinfelder 1995; Radeau et al. 1998; Thierry et al. 1998; Thierry, Cardebat, et al. 2003). However, our result is consistent with previously reported differences in N400 peak latencies elicited by environmental sounds and words (Cummings et al. 2006). It may be due to a greater immediacy of access to meaning for environmental sounds than words and is compatible with recent demonstration of semantic effects before the N400 time window (Hauk et al. 2006). In addition, the topography of the N400 elicited by conceptually unrelated sounds tended to be right lateralized, consistent with previously reported N400 topography elicited by spoken words presented outside a sentence context (Kutas and Iragui 1998; Cummings et al. 2006; but see Van Petten and Rheinfelder 1995; Thierry et al. 1998; Thierry, Cardebat, et al. 2003). This result is reminiscent of classical dichotic listening studies showing a left-ear (right hemisphere) advantage for the interpretation of environmental sounds (Kimura 1967; King and Kimura 1972) and, more recently, neuroimaging data suggesting right-hemispheric involvement in accessing the meaning of environmental sounds/nonverbal information (Humphries et al. 2001; Thierry, Giraud, et al. 2003; Thierry and Price 2006). Even though there is evidence that “semantic access” is lateralized (Thierry and Price 2006), it must be kept in mind that there is a large anatomical overlap in the structures ultimately activated by words and sounds, that is, the “semantic system” (Saygin et al. 2003; Thierry, Giraud, et al. 2003; Cummings et al. 2006).

The main effect of emotional congruence on mean amplitudes in the N400 range is consistent with the hypothesis that emotionally incongruent information increases difficulty of contextual information integration (Bentin et al. 1993; Chwilla et al. 1995). Schirmer et al. (2002) have reported similar priming effects of emotional congruence between speech prosody and emotional valence in word processing. However, the sex effects reported by Schirmer and Kotz (2003) and Schirmer et al. (2002) suggest that our results may only apply to female listeners since 12 of our 14 participants were women (but see Schirmer et al. 2005). Critically, the absence of interaction between conceptual relatedness and emotional congruence supports our hypothesis that the emotional valence congruency effect reported here is independent from the analysis of other conceptual properties.

The ERP results of Experiment 2, where participants made emotional congruence instead of conceptual relatedness judgment, have essentially replicated those of Experiment 1: Priming effects were observed for both conceptually relatedness and emotional congruency, and no interaction was found between the 2 factors, indicating that the dissociation between conceptual and emotional processing observed in Experiment 1 is not task specific. However, the priming effects by conceptual and emotional manipulations in Experiment 2 were of similar magnitude and latency, thus contrasting with the asymmetry of the priming effects in Experiment 1. This suggests that the relatively smaller effect of emotional priming in the first experiment is due to its implicit nature rather than the insignificance of the emotional congruence manipulation (Interestingly, we did not observe the reverse pattern of ERP modulations [i.e., larger and earlier effect by emotional priming relative to conceptual priming] in Experiment 2. One explanation is that the cross-modal priming paradigm adopted in the present study was more sensitive to conceptual relatedness than emotional congruence. Another possible interpretation of this result is that the N400 itself is less sensitive to emotional priming than conceptual priming. It has been shown elsewhere that the N400 for words and sounds is modulated by automatic processing of conceptual features and also in the absence of behavioral priming effects [e.g., see Heil et al. 2004; Orgs et al. 2007, 2008]). Our results therefore show that N400 modulations by conceptual relatedness and emotional congruence are to some extent functionally independent and given that a number of previous studies have reported atypical N400 topographies in verbal (e.g., Deacon et al. 1995; Rugg et al. 1998; De Diego Balaguer et al. 2005) and nonverbal (e.g., Huddy et al. 2003; Zhou et al. 2004; Kelly et al. 2006) contexts, these 2 processes may involve partially distinct neural substrata.

Conclusion

The present study provides scientific evidence addressing a long-debated philosophical question of the relative independence between conceptual and affective evaluation of meaningful information. In a cross-modal verbal–nonverbal context (word–sound priming), independent modulations of ERPs induced by the processing of emotional congruence and conceptual relatedness suggest that the 2 forms of information processing can be functionally dissociated regardless of task demands. Future studies will determine whether such relative independence can be generalized to the processing of other categories of meaningful stimuli.

Funding

Biotechnology and Biological Sciences Research Council UK (grant # 5/S18007); European Research Council (grant ERC-SG-209704 to D.D., Y.J.W., and G.T.).

The authors wish to thank Bastien Boutonnet for assistance with data collection and Dr Clara Martin for useful discussions. Conflict of Interest: None declared.

Appendix

List of stimuli used in the experiments

Adjective–noun pair Negative sound Adjective–noun Pair Positive sound 
Adorable Car Car alarm Inspiring Music Drums 
Terrible Car  Depressing Music  
Good Plane Airplane attack Terrific Lake Duck quack 
Dreadful Plane  Cold Lake  
Joyful Alert Alert alarm Outstanding Trunk Elephant 
Tense Alert  Hostile Trunk  
Romantic Storm Wind storm Nice Food Microwave 
Rough Storm  Stiff Food  
Safe Tree Chainsaw Famous Horse Horse gallop 
Toxic Tree  Nervous Horse  
Beautiful Warmth Explosion Friendly Cow Cow 
Overwhelming Warmth  Hungry Cow  
Happy Cat Cat scream Powerful Engine Helicopter 
Hurt Cat  Delayed Engine  
Lucky Snake Snake die Tidy Locker Keys 
Timid Snake  Odd Locker  
Relaxing Dentist Dentist drill Natural Milk Sheep 
Fearful Dentist  Narcotic Milk  
Impressive Pig Pig scream Lively People Crowd 
Upset Pig  Scared People  
Proud Shotgun Machine gun Smooth Storm Thunder storm 
Nasty Shotgun  Severe Storm  
Thoughtful Door Door slam Wise Owl Barn owl 
Noisy Door  Violent Owl  
Devoted Patient Ambulance Loved Kitten Kitten 
Morbid Patient  Sick Kitten  
Luscious Honey Bees in heat Merry Christmas Sleigh bells 
Rotten Honey  Sad Christmas  
Satisfied Dog Dog growl Erotic Kiss Kiss 
Angry Dog  Regretful Kiss  
Gentle Hit Punch series Useful Insect Cricket 
Bloody Hit  Distressed Insect  
Masterful Hairdryer Hair dryer Jolly Heart Heart beat 
Useless Hairdryer  Rejected Heart  
Sexy Foam Shave Silly Bird Bird song 
Boring Foam  Terrified Bird  
Pleasurable Clothing Cloth ripping Grateful Puppy Puppy 
Embarrassing Clothing  Messy Puppy  
Astonished Rattle Snake rattle Untroubled Laughter Laugh 
Obnoxious Rattle  Evil Laughter  
Adjective–noun pair Negative sound Adjective–noun Pair Positive sound 
Adorable Car Car alarm Inspiring Music Drums 
Terrible Car  Depressing Music  
Good Plane Airplane attack Terrific Lake Duck quack 
Dreadful Plane  Cold Lake  
Joyful Alert Alert alarm Outstanding Trunk Elephant 
Tense Alert  Hostile Trunk  
Romantic Storm Wind storm Nice Food Microwave 
Rough Storm  Stiff Food  
Safe Tree Chainsaw Famous Horse Horse gallop 
Toxic Tree  Nervous Horse  
Beautiful Warmth Explosion Friendly Cow Cow 
Overwhelming Warmth  Hungry Cow  
Happy Cat Cat scream Powerful Engine Helicopter 
Hurt Cat  Delayed Engine  
Lucky Snake Snake die Tidy Locker Keys 
Timid Snake  Odd Locker  
Relaxing Dentist Dentist drill Natural Milk Sheep 
Fearful Dentist  Narcotic Milk  
Impressive Pig Pig scream Lively People Crowd 
Upset Pig  Scared People  
Proud Shotgun Machine gun Smooth Storm Thunder storm 
Nasty Shotgun  Severe Storm  
Thoughtful Door Door slam Wise Owl Barn owl 
Noisy Door  Violent Owl  
Devoted Patient Ambulance Loved Kitten Kitten 
Morbid Patient  Sick Kitten  
Luscious Honey Bees in heat Merry Christmas Sleigh bells 
Rotten Honey  Sad Christmas  
Satisfied Dog Dog growl Erotic Kiss Kiss 
Angry Dog  Regretful Kiss  
Gentle Hit Punch series Useful Insect Cricket 
Bloody Hit  Distressed Insect  
Masterful Hairdryer Hair dryer Jolly Heart Heart beat 
Useless Hairdryer  Rejected Heart  
Sexy Foam Shave Silly Bird Bird song 
Boring Foam  Terrified Bird  
Pleasurable Clothing Cloth ripping Grateful Puppy Puppy 
Embarrassing Clothing  Messy Puppy  
Astonished Rattle Snake rattle Untroubled Laughter Laugh 
Obnoxious Rattle  Evil Laughter  

References

Bentin
S
Kutas
M
Hillyard
SA
Electrophysiological evidence for task effects on semantic priming in auditory word processing
Psychophysiology
 , 
1993
, vol. 
30
 (pg. 
161
-
169
)
Bradley
MM
Lang
PJ
Affective norms for English words (ANEW)
 , 
1999
Gainesville (FL)
The NIMH Center for the Study of Emotion and Attention, University of Florida
Chiu
C-YP
Schacter
DL
Auditory priming for nonverbal information: implicit and explicit memory for environmental sounds
Conscious Cogn
 , 
1995
, vol. 
4
 (pg. 
440
-
458
)
Chwilla
DJ
Brown
CM
Hagoort
P
The N400 as a function of the level of processing
Psychophysiology
 , 
1995
, vol. 
32
 (pg. 
274
-
285
)
Coltheart
M
The MRC psycholinguistic database
Q J Exp Psychol
 , 
1981
, vol. 
33
 (pg. 
497
-
505
)
Connolly
JF
Byrne
JM
Dywan
CA
Assessing adult receptive vocabulary with event-related potentials: an investigation of cross-modal and cross-form priming
J Clin Exp Neuropsychol
 , 
1995
, vol. 
17
 (pg. 
548
-
565
)
Cummings
A
Ceponiene
R
Koyama
A
Saygin
AP
Townsend
J
Dick
F
Auditory semantic networks for words and natural sounds
Brain Res
 , 
2006
, vol. 
1115
 (pg. 
92
-
107
)
Deacon
D
Mehta
A
Tinsley
C
Nousak
JM
Variation in the latencies and amplitudes of N400 and NA as a function of semantic priming
Psychophysiology
 , 
1995
, vol. 
32
 (pg. 
560
-
570
)
De Diego Balaguer
R
Sebastian-Galles
N
Diaz
B
Rodriguez-Fornells
A
Morphological processing in early bilinguals: an ERP study of regular and irregular verb processing
Brain Res Cogn Brain Res
 , 
2005
, vol. 
25
 (pg. 
312
-
327
)
Fischler
I
Bradley
M
Event-related potential studies of language and emotion: words, phrases, and task effects
Prog Brain Res
 , 
2006
, vol. 
156
 (pg. 
185
-
203
)
Gratton
G
Coles
MG
Donchin
E
A new method for off-line removal of ocular artifact
Electroencephalogr Clin Neurophysiol
 , 
1983
, vol. 
55
 (pg. 
468
-
484
)
Grigor
J
Van Toller
S
Behan
J
Richardson
A
The effect of odour priming on long latency visual evoked potentials of matching and mismatching objects
Chem Senses
 , 
1999
, vol. 
24
 (pg. 
137
-
144
)
Hauk
O
Davis
MH
Ford
M
Pulvermuller
F
Marslen-Wilson
WD
The time course of visual word recognition as revealed by linear regression analysis of ERP data
Neuroimage
 , 
2006
, vol. 
30
 (pg. 
1383
-
1400
)
Heil
M
Rolke
B
Pecchinenda
A
Automatic semantic activation is no myth: semantic context effects on the N400 in the letter-search task in the absence of response time effects
Psychol Sci
 , 
2004
, vol. 
15
 (pg. 
852
-
857
)
Herbert
C
Junghofer
M
Kissler
J
Event related potentials to emotional adjectives during reading
Psychophysiology
 , 
2008
, vol. 
45
 
3
(pg. 
487
-
498
)
Holt
DJ
Lynn
SK
Kuperberg
GR
Neurophysiological correlates of comprehending emotional meaning in context
J Cogn Neurosci
 , 
2008
, vol. 
21
 
11
(pg. 
2245
-
2262
)
Huddy
V
Schweinberger
SR
Jentzsch
I
Burton
AM
Matching faces for semantic information and names: an event-related brain potentials study
Brain Res Cogn Brain Res
 , 
2003
, vol. 
17
 (pg. 
314
-
326
)
Humphries
C
Willard
K
Buchsbaum
B
Hickok
G
Role of anterior temporal cortex in auditory sentence comprehension: an fMRI study
Neuroreport
 , 
2001
, vol. 
12
 (pg. 
1749
-
1752
)
Kelly
SD
Ward
S
Creigh
P
Bartolotti
J
An intentional stance modulates the integration of gesture and speech during comprehension
Brain Lang
 , 
2006
, vol. 
101
 
3
(pg. 
222
-
233
)
Kimura
D
Functional asymmetry of the brain in dichotic listening
Cortex
 , 
1967
, vol. 
3
 (pg. 
163
-
178
)
King
FL
Kimura
D
Left-ear superiority in dichotic perception of vocal nonverbal sounds
Can J Psychol
 , 
1972
, vol. 
26
 (pg. 
111
-
116
)
Kissler
J
Herbert
C
Peyk
P
Junghofer
M
Buzzwords: early cortical responses to emotional words during reading
Psychol Sci
 , 
2007
, vol. 
18
 
6
(pg. 
475
-
480
)
Kutas
M
Federmeier
KD
Electrophysiology reveals semantic memory use in language comprehension
Trends Cogn Sci
 , 
2000
, vol. 
4
 (pg. 
463
-
470
)
Kutas
M
Hillyard
SA
Reading senseless sentences: brain potentials reflect semantic incongruity
Science
 , 
1980
, vol. 
207
 (pg. 
203
-
205
)
Kutas
M
Hillyard
SA
Brain potentials during reading reflect word expectancy and semantic association
Nature
 , 
1984
, vol. 
307
 (pg. 
161
-
163
)
Kutas
M
Iragui
V
The N400 in a semantic categorization task across six decades
Electroencephalogr Clin Neurophysiol
 , 
1998
, vol. 
108
 (pg. 
456
-
471
)
Oldfield
RC
The assessment and analysis of handedness: the Edinburgh inventory
Neuropsychologia
 , 
1971
, vol. 
9
 (pg. 
97
-
113
)
Orgs
G
Lange
K
Dombrowski
JH
Heil
M
Conceptual priming for environmental sounds and words: an ERP study
Brain Cogn
 , 
2006
, vol. 
62
 (pg. 
267
-
272
)
Orgs
G
Lange
K
Dombrowski
JH
Heil
M
Is conceptual priming for environmental sounds obligatory?
Int J Psychophysiol
 , 
2007
, vol. 
65
 (pg. 
162
-
166
)
Orgs
G
Lange
K
Dombrowski
JH
Heil
M
N400-effects to task-irrelevant environmental sounds: further evidence for obligatory conceptual processing
Neurosci Lett
 , 
2008
, vol. 
436
 
2
(pg. 
133
-
137
)
Picton
TW
Bentin
S
Berg
P
Donchin
E
Hillyard
SA
Johnson
R, Jr
Miller
GA
Ritter
W
Ruchkin
DS
Rugg
MD
, et al.  . 
Guidelines for using human event-related potentials to study cognition: recording standards and publication criteria
Psychophysiol
 , 
2000
, vol. 
37
 
2
(pg. 
127
-
152
)
Radeau
M
Besson
M
Fonteneau
E
Castro
SL
Semantic, repetition and rime priming between spoken words: behavioral and electrophysiological evidence
Biol Psychol
 , 
1998
, vol. 
48
 (pg. 
183
-
204
)
Rugg
MD
Mark
RE
Walla
P
Schloerscheidt
AM
Birch
CS
Allan
K
Dissociation of the neural correlates of implicit and explicit memory
Nature
 , 
1998
, vol. 
392
 (pg. 
595
-
598
)
Saygin
AP
Dick
F
Wilson
SM
Dronkers
NF
Bates
E
Neural resources for processing language and environmental sounds: evidence from aphasia
Brain
 , 
2003
, vol. 
126
 (pg. 
928
-
945
)
Schirmer
A
Kotz
SA
ERP evidence for a sex-specific Stroop effect in emotional speech
J Cogn Neurosci
 , 
2003
, vol. 
15
 (pg. 
1135
-
1148
)
Schirmer
A
Kotz
SA
Friederici
AD
Sex differentiates the role of emotional prosody during word processing
Brain Res Cogn Brain Res
 , 
2002
, vol. 
14
 (pg. 
228
-
233
)
Schirmer
A
Kotz
SA
Friederici
AD
On the role of attention for the processing of emotions in speech: sex differences revisited
Brain Res Cogn Brain Res
 , 
2005
, vol. 
24
 (pg. 
442
-
452
)
Scott
GG
O'Donnell
PJ
Leuthold
H
Sereno
SC
Early emotion word processing: evidence from event-related potentials
Biol Psychol
 , 
2009
, vol. 
80
 (pg. 
95
-
104
)
Stuart
GP
Jones
DM
Priming the identification of environmental sounds
Q J Exp Psychol
 , 
1995
, vol. 
48A
 (pg. 
741
-
761
)
Thierry
G
Cardebat
D
Demonet
JF
Electrophysiological comparison of grammatical processing and semantic processing of single spoken nouns
Brain Res Cogn Brain Res
 , 
2003
, vol. 
17
 (pg. 
535
-
547
)
Thierry
G
Doyon
B
Demonet
JF
ERP mapping in phonological and lexical semantic monitoring tasks: a study complementing previous PET results
Neuroimage
 , 
1998
, vol. 
8
 (pg. 
391
-
408
)
Thierry
G
Giraud
AL
Price
C
Hemispheric dissociation in access to the human semantic system
Neuron
 , 
2003
, vol. 
38
 (pg. 
499
-
506
)
Thierry
G
Price
CJ
Dissociating verbal and nonverbal conceptual processing in the human brain
J Cogn Neurosci
 , 
2006
, vol. 
18
 (pg. 
1018
-
1028
)
Thierry
G
Roberts
MV
Event-related potential study of attention capture by affective sounds
Neuroreport
 , 
2007
, vol. 
18
 (pg. 
245
-
248
)
Van Petten
C
Rheinfelder
H
Conceptual relationships between spoken words and environmental sounds: event-related brain potential measures
Neuropsychologia
 , 
1995
, vol. 
33
 (pg. 
485
-
508
)
Zhou
S
Zhou
W
Chen
X
Spatiotemporal analysis of ERP during Chinese idiom comprehension
Brain Topogr
 , 
2004
, vol. 
17
 (pg. 
27
-
37
)