Abstract

Mixed feelings, the simultaneous presence of feelings with positive and negative valence, remain an understudied topic. They pose a specific set of challenges due to individual variation, and their investigation requires analtyic approaches focusing on individually self-reported states. We used functional magnetic resonance imaging (fMRI) to scan 27 subjects watching an animated short film chosen to induce bittersweet mixed feelings. The same subjects labeled when they had experienced positive, negative, and mixed feelings. Using hidden-Markov models, we found that various brain regions could predict the onsets of new feeling states as determined by self-report. The ability of the models to identify these transitions suggests that these states may exhibit unique and consistent neural signatures. We next used the subjects’ self-reports to evaluate the spatiotemporal consistency of neural patterns for positive, negative, and mixed states. The insula had unique and consistent neural signatures for univalent states, but not for mixed valence states. The anterior cingulate and ventral medial prefrontal cortex had consistent neural signatures for both univalent and mixed states. This study is the first to demonstrate that subjectively reported changes in feelings induced by naturalistic stimuli can be predicted from fMRI and the first to show direct evidence for a neurally consistent representation of mixed feelings.

Introduction

Mixed valence feelings (commonly referred to as “mixed feelings” or “mixed emotions”), consist of simultaneously experiencing positive, negative, and generally conflicted feelings. Mixed feelings are usually thought of as a rarity but are actually common and occur across cultures (Miyamoto et al. 2010; Larsen and McGraw 2014; Lomas 2017; Moeller et al. 2018). Events that trigger mixed feelings are often ascribed a significant personal meaning (Oliver and Woolley 2010; Abeyta and Routledge 2017; Shirai and Kimura 2022). Despite their ubiquity, however, the topic of mixed feelings is largely absent from affective neuroscience research. One reason why mixed feelings are rarely studied is their lack of fit with the dominant ways of thinking about emotion in neuroscience. In the constructionist approach, bipolar valence is an irreducible dimension of affect and consequently, positivity and negativity cannot simultaneously occur (Barrett and Bliss-Moreau 2009; Larsen 2017; Russell 2017). In frameworks that view emotions as stemming from discrete functional biological states, the core affective systems and the traditionally defined emotion categories are generally conceived as solely positive or negative (Kragel and LaBar 2015; Adolphs 2016; Nummenmaa and Saarimäki 2017). When affective neuroscience conducts experiments in service of finding evidence for or against these two theories, important topics such as mixed feelings tend to fall between the gaps (Lench et al. 2013; Vaccaro et al. 2020). Critically, the lack of research on mixed feelings limits our knowledge of how the neurobiology of valence works—an essential topic for understanding affect.

In a previous theoretical paper, we advanced several hypotheses concerning the neurobiological foundations of mixed feelings (Vaccaro et al. 2020). The model posits that even when the neural correlates of emotions vacillate rapidly in the brainstem and subcortical limbic structures, subsequent integrative processes in the insula, anterior cingulate, and vmPFC result in a unified feeling state. With this model in mind, we may expect differences in the regions where the onset of mixed feelings can be predicted. Additionally, if some functional circuitries encode positivity and negativity as mutually exclusive states, the neural patterns when subjects are experiencing mixed feelings should be in flux. This would result in the regional patterns at different time points of experiencing mixed feelings being no more correlated with each other than they are with the regional patterns during the positive and negative states that they are switching between. On the other hand, when regions represent the feeling of mixed valence as an integrated and unique state, the regional pattern during mixed feelings should be correlated with other timepoints of itself than with these positive and negative states. Traditional emotion research in fMRI relies on the ability to have subjects feel similarly when experiencing the same stimuli. This increases our analytic power by allowing us to control for differences between stimuli to isolate the quality we are interested in and by combining our data across subjects. However, mixed feelings are not easy to induce consistently across subjects, especially in a laboratory setting (Berrios et al. 2015a; Kreibig and Gross 2017), as they generally require naturalistic contexts that allow for the contrast between the present view and another point in time (Grossmann and Ellsworth 2017). Mixed feelings research likely requires using approaches and stimuli that allow the consideration of individualized experiences rather than consistent responses across participants (Rafaeli et al. 2007; Moore and Martin 2022; Oh and Tong 2022). However, to analyze the more dynamic, naturalistic stimuli that capture these complex emotions, we cannot rely on traditional univariate approaches.

Hidden-Markov models (HMMs) are particularly suited to analyzing brain activity during film stimuli. With HMMs, we assume that regional activity in the brain shifts through several consistent and unique neural signatures (Baldassano et al. 2017). With this type of analysis, we can test hypotheses about when state transitions occur in relation to the continuous stimuli, as well as region-of-interest hypotheses as to the features of the stimuli relevant to shifts in states (Adolphs 2016; Saarimäki 2021). HMM approaches have revealed that state changes in regions such as the posterior medial cortex and prefrontal cortex align well with meaningful events in films, whereas auditory and visual regions align instead with lower-level sensory features (Baldassano et al. 2017; Lee et al. 2021). Recent studies have begun to apply HMM analyses to fMRI studies of affective experience. In one recent study, states of the vmPFC were found to associate with affective states while watching a TV drama, though the timing of state onsets varied across subjects (Chang et al. 2021). Another recent study was able to predict consensus emotional shifts in musical pieces in temporoparietal regions (Sachs et al. 2023). However, to date, no studies have HMM to predict individual affective state transitions while subjects watch film stimuli (Morgenroth et al. 2023).

The manner in which HMM calculates state transitions is a strong and crucial test as to whether individually reported mixed emotions, especially in the context of positive and negative emotions, are associated with enough neural consistency to feasibly be studied with fMRI. Vacillation hypotheses suggest that the subjective report of mixed emotions is related to ongoing mental shifts, suggesting that second-to-second neural activity would not necessarily have a predictable consistent pattern (Larsen 2017; Russell 2017; Vaccaro et al. 2020). A lack of neural consistency, characterized by an inability to predict onsets and transitions between self-reported feeling states from neural activity, would undermine neuroimaging as a feasible method of studying mixed emotions.

In the present study, we had two main goals: (i) to determine if individualized feeling transitions during a dynamic, naturalistic experience can be predicted with fMRI and (ii) to determine if various brain regions differ in their representation of mixed feelings as a state with unique and consistent neural patterns. For our first goal, we used HMM as a data-driven approach to test if state transitions in a given neural region of interest would occur at similar time points to an individual’s emotion annotations. For the second goal, we used subjects’ individual emotion annotations as a ground truth to analyze whether mixed, positive, and negative states were neurally consistent and unique from each other.

Methods

Participants

Twenty-eight right-handed, English-speaking subjects in the Los Angeles area were recruited for the study. After one subject’s data were removed due to a scanning parameters error, the final sample included 27 right-handed subjects (mean age = 25.3, SD = 6.3; 15 female, 12 male). None had any history of neurological trauma. This sample size was chosen to be comparable to recent naturalistic fMRI studies. The manageable size of sample allowed us the ability to fit separate HMM models to every subject’s data, associate each participant’s data with their own unique emotion timecourses, and compare each subject’s data to their own individually generated null distribution for each analysis and then aggregate these null distributions to create a null distribution for effect sizes across all subjects. Thereby, we were able to analyze subjects in as high of detail as possible across the entire stimulus length, while also having a large enough sample to test for significant effect sizes across the sample. All provided informed consent as approved by the University of Southern California Institutional Review Board.

Scanning parameters

All fMRI scanning was completed on a 3T Siemens Prisma System Scanner at the USC Dornsife Cognitive Neuroimaging Center using a 32-channel head coil. Anatomical images were acquired with a T1-weighted magnetization-prepared rapid gradient-echo (MPRAGE) sequence (repetition time [TR]/echo time [TE] = 2300/2.26, voxel size 2 mm isotropic voxels, flip angle 9°). Functional images were acquired with a T2*-weighted gradient echo sequence (repetition time [TR]/echo time [TE] = 1,000/35 ms, 41 transverse 3 mm slices, flip angle 52°, multiband factor = 8). A T2-weighted volume was acquired for blind review by an independent neuroradiologist, in compliance with the scanning center’s policy and local IRB guidelines.

Procedure

While in the scanner, subjects watched the Oscar-nominated animated short One Small Step (Chesworth 2018). The film tells the story of a young girl who dreams of being an astronaut and her shoe-making father who supports her dreams. The father encourages her dream as a child by making her astronaut boots, and when she gets older, she begins to study astrophysics in college. She struggles in her courses and is initially rejected from the astronaut program. Throughout this struggle, every time she comes home her father is sitting at the kitchen table with food ready for her. One day, she returns home and he is not there—we then see her at his grave, crying. Later, when sorting through his belongings, she finds the old astronaut boot he had made for her. This rekindles her motivation; she begins to excel at school and gets accepted to the astronaut program. We finally see her launch in a rocket ship to the moon, and when she takes the first step onto the surface, the scene pans to her as a child wearing the astronaut boots playing with her father on her bed.

After scanning, subjects rewatched the video and performed a feeling annotation task using a custom JavaScript application (https://github.com/jtkaplan/cinemotion). Subjects were instructed to reflect back on when they watched the video in the scanner and to press buttons to indicate how they were feeling during that initial watching. Subjects were able to turn feeling labels on and off to indicate stretches of time where they felt “Positive,” “Negative,” and “Mixed” and also had an additional button to indicate any period of time they had cried.

fMRI preprocessing

Data were preprocessed using fMRIPrep’s standard parameters (Esteban et al. 2019). fMRIPrep implements brain extraction, slice-time correction, standard motion correction, spatial smoothing, and high-pass temporal filtering. Additionally, ICA-AROMA was run for the removal of motion-related components. For each preprocessed subject, we regressed out the effect of white matter, gray matter, and cerebrospinal fluid from the time course data. Finally, all subjects’ data were trimmed to be 454 TRs long, removing the opening 12 s of scanning and final 6 s. This helped align the neural data while accounting for both the initial 6 s delay before the video began and 6 s of hemodynamic response function delay.

Feeling labels

Each subject’s feeling labeling data were annotated to create individualized transition points for them. At each second of the clip, we determined which feeling state the subject was in based on the labels that were turned on. Positive and negative feelings were defined as moments where either label was on exclusively, while mixed feelings were defined as any moment the mixed feeling button was on, as subjects differed in whether they tended to use the mixed button as mutual exclusively to the other two or tended to turn all three on together when experiencing mixed feelings. If no buttons were on, this was considered a neutral feeling period. The feeling labels were smoothed with a 5 s sliding window to account for mistaken button presses, small gaps of time between new labels, and delays between actual feeling onset and response time. Timepoints (to the nearest second) where feeling labels changed were then used as that subject’s specific transition points.

Regions of interest

In each of our analyses, we extracted voxel-wise timeseries from eight regions of interest (ROIs). The insula ROI was obtained from merging the three functional connectivity–based subdivisions in Deen et al. (2010). The anterior cingulate, amygdala, and nucleus accumbens regions were obtained from the Harvard cortical and subcortical atlases. The posterior cingulate region was taken from Shirer et al. (2012). Finally, we used the planum temporale from the Harvard-Oxford cortical atlas as auditory cortex and Brodmann area 17 thresholded at 50% probability as early visual cortex.

Matching individualized feeling boundaries to predicted neural state changes

All HMM models in this study were fitted using the BrainIAK Python package (Kumar et al. 2021). The analysis fits an HMM model to the data by iteratively estimating the neural event signatures for a specific number of events (from here on referred to as k) and the temporal event structure, till the model converges on a high-likelihood solution.

For each subject, in each region of interest, we fit a separate HMM model. The k-value for each subject was picked to be equal to the number of feeling transitions plus 1, so the model would fit the same number of state transitions as were reported subjectively while labeling the video.

We then compared the HMM-identified neural transition time points with that subject’s manually reported feeling transitions using a moving-window algorithm. The first neural boundary found within plus or minus 5 TRs (TR = 1 s) of the reported feeling boundary was considered a match. Therefore, a match with a manually reported boundary would not be counted more than once when multiple HMM-identified boundaries were in its window. This resulted in an overall fractional match rate, representing the proportion of reported feeling boundaries that matched with neurally predicted boundaries for each region within each participant.

In testing for statistical significance, we aimed to account for the variability in the number of feelings subjects reported and subjects’ respective neural differences. We first generated null distributions for every subject in every region by randomly shuffling the TRs independently and randomly generating transition timepoints. We fit an HMM to these randomly shuffled data and tested whether the randomly generated timepoints matched within 5 TRs of the HMM-predicted ones, repeating this entire process 10,000 times. After the 10,000 permutations, we compared the subject’s true match rate to the mean of their null distribution to get a metric of how far their match rate was from chance. All these differences were then averaged across subjects to get one metric of the average difference between subjects’ match rates and their null match rates in a specific region. We repeated this process using all 10,000 permutations of each subject’s null distribution, resulting in an aggregated null distribution for the average difference metrics in each region. By comparing the actual average difference metrics to these aggregated null distributions, we were able to determine whether subjects’ neurally predicted boundaries were consistently matching with their reported boundaries across regions at a level that was significantly greater than chance.

Matching consensus feeling boundaries to predicted neural state changes

We additionally tested the matching rate of each subject’s data with “consensus” emotion boundaries—an approach more similar to previous HMM fMRI studies. We defined consensus emotion boundaries as timepoints where 50% of subjects agreed on a new emotional state for at least 5 s. This led to a total of seven boundaries across the length of the video. The HMM analysis and null distributions were then fit in the same manner as the analysis of individualized feeling boundaries. Using Wilcoxon signed-rank tests, we then compared the regional matching accuracy rates for individualized feelings vs. consensus boundaries.

Consistent optimal boundaries in each region

We explored the optimal number of states in various regions, agnostic to any consideration of emotion. With the mean time-course data of all subjects, in each of our regions of interest, we ran HMM models with different k-values ranging from 2 to 45. For each k-value (the number of states), we calculated the correlations of spatial signal patterns within predicted boundaries vs. across boundaries using the method originated in Baldassano et al. (2017). Specifically, we calculated the correlation of every timepoint with the timepoint occurring 5 s later. The “within” correlation was determined as all of the timepoint correlations where both timepoints were within the boundaries of a predicted state. The “across” correlation was between timepoints outside a state and time points inside the state. Whichever k-value led to the greatest positive difference of the within correlation minus the across correlation was considered the optimal model for that region.

Using this optimal k-value from the mean data, we then tested whether this same value, when applied to individual subjects’ data, would find similar boundary locations. In each region, for each subject, we used the best k-value from the mean data in that region. We then compared the subject’s HMM-determined transition points with the transition points from the mean data using the same moving algorithm as the previous analysis and tested for significance using the same approach to making null distributions as the previous analysis. The number of states that led to the highest within vs. across correlation difference differed in each region. As expected, the optimal fit for sensory cortices involved a higher number of states than those for other regions, likely due to a greater number of changes in low-level sensory features in the video format. The k-values displayed in Table 1 were used in the next analysis.

Table 1

Optimal number of states in each region based on within vs. across correlations in the mean data.

Region of interestk
Ventromedial prefrontal cortex3
Anterior cingulate23
Insula5
Nucleus accumbens13
Amygdala5
Auditory cortex28
Posterior cingulate39
Visual cortex (V1)27
Region of interestk
Ventromedial prefrontal cortex3
Anterior cingulate23
Insula5
Nucleus accumbens13
Amygdala5
Auditory cortex28
Posterior cingulate39
Visual cortex (V1)27
Table 1

Optimal number of states in each region based on within vs. across correlations in the mean data.

Region of interestk
Ventromedial prefrontal cortex3
Anterior cingulate23
Insula5
Nucleus accumbens13
Amygdala5
Auditory cortex28
Posterior cingulate39
Visual cortex (V1)27
Region of interestk
Ventromedial prefrontal cortex3
Anterior cingulate23
Insula5
Nucleus accumbens13
Amygdala5
Auditory cortex28
Posterior cingulate39
Visual cortex (V1)27

Matching the regional optimal boundaries to individual subjects’ data

These mean-data derived boundaries significantly matched with boundaries predicted from individual subjects’ data in the insula (35% accuracy, P = 0.046), vmPFC (42.1% match, P = 0.010), anterior cingulate (10% match, P < 0.0001), posterior cingulate (37.3% match, P < 0.0001) visual cortex (33.4% match, P < 0.0001), and auditory cortex (32.1% match, P < 0.0001). Boundaries predicted from the mean of the group’s data notably did not match individually predicted boundaries in the amygdala (10% match, P = 0.233) or nucleus accumbens (10% match, P = 0.438).

Neural consistency of feelings in different regions

We aimed to test whether specific regions differed in the consistency of their neural states corresponding to positive, negative, and mixed feelings. To test this, we used each subjects’ feeling labels to determine periods of time where they reported feeling positively, negatively, and mixed. To quantify the consistency of neural states within subject-reported feeling state boundaries, we used a similar logic to the within vs. across correlation analysis for each feeling type. Specifically, we compared the difference in the average correlation of each timepoint of a feeling type with all other timepoints of that feeling type, with the average correlation of those timepoints correlated with all other timepoints that were of different feeling types (i.e. the correlation of all timepoints where a subject indicated mixed feelings with other timepoints where they reported mixed feelings vs. the correlation of all mixed feeling timepoints with all positive and negative time points). We will refer to this metric as neural consistency. We intentionally excluded timepoints where subjects did not report any feeling to avoid our results being driven largely by differences in how correlated affect timepoints are with neutral timepoints—our interest is in the distinguishability of different types of affective states. To create null distributions for each subject in each region, we shuffled the order of every TR 1,000 times and calculated this correlation difference each time with the randomized TRs. We then calculated a test statistic by subtracting the subject’s true correlation difference with the mean of the null distribution. Finally, to create overall null distributions for each region, we used the same method as in the previous analyses.

Valence similarity bias of mixed states

Finally, we explored whether mixed states in different regions consistently showed more similarity toward patterns associated with positive states or negative states. We employed the same overall approach as our previous neural consistency analysis, but modified our metric of interest to be the difference between the average correlation between mixed TRs and positive TRs and the average correlation between mixed TRs and negative TRs. With this method, a positive metric would indicate mixed TRs being more similar to positive TRs than negative TRs, and vice versa. We calculated null distributions and group-level metrics using the same methods as the previous analyses, and additionally calculated Cohen’s d as a comparable effect size measure across regions.

Results

Subjects reported strong variability in what they felt, when specific feelings occurred, and in the number of times their feelings changed (mean = 12.5, standard deviation = 7.1, range = 3 to 36).

Matching individual’s feeling transitions to predicted neural state change

HMM-predicted boundaries matched subjects’ individual boundaries in the insula (33.2% match, P = 0.007), amygdala (32.1% match, P = 0.024), anterior cingulate (32.1% match, P = 0.023), and nucleus accumbens (33.7% match, P = 0.034) at rates significantly better than would be expected by chance (Fig. 1). The vmPFC (28.3% match, P = 0.383), posterior cingulate (27.0% match, P = 0.357), visual cortex (25.2% match, P = 0.354), and auditory cortex (31.2% match, P = 0.056) did not significantly match with individuals’ self-report (Fig. 2).

Percentage of subjects (n = 27) reporting each state throughout the film. Subjects were allowed to report more than one state at a time, or to report no state at all, so percentages do not necessarily add up to 100%.
Fig. 1

Percentage of subjects (n = 27) reporting each state throughout the film. Subjects were allowed to report more than one state at a time, or to report no state at all, so percentages do not necessarily add up to 100%.

Significance was determined by comparison with 10,000 permutations of shuffling TRs and boundaries. Each plot shows a distribution, across all subjects, of the average deviation from the mean of the null distribution for each individual subject. The dot indicates the true average difference between a subject’s matching rate and the mean of their null distribution. Asterisks indicate a matching rate significantly greater (P < 0.05) than the null mean.
Fig. 2

Significance was determined by comparison with 10,000 permutations of shuffling TRs and boundaries. Each plot shows a distribution, across all subjects, of the average deviation from the mean of the null distribution for each individual subject. The dot indicates the true average difference between a subject’s matching rate and the mean of their null distribution. Asterisks indicate a matching rate significantly greater (P < 0.05) than the null mean.

Matching rates to individual vs. consensus emotional report

We next tested the matching rate of each subject’s data with the average “consensus” emotion boundaries, an approach more similar to previous fMRI studies that employed HMMs. HMM-predicted boundaries did not significantly match consensus feeling transitions in any tested region.

Matching rates were significantly greater for individually reported feeling transitions vs. consensus feeling boundaries in all regions of interest: insula (match 33.2% vs. 17.5%, P = 0.002), amygdala (32.1% vs. 13.8%, P = 0.002), anterior cingulate (32.1% vs. 14.8%, P = 0.001), nucleus accumbens (33.7% vs. 17.5%, P = 0.002), posterior cingulate (27.0% vs. 14.5%, P = 0.005), vmPFC (28.3% vs. 15.3%, P = 0.003), auditory cortex (31.2% vs. 17.5%, P = 0.001), and visual cortex (25.2% vs. 13.8%, P = 0.013) (Fig. 2).

Matching rates to mean optimal regional boundaries

We additionally tested how boundaries derived from the mean neural data, across subjects and within each region, would fit with individual subjects’ data. This allowed us to explore how consistent predicted boundaries are in each of these regions when analysis decisions are agnostic to considerations of emotion. In sum: Were the successful matching rates to individuals’ emotion report just reflective of regional consistencies in neural states overall? Based on the pattern of significant regions, this does not appear to be the case.

These mean-data-derived boundaries significantly matched with boundaries predicted from individual subjects’ data in the insula (13.4% accuracy, P = 0.046), vmPFC (42.1% match, P = 0.01), anterior cingulate (40.7% match, P < 0.0001), posterior cingulate (57.3% match, P < 0.0001), visual cortex (74.4% match, P < 0.0001), and auditory cortex (48.9% match, P < 0.0001). Boundaries predicted from the mean of the group’s data notably did not match individually predicted boundaries in the amygdala (10.7% match, P = 0.233) or nucleus accumbens (10.0% match, P = 0.438).

Neural consistency of positive, negative, and mixed valence feelings in different regions

Neural consistency for a specific state that is consistently above chance across subjects in a region demonstrates that (i) the spatial pattern of activity in that region has relative consistency whenever it occurs and (ii) that spatial pattern is significantly different from the other states it is being compared to. A mixed state characterized by fluctuation, and/or similarity to the positive and negative states, would not pass this test.

The vmPFC was significant for all three feeling states (P < 0.001 for all, average same state—different state rdifference = 0.0155, rdifference = 0.0278, rdifference = 0.0249 for positive, negative, and mixed respectively), as was the ACC (P = 0.001, rdifference = 0.0049 positive; P = 0.007, rdifference = 0.0098 negative; P < 0.001, rdifference = 0.0092 mixed). Analysis of the insular cortex found significant neural consistency for positive (P = 0.002, rdifference = 0.0017) and negative states (P = 0.007, rdifference = 0.0048), but not mixed (P = 0.354, rdifference = 0.0056). In the amygdala, only negative states (P = 0.009, rdifference = 0.0026) were significantly more neurally consistent than would be expected by chance (P = 0.394, rdifference = −0.0006 for positive; P = 0.842, rdifference = 0.0004 for mixed). No feeling states were significantly consistent in the nucleus accumbens (P = 0.524, rdifference = −1.8 × 10−6 for positive, P = 0.107, rdifference = 0.002 for negative, P = 0.472, rdifference = 9.02 × 10−5 for mixed). In the auditory cortex, all three reached significance (P < 0.001, rdifference = 0.008 for positive, P = 0.001, rdifference = 0.009 for negative, P < 0.001, rdifference = 0.019 for mixed). The PCC reached significance for negative (P < 0.001, rdifference = 0.0102) and mixed (P = 0.004, rdifference = 0.0109) but not for positive (P = 0.767, rdifference = 0.0031). In the visual cortex, all three reached significance (P < 0.001, rdifference = 0.013, rdifference = 0.023, rdifference = 0.017 for positive, negative, and mixed, respectively) (Fig. 3, for example, regional null distributions vs. true values).

Average difference between neural consistency of positive, negative, and mixed timepoints across subjects and the mean of individual subjects’ null distributions. Neural consistency is calculated as the average correlation between timepoints of the same feeling type subtracted by the average correlation of timepoints of that feeling type with other feeling types. Null distributions are made by 1,000 permutations of shuffling TRs in each subject.
Fig. 3

Average difference between neural consistency of positive, negative, and mixed timepoints across subjects and the mean of individual subjects’ null distributions. Neural consistency is calculated as the average correlation between timepoints of the same feeling type subtracted by the average correlation of timepoints of that feeling type with other feeling types. Null distributions are made by 1,000 permutations of shuffling TRs in each subject.

Valence similarity bias of mixed states

In our final analysis, we explored whether mixed valence states exhibited biases as to whether their patterns across subjects were more similar to negative or positive states in different regions (Fig. 4). In both vmPFC (P < 0.001, d = −0.262) and ACC (P < 0.001, d = −0.304), mixed states were significantly more similar to negative states than positive states. Mixed states were significantly more similar to positive states than negative ones in the insula (P < 0.001, d = 0.145), amygdala (P < 0.001, d = 0.167), nucleus accumbens (P < 0.001, d = 0.114), auditory cortex (P < 0.001, d = 0.227), and visual cortex (P < 0.001, d = 0.922). In the PCC, mixed states were not significantly biased towards either end of the valence spectrum (P = 0.546, d = −0.002)

Effect size (Cohen’s d) of valence similarity bias for mixed states in each region. Effect sizes are calculated from the t-statistics across subjects and the mean of the null distributions.
Fig. 4

Effect size (Cohen’s d) of valence similarity bias for mixed states in each region. Effect sizes are calculated from the t-statistics across subjects and the mean of the null distributions.

Discussion

We used individualized feeling dynamics to understand the neural basis of mixed feelings. For a long time, mixed feelings have been largely avoided in neuroscience research and approaches to emotion, owing to questions as to the structure of valence, individual variation, and temporal resolution (Wohlgemuth 1911; Allport 1922; Barrett and Bliss-Moreau 2009; Larsen 2017; Walle and Dukes 2023). The two presented studies aim to show both the importance of mixed feelings for progressing affective neuroscience into methods more closely tied with self-report and to understand the neural features and temporal patterns that are associated with mixed feelings overall. In response to a dynamic affective stimulus, our subjects frequently reported mixed feelings, and these transitions into mixed feelings were predictable from patterns of neural activity. Further analyses then confirmed that in many regions, these patterns were consistent and unique compared to positive and negative states, partially explaining their predictability.

Using HMMs, we found that brain state changes in the insular cortex, anterior cingulate, amygdala, and nucleus accumbens match significantly with feeling transitions that are defined by an individual’s self-report. All of these regions have been classically associated with interoception and salience processing (Damasio and Carvalho 2013; Saarimäki et al. 2016; Critchley and Garfinkel 2017; Seeley 2019). These results provide further evidence that the onsets of affective features can be predicted by applying HMM to fMRI data (Chang et al. 2021; Sachs et al. 2023) and extend this approach to include an individualized self-report. Individually predicted boundaries matched significantly better with the subjects’ self-reported boundaries when compared to the emotion boundaries defined by consensus, which is the method used in most published analyses. While we often rely on assumed consistency of emotional features across subjects due to time constraints in collecting self-report data, and hoping to increase the power of the studies, defining stimuli in this manner likely shifts what our measures are studying further away from the neural correlates of conscious emotional experience and closer to other cognitive and stimuli-based features.

Importantly, these findings cannot be explained by a general lack of consistency in any aspect of neurocognitive processing or by the self-reported emotion transitions reflecting simply low-level stimulus features: our analysis of the optimal boundaries (derived from the mean data) revealed significant consistency in the timing of brain state changes across subjects, in many regions. These boundaries, however, (i) were not the same as the self-reported affect-related ones; (ii) did not match at similar rates; and (iii) significantly matched in many regions that did not significantly match with self-report, such as the vmPFC, visual cortex, and auditory cortex. Meanwhile, the amygdala and nucleus accumbens, which significantly matched with subject self-report, did not have consistent optimal boundaries that fit consistently across all subjects.

The HMM results demonstrate that even when subjects are frequently transitioning between different feeling states, and importantly, reporting mixed feeling states, the neural patterns are unique and consistent enough to be significantly predicted. Given the predictability of the transition points between self-reported states, we used further analyses to investigate the specifics of how mixed, positive, and negative states compare to each other in terms of temporal stability and uniqueness. Our analyses of the neural consistency of positive, negative, and mixed feelings fittingly had mixed results in terms of our initial hypotheses. The key deviation from our hypotheses was the finding that the insular cortex had consistent neural signatures for both positive and negative states, but not for mixed ones. The insular cortex has a role in integrating sources of affective information from multiple sources and helps establish our subjective sense of the present moment (Craig 2009a, 2009b; Kent and Wittmann 2021). We hypothesized that the temporal dynamics—on the scale of seconds in blood-oxygen-level-dependent (BOLD) activity—would be consistent and unique in this region. Our reasoning for this hypothesis was that when integrating mixed valence, a unique pattern of activity compared to positive and negative valence would occur reflecting the integration process of the two types of valence information—while significant vacillation might occur across time on the neuronal level, we did not expect this pattern inconsistency to be reflected on the temporal scale of the BOLD signal, as our data may suggest.

The anterior cingulate and vmPFC, two regions we proposed as being crucial for elaborating on the experience of mixed feelings, had significant neural consistency during mixed feelings, as well as during positive and negative feelings. The juxtaposition between positive and negative feelings arising during the same situation often leads to a sense of conflict (Berrios et al. 2015b; Mejia and Hooker 2017). Activity in the anterior cingulate has long been linked to the experience of conflict and ambivalence, likely related to its proposed role in error monitoring (Cunningham et al. 2004; Luttrell et al. 2016). In the context of mixed feelings, the anterior cingulate may similarly allow the utilization of information about conflicting goals to be used in making complex decisions and self-regulation (Nohlen et al. 2014; Kruschwitz et al. 2018; Yang et al. 2022). Conflict appears to be an important aspect of mixed feelings, and it is possible that the relevance of the ACC to mixed feelings is closely related to this potentially separate dimension of feeling. Interestingly, previous research has found that feelings of certainty, and feelings of ambivalence (the co-occurrence of positive and negative valence), track to separate regions of the ACC (Luttrell et al. 2016). Furthermore, recent work has shown that subjective uncertainty, while strongly associated with mixed feelings, does vary independently in the strength of the relationship across individuals and instances (Vaccaro et al. 2023). Cautiously, these findings may suggest cognitive overlap in the processes that lead subjective conflict and to mixed feelings. Subjective conflict may also relate to the valence similarity bias of mixed states being more similar to negative than positive in these regions, despite otherwise being significantly unique.

The vmPFC may play an important role in the representation of mixed feelings. Previous studies have found the orbitofrontal portion of the vmPFC to show unique activations in response to conflicting affective information (Simmons et al. 2006; Rolls and Grabenhorst 2008). More broadly, the vmPFC has been proposed to integrate affective information from the body with other sources of information, such as memories and conceptual knowledge (Barbas 2000; Rolls and Grabenhorst 2008; Roy et al. 2012). The integration of these various modalities may play a pivotal role in the generation of complex feelings. The high distinctiveness of the three types of feelings in the ventromedial prefrontal cortex also aligns with the somatic marker hypothesis (Damasio 1996). Even if the various physiological patterns in insular cortex and subcortical regions are distinctively positive or negative, the vmPFC may facilitate the learning and representation of this fluctuating somatic pattern, leading to an emotional marker that is stable, representing mixed valence (Panksepp 2005; Dunning et al. 2017)

Surprisingly, while all three feeling states were highly neurally consistent in the vmPFC, HMM methods could not predict the timings of emotion transitions at rates better than chance in this region. In the opposite vein, HMM significantly predicted emotion transitions in the nucleus accumbens despite the differently valenced states not being neurally consistent. It is possible that these findings hint at the temporal dynamics of these regions as well as the complexity of the information represented. Previous work has found that the rate of change in the vmPFC is relatively slow during naturalistic viewing, being on the order of tens of seconds—but that these states themselves are quite stable (Chang et al. 2021; Lee et al. 2021). The slow and cumulative nature of information processing would mean that the changes between distinct states are more gradual, making it unlikely that boundaries could be predicted within a margin of error of 5 s as is done with our analyses. Meanwhile, subcortical regions such as the nucleus accumbens and amygdala, which are involved in quick saliency-related orienting, have dynamics characterized by briefer transitions in activity—new activity patterns may be more of “spikes” than stable states (Puccetti et al. 2022).

Sensory cortices also were also strongly neurally consistent for affective states, with the auditory cortex being significant for positive and mixed states, while the visual cortex was significant for positive, negative, and mixed states. Previous studies have found that the valence and emotion category of visuo-auditory stimuli can be predicted remarkably well from patterns of activity in lower-level sensory cortices (Ethofer et al. 2009; Kragel et al. 2019; Bo et al. 2021). Specific visual and auditory cues may have associations with specific types of affect.

The facts that mixed, positive, and negative feelings are unique and distinct from each other, and that they are consistent in anterior cingulate and vmPFC, suggest that their neurobiological features are not the mere consequence of switching back and forth between a positive and a negative state. Furthermore, our analysis of regional differences in valence pattern similarity of mixed states suggests that they cannot be simply attributed to a type of labeling of either a positive or negative state and are not significantly more similar to positive nor negative states on a consistent basis throughout the brain (Russell 1980; Murray et al. 2023). Interestingly, the posterior cingulate also had a unique and consistent neural signature for mixed feelings, and this state had no bias toward positive-like or negative-like patterns. The posterior cingulate has been implicated in many studies of nostalgia as being important for the contextual and autobiographical processes involved in setting the stage for the bittersweetness of nostalgic remembrance (Yang et al. 2022). It is possible that these processes, facilitated by the posterior cingulate, are particularly unique to mixed feelings in a way that shapes the neural patterns as resembling neither positivity nor negativity. One previous study using univariate methods found that the posterior cingulate was associated with the mixed reaction of amusement with disgust, but did not find that these patterns were significantly different from the disgust-only patterns (Murray et al. 2023). Our study further confirms the region’s role in mixed reactions and also demonstrates that multivoxel pattern analyses may shed light on state differences in manners that univariate approaches cannot. Overall, the posterior cingulate may be a key region to explore in future studies of mixed feelings, incorporating further multivariate analyses, functional connectivity, and individual-level differences to further parse its role.

These results underscore the importance of considering individualized experiences when investigating the neural correlates of affect, because they may capture different and crucial elements of affective processing. On this note, it is important to consider the extent to which personal autobiographical relevance of the film, and all subjects living in the United States, may affect emotional responses. Future work should investigate how neural responses and emotion dynamics vary with subjects living in different cultural contexts and whether the relatability of the story affects emotional and neural responses. This being said, we still expect the regional patterns in neural consistency of univalent vs. mixed feelings to be generalizable. Furthermore, our work used the voxel patterns of entire regions to find broad regional patterns of consistency that were significantly above chance. Future work can expand on these methods to incorporate feature selection for the most informative voxels, which would also facilitate investigating whole-brain patterns. Our results where the valence similarity bias of mixed states differ across regions further add credence to the utility of a whole-brain approach, suggesting that regional combinations of positive-like and negative-like patterns may further underlie a whole-brain unique pattern for mixed states. Research on mixed feelings poses major difficulties, but the obstacles should not stand in the way of creating a comprehensive picture of affect. Our findings suggest that we should regard mixed valence feelings as a valid and important concept in neuroscience and psychological science.

Authors contributions

Anthony Vaccaro (Conceptualization, Data curation, Formal analysis, Investigation, Methodology, Project administration, Resources, Software, Supervision, Visualization, Writing—original draft, Writing—review & editing), Helen Wu (Investigation, Methodology, Software, Visualization, Writing—review & editing), Rishab Iyer (Data curation, Formal analysis, Investigation, Software, Writing—review & editing), Shruti Shakthivel (Investigation, Resources, Writing—review & editing), Nina Christie (Investigation, Resources, Writing—review & editing), Antonio Damasio (Conceptualization, Resources, Writing—review & editing), and Jonas Kaplan (Conceptualization, Methodology, Resources, Supervision, Writing—review & editing).

Funding

This work was supported by the Brain and Creativity Institute and Dornsife Cognitive Neuroimaging Center.

Conflict of interest statement: None declared.

References

Abeyta
AA
,
Routledge
C
. Nostalgia as a psychological resource for a meaningful life. In:
The happy mind: cognitive contributions to well-being
. Springer International Publishing/Springer Nature,
2017
. pp.
427
442
.

Adolphs
R
.
How should neuroscience study emotions? By distinguishing emotion states, concepts, and experiences
.
Soc Cogn Affect Neurosci
.
2016
:
12
(
1
):
24
31
.

Allport
FH
.
A physiological-genetic theory of feeling and emotion
.
Psychol Rev
.
1922
:
29
(
2
):
132
139
.

Baldassano
C
,
Chen
J
,
Zadbood
A
,
Pillow
JW
,
Hasson
U
,
Norman
KA
.
Discovering event structure in continuous narrative perception and memory
.
Neuron
.
2017
:
95
(
3
):
709
721
e705
.

Barbas
H
.
Connections underlying the synthesis of cognition, memory, and emotion in primate prefrontal cortices
.
Brain Res Bull
.
2000
:
52
(
5
):
319
330
. https://www.ncbi.nlm.nih.gov/pubmed/10922509.

Barrett
LF
,
Bliss-Moreau
E
.
Affect as a psychological primitive
.
Adv Exp Soc Psychol
.
2009
:
41
:
167
218
.

Berrios
R
,
Totterdell
P
,
Kellett
S
.
Eliciting mixed emotions: a meta-analysis comparing models, types, and measures
.
Front Psychol
.
2015a
:
6
:
428
.

Berrios
R
,
Totterdell
P
,
Kellett
S
.
Investigating goal conflict as a source of mixed emotions
.
Cogn Emot
.
2015b
:
29
(
4
):
755
763
.

Bo
K
,
Yin
S
,
Liu
Y
,
Hu
Z
,
Meyyappan
S
,
Kim
S
,
Keil
A
,
Ding
M
.
Decoding neural representations of affective scenes in retinotopic visual cortex
.
Cereb Cortex
.
2021
:
31
(
6
):
3047
3063
.

Chang
LJ
,
Jolly
E
,
Cheong
JH
,
Rapuano
KM
,
Greenstein
N
,
Chen
P-HA
,
Manning
JR
.
Endogenous variation in ventromedial prefrontal cortex state dynamics during naturalistic viewing reflects affective experience
.
Sci Adv
.
2021
:
7
(
17
):
eabf7129
.

Chesworth
AP,
Pontillas B
In:
Zhang
SATS
, editors.
One small step [animated short film]
.
Taiko Studios
; Los Angeles, CA, United States & Wuhan, China.
2018
.

Craig
AD
.
Emotional moments across time: a possible neural basis for time perception in the anterior insula
.
Philos Trans R Soc Lond B Biol Sci
.
2009a
:
364
(
1525
):
1933
1942
.

Craig
AD
.
How do you feel--now? The anterior insula and human awareness
.
Nat Rev Neurosci
.
2009b
:
10
(
1
):
59
70
.

Critchley
HD
,
Garfinkel
SN
.
Interoception and emotion
.
Curr Opin Psychol
.
2017
:
17
:
7
14
.

Cunningham
WA
,
Raye
CL
,
Johnson
MK
.
Implicit and explicit evaluation: FMRI correlates of valence, emotional intensity, and control in the processing of attitudes
.
J Cogn Neurosci
.
2004
:
16
(
10
):
1717
1729
.

Damasio
AR
.
The somatic marker hypothesis and the possible functions of the prefrontal cortex
.
Philos Trans R Soc Lond Ser B Biol Sci
.
1996
:
351
(
1346
):
1413
1420
.

Damasio
A
,
Carvalho
GB
.
The nature of feelings: evolutionary and neurobiological origins
.
Nat Rev Neurosci
.
2013
:
14
(
2
):
143
152
.

Deen
B
,
Pitskel
NB
,
Pelphrey
KA
.
Three systems of insular functional connectivity identified with cluster analysis
.
Cereb Cortex
.
2010
:
21
(
7
):
1498
1506
.

Dunning
D
,
Fetchenhauer
D
,
Schlösser
T
.
The varying roles played by emotion in economic decision making
.
Curr Opin Behav Sci
.
2017
:
15
:
33
38
.

Esteban
O
,
Markiewicz
CJ
,
Blair
RW
,
Moodie
CA
,
Isik
AI
,
Erramuzpe
A
,
Kent
JD
,
Goncalves
M
,
DuPre
E
,
Snyder
M
.
fMRIPrep: a robust preprocessing pipeline for functional MRI
.
Nat Methods
.
2019
:
16
(
1
):
111
116
.

Ethofer
T
,
Van De Ville
D
,
Scherer
K
,
Vuilleumier
P
.
Decoding of emotional information in voice-sensitive cortices
.
Curr Biol
.
2009
:
19
(
12
):
1028
1033
.

Grossmann
I
,
Ellsworth
PC
.
What are mixed emotions and what conditions foster them? Life-span experiences, culture and social awareness
.
Curr Opin Behav Sci
.
2017
:
15
:
1
5
.

Kent
L
,
Wittmann
M
.
Time consciousness: the missing link in theories of consciousness
.
Neurosci Conscious
.
2021
:
2021
(
2
):
niab011
.

Kragel
PA
,
LaBar
KS
.
Multivariate neural biomarkers of emotional states are categorically distinct
.
Soc Cogn Affect Neurosci
.
2015
:
10
(
11
):
1437
1448
.

Kragel
PA
,
Reddan
MC
,
LaBar
KS
,
Wager
TD
.
Emotion schemas are embedded in the human visual system
.
Sci Adv
.
2019
:
5
(
7
):
eaaw4358
.

Kreibig
SD
,
Gross
JJ
.
Understanding mixed emotions: paradigms and measures
.
Curr Opin Behav Sci
.
2017
:
15
:
62
71
.

Kruschwitz
JD
,
Waller
L
,
List
D
,
Wisniewski
D
,
Ludwig
VU
,
Korb
F
,
Wolfensteller
U
,
Goschke
T
,
Walter
H
.
Anticipating the good and the bad: a study on the neural correlates of bivalent emotion anticipation and their malleability via attentional deployment
.
NeuroImage
.
2018
:
183
:
553
564
.

Kumar
M
,
Anderson
MJ
,
Antony
JW
,
Baldassano
C
,
Brooks
PP
,
Cai
MB
,
Chen
P-HC
,
Ellis
CT
,
Henselman-Petrusek
G
,
Huberdeau
D
.
BrainIAK: the brain imaging analysis kit
.
Apert Neuro
.
2021
:
1
(
4
):1–19.

Larsen
JT
.
Holes in the case for mixed emotions
.
Emot Rev
.
2017
:
9
(
2
):
118
123
.

Larsen
JT
,
McGraw
AP
.
The case for mixed emotions
.
Soc Personal Psychol Compass
.
2014
:
8
(
6
):
263
274
.

Lee
CS
,
Aly
M
,
Baldassano
C
.
Anticipation of temporally structured events in the brain
.
elife
.
2021
:
10
:
e64972
.

Lench
HC
,
Bench
SW
,
Flores
SA
. Searching for evidence, not a war: reply to Lindquist, Siegel, Quigley, and Barrett.
Psychological Bulletin
.
2013
:
139
(1):264–268.

Lomas
T
.
The value of ambivalent emotions: a cross-cultural lexical analysis
.
Qual Res Psychol
.
2023
:
20
(
2
):
1
25
.

Luttrell
A
,
Stillman
PE
,
Hasinski
AE
,
Cunningham
WA
.
Neural dissociations in attitude strength: distinct regions of cingulate cortex track ambivalence and certainty
.
J Exp Psychol Gen
.
2016
:
145
(
4
):
419
433
.

Mejia
ST
,
Hooker
K
.
Mixed emotions within the context of goal pursuit
.
Curr Opin Behav Sci
.
2017
:
15
:
46
50
.

Miyamoto
Y
,
Uchida
Y
,
Ellsworth
PC
.
Culture and mixed emotions: co-occurrence of positive and negative emotions in Japan and the United States
.
Emotion
.
2010
:
10
(
3
):
404
415
.

Moeller
J
,
Ivcevic
Z
,
Brackett
MA
,
White
AE
.
Mixed emotions: network analyses of intra-individual co-occurrences within and across situations
.
Emotion
.
2018
:
18
(
8
):
1106
1121
.

Moore
MM
,
Martin
EA
.
Taking stock and moving forward: a personalized perspective on mixed emotions
.
Perspect Psychol Sci
.
2022
:
17
(
5
):
1258
1275
.

Morgenroth
E
,
Vilaclara
L
,
Muszynski
M
,
Gaviria
J
,
Vuilleumier
P
,
Van De Ville
D
.
Probing neurodynamics of experienced emotions—a Hitchhiker’s guide to film fMRI
.
Soc Cogn Affect Neurosci
.
2023
:
18
(
1
):
nsad063
.

Murray
RJ
,
Kreibig
SD
,
Pehrs
C
,
Vuilleumier
P
,
Gross
JJ
,
Samson
AC
.
Mixed emotions to social situations: an fMRI investigation
.
NeuroImage
.
2023
:
271
:119973:1–14.

Nohlen
HU
,
van
Harreveld
F
,
Rotteveel
M
,
Lelieveld
GJ
,
Crone
EA
.
Evaluating ambivalence: social-cognitive and affective brain regions associated with ambivalent decision-making
.
Soc Cogn Affect Neurosci
.
2014
:
9
(
7
):
924
931
.

Nummenmaa
L
,
Saarimäki
H
.
Emotions as discrete patterns of systemic activity
.
Neurosci Lett
.
2017
:
693
:
3
8
.

Oh
VY
,
Tong
EM
.
Specificity in the study of mixed emotions: a theoretical framework
.
Personal Soc Psychol Rev
.
2022
:
26
(
4
):
283
314
.

Oliver
MB
,
Woolley
JK
. Tragic and poignant entertainment: The gratifications of meaningfulness as emotional response. In: Döveling K, Konijn EA.
The Routledge handbook of emotions and mass media
.
London, UK: Routledge
;
2010
. pp.
148
161
.

Panksepp
J
.
On the embodied neural nature of core emotional affects
.
J Conscious Stud
.
2005
:
12
(
8–9
):
158
184
.

Puccetti
NA
,
Villano
WJ
,
Fadok
JP
,
Heller
AS.
Temporal dynamics of affect in the brain: Evidence from human imaging and animal models
.
Neuroscience and biobehavioral reviews
.
2022
:
133
:
104491
.

Rafaeli
E
,
Rogers
GM
,
Revelle
W
.
Affective synchrony: individual differences in mixed emotions
.
Personal Soc Psychol Bull
.
2007
:
33
(
7
):
915
932
.

Rolls
ET
,
Grabenhorst
F
.
The orbitofrontal cortex and beyond: from affect to decision-making
.
Prog Neurobiol
.
2008
:
86
(
3
):
216
244
.

Roy
M
,
Shohamy
D
,
Wager
TD
.
Ventromedial prefrontal-subcortical systems and the generation of affective meaning
.
Trends Cogn Sci
.
2012
:
16
(
3
):
147
156
.

Russell
JA
.
A circumplex model of affect
.
J Pers Soc Psychol
.
1980
:
39
(
6
):
1161
1178
.

Russell
JA
.
Mixed emotions viewed from the psychological constructionist perspective
.
Emot Rev
.
2017
:
9
(
2
):
111
117
.

Saarimäki
H
.
Naturalistic stimuli in affective neuroimaging: a review
.
Front Hum Neurosci
.
2021
:
15
:675068:1–25.

Saarimäki
H
,
Gotsopoulos
A
,
Jaaskelainen
IP
,
Lampinen
J
,
Vuilleumier
P
,
Hari
R
,
Sams
M
,
Nummenmaa
L
.
Discrete neural signatures of basic emotions
.
Cereb Cortex
.
2016
:
26
(
6
):
2563
2573
.

Sachs
ME
,
Ochsner
K
,
Baldassano
C
.
Brain state dynamics reflect emotion transitions induced by music
.
bioRxiv
.
2023
:
2023
2003
.

Seeley
WW
.
The salience network: a neural system for perceiving and responding to homeostatic demands
.
J Neurosci
.
2019
:
39
(
50
):
9878
9882
.

Shirai
M
,
Kimura
T
.
Degree of meaningfulness of an Event’s ending can modulate mixed emotional experiences among Japanese undergraduates
.
Percept Mot Skills
.
2022
:
129
(
4
):
1137
1150
.

Shirer
WR
,
Ryali
S
,
Rykhlevskaia
E
,
Menon
V
,
Greicius
MD
.
Decoding subject-driven cognitive states with whole-brain connectivity patterns
.
Cereb Cortex
.
2012
:
22
(
1
):
158
165
.

Simmons
A
,
Stein
MB
,
Matthews
SC
,
Feinstein
JS
,
Paulus
MP
.
Affective ambiguity for a group recruits ventromedial prefrontal cortex
.
NeuroImage
.
2006
:
29
(
2
):
655
661
.

Vaccaro
AG
,
Kaplan
JT
,
Damasio
A
.
Bittersweet: the neuroscience of ambivalent affect
.
Perspect Psychol Sci
.
2020
:
15
(
5
):
1187
1199
.

Vaccaro
AG
,
Shakthivel
S
,
Wu
H
,
Iyer
R
,
Kaplan
J
.
Individual differences in feelings of certainty surrounding mixed emotions
.
PsyArXix
.
2023
: .

Walle
EA
,
Dukes
D
.
We (still!) need to talk about valence: contemporary issues and recommendations for affective science
.
Affect Sci
.
2023
:
4
(
3
):
463
469
.

Wohlgemuth
A
.
Pleasure-unpleasure: an experimental investigation on the feeling-elements
. Vol.
No. 1-6
.
Cambridge, UK: University Press
;
1911
.

Yang
Z
,
Wildschut
T
,
Izuma
K
,
Gu
R
,
Luo
YL
,
Cai
H
,
Sedikides
C
.
Patterns of brain activity associated with nostalgia: a social-cognitive neuroscience perspective
.
Soc Cogn Affect Neurosci
.
2022
:
17
(
12
):
1131
1144
.

This article is published and distributed under the terms of the Oxford University Press, Standard Journals Publication Model (https://academic.oup.com/pages/standard-publication-reuse-rights)