Elsevier

Brain Research

Volume 1284, 11 August 2009, Pages 100-115
Brain Research

Research Report
Emotions in motion: Dynamic compared to static facial expressions of disgust and happiness reveal more widespread emotion-specific activations

https://doi.org/10.1016/j.brainres.2009.05.075Get rights and content

Abstract

In social contexts, facial expressions are dynamic in nature and vary rapidly in relation to situational requirements. However, there are very few fMRI studies using dynamic emotional stimuli. The aim of this study was (1) to introduce and evaluate a new stimulus database of static and dynamic emotional facial expressions according to arousal and recognizability investigated by a rating by both participants of the present fMRI study and by an external sample of 30 healthy women, (2) to examine the neural networks involved in emotion perception of static and dynamic facial stimuli separately, and (3) to examine the impact of motion on the emotional processing of dynamic compared to static face stimuli. A total of 16 females participated in the present fMRI study performing a passive emotion perception task including static and dynamic faces of neutral, happy and disgusted expressions. Comparing dynamic stimuli to static faces indicated enhanced emotion-specific brain activation patterns in the parahippocampal gyrus (PHG) including the amygdala (AMG), fusiform gyrus (FG), superior temporal gyrus (STG), inferior frontal gyrus (IFG), and occipital and orbitofrontal cortex (OFC). These regions have been discussed to be associated with emotional memory encoding, the perception of threat, facial identity, biological motion, the mirror neuron system, an increase of emotional arousal, and reward processing, respectively. Post hoc ratings of the dynamic stimuli revealed a better recognizability in comparison to the static stimuli. In conclusion, dynamic facial expressions might provide a more appropriate approach to examine the processing of emotional face perception than static stimuli.

Introduction

Emotions play an important role in regulating social behavior (see Adolphs, 2002 for review). The impact of emotions on our thoughts, memory, attention, decisions and behavior is part of everyday life experience. Everyday social life requires correct perception and interpretation of different emotional facial expressions for adequate behavior in social contexts.

Pioneering work in the study of face and emotional face perception was provided by Bruce and Young (1986). Based on this face perception model, Haxby et al. (2000) modified the latter model. This modified model is based on a “core” and an “extended system” of face perception. The “core system” represents the visual analysis of faces and comprises the inferior occipital gyri for early visual analysis, the fusiform face area (FFA) for the processing of facial features and identity, and the superior temporal sulcus (STS) area for the processing of changeable facial features (i.e., eyes, mouth, expressions; see also Allison et al. (2000), for review). The “extended system” is reciprocally linked to the core system and comprises the auditory cortex (i.e., for lip reading), parietal areas (i.e., for spatial attention to changeable features), the temporal pole (TP, associated with autobiographical information, names, and personal identity), and the amygdala (AMG, emotion, facial expressions). Haxby et al. claimed that the temporal sequence of processing of the different stages of the model should be addressed in detail in future studies (Haxby et al., 2000). Adolphs (2002) focused on this aspect and extended the face perception model by Haxby et al. (2000) by adding the temporal dimension of face and emotion perception and recognition. Within the first 120 ms after emotion onset, the amygdala (AMG, at this point, the AMG is activated in the early processing stage), the striate cortex and the thalamus are proposed to process and encode automatically very early perceptual structures of salient emotional stimuli. This is considered to be equivalent to the “core system” as suggested by Haxby et al. (2000).

The equivalent of the “extended system” of emotion recognition modules according to Adolphs (2002) comprises the striate cortex, the fusiform face area (FFA; early), the superior temporal gyrus (STG; early), the AMG, late (reactivation of the AMG), the orbitofrontal cortex (OFC) and the basal ganglia (BG). The latter regions are proposed to play a role in processing the motion of emotional expressions, even if simply implied by static stimuli (superior temporal sulcus [STS] area, especially for changes in mouth and eye area; for review, see Allison et al., 2000, Haxby et al., 2000, Hoffman and Haxby, 2000).

Numerous fMRI studies have provided evidence for an involvement of emotion-related neural networks in the perception of static emotional face expressions (for review, see Adolphs, 2002, Davidson and Irwin, 1999, LeDoux, 1996, Murphy et al., 2003, Phan et al., 2002).

The majority of emotion perception studies (for reviews, see Adolphs, 2002, Davidson and Irwin, 1999, Murphy et al., 2003, Phan et al., 2002) have used stimuli displaying static emotional facial expressions (see also Ekman and Friesen, 1976). Natural features, however, should better be considered for stimulus construction because they convey an increased richness of temporal and structural facial object properties (Harwood et al., 1999, Sato et al., 2004), which can improve the three-dimensional perception of faces (Knight and Johnston, 1997). These features, in turn, potentially facilitate emotion recognition and might even support the processing of social interactions in a more natural way (Bassili, 1979, Berry, 1990, LaBar et al., 2003, Sato et al., 2004).

Behavioral studies on moving facial expressions corroborate this line of argumentation and showed higher arousal rates (Simons et al., 1999, Weyers et al., 2006, respectively) as well as better recognition accuracy (Ambadar et al., 2005, Bassili, 1979, Harwood et al., 1999) during rating tasks of dynamic compared to static emotional facial expressions.

To date, there are only few neuroimaging studies addressing the perception of dynamic emotional stimuli, and, to our knowledge, there is no fMRI study examining the neural processing of dynamic facial expressions of disgust.

In a PET study, Kilts et al. (2003) contrasted dynamic and static emotional face stimuli showing happy and angry facial expression. They reported increased activity in V5, STS area and periamygdaloid area for dynamic versus static angry faces and cuneus, V5, lingual, middle temporal and medial frontal gyrus for dynamic versus static happy faces. In an fMRI study, LaBar et al. (2003) presented photographs and morphed videos of emotional expressions of anger and fear and reported an enhancement of activation for emotional compared to neutral expressions in – among other – the fusiform gyrus (FG), the ventromedial prefrontal (also orbitofrontal) cortex and the STS area for dynamic stronger than for static faces. Sato et al. (2004) applied a passive viewing task including happy and fearful dynamic faces. They described more widespread activations for dynamic compared to static facial expressions in happy and fearful expressions when compared to neutral faces or mosaics of scrambled faces. In line with the work of Kilts et al., 2003, LaBar et al., 2003, Sato et al., 2004 concluded that dynamic stimuli convey more lively and realistic aspects of faces occurring in social interactions and related those to leading to more widespread activation patterns.

The above-mentioned studies demonstrated that the processing of dynamic in contrast to static facial expressions appear to more reliably recruit neural networks of emotion processing such as the amygdala (Kilts et al., 2003, LaBar et al., 2003), fusiform gyrus, inferior occipital, middle and superior temporal regions (STS area; for reviews, see Allison et al., 2000, Haxby et al., 2000), motion sensitive areas (MT+/V5), and the lateral inferior frontal cortex (mirror neuron system; Buccino et al., 2001, Kilts et al., 2003, LaBar et al., 2003, Leslie et al., 2004, Sato et al., 2004).

We presume that natural moving faces might provide a more valid stimulus basis for the examination of neuronal correlates of facial expression perception. Only few studies have applied “natural” dynamic facial expressions (Gepner et al., 2001, Kilts et al., 2003). Kilts et al. (2003) applied dynamic emotional expressions of neutrality, happiness, and anger of a face database, which consisted of each two female and two male professional actors. Gepner et al. (2001) have recorded videos of one actress displaying natural dynamic facial expressions of joy, surprise, sadness, disgust, by exclusively one actress. The downside of those studies is that study participants might have run the risk of habituation because the same actors were presented multiple times. For this reason, we developed a new stimulus data set based on 80 different amateur actors and actresses.

The present study aimed at three central points: First, since we used a new stimulus database, the stimuli needed to be evaluated (i) by an extensive psychometrical evaluation study (see Experimental procedures section), and (ii) by participants of the fMRI study. Second, BOLD activation patterns of emotional (happiness/disgust) compared to neutral faces were examined. The above discussed network of emotion-specific areas of emotional (happiness/disgust) compared to neutral faces was expected for both static and dynamic stimuli. Third, we examined BOLD activation patterns of dynamic compared to static stimuli of both emotional valences compared to neutrality. We expected an “emotion by motion effect” which refers to the enhancement of emotional face perception and of activation of the above-mentioned emotion-specific network by the movement of emotional facial expressions. Thus, we expected consistent (i.e., reliable) and more widespread (i.e., topographically larger distributed) activation patterns for dynamic faces in the above-mentioned emotion-specific network compared to static stimuli. Besides, this enhanced “emotion by motion” effect should also result in a better recognition rate of the different facial expressions in dynamic compared to static stimuli on a behavioral level.

Section snippets

Behavioral data (evaluation study)

Data of the behavioral evaluation study of 30 healthy female participants (mean age 22.7 ± 2.9 years, see Experimental procedures for further information) showed a recognition accuracy rate of 94.1% (± 9.8) for neutral, 98.1% (± 5.2) for happy, 94.2% (± 4.5) for disgust, 95.3% (± 6.4) for fearful, and 88.8% (± 5.1) for angry expressions. Repeated measurement ANOVAs with the factor EMOTION (5 levels: neutrality, happiness, disgust, fear, anger), calculated for arousal and category separately, revealed

Discussion

In the present study, we aimed at two central hypotheses. First, we predicted that the processing of the emotional facial expressions of a newly introduced stimulus database would evoke emotion-specific networks. Second, we expected more widespread BOLD activation patterns for dynamic compared to static facial expressions in those emotion-specific regions and a higher recognition accuracy of emotional stimuli for the dynamic modality.

Confirming the above-mentioned hypotheses for the largest

Conclusion

The results of the present study indicate that dynamic face stimuli result in more pronounced and distributed activation patterns when compared to static faces. This finding is interpreted in terms of higher ecological validity of the dynamic face stimuli facilitating the perception of emotional facial expressions, and recruiting more widespread emotion-specific neuronal networks possibly due to a higher complexity of stimuli and parallel processing of information. Except for two studies

Participants

The fMRI study group consisted of 16 female adults between 19 and 27 years (21.6 ± 2.3 years) from Bremen University campus. We only included female participants in order to avoid gender effects in neural activation patterns and emotion perception which have been previously reported in neuroimaging studies of emotion for males and females (for review, see Wager et al. (2003)). It was stated that women show stronger activation during emotional processing tasks compared to men (Wager et al., 2003).

Acknowledgments

The authors wish to thank Peter Erhard, and Melanie Loebe for technical support, Christina Regenbogen for assistance with data analysis, Sascha Clamer for providing software support, all participants in this study, and all actresses for participation in the development of the new stimulus database.

References (81)

  • GarrettA.S. et al.

    Separating subjective emotion from the perception of emotion-inducing stimuli: an fMRI study

    Neuroimage

    (2006)
  • Gorno-TempiniM.L. et al.

    Explicit and incidental facial expression processing: an fMRI study

    Neuroimage

    (2001)
  • GrossmanE.D. et al.

    Brain areas active during visual perception of biological motion

    Neuron

    (2002)
  • HaxbyJ.V. et al.

    The distributed human neural system for face perception

    Trends Cogn. Sci.

    (2000)
  • HennenlotterA. et al.

    A common neural basis for receptive and expressive communication of pleasant facial affect

    Neuroimage

    (2005)
  • HumphreysG.W. et al.

    Expression is computed separately from facial identity, and it is computed separately for moving and static faces: neuropsychological evidence

    Neuropsychologia

    (1993)
  • KiltsC.D. et al.

    Dissociable neural pathways are involved in the recognition of emotion in static and dynamic facial expressions

    Neuroimage

    (2003)
  • LangeK. et al.

    Task instructions modulate neural responses to fearful facial expressions

    Biol. Psychiatry

    (2003)
  • LeslieK.R. et al.

    Functional imaging of face and hand imitation: towards a motor theory of empathy

    Neuroimage

    (2004)
  • NarumotoJ. et al.

    Attention to emotion modulates fMRI activity in human right superior temporal sulcus

    Brain Res. Cogn. Brain Res.

    (2001)
  • O'DohertyJ. et al.

    Beauty in a smile: the role of medial orbitofrontal cortex in facial attractiveness

    Neuropsychologia

    (2003)
  • OldfieldR.C.

    The assessment and analysis of handedness: the Edinburgh inventory

    Neuropsychologia

    (1971)
  • OrozcoS. et al.

    Gender differences in electrophysiological responses to facial stimuli

    Biol. Psychiatry

    (1998)
  • PelphreyK.A. et al.

    Brain activation evoked by perception of gaze shifts: the influence of context

    Neuropsychologia

    (2003)
  • PhanK.L. et al.

    Functional neuroanatomy of emotion: a meta-analysis of emotion activation studies in PET and fMRI

    Neuroimage

    (2002)
  • RizzolattiG. et al.

    Motor and cognitive functions of the ventral premotor cortex

    Curr. Opin. Neurobiol.

    (2002)
  • SatoW. et al.

    Enhanced neural activity in response to dynamic facial expressions of emotion: an fMRI study

    Brain Res. Cogn. Brain Res.

    (2004)
  • StarkR. et al.

    Hemodynamic responses to fear and disgust-inducing pictures: an fMRI study

    Int. J. Psychophysiol.

    (2003)
  • StarkR. et al.

    Erotic and disgust-inducing pictures—differences in the hemodynamic responses of the brain

    Biol. Psychol.

    (2005)
  • VuilleumierP. et al.

    Effects of attention and emotion on face processing in the human brain: an event-related fMRI study

    Neuron.

    (2001)
  • WagerT.D. et al.

    Valence, gender, and lateralization of functional brain anatomy in emotion: a meta-analysis of findings from neuroimaging

    Neuroimage

    (2003)
  • WickerB. et al.

    Both of us disgusted in My insula: the common neural basis of seeing and feeling disgust

    Neuron

    (2003)
  • AdolphsR.

    Recognizing emotion from facial expressions: psychological and neurological mechanisms

    Behav. Cogn. Neurosci. Rev.

    (2002)
  • AmbadarZ. et al.

    Deciphering the enigmatic face: the importance of facial dynamics in interpreting subtle facial expressions

    Psychol. Sci.

    (2005)
  • AndersonA.K. et al.

    Neural correlates of the automatic processing of threat facial signals

    J. Neurosci.

    (2003)
  • BassiliJ.N.

    Emotion recognition: the role of facial movement and the relative importance of upper and lower areas of the face

    J. Pers. Soc. Psychol.

    (1979)
  • BerryD.S.

    What can a moving face tell us?

    J. Pers. Soc. Psychol.

    (1990)
  • BieleC. et al.

    Sex differences in perception of emotion intensity in dynamic and static facial expressions

    Exp. Brain Res.

    (2006)
  • BruceV. et al.

    Understanding face recognition

    Br. J. Psychol.

    (1986)
  • BuccinoG. et al.

    Action observation activates premotor and parietal areas in a somatotopic manner: an fMRI study

    Eur. J. Neurosci.

    (2001)
  • Cited by (209)

    • Examining social reinforcement learning in social anxiety

      2023, Journal of Behavior Therapy and Experimental Psychiatry
    View all citing articles on Scopus

    The study was carried out at the Center for Advanced Imaging (CAI), University of Bremen, Germany.

    View full text