Research ReportEmotions in motion: Dynamic compared to static facial expressions of disgust and happiness reveal more widespread emotion-specific activations☆
Introduction
Emotions play an important role in regulating social behavior (see Adolphs, 2002 for review). The impact of emotions on our thoughts, memory, attention, decisions and behavior is part of everyday life experience. Everyday social life requires correct perception and interpretation of different emotional facial expressions for adequate behavior in social contexts.
Pioneering work in the study of face and emotional face perception was provided by Bruce and Young (1986). Based on this face perception model, Haxby et al. (2000) modified the latter model. This modified model is based on a “core” and an “extended system” of face perception. The “core system” represents the visual analysis of faces and comprises the inferior occipital gyri for early visual analysis, the fusiform face area (FFA) for the processing of facial features and identity, and the superior temporal sulcus (STS) area for the processing of changeable facial features (i.e., eyes, mouth, expressions; see also Allison et al. (2000), for review). The “extended system” is reciprocally linked to the core system and comprises the auditory cortex (i.e., for lip reading), parietal areas (i.e., for spatial attention to changeable features), the temporal pole (TP, associated with autobiographical information, names, and personal identity), and the amygdala (AMG, emotion, facial expressions). Haxby et al. claimed that the temporal sequence of processing of the different stages of the model should be addressed in detail in future studies (Haxby et al., 2000). Adolphs (2002) focused on this aspect and extended the face perception model by Haxby et al. (2000) by adding the temporal dimension of face and emotion perception and recognition. Within the first 120 ms after emotion onset, the amygdala (AMG, at this point, the AMG is activated in the early processing stage), the striate cortex and the thalamus are proposed to process and encode automatically very early perceptual structures of salient emotional stimuli. This is considered to be equivalent to the “core system” as suggested by Haxby et al. (2000).
The equivalent of the “extended system” of emotion recognition modules according to Adolphs (2002) comprises the striate cortex, the fusiform face area (FFA; early), the superior temporal gyrus (STG; early), the AMG, late (reactivation of the AMG), the orbitofrontal cortex (OFC) and the basal ganglia (BG). The latter regions are proposed to play a role in processing the motion of emotional expressions, even if simply implied by static stimuli (superior temporal sulcus [STS] area, especially for changes in mouth and eye area; for review, see Allison et al., 2000, Haxby et al., 2000, Hoffman and Haxby, 2000).
Numerous fMRI studies have provided evidence for an involvement of emotion-related neural networks in the perception of static emotional face expressions (for review, see Adolphs, 2002, Davidson and Irwin, 1999, LeDoux, 1996, Murphy et al., 2003, Phan et al., 2002).
The majority of emotion perception studies (for reviews, see Adolphs, 2002, Davidson and Irwin, 1999, Murphy et al., 2003, Phan et al., 2002) have used stimuli displaying static emotional facial expressions (see also Ekman and Friesen, 1976). Natural features, however, should better be considered for stimulus construction because they convey an increased richness of temporal and structural facial object properties (Harwood et al., 1999, Sato et al., 2004), which can improve the three-dimensional perception of faces (Knight and Johnston, 1997). These features, in turn, potentially facilitate emotion recognition and might even support the processing of social interactions in a more natural way (Bassili, 1979, Berry, 1990, LaBar et al., 2003, Sato et al., 2004).
Behavioral studies on moving facial expressions corroborate this line of argumentation and showed higher arousal rates (Simons et al., 1999, Weyers et al., 2006, respectively) as well as better recognition accuracy (Ambadar et al., 2005, Bassili, 1979, Harwood et al., 1999) during rating tasks of dynamic compared to static emotional facial expressions.
To date, there are only few neuroimaging studies addressing the perception of dynamic emotional stimuli, and, to our knowledge, there is no fMRI study examining the neural processing of dynamic facial expressions of disgust.
In a PET study, Kilts et al. (2003) contrasted dynamic and static emotional face stimuli showing happy and angry facial expression. They reported increased activity in V5, STS area and periamygdaloid area for dynamic versus static angry faces and cuneus, V5, lingual, middle temporal and medial frontal gyrus for dynamic versus static happy faces. In an fMRI study, LaBar et al. (2003) presented photographs and morphed videos of emotional expressions of anger and fear and reported an enhancement of activation for emotional compared to neutral expressions in – among other – the fusiform gyrus (FG), the ventromedial prefrontal (also orbitofrontal) cortex and the STS area for dynamic stronger than for static faces. Sato et al. (2004) applied a passive viewing task including happy and fearful dynamic faces. They described more widespread activations for dynamic compared to static facial expressions in happy and fearful expressions when compared to neutral faces or mosaics of scrambled faces. In line with the work of Kilts et al., 2003, LaBar et al., 2003, Sato et al., 2004 concluded that dynamic stimuli convey more lively and realistic aspects of faces occurring in social interactions and related those to leading to more widespread activation patterns.
The above-mentioned studies demonstrated that the processing of dynamic in contrast to static facial expressions appear to more reliably recruit neural networks of emotion processing such as the amygdala (Kilts et al., 2003, LaBar et al., 2003), fusiform gyrus, inferior occipital, middle and superior temporal regions (STS area; for reviews, see Allison et al., 2000, Haxby et al., 2000), motion sensitive areas (MT+/V5), and the lateral inferior frontal cortex (mirror neuron system; Buccino et al., 2001, Kilts et al., 2003, LaBar et al., 2003, Leslie et al., 2004, Sato et al., 2004).
We presume that natural moving faces might provide a more valid stimulus basis for the examination of neuronal correlates of facial expression perception. Only few studies have applied “natural” dynamic facial expressions (Gepner et al., 2001, Kilts et al., 2003). Kilts et al. (2003) applied dynamic emotional expressions of neutrality, happiness, and anger of a face database, which consisted of each two female and two male professional actors. Gepner et al. (2001) have recorded videos of one actress displaying natural dynamic facial expressions of joy, surprise, sadness, disgust, by exclusively one actress. The downside of those studies is that study participants might have run the risk of habituation because the same actors were presented multiple times. For this reason, we developed a new stimulus data set based on 80 different amateur actors and actresses.
The present study aimed at three central points: First, since we used a new stimulus database, the stimuli needed to be evaluated (i) by an extensive psychometrical evaluation study (see Experimental procedures section), and (ii) by participants of the fMRI study. Second, BOLD activation patterns of emotional (happiness/disgust) compared to neutral faces were examined. The above discussed network of emotion-specific areas of emotional (happiness/disgust) compared to neutral faces was expected for both static and dynamic stimuli. Third, we examined BOLD activation patterns of dynamic compared to static stimuli of both emotional valences compared to neutrality. We expected an “emotion by motion effect” which refers to the enhancement of emotional face perception and of activation of the above-mentioned emotion-specific network by the movement of emotional facial expressions. Thus, we expected consistent (i.e., reliable) and more widespread (i.e., topographically larger distributed) activation patterns for dynamic faces in the above-mentioned emotion-specific network compared to static stimuli. Besides, this enhanced “emotion by motion” effect should also result in a better recognition rate of the different facial expressions in dynamic compared to static stimuli on a behavioral level.
Section snippets
Behavioral data (evaluation study)
Data of the behavioral evaluation study of 30 healthy female participants (mean age 22.7 ± 2.9 years, see Experimental procedures for further information) showed a recognition accuracy rate of 94.1% (± 9.8) for neutral, 98.1% (± 5.2) for happy, 94.2% (± 4.5) for disgust, 95.3% (± 6.4) for fearful, and 88.8% (± 5.1) for angry expressions. Repeated measurement ANOVAs with the factor EMOTION (5 levels: neutrality, happiness, disgust, fear, anger), calculated for arousal and category separately, revealed
Discussion
In the present study, we aimed at two central hypotheses. First, we predicted that the processing of the emotional facial expressions of a newly introduced stimulus database would evoke emotion-specific networks. Second, we expected more widespread BOLD activation patterns for dynamic compared to static facial expressions in those emotion-specific regions and a higher recognition accuracy of emotional stimuli for the dynamic modality.
Confirming the above-mentioned hypotheses for the largest
Conclusion
The results of the present study indicate that dynamic face stimuli result in more pronounced and distributed activation patterns when compared to static faces. This finding is interpreted in terms of higher ecological validity of the dynamic face stimuli facilitating the perception of emotional facial expressions, and recruiting more widespread emotion-specific neuronal networks possibly due to a higher complexity of stimuli and parallel processing of information. Except for two studies
Participants
The fMRI study group consisted of 16 female adults between 19 and 27 years (21.6 ± 2.3 years) from Bremen University campus. We only included female participants in order to avoid gender effects in neural activation patterns and emotion perception which have been previously reported in neuroimaging studies of emotion for males and females (for review, see Wager et al. (2003)). It was stated that women show stronger activation during emotional processing tasks compared to men (Wager et al., 2003).
Acknowledgments
The authors wish to thank Peter Erhard, and Melanie Loebe for technical support, Christina Regenbogen for assistance with data analysis, Sascha Clamer for providing software support, all participants in this study, and all actresses for participation in the development of the new stimulus database.
References (81)
- et al.
Social perception from visual cues: role of the STS region
Trends Cogn. Sci.
(2000) - et al.
Response and habituation of the human amygdala during visual processing of facial expression
Neuron.
(1996) - et al.
Predictors of amygdala activation during the processing of emotional stimuli: a meta-analysis of 385 PET and fMRI studies
Brain Res. Rev.
(2008) - et al.
The functional neuroanatomy of emotion and affective style
Trends Cogn. Sci.
(1999) - et al.
An empirical comparison of SPM preprocessing parameters to the analysis of fMRI data
Neuroimage
(2002) - et al.
Neural correlates of internally-generated disgust via autobiographical recall: a functional magnetic resonance imaging investigation
Neurosci. Lett.
(2004) - et al.
Multisubject fMRI studies and conjunction analyses
Neuroimage
(1999) - et al.
How many subjects constitute a study?
Neuroimage
(1999) - et al.
Stochastic designs in event-related fMRI
Neuroimage
(1999) - et al.
Time course of the subjective emotional response to aversive pictures: relevance to fMRI studies
Psychiatry Res.
(2001)
Separating subjective emotion from the perception of emotion-inducing stimuli: an fMRI study
Neuroimage
Explicit and incidental facial expression processing: an fMRI study
Neuroimage
Brain areas active during visual perception of biological motion
Neuron
The distributed human neural system for face perception
Trends Cogn. Sci.
A common neural basis for receptive and expressive communication of pleasant facial affect
Neuroimage
Expression is computed separately from facial identity, and it is computed separately for moving and static faces: neuropsychological evidence
Neuropsychologia
Dissociable neural pathways are involved in the recognition of emotion in static and dynamic facial expressions
Neuroimage
Task instructions modulate neural responses to fearful facial expressions
Biol. Psychiatry
Functional imaging of face and hand imitation: towards a motor theory of empathy
Neuroimage
Attention to emotion modulates fMRI activity in human right superior temporal sulcus
Brain Res. Cogn. Brain Res.
Beauty in a smile: the role of medial orbitofrontal cortex in facial attractiveness
Neuropsychologia
The assessment and analysis of handedness: the Edinburgh inventory
Neuropsychologia
Gender differences in electrophysiological responses to facial stimuli
Biol. Psychiatry
Brain activation evoked by perception of gaze shifts: the influence of context
Neuropsychologia
Functional neuroanatomy of emotion: a meta-analysis of emotion activation studies in PET and fMRI
Neuroimage
Motor and cognitive functions of the ventral premotor cortex
Curr. Opin. Neurobiol.
Enhanced neural activity in response to dynamic facial expressions of emotion: an fMRI study
Brain Res. Cogn. Brain Res.
Hemodynamic responses to fear and disgust-inducing pictures: an fMRI study
Int. J. Psychophysiol.
Erotic and disgust-inducing pictures—differences in the hemodynamic responses of the brain
Biol. Psychol.
Effects of attention and emotion on face processing in the human brain: an event-related fMRI study
Neuron.
Valence, gender, and lateralization of functional brain anatomy in emotion: a meta-analysis of findings from neuroimaging
Neuroimage
Both of us disgusted in My insula: the common neural basis of seeing and feeling disgust
Neuron
Recognizing emotion from facial expressions: psychological and neurological mechanisms
Behav. Cogn. Neurosci. Rev.
Deciphering the enigmatic face: the importance of facial dynamics in interpreting subtle facial expressions
Psychol. Sci.
Neural correlates of the automatic processing of threat facial signals
J. Neurosci.
Emotion recognition: the role of facial movement and the relative importance of upper and lower areas of the face
J. Pers. Soc. Psychol.
What can a moving face tell us?
J. Pers. Soc. Psychol.
Sex differences in perception of emotion intensity in dynamic and static facial expressions
Exp. Brain Res.
Understanding face recognition
Br. J. Psychol.
Action observation activates premotor and parietal areas in a somatotopic manner: an fMRI study
Eur. J. Neurosci.
Cited by (209)
Examining social reinforcement learning in social anxiety
2023, Journal of Behavior Therapy and Experimental PsychiatryVideo playback, affective witnessing, and the mobility of trauma: Video evidence of violent crime in the criminal justice system
2023, Emotion, Space and SocietyThe time course of categorical perception of facial expressions
2022, NeuropsychologiaA Dynamic Disadvantage? Social Perceptions of Dynamic Morphed Emotions Differ from Videos and Photos
2024, Journal of Nonverbal Behavior
- ☆
The study was carried out at the Center for Advanced Imaging (CAI), University of Bremen, Germany.