Elsevier

NeuroImage

Volume 122, 15 November 2015, Pages 306-317
NeuroImage

Brain activity correlates with emotional perception induced by dynamic avatars

https://doi.org/10.1016/j.neuroimage.2015.07.056Get rights and content

Highlights

  • Dynamic computer-generated avatars were used to study emotion recognition from gait.

  • Emotional-gait activated emotion and visual-related brain systems.

  • Ambiguous avatars reveal activations associated with perceived emotions.

  • Amygdala activity was parametrically linked to the emotional load of the stimuli.

  • Dynamic patterns more than static postures or velocity influenced the results.

Abstract

An accurate judgment of the emotional state of others is a prerequisite for successful social interaction and hence survival. Thus, it is not surprising that we are highly skilled at recognizing the emotions of others. Here we aimed to examine the neuronal correlates of emotion recognition from gait. To this end we created highly controlled dynamic body-movement stimuli based on real human motion-capture data (Roether et al., 2009). These animated avatars displayed gait in four emotional (happy, angry, fearful, and sad) and speed-matched neutral styles. For each emotional gait and its equivalent neutral gait, avatars were displayed at five morphing levels between the two. Subjects underwent fMRI scanning while classifying the emotions and the emotional intensity levels expressed by the avatars.

Our results revealed robust brain selectivity to emotional compared to neutral gait stimuli in brain regions which are involved in emotion and biological motion processing, such as the extrastriate body area (EBA), fusiform body area (FBA), superior temporal sulcus (STS), and the amygdala (AMG). Brain activity in the amygdala reflected emotional awareness: for visually identical stimuli it showed amplified stronger response when the stimulus was perceived as emotional. Notably, in avatars gradually morphed along an emotional expression axis there was a parametric correlation between amygdala activity and emotional intensity.

This study extends the mapping of emotional decoding in the human brain to the domain of highly controlled dynamic biological motion. Our results highlight an extensive level of brain processing of emotional information related to body language, which relies mostly on body kinematics.

Introduction

There is a large body of literature concerning the neuronal correlates of both emotional processing and biological motion. However, little is known about the neuronal correlates of perception of bodily emotional expression.

So far, a major part of research exploring expression and recognition of emotions has focused on facial expressions (Allison et al., 2000, Ekman et al., 1969, Gobbini and Haxby, 2007, Haxby et al., 2000, Izard, 1971, Puce et al., 2007, Rossion et al., 2003, Vuilleumier and Pourtois, 2007, Winston et al., 2007). However, body language also carries important information about emotional state and inter-individual signaling, which is sometimes more reliable, accessible and action oriented than facial expressions (Aviezer et al., 2012, De Gelder, 2006). A number of studies have begun mapping brain regions showing selective responses to emotional body expressions, employing either static images or dynamic videos. Studies using static images found that emotional compared to neutral body expressions show enhanced activity in the fusiform face area (FFA), the amygdala and the temporal pole (Grezes et al., 2007, Hadjikhani and de Gelder, 2003, Pichon et al., 2008). Studies using naturalistic dynamic videos showed an increased activity in visual body processing areas, such as the extrastriate body area (EBA), and fusiform body area (FBA), as well as the FFA, the temporal parietal junction (TPJ), and the superior temporal sulcus (STS) which is frequently highlighted in biological motion, goal directed action and emotional research (Atkinson et al., 2012, Grezes et al., 2007, Kret et al., 2011, Pichon et al., 2008, Schneider et al., 2014, Sinke et al., 2010). They also showed increased brain activity in regions that are linked to autonomic regulation—such as the hypothalamus, the ventro-medial prefrontal cortex, and the premotor cortex (Grezes et al., 2007, Pichon et al., 2008).

Previous studies have uncovered differential brain processing for static and dynamic bodily stimuli. Thus, fMRI studies which compared directly static and dynamic emotional body gestures showed an exclusive response to dynamic emotional (angry) body gestures, in STS and premotor cortex (Grezes et al., 2007, Pichon et al., 2008), the hypothalamus, the ventro-medial prefrontal cortex, and the temporal pole (Pichon et al., 2008). A behavioral study directly comparing dynamic and static body expressions (Atkinson et al., 2004) revealed that exaggerating body movements enhanced recognition accuracy and that emotional-intensity ratings for movies were higher than for still images. These findings suggest that evaluation of emotional body gestures relies more on movement than on static form information. In addition, the dynamic aspect may hold important information regarding the emotional intensity of the stimulus. However, a major problem associated with the use of dynamic naturalistic stimuli is the many dimensions of the image kinetics that may vary between different stimuli. One option for dealing with this complex problem is using more minimalistic and controlled stimuli depicting biological motion. Indeed, previous studies have shown that human observers can successfully recognize expressed emotions even in extremely impoverished stimuli such as point-light displays (Alaerts et al., 2011, Atkinson et al., 2004, Clarke et al., 2005, Dittrich et al., 1996, Nackaerts et al., 2012). These findings suggest that essential information of emotional signaling can be derived from minimalistic dynamic displays of the emotional gestures. However, point light displays constitute an extreme abstraction that could affect the emotional experience.

A promising intermediate category, which allows examining dynamic biological motion in a controlled yet more naturalistic manner, is computer-generated avatars. These stimuli offer the advantage of high stimulus control while possessing a much more naturalistic appearance than point-light displays. Indeed Schneider et al. (2014) have used a new set of dynamic avatar stimuli (provided by Roether et al., 2009, Roether et al., 2010) for the first time in a brain imaging study. These faceless avatars were both detailed and highly controlled, expressing a variety of emotional gaits derived from real human movements. In their functional near-infrared spectroscopy (fNIRS) study Schneider et al. (2014), used emotion detection task (identifying the emotion expressed by the avatar), and speed evaluation task as a control (rating the moving speed of each avatar on a 5 point scale). They found enhanced activity to emotional compared to neutral stimuli, in both motion and emotion processing brain regions (in parietal and temporal lobes). Moreover these regions were less responsive when the emotion was not intentionally processed (i.e. during the speed task).

Here we opted to use similar highly controlled stimuli during functional magnetic resonance imaging (fMRI). Using such dynamic avatars enabled us to examine biological motion over several emotions, intensities and speeds, and therefore to test a number of hypotheses. First, we hypothesized that body motion has a significant role in emotional recognition. Furthermore, it should engage a wide range of cortical networks, such as visual and emotional brain networks. To this end we measured brain responses to emotional vs. neutral dynamic gaits (experiment 1). Second, we hypothesized that dynamic motion, more than static postures, enhances the emotional selectivity. We examined brain selectivity to static emotional body expressions, by measuring brain responses to emotional vs. neutral static avatars (experiment 2). We further hypothesized that body motion is a reliable indicator not only for identifying distinct emotions, but also for evaluating emotional intensity. Thus we aimed to examine emotional perception from gait (both behaviorally and neuronally) along the emotional intensity axis. To that end we used avatars at several levels of emotional intensities (experiment 3). Finally we hypothesized that the emotion-related brain activations reflect the subjective perception of the observers rather than the objective, physical attributes of the stimuli. In most previous studies, these two aspects were closely linked—rendering it difficult to disentangle them. In the present study, some of the avatar stimuli were emotionally ambiguous—in the sense that the same physical stimuli produced different subjective emotional reports—allowing us to dissociate the physical from the subjective aspects.

Section snippets

Subjects

Nineteen healthy right handed subjects (ages 28 ± 4.5, 11 females) participated in the experiments. Most of the subjects participated in more than one experiment (see Table 1). Seventeen subjects (10 females) participated in experiment 1, ten subjects (5 females) in experiment 2, and 14 subjects (9 females) in experiment 3. In addition 13 subjects (6 females) participated in an external visual localizer task. Subjects gave a written informed consent and were paid for their participation. All

Results

In the present study we conducted three experiments examining various parameters of neuronal emotional processing of dynamic and static avatar images. Fig. 1 illustrates the different experiments and the type of images used. Experiment 1 (dynamic gait) aimed to distinguish between the neuronal correlates of emotional and neutral body expressions (gait). Experiment 2 (static posture) aimed to distinguish between the neuronal correlates of static and dynamic body expressions (gait). Experiment 3 (

Neuronal networks showing selective gait-driven emotional responses

We present here a novel paradigm to assess human brain processing while observing emotional body movements, performed by avatars. Preferential emotion-related responses were expressed mainly in visual, emotion, and motion-related brain networks; Enhanced activity in response to emotional gait was shown in the extrastriate body area (EBA), and fusiform body area (FBA)—high order visual areas related to body representation, and the decoding of emotion from body movements (Downing et al., 2007,

Conclusion

Using dynamic computer-generated avatars we were able to show that emotion perception from body movements engages a wide range of brain networks—visual, social and emotion-related. The highly controlled avatar stimuli allowed us to demonstrate a parametric modulation of amygdala activity with emotional intensity, and to disentangle effects of speed and motion from emotional processing of body motion. Notably, we were able to uncover a subjective component to these amygdala responses by

Acknowledgments

We thank Nahum Stern, Fanny Atar and Dr. Edna Haran-Furman for their assistance in the imaging setup and fMRI data collection. We thank Dr. Aya Ben Yakov and Dr. Yuval Hart for fruitful discussions.

This study was funded by the following grants: EU FP7 TANGO ICT-2007.8.0/249858 (H.G., A.C., T.F., M.A.G., R.M.), EU FP7 VERE FET/257695 (T.F., R.M.), EU FP7 Koroibot FP7-ICT-2013-10/ 611909 (T.F., M.A.G.), EU FP7 HBP Flagship FP7-ICT-604102 (M.A.G., R.M.), ICORE ISF (T.F., R.M.), the Boehringer

References (74)

  • N. Hadjikhani et al.

    Seeing fearful body expressions activates the fusiform cortex and amygdala

    Curr. Biol.

    (2003)
  • J.V. Haxby et al.

    The distributed human neural system for face perception

    Trends Cogn. Sci.

    (2000)
  • M. Kret et al.

    Similarities and differences in perceiving threat from dynamic faces and bodies. An fMRI study

    Neuroimage

    (2011)
  • M. Mishkin et al.

    Object vision and spatial vision: two cortical pathways

    Trends Neurosci.

    (1983)
  • L. Pessoa et al.

    Fate of unattended fearful faces in the amygdala is determined by both attentional resources and cognitive modulation

    Neuroimage

    (2005)
  • A. Puce et al.

    Neural responses elicited to face motion and vocalization pairings

    Neuropsychologia

    (2007)
  • B. Rossion et al.

    Early lateralization and orientation tuning for face, word, and object processing in the visual cortex

    Neuroimage

    (2003)
  • S. Schneider et al.

    Show me how you walk and I tell you how you feel—a functional near-infrared spectroscopy study on emotion perception based on human gait

    Neuroimage

    (2014)
  • L. Shmuelof et al.

    Dissociation between ventral and dorsal fMRI activation during object and action recognition

    Neuron

    (2005)
  • C.B. Sinke et al.

    Tease or threat? Judging social interactions from bodily expressions

    Neuroimage

    (2010)
  • L.G. Ungerleider et al.

    [] What'and [] where'in the human brain

    Curr. Opin. Neurobiol.

    (1994)
  • P. Vuilleumier et al.

    Distributed and interactive brain mechanisms during emotion face perception: evidence from functional neuroimaging

    Neuropsychologia

    (2007)
  • P. Vuilleumier et al.

    Effects of attention and emotion on face processing in the human brain: an event-related fMRI study

    Neuron

    (2001)
  • K.S. Weiner et al.

    Not one extrastriate body area: using anatomical landmarks, hMT+, and visual field maps to parcellate limb-selective activations in human lateral occipitotemporal cortex

    Neuroimage

    (2011)
  • J.S. Winston et al.

    Brain systems for assessing facial attractiveness

    Neuropsychologia

    (2007)
  • D. Yellin et al.

    Coupling between pupil fluctuations and resting-state fMRI uncovers a slow build-up of antagonistic responses in the human cortex

    Neuroimage

    (2015)
  • R. Adolphs et al.

    Impaired recognition of emotion in facial expressions following bilateral damage to the human amygdala

    Nature

    (1994)
  • K. Alaerts et al.

    Action and emotion recognition from point light displays: an investigation of gender differences

    PLoS One

    (2011)
  • A.P. Atkinson et al.

    Emotion perception from dynamic and static body expressions in point-light and full-light displays

    Perception Lond.

    (2004)
  • H. Aviezer et al.

    Body cues, not facial expressions, discriminate between intense positive and negative emotions

    Science

    (2012)
  • A. Barliya et al.

    Expression of emotion in the kinematics of locomotion

    Exp. Brain Res.

    (2013)
  • M. Candidi et al.

    Event-related repetitive transcranial magnetic stimulation of posterior superior temporal sulcus improves the detection of threatening postural changes in human bodies

    J. Neurosci.

    (2011)
  • T.J. Clarke et al.

    The perception of emotion from body movement in point-light displays of interpersonal dialogue

    Perception Lond.

    (2005)
  • M. Corbetta et al.

    Control of goal-directed and stimulus-driven attention in the brain

    Nat. Rev. Neurosci.

    (2002)
  • I. Davidesco et al.

    Exemplar selectivity reflects perceptual similarities in the human fusiform cortex

    Cereb. Cortex

    (2013)
  • M. Davis et al.

    The amygdala: vigilance and emotion

    Mol. Psychiatry

    (2001)
  • B. De Gelder

    Towards the neurobiology of emotional body language

    Nat. Rev. Neurosci.

    (2006)
  • Cited by (19)

    • The automatic detection of unexpected emotion and neutral body postures: A visual mismatch negativity study

      2022, Neuropsychologia
      Citation Excerpt :

      Moreover, body expressions are advantageous in information transmission in a long-distance social context (de Borst and de Gelder, 2016). Researchers have found that brain structures and neural pathways associated with adaptive behavior are activated when processing body expressions (Pichon et al., 2008, 2009; Engelen et al., 2015; Meeren et al., 2016), which suggests that body expressions may be a key factor in exploring the emotional-behavioral circuit (Goldberg et al., 2015). Therefore, it is evidently worth exploring whether unexpected emotional changes in body expressions also have specific neurophysiological responses.

    • Decoding the difference between explicit and implicit body expression representation in high level visual, prefrontal and inferior parietal cortex

      2021, NeuroImage
      Citation Excerpt :

      Previous studies provided clear evidence for the role played by IPL in body and emotional perception. Emotion-specific activation within parietal cortex was found for face stimuli (Grezes, Pichon, and de Gelder 2007; Kitada et al. 2010; Sarkheil et al. 2013) and for body stimuli (de Gelder et al. 2004; Goldberg et al. 2015; Goldberg, Preminger, and Malach 2014; Kana and Travers 2012). Significant activity was elicited in IPL for the contrast bodies expressing fear or happiness (Poyo Solanas et al. 2018).

    • Conflicting group memberships modulate neural activation in an emotional production-perception network

      2020, Cortex
      Citation Excerpt :

      As such, the IPL is also believed to be involved in action observation-execution matching as part of the mirror neuron system (Buccino et al., 2001; Molenberghs et al., 2012). This role of the parietal cortex in action observation extends to the processing of dynamic emotional faces (Sarkheil, Goebel, Schneider, & Mathiak, 2013) as well as emotional whole-body stimuli (Engelen, de Graaf, Sack, & de Gelder, 2015; Goldberg, Christensen, Flash, Giese, & Malach, 2015; Goldberg, Preminger, & Malach, 2014). More importantly, emotional expressions – such as anger – concurrently activate both affective areas (e.g., the amygdala) and the IPL, which in turn can trigger adaptive actions (de Gelder, Snyder, Greve, Gerard, & Hadjikhani, 2004; Engelen, Zhan, Sack, & de Gelder, 2018).

    • The two-process theory of biological motion processing

      2020, Neuroscience and Biobehavioral Reviews
      Citation Excerpt :

      Collectively, these studies showed that several cortical regions such as the IFG, EBA, insula (INS), STS, FG, FBA, and OCC were activated when comparing a human PLW with spatially scrambled PLWs. Some other studies have used a human walker avatars generated from computer graphics (CG; Goldberg et al., 2015; Pelphrey et al., 2003; Thompson et al., 2005) and CG and PLW (Begliomini et al., 2017). However, these stimuli are not PLWs, and, therefore, are not included in the current study.

    • Healthy young adults implement distinctive avoidance strategies while walking and circumventing virtual human vs. non-human obstacles in a virtual environment

      2018, Gait and Posture
      Citation Excerpt :

      The VE, controlled using CAREN-3 (Motek, BV, Amsterdam), consisted of a rich-textured room of similar dimension to that of the laboratory environment (Fig. 1). Since the perceived gender and emotion of biological motion stimuli (such as point light figures) are factors that can influence perception, [23] and/or neurofunctional processes [24] of the observer, all trials involving virtual humans used non-reactive female obstacles (1.65 m tall) which walked with a neutral gait pattern [25]. Human-like animations were created in Maya LT (Autodesk, U.S.) using pre-recorded motion data collected with a 17-sensor wireless inertial system (YEI technology) in a healthy young participant walking along a straight trajectory or while turning at a speed of 1.2 m/s. For the combined visual and auditory condition, virtual humans were accompanied by pre-recorded footstep sounds that were timed to their heel and forefoot contacts with the ground.

    View all citing articles on Scopus
    1

    These authors contributed equally to this work.

    View full text