Skip to main content

Main menu

  • HOME
  • CONTENT
    • Early Release
    • Featured
    • Current Issue
    • Issue Archive
    • Blog
    • Collections
    • Podcast
  • TOPICS
    • Cognition and Behavior
    • Development
    • Disorders of the Nervous System
    • History, Teaching and Public Awareness
    • Integrative Systems
    • Neuronal Excitability
    • Novel Tools and Methods
    • Sensory and Motor Systems
  • ALERTS
  • FOR AUTHORS
  • ABOUT
    • Overview
    • Editorial Board
    • For the Media
    • Privacy Policy
    • Contact Us
    • Feedback
  • SUBMIT

User menu

Search

  • Advanced search
eNeuro

eNeuro

Advanced Search

 

  • HOME
  • CONTENT
    • Early Release
    • Featured
    • Current Issue
    • Issue Archive
    • Blog
    • Collections
    • Podcast
  • TOPICS
    • Cognition and Behavior
    • Development
    • Disorders of the Nervous System
    • History, Teaching and Public Awareness
    • Integrative Systems
    • Neuronal Excitability
    • Novel Tools and Methods
    • Sensory and Motor Systems
  • ALERTS
  • FOR AUTHORS
  • ABOUT
    • Overview
    • Editorial Board
    • For the Media
    • Privacy Policy
    • Contact Us
    • Feedback
  • SUBMIT
PreviousNext
Research ArticleResearch Article: New Research, Cognition and Behavior

Electrophysiological Responses to Rapidly-Presented Affective Stimuli Predict Individual Differences in Subsequent Attention

Ha Neul Song, Sewon Oh and Sang Ah Lee
eNeuro 3 December 2021, 9 (1) ENEURO.0285-21.2021; DOI: https://doi.org/10.1523/ENEURO.0285-21.2021
Ha Neul Song
1Department of Brain and Cognitive Sciences, Seoul National University, Seoul 08826, Republic of Korea
2Department of Bio and Brain Engineering, Korea Advanced Institute of Science and Technology, Daejeon 34141, Republic of Korea
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for Ha Neul Song
Sewon Oh
3Department of Psychology, University of South Carolina, Columbia, SC 29208
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for Sewon Oh
Sang Ah Lee
1Department of Brain and Cognitive Sciences, Seoul National University, Seoul 08826, Republic of Korea
2Department of Bio and Brain Engineering, Korea Advanced Institute of Science and Technology, Daejeon 34141, Republic of Korea
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for Sang Ah Lee
  • Article
  • Figures & Data
  • Info & Metrics
  • eLetters
  • PDF
Loading

Abstract

We are constantly surrounded by a dynamically changing perceptual landscape that can influence our behavior even without our full conscious awareness. Emotional processing can have effects on subsequent attention, but there are mixed findings on whether it induces attentional enhancement or interference. The present study used a new multimodal approach to explain and predict such attentional effects based on individual differences in response to emotional stimuli. We briefly presented affective pictures (neutral, positive, erotic, mutilation, and horror categories) for 80 ms, immediately followed by a cued flanker task that was unrelated to the pictures. Event-related potentials (ERPs), skin conductance response (SCR), and reaction time (RT) were measured for each participant. We found that, in general, affective pictures induced higher electrophysiological responses compared with neutral pictures [P300 and late positive potential (LPP) in the erotic condition; P300, LPP, and SCR in the horror condition]. In particular, individuals who showed a strong ERP response to the pictures were impeded in the erotic condition (only P300) and facilitated in the horror condition (both P300 and LPP). Those who did not show a significant ERP or SCR response to the pictures were facilitated in the erotic condition and impeded in the horror condition. Furthermore, it was possible to classify the direction of the attentional effect from the participants’ P300, LPP, and SCR responses. These results demonstrate that underlying individual differences in emotional processing must be considered in understanding and predicting the effects of emotions on attention and cognition.

  • attention
  • EEG
  • emotion
  • individual differences
  • skin conductance response

Significance Statement

Automatic influence of emotions on subsequent attention may be adaptive for fast behavioral response to environmental stimuli. The majority of past studies have claimed that pleasant emotions facilitate subsequent attention and that unpleasant emotions impede it. However, several studies directly contradicted such findings by reporting opposite effects, with pleasant pictures impeding attention and unpleasant pictures facilitating it. Our results resolve this discrepancy in the existing literature by showing that depending on how weakly or strongly someone responds to emotional stimuli (erotic and horror categories), they may be either facilitated or distracted in their subsequent attention. Furthermore, we were able to accurately classify the direction of this attentional effect using their event-related potential (ERP) and skin conductance response (SCR) to the pictures.

Introduction

Recently, research in brain and cognitive sciences has started to interface closely with applications for improving cognition and mental health. One way in which such tools are used is for personal emotion monitoring and regulation. Most current technology, however, requires that users explicitly recognize their internal states (e.g., through self-report). Yet, in everyday life, people are constantly bombarded with rapidly changing perceptual stimuli that may trigger brain processes which can influence them even while they are engaged in other tasks (Halgren, 1992; Compton, 2003; Vuilleumier, 2005; Vuilleumier and Driver, 2007; Bradley, 2009; Moser et al., 2010; Pourtois et al., 2013; LeBlanc et al., 2015). For example, after passing an animal on the side of the road, a driver may become momentarily susceptible to missing a turn or getting into an accident without fully being aware of what he saw. Subsequent attentional effects induced by emotional stimuli can vary depending on the individual; in the situation described above, some people may become more alert while others get distracted after passing the scene. Although individual differences in emotional processing have been studied extensively, there is a lack of understanding on how such differences influence attention (Gohm and Clore, 2000; Hamann and Canli, 2004; Mardaga et al., 2006; Zhang and Zhou, 2014; Matusz et al., 2015).

Emotional processing consists of detecting and responding to (e.g., via arousal and regulation) emotionally significant perceptual stimuli and can have multiple pathways by which it affects subsequent attention (Pourtois et al., 2013). Because such processes can happen quickly, their quantitative measurement requires high temporal resolution. Event-related potentials (ERPs) in response to emotional stimuli provide simple and fast markers of cortical activity (Hajcak et al., 2013b) that can be easily acquired using a variety of EEG systems. According to previous studies, P300 (positive potential occuring about 300ms after stimulus onset) amplitude correlates with perceived emotional significance and late positive potential (LPP) amplitude with emotion regulation (Johnston et al., 1986; Cuthbert et al., 2000; Foti and Hajcak, 2008; Hajcak et al., 2010; Hajcak and Foti, 2020). In addition, skin conductance response (SCR), which indicates activity of the sympathetic nervous system and is associated with hypothalamic arousal, has a slower progression and is longer-lasting compared with ERPs (Critchley et al., 2000; Cuthbert et al., 2000). As different aspects of emotional processing are reflected in each physiological marker, a multimodal approach using ERP and SCR may enhance our ability to explain and predict the cognitive effects of emotional processing at the individual level.

Given that fast processing of emotions are adaptive mechanisms for subsequent behavioral responses, it seems reasonable for even quickly presented emotional stimuli to modulate attention; however, there have been mixed findings on the direction of such effects (Schmeichel, 2007; Bocanegra and Zeelenberg, 2009, 2011; Ortner et al., 2013). Furthermore, while electrophysiological correlates of attention on emotional stimuli themselves have been well-documented (e.g., N2, EPN, LPP), their relevance to attention on an unrelated task has not yet been characterized extensively (Krolak-Salmon et al., 2001; Pourtois et al., 2004; Sabatinelli et al., 2007; Olofsson et al., 2008; Wiens et al., 2011; Hajcak et al., 2013a). Some studies reported that pleasant emotional stimuli facilitate subsequent attention and that unpleasant stimuli impede it (Eastwood et al., 2003; Wadlinger and Isaacowitz, 2006; Friedman and Förster, 2010; LeBlanc et al., 2015). However, others have yielded contrary results. In one study, images of fearful faces enhanced, rather than decreased, performance in a perceptual attention task (Phelps et al., 2006). Another study reported that briefly-presented sexual stimuli decreased performance in a dot detection task; interestingly, the magnitude of this effect was correlated with self-reports of eroticism (Prause et al., 2008).

One overlooked factor is that individual differences may not only explain the magnitude of such effects but also their direction (facilitation vs impediment). Because the same emotional stimulus can elicit varied responses across individuals according to their personal characteristics or experiences, the current study investigated individual differences in the interaction between emotional processing and attention and hypothesized that people whose attention is facilitated by affective pictures would show dissociable physiological responses from those who are impeded by it. Through this investigation, we aimed not only to provide insight into the mechanisms underlying the interaction between emotion and cognition but to also improve personalization of neurotechnology and its real-world applicability.

To simulate situations in which attention is automatically influenced by rapid emotional processing, we briefly presented participants with affective pictures before teach trial of a cued flanker task (Fan et al., 2002, 2005). Neutral, positive, erotic, mutilation, and horror picture stimuli were used to elicit a variety of potentially emotion-dependent effects. To explore individual differences, we divided people into two groups based on whether they were facilitated or impeded by certain picture categories and compared their ERP and SCR measures. Finally, we tested whether these physiological markers can accurately classify and predict attentional effects at the individual level.

Materials and Methods

Participants

Participants were thirty-one university students (19 males, mean age 24.77, SD = 3.74) recruited from the Daejeon area. All participants were right-handed and had normal or corrected vision. Data from all participants were included in the group analysis involving SCR and reaction time (RT). Data from five participants were excluded from the analysis of EEG data due to a failure to acquire usable data (disrupted connection or interrupted testing session), resulting in a final sample size of 26 (14 males). All participants’ anxiety and depression scores were measured via Beck Anxiety Inventory (BAI) and Beck Depression Inventory-II (BDI-II); no participants were found to have severe anxiety or depression (Beck et al., 1988, 1996).

Supporting data were collected from three separate independent samples: picture stimuli valence/arousal rating (n = 10, mean age = 23.20, SD = 2.44), picture awareness and memory test (n = 17, mean age = 22.35, SD = 4.27), and a partial replication of the findings using a 32-channel wired EEG system (eight males, mean age = 28.35, SD = 4.19).

Materials and procedures

We aimed to induce rapid emotional processing via brief presentations of visual scenes immediately followed by a trial of a cued flanker task [attention network task (ANT); Fan et al., 2002, 2005]. On each trial, the affective picture was presented for 80 ms (for more information, see Figs. 1A), followed by a randomized fixation period between 900 and 1300 ms long. For cued trials, an asterisk appeared for 100 ms (either above, center, or below the fixation point) and, after 400 ms of fixation, the target was presented. The ANT task, designed to engage multiple attentional mechanisms, employed a center asterisk (center cue) to give participants temporal information about the target presentation, and the placement of the asterisk above or below the center fixation point (spatial cue) additionally provided information about where the target will appear. The target was the center arrow of a row of five arrows; on congruent trials, the flanker arrows were consistent with the direction of the target arrow, and on incongruent trials, they pointed in the opposite direction. Participants were asked to indicate the direction of the target arrow as quickly as possible; if they did not respond within 1700 ms, the fixation period for the next trial started automatically. RT on each trial was recorded and log transformed to minimize skewed distribution of each participant’s data. After the participants made a response, the arrows disappeared and a fixation period followed.

Figure 1.
  • Download figure
  • Open in new tab
  • Download powerpoint
Figure 1.

A, Task sequence. A picture (neutral, positive, erotic, mutilation, or horror) was presented for 80 ms to induce emotional processing before each trial of the cued ANT. When a row of five arrows appeared, participants were asked to indicate the direction of the center arrow (target) as quickly as possible. RT was measured. B, Examples of horror pictures. A total of 48 horror pictures from a commercially usable free web source were used (other picture categories were taken from the IAPS database). C, D, Validation of valence (C) and arousal (D) ratings for the non-IAPS horror pictures. Ten subjects were separately recruited to rate the stimulus set on their valence and arousal. Valence ratings for pictures in the horror category were lower than the ratings for neutral, positive, and erotic pictures but not different from mutilation pictures (C). Arousal ratings for horror pictures were higher than the ratings for neutral pictures but not different from mutilation pictures (D). Black asterisks indicate corrected ps < 0.05 for nonparametric paired tests.

Before the start of the test session, participants were given 20 practice trials to familiarize themselves with the task flow. The main task consisted of 10 blocks of 24 trials each. Two-minute-long breaks were given between the blocks. Each emotion condition (neutral, positive, erotic, mutilation, and horror) was tested across two blocks, once in the first half of the session and another in the second half. The order of the blocks within each half was randomized, with the restriction that the same emotion condition block did not appear in succession (i.e., fifth and sixth blocks were not the same). During the task, EEG, SCR (right-side two fingers in hardware), and RT (button press with left hand) were recorded simultaneously.

With the exception of the pictures in the horror condition, all pictures were selected from the International Affective Picture System (IAPS), a commonly used image database for emotion research containing the standardized valence score and situational category of each picture (Lang et al., 2008). For the neutral condition, pictures in the median 20% of IAPS valence scores were selected, excluding those containing images of people, weapons, cigarettes, and food, to avoid socially biased effects. From pictures with top 20% valence scores in the IAPS data base, those of intimately engaged heterosexual couples were selected for the erotic condition, and those excluding sexual content were selected for the positive condition. The mutilation condition consisted of images of bodily damage/harm selected based on the IAPS picture descriptions. Since the fear-inducing pictures included in the IAPS data base were inadequate to be categorized as “horror,” the horror condition pictures were selected from a commercially usable free web source. The horror pictures’ comparability to other conditions was confirmed before the main task (Fig. 1B). A separate group of 10 participants rated the valence and arousal of all of the pictures in our stimulus set, after each picture was presented for 3 s on a computer monitor in front of them. Altogether, 240 pictures were used, 48 from each emotion condition.

SCR data acquisition

To detect the release of sweat due to a change in the arousal state (Montagu and Coles, 1966), SCR (galvanic skin response) was measured using the Gazepoint biometrics package and software, with a constant voltage coupler (5 V) and a 60-Hz sampling rate. Participants put their right index and middle fingers into the biometric hardware and were instructed to pull out their fingers between task blocks to prevent the physiological response from saturation. To calculate SCR for each picture, a high-pass FIR filter of 0.05 Hz was applied (MATLAB) to the entire time series; then, maximum change in SCR was extracted from the baseline (average over the 500-ms fixation period preceding picture onset) to the test trial (from picture onset to 500 ms before the next trial). The data were log transformed to minimize skewness and averaged for each block (for multimodal classification) and each emotional condition (for the remainder of the analysis).

EEG data acquisition and processing

Participants’ EEG signal was recorded using the gel-type 32-channel wireless Emotiv EPOC Flex that adheres to the 10–20 system, a standard method for electrode placement. The data were preprocessed through average re-referencing and bandpass filtering between 0.1 and 30 Hz using EEGLAB on MATLAB (Delorme and Makeig, 2004). Based on the picture presentation at 0 ms, ERP epochs were selected from –100 to 1000 ms. Baseline (from –100 to 0 ms) correction was applied in each epoch. Epochs containing ocular artifacts (identified through Infomax ICA) or signals with an absolute value higher than 100 μV were omitted from the analysis (Delorme et al., 2007). Three channels (Fz, Cz, and Pz) were selected for ERP component analysis (Stormark et al., 1995; Cuthbert et al., 1998; Codispoti et al., 2006; Yen et al., 2010). P300 and LPP amplitudes were calculated using the mean voltage between 250 and 350 ms and between 500 and 800 ms, respectively; these time-points were chosen based on previous literature (Lu et al., 2011; Zhang and Zhou, 2014; Zhao et al., 2018; Maffei et al., 2021) and our study design in which the ANT began at least 900  ms after picture onset. To test for the effects of emotional stimuli, the three types of responses (RT, SCR, ERP) in the four emotion conditions (positive, erotic, mutilation, and horror) were compared with those in the neutral condition (see below for a description of notation) (Schupp et al., 2003, 2006a,b). Bonferroni correction was applied to the p value of Emodality, emotion based on the number of multiple comparisons following the repeated-measures ANOVA. The Greenhouse-Geisser correction was applied for violations of sphericity (adjusted degrees of freedom provided). Emodality,emotion=Rmodality,emotion–Rmodality,neutral,

E: Effect of affective stimuli compared with the neutral condition;

R: response value in each modality and condition;

modality: RT, SCR, or ERP;

emotion: positive, erotic, mutilation, or horror.

Prediction of facilitation versus impediment of attention

In the conditions which resulted in significant emotional effects on SCR and ERP, the participants were divided into two groups based on whether they were facilitated (ERT, emotion < 0) or impeded (ERT, emotion > 0). A support vector machine (SVM) was used to classify the subjects, based on their ERP (unimodal) or both ERP and SCR (multimodal), to predict whether RT in the emotion condition is faster or slower than that in the neutral condition (Noble, 2006). Block-averaged values were used for each variable. In each condition, prediction and accuracy and area under the receiver operating characteristic (ROC) curve (AUC) were calculated using a SVM with 10-fold cross-validation.

Results

Behavioral performance

Before the main experiment, we compared the valence and arousal ratings of the picture stimuli across all conditions; this was particularly relevant with respect to the horror condition, which was not a part of the IAPS. Non-parametric comparisons (Friedman’s test) revealed a main effect of the emotion condition for both valence ratings (χ2 = 38.000, p < 0.001; Fig. 1C) and arousal ratings (χ2 = 29.760, p < 0.001; Fig. 1D). As expected, post hoc Wilcoxon signed-rank tests showed that horror pictures were rated significantly lower in valence than neutral, positive, and erotic pictures but not differently from mutilation pictures. For arousal ratings, the horror condition was significantly higher than the neutral condition but not different from other emotion conditions (Table 1).

View this table:
  • View inline
  • View popup
Table 1

Statistical table 1

A three-way repeated measures ANOVA including the emotion condition (neutral, positive, erotic, mutilation, and horror), cue condition (spatial, center, and no), and target condition (congruent and incongruent) was conducted to analyze their effects on RT and make sure that the cue and target in our modified ANT worked properly. There were main effects of the cue and target conditions on RT [F(1.601,48.025) = 49.605, p < 0.001, ηp2 = 0.213 (Fig. 2A); F(1,30) = 8.109, p = 0.008, ηp2 = 0.623 (Fig. 2B)]. For post hoc pairwise t tests, RT after the spatial cue was faster than that after both the center cue (t(30) = −5.871, pcorrected < 0.001, d = −1.054) and no cue (t(30) = −8.659, pcorrected < 0.001, d = −1.555). RT following the center cue was faster than no cue (t(30) = −4.517, pcorrected < 0.001, d = 0.811). RT for the congruent target was also faster than that for the incongruent target (t(30) = −2.826, pcorrected = 0.008, d = −0.508). There was a main effect of emotion but no significant results in the post hoc pairwise comparisons (F(4,120) = 2.839, p = 0.027, ηp2 = 0.086). The results showed that participants were able to correctly perform the ANT using the cue information and the congruency of arrows.

Figure 2.
  • Download figure
  • Open in new tab
  • Download powerpoint
Figure 2.

A, Behavioral performance across cue types. RT after the spatial cue was faster than that after the center cue; both were faster than having no cue at all. B, Behavioral performance across target types. RT for the congruent target was faster than RT for the incongruent target. C, SCR difference scores ESCR, emotion across emotion conditions. The dotted line indicates SCR in the neutral condition. SCR in the horror condition was higher than in the neutral condition. D, ERP across emotion conditions after picture presentation in channels Fz, Cz, and Pz. Dotted and colored lines indicate ERPs, with the picture presented at time = 0 ms. P300 and LPP amplitudes were averaged between 250 and 350 ms and between 500 and 800 ms, respectively. E, P300 difference scores EP300, emotion across emotion conditions. From the top to bottom, graphs show EP300, emotion in channels Fz, Cz, and Pz. Dotted lines indicate P300 in the neutral condition. P300 amplitudes in the horror condition (channel Fz) and erotic condition (channel Cz) were higher than the neutral condition. F, LPP difference score ELPP, emotion across emotion conditions. From the top to bottom, graphs show ELPP, emotion in channels Fz, Cz, and Pz. Dotted lines indicate LPP in the neutral condition. LPP amplitudes in the erotic condition (channels Cz and Pz) and horror condition (channel Pz) were higher than in the neutral condition. Black asterisks indicate corrected ps < 0.05 for paired t tests or one-sample t tests.

To test whether RT in each of the positive, erotic, mutilation, and horror conditions were different from that in the neutral condition, a one-sample t test with 0 was conducted on the RT difference score ERT, emotion in each emotion category. There were no significant differences (Table 2).

View this table:
  • View inline
  • View popup
Table 2

Statistical table 2

SCR

To test whether SCR in each of the positive, erotic, mutilation, and horror conditions was different from that in the neutral condition, a one-sample t test with 0 was conducted on the SCR difference score ESCR, emotion in each emotion category. SCR in the horror condition was higher than that in the neutral condition (t(30) = 2.675, pcorrected = 0.048, d = 0.481; Fig. 2C). Furthermore, a one-way repeated measures ANOVA revealed a significant effect of the emotion condition on SCR (F(3,90) = 4.955, p = 0.003, ηp2 = 0.142), with post hoc pairwise t tests showing that ESCR, horror was higher than ESCR, positive and ESCR, erotic (Table 2). The results indicated that physiological arousal was significantly elicited in the horror condition.

ERP response

To test whether P300 amplitude in each of the positive, erotic, mutilation, and horror conditions was different from that in the neutral condition, a one-sample t test with 0 was conducted on the P300 difference score EP300, emotion in each emotion condition and in each EEG channel (Fig. 2E). In channel Fz, P300 amplitude in the horror condition was higher than that in the neutral condition (t(25) = 3.387, pcorrected = 0.009, d = 0.664). In channel Cz, P300 amplitude in the erotic condition was higher than that in the neutral condition (t(25) = 2.923, pcorrected = 0.029, d = 0.573). To differentiate emotional effects on P300, a two-way repeated measures ANOVA including the emotion condition (positive, erotic, mutilation, and horror) and channel (Fz, Cz, and Pz) was performed. There was a main effect of emotion condition (F(3,75) = 9.065, p < 0.001, ηp2 = 0.266). Since there was no main effect of channel, the P300 amplitudes in channels Fz, Cz, and Pz were averaged for a post hoc pairwise t test. EP300, erotic was higher than EP300, positive, and both EP300, erotic and EP300, horror were higher than EP300, mutilation (Table 2). Increased P300 amplitudes for only the erotic and horror pictures may indicate that participants processed the emotional significance of these particular visual stimuli even after a brief presentation.

For LPP amplitudes, a one-sample t test against 0 was conducted on the LPP difference score ELPP, emotion in each emotion condition and channel (Fig. 2F). LPP in both the erotic and horror conditions was higher than that in the neutral condition in channel Pz (t(25) = 6.708, pcorrected < 0.001, d = 1.316; t(25) = 3.258, pcorrected = 0.010, d = 0.639), while only the erotic condition was higher than the neutral condition in channel Cz (t(25) = 5.924, pcorrected < 0.001, d = 1.162). To differentiate emotional effects on LPP amplitude, a two-way repeated measures ANOVA including the emotion condition (positive, erotic, mutilation, and horror) and channel (Fz, Cz, and Pz) was performed. There were main effects of both channel and emotion condition with interaction between them (F(1.164,27.937) = 7.019, p = 0.010, ηp2 = 0.226; F(3,72) = 5.598, p = 0.002, ηp2 = 0.189; F(2.753,63.070) = 3.998, p = 0.013, ηp2 = 0.143). For the post hoc pairwise t test, both in channel Cz and Pz, ELPP, erotic and ELPP, horror were higher than ELPP, positive (Table 2). Increased LPP amplitude after seeing the erotic and horror pictures may reflect emotional arousal and regulation in these conditions.

Facilitated or impeded attention and its prediction

Based on RT in the neutral condition (ERT, emotion = 0), participants were divided into two groups (facilitated vs impeded) in the erotic and horror conditions (Fig. 3A). Out of 26 participants, 13 were facilitated and the remaining 13 were disrupted in each condition (Fig. 3B). A nonparametric one-sample test (Wilcoxon signed-rank test) against 0 was conducted on EP300, emotion and ELPP, emotion. In the erotic condition (Fig. 3C), compared to the neutral condition, only participants showing impeded attention had higher P300 amplitude in response to the affective pictures in channels Fz and Cz (Z = 2.551, pcorrected = 0.022, r = 0.708; Z = 3.180, p = 0.003, r = 0.882). On the other hand, LPP amplitude increased in participants showing both facilitated and impeded attention in channels Cz and Pz (Z = 2.551, pcorrected = 0.011, r = 0.708; Z = 3.110, pcorrected = 0.003, r = 0.863; Z = 3.110; pcorrected = 0.003, r = 0.863; Z = 3.040, pcorrected = 0.003, r = 0.843). In the horror condition (Fig. 3D), compared to the neutral condition, only participants whose attention was facilitated showed higher P300 amplitude in channel Fz (Z = 2.341, pcorrected = 0.038, r = 0.649) and higher LPP amplitude in channel Pz (Z = 2.271, pcorrected = 0.046, r = 0.630). There were no differences in SCR, anxiety and depression scores, and gender between facilitated and impeded groups for both the erotic and horror conditions. To sum up, distraction in the erotic condition and facilitation in the horror condition showed distinct ERP profiles.

Figure 3.
  • Download figure
  • Open in new tab
  • Download powerpoint
Figure 3.

A, Individual differences in the direction of emotional effects on RT in the erotic and horror conditions. The dotted line indicates RT in the neutral condition. Each circle signifies individual RT difference scores ERT, emotion in the erotic and horror conditions. B, Facilitated or impeded group placement. Based on individual emotional effects on RT, the 26 participants were divided into facilitated (ERT, emotion < 0) and impeded (ERT, emotion > 0) groups in each of the erotic and horror conditions. Individual group distribution is visualized using different shading. C, D, Topographical maps of ERP difference scores in the erotic condition (C) and horror condition (D). Black asterisks indicate that ERP amplitude in the erotic or horror condition was higher than the neutral condition. Top two topographic maps show EP300, erotic (C) and EP300, horror (D) and bottom maps show ELPP, erotic (C) and ELPP, horror (D). Topographic maps on the left side in each pair were from people whose attention was facilitated; and maps on the right side in each pair were from those whose attention was impeded.

An identical experiment was conducted with an independent sample of 15 participants using a 32-channel EEG system (Neuroscan Grael and Curry 8 EEG software) for a partial replication of the original results. In the erotic condition, nine subjects were facilitated and six were disrupted (compared with the neutral condition); in the horror condition, eight were facilitated and seven were disrupted. Again, in the erotic condition, compared with the neutral condition, only participants showing impeded attention had higher P300 and LPP amplitude in channel Cz (Z = 1.992, p = 0.046, r = 0.813; Z = 2.201, p = 0.028, r = 0.899), while in the horror condition, participants whose attention was facilitated attention showed a tendency of higher P300 and LPP amplitude (Z = 1.820, p = 0.069, r = 0.644; Z = 1.820, p = 0.069, r = 0.644). Although slightly underpowered, these results suggest that our finding of the “cognotypes” in emotion-attention interaction are widespread and replicable.

Given the lack of a difference across channels in the repeated measures ANOVA and post hoc pairwise t tests, EP300, emotion in channels Fz, Cz, and Pz and ELPP, emotion in channels Cz and Pz were averaged. P300 and LPP were used as features for unimodal classification, and P300, LPP, and SCR were used as features for multimodal classification. In a unimodal classification based only on ERP measures, accuracy values from SVM, predicting whether attention would be facilitated or impeded were 53.0% (positive), 66.0% (erotic), 51.0% (mutilation), and 65.5% (horror). Mean 10-fold AUC values were 0.63 (positive), 0.73 (erotic), 0.48 (mutilation), and 0.76 (horror). On the other hand, in a multimodal classification based on ERP and SCR, accuracy values from SVM classifying whether attention would be facilitated or impeded were 56.17% (positive), 70.50% (erotic), 51.00% (mutilation), and 73.50% (horror). Mean 10-fold AUC values were 0.68 (positive), 0.74 (erotic), 0.46 (mutilation), and 0.81 (horror). The prediction accuracies of the direction of attentional effects in the erotic and horror conditions were significantly above chance level, 50% (t(9) = 3.706, p = 0.005; t(9) = 4.045, p = 0.003). Moreover, in a comparison of the unimodal and multimodal classification, overall accuracy and AUC were increased when SVM was performed with ERP and SCR (Table 3).

View this table:
  • View inline
  • View popup
Table 3

Unimodal and multimodal classification accuracy and AUC for each emotional condition

Picture awareness ratings and recognition test

An additional supplementary experiment was conducted to see how participants processed the affective pictures in the present study. As in the original experiment, an affective picture was presented for 80 ms and followed immediately by a trial of the attention task (Fig. 4A). However, after each trial was completed, subjects were asked to answer several questions on the level of detail with which they perceived the picture and how much emotion was elicited by it. Five categories of pictures (neutral, positive, erotic, mutilation, and horror) were used, each consisting of 12 pictures from our original stimulus set. We found that participants were able to report the general gist of the pictures and their subjective feeling of arousal, but were not able to recall them in detail, regardless of picture type (Fig. 4B). After performing all 60 trials, they were also asked about how they thought their RT was influenced by the affective pictures (facilitated vs impeded); only four participants in the erotic condition and eight in the horror condition answered correctly (Fig. 4C). In the second part of the experiment, half of the previously presented pictures and novel lure pictures in the same category were presented one by one as a recognition test in which subjects answered whether or not they had seen the picture in the first part of the experiment. Recognition accuracies of all categories of pictures were not significantly higher than chance level, 50% (t(16) = −0.436, pcorrected = 1, d = −0.106; t(16) = 1.022, pcorrected = 0.966, d = 0.248; t(16) = −10.661, pcorrected < 0.001, d = −2.586; t(16) = −0.623, pcorrected = 1, d = −0.151; t(16) = −1.578, pcorrected = 0.537, d = −0.383; Fig. 4D).

Figure 4.
  • Download figure
  • Open in new tab
  • Download powerpoint
Figure 4.

A, Task sequence. In the first part, an independent sample of 17 participants was asked to report their experience and awareness of the briefly-presented pictures after each trial of the task (60 trials total, 12 per picture category). Participants rated their perceived level of awareness of the presented picture after each trial. In the second part, half of the previously presented pictures and novel lure pictures in the same category were presented one by one in a recognition test in which subjects answered whether they had seen the picture. B, Subjective awareness. 0: not aware; 25: color only; 50: emotional feeling; 75: partially-detailed recognition; and 100: perfect recognition. C, Comparison of reported and actual effects of the pictures on attention. In the erotic and horror conditions, participants were also asked about how they felt their RT was influenced by the pictures (facilitated vs impeded). Their responses are shown alongside the actual attentional effects using different-colored shading; there was no significant correspondence between the two, meaning that participants were not aware of the effect that the pictures had on their subsequent attention. D, Recognition memory accuracy. Participants were not able to distinguish the pictures they saw from lures in the same category, suggesting that while they were aware of the picture being flashed, they failed to process them in detail.

Discussion

Through an investigation of individual differences, the present study clarifies previous mixed findings on whether emotional processing induces attentional assistance or interference. We found that neurophysiological responses to emotional processing in the erotic and horror conditions were reflected in ERP and SCR measures. Furthermore, it was possible to predict individual differences in the direction of subsequent attentional effects from the ERP and SCR measures, particularly in response to erotic and horror pictures.

Salient emotion conditions

Significant increases in P300, and LPP were elicited and reliably predictable only in the erotic and horror conditions. Hajcak et al. (2010) pointed out that P300 amplitude is related to perceiving emotional significance and that LPP amplitude indicates emotional arousal and regulation. Moreover, the timing of the LPP is purported to reflect cognitive load. The higher LPP amplitude that we observed at 500–800 ms following the picture presentation can be additionally interpreted as the processing of emotional information with an additional cognitive demand, such as memory. Although this is a possibility, our supporting experiment showing participants’ failure in a subsequent recognition memory test suggests that higher memory processes were not involved in the timeframe we provided in our task (see Fig. 4). Therefore, rather than explicit emotional reappraisal or contextual memory processing, LPP may reflect a rapid emotional arousal regulation. According to this interpretation, the effects found in the erotic and horror conditions could reflect the perceptual significance of the stimuli (as indicated by the P300 response), followed by an automatic, rapid regulatory process before the attention task began (as indicated by the LPP response).

In past studies, explicit affective stimuli have been reported to elicit higher P300 or LPP amplitude than neutral stimuli, regardless of the specific emotional category (Naumann et al., 1992; Lang et al., 1993, 1997, 1998; Bradley and Lang, 2000; Cuthbert et al., 2000; Bradley et al., 2001; Hajcak et al., 2010). Furthermore, the increase in P300 and LPP amplitudes after rapidly presented unpleasant stimuli were found to be weaker compared with the response to explicit stimuli (Ito and Cacioppo, 2000; Van Strien et al., 2010; Lin et al., 2018). Thus, our results suggest that only stimuli which are salient or potentially important may overcome the threshold for eliciting P300 or LPP. For instance, it is plausible that environmental stimuli related to mating opportunities or potentially harmful situations (as in the erotic and horror conditions) would be processed more quickly and effectively than others, albeit through different pathways of attentional modification.

In this sense, horror stimuli signifying a threat-related situational context might strongly elicit both cortico-limbic and sympathetic nervous system responses, reflected in the ERPs and SCR (Northoff et al., 2000; Carretié et al., 2004; Baumgartner et al., 2006). According to the Multiple Attention Gain Control (MAGiC) model (Pourtois et al., 2013), attentional processes can be enhanced indirectly by a mechanism of visual perception amplification triggered by emotion signals from the amygdala (that then consequently enhances performance in tasks requiring visual attention). The erotic condition, on the other hand, may modulate attention through a slightly different pathway that directly activates cortical attentional processes; one study, for instance, found that visual erotic stimuli activated the dorsolateral prefrontal cortex, which is known to play a crucial role in selective attention, and that this activation was sustained even after the stimulus disappeared (Leon-Carrion et al., 2007). This interpretation may explain why only people who responded significantly to the erotic stimulus showed a reduction in performance in the subsequent attention task.

Individual differences in the direction of attentional effects

Past research reported mixed findings on the direction of the effect of emotion on attention (Schmeichel, 2007; Prause et al., 2008; Bocanegra and Zeelenberg, 2009, 2011; Rossignol et al., 2012; Brosch et al., 2013; Domınguez-Borras and Vuilleumier, 2013; Ortner et al., 2013; Pourtois et al., 2013). In our findings, attentional performance of some partipants was facilitated and that of others was impeded, depending on the erotic and horror conditions. This variation might be explained by differences in emotional response based on personal experiences and inclination toward the emotional stimuli (Zhang and Zhou, 2014; Matusz et al., 2015). For example, a past study reported that people who are afraid of snakes or spiders show a selectively higher LPP response to pictures of the particularly threatening objects than those who are not (Kolassa et al., 2005; Miltner et al., 2005).

In our study, LPP response to erotic pictures increased regardless of whether attention was facilitated or impeded, but P300 amplitude increased only for subjects who were impeded in the attention task. We interpret these results to mean that erotic pictures required emotion regulation in general, while they impeded subsequent attention only when people responded more strongly to them. In contrast, in the horror condition, both P300 and LPP amplitudes increased only for facilitated attention. It is possible that, although unpleasant stimuli are distracting in general, for individuals who are particularly responsive to the horror stimuli, they can assist subsequent attention (i.e., getting scared may enhance visual perception; Phelps et al., 2006; Bocanegra and Zeelenberg, 2009; Mobbs et al., 2009; Pourtois et al., 2013).

Limitations

Our findings imply that there are certain types of people whose attentional effects of emotional processing can be dissociable depending on the emotion. In particular, their initial responses (P300) were highly indicative of the direction of attentional effects. However, we did not find a significant effect of the participants’ anxiety and depression scores, suggesting that there may be more complex factors contributing to individual sensitivity to specific types of emotional stimuli. Further investigations will delve into identifying these key factors to optimize our ability to predict and enhance cognitive performance at the individual level.

Furthermore, although we have speculated above on the distinct neural mechanisms in response to the horror and erotic stimuli, the difficulty in accessing signals directly from deep brain regions such as the amygdala using EEG makes it difficult for us to fully characterize these purported neural pathways underlying emotional processing and attention. A follow-up study using functional magnetic resonance imaging (fMRI) will make it possible to observe activity in deep brain structures in our task.

In conclusion, attentional effects of emotional processing may be unavoidable, as the fast and autonomic processing of stimuli may have evolved as an adaptive mechanism for subsequent behaviors. The present study provided a potential explanation for the directional effects of emotion on attention from the perspective of individual differences in emotional processing itself. Remarkably, these individual trends differed according to the category of emotion and were classifiable based on electrophysiological responses preceding the attention task. These findings may contribute to the development of personalized alerting or cognitive enhancement systems that cannot only optimize our performance in everyday life but also help prevent accidents and losses due to inattention.

Acknowledgments

Acknowledgements: We thank the members of the Developmental Cognitive Neuroscience Laboratory at Seoul National University, including Y. J. Rah, S. Park, and J. Lee, for their assistance in data analysis and Professors J. Jeong and S. H. Lee for their comments on an earlier version of this work.

Footnotes

  • The authors declare no competing financial interests.

  • This work was supported by the National Research Foundation of Korea Grant 2021M3A9E408078011 (to S.A.L.), Hyundai NGV Project: Neuro-Cognitive/Affective Modulation in Automobile Environments (S.A.L.), and The KAIST Center for Contemplative Science (S.A.L.).

This is an open-access article distributed under the terms of the Creative Commons Attribution 4.0 International license, which permits unrestricted use, distribution and reproduction in any medium provided that the original work is properly attributed.

References

  1. ↵
    Baumgartner T, Esslen M, Jäncke L (2006) From emotion perception to emotion experience: emotions evoked by pictures and classical music. Int J Psychophysiol 60:34–43. doi:10.1016/j.ijpsycho.2005.04.007 pmid:15993964
    OpenUrlCrossRefPubMed
  2. ↵
    Beck AT, Brown G, Epstein N, Steer RA (1988) An inventory for measuring clinical anxiety - psychometric properties. J Consult Clin Psychol 56:893–897. doi:10.1037/0022-006X.56.6.893
    OpenUrlCrossRefPubMed
  3. ↵
    Beck AT, Steer RA, Brown GK (1996) Manual for the Beck depression inventory-II. San Antonio: Psychological Corporation.
  4. ↵
    Bocanegra BR, Zeelenberg R (2009) Emotion improves and impairs early vision. Psychol Sci 20:707–713. doi:10.1111/j.1467-9280.2009.02354.x pmid:19422624
    OpenUrlCrossRefPubMed
  5. ↵
    Bocanegra BR, Zeelenberg R (2011) Emotion-induced trade-offs in spatiotemporal vision. J Exp Psychol Gen 140:272–282. doi:10.1037/a0023188 pmid:21443382
    OpenUrlCrossRefPubMed
  6. ↵
    Bradley MM (2009) Natural selective attention: orienting and emotion. Psychophysiology 46:1–11. doi:10.1111/j.1469-8986.2008.00702.x pmid:18778317
    OpenUrlCrossRefPubMed
  7. ↵
    Bradley MM, Lang PJ (2000) Measuring emotion: behavior, feeling, and physiology. In: Cognitive neuroscience of emotion. Oxford: Oxford University Press.
  8. ↵
    Bradley MM, Codispoti M, Cuthbert BN, Lang PJ (2001) Emotion and motivation I: defensive and appetitive reactions in picture processing. Emotion 1:276–298. doi:10.1037//1528-3542.1.3.276
    OpenUrlCrossRefPubMed
  9. ↵
    Brosch T, Scherer KR, Grandjean D, Sander D (2013) The impact of emotion on perception, attention, memory, and decision-making. Swiss Med Wkly 143:w13786. doi:10.4414/smw.2013.13786
    OpenUrlCrossRef
  10. ↵
    Carretié L, Hinojosa JA, Martín-Loeches M, Mercado F, Tapia M (2004) Automatic attention to emotional stimuli: neural correlates. Hum Brain Mapp 22:290–299. doi:10.1002/hbm.20037 pmid:15202107
    OpenUrlCrossRefPubMed
  11. ↵
    Codispoti M, Ferrari V, Bradley MM (2006) Repetitive picture processing: autonomic and cortical correlates. Brain Res 1068:213–220. doi:10.1016/j.brainres.2005.11.009 pmid:16403475
    OpenUrlCrossRefPubMed
  12. ↵
    Compton RJ (2003) The interface between emotion and attention: a review of evidence from psychology and neuroscience. Behav Cogn Neurosci Rev 2:115–129. doi:10.1177/1534582303255278 pmid:13678519
    OpenUrlCrossRefPubMed
  13. ↵
    Critchley HD, Elliott R, Mathias CJ, Dolan RJ (2000) Neural activity relating to generation and representation of galvanic skin conductance responses: a functional magnetic resonance imaging study. J Neurosci 20:3033–3040. doi:10.1523/JNEUROSCI.20-08-03033.2000
    OpenUrlAbstract/FREE Full Text
  14. ↵
    Cuthbert BN, Schupp HT, Bradley M, McManis M, Lang PJ (1998) Probing affective pictures: attended startle and tone probes. Psychophysiology 35:344–347. doi:10.1017/s0048577298970536 pmid:9564755
    OpenUrlCrossRefPubMed
  15. ↵
    Cuthbert BN, Schupp HT, Bradley MM, Birbaumer N, Lang PJ (2000) Brain potentials in affective picture processing: covariation with autonomic arousal and affective report. Biol Psychol 52:95–111. doi:10.1016/S0301-0511(99)00044-7 pmid:10699350
    OpenUrlCrossRefPubMed
  16. ↵
    Delorme A, Makeig S (2004) EEGLAB: an open source toolbox for analysis of single-trial EEG dynamics including independent component analysis. J Neurosci Methods 134:9–21. doi:10.1016/j.jneumeth.2003.10.009 pmid:15102499
    OpenUrlCrossRefPubMed
  17. ↵
    Delorme A, Sejnowski T, Makeig S (2007) Enhanced detection of artifacts in EEG data using higher-order statistics and independent component analysis. Neuroimage 34:1443–1449. doi:10.1016/j.neuroimage.2006.11.004 pmid:17188898
    OpenUrlCrossRefPubMed
  18. ↵
    Domınguez-Borras J, Vuilleumier P (2013) Affective biases in attention and perception. In: The Cambridge handbook of human affective neuroscience. Cambridge: Cambridge University Press.
  19. ↵
    Eastwood JD, Smilek D, Merikle PM (2003) Negative facial expression captures attention and disrupts performance. Percept Psychophys 65:352–358. doi:10.3758/bf03194566 pmid:12785065
    OpenUrlCrossRefPubMed
  20. ↵
    Fan J, McCandliss BD, Sommer T, Raz A, Posner MI (2002) Testing the efficiency and independence of attentional networks. J Cogn Neurosci 14:340–347. doi:10.1162/089892902317361886 pmid:11970796
    OpenUrlCrossRefPubMed
  21. ↵
    Fan J, McCandliss BD, Fossella J, Flombaum JI, Posner MI (2005) The activation of attentional networks. Neuroimage 26:471–479. doi:10.1016/j.neuroimage.2005.02.004 pmid:15907304
    OpenUrlCrossRefPubMed
  22. ↵
    Foti D, Hajcak G (2008) Deconstructing reappraisal: descriptions preceding arousing pictures modulate the subsequent neural response. J Cogn Neurosci 20:977–988. doi:10.1162/jocn.2008.20066 pmid:18211235
    OpenUrlCrossRefPubMed
  23. ↵
    Friedman RS, Förster J (2010) Implicit affective cues and attentional tuning: an integrative review. Psychol Bull 136:875–893. doi:10.1037/a0020495 pmid:20804240
    OpenUrlCrossRefPubMed
  24. ↵
    Gohm CL, Clore GL (2000) Individual differences in emotional experience: mapping available scales to processes. Pers Soc Psychol Bull 26:679–686. doi:10.1177/0146167200268004
    OpenUrlCrossRef
  25. ↵
    Hajcak G, Foti D (2020) Significance?& Significance! Empirical, methodological, and theoretical connections between the late positive potential and P300 as neural responses to stimulus significance: an integrative review. Psychophysiology 57:e13570. doi:10.1111/psyp.13570
    OpenUrlCrossRef
  26. ↵
    Hajcak G, MacNamara A, Olvet DM (2010) Event-related potentials, emotion, and emotion regulation: an integrative review. Dev Neuropsychol 35:129–155. doi:10.1080/87565640903526504 pmid:20390599
    OpenUrlCrossRefPubMed
  27. ↵
    Hajcak G, MacNamara A, Foti D, Ferri J, Keil A (2013a) The dynamic allocation of attention to emotion: simultaneous and independent evidence from the late positive potential and steady state visual evoked potentials. Biol Psychol 92:447–455. doi:10.1016/j.biopsycho.2011.11.012 pmid:22155660
    OpenUrlCrossRefPubMed
  28. ↵
    Hajcak G, Weinberg A, MacNamara A, Foti D (2013b) ERPs and the study of emotion. In: Handbook of event-related potential components. New York: Oxford University Press.
  29. ↵
    Halgren E (1992) Emotional neurophysiology of the amygdala within the context of human cognition. In: The amygdala: neurobiological aspects of emotion, memory, and mental dysfunction. New York: Wiley-Liss.
  30. ↵
    Hamann S, Canli T (2004) Individual differences in emotion processing. Curr Opin Neurobiol 14:233–238. doi:10.1016/j.conb.2004.03.010 pmid:15082330
    OpenUrlCrossRefPubMed
  31. ↵
    Ito TA, Cacioppo JT (2000) Electrophysiological evidence of implicit and explicit categorization processes. J Exp Soc Psychol 36:660–676. doi:10.1006/jesp.2000.1430
    OpenUrlCrossRef
  32. ↵
    Johnston VS, Miller DR, Burleson MH (1986) Multiple P3s to emotional stimuli and their theoretical significance. Psychophysiology 23:684–694. doi:10.1111/j.1469-8986.1986.tb00694.x pmid:3823344
    OpenUrlCrossRefPubMed
  33. ↵
    Kolassa IT, Musial F, Mohr A, Trippe RH, Miltner WHR (2005) Electrophysiological correlates of threat processing in spider phobics. Psychophysiology 42:520–530. doi:10.1111/j.1469-8986.2005.00315.x pmid:16176374
    OpenUrlCrossRefPubMed
  34. ↵
    Krolak-Salmon P, Fischer C, Vighetto A, Mauguière F (2001) Processing of facial emotional expression: spatio-temporal data as assessed by scalp event-related potentials. Eur J Neurosci 13:987–994. doi:10.1046/j.0953-816x.2001.01454.x pmid:11264671
    OpenUrlCrossRefPubMed
  35. ↵
    Lang PJ, Greenwald MK, Bradley MM, Hamm AO (1993) Looking at pictures - affective, facial, visceral, and behavioral reactions. Psychophysiology 30:261–273. doi:10.1111/j.1469-8986.1993.tb03352.x pmid:8497555
    OpenUrlCrossRefPubMed
  36. ↵
    Lang PJ, Bradley MM, Cuthbert BN (1997) Motivated attention: affect, activation, and action. In: Attention and orienting: sensory and motivational processes. Mahwah: Lawrence Erlbaum Associates Publishers.
  37. ↵
    Lang PJ, Bradley MM, Cuthbert BN (1998) Emotion, motivation, and anxiety: brain mechanisms and psychophysiology. Biol Psychiatry 44:1248–1263. doi:10.1016/S0006-3223(98)00275-3 pmid:9861468
    OpenUrlCrossRefPubMed
  38. ↵
    Lang PJ, Bradley MM, Cuthbert BN (2008) International affective picture system (IAPS): affective ratings of pictures and instruction manual. In: Technical report A-8. Gainesville: University of Florida.
  39. ↵
    LeBlanc VR, McConnell MM, Monteiro SD (2015) Predictable chaos: a review of the effects of emotions on attention, memory and decision making. Adv Health Sci Educ Theory Pract 20:265–282. doi:10.1007/s10459-014-9516-6 pmid:24903583
    OpenUrlCrossRefPubMed
  40. ↵
    Leon-Carrion J, Martín-Rodríguez JF, Damas-López J, Pourrezai K, Izzetoglu K, Barroso Y Martin JM, Dominguez-Morales MR (2007) Does dorsolateral prefrontal cortex (DLPFC) activation return to baseline when sexual stimuli cease? The role of DLPFC in visual sexual stimulation. Neurosci Lett 416:55–60. doi:10.1016/j.neulet.2007.01.058 pmid:17316990
    OpenUrlCrossRefPubMed
  41. ↵
    Lin HY, Liang JF, Jin H, Zhao DM (2018) Differential effects of uncertainty on LPP responses to emotional events during explicit and implicit anticipation. Int J Psychophysiol 129:41–51. doi:10.1016/j.ijpsycho.2018.04.012 pmid:29704580
    OpenUrlCrossRefPubMed
  42. ↵
    Lu BL, Zhang L, Kwok J (2011) Neural information processing. 18th International Conference, ICONIP 2011, Shanghai, China, November 13-17, 2011, Proceedings, Part III. Vol.7064. Springer.
  43. ↵
    Maffei A, Goertzen J, Jaspers-Fayer F, Kleffner K, Sessa P, Liotti M (2021) Spatiotemporal dynamics of covert versus overt processing of happy, fearful and sad facial expressions. Brain Sci 11:942. doi:10.3390/brainsci11070942
    OpenUrlCrossRef
  44. ↵
    Mardaga S, Laloyaux O, Hansenne M (2006) Personality traits modulate skin conductance response to emotional pictures: an investigation with Cloninger’s model of personality. Pers Indiv Diff 40:1603–1614. doi:10.1016/j.paid.2005.12.006
    OpenUrlCrossRef
  45. ↵
    Matusz PJ, Traczyk J, Sobkow A, Strelau J (2015) Individual differences in emotional reactivity moderate the strength of the relationship between attentional and implicit-memory biases towards threat-related stimuli. J Cogn Psychol 27:715–724. doi:10.1080/20445911.2015.1027210
    OpenUrlCrossRef
  46. ↵
    Miltner WHR, Trippe RH, Krieschel S, Gutberlet I, Hecht H, Weiss T (2005) Event-related brain potentials and affective responses to threat in spider/snake-phobic and non-phobic subjects. Int J Psychophysiol 57:43–52. doi:10.1016/j.ijpsycho.2005.01.012 pmid:15896860
    OpenUrlCrossRefPubMed
  47. ↵
    Mobbs D, Marchant JL, Hassabis D, Seymour B, Tan G, Gray M, Petrovic P, Dolan RJ, Frith CD (2009) From threat to fear: the neural organization of defensive fear systems in humans. J Neurosci 29:12236–12243. doi:10.1523/JNEUROSCI.2378-09.2009 pmid:19793982
    OpenUrlAbstract/FREE Full Text
  48. ↵
    Montagu JD, Coles EM (1966) Mechanism and measurement of the galvanic skin response. Psychol Bull 65:261. doi:10.1037/h0023204 pmid:5325891
    OpenUrlCrossRefPubMed
  49. ↵
    Moser JS, Most SB, Simons RF (2010) Increasing negative emotions by reappraisal enhances subsequent cognitive control: a combined behavioral and electrophysiological study. Cogn Affect Behav Neurosci 10:195–207. doi:10.3758/CABN.10.2.195 pmid:20498344
    OpenUrlCrossRefPubMed
  50. ↵
    Naumann E, Bartussek D, Diedrich O, Laufer ME (1992) Assessing cognitive and affective information-processing functions of the brain by means of the late positive complex of the event-related potential. J Psychophysiol 6:285–298.
    OpenUrl
  51. ↵
    Noble WS (2006) What is a support vector machine? Nat Biotechnol 24:1565–1567. doi:10.1038/nbt1206-1565 pmid:17160063
    OpenUrlCrossRefPubMed
  52. ↵
    Northoff G, Richter A, Gessner M, Schlagenhauf F, Fell J, Baumgart F, Kaulisch T, Kötter R, Stephan KE, Leschinger A, Hagner T, Bargel B, Witzel T, Hinrichs H, Bogerts B, Scheich H, Heinze HJ (2000) Functional dissociation between medial and lateral prefrontal cortical spatiotemporal activation in negative and positive emotions: a combined fMRI/MEG study. Cereb Cortex 10:93–107. doi:10.1093/cercor/10.1.93 pmid:10639399
    OpenUrlCrossRefPubMed
  53. ↵
    Olofsson JK, Nordin S, Sequeira H, Polich J (2008) Affective picture processing: an integrative review of ERP findings. Biol Psychol 77:247–265. doi:10.1016/j.biopsycho.2007.11.006
    OpenUrlCrossRefPubMed
  54. ↵
    Ortner CNM, Zelazo PD, Anderson AK (2013) Effects of emotion regulation on concurrent attentional performance. Motiv Emot 37:346–354. doi:10.1007/s11031-012-9310-9
    OpenUrlCrossRef
  55. ↵
    Phelps EA, Ling S, Carrasco M (2006) Emotion facilitates perception and potentiates the perceptual benefits of attention. Psychol Sci 17:292–299. doi:10.1111/j.1467-9280.2006.01701.x pmid:16623685
    OpenUrlCrossRefPubMed
  56. ↵
    Pourtois G, Grandjean D, Sander D, Vuilleumier P (2004) Electrophysiological correlates of rapid spatial orienting towards fearful faces. Cereb Cortex 14:619–633. doi:10.1093/cercor/bhh023 pmid:15054077
    OpenUrlCrossRefPubMed
  57. ↵
    Pourtois G, Schettino A, Vuilleumier P (2013) Brain mechanisms for emotional influences on perception and attention: what is magic and what is not. Biol Psychol 92:492–512. doi:10.1016/j.biopsycho.2012.02.007 pmid:22373657
    OpenUrlCrossRefPubMed
  58. ↵
    Prause N, Janssen E, Hetrick WP (2008) Attention and emotional responses to sexual stimuli and their relationship to sexual desire. Arch Sex Behav 37:934–949. doi:10.1007/s10508-007-9236-6 pmid:17943435
    OpenUrlCrossRefPubMed
  59. ↵
    Rossignol M, Philippot P, Bissot C, Rigoulot S, Campanella S (2012) Electrophysiological correlates of enhanced perceptual processes and attentional capture by emotional faces in social anxiety. Brain Res 1460:50–62. doi:10.1016/j.brainres.2012.04.034 pmid:22592075
    OpenUrlCrossRefPubMed
  60. ↵
    Sabatinelli D, Lang PJ, Keil A, Bradley MM (2007) Emotional perception: correlation of functional MRI and event-related potentials. Cereb Cortex 17:1085–1091. doi:10.1093/cercor/bhl017 pmid:16769742
    OpenUrlCrossRefPubMed
  61. ↵
    Schmeichel BJ (2007) Attention control, memory updating, and emotion regulation temporarily reduce the capacity for executive control. J Exp Psychol Gen 136:241–255. doi:10.1037/0096-3445.136.2.241 pmid:17500649
    OpenUrlCrossRefPubMed
  62. ↵
    Schupp HT, Junghöfer M, Weike AI, Hamm AO (2003) Emotional facilitation of sensory processing in the visual cortex. Psychol Sci 14:7–13. doi:10.1111/1467-9280.01411 pmid:12564747
    OpenUrlCrossRefPubMed
  63. ↵
    Schupp HT, Flaisch T, Stockburger J, Junghöfer M (2006a) Emotion and attention: event-related brain potential studies. Prog Brain Res 156:31–51. doi:10.1016/S0079-6123(06)56002-9 pmid:17015073
    OpenUrlCrossRefPubMed
  64. ↵
    Schupp HT, Stockburger J, Codispoti M, Junghöfer M, Weike AI, Hamm AO (2006b) Stimulus novelty and emotion perception: the near absence of habituation in the visual cortex. Neuroreport 17:365–369. doi:10.1097/01.wnr.0000203355.88061.c6 pmid:16514360
    OpenUrlCrossRefPubMed
  65. ↵
    Stormark KM, Nordby H, Hugdahl K (1995) Attentional shifts to emotionally charged cues: behavioral and ERP data. Cogn Emot 9:507–523. doi:10.1080/02699939508408978
    OpenUrlCrossRef
  66. ↵
    Van Strien JW, De Sonneville LMJ, Franken IHA (2010) The late positive potential and explicit versus implicit processing of facial valence. Neuroreport 21:656–661. doi:10.1097/WNR.0b013e32833ab89e pmid:20453693
    OpenUrlCrossRefPubMed
  67. ↵
    Vuilleumier P (2005) How brains beware: neural mechanisms of emotional attention. Trends Cogn Sci 9:585–594. doi:10.1016/j.tics.2005.10.011 pmid:16289871
    OpenUrlCrossRefPubMed
  68. ↵
    Vuilleumier P, Driver J (2007) Modulation of visual processing by attention and emotion: windows on causal interactions between human brain regions. Philos Trans R Soc Lond B Biol Sci 362:837–855. doi:10.1098/rstb.2007.2092 pmid:17395574
    OpenUrlCrossRefPubMed
  69. ↵
    Wadlinger HA, Isaacowitz DM (2006) Positive mood broadens visual attention to positive stimuli. Motiv Emot 30:87–99. doi:10.1007/s11031-006-9021-1
    OpenUrlCrossRefPubMed
  70. ↵
    Wiens S, Sand A, Olofsson JK (2011) Nonemotional features suppress early and enhance late emotional electrocortical responses to negative pictures. Biol Psychol 86:83–89. doi:10.1016/j.biopsycho.2010.11.001 pmid:21093530
    OpenUrlCrossRefPubMed
  71. ↵
    Yen NS, Chen KH, Liu EH (2010) Emotional modulation of the late positive potential (LPP) generalizes to Chinese individuals. Int J Psychophysiol 75:319–325. doi:10.1016/j.ijpsycho.2009.12.014 pmid:20079772
    OpenUrlCrossRefPubMed
  72. ↵
    Zhang J, Zhou RL (2014) Individual differences in automatic emotion regulation affect the asymmetry of the LPP component. PLoS One 9:e88261. doi:10.1371/journal.pone.0088261
    OpenUrlCrossRefPubMed
  73. ↵
    Zhao L, Shi ZL, Zheng Q, Chu HD, Xu L, Hu FP (2018) Use of electroencephalography for the study of gain-loss asymmetry in intertemporal decision-making. Front Neurosci 12:984. doi:10.3389/fnins.2018.00984 pmid:30622455
    OpenUrlCrossRefPubMed

Synthesis

Reviewing Editor: Alexander Soutschek, Ludwig-Maximilians-Universitat Munchen

Decisions are customarily a result of the Reviewing Editor and the peer reviewers coming together and discussing their recommendations until a consensus is reached. When revisions are invited, a fact-based synthesis statement explaining their decision and outlining what is needed to prepare a revision will be listed below. The following reviewer(s) agreed to reveal their identity: Ivan Grahek, Pawel Strozak.

The reviewers appreciate all the changes that were made by the authors and are convinced that the paper in its current form is significantly improved. It is worth mentioning that the authors have conducted additional experiments to support their initial findings and to deal with some of the concerns that were raised in the first round of reviews. Nonetheless, the reviewers point to some minor issues that still need to be addressed:

1. Abstract, lines 14-16: this is not very precise. The lack of large physiological response to erotic pictures in participants that were facilitated in this condition is only true for P300, but not for LPP (see also p. 17, lines 322-323). Please be more precise.

2. Still very little information is provided with regard to the measurement of skin conductance response (SCR). In the previous reviews, the reviewers had already asked for clarifying what exact measure of SCR activity was chosen, what was the time window for the measurement, how these data were analyzed and was there any decomposition of the SCR signal. In the revised version of the manuscript, only time window for SCR measurement is specified.

3. p. 9, line 170 - “(...) p-value of Emodality, emotion based on (...)”. It seems like there is a typographical error here.

4. Please explain supporting experiments in more detail. According to the extended data guidelines for eNeuro, the goal of including them “is not to report supplemental experiments or analyses that the authors are using to support the argument”. As these data are clearly aimed at supporting the results of the main experiment, they should be included in the main body of the manuscript, possibly in the “Results” section.

5. When reporting the results of recognition test of the additional experiment, please calculate and report the measure of subjects’ sensitivity (d prime). This is a better option than just the mere hit rate and can give us more insight into whether participants were really capable of discriminating old pictures from novel pictures.

Author Response

We have clarified all points raised to the best of our ability. We hope you agree that these changes have made the manuscript suitable for publication. Thanks again for your consideration.

1. Abstract, lines 14-16: this is not very precise. The lack of large physiological response to erotic pictures in participants that were facilitated in this condition is only true for P300, but not for LPP (see also p. 17, lines 322-323). Please be more precise.

Thank you for pointing this out. We have rewritten those lines (lines 12 - 18 in revision) to be more precise:

We found that, in general, affective pictures induced higher electrophysiological responses compared to neutral pictures (P300 and late positive potential (LPP) in the erotic condition; P300, LPP, and SCR in the horror condition). In particular, individuals who showed a strong ERP response to the pictures were impeded (slower) in the erotic condition (only P300) and facilitated (faster) in the horror condition (both P300 and LPP). Those who did not show a significant ERP or SCR response to the pictures were facilitated in the erotic condition and impeded in the horror condition.

2. Still very little information is provided with regard to the measurement of skin conductance response (SCR). In the previous reviews, the reviewers had already asked for clarifying what exact measure of SCR activity was chosen, what was the time window for the measurement, how these data were analyzed and was there any decomposition of the SCR signal. In the revised version of the manuscript, only time window for SCR measurement is specified.

Thank you for these constructive comments. Because each trial of our task was quite short (about 4 seconds long) compared to the temporal dynamics of SCR, we implemented a block design in which subjects performed 24 trials of the same emotion condition in each block. We have clarified the measurement and analysis of SCR in the revised method section (lines 146 - 155):

SCR was measured by galvanic skin response to detect the release of sweat due to a change in arousal state (Montagu & Coles, 1966), using the Gazepoint biometrics package and software, with a constant voltage coupler (5 V) at a 60Hz sampling rate. Participants put their right index and middle fingers into the biometric hardware and were instructed to pull out their fingers between task blocks to prevent the physiological response from saturation. To calculate SCR for each picture, a high-pass FIR filter of 0.05Hz was applied (Matlab) to the entire dataset; then, maximum change from the baseline (average over 500 ms fixation period before picture onset) to the test trial (from picture onset to 500 ms before next trial) was extracted. The data was log-transformed to minimize skewed response and averaged for each block (for multimodal classification) and each emotional condition (for the remainder of the analysis).

3. p. 9, line 170 - “(...) p-value of Emodality, emotion based on (...)”. It seems like there is a typographical error here.

We have corrected this typographical error in the revision.

4. Please explain supporting experiments in more detail. According to the extended data guidelines for eNeuro, the goal of including them “is not to report supplemental experiments or analyses that the authors are using to support the argument”. As these data are clearly aimed at supporting the results of the main experiment, they should be included in the main body of the manuscript, possibly in the “Results” section.

We have now moved the Extended Data 1 to the Results section (New Figure 4 and lines 292 - 307). The replication results in Extended Data 2 have been moved as text to the Result section (lines 267 - 275).

5. When reporting the results of recognition test of the additional experiment, please calculate and report the measure of subjects’ sensitivity (d prime). This is a better option than just the mere hit rate and can give us more insight into whether participants were really capable of discriminating old pictures from novel pictures.

Thank you. We have included effect size values in the revision.

Back to top

In this issue

eneuro: 9 (1)
eNeuro
Vol. 9, Issue 1
January/February 2022
  • Table of Contents
  • Index by author
  • Ed Board (PDF)
Email

Thank you for sharing this eNeuro article.

NOTE: We request your email address only to inform the recipient that it was you who recommended this article, and that it is not junk mail. We do not retain these email addresses.

Enter multiple addresses on separate lines or separate them with commas.
Electrophysiological Responses to Rapidly-Presented Affective Stimuli Predict Individual Differences in Subsequent Attention
(Your Name) has forwarded a page to you from eNeuro
(Your Name) thought you would be interested in this article in eNeuro.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
Print
View Full Page PDF
Citation Tools
Electrophysiological Responses to Rapidly-Presented Affective Stimuli Predict Individual Differences in Subsequent Attention
Ha Neul Song, Sewon Oh, Sang Ah Lee
eNeuro 3 December 2021, 9 (1) ENEURO.0285-21.2021; DOI: 10.1523/ENEURO.0285-21.2021

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
Respond to this article
Share
Electrophysiological Responses to Rapidly-Presented Affective Stimuli Predict Individual Differences in Subsequent Attention
Ha Neul Song, Sewon Oh, Sang Ah Lee
eNeuro 3 December 2021, 9 (1) ENEURO.0285-21.2021; DOI: 10.1523/ENEURO.0285-21.2021
Reddit logo Twitter logo Facebook logo Mendeley logo
  • Tweet Widget
  • Facebook Like
  • Google Plus One

Jump to section

  • Article
    • Abstract
    • Significance Statement
    • Introduction
    • Materials and Methods
    • Results
    • Discussion
    • Acknowledgments
    • Footnotes
    • References
    • Synthesis
    • Author Response
  • Figures & Data
  • Info & Metrics
  • eLetters
  • PDF

Keywords

  • attention
  • EEG
  • emotion
  • individual differences
  • skin conductance response

Responses to this article

Respond to this article

Jump to comment:

No eLetters have been published for this article.

Related Articles

Cited By...

More in this TOC Section

Research Article: New Research

  • Capacity Limits Lead to Information Bottlenecks in Ongoing Rapid Motor Behaviors
  • Nonlinear Theta-Gamma Coupling between the Anterior Thalamus and Hippocampus Increases as a Function of Running Speed
  • Contrast and Luminance Gain Control in the Macaque’s Lateral Geniculate Nucleus
Show more Research Article: New Research

Cognition and Behavior

  • Environment Enrichment Facilitates Long-Term Memory Consolidation Through Behavioral Tagging
  • Effects of cortical FoxP1 knockdowns on learned song preference in female zebra finches
  • The genetic architectures of functional and structural connectivity properties within cerebral resting-state networks
Show more Cognition and Behavior

Subjects

  • Cognition and Behavior

  • Home
  • Alerts
  • Visit Society for Neuroscience on Facebook
  • Follow Society for Neuroscience on Twitter
  • Follow Society for Neuroscience on LinkedIn
  • Visit Society for Neuroscience on Youtube
  • Follow our RSS feeds

Content

  • Early Release
  • Current Issue
  • Latest Articles
  • Issue Archive
  • Blog
  • Browse by Topic

Information

  • For Authors
  • For the Media

About

  • About the Journal
  • Editorial Board
  • Privacy Policy
  • Contact
  • Feedback
(eNeuro logo)
(SfN logo)

Copyright © 2023 by the Society for Neuroscience.
eNeuro eISSN: 2373-2822

The ideas and opinions expressed in eNeuro do not necessarily reflect those of SfN or the eNeuro Editorial Board. Publication of an advertisement or other product mention in eNeuro should not be construed as an endorsement of the manufacturer’s claims. SfN does not assume any responsibility for any injury and/or damage to persons or property arising from or related to any use of any material contained in eNeuro.