Speakers' emotional facial expressions modulate subsequent multi-modal language processing: ERP evidence
Sprach- und literaturwissenschaftliche Fakultät
We investigated the brain responses associated with the integration of speaker facial emotion into situations in which the speaker verbally describes an emotional event. In two EEG experiments, young adult participants were primed with a happy or sad speaker face. The target consisted of an emotionally positive or negative IAPS photo accompanied by a spoken emotional sentence describing that photo. The speaker's face either matched or mismatched the event-sentence valence. ERPs elicited by the adverb conveying sentence valence showed significantly larger negative mean amplitudes in the EPN and descriptively in the N400 time windows for positive speaker faces - negative event-sentences (vs. negatively matching prime-target trials). Our results suggest that young adults might allocate more processing resources to attend to and process negative (vs. positive) emotional situations when being primed with a positive (vs. negative) speaker face but not vice versa. Post-hoc analysis indicated that this interaction was driven by female participants. We extend previous eye-tracking findings with insights into the timing of the functional brain correlates implicated in integrating the valence of a speaker face into a multi-modal emotional situation.
Files in this item