scholarly journals Emotion-modulated recall: Congruency effects of nonverbal facial and vocal cues on semantic recall

2021 ◽  
Author(s):  
Arianne Constance Herrera-Bennett ◽  
Shermain Puah ◽  
Lisa Hasenbein ◽  
Dirk Wildgruber

The current study investigated whether automatic integration of crossmodal stimuli (i.e. facial emotions and emotional prosody) facilitated or impaired the intake and retention of unattended verbal content. The study borrowed from previous bimodal integration designs and included a two-alternative forced-choice (2AFC) task, where subjects were instructed to identify the emotion of a face (as either ‘angry’ or ‘happy’) while ignoring a concurrently presented sentence (spoken in an angry, happy, or neutral prosody), after which a surprise recall was administered to investigate effects on semantic content retention. While bimodal integration effects were replicated (i.e. faster and more accurate emotion identification under congruent conditions), congruency effects were not found for semantic recall. Overall, semantic recall was better for trials with emotional (vs. neutral) faces, and worse in trials with happy (vs. angry or neutral) prosody. Taken together, our findings suggest that when individuals focus their attention on evaluation of facial expressions, they implicitly integrate nonverbal emotional vocal cues (i.e. hedonic valence or emotional tone of accompanying sentences), and devote less attention to their semantic content. While the impairing effect of happy prosody on recall may indicate an emotional interference effect, more research is required to uncover potential prosody-specific effects. All supplemental online materials can be found on OSF (https://osf.io/am9p2/).

Neurology ◽  
2019 ◽  
Vol 94 (10) ◽  
pp. e1013-e1020 ◽  
Author(s):  
Shannon M. Sheppard ◽  
Lynsey M. Keator ◽  
Bonnie L. Breining ◽  
Amy E. Wright ◽  
Sadhvi Saxena ◽  
...  

ObjectiveTo determine whether right ventral stream and limbic structures (including posterior superior temporal gyrus [STG], STG, temporal pole, inferior frontal gyrus pars orbitalis, orbitofrontal cortex, amygdala, anterior cingulate, gyrus, and the sagittal stratum) are implicated in emotional prosody identification.MethodsPatients with MRI scans within 48 hours of unilateral right hemisphere ischemic stroke were enrolled. Participants were presented with 24 sentences with neutral semantic content spoken with happy, sad, angry, afraid, surprised, or bored prosody and chose which emotion the speaker was feeling based on tone of voice. Multivariable linear regression was used to identify individual predictors of emotional prosody identification accuracy from a model, including percent damage to proposed right hemisphere structures, age, education, and lesion volume across all emotions (overall emotion identification) and 6 individual emotions. Patterns of recovery were also examined at the chronic stage.ResultsThe overall emotion identification model was significant (adjusted r2 = 0.52; p = 0.043); greater damage to right posterior STG (p = 0.038) and older age (p = 0.009) were individual predictors of impairment. The model for recognition of fear was also significant (adjusted r2 = 0.77; p = 0.002), with greater damage to right amygdala (p = 0.047), older age (p < 0.001), and less education (p = 0.005) as individual predictors. Over half of patients with chronic stroke had residual impairments.ConclusionsRight posterior STG in the right hemisphere ventral stream is critical for emotion identification in speech. Patients with stroke with damage to this area should be assessed for emotion identification impairment.


2008 ◽  
Vol 39 (6) ◽  
pp. 927-938 ◽  
Author(s):  
D. R. Bach ◽  
K. Buxtorf ◽  
D. Grandjean ◽  
W. K. Strik

BackgroundIdentification of emotional facial expression and emotional prosody (i.e. speech melody) is often impaired in schizophrenia. For facial emotion identification, a recent study suggested that the relative deficit in schizophrenia is enhanced when the presented emotion is easier to recognize. It is unclear whether this effect is specific to face processing or part of a more general emotion recognition deficit.MethodWe used clarity-graded emotional prosodic stimuli without semantic content, and tested 25 in-patients with paranoid schizophrenia, 25 healthy control participants and 25 depressive in-patients on emotional prosody identification. Facial expression identification was used as a control task.ResultsPatients with paranoid schizophrenia performed worse than both control groups in identifying emotional prosody, with no specific deficit in any individual emotion category. This deficit was present in high-clarity but not in low-clarity stimuli. Performance in facial control tasks was also impaired, with identification of emotional facial expression being a better predictor of emotional prosody identification than illness-related factors. Of those, negative symptoms emerged as the best predictor for emotional prosody identification.ConclusionsThis study suggests a general deficit in identifying high-clarity emotional cues. This finding is in line with the hypothesis that schizophrenia is characterized by high noise in internal representations and by increased fluctuations in cerebral networks.


1989 ◽  
Vol 57 (1) ◽  
pp. 100-108 ◽  
Author(s):  
Sandra E. Duclos ◽  
James D. Laird ◽  
Eric Schneider ◽  
Melissa Sexter ◽  
et al

2020 ◽  
Vol 10 (3) ◽  
pp. 179-194
Author(s):  
Rachel L Moline ◽  
Kaytlin L Constantin ◽  
Megan N Gauthier ◽  
Deborah M Powell ◽  
C Meghan McMurtry

Aim: Fully illuminating mechanisms relating parent behaviors to child pain require examining both verbal and nonverbal communication. We conducted a multimethod investigation into parent nonverbal communication and physiology, and investigated the psychometric properties of the Scheme for Understanding Parent Emotive Responses Scale to assess parent nonverbals accompanying reassurance and distraction. Materials & methods: 23 children (7–12 years of age) completed the cold pressor task with their parent (predominately mothers). Parent heart rate and heart rate variability were monitored and assessed. The Scheme for Understanding Parent Emotive Responses Scale coding of parent nonverbal behaviors (i.e., vocal cues, facial expressions, posture) was used to detect levels of fear, warmth, disengagement and humor. Results & conclusion: Preliminary evidence for the psychometric properties of the scale are offered. Parent reassurance was associated with more fear, less warmth and less humor compared with distraction.


PeerJ ◽  
2018 ◽  
Vol 6 ◽  
pp. e5278 ◽  
Author(s):  
Ana R. Gonçalves ◽  
Carina Fernandes ◽  
Rita Pasion ◽  
Fernando Ferreira-Santos ◽  
Fernando Barbosa ◽  
...  

Background Emotion identification is a fundamental component of social cognition. Although it is well established that a general cognitive decline occurs with advancing age, the effects of age on emotion identification is still unclear. A meta-analysis by Ruffman and colleagues (2008) explored this issue, but much research has been published since then, reporting inconsistent findings. Methods To examine age differences in the identification of facial expressions of emotion, we conducted a meta-analysis of 24 empirical studies (N = 1,033 older adults, N = 1,135 younger adults) published after 2008. Additionally, a meta-regression analysis was conducted to identify potential moderators. Results Results show that older adults less accurately identify facial expressions of anger, sadness, fear, surprise, and happiness compared to younger adults, strengthening the results obtained by Ruffman et al. (2008). However, meta-regression analyses indicate that effect sizes are moderated by sample characteristics and stimulus features. Importantly, the estimated effect size for the identification of fear and disgust increased for larger differences in the number of years of formal education between the two groups. Discussion We discuss several factors that might explain the age-related differences in emotion identification and suggest how brain changes may account for the observed pattern. Furthermore, moderator effects are interpreted and discussed.


Author(s):  
Larysa Gorbolis

The article, based on the play «At the Beginning and At the End of Times» by the contemporary Ukrainian playwright Pavlo Arie and the performance of Lviv Lesia Ukrainka Academic Drama Theatку, directed by Oleksiy Kravchuk, ascertains the expedience to use an intermedial approach to find out the features of architectonics, genre originality, the role of the title, dedication, remarks, chronotope in the literary and stage modelling of the heroes, reflection of the general concept of works, visualization of the ideological and thematic direction. It studied the significance of remarks, details, scenery, music and video for disclosing the conflict of work, reflecting the character and the mental state of heroes. Vertical-horizontal coordinates of the material and spiritual life of the characters are described and analysed. The semantic content of the work, the author’s and director’s accents are oriented not only to the person, but also to the culture that is perished. The play heroes are in difficult situations of choice, searching for themselves, rethinking values. Working on the images and problems, P. Arie, O. Kravchuk skilfully conveyed the volatility of mood, a complex body of hero experiences, appealing to the audience’s feelings. With the help of music, details, colours, sounds, emotions, facial expressions, movements the director, artist and actors emphasize the characters at all levels of stage realization.


2021 ◽  
Author(s):  
Jennifer Quinde Zlibut ◽  
Anabil Munshi ◽  
Gautam Biswas ◽  
Carissa Cascio

Abstract Background: It is unclear whether atypical patterns of facial expression production metrics in autism reflect the dynamic and nuanced nature of facial expressions or a true diagnostic difference. Further, the heterogeneity observed across autism symptomatology suggests a need for more adaptive and personalized social skills programs. For example, it would be useful to have a better understanding of the different expressiveness profiles within the autistic population and how they differ from neurotypicals to help develop systems that train facial expression production and reception. Methods:We used automated facial coding and an unsupervised clustering approach to limit inter-individual variability in facial expression production that may have otherwise obscured group differences in previous studies, allowing an "apples-to-apples" comparison between autistic and neurotypical adults. Specifically, we applied k-means clustering to identify subtypes of facial expressiveness in an autism group (N=27) and a neurotypical control group (N=57) separately. The two most stable clusters from these analyses were then further characterized and compared on the basis of their expressiveness and emotive congruence to emotionally charged stimuli. Results: Our main finding was that autistic adults show heightened spontaneous facial expressions in response to negative emotional images. The group effect did not extend to positive emotional images, and we did not find evidence for greater incongruous (i.e., inappropriate) facial expressions in autism. Conclusion: These findings build on previous work suggesting valence-specific effects of autism on emotional empathy and suggest the need for intervention programs to focus on social skills in the context of both negative and positive emotions.


Brain ◽  
2019 ◽  
Vol 142 (9) ◽  
pp. 2873-2887 ◽  
Author(s):  
Charles R Marshall ◽  
Christopher J D Hardy ◽  
Lucy L Russell ◽  
Rebecca L Bond ◽  
Harri Sivasathiaseelan ◽  
...  

Abstract Impaired processing of emotional signals is a core feature of frontotemporal dementia syndromes, but the underlying neural mechanisms have proved challenging to characterize and measure. Progress in this field may depend on detecting functional changes in the working brain, and disentangling components of emotion processing that include sensory decoding, emotion categorization and emotional contagion. We addressed this using functional MRI of naturalistic, dynamic facial emotion processing with concurrent indices of autonomic arousal, in a cohort of patients representing all major frontotemporal dementia syndromes relative to healthy age-matched individuals. Seventeen patients with behavioural variant frontotemporal dementia [four female; mean (standard deviation) age 64.8 (6.8) years], 12 with semantic variant primary progressive aphasia [four female; 66.9 (7.0) years], nine with non-fluent variant primary progressive aphasia [five female; 67.4 (8.1) years] and 22 healthy controls [12 female; 68.6 (6.8) years] passively viewed videos of universal facial expressions during functional MRI acquisition, with simultaneous heart rate and pupillometric recordings; emotion identification accuracy was assessed in a post-scan behavioural task. Relative to healthy controls, patient groups showed significant impairments (analysis of variance models, all P < 0.05) of facial emotion identification (all syndromes) and cardiac (all syndromes) and pupillary (non-fluent variant only) reactivity. Group-level functional neuroanatomical changes were assessed using statistical parametric mapping, thresholded at P < 0.05 after correction for multiple comparisons over the whole brain or within pre-specified regions of interest. In response to viewing facial expressions, all participant groups showed comparable activation of primary visual cortex while patient groups showed differential hypo-activation of fusiform and posterior temporo-occipital junctional cortices. Bi-hemispheric, syndrome-specific activations predicting facial emotion identification performance were identified (behavioural variant, anterior insula and caudate; semantic variant, anterior temporal cortex; non-fluent variant, frontal operculum). The semantic and non-fluent variant groups additionally showed complex profiles of central parasympathetic and sympathetic autonomic involvement that overlapped signatures of emotional visual and categorization processing and extended (in the non-fluent group) to brainstem effector pathways. These findings open a window on the functional cerebral mechanisms underpinning complex socio-emotional phenotypes of frontotemporal dementia, with implications for novel physiological biomarker development.


2017 ◽  
Vol 46 (7) ◽  
pp. 1026-1049 ◽  
Author(s):  
Aurélie De Waele ◽  
An-Sofie Claeys ◽  
Verolien Cauberghe

Research on crisis communication has mainly focused on verbal aspects of organizational responses. However, the nonverbal cues of the organizational spokesperson communicating about the crisis may also influence stakeholders’ perceptions. This study examines the impact of two vocal cues, voice pitch and speech rate. In addition, the study examines how these cues affect perceptions of organizations depending on the message’s verbal content. A 2 (voice pitch: low vs. high) × 2 (speech rate: slow vs. fast) × 2 (crisis response strategy: deny vs. rebuild) between-subjects experimental design was conducted. Results show that voice pitch and speech rate affected postcrisis reputation. However, these vocal cues affected perceptions only when the organization applied a rebuild strategy (i.e., apology) and not in the case of a deny strategy. This interaction between verbal and vocal cues was partly mediated by vocal attractiveness.


Sign in / Sign up

Export Citation Format

Share Document