scholarly journals Dynamic Facial Expressions Prime the Processing of Emotional Prosody

Author(s):  
Patricia Garrido-Vásquez ◽  
Marc D. Pell ◽  
Silke Paulmann ◽  
Sonja A. Kotz
2021 ◽  
Vol 5 (3) ◽  
pp. 13
Author(s):  
Heting Wang ◽  
Vidya Gaddy ◽  
James Ross Beveridge ◽  
Francisco R. Ortega

The role of affect has been long studied in human–computer interactions. Unlike previous studies that focused on seven basic emotions, an avatar named Diana was introduced who expresses a higher level of emotional intelligence. To adapt to the users various affects during interaction, Diana simulates emotions with dynamic facial expressions. When two people collaborated to build blocks, their affects were recognized and labeled using the Affdex SDK and a descriptive analysis was provided. When participants turned to collaborate with Diana, their subjective responses were collected and the length of completion was recorded. Three modes of Diana were involved: a flat-faced Diana, a Diana that used mimicry facial expressions, and a Diana that used emotionally responsive facial expressions. Twenty-one responses were collected through a five-point Likert scale questionnaire and the NASA TLX. Results from questionnaires were not statistically different. However, the emotionally responsive Diana obtained more positive responses, and people spent the longest time with the mimicry Diana. In post-study comments, most participants perceived facial expressions on Diana’s face as natural, four mentioned uncomfortable feelings caused by the Uncanny Valley effect.


2021 ◽  
Vol 151 ◽  
pp. 107734
Author(s):  
Katia M. Harlé ◽  
Alan N. Simmons ◽  
Jessica Bomyea ◽  
Andrea D. Spadoni ◽  
Charles T. Taylor

2011 ◽  
Vol 24 (2) ◽  
pp. 149-163 ◽  
Author(s):  
Marie Arsalidou ◽  
Drew Morris ◽  
Margot J. Taylor

2017 ◽  
Vol 354 ◽  
pp. 64-72 ◽  
Author(s):  
Emmanuèle Ambert-Dahan ◽  
Anne-Lise Giraud ◽  
Halima Mecheri ◽  
Olivier Sterkers ◽  
Isabelle Mosnier ◽  
...  

2018 ◽  
Vol 115 (43) ◽  
pp. E10013-E10021 ◽  
Author(s):  
Chaona Chen ◽  
Carlos Crivelli ◽  
Oliver G. B. Garrod ◽  
Philippe G. Schyns ◽  
José-Miguel Fernández-Dols ◽  
...  

Real-world studies show that the facial expressions produced during pain and orgasm—two different and intense affective experiences—are virtually indistinguishable. However, this finding is counterintuitive, because facial expressions are widely considered to be a powerful tool for social interaction. Consequently, debate continues as to whether the facial expressions of these extreme positive and negative affective states serve a communicative function. Here, we address this debate from a novel angle by modeling the mental representations of dynamic facial expressions of pain and orgasm in 40 observers in each of two cultures (Western, East Asian) using a data-driven method. Using a complementary approach of machine learning, an information-theoretic analysis, and a human perceptual discrimination task, we show that mental representations of pain and orgasm are physically and perceptually distinct in each culture. Cross-cultural comparisons also revealed that pain is represented by similar face movements across cultures, whereas orgasm showed distinct cultural accents. Together, our data show that mental representations of the facial expressions of pain and orgasm are distinct, which questions their nondiagnosticity and instead suggests they could be used for communicative purposes. Our results also highlight the potential role of cultural and perceptual factors in shaping the mental representation of these facial expressions. We discuss new research directions to further explore their relationship to the production of facial expressions.


2013 ◽  
Vol 27 (8) ◽  
pp. 1486-1494 ◽  
Author(s):  
Guillermo Recio ◽  
Annekathrin Schacht ◽  
Werner Sommer

Sign in / Sign up

Export Citation Format

Share Document